XR-HIVE - SINES Lab for Research on Immersive Technologies

Vision

At the Hub of Immersive Virtual Environments, our vision is to be at the forefront of research in multimodal interaction and immersive technologies.

We aspire to lead the way in creating extraordinary, immersive experiences across diverse application domains, pushing the boundaries of human-computer interaction and redefining the possibilities of immersion in a digital world.

We aim to be a trusted resource for industry and academia, helping to drive advancements that have a tangible impact on a variety of sectors.

Mission

Our mission at the Hub of Immersive Virtual Environments is to drive innovation through rigorous research and experimentation.We are dedicated to exploring, developing, and harnessing cutting-edge technologies to advance the state of immersive visualization and intuitive interaction.

Through collaboration, experimentation, and a commitment to excellence, we seek to empower our community, our partners, and the world at large with transformative insights and solutions that enhance the way we connect with technology, information, and each other.

Our Research Focus

At the HIVE, we investigate avenues of multimodal interaction for developing enriched experiences. Primarily, our work is focused in four broad areas.

Virtual and Augmented Reality Haptic Interaction Brain Computer Interfaces Computer Graphics / Vision
Immersive visualization techniques for various applications Modeling high realism force feedback for various applications Cognitive state estimation and neurofeedback for enhanced exp. Out of the box solutions for conventional problems in CG and CV

Research Funding

In Progress

Completed

Research Team

Current

  1. Amna Khan PhD CSE (2022-Present)
  2. Warda Ayaz MS CSE (2024-Present)

Alumni

  1. Muhammad Hamza Saleem (2024-2025)
    Comparative Evaluation of Cybersickness Mitigation Techniques using a Unified Scoring System in VR
  2. Hifsa Shahid (2024-2025)
    Emotional Responses to Design Elements in Virtual Environments
  3. Mian Muhammad Fatik Owais (2022-2024)
    Electric Muscle Stimulation for Haptic Feedback in Virtual Reality Environment
  4. Maira Sohail (2022-2024)
    Immersive Virtual Reality based Gamified Stereochemistry Learning
  5. Fiza Azam (2023-2024)
    Collaborative Task Performance via Real-Time Interaction with Intelligent Virtual Agents
  6. Raheela Raza (2023-2024)
    Exploring provision of hints in a puzzle game and their influence on engagement and performance
  7. Urwa Ejaz (2023-2024)
    Investigating the Neural Corelates of Stiffness Perception using Force and Pseudo-Haptic Feedback
  8. Muhammad Adil Talay (2020-2024)
  9. Sofia Mohammad (2023-2024)
    An exploration of strategies for effective placement of advertisements in the Metaverse
  10. Amna Naeem (2022-2023)
    Reinforcement Learning Based Agent Training for User Privacy in Metaverse
  11. Maria Maqbool (2021-2023)
    Empowering eco-friendly habits - Designing interactive virtual environments for attitude and behaviour change towards energy conservation
  12. Kiran Firdaus (2022-2023)
    Human stress classification using EEG in response to stand-up comedians’ clips
  13. Irsa Abbasi (2020-2023)
    Developing a virtual reality approach towards a better understanding of different types of enzymes
  14. Ahmad Javaid (2021-2022)
    Analysis of vestibulo-ocular effects on motion sickness in flight simulation
  15. Syeda Yumna Nasir (2020-2022)
    Pseudo-haptic feedback through mid-air action for learning of chemical bond strengths
  16. Hafsa Tahir MS CSE (2020-2022)
    Force feedback for collision avoidance in UAV teleoperation through virtual corridors
  17. Amna Khan (2021-2021)
    Game-induced emotion analysis using electroencephalography
  18. Neelam Shoaib (2020-2021)
    Virtual reality based procedural memorization of general aviation light aircraft
  19. Attia Nafees ul Haq (2020-2021)
    Pure mental state detection using EEG
  20. Muhammad Ali Bilal (2018-2021)
    Cognitive workload analysis in visual and auditory task using EEG signals
  21. Umar Shahid (2019-2020)
    EEG based mental workload assessment using machine learning
  22. Muhammad Adil Talay (2018-2020)
    Few-shot metric learning for remote sensing image scene classification
  23. Zain ul Abideen (2018-2020)
    Development of a cost effective training system for small arms shooting training
  24. Hassam Ahmed Malik (2018-2020)
    Effect of haptic feedback on pilot/operator performance during flight simulation
  25. Amal Fatemah (2018-2019)
    Design of an integrated pipeline for the visualization of 3D molecular models to study the effects on spatial learning ability
  26. Hasnain Rashid (2017-2019)
    Automatic cell detection and counting of microscopic images using machine learning
  27. Aroosh Fatima (2017-2018 )
    Using deep learning for image and video vompression
  28. Syed Rameez Rehman (2016-2018)
    A framework for cardboard based augmented reality
  29. Samin Kainat (2016-2017)
    Man made world image matching over wide baselines