XR Projects

Explore our cutting-edge XR projects that push the boundaries of augmented, virtual, and mixed reality. Each project showcases our expertise in creating immersive and interactive experiences.

Wrist Interaction


Wrist Interaction offers an engaging method for users to interact with digital content through natural hand movements, especially valuable in VR and AR applications for enhancing user engagement. This interaction model allows intuitive control over objects and interfaces based on wrist orientation and movement, making it ideal for interactive learning environments, training simulations, and gaming interfaces.

Key Features and Applications:

Tracked Transform: By monitoring the user's hand or controller movements, the system can precisely activate and manipulate objects based on wrist orientation. This feature enables a more immersive and responsive experience, ideal for interactive learning environments and simulations.

Object Array Activation: Developers can specify multiple objects to be activated in sequence, such as numbers or symbols, providing a dynamic way to teach concepts like counting, sequencing, or categorization. This aspect is particularly beneficial for educational applications, where visual and interactive elements can greatly enhance understanding.

Elastic Scaling and Positioning: Smooth scaling animations and elastic positioning offer visual feedback that helps users understand the impact of their movements. This feedback loop is crucial for learning, as it reinforces the connection between physical actions and digital outcomes, making the interaction more intuitive and memorable.

Adaptive Offsets and Velocity Tracking: The system adjusts object positioning based on the speed and direction of wrist movements. This capability can be utilized in training simulations, where users can practice and refine their motor skills, or in gaming interfaces, where quick reflexes and precise control are necessary.

Spring Dynamics and Damping: Applying spring-like dynamics to object movements creates a natural and lifelike interaction. This feature enhances the realism of VR/AR experiences, making them more immersive and engaging for users.

Wrist Interaction serves as an effective learning tool by incorporating natural hand movements into digital interfaces, allowing users to intuitively explore and interact with content. This approach is ideal for educational tools, interactive exhibits, and immersive training programs.

Interactive Gauge Control in VR/AR


The UIGaugeController script provides an interactive and intuitive way for users to engage with virtual gauges using hand gestures in VR/AR environments. This system is particularly useful in educational simulations, automotive training, and other technical fields where users can learn and explore concepts like speed regulation, engine performance, and temperature control.

Key Features and Applications:

Hand-Tracked Control: By tracking the user's hand movements, the script enables precise control over the gauges. This interaction model can be used in educational simulations, allowing learners to understand and explore concepts like speed regulation, engine performance, and temperature control in an engaging, hands-on manner.

Multi-Gauge Interaction: The system supports multiple gauge needles, each representing different metrics. This feature is particularly useful in training environments where users can simulate and monitor multiple variables simultaneously, enhancing their understanding of complex systems.

Adaptive Feedback: The script includes adaptive feedback mechanisms, such as elastic scaling and damping, which provide realistic responses to user input. This makes the interaction more natural and intuitive, crucial for creating believable simulations in automotive training, aviation control panels, and other technical fields.

Learning Through Interaction: The interactive nature of the gauges allows users to learn by doing, a method proven to be effective in educational settings. By directly manipulating the gauges, users can experiment with different scenarios, observe outcomes, and develop a deeper understanding of the underlying principles.

Overall, the UIGaugeController script offers a versatile platform for creating interactive, educational, and training experiences in VR/AR. Its ability to simulate real-world controls and provide immediate, intuitive feedback makes it an invaluable tool for learning and skill development in various fields.

VR Interactive Arrow Grid System


The VRArrowController and ArrowGrid scripts offer a compelling way to create interactive VR experiences where arrow objects respond to user hand movements. This system can be applied in various contexts, such as interactive exhibits, navigational aids, or educational tools that respond to user input.

Key Features and Applications:

Hand Tracking and Interaction: By utilizing hand tracking technology, the arrows dynamically adjust their orientation based on the user's hand position. This feature can be used to create engaging VR experiences, such as interactive exhibits, navigational aids, or educational tools that respond to user input.

Real-Time Feedback: The arrows provide real-time feedback by rotating to face the nearest hand, offering an intuitive way for users to interact with the virtual environment. This interaction model can be applied in various contexts, such as guiding users through a virtual space or indicating points of interest.

Customizable Grid Layout: The ArrowGrid script allows developers to customize the number of rows and columns, as well as the spacing between arrows. This flexibility makes it easy to integrate the arrow grid into different types of VR environments, whether for gaming, training, or educational applications.

Enhancing User Experience: The dynamic nature of the arrow grid system enhances the user experience by providing immediate visual responses to hand movements. This responsiveness is particularly useful in training simulations, where users can learn and practice skills in an immersive and interactive setting.

Versatility in Applications: The system's versatility allows it to be adapted for various applications, including wayfinding in large virtual environments, interactive art installations, and more. The use of hand tracking and responsive visuals creates an engaging and immersive experience for users.

Overall, the VRArrowController and ArrowGrid scripts offer a versatile and innovative solution for creating interactive VR environments. Their ability to provide real-time feedback and customizable layouts makes them valuable tools for developers looking to enhance user engagement and create memorable VR experiences.

Pinch Twist Interaction


The PinchTwistController explores different ways of interacting with 3D objects, specifically through twisting a 3D mesh. This project demonstrates how hand-tracking technology can be used to manipulate objects in a virtual environment, offering new possibilities for interactive design and user experience.

Key Features and Applications:

Hand-Tracking Based Control: The right hand controls the twist angle of the 3D mesh using pinching gestures, with the amount of pinching and hand movement determining the angle. The left hand can trigger a continuous twist offset, allowing for dynamic and precise manipulation.

Sensitivity and Threshold Adjustments: The system allows for fine-tuning of the twist sensitivity and movement thresholds, making it adaptable for various applications, from gaming to virtual training.

Interactive 3D Demonstration: This interaction model can be used in educational and design settings to demonstrate physical principles or explore creative concepts, making it a versatile tool for developers and educators alike.

The PinchTwistController showcases how advanced hand-tracking and gesture recognition can be integrated into virtual environments, providing a rich and immersive user experience.