Exploring Projected + Motion Tracking Interfaces

Recently, I had the opportunity to delve into the topic of motion-tracking interfaces as I prepared for a meeting. It felt like I was coming full circle, as my previous experiences exploring the workings of self-driving cars and designing for VR/AR interfaces had already given me a foundational understanding of these technologies.

The first thing I did was start watching YouTube videos. I’ve been watching these kinds of interface videos for a year now being fascinated by them. I started by going back to the first Microsoft Tables from back in 2012 when they were Sci-Fi interfaces.


This video is 17 years old, so it’s clear that projected interfaces and motion tracking aren’t exactly new. I first came across these concepts in 2011 during my degree when the Microsoft Surface Table—now called PixelSense—was introduced. (Side note: doesn’t that name sound like it should belong to an Android device?) While the technology was fascinating, it took some time for the right blend of commerce and innovation to create something people were actually willing to pay for. For a while, it seems the technology was somewhat sidelined.

These systems represent a transformative shift in how we interact with digital environments by blending projection technology and advanced sensors to create touchless, intuitive interfaces on virtually any surface.

Notable examples of this technology include the Sony Xperia Touch, which turns flat surfaces into touchscreens with infrared sensors for gesture control, and Microsoft HoloLens when combined with projection and motion tracking for collaborative tasks in fields like surgery or engineering. LUMOplay uses interactive floor projections for entertainment or education, while Google’s Project Soli employs radar-based tracking for gesture-based control. Other innovations, such as SixthSense by MIT Media Lab, project interfaces onto surfaces like palms, enabling intuitive interaction with gestures, while Lightform enhances 3D objects with interactive projection mapping. Ultraleap adds another layer with ultrasound-based haptic feedback for touchless control.

This was the start of my interest in alternative interfaces and where I started to think about UX and digital and physical interaction. Although apart from concept cafes in Japan where you could order from digitally projected menus nothing appeared in London for a long time. So I dug into this I Imagined a world where walls, tables, or even your hand become

dynamic, interactive displays—where you can manipulate digital content with a simple gesture, without needing to physically touch a screen. This convergence of projection mapping, motion tracking, and real-time software processing has opened up exciting opportunities across a variety of industries, from healthcare and retail to automotive and education.

During the last couple of years we’ve seen a resurgent interest in this type of interface. For me, it was the cultural phenomenon of Squid Game and the collaboration with Immersive Gamebox, an interactive game company co-founded by Will Dean MBE and David Spindler, which developed the immersive “Squid Game” experience in-house through its game content studio.

We can see that by focusing on the content and storytelling, the technology becomes less of the main attraction and more of a complementary element—important, but secondary in focus. Looked at practical examples like gesture-controlled retail displays, interactive medical imaging systems, and collaborative tools for education or industrial design. This highlights some of the cutting-edge technologies behind these experiences, such as depth cameras, radar sensors, and projection mapping systems.

Our goal today is to understand how these tools can not only enhance user experience but also align with our objectives by offering innovative, hands-free, and immersive solutions. Let’s explore how projected interfaces and motion tracking are shaping the future of interaction design.

Projected interfaces, combined with motion tracking, transform physical spaces into interactive digital environments by using projection technology and sensors to create touchless displays on any surface. These systems detect and interpret user movements, allowing interaction through gestures or motion. Projection systems, such as laser or DLP projectors, display images, videos, or interfaces onto surfaces like walls, tables, or even hands. Motion tracking sensors, including depth cameras, LiDAR, or camera-based systems like Leap Motion, monitor gestures and movements, while software processes this data in real-time to enable seamless interaction.

Motion tracking monitors user movements, including gestures, body position, and head orientation, using technologies like infrared cameras or wearable sensors to translate actions into the game environment. For example, users can interact with virtual objects through hand tracking or navigate challenges by jumping or dodging. The HMD, likely a lightweight AR or VR headset, enhances the experience by overlaying virtual elements like HUDs or interactive objects while aligning content with the user’s perspective. Surround sound adds another layer of immersion through 3D auditory effects, making game sounds feel directionally realistic. The experience is powered by interactive game engines like Unity or Unreal Engine, which render visuals, process player inputs, and manage game logic in real-time. Similar systems, such as The VOID, CAVE, or Illuminarium, demonstrate how these technologies combine to create hyper-immersive experiences. In a Squid Game-themed setup, projection mapping could display scenes like “Red Light, Green Light,” while motion tracking ensures adherence to game rules, and HMDs or sensors enhance realism by tracking subtle movements. Together, these cutting-edge technologies immerse users in a dynamic and interactive gaming world.

This leads on the Interactive Pool tables created by Cue Light & Bounce. While immersive games that combine projection technology with pool or snooker are relatively rare, there are notable examples that enhance the traditional experience:

CueLight Interactive Pool Table System

Developed by Obscura Digital, the CueLight system transforms a standard pool table into an interactive art display. A high-definition projector mounted above the table projects motion graphics precisely mapped to the table’s surface. As balls move, the graphics react in real-time, creating effects like rippling water or fiery trails. This system has been installed in venues such as the Hard Rock Hotel & Casino’s Paradise Tower Penthouse in Las Vegas. 


Wonderball at Bounce

While primarily applied to ping pong tables, the Wonderball system developed by Projection Artworks and the social entertainment brand Bounce showcases the potential of interactive projection in table games. This system uses projection mapping to turn a standard ping pong table into an interactive gaming experience, with various games and effects that respond to the ball’s movement. The technology combines ball-tracking with interactive projections, offering a glimpse into how similar systems could be adapted for pool or snooker. 

These innovations demonstrate how projection technology can enhance traditional table games, creating immersive and interactive experiences for players.

Projected interfaces using pool tables combine projection mapping or interactive projection technology to create dynamic and engaging digital experiences on the table’s surface. This concept blends the physical environment with digital interactions, allowing users to engage in real time. For example, projected games or challenges could transform the table into an interactive experience, where patterns or guides appear to assist players with shots in the pool. Additionally, pool tables could serve as control interfaces for smart devices, allowing users to manage home automation systems, music, or lighting through projected controls. Such interfaces could also be used as training tools for pool players, highlighting optimal ball paths, target zones, or angles to improve skills. Augmented reality (AR) experiences could further enhance the concept, merging physical elements with digital graphics to create immersive gameplay or educational environments. During social events, projected images, videos, or media on the table could engage guests, creating a unique atmosphere. These systems typically rely on motion sensing, touch-based technologies, or cameras to detect user interaction, ensuring a responsive and interactive experience. This fusion of physical and digital elements provides entertainment, education, and utility in a novel way.


You could argue that the Squid Game projected interfaces revisited and modernised the concept of interactive projection technologies, similar to systems like the CueLight Interactive Pool Table. Both utilize projection mapping to transform a physical game surface into a dynamic, responsive interface, enhancing the experience with real-time visuals and interactivity.

In the case of Squid Game, Immersive Gamebox employs projection technologies to create an entire room-based interactive environment where players engage in digital challenges. Similarly, earlier innovations like CueLight used projections to augment traditional games like pool, blending physical gameplay with digital effects but most importantly this wasn’t done using VR tech meaning it became a group activity. In other words, overcoming the biggest limitation of VR/AR/XR devices.

This shows a continuous evolution of projected interfaces, adapting them for contemporary gaming narratives and immersive experiences. The Squid Game adaptation takes it a step further by integrating storytelling, competitive elements, and multi-player dynamics, making the technology even more engaging and relevant.

Leave a Reply

Your email address will not be published.