Skip to main content

Smart living: How coffee cups, bike handles and other everyday objects could boost productivity

Major tech companies are investing billions in augmented reality (AR) headsets and smart glasses, but how will this cutting-edge technology enhance daily life?

A conceptual image of a pair of glasses showing an augmented reality overlay.
AR overlays digital information onto the real world through devices like smartphones, tablets or AR glasses. GraspUI is set to expand the usefulness of this technology.

In a world where convenience and productivity are key, a new user interface called GraspUI – developed by an international team led by a computer scientist from the University of Bath – is redefining how we interact with technology.

Developed by researchers at the Universities of Bath and Toronto, in collaboration with Meta Reality Labs, GraspUI uses augmented reality (AR) to turn everyday objects into smart tools that effortlessly connect you to your digital world.

Presented at this year’s ACM Designing Interactive Systems conference (one of the premier events for cutting-edge research on user interfaces), GraspUI transforms everyday movements – like picking up a mug, reaching for a pen or gripping a bike handle – into seamless digital interactions, allowing users to control devices with the natural grasping movements they already make.

Everyday movements transformed into smart digital interactions

AR overlays digital information onto the real world through devices like smartphones, tablets or AR glasses. It uses sensors and cameras to detect a person’s environment then displays relevant digital content – such as images, text, or animations – on top of the physical surroundings.

Major tech companies like Meta, Apple, and Google are investing heavily in AR headsets and smart glasses, which are expected to have over 1 billion users by 2028.

GraspUI is set to expand the usefulness of this technology. Wearing an AR device, a user uses subtle hand gestures to trigger actions that currently require them to reach for their phone and juggle multiple apps.

The beauty of this interface is its simplicity. So, for instance, something as simple as tapping your pen could signal to your phone that you want the light to turn off. And lifting your hand from the car’s steering wheel could trigger your phone to take note of your parking spot.

“We’re taking everyday objects and making them work for you using simple finger-based microgestures,” says Dr Adwait Sharma, the lead researcher behind GraspUI. “Rather than forcing you to adapt to technology, we’re seamlessly integrating it into your life.”

A glimpse into the future

Infographic showing the hand gestures for GraspUI
Simple hand gestures that fit into your every day life could trigger seamless digital interactions to boost productivity.

GraspUI might sound futuristic, but it’s designed to fit easily into today’s digital world. From your morning routine to your commute, this interface simplifies tasks in ways you never imagined. Here are examples of how it might work:

  • Morning Routine: As the user reaches for their toothbrush, their schedule for the day ahead automatically appears on their smart glass.
  • Before Work: Placing a hand on their coffee cup reveals the weather forecast and unread emails.
  • Commute: Lifting a book during the morning commute silences the phone in their pocket, ensuring a distraction-free reading experience.
  • Workout: While cycling, the user swipes on the bike’s handlebar to browse social media and share workout progress.
  • At Work: After taking meeting notes, the user taps their pen to turn off the desk lamp with a simple gesture.
  • On the Road: Squeezing the steering wheel as they unload their hands saves the parking location for easy access later.
  • Home Arrival: Using a mid-air thumb gesture, the user can activate smart home appliances to prepare the home environment while walking away from their car.

Early testers have raved about the system’s intuitive design. One user interviewed for the paper describing GraspUI said: “You don’t have to think about it. It just works.”

Dr Sharma said: “GraspUI isn’t just about convenience – it’s about rethinking how we interact with technology.

“By focusing on those small, almost imperceptible actions we take every day, the system can free up time, reduce mental clutter, and make even the most mundane moments more productive. Whether it’s managing your schedule, checking your emails, or keeping distractions at bay, GraspUI’s subtle interface could change how you navigate your entire day.”

Tracking movement

GraspUI can be implemented using built-in headsets or smart glasses embedded with cameras to track hand and object movements.

In a coffee cup demo, researchers trained AI to recognise seven grasping phases, display appropriate visual elements in real time and link gestures to commands. Future AI models could push this further by predicting user intent, anticipating actions and requiring only quick confirmations. This would make interactions faster and easier, freeing users to focus on what matters while the technology handles the rest.

This research represents a transformative leap in AR interaction design and is happening alongside other hardware and software advancements in the industry. For example, Google’s Android XR platform, set to launch in 2025, has showcased the ability of Google’s Large Language Model, Gemini, to navigate 3D maps, translate text, and provide contextual information – all through gestures and voice commands.

Also, Meta’s Orion glasses have demonstrated advanced displays, neural wristbands for precise control and AI-powered holograms. These innovations hint at a future where AR devices have the potential to replace smartphones. The once-distant vision of all-day wearable AR is quickly becoming a practical reality.

How it works

Find out how researchers worked with mixed-reality designers to explore gesture integration through grasping and envision practical applications for the technology.


The research

Read the paper in full