All eyes are on wearable AR
The next generation of wearable technology is here…and you wear it on your eyeball. Yup, AR contact lenses have arrived.
For years, technological advancements have pushed toward blurring the divide between the physical and digital worlds. The current offering of AR (augmented reality) or ‘smart’ glasses, is minimal and expensive. Tech giants Meta and Apple are reportedly working on their versions of the wearable tech, but we may be a few years away from commercial release.
However, a new innovation by Mojo Vision – smart contact lenses – paves the way for a new level of connectivity that makes sci-fi technology a reality, today.
The Mojo Lens is a contact lens with a 14,000 pixel-per-inch MicroLED display (for comparison, the iPhone 13's display is 460 pixels per inch). Measuring less than 0.5mm in diameter with a pixel-pitch of 1.8 microns, it’s the world’s smallest and densest display ever created for dynamic content. The fact that you can take the computing power of an AR headset and put it into a wearable contact lens is mindblowing.
It also has a custom-configured accelerometer, gyroscope, and magnetometer that continuously track eye movements so that the AR imagery is held still as the user's eyes move.
Drew Perkins, Mojo’s CEO, was the first person to wear the completed prototype. It was the first-ever on-eye demonstration of a feature-complete augmented reality smart contact lens. Reflecting on the future uses of this technology, Perkins said in a blog post:
“We hope to see Mojo Lens change the lives of individuals with vision impairment by improving their ability to perform daily tasks that many of us take for granted. I envision amateur and professional athletes wearing Mojo Lens so they can train smarter, stay focused, and reach peak performance. Ultimately, this is a tool that can give people an invisible assistant throughout their day to stay focused without losing access to the information they need to feel confident in any situation.”
Mojo is committed to creating a next-generation computing experience, whereby information is presented to the user only when required, and in a way that means they do not need to break their immersion in the experience itself. This means not reaching down to pull out a phone, looking at a screen, and losing focus on the people and world around them, but unifying the technology and user experience.
As immersive technology continues to advance, we are seeing the interfaces we use to link the physical and digital worlds virtually disappear. Currently, we associate these experiences with cumbersome headsets or interactive touchscreens. This contact lens has shown us what is possible and we’ll soon see our environments, or even clothes, acting as the gateway to a new reality. It’s also possible that our very brainwaves could provide the commands needed to let AI-driven immersive systems know what we want.
BCIs, also known as neural interfaces, connect the brain or nervous system to equipment such as digital devices or IT systems. Interfaces placed inside the brain, or body, are known as internal, invasive, or implanted technologies – as opposed to external, non-invasive, or wearable devices.
Even BCIs are not just a vision of the future – they are the here and now. The basic building blocks of neural interfaces have been around for years (we already have brain-controlled artificial limbs) and with the amount of investment that neural technology is receiving, innovation is accelerating.
This merging of the digital with the physical transforms the nature of our experiences and the way we perceive and interact with the world. I’m excited for what’s next.