Facebook showcases wrist-worn AR interface concept

Facebook’s hardware strategy often looks pretty opaque from the outside. The company has done fairly well with Oculus sales amid pandemic demand. Even its Echo Show competitor Portal has seen a bump as people have been forced to socially distance. The company’s smartphone partnership with HTC, meanwhile, fell flat eight or so years back.

Earlier this year, reports surfaced that the company was working on its own Apple Watch competitor. The smartwatch was said to have a health focus, running on an open-source version of Android. That, of course, would mark an interesting alternative from Google’s chosen wearOS.

This week, the company highlighted another wrist-based wearable. The specifics of the project don’t line up super closely with earlier reports, which could well mean two separate projects. Facebook is a big company, after all.

This particular project out of Facebook Reality Labs is more focused on providing an alternative computer interface. Specifically, it seems in line with the company’s augmented reality efforts.

Per yesterday’s blog post:

A separate device you could store in your pocket like a phone or a game controller adds a layer of friction between you and your environment. As we explored the possibilities, placing an input device at the wrist became the clear answer. The wrist is a traditional place to wear a watch, meaning it could reasonably fit into everyday life and social contexts. It’s a comfortable location for all-day wear. It’s located right next to the primary instruments you use to interact with the world — your hands. This proximity would allow us to bring the rich control capabilities of your hands into AR, enabling intuitive, powerful and satisfying interaction.

I will say that, based on the information presented, this seems more conceptual. As in, this could be the key to offering more seamless control for some future augmented reality system. And even still, it’s presented as a step on the way to a more deeply integrated human-computer solution. How deeply you want Facebook to integrate with your neurons is apparently a question we’re all going to have to ask ourselves in the not too distant future.

This interface specifically is designed to use electromyography (EMG) sensors to interpret motor nerve signals and interact with the interface accordingly. The subject interestingly came up during a Clubhouse event featuring Mark Zuckerberg last night. After Pebble founder/YC partner Eric Migicovsky discussed experiences dealing with Apple for his own smartwatch startup, the Facebook CEO said the following:

If you’re trying to build a watch, which we’re exploring as we talked about the wrist thing and I don’t want to call it a watch, but it’s the basic neural interfaces work that our Facebook reality labs team demoed some of our research about today. With the neural interface on the wrist, if you want that to integrate with the phone in any way, it’s just so much easier on Android than iOS. My guess is that this is an area where there probably should be a lot more focus. And I do think the private APIs are just something that makes it really difficult to have a healthy ecosystem.

“Exploring” seems like an operative word here. But it’s always cool/fascinating to see these projects in their early stages. Even if the promises might still seem a tad…overzealous.

EMG will eventually progress to richer controls. In AR, you’ll be able to actually touch and move virtual UIs and objects, as you can see in this demo video. You’ll also be able to control virtual objects at a distance. It’s sort of like having a superpower like the Force.

By TechCrunch Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here