Meta Unveils New Wearables, AI Advancements, and a Glimpse of the Future

Daily News Egypt
4 Min Read

Meta CEO Mark Zuckerberg announced a wave of updates across mixed reality, artificial intelligence, and wearables at the company’s annual Connect conference. The announcements build towards Meta’s vision of an open, more connected future, placing users at the centre of a new computing platform.

Mixed Reality Takes Centre Stage with Quest 3S and Hyperscape

Meta unveiled the Quest 3S, a lower-priced version of the Quest 3, starting at $299.99. The headset boasts the same high-resolution mixed reality features as the Quest 3, including full-colour mixed reality, hand tracking, Touch Plus controllers, and access to the full range of experiences on Meta Horizon OS. The Quest 3S features a slightly narrower field of view and a different design compared to the Quest 3.

Meta Unveils New Wearables, AI Advancements, and a Glimpse of the Future

 

Alongside the Quest 3S, Meta announced “Hyperscape,” a new feature that brings photorealistic spaces into the metaverse. Using a 3D volume rendering technique called Gaussian Splatting, Hyperscape enables users to interact in spaces that appear as if they are physically there. This feature allows creators to scan physical rooms and bring them into the digital world.

 

Meta AI Expands its Capabilities

Meta AI continues to expand its capabilities, making the AI assistant more engaging and useful. Meta AI can now process visual information, allowing users to ask questions about uploaded photos, such as identifying a flower or generating a recipe. Meta AI also supports voice interactions, allowing users to converse with the AI assistant using Facebook, Messenger, WhatsApp, and Instagram DM.

Meta is also experimenting with automatic video dubbing and lip-syncing, allowing creators to reach wider audiences by generating content in different languages. The AI editing tools have been enhanced to allow users to make precise edits to real photos using everyday language.

Llama 3.2: Open-Source AI Advances

Meta released its new Llama 3.2 models, featuring the first major vision model from the company. Llama 3.2 understands both images and text, and these models are optimized to run on devices such as smartphones and eventually, glasses. Meta continues to advocate for open-source AI, stating that it is the “right path forward” due to its cost-effectiveness, customizability, trustworthiness, and performance.

Ray-Ban Meta Glasses Get Smarter and More Versatile

Meta announced new integrations for its Ray-Ban Meta glasses, including Spotify, Audible, and iHeart. The glasses experience with Meta AI is enhanced with more natural and conversational interactions, allowing users to ask questions without having to say “Hey Meta” for each query.

The glasses now offer the ability to set reminders and perform actions based on text viewed through the glasses. They also feature real-time speech translation, supporting Spanish, French, and Italian, with plans to add more languages in the future. The Ray-Ban Meta glasses are also now compatible with Transitions® lenses, offering users greater versatility.

Orion: A Glimpse into the Future of Augmented Reality

Meta unveiled a prototype for its first AR glasses, named Orion, which has been in development for 10 years. Orion features a minimalist design, weighing less than 100 grams, and boasts wide field of view displays, allowing for multitasking and entertainment on multiple screens. The glasses feature voice controls, hand and eye tracking, and an EMG wristband for seamless input.

Orion’s key feature is its ability to overlay digital content on the real world, rather than showing a representation of the physical world. This represents a significant step towards a future where augmented reality becomes an integral part of everyday life.

Share This Article