Mark Zuckerberg claims that Meta has not shifted its focus from the metaverse to AI. He believes that AR glasses will be a critical hardware platform for AI assistants.
FACTS
Mark Zuckerberg said yesterday in a new interview with The Verge that his business intends to build broad artificial intelligence.
At the same time, he denied that Meta had transitioned from the metaverse to AI. “I don’t know how to more unequivocally state that we’re continuing to focus on Reality Labs and the metaverse,” Zuckerberg stated, adding that Meta continues to invest more than $15 billion in the metaverse each year.
In a video, Zuckerberg outlines how AI and the metaverse may one day meet in the not-too-distant future, justifying the metaverse’s significant investments:
People are also going to need new devices for AI and this brings together AI and the metaverse. Because over time, I think a lot of us are going to talk to AIs frequently throughout the day. And I think a lot of us are going to do that using glasses because glasses are the ideal form factor for letting an AI see what you see and hear what you hear. So it’s always available to help out.
CONTEXT
Zuckerberg reiterates what CTO Andrew Bosworth stated in his 2023 year-end review: glasses are the ideal form factor for AI assistants because they show the world from a human perspective, are socially acceptable, can be worn all day, and allow wearers to focus on the real world rather than the hardware.
The core notion is that glasses are a superior hardware platform for AI helpers than current projects such as the Humane Ai Pin and Rabbit R1. The Ray-Ban Meta smart glasses are now available with rudimentary AI features, with more to come.
According to Bosworth, the unexpected discovery of massive language models may alter Meta’s AR agenda. The company’s smartglasses may become far more helpful than planned, while Meta’s first real AR glasses may receive a far earlier and greater emphasis on contextual AI than anticipated.