AI dominated Mark Zuckerberg’s opening keynote at Meta Connect today — mentioned within the first 10 seconds and dozens of times throughout. Absent was the term “metaverse,” which was referenced just a couple of times at best. There were no updates about Horizon Worlds. It’s not unexpected. Forrester predicted a metaverse winter for 2023. And only 12% of US online adults are eager to spend some time exploring immersive digital worlds/platforms, according to Forrester’s Media And Marketing Benchmark Recontact Survey, 2023, published in July. Even Zuckerberg admitted during Meta’s Q2 earnings call, “I understand the discomfort that investors feel” regarding the metaverse.
In its place, Zuckerberg talked about three very loosely connected things:
-
- The new Meta Quest 3 mixed headset
- Advances embedding AI into Meta apps
- Meta’s next-gen Ray-Ban smart glasses
Meta Quest 3 Won’t Scale Much Beyond Gamers
Despite Meta’s claims that the Quest 3 is “the first ‘mainstream’ mixed reality headset,” the masses simply aren’t yet interested in this category of devices. Just 8% of US online adults have a VR headset, according to Forrester’s Technology And Telecom Benchmark Recontact Survey, 2023, published in August. And among those who do, the most popular use case is playing video games.
Meta (and its headset competition) isn’t just plagued by a niche audience but also headset fatigue. Forrester data shows that a typical VR headset session lasts about 45–50 minutes and very few people use them daily (just 9% of those who own one). Until these devices become far less cumbersome to wear, they remain limited in their usefulness.
Meta AI Already Demonstrates Tangible Utility
Arguably, the star of Meta Connect today was Meta AI, a first-gen assistant based on Meta’s Llama 2 language model. It can access real-time information due to a partnership with Microsoft’s Bing search. It also embeds Emu (Meta’s “expressive media universe” image model) into its UI. This enables users to generate custom photorealistic images and stickers. And Emu’s soon-to-launch Instagram integration for advanced AI photo manipulation will be one to watch, as it will bring immediate utility to the IG experience.
But it was Meta AI’s integration into its next-generation Ray-Ban smart glasses that was perhaps today’s biggest surprise. Almost (but not quite yet) like fictional Tony Stark’s J.A.R.V.I.S.-enabled glasses, Ray-Ban users can interact with Meta AI using voice commands. As Zuckerberg put it, this is “an AI assistant that will hear what you’re hearing and see what you’re seeing.” Yes, this is both creepy and promising for the future of smart glasses.
Forrester clients: Let’s chat more about this via a Forrester guidance session.