We all know that frustrating feeling when you’re trying to listen to a friend in a noisy coffee shop or crowded bar, but all you hear is the espresso machine and glasses clinking. Meta is now rolling out a solution for this exact problem with its smart glasses. The new feature is called “Conversation Focus,” and it’s available now through the Early Access channel for Ray-Ban Meta and Oakley Meta HSTN users in the US and Canada.
This isn’t just a simple volume control turned up loud, which would only make background noise louder too. Meta built something much smarter instead. According to Digital Trends, the technology uses microphones built into the glasses frames to isolate sound coming from the person standing directly in front of you. It actively blocks out background noise, creating a private “audio tunnel” between you and the person you’re talking to.
This directional focus makes Meta’s approach different from other wearable audio tech. Features in devices like AirPods Pro 2 are great for general sound boost or hearing assistance. But Meta focuses specifically on directional use. You need to face the person for the feature to work, and they need to be close, about six feet away or arm’s reach.
Meta has been working on this technology for over six years
Meta has been developing this kind of technology for more than six years under their research program called “perceptual superpowers.” The name sounds like marketing talk, but the technology behind it is solid. This is a real useful feature that solves a common problem. Until now, the main selling points of these smart glasses were fun extras, like taking hands-free photos or asking the built-in AI questions. Those are cool, but not essential. Conversation Focus changes that by turning the device into an accessibility tool without making it feel medical or bulky.
The activation process seems very easy to use. You won’t need to open a phone app while someone is talking to you. You can simply say “Hey Meta, start conversation focus” as a voice command. There’s also a physical option where you long-press the touchpad on the glasses. That physical gesture is excellent because shouting a voice command in a busy cafe can feel just as awkward as not hearing the person. Meta has been busy expanding its tech infrastructure lately, including their ambitious AI sea cable project.
Keep in mind this is still in Early Access, so it won’t be perfect right away. Meta says clearly this isn’t magic. It won’t help you have a secret conversation during a rock concert. It’s built for “moderately noisy” places, like busy bars, coffee shops, or loud offices.
If you want to try this feature and see if it works as well in real life as it does in demos, you need to sign up through the Meta AI app. If this feature works well, we might finally be moving past the “gimmick” stage of wearable tech into something truly useful. The company has been rolling out various new features across its platforms, like the new chat feature on Threads.
Published: Jan 13, 2026 02:00 pm