When AI Moves Onto Your Face
Glasses with AI, a wristband that feels like telepathy, and what it means when meetings have more screens than faces
Sorry for the spam this week. A new piece today, because yesterday Meta dropped an update to their Ray-Ban smart glasses.
Most of you weren’t around when I started Fullstack HR (and that’s probably a good thing) because back then, I was deep into the Metaverse, crypto, AR, VR, all of it.
I was fascinated by how these things might reshape work. (I still am, maybe not crypto as much these days though…) I have the first-generation Meta Ray-Bans. I love them. For me, they’re the closest thing I’ve had to carrying a small AI assistant everywhere I go.
And yesterday, as said, we’ve got a new version. And yes, this may sound a bit like fanboy content, but I can’t help it, it’s too cool.
First: the display.
First, Meta has put a display into the glasses. Think back to Google Glass (yes, it looked ridiculous in practice, but the concept was amazing.) Meta has finally made that idea look… good. And also, every review I’ve read says the same: it works surprisingly well, it’s sharp, it’s bright, and functional. That alone opens up a ton of use cases.
Second: the wristband.
This, to me, is the really wild part. You can type or gesture in the air, almost like telepathy - check the video, which starts where the wristband is shown. It’s cool to see that this is a product you (soon) can buy.
Live AI
Live AI has been rolling out gradually in Ray-Ban Meta glasses. So far, most of the interaction has been via audio or through your phone or camera feed. What’s new with the Ray-Ban Display is that there is now a built-in display in the right lens. That allows you to see messages, navigation, and responses directly, not just hear them.
The live demo of this didn’t work (big kudos to Zuckerberg for even trying; live demos are brutal), but still, the videos of this look way too cool.
But for me, what matters here is that AI is moving closer to us, literally onto our bodies. We’ve already got AirPods Pro 3 with live translation, now we’ve got glasses with displays and assistants. This is personal AI, body-near AI or whatever you want to call it.
It won’t happen overnight. Not everyone will walk around with AI on their face tomorrow. But the direction is obvious.
One side effect of this? It will put even greater pressure on us to run good meetings. Imagine a future where everyone has a display right in front of them. If the meeting is boring, if it feels like a meeting for the sake of having a meeting, people will just flick on the latest episode of Drive to Survive while nodding politely. Ok, maybe not, but in theory, doing something else while in a meeting will be easier than ever. And most likely, the social pressure of being in the same room will keep people from doing it. But the temptation will be there.
But one real use case that I think we will see is in recruitment. Already today we see candidates using AI tools during interviews. Now imagine them wearing Ray-Bans with Live AI. They could get real-time prompts, facts, or even suggested answers without you noticing.
That puts hiring managers in an interesting position. Do you treat it as cheating, or as the new normal where people bring augmentation into every conversation?
Is it good? Is it bad? Yes.
And that is the point. These tools cut both ways. On the positive side, you can pull up documents, cross-check information, or get context while still being part of a discussion. That can make you sharper. On the negative side, it can split your attention, erode focus, and make genuine interaction harder to spot.
Like all technology, it depends on how we use it. And that is the big challenge we will face as AI gets this close to our bodies. It will not just change what we do at work, it will change how we show up in the room.
And by nature, I am a tech optimist; I want to believe that this will improve our lives, rather than diminish them. I view this as another step in AI becoming an integral part of everyday life and work. What happens when job interviews are held with candidates wearing these? Or when managers rely on live overlays to make decisions?
We do not know yet. But we are getting closer to finding out.
(And I can finally write about AR and the Metaverse again.)


