Cover photo

Leaning Into Social Friction: My Week Wearing the Meta AI Ray-Ban Glasses

Reactions from one week of wearing the Meta glasses, including those awkward moments of "social friction"

This week I decided to bite the bullet and pick up a pair of the Ray-Ban Meta AI glasses to see what it feels like to enjoy the world in a more AI-supercharged way. I was particularly curious to experience these things:

What I Wanted To Learn From Wearing Smart Glasses

  1. What's it like to have audio playback (music, phone conversations, etc.) via bone conduction vs. in-ear or over-the-ear options

  2. How good the AI is at interpreting the real world to me, at my own level (obviously I've been thinking a lot about a kid-friendly learning lens version of this with MuseKat).

  3. How natural vs. unnatural it feels to both wear a camera on my face and interrupt my usual workflows with AI-supercharged responses

  4. How effective a smart wearable device is at reducing my screen time.

Here are some of my initial thoughts and reactions about this experience based on these four areas.


1. Audio via bone conduction

Verdict: Excellent experience

I've been feeling for some time that I'm "missing out" on NYC through the so-called technological escapism of wearing bluetooth headphones out on the streets, in parks, and on the subway so often. I really like the lightweight and effective way that the AI glasses offer a private audio listening experience without being as disruptive.

It's still easier to engage with strangers on the street, to catch the subway announcement, or to hear a car coming up the street, while having a semi-attached sensory experience while listening to a podcast or music.

I am also impressed by the phone call experience, which felt really fun to take calls on the go without needing my phone or headphones on (though I did confuse several people in the elevator and park while I was seemingly "talking to myself").

I'm excited to see more diversified audio experiences that continue to take this even further.

2. Interpreting the real world, at my level

Verdict: Novel, but not nearly as good as it could be

This was the category I was most interested in. Notably, I have an incredibly high bar with how I like AI to talk to me and constantly develop customized AIs (or custom GPTs) with just enough context to help me interpret the world through a different lens. To put my smart glasses to the test, this week I tried to use the "look and see" feature of the glasses with nearly every outing.

Some of the most useful things I've used it for:

  • Interpreting signs in the real world, in real time.
    "Meta, look at this subway map. I want to get from Times Square to Atlantic Avenue Barclays Center. What train should I take?"

  • Interpreting documents in real time, with short-term context.
    "Meta, look at the agenda for a conference I'm attending today. If I want to meet people to help me with go-to-market partnerships for a family-focused learning app for kids that I'm building, which sessions should I attend? Who should I try to meet?"

    Here's a photo I took of some subway art with the Meta glasses, a fun perk, but also a good reminder that you need to watch your head tilt while snapping pics.
  • Reading menus for me, suggesting items based on preferences.
    "Meta, look at this menu. Which signature cocktail should I try if I like floral flavors?"
    "Tell me what items are gluten-free."

  • Interpreting art for me
    "Meta, look at this mural on the way. Tell me about it. Now read the plaque about the artist. Tell me more about them."

  • Asking questions about the world around me
    "Meta, look. What's that on the rocks down there?"
    "What kind of flower is this? Tell me more about it."

  • Learning about deeper historical facts about real-world items
    "Meta, look. What building is that?"
    "The Flatiron Building."
    "Tell me about the history of the Flatiron Building."

  • Quizzing me on real-time learning.
    "Meta, let's see what I learned. Now create a five-question quiz based on what you just told me about the Flatiron Building. Ask me the questions one at a time."

  • Asking about book recommendations.
    "Meta, look at this book. Tell me what it's about."

But I very quickly hit the limits of what I wanted to do with these AI interpretations. Without being able to customize the personality, to retain longer-term context, or to use the AI as a real-time ideation thought partner (through collaboratively created voice memos, for instance), I struggled to build upon this corpus of creative energy. I also really wish it remembered more about me from one session to the next.

3. How natural or unnatural it feels

Verdict: Unnatural, sometimes socially creepy

I've had fun this week letting other people try on the glasses and now have a hilarious collection of images of me, explaining to them how to take a picture with the glasses. (image taken by Meta glasses)

Overall, if you want to wear glasses like this around town today, you need to be prepared for a pretty high degree of social friction.

I've been confusing bartenders (by "talking to my menu"), confusing elevator passengers (by "talking to myself" while on a phone call), and creeping out colleagues who ever-so-slightly modulate their behavior while seeing me wear them.

Yesterday, I attended a 300-person conference and decided to wear my AI glasses all day, just to see what it felt like to engage in close quarters with people in this new way. The event, Collective Future, positioned itself as "authentic, not agentic" which did mean there was a good degree of irony in that I walked around and sat in on small-group discussions wearing a recording device on my face all day.

I had quite a few conversations like this:

"Are you wearing the Meta glasses?"
"Yes!"
"Wait, are you recording me right now...?"
"No." (beat) "But do you trust me...?"
"Um...I mean, I just met you..."
"Want to try them on?"

I had several weird moments in group sessions during the event where my glasses suddenly started talking to me with text message updates from my husband or nanny about childcare logistics, and at least once where music started playing loudly in my own ear that no one else could hear, at a particularly inopportune moment.

I'm not going to lie, this degree of uncomfortable social friction is not for everyone. A friend I saw at the conference told me that he wore the glasses around the city for a week, but ultimately got heckled too much so he returned them. Interestingly, my experience has been quite different: I didn't get heckled, I got hit on.

4. Effectiveness at reducing screen time

Verdict: Maybe, but not as much as I'd want

I was really hoping that wearing the glasses would significantly reduce my screen time on my phone. According to my iPhone, my screentime is down 39% this week compared to last week. But I don't feel particularly less encumbered, and it's hard for me to assess that as a long-term pattern.

I did find that "voice dictation mode" when responding to texts did reduce my need to check my phone with every buzz. Also, rather than walk around the world with a phone in my hand, checking social media, I was using my glasses to ask questions about things around me, so in that sense I was more physically plugged in.

But since the glasses are still tied to the phone (by bluetooth), you need your phone within arm's reach all the time, and I still fought the compulsion to pick up the phone and idly scan through my email at quiet moments, like on the subway or sitting on a park bench.


Conclusion

There's a lot to like about the idea of wearing a portable learning lens that helps you interpret the world around you, but I think we have a long way to go. In my next post on this topic, I'll share a few ideas that I have about what could be improved about this experience for the next generation version of wearables.


Loading...
highlight
Collect this post to permanently own it.
Hard Mode First logo
Subscribe to Hard Mode First and never miss a post.
#ai#technology#wearables