For years, bicycle trips for the weekend have been sacred escapes for me. Each pedal shot helps to dissolve the stress factors that have accumulated during the week and collected some gadgets that improve these races. However, I learned in the hardest way that by bringing too many equipment takes off the race itself, forcing you to manage a ping network and battery levels instead of going by bike damn.
Insert Ray-Ban Meta: smart glasses that made my simplest and a little more fun weekend races.
Instead of wearing sunglasses, a pair of headphones and fool with my phone to take photos during the race, now I have a device that helps with everything.
The Ray-Ban Meta Smart Glasses were a surprise success with more people who simply me-meta claims to have sold millions of these devices and the CEO Mark Zuckerberg has recently said that sales have tripled in the last year.
Several Reddit threads and YouTube videos suggest that many people wear meadows of Ray-Ban during the bicycle. Meta has also taken hold: according to what reports it is building a next generation of intelligent glasses with Oakley, specifically built for athletes.
I would never have expected to use my bike metles ray-ban. But a few months ago, I decided to try them.
Now, I wear these glasses on bike races rather than elsewhere. Meta had quite right things with these intelligent glasses to convince me something Here. It is almost a joy to use and, with some updates, it could get there.
Techcrunch event
Berkeley, ca.
|
June 5th
Book now
A key strength of the Meta-Ban-Ban is that they are just a solid pair of rays sunglasses: mine are the wayfishing with transition lenses and a transparent plastic body.
I found these work well for the bicycle trips, protecting my eyes from the sun, dirt and pollen. They sit comfortably under a bicycle helmet, but perhaps not perfectly. (We will talk about it later.)
The killer function of intelligent glasses of Meta is the camera that is located above the right and left eyes. The glasses allow me to grab photos and videos of things that I see on my rides just by pressing a button in the upper right corner of the frames, instead of fooking with my phone – something that seems slightly bulky and dangerous by bike.


While riding the Golden Gate Park in San Francisco last weekend, I used Ray-Ban’s Due Bicchieri to take photos of the beautiful Blue Heron lake, the dunes covered with shrubs where the park meets the Pacific Ocean and the covered track of trees that is located at the park entrance.
Is the camera fantastic? No. But it’s quite good and I end up capturing moments that I just would never have had if I didn’t wear glasses. For this reason, I don’t see the camera to replace my phone’s camera, but rather a way to acquire more photos and videos.
The feature I use most: the outdoor speakers in the arms of the glasses, which allow me to listen to podcasts and music without blocking the sound of people, motorcyclists and cars around me. Meta was far from the first company to put the speakers in glasses – Bose has had a solid couple for years. But the goal take on the outdoor speakers is surprisingly good. I was struck by audio quality and how little I miss the traditional headphones on these rides.
I found myself chatting a little with the artificial intelligence assistant of the destination during my weekend races. I recently asked questions about the nature that I was seeing throughout the park – how “hey, destination, look and tell me what kind of tree is it?” – As well as the origins of the historic buildings I have seen.
Generally I use bike races as a way to disconnect from the world, so it seemed contraintial to speak with an artificial intelligence chatbot during the rides. However, I discovered that these short queries have fueled my curiosity about the world that surrounds me without sucking myself in a rabbit den of content and notifications, which is what usually happens when I use my phone.
And, once again, the biggest thing of these features is that everyone arrives in a device.
This means less things to load, less disorder in my box for bicycle gear and less devices to manage along my race.
Hole
While Ray-Ban’s Meta Glasses seem big to walk, they haven’t been designed clearly thinking about bicycle.
Often, Ray-Ban’s destination falls into my nose during a rough race. When they are bowed on the bike and look up to see what’s in front of me, the thick frames block my view. (Most sunglasses for cyclists have thin frames and bearings to solve these problems.)
There are some limitations on how Ray-Ban’s Meta Meta Meta work with other apps, which is a problem. While I love taking photos and pauseing the music with glasses, for anything else, my phone must get out of his pocket.
For example, Ray-Ban Meta has a Spotify integration, but I had difficulty convincing AI assistant to play specific playlists. Sometimes, the glasses did not play anything when I asked for a playlist or I completely played the wrong playlist.
I would like to see these enhanced and expanded additions to include more specific cycling integrations with apps such as Strava or Garmin.
Ray-Ban Meta also does not work very well with the rest of my iPhone, which is probably due to Apple’s restrictive policies.
I would like to be able to shoot texts or browse easily through Apple maps with my-ban-villa rays, but characteristics of the genre may not be available until Apple releases its intelligent glasses.
This leaves the assistant of ai di Meta. The characteristic of the AI is often propagated as the main strength of these glasses, but I often found it lacking.
The voice of Meta Ai is not as impressive as other products Ai Ai di Openi, perplexity and Google. His artificial intelligence voices seem more robotic and I find that his answers are less reliable.
I tested the sessions of the Ray-Ban Meta recently launched Meta, which were revealed for the first time at the Met Connect conference last year. The characteristic video and audio flows live by Ray-Ban Meta in an artificial intelligence model in the cloud, with the aim of creating a more fluid way to interact with your artificial intelligence assistant and let it “see” what you see. In reality, it was a hallucinated disorder.
I asked Ray-Ban destination to identify some of the interesting cars that I was about to go by bicycle near my apartment. The glasses describe a modern Bronco Ford as a Volkswagen Vintage Scarabeo, even if the two seem to be nothing the same. Later, the glasses gave me safely that a BMW from the 80s was a Honda Civic. Nearest, but still very different cars.
During the live artificial intelligence session, I asked the AI to help identify some plants and trees. The IA told me that an eucalyptus tree was a oak. When I said: “No, I think it’s an eucalyptus tree”, the IA replied, “Oh yes, you’re right.” Such experiences make me ask for because I speak with the IA.
Google Deepmind and Openai are also working on multimodal artificial intelligence sessions like the one that Meta offers with its intelligent glasses. But for now, experiences seem far from finite.
I really want to see an improved version of the AI intelligent glasses that I can face for bike races. Ray-Ban glasses are one of the most convincing devices that I have still seen, and I was able to see how to wear them in a lap would be a joy after some key updates.