LIVE CLIPS
EpisodeĀ 9-17-2025
Great to see you. Congratulations on. Massive day. You got a bunch of fans here. Love to see it. Yeah, it's a fun one. You still winded from the run or. No, that was a pretty conversational pace. That's conversational pace. Love it. Yeah. React to the Kinect announcements. How do, how do you envision the next phase of this with developers? I mean, there's so many cool ideas that I could imagine happening on Ray Ban display, but there's an immense amount of constraints operating in such a small format. What does this look like over the next couple years? Yeah, well, I mean, I think that there are two platforms here that are interesting. One is the display glasses and the other is the neural band. Sure. And I actually think both of them could evolve into important platforms by themselves. So the glasses, I actually think there, it's pretty clear. Right. I mean, you saw there's the nav, where there's a bunch of different apps. We're going to try to start off with partnerships and start off getting some of the most used use cases and really nailing those and getting them in there. And then over time, hopefully we'll be able to open it up in some way. But I think we need to figure that out. The neural band, I think, is going to be an interesting platform by itself because, I mean, right now we're basically. We designed it to be able to power glasses. I mean, that was the purpose. But there's no rule that says that it can only be used to power glasses. I think that that's an interesting thing to explore over time too. I mean, you can imagine, you know, something like this when you're sitting at home and watching tv, being pretty cool too. So I think we need to figure out what direction this goes in over time. But this is a pretty good start. We've done this last year. The display is going to get all the attention. But the neural band is insane. I can't wait for people to try it. I mean, the fact that you can buy this in a couple of weeks is just insane. Talk about the team's foresight around the intersection of glasses and AI, because now it seems incredibly obvious, right? This always on this live AI. But it wasn't that long ago that people thought these were two different sort of tech trees and they didn't see the convergence. Yeah, I mean, look, every new important technology needs a new class of devices in order to make it first class. And I think glasses have three main advantages that I think are going to be just make them the ideal candidate to be the next major computing platform. One is that they help preserve this sense of presence when you're there with another person. I mean, you take out your phone, you're gone from the moment. Glasses have the ability to bring that back. Two is that glasses, I think, are the ideal form factor for AI because it's the only device type where you can let an AI see what you see, hear what you hear, talk to you throughout the day. Soon it's going to be able to just generate a UI visually for you in your vision, in real time. And then the third thing that glasses can do, it's really the only form factor that can bring together the physical world that you have around you with.
Couple years. On the Gen AI side, do you have a view on how much AI content we're going to be seeing? People like to complain about AI slot, but I've seen some incredible AI generated videos. I'm sure we've all seen Harry Potter, Balenciaga. It clearly still had a human element in it. It wasn't just make me something that gets likes. Yeah, there was a human touch that was enabled by AI technology. Right. Well, I think you're going to see like with all other technology that there's going to be good and there's going to be bad. And the most interesting content that I've seen that has been generated with AI or AI has been part of creating it, have had a point of view that has come from a person. Sure. I do think what you're going to see is we're going to see, yes, more purely generated AI content grow over time and some of that is going to have real risks. Things like deep fakes, trying to misrepresent what's happening. Some of it's going to be really inspiring and trying to help you. You know, you can imagine things like creating tutorials to learn how to do things that you couldn't do before. And a creator might not have been able to do that now. They can use the tools that do just that. I also think you're going to see a lot of content that is sort of hybrid. We don't talk about this a lot because we're more focused on the extremes. But AI can help people just clean up photos, clean up videos, make every clip in a reel, the same lighting. There's a lot of basic stuff that is actually, I think, super important opportunities for creators. My view, it's like if you've built an audience that cares about you and cares about your content today, you're going to do really well over the next 10 years. You're going to be able to make more content, you're going to be able to make better content. And then the exciting thing is the entirely new categories of people that never thought to make something because it was really hard or they never thought to learn. Right. I mean, the beauty of Instagram early on, and even Facebook was like Facebook, you could just type out a message and hit return, post it. Instagram, you could take a picture, post it. Maybe you add a filter, maybe you don't. And it's just about reducing that friction. Question I have is how do you think about the push and pull between keeping Instagram? You know, I feel like in your comments, keeping Instagram Instagram, Right. We think about, you know, we think about, you know, people have an idea of what an app is and then there's constantly pressure to add new things and do new things. But how do you think about that push and pull internally? So I think about our reason to be is about inspiring creativity and helping people connect. Over that creativity, I see an amazing piece of stand up that I know is good. I really hit hard with my brother and I send it to him and then we talk about it. You're seeing the shares are higher than likes in some. A lot of reels. Yeah. So it's about sharing reels, it's about responding to stories, it's about connecting over your interests. Now, how people do that on Instagram is going to have to change is how people communicate with their friends and how people entertain themselves inevitably changes. Often people think of Instagram as a feed of square photos. But if we didn't evolve, if we didn't add video, if we didn't add stories, if we didn't add DMs, if we didn't add reels, we wouldn't be here today. You wouldn't be asking me any questions. And so we have to figure out, how do we evolve?
See more and more people wearing two devices on their wrists. People are very comfortable with this. I don't learn. All right, we've got an after party over at Meta's class. Diplo is going to play. There you go. Please join me in welcoming Diplo. Fantastic. Well, we are moving over to our first guest of the stream, Chris Cox, the chief product officer. But at Meadow, people are also starting to learn that you're a big runner and you've got the. The whole Diplo Run club. Exactly. So what do you think? Should we. Should we run over to Ready for a run right now and take these things for a spin? Absolutely. All right, let's do it. Meta play. Be right there from spot. Go for a run. And I think that I believe they're going to run right past us. We will say we will wave to them when they run over here. Going for a light jog before hopping on the show. Love to see it the warm up. Great. And we are ready for our first guest of the show. Welcome to the stream, Chris Cox. Let's do it. Thanks so much for hopping on. How you doing? Welcome. This is Jordy. Here, grab a headset. There you go. Shades or not? Yeah, please. It's a little hard to wear over the head under the headset, but you can make it work. Nice. Yeah. Which ones are you grabbing? Well, I brought my own. I got these Navy. What do you daily drive? The full size. I like the Navy. Great. And they're transitioning. Here, pull up the mic a little bit. We can hear you. There you go. Great. Can you hear me? Loud and clear. Sweet. Yeah. So what is. What does your organization look like right now? I mean, you've been at Meta for 20 years, right? Almost 20 years. Almost 20. Congratulations. Uh, I mean, it's a massive company. How. How do you fit into today? Uh, so I'm the cpo, Chief Product Officer. I lead the family of apps. So That's Facebook, Instagram, WhatsApp, Messenger, Threads, edits. Working very closely with Alex and that, building out all the AI stuff that we're doing. Also lead our privacy team, the team that thinks about protecting user data. Yep. Uh, how has your frame of mind changed in the age of AI? Around the trade off the decisions around how you build the products. It's a new era for product guys. Yeah, it is. I mean, it's changing these days. It's changing like one week at a time. That's how much is changing. How people engineer. Prototypes can now be done. Stuff can be done in hours. That used to take Weeks. And part of what we're trying to do for the company is just encourage everybody, even if they know what they're doing, to take risks on trying to do things differently and to learn as. Quickly as they can, all the way. Down to the way infrastructure is built, the way bugs are detected, the way optimizations are made to ranking. For example, we've been ranking newsfeed since 2006. We're now starting to deploy agents to think about how to do that themselves and already seeing pretty interesting wins in terms of just making the experience better for people. So I would say it's changing very rapidly and it requires a huge amount of constant attention to make sure that we're staying on the edge. What about at the product level for consumers and how you think about like product quality? Historically it was easier to be like, does the button work or not? Yep. And now we're in an era where AI is probabilistic. You don't have the same ability to have consistency. How has that kind of shifted your thinking? A lot of it. I mean, AI can be used to detect edge cases a lot more easily, which is really important. AI can be used to scale a judgment to lots more types of people and lots more languages. For example, one of my favorite features on the glasses is live translations. And then one of my favorite features we've started to roll out on Instagram is captioning and lip syncing so that you can take any video creator's language and translate it into the native language of the viewer along with lip syncing. This to me is like very, very fundamental if you think about what it unlocks. It's kind of like Tower of Babel level phenomenal to take any and translated into the voice of the listener. So it scales the kind of thing that's just pure human connection, but it does it in a way that's instantaneous and could let somebody who speaks a relatively small language family experience the rest of the Internet or experience the speaker of anybody out there. Yeah, it'll be interesting to think about new superstars, Internet superstars starting out default global, just because they're able to just be instantly translated across the entire world. Exactly. Yeah. I mean, you said something about, you. Said you could scale a. What was that? A resolution or something. You had some word for it. But I'm interested to know how you think about the trade offs between rethinking products entirely from the ground up in AI native ways versus there are.
Told large Salesforce customers that using Agent Force, the software firm's new artificial intelligence for automating customer service and other functions would require extensive planning. The product information, the Salesforce. We're going live, Jordan. Going live. It's time. Three, two, one. Here we go. There he is. There he is. All right, here we go. No way. Here we go. Wow. Throw on some tunes. Live demo. Ballsy. High risk, high risk, high reward. And I will say, the speakers in the new Meta ray bans have improved dramatically. Yeah, there you go. Just ripping emojis on the way in. Hey, there's Diplo. Good to see you, Wes. So the glasses can support live stream. They must be. Maybe that's the one more thing we'll see. Here we go. Hacked house at Meta Connect 2025. Here we go. We'll talk about these in a minute. Here we go. Welcome to Kinect. All right, no chain. AI glasses and virtual reality. Our goal is to build great looking glasses that deliver personal superintelligence and a feeling of presence using realistic holograms. And these ideas combined are what we call the metaverse. Now, glasses are the ideal form factor for personal superintelligence because they let you stay present in the. In the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses, and more. Glasses are the only form factor where you can let an AI see what you see, hear what you hear, talk to you throughout the day, and very soon generate whatever UI you need right in your vision, in real time. So it is no surprise that AI glasses are taking off. This is now our third year shipping AI glasses with our great partner, Essler Luxottica. And the sales trajectory that we've seen is similar to some of the most popular consumer electronics of all time. Now we are focused on designing glasses with a few clear values. Number one, they need to be great glasses first. Now, before we get to any of the technology, the glasses need to be well designed and comfortable. And if you're going to wear glasses on your face all day, every day, then they need to be refined in their aesthetics and they need to be light. So in addition to working with iconic brands, we have spent years of engineering obsessing over how to shave every fraction of a millimeter and portion of a gram that we can from every pair of glasses that we ship. And I think that that shows in the work. Number two, the technology needs to get out of the way. The promise of glasses is to preserve this sense of presence that you have when you're with other people now, this feeling of presence, it's a profound thing. And I think that we've lost it a little bit with phones, and we have the opportunity to get it back with glasses. So when we're designing the hardware.
Live demo. Ballsy. High risk. High risk, high reward. And I will say, the speakers in the new Meta ray bands have improved dramatically. Yeah, there you go. Just ripping emojis on the way in. Hey, there's Diplo. Good to see you, Wes. So the glasses can support live stream? They must. Maybe that's the one more thing we'll see. Here we go. Packed house @Mea Connect 2025. Here we go. We'll talk about these in a minute. There we go. Welcome to Connect. All right, no chain. AI glasses and virtual reality. Our goal is to build great looking glasses that deliver personal super intelligence and a feeling of presence using realistic holograms. And these ideas combined are what we call the Metaverse. Now, glasses are the ideal form factor for personal superintelligence because they let you stay present in the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses, and more. Glasses are the only form factor where you can let an AI see what you see, hear what you hear, talk to you throughout the day, and very soon generate whatever UI you need right in your vision, in real time. So it is no surprise that AI glasses are taking off. This is now our third year shipping AI glasses with our great partner, Essler Luxottica. And the sales trajectory that we've seen is similar to some of the most popular consumer electronics of all time. Now we are focused on designing glasses with a few clear values. Number one, they need to be great glasses first.