Episode 29: Inside Facebook’s Ambitious Plan To Build Out the Next Dimension

Read the transcript below, or listen to the full interview on the First Contact podcast.

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.

Laurie Segall: You are building out a new computing platform. You’re building out a new social world. And I don’t actually know if people realize what a big deal this is. 

Andrew Bosworth: I could not agree more. This is like the beginning of computing again. You know, this is going back to the 1960s, the 1970s, when the computers that we know now, even the ones we have in our pockets, really come back to the same fundamentals that were designed way back in the middle of the last century. And with VR and AR, suddenly it is a different paradigm.

He is sitting in one of the most influential seats at Facebook… building out a new virtual, social world. 

Andrew Bosworth – better known as Boz – was among the first 15 engineers at Facebook, he still remembers the exact day he started in 2006. Over the years he helped build NewsFeed, he oversaw Facebook’s advertising efforts during the 2016 election.

And he’s also become known for his controversial and unfiltered opinions – which he sent out in company-wide, attention-grabbing memos.

Now, Facebook is pursuing an ambitious vision of virtual reality and augmented reality, and it’s Boz who’s in charge. 

There’s a lot at stake as Facebook builds out what has potential to be a whole new dimension.

What will AR and VR look like down the line? And how do you make virtual interaction feel as natural as in-person interaction? What will this mean of the future of remote work? How close is too close in the Virtual World? And when it comes to creating your virtual self… who owns your identity?

These are all ethical questions and their the types of questions Facebook is looking at as they invest pretty heavily in building out AR and VR.

And as the company creates another social layer, I think it’s important to ask Boz – how confident is he that Facebook won’t re-create the problems they faced as a platform over the last decade.

But before we get to Boz – I want to tell you about something new from Dot Dot Dot that I am really excited about. It’s our new email newsletter – The Grey Area.

Each month The Grey Area confronts the complex issues facing technology and humanity – issues that aren’t necessarily black and white. 

In October, we’re exploring the controversial topic of technology and neutrality. Including an unfiltered perspective from a well-known Silicon Valley founder who says tech should NOT try to be neutral. And those building platforms now… should start building with that in mind.

It’s actually a fascinating take – and I hope you guys don’t miss out. You can sign up at Dot-Dot-Dot-Media-Dot-Com SLASH Newsletter. 

And now, it’s time for Boz. 

I’m Laurie Segall, and this is First Contact.

Laurie Segall: Boz, that’s what we call you. That’s the nickname. That’s the correct way I should describe you.

Andrew Bosworth: That’s right. You’re also welcome to call me Andrew. I do respond to both names

Laurie Segall: Okay.

Andrew Bosworth: From time immemorial, people have preferred to call me Boz.

Laurie Segall: Great. Currently a VP at Facebook reality Labs, but you’ve been at Facebook since 2006.

Andrew Bosworth: That’s right. January 9th, 2006.

Laurie Segall: January. I love that you, by the way that you know that the date that you started.

Andrew Bosworth: That’s a big deal. That’s a big deal. It was also two days after my birthday. I started, uh, it was, uh… My birthday is January 7th.

Laurie Segall: Wow. Do you remember your first day?

Andrew Bosworth: Vividly. Yeah, absolutely. Uh, it was, you know, a bunch of really great Facebook engineers joined at the same time that I did. Um, you know, Mark Slee, Dave Fetterman. It was a fun time for us and uh, quite a crazy change. I came from Microsoft. Uh, and so going to this little startup where like there was just no one even to greet, you just kind of wandered in and found yourself a desk. It was pretty different.

Laurie Segall: Hmm. What was your first conversation like with Mark that day? What was the first thing he had you do?

Andrew Bosworth: We had to go fix bugs, which is a tradition that continues to this day. He was like, “Hey, here’s your desk and here’s a computer, go fix some bugs.”

Laurie Segall: Right. I mean, for folks who are listening to this, context is you’re one of the first 15 engineers at the company. And you help build News Feed, Messenger, Groups. So you have just been a part of every major milestone of the company to this day.

Andrew Bosworth: I’ve had a, a good fortune of working across a huge breadth of the company. Of course, there’s always so much more, a lot of things that I’ve, I’ve never gotten to, to work on that are great. Um, but I’ve, yeah, I’ve had, I’ve had a really fun set of projects to work on connecting people, uh, and, and creating these communication tools that people use every day it’s really satisfying.

Laurie Segall: And also what an extraordinary time to be sitting in your seat, And, and also one of the things I love about you having been in the industry for a while and having followed the company closely is you’re, you’re kind of one of the, the executives that says how he feels which as journalists, but just like as people we, we really, uh, appreciate, but, but it is really fascinating because you’ve said some things throughout the years that always get a lot of attention. But you are a person at the company, at a big corporate company who’s kind of known for saying how you feel.

Andrew Bosworth: I think it’s saying, saying how I feel, but also trying to bring voice to conversations that are important, is a bigger part of it. I think it’s tempting for any company that gets big to get comfortable or to get into a habit of not asking the hard questions. And I’ve always wanted to be someone. It doesn’t matter if I’m just a regular employee or if I’m an executive, I want to be somebody who invites the hard questions, who’s bring those to the surface to make sure that we’re always doing that work. And it’s never been more important than it is now.

Laurie Segall: Yeah. I want to get to, to what you guys are announcing and, and, um, virtual reality and augmented reality, which I just think is actually fascinating. You know, I think a lot of times in technology, everybody has one conversation and we totally don’t look at the other way. And, and I think that you’re sitting in probably one of the most influential seats at Facebook, given everything that’s happening. You’re almost building out… you are building out a new computing platform. You’re building out a new social world. And I don’t think, I don’t actually know if people realize what a big deal this is. It, you know, I know people have been talking about VR for a long time. People been talking about augmented reality. I know that, that Mark has, in his non-New Year’s resolutions, he always post on Facebook. He did this year, a non-New Year’s resolution where he posted something about augmented reality. Fast forward pandemic, remote work, it seems to me that you have one of the most important roles at the company right now.

Andrew Bosworth: It’s hard to gauge its importance. You know, how would you weigh the importance of new computing platforms versus, uh, bringing greater integrity and privacy and security to existing platforms? I think they’re all important. Uh, I can certainly tell you, it’s one of the most fun jobs at Facebook right now. I really, I wanna double down, you said you don’t think people will know how important this is. I could not agree more. This is like the beginning of computing again. You know, this is going back to the 1960s, the 1970s, uh, when the computers that we know now, even the ones we have in our pockets, really come back to the same fundamentals that were designed way back in the middle of the last century. And with VR and AR, suddenly it is a different paradigm. It’s not just flat 2D windows that you directly manipulate somehow whether with your, a mouse or your finger, it’s, it’s like in the world, there’s a bunch of elements that the computer can’t control that it has to adapt to. Not only was that impossible previously from a standpoint of the displays, which still don’t exist yet, but we think they can, uh-

Laurie Segall: Mm-hmm.

Andrew Bosworth: … the sensors, all that, it’s also the AI, the, the intelligence you need to be able to be useful in that kind of scenario. So it does feel like we’re at the beginning of a really big arc in progress for technology. Whereas the mobile phone was maybe the end of the last arc of progress. Snd, and that’s exciting.

Laurie Segall: And, but the gimme the sell, ’cause like I’ve been in this for a while, right. And how many times have we heard people say like VR, the next big bet is VR and, and AR and like you, you and me both know that people have been saying this for a really long time. I think my instinct is saying, “No, no, no, something about this actually feels kind of different.” And maybe it’s because of all the external things happening with remote work and with our reliance on screens and our craving humanity in a different way, but something does feel different. So like Boz like, give me the sell, like VR hasn’t really hit a hundred percent. Maybe you can argue with me on this. Um, but, but why now do you think, this is the moment for virtual reality, augmented reality?

Andrew Bosworth: Well, in the case of virtual reality, you have the mobile phone to thank. Mobile phones created an incentive for technology to miniaturize, improve performance per watt, improve things like cameras and make them smaller and cheaper and higher fidelity and improve, uh, things like displays, very, very small, tightly packed displays. Without all of that, the physics of virtual reality, which yeah, have been around since the ’80s, were just unworkable. You know, some of the early, uh, VR headsets were so heavy they had to be suspended from the ceiling by steel cables. They used to call that the sword of Damocles. Uh, ’cause if it fell off, it would kill you while you were using VR. You know, today we’ve got Quest 2 weighing in 15% lighter than a generation that was launched just a year and a half earlier, four times more powerful, 50% more pixels. And it’s $100 cheaper. That is, you know, the benefit that we have of working on a supply chain that was really developed for mobile phones, but works beautifully for VR as well. Not to mention tremendous wealth now of 3D content, thanks to years of, of heavily investment in the industry around, uh, 3D gaming in particular. Uh, so I do think it does feel differently. I, it reminds me, if you, if you went back and you had like a PalmPilot or you had a Handspring, those were awesome devices and you could glimpse in those devices what the iPhone would become, but they weren’t the iPhone. I think the previous generation of VR was kind of like those PalmPilot Handspring type devices. Right? Yeah. I get it. Like, if you could do this, it would be cool, but you can’t do it yet. We can do it now. It’s exciting. Like, it’s here. If you’ve used it, you’re like, “Oh my gosh, this, this is it. This is what I’ve been waiting for.” Now you can use your hands. It can be very natural. Augmented reality, we can see, but we’re like, it’s, we’re still trying to solve some of the fundamental physics problems, you know, like how do you literally make certain wavelengths of light? How do you bend those wavelengths of light in the right way? Um, so we’re, we’re at a little bit more of a fundamental stage with AR, but the same technology should allow us to cross the bridge. And, and also huge advances in wireless technology are critical as well.

Laurie Segall: Do you think when it comes to the future, and I want to get into specifically some of the, the platforms that you guys are launching and, and what you guys just announced, but I mean, do you think that the future of Facebook is, um… You know, I look at, I look at the history of Facebook and I look at Instagram and WhatsApp and all of these different, um, products that are owned by Facebook, um, to, to some degree and that have been integrated with Facebook. Do you think that when we look at Facebook in 10 years, and maybe this is just to kind of talk about the stakes of this and to talk about how important, um, building out another world is and what will come along with that. Do you think that these worlds that you guys are building, the ones that we’re about to talk about, will be the kind of the next dimension that in the next layer of Facebook, that it may be 10 years? We might not even be on the Facebook we know, we’ll be in these different worlds that you’re building today.

Andrew Bosworth: I love that you used the word layers there, ’cause that is how I think of it. You know, I don’t, we still make phone calls today and that’s a layer of communication that we, as a society laid 100 years ago, the foundations for. And then we added text messaging and we’re increasing the speed and the fidelity and the richness with which we can exchange information when we’re at a distance. Look, nothing is as good as being in the same room with somebody you love, that’s, that’s a high standard to meet, but can it be better than VC today? Absolutely. Like we can do better than this. And I think all the time about, uh, bowling, Laurie, you ever been bowling?

Laurie Segall: Totally. I was on a bowling league when I was younger. Just FYI.

Andrew Bosworth: I love that. What a, what a weird thing for us as a species to do. Can you imagine if we saw like an ant go bowling or like a dog bowling-

Laurie Segall: Right.

Andrew Bosworth: … we’d be like-

Laurie Segall: Uh-huh.

Andrew Bosworth: … it would be the greatest sensation of all time. What do we, we, why we go bowl- We go, you wanna have something, just any excuse for us to have a shared experience, to create memories, to have an excuse to go to a place and be together. And I think when you’re in virtual reality, like I’ve done a lot of, uh, you know, happy hours, with friends over portal and they’ve been great, but at some point they kind of drop off the calendar ’cause you don’t have a reason to do it. There’s not like a thing that anchors it. Uh, with virtual reality, with augmented reality, you potentially do have those things. Do I think so, do I think that they replace Facebook as we know it today? No. There’s still plenty of opportunity there where I wanna either, uh, asynchronously communicate through sharing and broadcasting or multi-casting, or I want to communicate one on one or do really richly… And by the way, video calling for two people is pretty great. You can see my full facial expression. It’s really rich. You have a good sense of my emotion. So there’s a lot… All that value still exists. There’s gonna be new forms of value. Um, which yeah, I think a year ago would have been maybe a tougher sell, but now that people are in lockdown, they’ve experienced what it’s like to be in quarantine. You get it. It’s like, Oh yeah, that’s, that’s a-

Laurie Segall: Yeah.

Andrew Bosworth: For some people, for immigrants, that’s what life is like every day, they don’t know anybody, their loved ones are far away and like that’s, that’s something they can do about it. So I really believe in this direction, uh, for us as a society. And I think it’s also important as we’re seeing now for people who wanna collaborate at work.

Laurie Segall: Yeah. You know, I remember I started covering tech in 2009, right out of the recession after 2008. And there was all this innovation that happened because you saw that there’s so much broken and, and I think this experience even being on Zoom and, and the fact that, you know, I don’t think we’ll go back to work in the same way. Although I hope we all go back to work in some capacity, but some of this will remain, right? These ideas of remote work. So there is a tremendous opportunity you know, for, for these platforms. And I can see that Facebook, you know, in many ways wants to own that, right? And, and of course, I think that comes with so many interesting questions, o- o- on the human side along with, with that, you know, but this experience right now we have is pretty broken, right? And I think people are sick of the Zoom apocalypse and, and will people, you know, wanna Zoom when, you know, when we are kinda going back to work and what will be that in between human connection, it seems like that’s the thing that, that you’re thinking about.

Andrew Bosworth: Well, I also, I also want to clear, like, I don’t know that I wanna own it. I just wanna make sure there’s space for it. I mean, honestly, if you look at… There’s lots of examples of, of technology that we use, that we don’t primarily use to communicate or connect with people. Um, and this also goes back 70, 80 years. There’s a very deep divide in the history of computation where some people felt, “Hey, this is about a tool being useful for me as an individual and that would make me more powerful.” And there’s people who said, “No, no, no, this is a tool to connect with other people.” That’s why it’s so incredible that in 1968, that Engelbart who debuted the computer mouse, also debuted video conferencing and shared document editing. You know, it was important to the early pioneers of the current generation of computing that this be not just about, “Oh, I can do spreadsheets more effectively,” but also enabling the internet, ethernet came out of a Xerox, like incredible leaps forward in our ability to connect across distances. And for us at Facebook, that is what we care the most about. And I do legitimately worry that if we’re not in there at the pioneering stage of these new technologies, that other technology providers will just cut that use case out, and it’ll still be great devices, they’ll be super useful to you, but they won’t help you connect with other people. And so I don’t need to own the whole thing. I’m happy to, to play in a lot of different systems. I need to make sure there’s space for this really valuable work to happen. And we’re the ones who care the most about it.

Laurie Segall: And I wanna get into that by the way, because I do think this idea of connection has come along with complicated questions. And so I want to talk about how you guys are thinking about that as you build out a new computer interface, as you build out these layers. But I don’t wanna speak around. I wanna talk specifics. Um, you guys launched, quite a bit, you made a quite a few announcements, in the, in the last couple of weeks with  with Facebook Connect. So talk to me, I mean, let’s start with Horizon. I thought Horizon was-

Andrew Bosworth: Yeah.

Laurie Segall: … super interesting. It’s kind of weird to talk about it over a podcast, but like-

Andrew Bosworth: I know.

Laurie Segall: … if you can just like close your eyes and pretend like we’re there and like describe to people what you see when you’re wi- with Horizon.

Andrew Bosworth: Yeah. I mean, Horizon is a virtual world. It’s got things to do. There’s rooms, there’s spaces that, uh, hopefully a large community of creators can build out more and more spaces. And those could be performance spaces. If you wanted to be, uh, do an artist or, or do poetry or do a performance, those could be, uh, little game spaces. Like, uh, we played a little laser tag game. That could be, uh, a puzzle, you know, a little puzzle space. Escape rooms are one of the popular early ones that some of our internal devs have done. Um, again, to my point about bowling earlier, it’s a place where you and I could go and just have a share experience and it’s social, you’ve got a, an avatar, I’ve got an avatar. Um, and, and they’re not high fidelity, uh, they’re, but you do get that sense. And we, we’re all about at Facebook Reality Labs, is that sense of presence. Y- that sense that I am with somebody, that sense of being with somebody and having an experience that the two of you share together. And Horizon is, it’s just an open beta right now. And, uh, it’s pretty cool. It’s early yet. We’re still kind of working out some of the scaling issues and getting the avatars just right and getting the quality just right. Um, but yeah, no, it’s, it’s, it’s, it’s, you can think of it as a virtual space for people to go and be together.

More from Boz after the break, and make sure you sign up for our newsletter at dotdotdotmedia.com/newsletter, we’ll be launching in October. 

Laurie Segall: You kinda touched on like this idea of social VR, right? Um, and being able to be around other people and be present with other people and do things together. I always think that’s fascinating and important, especially now. I also, when I look at these tech platforms being built, right, and you see the, the wonderful videos and, and Boz you guys put these together, like, so you know them better than, than anyone like where it’s all the amazing things you can do together. And all of a sudden you’re in the virtual world and you’re playing games and you’re building things. It’s not like you guys are putting “warning and also XYZ”, right? And you gotta be really careful as you’re building out a, a new dimension of some sort, a new layer that you don’t recreate many of the problems that you have on Facebook OG platform because obviously you guys have been, uh, dealing with many complicated issues over the last decade, even, you know, especially over the last five years. So what are you thinking about as you build out a new layer, um, you know, what are the ethical issues you’re thinking about? What are kind of some of the human problems you’re thinking about?

Andrew Bosworth: Yeah. They run a huge range. I mean, here we, we benefit so much actually from being a part of Facebook because we do stand on the shoulders of all the lessons learned, uh, over the last 10 or 15 years, uh, working through the platforms that we’ve built there. And we benefit from all the technology that’s been deployed around that. We also have a few additional advantages, for example, in Horizon, which you just talked about, if there is some kind of an abuse happening, you have the ability, which is unique to virtual reality, to literally freeze the world, find the offending individual and like disappear them. That person just doesn’t exist for you anymore. And you get to go about your day. And so you have a lot of power in virtual reality to control your own experience, which makes sense because, you know, you, it’s all just virtual. Uh, and so I think we’ve got  two advantages on virtual reality, which are really valuable. One of which is yeah, the history and technology that Facebook brings to the table. And the second one is the nature of the medium is a little bit more empowering. And then, you know, the other thing we talked about at Connect last week is for example, Project Aria. Project Aria is a research vehicle that we’re rolling out 100, kind of hand-built, pairs of glasses that have sensor packages on them. They have, outward facing sensors. They have inward facing sensors. They have GPS, somewhere on the order of 100 Facebook employees and contractors fully trained, are gonna be wearing them around in the Bay Area and Seattle. And this is in- inviting, very intentionally inviting a conversation about, “Hey, what is the nature of what we should expect or allow as a society when it comes to these types of devices?” To be clear about Project Aria, we’re being very careful with it. All data is quarantined for three days. It gives us time to scrub any faces out. We blur faces. We blur license plates. We don’t use the data at all with those things intact. Um, the people are wearing shirts, they’re all identifiable, but set aside the specifics, the more general question is like, “Hey, augmented reality could be incredibly valuable, uh, in a really specific use case.” You know, we’re, we’re partnering with Carnegie Mellon University to say, “Hey, could these help, uh, people who are vision impaired navigate physical spaces, right? Uh, it’s, it’s a device that could help them be able to actually not see, but navigate physical spaces more effectively than they could otherwise. That’s pretty good, but it’s also got a bunch of cameras on it that are gonna like see other humans doing things in the world. Um, what is the impact, not just on the person wearing it, which is the major focus when it comes to like mobile phones, what is the impact to the people who aren’t wearing it? What is the impact to underserved or marginalized communities who come into contact with this technology? And so there’s really tremendous opportunity for good. And there’s, you know, obviously a huge amount of risk. Well, we’re now trying to start that public conversation today, uh, last week, I suppose, so that we… ‘Cause, ’cause we’re years away from having a consumer product out. And so let’s have it, let’s figure it out as a society. What do we think is a good trade off? What is an acceptable level of protection? We’re not gonna get rid of all the harms, but we can hopefully find a balance that we find equitable as a society. Um, and so that’s a really big shift. I mean, for you covering Facebook for 10, 15 years, you know, like that’s a shift, we’re trying to get this stuff out way in advance of when the product arrives so that there’s no surprises.

Laurie Segall: Although, we all know you put a product out there and just people misuse it, right? People use it in all sorts of ways that will shock you. How could you have seen that Russia was gonna, do what it did for the election, right? So it’s almost also, how do you anticipate the unintended consequences?

Andrew Bosworth: Yeah. So by definition, I suppose you can’t in, uh-

Laurie Segall: Yeah.

Andrew Bosworth: … find all the unintended consequences, but we can certainly do a lot of them. You know, we really, that’s, that’s really the work that we’d been doing, certainly in the last five years, uh, really intensified the last, uh, three, where it’s like, “Hey, okay, what are all the forms of harm that you’re gonna try to get from nation state actors?” You’re right. We didn’t, we weren’t looking at nation states before, we are now. Uh, there is a list of them. Will we catch all of them? No, but I don’t think consumers really hold people to that standard. They just want you to like control the obvious negative externalities. Um, a- th- you know, you, people, you can make a mistake, you just wanna make it twice. And so for me, at least I think we are trying to take all those lessons learned. We have what we call the responsible innovation principles, which are informed by the entire history of abuse that we’ve observed, uh, on other platforms and saying, “Okay, let’s run everything we build through every single one of those types of abuse and understand what the risks are, what the opportunities are and how to, how to do our best to mitigate.”

Laurie Segall: I mean I think it’s gotta be so fascinating to, to be building this right now. Can you take us, um, take us to the inside Men… Are you even… You’re… I guess you’re not in Menlo Park right now. I mean like-

Andrew Bosworth: No.

Laurie Segall: … you guys are, I guess, take us to the, the, the virtual rooms that you guys are discussing some of these issues, right? Like I remember, um, covering a virtual reality, I, a woman who had been harassed in the virtual world, right? And she talked about not having the physical control to like push someone away, but you hear people’s voices. I mean, that was insane to me Boz, right? Like in, she talked about, you know, this idea that she couldn’t actually physically move someone away and they could continue to harass her. And, and this was through Oculus and right. But it was a ga- the game developers hadn’t really understood that harassment could actually happen like this in this world. So like, take me to the behind the scenes conversations that you guys are having, um, about these. Like, are there anything specific that you guys are talking about? I just feel like these jam sessions, because you, you talk about how you are trying to think through some of these scenarios before, they’ve gotta be pretty interesting. Um, take us to them.

Andrew Bosworth: Yeah. I mean, so, you know, to use the Horizon example, just ’cause it’s convenient-

Laurie Segall: Yeah.

Andrew Bosworth: That’s a scenario that we obviously did think through. Uh, and in fact, one of the debates we have inside is, what is the distance that you can allow avatars to get close to each other? Uh, because for some people, close talkers in virtual reality, give them sense, a sense of an unease, but for other people, they wanna be able to do things like whisper quietly and have intimate conversations.

Laurie Segall: Right.

Andrew Bosworth: And so, it’s, that’s an example of the conversation we’re having right now inside the company of like, “Hey, like how much social distance is required in virtual reality for like maximum comfort?” When we get those situations, we err on the side of comfort, you know. Obviously you have to start by building a… If you don’t have a team that’s diverse, inclusive, and equitable, then you’re not gonna have even the eyes on the problems. You know? Uh, one of the things that I think we, we were lucky for early on is we had, uh, quite a few women in the Oculus organization who were testing the headsets out. It wasn’t working for hair, it wasn’t working for makeup. So now we’ve got the accessory program for Quest 2, which should allow for a more diverse variety of hairstyles. Mine is admittedly relatively easy to operate. For those who don’t know, I am bald. Um, but that’s obviously not a thing that you wanna optimize your headset around. Um, so there’s, for, you have to start with a team. You have to create space for those conversations. Uh, frankly, a lot of it is also working with external, parties, with experts. You know, we announced two RFPs last week for over a million dollars, uh, on understanding the impact of technology like AR on, underserved communities and underrepresented communities. UAnd that’s worth it. Why would you expect us at Facebook to be able to do that work? Like, you know, we’re not almost by definition an underserved community. We have to be doing outreach in those spaces. You have to be paying attention and inviting the criticism and the hard conversation that comes. So it’s three things, right? One is like looking at historic harms that you observed. Those are kind of the best cases. You know how to manage those pretty well. And so you can talk through those. The second one is having a team that’s really agile and able to hear and understand criticism as it’s coming in and internalize it. And then you also have to have the reactive, “Okay, we didn’t see this one coming and no one did and something bad happened. How do we adapt quickly?” You have to have all three of those muscles. Uh, and we do it at different times.

Laurie Segall: You know, I was doing a, um, kinda, it was some kind of, uh, demo. Uh, it wasn’t the Horizon demo, it was another demo. I think that, that you know, was about kind of a work, uh, a virtual work type space. And they created an avatar for me. And I s- I mean, I swear to God, Boz like, I, you know, i- it was, first of all, I, e- everyone’s pantless, right? Like, we should just say that, right? Okay. I mean, it’s just weird I’m just saying like, it’s a little str- e- everyone’s legless. And, and I think like they made my body look very strange. And as a woman, I was like, this is just so weird. And it was all these dudes around me, like talking to me about work, it was a very strange experience where I, the, the… Let me just say this, like the person who sees the f- who can see the future and see the point was like, “Oh, this is, we’re onto something real.” But then I was saying to myself, as I was sitting there, um, legless, sorry, not pantless, legless like with a, my arms were like seven feet long it felt like, and I was surrounded by like tech kind of tech bros who were speaking at me about, you know, the future of work and I could barely get things to move. I was like, “This feels a little weird,” you know? So I, you gotta, I, this is why I ask you about these things ’cause I mean, I, I do think it’s probably the, the future and you’ve gotta be kinda thinking about the little things, if this is like the world that we’re building out, that we will eventually in some capacity be living in.

Andrew Bosworth: Yeah. So, you, it sounds like you landed yourself in the uncanny valley and you know, it’s unquestionable that you really need representations and digital spaces to either be cartoonishly inaccurate where no one expects accuracy or like really accurate to the point that like-

Laurie Segall: Right.

Andrew Bosworth: … you really are proud of it. You know, listen, hey, you know, you may have combed your hair today. I oiled my beard. Like, we did things to look presentable, and this is just a video conference. I’m not even sure if anyone… This is a podcast, like, I, I made myself look pretty in the face for a podcast, or this is as pretty as I get for… My, my apologies to the audience. But like, so, so of course people care about how they present themselves in all spaces, digital spaces included. You can do either depressurize, which is how we’ve taken up, the approach we’re taking with Horizon and venues so far, it’s very consistent. You know, it’s, it’s in line with the Facebook avatars that they’ve launched. It’s all built by this Facebook Reality Labs team. Uh, and then we do have a vision to get from what we call Codec Avatars, which are incredibly rich, realistic reproductions of faces. Bodies are pretty much always going to be estimated. Other than faces and hands, humans don’t really key on specific details that much about one another. Uh, faces and hands is a tremendously rich communication service. We’re all tuned in our brains to identify small movements in those things. The rest of it, we can kind of estimate. Legs are particularly hard. I’m sitting down. Do I look short to you? I don’t know, like how do we wanna play that? Virtual spaces do have some challenges relative to… The extra degrees of freedom cause some challenges as well. Um, but I do think like you have to take that glimmer and realize, honestly, it’s not that far. I think we’re actually, honestly p- we’re seeing tremendous vertical adoption of virtual reality. It’s early yet, but that’s how these things always start. And what I like about it is the place that we are is not a place free of problems. I don’t like to be in a place free of problems. It’s a place full of problems that I believe we can solve and people will care when we do. That’s where I like to be. Like that’s where-

Laurie Segall: What do you, what do you mean by that?

Andrew Bosworth: Oh, man.

Laurie Segall: That’s interesting. What do you mean?

Andrew Bosworth: Consider News Feed. When I, when I first joined Facebook, uh, you know, I worked with Chris Cox and Richie Sanvi on News Feed, and we just knew it was gonna be a hit because the, the way that people use Facebook before that was insane, they would click around profile to profile to profile to profile to see what had changed. And we were like, “Oh, we can do better than this.” Uh, messenger. People were already doing so many kind of tricks and hacks to try to get chat to work on mobile phones-

Laurie Segall: Sure.

Andrew Bosworth: … to get around SMS fees, which were monstrous, looking back, think about SMS fees, what a monstrous thing that was. How amazing is it that the internet’s freed us from, from like, you know, 10 cents a message nonsense. Like we’re doing ama- It’s like, I like it when you’re at the beginning of something and you’re like, “Oh, this is not only to kinda glimpse it, it’s pretty good, but I see a hundred things that can make it even better.” That’s where I think VR is.

Laurie Segall: Right.

Andrew Bosworth: VR is like has gotten good. And I, I s- I just see like a hundred things ahead that can make it even better.

Laurie Segall: Totally. Any… But, but go back to News Feed, right? I remember when you guys launched News Feed, News Feed was the one that everybody was like, “Facebook is over,” right? Was that, was that the one that everyone was like-

Laurie Segall: Or, or e- everyone was very upset about it at first?

Andrew Bosworth: I think you’re driving every change you’ve ever made in history.

Laurie Segall: Okay.

Andrew Bosworth: No, I just-

Laurie Segall: No, I think these gu-, I feel like News Feed was- 

Andrew Bosworth: It was the first club.

Laurie Segall: … were there protests?

Andrew Bosworth: It was the-

Laurie Segall: Was, was that the one where there were protests?

Andrew Bosworth: News Feed was the first thing that we had done that people weren’t-

Laurie Segall: Yeah.

Andrew Bosworth: Uh, so, I don’t wanna over-fit the curve. Uh, I think sometimes when people really are upset, they really are upset. And sometimes when they’re upset, it’s because there’s been a change and there’s an adjustment period to that. And, and learning to distinguish those two is part of the art, I suppose, of being in these jobs. Let me give you my analogy for News Feed. We, we launched News Feed. Ever had a thing where you’re at a party and you know, everyone’s talking, the music’s loud, everyone’s loud. And then for whatever reason, like the music gets cut. Everyone’s quiet. And the last thing that you said is like, it just hangs in the air and everyone can hear it. That’s what News Feed was like. Like we did that to everyone all at once. ‘Cause everyone had been out there posting on walls and doing things. And yet technically, those things were discoverable. They didn’t know, they didn’t think it would be discovered. And then we like organized it differently. So we basically record scratched the entire community. So from that takeaway, we didn’t think, “Oh, we did it perfectly, ignore them.” No, we were like, “Oh man, we really screwed the roll up out.” We needed to tell them what we were doing, why we were doing it, roll it out steadily. And like, we didn’t do that back then. Uh, and so we learned from that. So each time we really learn every time things happen. “Oh, okay. Like we screwed that up. Let’s like not make that mistake again.” Um, so News Feed, yeah, News Feed had like a certainly a very strong visceral reaction. However, from a product standpoint, it solved the problem we were solving for almost immediately. Like we saw usage double kind of overnight, and never go back down because people were having more success finding the kind that they were looking for, they weren’t having to click around, like during the entire center of homepage, there used to be just like pokes, like number of pokes. It was just like-

Laurie Segall: Yeah.

Andrew Bosworth: That was like the center-

Laurie Segall: By the way, poking was so creepy. Like why did you… I just feel like it’s a-

Andrew Bosworth: I, that, listen, that’s a, that’s a, that’s a question for somebody else.

Laurie Segall: It’s just there like… That’s why you guys, guys probably had too many, too many-

Andrew Bosworth: I’ve never.

Laurie Segall: … men working at Facebook.

Andrew Bosworth: I’ve never worked at, I’ve never worked on poke. That’s one of, I, I can’t say questions on that.

Laurie Segall: You know. But, but let me, but let me just say so yes, News Feed was so, disruptive to everything and, and I’m not one to say like, uh, you know, “Whoa, like they should have never introduced a News Feed, changed everything, right? It, it truly did.

Andrew Bosworth: It changed the web.

Laurie Segall: It changed the web and everyone at first was kinda like, “Oh,” you know, like we talk about, we talk about kinda the party where you did this all at once. I really think that to some degree, the stuff you’re working on and maybe, maybe I could be completely wrong, but could, could have, depending on timing and what kinda comes out and, and this moment we’re in, like, you know, could have similar impact, right? But, but we can’t deny the, the last years, right? We can’t deny the fact that, um, News Feed also, um, you know, let’s not oversimplify this and be like, tech is good and bad, right? Like that News Feed also, you know, created misinformation and people are now, are trying to figure out what truth is. And there are filter bubbles and, and we’re, you know, and people have talked about the ad model on Facebook being, you know, being one of the most disruptive and terrible things ever. So, so, I mean, it created this host of problems as well. That doesn’t mean it should have gone away or should have never been done, but it did create this host of problems. So as you sit there and, and what I go back to my first question is, is one of the most important seats at Facebook, which I don’t think people maybe realize, and I, I’m just calling it one of the most important seats ’cause I think it’s a very important seat. Like, you know, you gotta… I’m, I’m assuming you gotta be thinking about it like this, right? Like you’ve gotta be thinking about it with the same stakes as you guys did, maybe with News Feed. Am I, is that incorrect to say, or am I just being dramatic? I know you,-

Andrew Bosworth: No. I’m not –

Laurie Segall: … sometimes, I think sometimes you think journalists are dramatic, but, but I, I certainly feel passionate about that.

Andrew Bosworth: I love the passion and I think it’s important. Li- I, I d- I don’t think journalists are dramatic. I think journalists are doing a great job. I think we’ve got, we live in tremendously interesting times, certainly much more interesting than several decades that preceded it. Uh, and it, it invites, it deserves the degree of public debate that we’re having on the issue. So, so, you know, far from it, I don’t agree with a lot of what you’ve just said, you know. Early on, I think I said News Feed was built by, by the core team was three people, Richie Sanvi including, uh, there were four women at the beginning of Facebook that people seem to recognize, maybe you’ve seen a, a fictional film-

Laurie Segall: Yeah.

Andrew Bosworth: … about it. Uh, don’t believe that, that’s fiction. So, so I don’t agree with a bunch of things that you just said. However, transitioning to the question of like, what is the thing that we’re doing? I, the work that I’m doing right now, feels important. Yeah. The capital I sense of the word, I think it’s important for society. I think it’s got tremendous opportunity for impact. Of course, it’s very hard to separate positive impacts from negative impacts and, and thinking those through really rigorously as something that we, we’ve said publicly, we weren’t doing in the early parts of Facebook. And we, that clearly was a mistake. It’s one that we’re, you know, we’re trying to rectify now with massive investment. Um, that’s one that I get the benefit of. It’s a mistake I don’t plan to make again. It’s a mistake that, uh, we actually don’t have to, you know, I get to benefit from all of Facebook’s learning on it and the technological investment on it comes to bear. I do think it’s very different though. You’re gonna find very different problems, you know.

Laurie Segall: Yeah.

Andrew Bosworth: Uh, News Feed dealt primarily in information and it raises important questions about free speech and democracy and who’s allowed to say things that aren’t true, uh, and who’s allowed to say, you know, get distribution, who’s allowed to be listened to. Those are hard questions. The, the problems that we’ll deal with in, in virtual reality and augmented reality are different. I don’t think it’s gonna be at all like News Feed because unfortunately, even as affordable as we have made Quest 2 at $300, $300 is pretty far away from $0. Um, and it’s gonna take a long time for us to continue to get this technology deployed. I’m worried about uneven access. I’m worried about, hey, can only rich people get access to this technology, which is potentially very empowering. Uh, what’s the problem that comes with that? That’s novel. That’s not something that you have to deal with for Facebook because Facebook has an absolutely wonderful consumer aligned business model-

Laurie Segall: Yeah.

Andrew Bosworth: … that allows us to deliver a tremendous amount of services that used to cost 10 cents a message for free. Uh, and we don’t think about that. Maybe because we have means, and we’re not thinking about all the people who are benefiting tremendously. So those are the types of things that do worry me a lot. Uh, and, and you know, obviously we’re, we’re not… We’re, we’re in a diff- different business than Apple, you know, we’re not charging, on these headsets, the amount that, that, you know, someone tryna make margin, uh, on a business would charge for them. So we are taking a different approach as a consequence of that. Um, so I think it’s very important technology. I think it’s important to empowering, uh, a workforce that’s global, that doesn’t have a geographic or economic mobility. Uh, if you follow Raj Chetty’s work first at Stanford and now at Harvard, um, like, so I think it’s important work and I do take it very seriously, uh, and I’m grateful for the resources I have at Facebook that help me do it better.

Laurie Segall: I think it’s so interesting what you say when you talk about the digital divide. I think that’s probably one of the most, uh, important things that’s probably really on display right now. I mean, it’s always been an issue, but it couldn’t be-

Andrew Bosworth: Yeah.

Laurie Segall: … more on display right now. Um, you know, as during the pandemic where people are, children are having to do remote work, you know, there’s that image of, you know, kids trying to get Wi-Fi from a, a taco bell parking lot-

Andrew Bosworth: At a taco bell. 

Laurie Segall: You know. So I, I wonder, I, I mean, I do think, you know, you talk about even the future of work and, and I I’ve looked at the demo you guys did, it’s, it’s really interesting, you know, where you put on the headset and you’re, uh, you’re in this presence and there’s the ability to be around people. Um, but, but you’re absolutely right that that’s for some people, um, you know, in an increasingly divided world and, and what we’re seeing. So what do you, what do you think is, is the solution, uh, as you kind of wade into this?

Andrew Bosworth: Uh, so a couple of things. From a technological standpoint, uh, you know, we are trying to do a s- Well, actually, this is a huge area of investment for Facebook as you know. You know internet.org was on the premise of like, “Hey, can we get internet to more people, uh, at a more affordable price?” Um, and it’s, it’s wild to me that that became a controversial program in any sense. It is literally trying to fill a gap that other companies and public services have completely failed to fill, leaving people on a very precarious position as it relates to access to information, which ultimately you and I agree, it’s probably access to education, to jobs, to a bunch of other pieces, potentially to mate’s like, it’s a huge issue. Um, and, and it’s one that I, we, we’re passionate about as a company. Uh, there’s things that we can kinda do. So, one, one example I just saw recently, for example, uh, video calling, this kind of video calling takes a tremendous amount of bandwidth. You have to have not just an internet connection, but a very good internet connection to sustain this over time. Um, and I, my heart goes, you know, I have a kindergartner who’s on Zoom right now, uh, I think. And can you… it’s hard enough right now for him, a five-year-old to be in a Zoom class. Can you imagine if the video is cutting out, it’s choppy, he’s not allowed to con- he can’t contribute because when he unmutes, it’s too late and they can’t dro- It’s awful. So something that we could do, for example, we’ve, we’ve seen demos internally where, um, we can use the type of technology that powers deep fakes, which we’re concerned about and doing a bunch of detection on, but instead says, “Hey, what if we recreate a little point cloud of your face and then transmit it very lightweight over the wire and then reconstitute it on the other side so that you can actually have a richer, more lifelike video communication at lower bandwidths.” That’s the kind of research that my team is doing, that I think could have a huge impact on our ability to communicate richly under a range of conditions. And indeed, if you look at our responsible innovation principles, things like, how does this behave in low bandwidth conditions? How does this affect those who are economically disadvantaged is one of the things that we look at. Um, so yeah, this is an area that’s that we’re all passionate about at Facebook. Uh, and I think probably as an entire industry.

More from Boz after the break, and make sure to subscribe to First Contact wherever you listen to your podcasts so you don’t miss an episode. 

Laurie Segall: I’d be curious to know, you talked about Codec Avatars earlier.

Andrew Bosworth: For your mother –

Laurie Segall: I mean-

Andrew Bosworth: We gotta work on the insides of mouths. That’s what it gets you. As soon as someone starts to opening their mouth, like, “Oh, okay, that’s fake.” But, uh-

Laurie Segall: Can you like describe it? Like if we were in VR and like, it’s like, I would be seeing you, but it really looks like you, right? Like I mean-

Andrew Bosworth: Yeah. Codec Avatars are extremely lifelike reproductions of somebody’s face, uh, and, and the, the musculature that powers their face. And what hopefully allows us to do is have really high fidelity interactions with lots of people at low bandwidth, because we’re not actually sending a video of your face. We’re sending a small number of key points, and machine learning metadata that allows us to re animate, the avatar of you on the other side. And like I said, there’s the famous concept of the uncanny valley, where you want, if things are kind of lifelike, that’s very bad, they either need to be, clearly representational or pretty literal.

Laurie Segall: Right.

Andrew Bosworth: And Codec Avatars are clearly over the uncanny valley. They are on the other side, they are clearly good enough. The challenge we have with Codec Avatars is generating them. Right now it’s like a 30 minute of you saying funny phrases that get your mouth to animate certain things and expressing certain emotions in like a camera capture rig to get to a Codec Avatar. That’s obviously not something that’s scalable. Can we get to it where you can just take 10 pictures with your phone, uh, at home and we can do it that way? So we’ve got to do a lot to miniaturize that and, and hopefully deploy it as something that people, anyone could do over, over messenger. You could say, “Hey, like, uh, you know, I, I didn’t, uh, I didn’t have time to shave today. I’m a, I’m a mess. Let me just get my Codec Avatar in the game and animate it that way, uh, for this conversation.” Or maybe I’m in a low bandwidth area. Or maybe there’s gonna be a lot of people on the call and it’s gonna start to break down. And then obviously as you move to virtual or augmented spaces, it’s the only way to work. You know, how am I, how else am I going to get, uh, you know, uh, somebody to have a fireside conversation with me, if, you know, like if I don’t have some kind of representation that looks like them and causes me to feel like, wow, we’re having a meaningful talk right now.

Laurie Segall: Yeah. I mean, I, I can see the virtual space, you know, the workspace, it’s kind of like, it is really kind of hard to take someone seriously if they’re kind of like a human pickle or something, you know what I’m saying? Like, it’s, it is, it is hard.

Andrew Bosworth: It’s so real. So I have, I have a weekly meeting in virtual reality in, in kind of one of our infinite infinite office prototypes. I had it on Monday and literally my team just like is outdoing themselves like one of my, one of my team members, Ficus, he comes in wearing a Santa shirt, Santa Claus-like outfit, uh, and like a, a Captain Cook hat. Another guy came with a red Mohawk and a pi- a parrot. We were having serious work conversations about serious topics but at some point you’re just like, it is hard to focus, like it keeps going on around you.

Laurie Segall: Mmm. Right. Right.

Andrew Bosworth: You know, we, we really wanna keep driving on Codec Avatars.

Laurie Segall: Well, and so, so let me ask you with Codec Avatars, it looks so real. Like, could someone, um, this is where I sort of like deadfake, like does s- could someone take my identity and, and turn themselves into me with a Codec Avatar in the virtual space? Like not to get too black mirror on you, but you know, I think you guys have to anticipate these types of things. Could I pretend to be Boz and assume your identity with my Codec Avatar?

Andrew Bosworth: Yeah. This is exactly the kind of threat vector I’m talking about when I say, “Yeah, we, this is obviously something we’re worried about, we’re thinking of ways to ensure that you have unique possession of it. With, this is actually an area that I think we, we have pretty strong ability to create guarantees, uh, because the Codec Avatar will be somewhat computationally specific in terms of what it takes to create it. Like, I don’t think other people will very readily in the near term be able to, to kind of create their own Codec Avatar version of you. Um, and so we’ll be able to kind of store it and ensure it’s just for your exclusive access. Uh, I don’t think we’ll allow the loaning of, of avatars anytime soon. Um, deep fakes are a little different, right? Because they start with a, a footage that’s already in public. And, and that’s something that, you know, we’ve talked, it’s a little bit, that’s more of, of shrap and the AI team are focusing on that. How do you ensure, uh, the provenance of an image, uh, is certainly one of the open areas of investigation, not just for Facebook, uh, but for the industry to kind of ensure that we have greater ideas, a sense of provenance of an image that it’s real, it hasn’t been tampered with.

Laurie Segall: I mean, God, the applications and I don’t want to get into it ’cause we don’t have too much time, but I mean, even thinking about like, you know, the future of death and mortality and remembrance and with Codec Avatars, I mean, there’s so many interesting use cases I, I’m imagining you guys, uh, could use, especially, um, since you guys have so much data on our, our lives and, and so much, uh, about us. I, I don’t know. I just it’s, it certainly seems like there’s a lot there.

Andrew Bosworth: There’s there’s yeah, I think there’s real potential there. You know, one of the things that we do is we’re working with Stanford, on a project to, for example, do really rich, uh, volumetric captures of historic sites. Um, for those who, who haven’t been paying attention over the last 20 years in particular under some regimes, like the Taliban really amazing historic sites have been destroyed, uh, systematically and intentionally. Uh, and that’s a loss for historians. It’s a loss for children, it’s a loss for people who, who would like to go see and experience that thing for themselves. So we are already trying to do those things. I recently saw a paper, not from Facebook, of somebody recreating what they thought the uh, using machine learning to see what the Roman emperors actually looked like, uh, as a, you know, using their sculptures and working backwards. Uh, and there’s an appetite for those things. So the idea of having like that kind of potentially biogra- autobiographical exposure to people who are both living and deceased is, is very interesting to me.

Laurie Segall: Um, I wanna go back to Project Aria ’cause we just kind of brushed uh, brushed over it. I mean, it’s so, it’s really interesting. Um, you know, you talk about… Can you, can you give us really quick the, you know, what exactly it is? Um-

Andrew Bosworth: Yeah.

Laurie Segall: … one more time and, and I, I just want to dig into it a teeny bit more.

Andrew Bosworth: Yeah. I mean, it looks like a pair of conventional glasses, except that you’ll notice it’s got cameras facing outside, uh, it’s got cameras facing towards the, the wearer’s eyes. It’s got a GPS and it’s connected to, you know, um, an app. The data that the researcher is wearing, it has no access to the data and it provides them no value. Like they get nothing out of this, except that we pay them to, to wear it around. What we get as a, as a, once it’s, once it’s been scrubbed and quarantined and cleaned of, of identifying data, then we get to use it to validate, you know, what sensors do we need to provide augmented reality? For example, why do you need outward facing sensors at all? For two reasons, one of which is to locate you in space. You know, us being able to put somebody in the sidewalk in front of you as a Codec Avatar, we have to know where the sidewalk is. Otherwise their feet are gonna be, they’re gonna be floating and their feet are gonna be under the surface, not gonna look correct. And you’re gonna take, take your reality. You know, you’re not going to believe it. You wanna play a Jenga game on the table? How do you do that if you don’t know where the table is? And when I pull a piece out I drop it. I need to know what the world looks like. So I need to go to localize you in space and understand the to- topography. The second thing is it’s potentially very useful. If I’m walking up to a restaurant, I’m looking at the menu, oh, my friend took a picture of one of these menu items, can we overlay that? Like, so it’s potentially useful from a, uh, you know, giving you value of wearing these glasses once they have a display, which is obviously where we’re planning, planning to go. At the same time, they raise these tremendous questions. Uh, hey like, who else has this video taken? Like, do I have access to the camera feed? Uh, can I post photos from it? Can the government, subpoena access to the camera feed? Uh, you know, what are… Tremendously deep questions. For us we wanna capture as little data as possible. The reason is data capture is very expensive. Data capture in augmented reality glasses. They’re tiny. They’re gonna fit in your face. We don’t have that much battery. We actually have to dissipate thermal energy without burning you. So we don’t have that much thermal space. So we would like to capture as absolutely little data as possible to deliver great experiences to you. So with these research, uh, glasses, we’re trying to figure out, “Hey, how much data do we need to deliver these experiences?” Um,what is different about egocentric data? You know, you can’t, we can’t just use data from, for example, cars that have been driving around forever a street view. ‘Cause it turns out when you’re on the sidewalk, you’re walking under trees and there’s different lighting conditions. It’s not the same. So with it, how’s it performing in different weather conditions? How does it perform with a human wearing it? Can we even detect that” The huge amount of questions that we need to answer, you know, and why answer them now years before the, the technology is actually in a consumer device. So for us, we’ve got technological questions, that’s one goal, but we’ve also got social questions and kind of frankly, societal questions about the, the use and benefit of these technologies versus the tradeoffs.

Laurie Segall: Yeah, I saw one person was saying, um, you know, there are proactive steps that we should be taking. Um, declaring biometric data, health data legislating more consumer protections, making privacy choices simpler and better informed. What do you think?

Andrew Bosworth: I don’t know the specifics that you’re referring to. I don’t know enough about what it means for them to be health data or not. Uh, I do think that fundamentally, uh, as Facebook has been saying for a while, we’d like to have a unified privacy, you know, legal framework that we can work within. Uh, you know, Facebook has been really open about this. Like you want legislation, you want legislation written by people who understand technology enough for that legislation to make sense, which isn’t always a guarantee. And so we are very in favor of kind of a legislation that makes clear how to handle things like, uh, face recognition, uh, which you see as a patchwork right now, it’s coming up, you know, Illinois and Portland, patchworks are hard. That’s like really hard to deploy scale technological solutions to, it’s hard to invest all of our energy in getting it right when there’s like four or five different jurisdictions. And that’s just in the United States, let alone in Europe. And so I think for us, the more clarity we have on like, “Hey, here’s the data framework. Here’s the privacy framework. Here’s what’s allowed, the happier I’m gonna be ’cause I can put 100% of my engineering resources on executing on that. I literally like don’t know the specifics that I want to have a more unified framework. We do have teams who are spending a lot more time than I am on, on tryna like make sure that there’s some progress there.

Laurie Segall: You have something called personal API. I’ve, I’ve read something you wrote about-

Andrew Bosworth: Yeah.

Laurie Segall: … uh, personal API, and learning when to say no. Um, can you ta- talk to us about your, your personal API and then, and then I wanna talk about the last time you said no.

Andrew Bosworth: Yeah. My… So personal API is just about, uh, we like, we exist in these egocentric bubbles where our worlds are so clear to us and we sometimes don’t understand why other people, why it’s not clear to other people who are around us and like, why should it be clear? You have to tell them like, “Hey, here’s who I am. Here’s what I’m trying to accomplish.” You know, for, for me, it’s like, “Hey…” Uh, I, I used to have these really strong visions of myself in one light and it was hard for me when people would, even if they were complimenting me, they would compliment me in a way that didn’t align with my internal vision. That was, that was a miss. I find like talking openly about, “Hey, you know, I like to work on technology that allows people to connect.” It turns out I like to work on like technology across a huge range. It doesn’t, I, you know, I can do it in broadcast, multi-cast, one-on-one, virtual reality. I like building things that then two other people can find some connection on that. Uh, it’s very satisfying to me. And it’s what I’ve dedicated myself to doing. And I enjoy that. And the more people understand that’s where I’m coming from, the easier it is to work with me, the easier it is to understand me. Uh, and I think Facebook, you know, we could do a better job as a company. Hey, like we’re trying to connect people and there’s, we got to do a better job of getting rid of the bad stuff, but there’s a lot of stuff that we really value and we gotta figure how to write the, the, the distribution between those things. So I write a lot about these things for the benefit… Honestly, these are almost always hard learned lessons for me. What ends up happening is I screw up something up for 10 years and then I finally have an epiphany or somebody coaches me, God bless and says, “Hey, you’re screwing this up.” And then I’m like, “Oh, right.” And then I try to write it so that hopefully somebody else gets that epiphany a little sooner than I did. You know, that’s definitely been the story of my career, is making mistakes often out in public, often in embarrassing ways, learning from those and trying to help other people with them. And so I actually wrote two notes. I wrote one called say yes. And I wrote one called say no, and it’s intentional. I had these, I, I love working at these balances. A lot of times people are too instinctive to say, no, ’cause I’m busy. They’re not saying no ’cause it’s a bad idea, they’re just saying, no, I’m not ready to hear a thing right now. And they need to say yes. However, at the same time, o- just as often people wanna say yes to everybody because it feels good to say yes, but then at some point you say yes to so many things, you start letting people down because you can’t do all the things. And so sometimes the most, the best way to show respect to somebody say, “Hey, I’m sorry, I can’t, I’m not gonna do it.” Or I’m not interested in it, I’m just saying I’m not interested in. Uh, at the risk of getting myself in trouble with some of your peers, I say, I say no to interviews that I wanna do, interviews that I would love to do. Uh, people who wanna get me on the podcast, people who wanna get me. And I just say, “No, you know, I, I, I can’t commit to it ’cause I take this stuff seriously. Uh, and I wanna put my time and make sure that every one of these I do is, is quality. You’ll, you’ll get to tell me how I did on that. Um, and so I, I do say no to other people, but I said yes to you. So it’s not all bad news.

Laurie Segall: Um, one of the things I’ve always thought, was interesting. I mean, when you talk about mistakes, um, you know, you, you talk about learning from your mistakes. And I know we hear a lot of executives who talk about mistakes and they’ve learned, and we hear the company line over and over again. But, but I think the thing that people crave more than anything is just humans, right? A- and people just to be human. Um, a- a- and I think that goes for a lot. So like maybe you could just be human with us for a moment. What do you think is like the biggest mistake you’ve made?

Andrew Bosworth: Oh, it’s, it’s easy. It’s, it’s that stupid memo. The ugly memo that I wrote years back. Um, you know, I wrote a thing. It was, it was pursuant to a bunch of internal conversations, which have since been lost to time. It was relevant to them. It made sense in the context of what they were… but I wrote it glibly. I wrote it, I think I wrote like five minutes. I took, I didn’t edit it. Nobody reviewed it. I put it up there, people hated it. It had a discussion that I thought was valuable. It was like, like, that’s, you know, I was like, I was like, “Cool.” Like I kicked that discussion off. Like, let me move on. You know? And, and it, it really is a part of an instance I’ve actually since gone back and written what I had intended to write the first time, the second one I wrote was like thoughtful. It was really nuanced. It had all the points. And of course, no one cared about that one because that was not this like glib kind of shoot from the hip thing. I think sometimes people, uh, confuse, uh, you know, being like, eh, ah controversial as like authenticity. That’s not authenticity. My authentic self is incredibly nuanced. You know, I have layers upon layers of feelings about things and sometimes they conflict and I wanna work through them, in, in all the meaningfulness. Actually… when that leaked and it was really, you know, uh, embarrassing for me, it was embarrassing for the company and I still to this day, you know, um, you know, think back on how foolish it was to have written it in the first place, the way that I did, um, I wrote to the company, I said, “Hey, the solution to this isn’t for me to write less it’s for me to write more, is to not give the glib hot take one liner.” That’s like, you know, punchy and gets a reaction it’s to write the thoughtful nuanced thing. Um, and so, yeah, no, I, I, you know, you live and learn it’s, it’s a, it’s an embarrassment to me still. And I think it will be, uh, to the end of time, but it can also be a valuable lesson to me and to others. And I’ve tried to turn that into something positive. Um, but yeah, no, you know, I do tend to wear my mistakes on my sleeve. I do think we live, to your point Laurie, in an age of authenticity, an age where, when we see something that’s perfect, we almost wanna tear it down more. When we see imperfections, it’s more relatable, it’s more understandable. It’s more real. And we trust that a little bit more. Um, and well, at least I’m hoping that’s the case because that’s, uh, I’m a case study of it.

Laurie Segall: So what do you, you talk about the, how that’s not the authentic Q, but the authentic Q is thinking about some of these things in a much more nuanced way. Um, what are you thinking about now that, that you would say is, uh, the more appropriate, uh, Boz memo now, right? Um, what, what is, what is the thing you’re stewing on now that, that you just think is super important, um, and, and needs attention and, and needs debate and needs discussion.

Andrew Bosworth: Yeah. For us, you know, I mean there’s two pieces. One of, uh, you know, from, uh, the body of work that the, the world is exposed to right now that I’ve put out. I think it’s just a question about, uh, the nature of democracies in general. You know, I think a lot of times these things are couched in terms of free speech. And sometimes we’re the ones who, who use that language. For me, it’s about democracy. It’s, it’s like, hey, you know, um, who’s allowed to have an opinion, are people allowed to be wrong? Uh, should we return to an era of central gatekeepers, uh, who watches the Watchers? Uh, you know, it’s, it’s a tremendous challenge. Uh, and, and it’s a, it’s an area that I see a lot of nuance myself and I don’t see as much nuance, um, in the public sphere and the conversation around it. It’s, it’s, it’s partisan all the way to the bottom. And listen, I’m partisan in a way, you know, I have politics. They’re not hard to discover for those who look. Um, but like, uh, I do also believe in democracy and I’m, I’m torn on some of these issues that, that sometimes my liberal brethren find so cut and dry, I’m a little more anxious about eventually taking that power and putting it in the hands of the other side when the democracy swings the other way. Um, so that, that’s one set of, of things where I, I, I wish we had a little more nuance collectively in the conversation. I’m not the only one there. Um, and this probably isn’t the time for it. I’m saying this to you ’cause you asked the question, but I don’t think right now, national voter registration day, everybody go ahead and register to vote. I realize that this won’t be done today. So whatever time, whatever time it is, vote in the next election you can. I realize it’s not the time right now for that conversation and I’m not pushing it, but it is in my head. And it’s something that, that I, I think about. Day to day though, more operably for me is, is thinking about how to get this technology out to people, you know, um, uh, making it more accessible, making it more universal, uh, making it more user friendly, making it so it doesn’t feel weird to you to be in a conversation with people. Making it so that you do feel like your virtual self as a representation of your real self and you’re comfortable with that. Um, there, like I said before, this is my favorite part of working on products. It does feel like the beginning of News Feed or the beginning of, of the ads work that I did or the beginning of messenger or the beginning of groups. It feels that way. It feels like the potential is here. I can see it. I can feel it. I experience it every day. How do I get it to everyone? Everyone should have this power, everyone should have this opportunity. Um, and so that’s where I am day to day.

Laurie Segall: So that tell me, you have, you’ve been in the war room at Facebook, um, for since 2006, with the highs, the lows, the criticism, the, the moments, you know, all of it. And we know there’s been, been tons of it. Um, why do you do what you do?

Andrew Bosworth: It really, for me comes down to, uh, uh, a joy at the small experiences that were not possible before that are possible now. Uh, two people connecting and I don’t judge other people’s motivations. For some people they want to empower, uh, civilizations and revolutions. And some people wanna just tell their mom and have their mom be using something. And some people want to be famous and they want to be in the news. I don’t know why. Uh, I, those, none of those things really motivate me very much. Uh, the only thing that I really find every day I get up is like, “Hey, there’s some people who tomorrow we’ll do a thing that was not possible before I did the work that I do.” And that is immensely satisfying for me. I guess I’m a little bit, I’m a little bit thought about legacy, leaving something that outlasts me, having done something that leaves some kind of impact on the world. The only impact in the world that matters is the impact you have on the people of the world, uh, and making their lives a little bit better. Uh, and I get a chance to do that every day and it’s pretty dang fun.

Laurie Segall: Think you’d be at Facebook for, for life? Are you a Facebook lifer?

Andrew Bosworth: Well, if Mark’s asking, no. Yeah, no, I, I um, I’m, I’m really happy at Facebook .I’m, you know, I, I certainly can’t predict more than a few years out in my life. But so far at every turn, uh, I’ve found more new and exciting work that kept me deeply engaged. So, you know, I really am, uh, as excited today as I was, uh, that first day, January 9, 2006.

For more from Dot Dot Dot, sign up for our newsletter at dotdotdotmedia.com/newsletter. We’re launching in October with an exciting guest and topic: A prominent Silicon Valley founder who believes the toxicity found on the biggest platforms can be avoided simply by taking a stance from the get go – he says that tech shouldn’t be neutral, he says, it should be opinionated. It’s a fascinating take.

And follow along on our social media. I’m @LaurieSegall on Twitter and Instagram. And the show is @firstcontactpodcast on Instagram – on Twitter, we’re @firstcontactpod. 

And if you like what you heard, leave us a review on Apple podcasts or wherever you listen. We really appreciate it.

First Contact is a production of Dot Dot Dot Media, Executive Produced by Laurie Segall and Derek Dodge. This episode was produced and edited by Sabine Jansen and Jack Regan. The original theme music is by Xander Singh. 

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.