ASHLEY LLORENS: Can we start to set some audacious goals around enabling as many people as possible on the planet to live a long, healthy life, creating an atmosphere of shared prosperity, and what is the role of AI in doing that? To me, these big societal narratives should be at the top level of abstraction in terms of what we’re talking about. And then everything else is derived from that.
KEVIN SCOTT: Hi, everyone. Welcome to Behind the Tech. I’m your host, Kevin Scott, Chief Technology Officer for Microsoft.
In this podcast, we’re going to get behind the tech. We’ll talk with some of the people who have made our modern tech world possible and understand what motivated them to create what they did. So, join me to maybe learn a little bit about the history of computing and get a few behind-the-scenes insights into what’s happening today. Stick around.
CHRISTINA WARREN: Hello, and welcome to Behind the Tech. I’m Christina Warren, senior cloud advocate at Microsoft.
KEVIN SCOTT: And I’m Kevin Scott.
CHRISTINA WARREN: Today on the show, we’re joined by Ashley Llorens. Ashley is a scientist and engineer with a 20-year career in research and development of AI technologies at Johns Hopkins Applied Physics Laboratory and he recently joined Microsoft Research. Ashley is also a hip-hop artist known as SoulStice, and one of his songs was actually featured in the Oscar-nominated film The Blind Side.
So, I know there are a lot of theories out there about why so many scientists are also musicians, and there’s this whole part about music being mathematical and scientists being good at recognizing the rules of music composition, but I’m curious what you think, Kevin.
KEVIN SCOTT: Well, you know, I think it’s one of those mysterious things, because you are absolutely right, there are a lot of programmers and computer scientists who seem to have serious interest in music.
But I don’t know many of them who are so serious about their music that they have a real recording career, and, like, I think that is one of the things that makes – one of many things that makes Ashley special.
CHRISTINA WARREN: No, without a doubt. And I can’t wait to hear you both talk about his various areas of expertise and kind of these dueling careers. So, let’s chat with Ashley.
KEVIN SCOTT: Our guest today is Ashley Llorens. Ashley is a scientist, engineer, and hip-hop artist. He worked for two decades at Johns Hopkins Applied Physics Laboratory developing novel AI technologies and served as founding chief of the lab’s intelligent systems center.
He was recently nominated by the White House Office of Science and Technology Policy to serve as an AI expert for the Global Partnership on AI. Besides his career in engineering, Ashley actually began his career as a hip-hop artist, and serves as a voting member of the Recording Academy for the Grammy Awards.
About a month ago, Ashley joined Microsoft as a vice president, distinguished scientist, and managing director for Microsoft Research. Welcome to the show, Ashley – and to Microsoft.
ASHLEY LLORENS: Thanks so much, Kevin, great to be here.
KEVIN SCOTT: It’s so, so awesome to have you here at Microsoft. So, we always start these podcasts by talking a little bit about where you got started. So, I know you grew up in Chicago. How did you get interested in – like, I guess either music or technology?
ASHLEY LLORENS: Yeah, yeah, so, maybe we’ll go in order and I can just kind of create two contrasting scenes for you.
So, you know, my interest in music growing up in Chicago, south side of Chicago, south suburbs, just really immersed in music throughout my childhood. And hip-hop in particular was always fascinating to me. Just the degree of storytelling, you know, the beats, the sounds growing up in the kind of east coast vibes, listening to artists like Nas, west coast artists as well.
And it was, you know, it came – it grew to be something that, you know, I went from being a fan of to something I really wanted to contribute to – especially as I was sort of coming of age and wanting to express myself.
So, you know, the challenge with the – where we grew up and everything was, we didn’t really have many outlets for that energy at first, you know? And so, we did what we could. You know, just kind of put a visual in your head, we would go to, you know, like the – what was it? Circuit City at the time or what have you. We had, like, a boom box with two tape decks. And that was the recording studio.
And we were so proud of ourselves because we figured out how to do multi-layer vocals with a $20 mic and the two tape decks. You know, you record on the one tape and then you play it back and record over yourself and, you know, over some instrumental and you get the multi-track vocals. So that was, like, our, you know, $75 studio setup.
And so, you know, and it just kind of went – kind of grew from there. And, you know, the contrasting scene, and of course we can come back to, like, the career trajectories and things, but the contrasting picture that I’d love to paint is just around, you know, kind of the intellectual curiosity that was really a family value for us.
And, you know, really, both of my parents were teachers. My dad introduced me to two things early in life that really shaped my curiosity, one being theoretical physics, you know. I was quite young reading books by, like, Michio Kaku and not really understanding what string theory was, but really just being fascinated by it. And then Marvel Comics was the other thing. And, you know, just really, you know, the Infinity Gauntlet, something that’s been a topic of conversation within our family for a few decades. So, it was great to see it on the screen.
And so, you know, I just was really driven by a curiosity of understanding how the world works. Again, not as much of an outlet for that curiosity and that brings me to a story about outlets. You know, I actually ran one of my first electrical engineering experiments by peeling the paper off of a twist tie and then sticking it into an electrical outlet just to see what would happen, you know?
So, really, you have these two scenes kind of unfolding. And I say sharing some of the same fundamental motivations, you know, driven by curiosity, a passion for understanding people and technology, and really grew to, you know, a passion for having an impact on the world in these two different ways.
KEVIN SCOTT: Yeah, it’s so interesting that so many of the people that we chat with who have these large creative appetites, they’re sort of creative across a whole bunch of different dimensions, like, have that grounded in curiosity. You know, just this sort of voracious curiosity about how the world works and, like, why things are the way that they are.
I mean, it’s really funny, this electrical outlet story, like, I did exactly the same thing when I was four years old. (Laughter.)
It wasn’t’ a twist tie, but – and I forget what I jabbed I it, but I remember my mother screaming when, you know, there was this loud crack and, you know, her child was crying. And, you know, that sort of fearlessness that accompanies the curiosity I think is – you know, curiosity strikes me as something that you can certainly cultivate, but you know, it’s hard to create when it’s not there, whereas I do think fearlessness, like, you can sort of work really hard to become more fearless and, like, part of how fearless you can be is your environment.
ASHLEY LLORENS: Yeah, yeah, it’s really interesting. I’m always careful not to take too much credit for a lot of just the things that come naturally. I feel really fortunate, you know, to have these kind of fundamental drives and everything.
I think as children, you know, we’re sort of naturally driven by our curiosity to explore. And, you know, the outlet story is a great kind of illustration of that. I think as adults, a lot of times, if you’re not careful, the world will cause you to unlearn some of those fundamental drives and, you know, being – you know, so I think part of the thing that I’ve learned is to just allow myself to be driven by my curiosity and to behave fearlessly in that way and to ask questions and not be afraid to fail. And I think part of what I’m grateful for is just that I’ve managed not to unlearn those things as an adult, to kind of just behave childishly in that respect at this point in my life.
KEVIN SCOTT: Yeah, and so was your dad a physics professor?
ASHLEY LLORENS: He was a math teacher, just with a – with just a passion for math and science. But, yeah, he was a high school math teacher and a track coach.
KEVIN SCOTT: That’s awesome. So awesome. So, you know, like, I’m just sort of curious, like, when you stuck that stripped wire tie into the socket, like, what did your parents do?
ASHLEY LLORENS: You know, I kind of got away with it. Nobody was really around, and it was just – it was an immediate kind of fascination with the, you know, kind of the explosion and the smoke – the little mini explosion and the smoke that happened, and then an overwhelming sense of guilt that I kind of carried. And, you know, I don’t know if they’re listening to this podcast at some point, they may be surprised to learn that this was an experiment that I conducted. (Laughter.)
KEVIN SCOTT: Oh, that’s awesome. Yeah, I mean, it’s really – you know, we were talking about this on the last episode of the podcast with Jacob Collier, who’s also a musician who uses technology in interesting ways. And, you know, one of the things that we were talking about is how you preserve the fearlessness when people are making themselves vulnerable by exploring.
And having an environment, like, whether it’s your parents when you’re a kid or, you know, a set of colleagues or peers or your company or your institution that helps you – you know, they can figure out how to strike that line between, you know, sort of saying, “Wow, this is awesome that you are – you’re exploring this curiosity, but you know, like, here are some things that can be better but, like, man, you should be really proud of yourself for pushing in this way.”
Like, is that something that you got as a child or, like, something that you had to learn over time or –
ASHLEY LLORENS: It’s a great question. I think a lot about that as a parent, you know, of my own children. But also, in terms of leadership, you know, in a science and technology organization, but yeah, as a kid, you know, I mentioned this kind of intellectual curiosity as a family value. I never really got the sense that I was asking too many questions or that stupid questions were a bad thing. I thought my parents did a really good job of creating that kind of environment for us. I try to do the same for my kids. You know, I just get really used to saying, “That’s a great question.” You know, and just encouraging the asking of questions and entertaining the questions.
And it’s hard because, you know, there is a such thing as boundaries that you have to try to enforce as well. There is a such thing as, you know, what I say is going to go now, even though I’ve entertained, you know, your thoughts and things. So, you know, I – striking the right balance there.
But, you know, just like, you know, I try to do for my kids, I try to create that environment in a professional setting as well, always leading with, “Man, that’s a fantastic question, let’s take some time to explore that. Let’s make sure we hear everybody’s conflicting viewpoints before we go forward.” But I say in a similar way, you do have to kind of set a direction and go eventually, so it’s, you know, striking that balance.
KEVIN SCOTT: Yeah, well, and it’s sort of like the two stages, I think, of creativity in a group, like, you know, making sure that you hear everyone’s voice, and everyone feels free to be bold in their thinking, even when they’re vulnerable. Like, super important but, like, then you have to make a decision. Like, we called it at LinkedIn “disagree and commit.” You know, so it’s perfectly fine to disagree, but at some point, like, you have to make a choice about what you’re going to do and then everybody needs to commit to doing that. It’s a–
ASHLEY LLORENS: Disagree and commit. I like that.
KEVIN SCOTT: Yeah. (Laughter.) So, you know, you have like all this stuff that you are curious about as a kid, what’s your school look like? Did you have a strong music program? Strong science program? Like, who is helping you explore these things that you’re super interested in?
ASHLEY LLORENS: Yeah, it’s interesting. I’m trying not to get too much into, like, local politics, but the way the school systems – school districts work in Illinois, where I’m from, is that one set of schools is supported by the tax base around it. And it’s not necessarily like a shared set of resources across schools. So, you have big inequities, you know, in terms of the level of resources.
And so, I went to a school on the lower end of that spectrum just from a resourcing standpoint. So, that was always a challenge. Like, as I was leaving the school, it’s better now, but as I was leaving the school, it was in the process of, like, losing its sports programs, you know, to debts and things like that. So, that was a tough environment.
I would say I was still really fortunate to have some great role models. My math teacher, Mr. Amaro you know, I’m going to go ahead and call out the name, was really just a great champion and would push us hard, you know. He was this gentleman of Cuban descent that was, you know, from Florida and spoke with an accent, but was always positive.
He would wake up – he would get to school at, like, 6:30 in the morning and would expect you to be there at that time. You know, if you were participating on his math team, which I did. So, you know, there was my Spanish teacher and the principal that, you know, I got a chance to work with in student government. So, you know, I would say maybe even in an environment that was sort of resource constrained like that, I think it’s possible to have, you know, some fantastic role models. So, I feel fortunate there.
KEVIN SCOTT: And so, when you were thinking about going to college and in college, like, how did you choose what to do since you had so many things you were interested in?
ASHLEY LLORENS: Yeah, yeah, you know, it’s funny because when I think back to high school and this – because you’re not really choosing a career at that point, you’re choosing a major. And so, you find yourself in the guidance counselor’s office. And it’s like, “Hey, if you’re smart, you do math or science.” You know? I’m not sure that that’s the right answer, necessarily, but that’s definitely a prevailing wind I felt.
So, it felt somehow more pragmatic to me to pursue that aspect, just from a career standpoint. So, I wound up – and it also jived with a lot of the kind of intellectual curiosity that I mentioned. But I sort of have always stubbornly refused to give up the creative aspects as well. And so, I took narrative writing courses. And, you know, a lot of courses on the creativity side, too.
I probably would have had a minor in those things if the engineering school had allowed it. And I’d say that kind of duality tracked me, you know, into my career. So, if you fast-forward a bit to 2003, I was kind of graduating from graduate school and starting a record label at the same time.
And my plan was to just move into science and technology long enough to fund a record label, and then leave and go do music full time. Because even having advanced to that point, you know, after having gone through undergrad and kind of gotten my master’s and all, I still didn’t really believe that a career in engineering was for me. Like, I didn’t have very many role models that were professional engineers and scientists, so I wasn’t really sure if it was a place that I could kind of be myself, you know, be intellectually curious and be entrepreneurial and kind of have an impact on my own terms. Those things were very important to me.
So, you know, so I kind of set off. And what I found was, first of all, the opportunity to do both – so I kind of hung onto those things for as long as I could. For a good 10 years, I was doing both things. On the music side, just figuring out how to press records – vinyl records – and how to get my records into places like clubs in Tokyo, and then how to get myself into places like clubs in Tokyo to, you know to do performances. And it kind of took me around the world and many adventures.
But, you know, on a parallel path, I was really figuring out how to chart my own course within science and technology, how to be myself in doing those things, and really discovering a kind of really cool career path right at the intersection of science and engineering and having opportunities to be a principal investigator, even as a fairly young scientist, you know, for the Office of Naval Research and other sponsors.
So, publishing, you know, so if you think science, you know, engineering and music. So, publishing papers and academic conferences. Going to – flying overseas to do those conferences, but then leaving the poster session to go do a show at the club, you know.
Presenting a research in Prague and then doing a show in Prague, you know, and then eventually, you know, being able to turn a lot of those scientific advancements into real-world technologies for the Navy and other sponsors. So, just a tremendous set of adventures as I kind of reflect on it.
KEVIN SCOTT: And what did you major in in college?
ASHLEY LLORENS: Yeah, so my undergrad was computer engineering. ECE is kind of a joint college anyway, but my undergrad was computer engineering, and then my master’s research focused on electrical engineering, digital signal processing. And then when I entered the professional environment, I immediately discovered machine learning, which eventually led me to artificial intelligence.
So, it’s interesting how, you know, 20 years – 20-some-odd years ago, you could go through a whole degree without ever bumping into machine learning. I didn’t discover machine learning until I got into my professional environment, but then I was kind of immediately hooked.
You know, early in my career, I wouldn’t have dared say I was into artificial intelligence. Like, that’s not what you said, you know, and you definitely didn’t talk about neural networks at all if you wanted to get your research funded. So, but you know, as I advanced from you know machine learning for human systems, you know, kind of as decision support tools for humans, to more robotics and autonomous systems, it naturally leads you to grander thoughts about artificial intelligence. You’re creating an agent that’s going to go out in the world and perceive and understand its environment and act on the basis of its own perception.
And in my case, you know, I spent a lot of time developing technologies for underwater robots – autonomous underwater systems. And so, you know, this is not a clean laboratory setting. This is the ocean. So, you’re creating something that has to go out there and fend for itself in an open environment.
And it naturally leads you to you know grander questions about more robust and generalizable forms of intelligence, which it’s been amazing to kind of have an opportunity to think more and more about as I advance in my career.
KEVIN SCOTT: Yeah, it’s – I mean, you and I probably were in school at the same time – I think I’m a little bit older than you are but like we were in school in the ‘90s, I’m guessing.
ASHLEY LLORENS: Yes.
KEVIN SCOTT: And, yeah, you’re totally right, like, there was, like, when I was getting my – or working on my PhD, I dropped out, much to my advisor’s chagrin. You know, like, you are absolutely right, like, you just didn’t mention neural networks if you wanted, you know, you wanted to graduate and get your papers published. (Laughter.)
So, it’s really amazing to see how much has changed just over the past 20 years and even accelerating over the past 10.
So, but I do want to go back for a minute and sort of chat about this interplay that must have existed between your professional career and your music career. Did they benefit each other?
ASHLEY LLORENS: Yeah, that’s a great question. And it was an interesting journey to kind of discover the interplay. I would have said something like not so much at the very beginning of that journey, but I think the intersections kind of have presented themselves to me over time and have been really satisfying to see.
On the music side, you know, especially as an independent artist with a very low budget, you know, you had to really be convincing to convince people to work with you for little or no money. Right? Or at least up front. And so, you know, you have to develop the skill of creating a pitch for yourself – a story that you know people would sort of buy into, a vision for where you were going. And then as, you know, the executive producer of an album, you’re a project manager, you know? It needs a budget, it needs milestones, you know, you’re going to have roles and responsibilities if you’re ever going to get anything out into the world.
And so, there’s interesting aspects of those things, obviously, transcend both. And then from a technology standpoint, you know, if you think about audio recording and, you know, audio engineering, it’s engineering. You know? It’s frequency selective signal processing, it’s filters, it’s gains and amplification and all kinds of things. And so, I definitely think from a technology standpoint, from a project management standpoint, and I’d add another dimension, too, just communications and storytelling really transcend both.
And so over time, again, you know, a lot of these intersections have sort of presented themselves. And I realized even without, you know, knowing it at the time that each was benefiting from the other.
One of the things that’s been fun about my music career is sort of figuring out how things happen and how things get done. So, at one point, you know, it was just a curiosity. Like, how do I get things into music and TV? Just another challenge, right?
So, what I discovered is that there’s these agents, right? And they, you know, they bring opportunities. And so, what you have is you have movie editors, video editors. A lot of times, they put these – they get to the end of their production, they have these reference songs in there. And so, you know, at some point, they – in The Blind Side, they got to the end and there was a scene that they had 50 Cent’s In Da Club. And they’re, like, well, we ain’t paying to license this song. So, then, they put out, you know, they put out the call, like, to all the agents.
And my agent comes to me and they said, “We need a replacement for 50 Cent’s In Da Club.” So, I call my guy in Ohio and I’m, like, okay, let’s, you know, let’s put something together. So, I made two songs for them. One was That Thing, and it – you know, it’s not really my voice, it’s me replacing that song. You know, it’s my physical voice, but not my voice as an artist, but it’s still fun. And that one wound up in The Blind Side.
The second one I made for that spot to replace the 50 Cent song was one that actually wound up in 50 Cent’s Power a few years later. So, that was – it was the two things I made to replace 50 Cent in the reference soundtrack. So, it’s kind of a cool story. But, no, I didn’t meet 50 Cent or anything like that.
KEVIN SCOTT: It’s fascinating that you said that. Like, one of the reasons I try to explain this stuff. So, I love to learn about stuff. And the process of learning, for me, is almost better than the end product that I, you know, I end up.
And so, like, that whole idea about, like, figuring out how all of this craziness works behind the scenes, like, that sounds so appealing, like, more appealing than the song, itself. And –
ASHLEY LLORENS: That’s exactly right. That’s the fun of it, for sure.
KEVIN SCOTT: Ton of fun.
ASHLEY LLORENS: Yeah.
KEVIN SCOTT: You know, it’s sort of interesting. Do you ever feel like the – that it’s easier for people to engage with and maybe even have more curiosity about the music work that you’ve done versus this machine learning work that you’re doing? Given that like everybody sort of understands music, right? Like, it’s just something innate in us that, like, we are musical. We appreciate music, you know, to the degree of, like, being a fan of musical artforms and, like, fans of the people who create it. But like, you don’t really have that with technology in the same way.
ASHLEY LLORENS: You know, it’s interesting. So, this was actually something I really worried about at the beginning of my career. And I have to confess, I actively tried to keep my music and work separate. And I was not very forthcoming about the music side of what I was doing at work because I wanted to be taken seriously as a technologist. I was already someone who was from an underrepresented demographic, you know, from that standpoint, being a black person, a black male in science and technology. So, to add that dimension, too, I was like, “Okay, let me just focus on, you know, the science and the tech when I’m at work.”
But even that over time, you know, I grew to realize that I could kind of bring my whole self and present both sides, but it was easier as I had a track record of scientific and technological accomplishments to back it up. It was hard when I was a blank slate coming out of college. So, you know, for better or worse, I’m not sure what the right and wrong of that was.
But, you know, I did wind up, for example, if you go on YouTube, you know, you’ll find a clip of me doing a hip-hop duo with the director of the Applied Physics Laboratory at one of the all-hands addresses. And so that was, like, the big coming-out party at work for the hip-hop side.
You know, on the technology side, though, 20 years ago or you know 15 years ago, you say you’re doing machine learning, people are like, “Oh, what’s that? That sounds curious. Machines learn? I never heard of that.”
Now, it’s a little different. You know? You know, people know what AI is. When you say you do AI or machine learning, they’re like, “Oh, tell me more. You know, I hear about this stuff all the time.” And that plus the fact that technology is forming such a profound substrate of our whole human experience now, I think people are just more naturally engaged and curious about technology because it’s such a part of our lives. And so, whereas it started more so the kind of arts and music was something that people relate to, now, I don’t necessarily see a difference, you know, in terms of the levels to which people are engaged in these two things.
KEVIN SCOTT: So, you know, I think this whole notion of narrative, the thing that you were talking about a minute ago, was really important. So, like, you just sort of got, you know, in your musical career that you are telling stories and you sort of understood as a young professional that you had to tell a story about yourself. And I think with technologies like machine learning, narrative is also really important because they’re complicated. And if you go all the way to the bottom just in terms of how they’re implemented, the complexity of these things is very high.
Yeah, the narrative is really important because it’s such an important technology and it is having such a profound impact on what the future is looking like every day as it unfolds that people need to be able to understand how to engage with it to sort of, like, what do I think about this technology? What do I think about policy about this technology? What do I think about, you know, like, my hopes and my fears for the future of this technology?
So how, you know, have you thought much about, like, you know, the story of AI?
ASHLEY LLORENS: Yeah, absolutely. And maybe there’s a couple of sides – well, there’s many sides, but maybe two sides I’ll pick to explore there. One is absolutely the idea that AI has taken us in a bold new direction as a society. And I think it’s more important than ever that we can engage around these policy questions and really around the directions of AI [remove stumble] definitely outside of computer science and across disciplines. And so, we do need to create narratives. Even more than that, I think we need to create directions that we agree on that we want to take this technology.
A lot of times, I think people are discussing AI as something separate from human beings and human intelligence, and I think we need to be thinking of these two things as complementary.
So, what are our goals for these things, you know. Can we start to set some audacious goals around enabling as many people as possible on the planet to live a long, healthy life, creating an atmosphere of shared prosperity, and what is the role of AI in doing that?
To me, these big societal narratives should be at the top level of abstraction in terms of what we’re talking about. And then everything else is derived from that.
I think if we’re going to just let 1,000 flowers bloom and see where we land on this thing, I think we could wind up with some really unintended consequences, you know, from that.
KEVIN SCOTT: Yeah. I really, really agree. And I think, you know, too, if you have the wrong narrative, you could have unintended consequences as well. Like, one of the things that I have been telling people over and over again over the past handful of years as just sort of a useful device about thinking about the future of AI is that AI, like especially its embodiment in machine learning, is a tool. And just like any other tool that people have invented, like it’s a human-made thing and, like, humans use it to accomplish a whole wide variety of tasks.
And, you know, the tool is going to be as good or as bad as the uses to which we put it. And, you know, it’s just very, very important I think for us to, like, have a set of hopeful things that we’re thinking about for, you know, those uses of AI as, you know, we have our anxieties. And both are important. Like, you have to – you know, it will certainly be used for bad things.
But, you know, as with any technology, like, the hope is that there will be orders of magnitude more positive things and good things that people will attempt to do with it than the bad. And part of how we get to that balance of good versus bad is the stories that we’re telling ourselves right now about what it’s capable of and, like, what to be wary about.
ASHLEY LLORENS: I think that’s right on point. And, you know, we can even ask yourself, you know, what does it mean to behave intelligently as a species? I actually think we’re getting to the point where we can start asking ourselves and holding ourselves to, you know, to some standard there.
You know, if you just think about artificial intelligence at a low level, you know, from an agent standpoint, you know, I think intelligence, itself, is the ability to achieve goals, to set and achieve goals. And then what do you have to do? You have to be able to have some understanding of the world around you, you know, through some mechanisms of perception, whether that’s kind of our human modalities or other kinds of modalities.
You have to decide on a course of action, you know, that best achieves your goals. And then you have to carry it out. Like, these are the things you do to be intelligent. So, when you extrapolate that to us as a species, because one of the hallmarks of human intelligence is our social intelligence, our ability to you know to collectively set – pursue goals and things like that.
So, I think – I’m sort of – as you can see, I’m sort of cursed now to see everything through the lens of intelligence and, you know, artificial intelligence. This is just how I – my lens on the world. But I think it’s helpful. I think it’s useful. I think in order to behave intelligently as a species, we have to do some of these things that you’re talking about – setting some bold visions and directions and figuring out how to organize around those.
KEVIN SCOTT: Yeah. When I was a kid, the thing that really inspired me, I think, to become a scientist were the stories that I was reading. And, you know, I grew up in the ‘70s and ‘80s and you had a whole mix of things I read a bunch of science fiction that was, sort of techno-Utopian. It was these future worlds that had a bunch of technology, some of which now exists, some of which will probably never exist and, , sort of people living these, crazy, interesting lives, you know, like full of drama, in these futuristic worlds.
And then you had some dystopian things as well. You know, I always sort of think about, like, these two different portrayals of AI. There’s Commander Data from Stark Trek, the Next Generation, and then there’s The Terminator, you know, from the Terminator movies. You know, the latter is sort of the, you know, the anxious depiction of AI, like, what happens if you build machines that, you know, get out of control?
And the Commander Data, you know, version of AI I think is really, really interesting because I don’t know whether you’ve watched the Star Trek Picard series, which is the recent thing with Patrick Stewart and, like, Data, again, played this very, very important role in the story that they were telling. And, like, the interesting thing about Data is that even though he was an android, he was an artificial intelligence. He was always the thing that the writers in Star Trek used to shine a spotlight on humanity. So, like, he, you know, in some sense was, like, the most human character in the show and, like, they sort of used his artificiality as this plot device to sort of explore what our human nature is.
And, like, I think that sort of gets to what you were just talking about, like, I think, you know, AI may tell us an awful lot about who we are.
ASHLEY LLORENS: Yeah, I think that’s right. And I want to seize on the theme of dystopian thinking, because you and I were, you know, talking previously about kind of the Utopian thinking, you know, setting goals. And I do think it’s even, you know, increasingly important to do that dystopian thinking, to do it in a way that’s constructive.
You know, coming from kind of the federal science and technology space and advising, you know, within the Department of Defense and those kinds of circles, it’s easy to get focused on really kind of hyperbolic types of concerns, like the Terminator and Skynet and all those things.
So, I like us to do more dystopian thinking, but I do like us to also ask the right questions, to take that mirror, you know, and kind of reflect, like you were saying with Commander Data. And if you think about the mirror, there’s a series of Netflix that I love called Black Mirror.
KEVIN SCOTT: Yeah.
ASHLEY LLORENS: And so, this is the reflection. I think these are the kinds of concerns we should really be thinking seriously about – ways in which people will use technologies, you know, in a way that hurts people. You know, whether through positive intent or negative intent.
I love the kind of Twilight Zone-esque you know, ways in which the best intentions or, you know, kind of turned into unintended consequences through things like, you know, the killer bees episode, you know, as a kind of cautionary tale on autonomous systems or the episode where the brain-computer interface that the gentleman is using imprisons him in his own mind.
I think these are – those particular questions may not be the right one, but that way of thinking that, you know, the dystopian thinking, but really asking the right questions I think is important for us as we move forward.
KEVIN SCOTT: Yeah, it is an interesting ethical dilemma, I think, thinking about where the line is between inaction and, you know, sort of safeguarded action. Like, the safest thing to do in life is to, you know, sort of stay at home and don’t come into contact with anyone else and, you know, just – you can sort of surround yourself in this bubble of safety where you really can’t do much. And you can look at almost any substantial technology that we’ve ever built and, like, if you let your imagination run wild, you are going to be able to imagine, like, a huge number of bad things that you can do with the technology.
And if you let that imagination paralyze you and convince you not to build the technology at all, not to, you know, leave your house at the beginning of the day to engage with the rest of humankind, like, you miss out on, these incredible things that help us become more human, I think.
And, you know, and then to solve, like, really, you know, important problems, like, you know, that help us be healthier and live longer and, like, have you know, much higher quality of life and, you know, that supports a larger population on the planet. You know, like, and, and, and – like, if you, like, just stripped away even the past 50 years’ worth of technological development, like, the world would be in a terrific amount of trouble, which is something we often forget.
You know, so, like, it’s that framework that you use to decide on, like, how you balance, you know, positive action versus inaction and, like, not get paralyzed, like, in one way or the other.
ASHLEY LLORENS: I think it’s right. I think you articulated the trade space there very well. I think, and I don’t know the right answer –
KEVIN SCOTT: Neither do I.
ASHLEY LLORENS: This is the trade space to be aware of and always conscious of. I also think we need the right dose of humility about our understanding of the world and ourselves. We still have so much to learn, even about the ways that our own bodies and minds work, much less the very rich, interconnected, you know, ecological systems that we live inside of.
You know, I was in – I was actually moderating a panel at the National Academy of Sciences about science communication and the communication around uncertainty in science, and you know, the discussion – I’m not an expert in this area, but the discussion was about genetic manipulation and doing things like putting genetic manipulations out into populations of, say, insects or something to cause some population-level change.
And the idea of making a change like that in an ecological system that’s so interconnected, you know, and the repercussions and the unintended consequences that could happen, I’m not saying it’s necessarily the right or wrong thing to do, but certainly, I think we need to approach it with the right dose of humility and to be humble as we explore that trade space that you just laid out there.
KEVIN SCOTT: Yeah. Look, I think that notion of humility is extraordinarily important for so many reasons. Maybe the, you know, one of the more important reasons we need to have humility is just that we need to make sure that we’re not drinking our own Kool-Aid, so to speak, that we’re not overconfident about what it is we think all of our technology and all of our science is telling us.
One of the things that, you know, that I’ve tried to push back on a lot over the past 12 months is, you know, folks who are trying to take all of their incredible IQ and all of their incredible energy and apply it to, you know, helping with the pandemic. And, you know, it’s this horrible thing and we’ve got this tremendous sense of urgency. And, like, even though we have that urgency, like, I’ve tried really hard to help us or to, like, encourage people to not throw away our scientific process.
The scientific process is, like, so valuable because its only allegiance is to truth. And, like, that process of finding the truth is, like, incredibly messy and, like, even when you think you’ve found truth, like, you probably haven’t and, you know, just sort of constantly reassessing, like, what you think you believe and, like, how you arrived at those beliefs is integral to the way science works.
You know, and it’s sort of unforgiving, right, like, if insist dogmatically that this thing is true and, you know, like, you just haven’t used all of this, you know, apparatus of scientific rigor that we’ve built up over the centuries, you know, like, things will spectacularly fail.
So, anyway, you know, I – like, I think that humility is, like, you know, like, we really have to not think that we understand more than we do and, like, we, you know, we certainly shouldn’t be, you know, sort of insisting that we’ve found truths that we really haven’t proved out.
ASHLEY LLORENS: Yeah, yeah, so important. And maybe this is a great segue into aspects around diversity and inclusion because I do think that can really be a superpower for us if we leverage it. Because like you said, it’s so messy, you know, getting to, you know, getting from our subjective observations, even of experimental data, much less, you know, a world experience. And everyone, even when you try to be unbiased, you bring all that inherent world view into everything that you do.
And in some of these cases, I think our only hope is to integrate over enough of those conflicting world views to try triangulate on what is objectively true, or at least the closest we can get there. And that’s, you know, there are so many aspects of diversity to really be focused on, you know, demographic diversity, etc. But you definitely need people that think differently, that have a different lived experience, etc. How do we create pipelines of those folks into these fields? How do we create safe spaces for people to think differently and pursue their careers differently? There’s just so many aspects.
But I think if we get it wrong, we’re dooming ourselves, I think, to the kinds of group think that happen when you have a lot of folks that – when you’re not integrating, so to speak, over a lot of these differing world views.
KEVIN SCOTT: Yeah, I could not agree with you more. You know, I think the things that I’ve seen be most successful over my, you know, couple of decades as a professional in technology have been places where you have exactly what you said, that diversity of experiences, that diversity of perspectives, that diversity of backgrounds, and like, a real diversity of thinking.
Who knows how many wonderful things that you can dream up or how many things you can inspire in others because you are both a musician and an engineer? And like having more of that, like, I think is just so valuable. There are like infinite wonders in the world that we will never discover, which is like maybe one of those, you know, sort of optimistic slash depressing things to say. And, like we won’t even get to, you know, like, the interesting part of that infinity unless, like, everybody is bringing the best that they have to figuring out what the future is.
ASHLEY LLORENS: Yeah, I think thought’s right. And you know, even as our technologies advance, etc., I think the kinds of global challenges that we have to step up to are getting greater and greater. And if you take this – you know, this simulation that we’re all kind of involved in and you fast – you know, you fast-forward it enough years, you know, the statistically improbable things that we sort of are afraid of, you know, they come to fruition. And the pandemic is, you know, an illustration. So, I do think we have to figure this out and get to the point where we’re sort of activating our collective intelligence in the best possible way.
KEVIN SCOTT: Yeah, and I think that, you know, if you think about some of these things like preventing the next pandemic or, like, making the impact of the next pandemic less than, you know, less than everyone that came before them or climate change or, you know, sort of dealing with the demographic inversions that are happening in the industrialized world.
Population growth is going to slow down in the U.S. and Europe and China, Japan, and Korea, like it will start to slow down in India, and then it’s going to start booming in Africa. And so, you know, the thing that we know is that, you know, population growth is the thing that, you know, implies, like, the growth of ingenuity and creativity and whatnot. And so, like, you can start investing now in Africa so that as that population explodes over the coming decades, like, all of those energetic young people are going to be equipped to, like, help all of us old folks, like, figure out how to solve some of the big, big challenges that we’ve got facing us.
ASHLEY LLORENS: Yeah, I think that’s absolutely right. And, you know, even, again, I’m not an expert in everything that I’m curious about, but it also seems to me that a lot of our economic systems are sort of predicated on this notion of an expanding population.
KEVIN SCOTT: Yes.
ASHLEY LLORENS: You know, there’s some interconnectedness there. And so, you know, as these population trends start to invert, like you’re saying, we need to do some serious thinking about how our economic markets and things allocate resources and all that. So, there are some really substantive and fundamental things I think we’re going to need to be rethinking here.
KEVIN SCOTT: Yeah, I mean, I think you mentioned, like, one of the – you mentioned a couple times, actually. Like, one of the things that we should with a tremendous amount of urgency be focusing on is, like, how you can use machine learning to help with healthcare and aging. You know, I read this article last year in the New York Times about the eldercare crisis in Maine, where there just aren’t enough people to help take care of the aging population there. And so, like, it’s not a wage thing or anything. For no amount of money, you can hire, you know, enough people to take care of all of the elderly.
And, you know, this is going to play out everywhere soon. I think in Japan, like, they already are, like, seriously thinking about the technology that has to be built to help make sure that, you know, you can have the elderly living a dignified life in their later years and, like, you still are able to pick up the slack and productivity in the population because you have fewer workers that you’ve ever had before. And, like, you preserve the ability of what workers that you do have to be able to do their jobs and, like, you know, do the creative things that they’re doing to, you know, build the future of those societies, you know, without being completely consumed with taking care of their parents and older relatives.
And, like, the only thing that sorts that out is technology. Otherwise, you’d just have a collapse in, like, the quality of life because there just aren’t enough people to do all of the work.
ASHLEY LLORENS: Yeah, absolutely. And then there I think we have a really amazing opportunity in automation and autonomous systems, where AI and robotics meet. You know, the ability to ingest data, do analysis, make decisions, but also to carry those decisions out.
Whether they’re in – you know, we have a lot of automation emerging in factory settings and controlled environments, but you know, I think a lot of the challenge, you know, comes in these more open environments, these less structured environments, environments with people and really making robotics and automation work in those kinds of settings. I just love to see more and more of our intellectual capacity thinking about some of those challenges in a human-centered way.
KEVIN SCOTT: Yeah, I could not agree more. So, we’re almost up on time here. So, I think the last thing I like to ask everyone is what you do for fun or in your spare time. It’s sort of a weird question, because I think almost everyone I chat with, like, thinks that their work is fun, and they don’t have much spare time. But I ask the question anyway.
ASHLEY LLORENS: Well, so I’m definitely one of those folks. I’m definitely one of those people that thinks my work is fun. But I love being a father. I love hanging out with my kids. And, you know, in pandemic times, it’s been all about video games and virtual social experiences through video game platforms have been just the thing that, you know, has kept the social fabric together, you know, among our kids and their friend circles and everything.
So, I’m someone who grew up on video games and, you know, had a Nintendo, you know, and an Atari and all those things. So, I love games. And then just kind of getting into whatever my kids are into in general, which right now is these kind of virtual games. I’ll be honest with you, those kinds of platforms, I’m not so crazy about. But I love spending time with my kids. I kind of just make a point to kind of get into whatever they’re into.
KEVIN SCOTT: So, what are your kids’ favorites?
ASHLEY LLORENS: Oh, man. They’re into Roblox right now.
KEVIN SCOTT: Yeah.
ASHLEY LLORENS: You know and Adopt Me is like a game in Roblox and Piggy. I don’t know if you’ve seen this.
KEVIN SCOTT: Yes.
ASHLEY LLORENS: You know, so, it’s those things. And it’s like, okay, well, I guess I have to get into these things to spend time with them. Sometimes I can’t come into their room, I have to come into the virtual server to see them. But I sometimes try to pull them back toward more console-oriented games, too, because that’s kind of my sweet spot.
KEVIN SCOTT: Yeah, it’s really fascinating. Like, my – I’ve got a 10-year-old and a 12-year-old, and the 10-year-old is, like, a legit Roblox tycoon. He’s learning all sorts of stuff about economics and you know, like, he has this facility with virtual worlds that will go with him the rest of his life.
It’s really fascinating to hear that your kids are into that, too. It’s an interesting shift.
ASHLEY LLORENS: Yeah, and a sign of the times, for sure.
KEVIN SCOTT: Well, thank you so much, Ashley, for taking time to chat with us today. And I, like, I’m so glad that you’re here at Microsoft and just can’t wait to be able to work more with you over the coming years.
ASHLEY LLORENS: Likewise, Kevin. Really appreciate the opportunity. Had a lot of fun today.
KEVIN SCOTT: Awesome.
CHRISTINA WARREN: Well, that was Kevin’s conversation with Ashley Llorens. And I was so fascinated by everything you talked about. First of all, he is brilliant. And as you were kind of saying at the top of the show, it’s really rare that you see someone who has achieved kind of parallel careers in two very different fields, not that there might not be, as you said, kind of a mysterious connection between music and between computer science, but they are on the face of it, how they work, very, very different. I think that’s just so stunning that someone like Ashley exists, honestly.
KEVIN SCOTT: Yeah, and it’s so great to see him have the success that he’s had in both of these things that he has passion about. I think a lot of us get pushed into one direction or the other fairly early in our career. Like, I know you, for instance, like, you are a computer programmer, and you are a writer and a journalist.
And, like, those are also two things that look very different from one another on the surface, and I’m guessing that through your career, and I’m guessing Ashley had this as well, and like, I have my own experience with it, like, you get encouraged to, like, “Oh, you’ve got to focus, you’ve got to focus, you’ve got to pick one thing.”
CHRISTINA WARREN: Yeah.
KEVIN SCOTT: And, like, I always love seeing people like Ashley where it’s, like, no, I actually don’t have to pick one thing. I’m going to do both.
CHRISTINA WARREN: No. I totally agree. And you’re exactly right. I think most of us, we do have one thing that we have – we either have to pick or we have to focus more on. And I love that he’s had these parallel careers, but it was really also interesting to me hearing him talk about how at the beginning he tried to keep, you know, his music life separate from his technology work because he wanted to be taken seriously. And I’m glad he doesn’t have to do that anymore, that he can share his full self.
But it also makes me think, okay, in technology, people really need to be a lot more open-minded about the different backgrounds and interests that people have.
KEVIN SCOTT: Oh, I could not agree with that more. And, like, when he said that, it really, really, really resonated with me because you know, a little bit of it, I’m guessing, is imposter syndrome and, you know, you sort of feeling very uncomfortable early in your career about whether or not you belong in the – the place that you’ve chosen to be.
And, like, part of it is, like, legitimately these professions, like, have these notions of, you know, this is what it means to be a “blah.” And it’s, you know, whether it’s a medical doctor, lawyer, computer scientist, academic, programmer at a tech company. And the reality is, like, if we were just much more open about things and encouraged people to be their authentic selves early on and like helped them understand that, like, we all feel like imposters at some point or the other that, like, maybe we would, you know, we’d have more people doing more interesting things.
CHRISTINA WARREN: For sure. And, I mean, I think it also – and this is really evident in the work that Ashley does, it’s so important in AI to have different perspectives. And I think that’s why it’s amazing that we have someone like him who is an expert in that area who also has different perspectives than other people might, you know, who have come into this because of his curiosity that he was talking about that he’s had since he was a kid. I have a feeling that you would need to be curious to be able to do the things that he does. You need to have that sense of asking why and wanting to learn more. And I love that.
KEVIN SCOTT: Yeah, for sure. And I agree. Like, it is critically important in technology in general, and particularly with AI.
Like, these things that are going to have a high degree of influence over what the future looks like, you just really need a diverse set of people helping you build those things just because the technology itself, has so much impact, like, you want it to be in the hands of as many people as humanly possible.
CHRISTINA WARREN: For sure. For sure. Well, I loved the conversation, and I can’t wait to see what Ashley does now that he’s at Microsoft.
KEVIN SCOTT: Yeah, me too.
CHRISTINA WARREN: Okay, that’s it for today’s episode. Thank you so much to Ashley for joining us today. And send us a message anytime at [email protected] and be sure to keep listening.
KEVIN SCOTT: See you next time.