RANA EL KALIOUBY: Empathy and emotions are at the center of how connect and communicate as humans, and our emotional experiences drive our decision making, whether it’s a big decision or a small decision. It drives our learning, our memory. It drives this human connection.
KEVIN SCOTT: Hi, everyone. Welcome to Behind the Tech. I’m your host, Kevin Scott, Chief Technology Officer for Microsoft.
In this podcast, we’re going to get behind the tech. We’ll talk with some of the people who have made our modern tech world possible and understand what motivated them to create what they did. So, join me to maybe learn a little bit about the history of computing and get a few behind-the-scenes insights into what’s happening today. Stick around.
CHRISTINA WARREN: Hello, and welcome to Behind The Tech. I’m cohost, Christina Warren, Senior Developer Advocate at GitHub.
KEVIN SCOTT: And I’m Kevin Scott.
CHRISTINA WARREN: And today, we have a super exciting guest with us, Rana el Kaliouby, who is the co-founder of Affectiva. She’s the Deputy CEO of Smart Eye and an Emotion AI pioneer, among so many other things.
KEVIN SCOTT: I am just thrilled to be speaking with Rana today. You know, we’ve talked so much about AI on the show, but the convergence of AI and human emotion is an angle I don’t think we’ve ever really covered before, certainly not to the extent that I’m hoping to get into with Rana.
CHRISTINA WARREN: I can’t wait to listen to your conversation with Rana. So, let’s dive in.
KEVIN SCOTT: Rana el Kaliouby is an Egyptian-American scientist, entrepreneur, author and an AI thought leader who is on a mission to bring emotional intelligence to our digital world. She was the Co-founder and CEO of Affectiva, which was acquired by Smart Eye in 2021, where she now serves as the Deputy CEO. Rana is also general partner at AI Operators Fund, an early-stage AI-focused venture fund, as well as an Executive Fellow at the Harvard Business School. She’s the author of Girl Decoded, a memoir, following her personal journey as a woman in technology. Rana, thank you so much for being here with us today.
RANA EL KALIOUBY: Thank you for having me. I look forward to it.
KEVIN SCOTT: So we always kick these conversations off by talking about how you first got interested in science or technology. Like was it when you were a little kid? Tell us a little bit about that.
RANA EL KALIOUBY: Yeah, so I am originally Egyptian. I was born in Cairo and my parents – it’s kind of a cute story.
My parents met in a programming class. My dad taught COBOL. And my mom in the ‘70s decided to learn programming. She must have been one of the very first Egyptian women to get into tech. And they met during that class, got married and moved to Kuwait. So they both had careers in tech.
And so we – I have two little sisters, and we grew up surrounded by technology. And I remember, like playing Atari video games, and my dad had these, like early, huge camcorders. And I was always fascinated by how technology brought our family together. It connected us, and this theme of how technology connects people has been really kind of a uniting thread across my career.
KEVIN SCOTT: That’s sort of – it’s very interesting, and I’d love to dig into that a little bit because some people, the fascination – early fascination with technology is about videogames, and for some people, it’s about – you know, just being able to communicate with other people, so like was there a particular thing, or was it just that technology was this gravitational pull that everyone around you was fascinated with?
RANA EL KALIOUBY: Yeah, I – I definitely think this whole, like – I remember there – we had this like little blue plastic chair. It probably cost like – I don’t know, like a dollar or something, but my dad used to plop me on it, and I think it was when I gave my first speeches ever, and he would sit there with the camcorder and record me like, you know, rambling.
And, and so that stuck with me. I remember the first program I wrote was a BASIC program and I I’m Muslim by background, but I wrote a program that kind of created a Christmas tree, and I remember that, right? So I just think it’s various things, and it’s just being surrounded by technology and being fascinated by technology. I think exposure is important too.
KEVIN SCOTT: Yeah, oh, I could not agree more. I’m convinced that for – well, it’s certainly an advantage, right? Like one of the things, one of the big things that I think folks struggle with is, if you get interested in technology or programming as a little kid, you just sort of have years and years of experience with it before you get to college, and you have to declare a major, and – like what is easy for you in college might be very hard for someone who didn’t have that exposure, and it has nothing to do with capability.
It’s just, you know, experience. So like – I mean, it’s wonderful that you had that environment where – where you’d – I mean, it sounds like it wasn’t just computers; it was a whole bunch of technological things. Your dad and mom must have been very enthusiastic nerds, which is great.
RANA EL KALIOUBY: Yeah, and I also think it takes away that fear, right? A lot of people have fear of technology, and when you’re exposed to it as a young kid, yeah, you don’t have – you have that curiosity but without the fear.
KEVIN SCOTT: Yeah, so from that seed that got planted very early, like how did things proceed. Did you take programming classes when you were – when you were in high school or middle school? Did you – you know, how did you sort of pursue this interest as you were a kid?
RANA EL KALIOUBY: I did a lot of programming in middle school and high school, not a lot, actually. Some. And then I decided to study computer science at the American University in Cairo, so I went back to Egypt and kind of declared a major in computer science and a minor in electrical engineering, but again I remember clearly, like I just kept getting drawn to this human computer interaction space, right? That touchpoint between humans and machines.
I was less interested in the operating system and the computer architecture classes, and I just became really fascinated by what is that touchpoint going to look like, and how can we create it and architect it in a thoughtful way that adds value to us as people?
And so I had a very clear plan. I said, okay. I graduated top of my class. I said, “Okay, I’m gonna go abroad, get a PhD and come back and teach at the American University in Cairo, AUC,” inspired by actually a role model.
So there weren’t a lot of woman faculty. In the department, there was only one woman, Dr. Hoda, and I was like, “Oh, my gosh, she’s so cool.” Like, she’s so smart, but fashionable, and you know, kind. And so I was like, “I want to be like her,” so she kind of became my hashtag-goals, and I went abroad to get my PhD, thinking I’d come back and teach, and then I always like to say I outgrew my dream. My dreams evolved.
KEVIN SCOTT: That’s awesome. And did you feel supported in the environment that you were in? So like you had your mother, obviously, as a role model, and your dad was a programmer. You had this amazing, you know, computer science professor who was a role model, but like was there a struggle? And like all of us have struggle, right? But like did you feel like you were sort of supported?
RANA EL KALIOUBY: I think, by and large, I feel very blessed and lucky in that my parents – you know, we’re three daughters, right, in a culture where, often, there are expectations on being a woman, that are different than the expectations for being a man, but I do feel like my parents were pretty – in a way, they were very traditional, but they were also kind of very supportive of our education.
It was the biggest investment they’ve made. It’s, they put all their money into our education, so I feel very lucky that way, and I can almost, like draw a straight line from their investment to where I am today. We traveled a lot as kids. They invested in our learning. We went to international schools, so we were very exposed to kind of – Western way of teaching and thinking.
But at the same time, you know, after undergrad, I got married very young. I was 19, and there – you know, I then decided to go abroad to Cambridge University to get my PhD. Both my parents and my in-laws basically said, “You’re married now, you’ve gotta – you know, stick with your husband, and you can’t just leave, and – you know, go pursue your career, so …”
And that was the beginning of this cultural tension between what I really wanted to do and my passion, but then the cultural norms and what society expects from you, and so there’s definitely a clash there.
KEVIN SCOTT: Yeah, so that is – I mean, it’s super interesting. I mean, I remember, for me, I grew up in Rural Central Virginia, and like my – I didn’t have – my mom had gone to, like in the ‘60s what at the time they called “secretary school” and she, you know, learned a bunch of administrative skills and was a bank teller before she got pregnant with me and my brother.
And my dad and my entire lineage were all construction workers. And so I just remember being very tentative about moving away to go pursue my career, so it was super hard for me, and I didn’t do it until after undergrad, to like leave where I was raised to go do something that I was passionate about.
And it sounds like you not only did that, but like you also had all of this other stuff going on that made it hard. Like how on earth did you muster the courage to do this?
RANA EL KALIOUBY: Yeah, it takes a lot of courage and also a leap into the unknown, right, when you – you don’t really know how it’s going to work out. Often, like in my case, my family kept – you know, there was a lot of fear. They basically kept saying, “No, no, no.”
So I also moved to Cambridge University right around September 11th, where – you know, and at the time I used to wear a headscarf, so I was very clearly Muslim, and there was just a lot of fear, and to me – I don’t know, when you’re excited about something, and you’re passionate about it, it becomes the best motivator, I think.
KEVIN SCOTT: Yeah, but I mean – still, it’s really inspiring to hear you talk about this, because I know, those sometimes are the hardest things. It’s less about, “Can I get an A in this class,” or like, “Am I technically good enough?” It’s like, “Can I – you know, everything else that’s going on in my life, can I just make it work?”
And like, there’s just so much stuff that’s going on in everyone’s lives, and like you – you’re young and inexperienced, and so anyway I just always love to celebrate these stories about people who have – you know, have figured it out, and – you know, like talking about it, I think helps serve as a role model for other folks who – like can see, yeah, you know, like, “Rana was able to overcome a bunch of stuff to go do this amazing set of things.”
So it’s just great to hear.
RANA EL KALIOUBY: I do think one of my core values is this concept of courage, right, and being bold, and how it relates to faith, and just having that – yeah, taking that leap of faith into the unknown. You won’t always have the answers, and I like to say, like embark on this journey.
I tell my kids this all the time, like embark on the journey without attaching to outcomes, and it’ll – you know, it’ll open doors, it’ll create new opportunities. You often don’t know what they are, but it’s better than what you originally thought, so that’s kind of always encouraged, so –
KEVIN SCOTT: And speaking of that very good advice that you give your kids, that’s a good mindset to have, not only when you’re pursuing something like a PhD, but like when you’re thinking about having an entrepreneurial career. So like, talk a little bit about, like, you know, once you get to Cambridge, you’re working on this PhD, and then – you know, you sort of jump into his career, like what did those experiences look like?
RANA EL KALIOUBY: So my PhD was all about bringing emotional intelligence to machines, and I’m sure we’re going to dig into that a little bit more, but my plan was to finish my PhD and go back to Egypt and teach. And literally a month before my defense, I met this MIT professor, Professor Rosalind Picard, who I had been following for years.
I had read her book, and basically her book kind of pivoted my interest into kind of pursuing this idea of emotion AI, but I’d never had the opportunity to meet her in person. And very serendipitously, she was visiting Cambridge to give a talk, and she said, “I have …”
She emailed the lab, and she said, “I have, you know, an hour to kill, I’d love to meet some students.” I raised my hand, of course, and I booked a 15-minute slot with her, which turned into 45 minutes. And I spent weeks preparing for that meeting. I was going to demo my technology and kind of – you know, I had all of these ideas, and we totally hit it off.
And she said, you know, “I’d love to bring you on a postdoc in my group,” and I remember saying, “Oh, my God, this is like a dream come true, but I have this husband who’s been waiting for me, for the past four years, back in Egypt, and I kind of need to go back.”
And she said, “That’s fine, just commute.” So I started a four-year commute between Cairo and Boston. I would literally like spend a few weeks in Cairo and a few weeks in Boston, which is fun but also exhausting and not good for any marriage. That’s how I ended up at MIT, and then within a couple of years at the media lab, it was clear that we had a huge commercial potential for the technology, and we spun out.
KEVIN SCOTT: Well, so let’s talk a little bit about, like the technology itself. So like I’m really intrigued by this idea of emotional AI, because – like one of the things that we think about all of the time in all of the infrastructure that we do, and all the work that we’ve done with OpenAI, and like our very strong conviction around how fast these systems are becoming very capable, is you do want them to be able to be really useful to human beings.
I mean, one of the things that I wrote about in my book was this experience going to – back home to Rural Central Virginia, and I was talking to one of my friends who is the office administrator for a nursing home. And I talked a little bit about AI, and she’s like, “Yeah, I don’t know how AI would actually connect with the people in this home.” It may be able to do a bunch of diagnostic things as well as a doctor, but like that’s not the important thing. The important thing is being able to be in the lives of these people in a way that they can connect with.
So like, I think what you’re doing is very, very intriguing, so maybe talk a little bit about that and why you think it’s interesting.
RANA EL KALIOUBY: Yeah, I mean, it’s exactly what your friend is saying. Like empathy and emotions are at the center of how connect and communicate as humans, and our emotional experiences drive our decision making, whether it’s a big decision or a small decision. It drives our learning, our memory. It drives this human connection.
It’s how we build trust, it’s everything, but if you look at how we think about technology, it’s often what I call the IQ of the device. You know, take ChatGPT, right? Very smart, very intelligent, but it’s all very focused on the IQ, the cognitive skills, and I felt like it was out of balance with our emotional and social intelligence skills, and I wanted to marry the IQ and the EQ in our machines, basically.
And so I had to go back. I’m a computer scientist by background, and I had to like study the emotion – the science of emotion, and how do we express emotions as human beings, and it turns out, 93% of how we communicate is nonverbal. It’s a combination of our facial expressions, our hand gestures, our vocal intonations, how much energy is in our voice, all of these things, and only 7% is in the actual choice of words you use.
So I feel like there’s been a lot of emphasis on the seven percent, but the 93% has been lost in cyberspace, and so I’m kind of reclaiming that 93%, using computer vision and machine learning technologies that weren’t available, you know, 20 years ago, but now they’re ubiquitous.
KEVIN SCOTT: Yeah, look, I think you are absolutely right. In 2016 I dialed back almost all of my social media consumption because you effectively have these machine learning systems, like particularly for businesses where their business model is engagement, so – like the more you engage with the platform, the more ads run, and the more money that you make. It is very easy to like get systems that get the engagement by triggering your amygdala and like keep you in this – and it’s very easy to be triggered by email.
Like I, all the time, have email conversations with colleagues, where I get super agitated by the email conversation and if I just jump into a video call with them, even, like not even face to face, but you know, what we’re doing right now, like in – in seconds, like all of the stuff that I was agitated about goes away.
So I’m just super intrigued by what you’re doing. Like how do we actually get this rolled out more broadly because I think you’re absolutely right. Like we’ve focused so much on the text and text is so rife with opportunity to get us emotionally activated in the wrong way.
RANA EL KALIOUBY: In the wrong way, right, because there’s a lot of confusion and ambiguity where you can clarify that when you see the face or hear the voice. I mean, I think what’s really cool about this, and what ended up being both the opportunity, but also the biggest challenge for Affectiva, you know, when we spun out of MIT, was that there are so many applications of this technology.
And so we tried to focus, but it was always this challenge of like, “Oh, my God, like there are so many cool applications,” and so some that I think are really powerful. One is the automotive industry where, you know, we ended up selling to Smart Eye, and they’re very focused on bringing driver monitoring solutions to the world, and so this idea of understanding driver distraction, and you know, if you’re texting while driving.
Well, we can tell that, using computer vision, right? Look at your eyes, your head movements, and if you have a phone in your hand, drowsiness, intoxication, even. We can – we’ve started like doing a lot of research to detect that, using cameras, you know, optical sensors. So automotive is one area.
We can do advanced safety solutions. We can look at, like is there a child seat in the car, and an infant in it, and often – not often, but about 50 kids or so get forgotten in the car every year, and they die of heat, so that’s very fixable. We can fix that.
Mental health is another space that I’m really fascinated by, that we know that there are facial and vocal biomarkers of mental health disease, depression, anxiety, stress, even Parkinson’s. So imagine if we – every time we hop in front of our computer, with people’s opt in, of course, we can kind of articulate your baseline, using machine learning.
We know your baseline, and then if you start deviating from it, we can flag that to you, a loved one, or – you know, a psychiatrist. So I think there’s a lot of opportunity, but we’re missing the scale. I think this is where Microsoft comes in, right?
KEVIN SCOTT: Yeah. Yeah, and we think about that all the time. I, one of the big things that we are attempting to do with AI is we want to make sure that it’s a platform that other people are using to build what they think is interesting and valuable, versus us sort of dictating to the entire universe what is interesting and valuable. And I mean, that is part of being a platform is – like the platform has to help you solve all of the problems that you have for building really great experiences for users.
So, yeah, so it’s super interesting to think about what you are doing.
RANA EL KALIOUBY: It’s, I mean, ultimately, too, I believe it’s going to be embedded in – it’s going to be like the natural human machine interface. We’re already seeing conversational interfaces, right? And then we’re going to see perceptual machines. And you bring all that together, and imagine if – you know, as you engage with your device, whether it’s a phone or in your car, your office or a home robot, it can kind of glean all of that information, and then nudge you to improve – you know, to improve your life, whether it’s to be happier or healthier, more productive, or – you know, whatever the KPI is, it can nudge you to do that better.
KEVIN SCOTT: Do you think at all about influence? So I mean, like with technology and the limit, where you want the flows of influence to be is like, you want the human to influence the technology. So like, I have a thing I want to accomplish, like I want to like be able to convey my intention to the machine and have it help me do this thing. What you want is less of the influence coming in, the other way, that there’s some sort of intention embedded in the thing that the person is unaware of, that is getting you to do something that you might otherwise not do. And so do – do you think about that, much, with what you’re doing?
RANA EL KALIOUBY: Definitely, because it’s a very blurry line between manipulation, right, and nudging towards positive behavior change, and who – where is that line, and who decides, right? And so in the case of social media, for example, it was optimized towards engagement, and I don’t know that that was good for each of us individually, right?
It wasn’t optimized for us. So I think that goes back to like the ethics of all of this, and how do we develop and deploy this in an ethical way? It’s not about the technology. Almost, it’s about the organizations and the teams that are deciding the use cases and the applications.
KEVIN SCOTT: Yeah, and I think a lot of it actually is about transparency. It’s about, you know, everybody being informed enough to understand, like, how the technology works, like not at the deepest levels, but, you know, this is the basic structure of these systems.
And I mean, like, my experience when I dial back on social media consumption, I’m sitting in there, you know, in 2016, reading a bunch of stuff that’s the same thing over and over and over again. It’s all opinion, and I’m getting agitated by it all the time. And I, like… I take a step back and I’m like, what am I doing? Like, I’m not learning anything. Like, this is not healthy. It’s almost like being on a diet of junk food, right? Like, you know, you sort of love eating the doughnut, but, you know, if all you eat is doughnuts, like, you’re going to get diabetes and die. (Laughter.)
RANA EL KALIOUBY: Yeah, there is a difference. How do you – you have kids, right?
KEVIN SCOTT: I do.
RANA EL KALIOUBY: How do you navigate that with – how old are your kids?
KEVIN SCOTT: They’re 12 and 14.
RANA EL KALIOUBY: OK are they on TikTok?
KEVIN SCOTT: Oh yeah, totally on TikTok. And, like, we talk about it a lot. Like, I talk about, it’s like, you know, you’re on this… So, first and foremost, the thing that we try to do is to give them productive things that they can go do, that are interesting to them, that isn’t just passively consuming content.
So, my 14-year-old is a competitive rower and she, like, thinks she wants to be a cardiothoracic surgeon. So, she likes to study all of the – and she’s like, very, into her schoolwork. And she’s – so, she’s super busy. So, like, she’s got all of these other things that compete for her attention, and they’re all productive things.
And so, when she gets on TikTok, it’s – you know, I think it’s, my assessment as her parent is, like, it’s okay, doing it a little bit. But even then, like, we talk about it. It’s like, “Look, kiddo, this system is designed to capture your attention for as long as it can possibly hold on to it. And it really doesn’t care about anything else, other than you swiping up to the next video, and doing that over and over and over and over again.
“And there’s a very powerful machine learning system sitting there, watching what you’re doing and, like, trying to select the next thing that’s going into your feed that is… it doesn’t care whether you’re happy. It doesn’t care about whether it’s healthy. Like, all it cares is that you swipe up again.” And it’s like, “Now that you know this, like, do you really want to be doing this?”
And so, like, I think that’s the best I can do, because she’s 14. So, four years from now, she’ll fly the coop, and, like, I will, like… then I have no visibility into, like, what she’s doing. And so I’m just trying to give her some patterns that she can use to make good decisions for herself in the future, not just, you know, put walls around things that she thinks she wants. (Laughter.)
RANA EL KALIOUBY: Yeah, but I actually, this is awesome to hear, because I feel like I have a similar parenting philosophy, too. So, rule number one or approach number one is to keep them busy with other productive stuff, right? So, both my kids are just, like, very active, and they just have a lot of things that they’re interested in, which is awesome. And then if – when they use technology, think of themselves as not just consumers, but creators, right?
So, my son, he’s 13. He has a podcast, which is awesome, because now he has to think about – he has to learn these new tools, he has to kind of think about who am I going to invite. He has to think about the questions. So, it’s not just consuming, it’s producing. So, I try to kind of nudge them in that direction, but of course, he still uses TikTok.
KEVIN SCOTT: Well, and I think you’re totally right. Like, the act of producing makes you think about a whole bunch of things other than – I mean, like, and part of wanting to produce, I think, as a kid is like, oh, I want to, like, get TikTok famous or, like, I want people to pay attention to me.
But, like, it also does this other thing. It’s like, okay, like, I have to try to understand what it is other people want. Like, that’s a good thing. Like, I have to go learn the tools to do the producing. Like, that’s a good thing. They’re all useful, interesting tools. Like, I have to figure out how to compose content in a way where it’s compelling. Like, that’s – so, yeah, I think you’re totally right. Like, it is a very interesting activity that teaches you a lot of things. So, that’s smart.
RANA EL KALIOUBY: Yeah. I try, I try. There’s still a fair amount of video games on TikTok happening. I think it’s all out of whack?.
KEVIN SCOTT: Well, the other thing - I did not intend this to be a podcast about parenting - but, you know, one of the other things, too, that I try to do as a parent is I realized that there’s a whole bunch of stuff that I did as a kid that was sort of really pointless. Like, I read a bunch of comic books. I, you know, like I watched cartoons whenever they aired on the networks that were on. I collected Star Wars and G.I. Joe action figures and, you know, put on, like, these huge pretend campaigns.
And I look at what my kids are doing, and a lot of it is exactly the same thing. It’s just the medium is different now. And so, I try not to… I mean, like, my youngest loves Roblox, for instance. And so, she spends, like, all of her disposable income, like her, you know, $10 birthday presents and whatnot, on Roblox.
And at first, I’m like, oh my God, like, what is – what is this kid doing? Like, she’s spending all of our money on, like, you know, these virtual things. And I’m like, but wait, is that really any different from what I spent my money on that’s, like, all been disposed of now? Like, I wasn’t buying anything when I was a kid – (laughter) – that had enduring value as an artifact.
And so, like, part of it is just, you know, old bias or parent bias or whatnot. It’s like you just, I think, as a parent, have to appreciate the context that you’re in. And just because like, you know, medium or, you know, something about the context in 2022 is different from your context, that is, probably, a lot of it is the same thing. And, like, we all turned out, okay. (Laughter.)
RANA EL KALIOUBY: Right, that’s true. Adam and I, my son, we have a lot of conversations around, because he was at some point also into Roblox, but just in general, buying these digital assets, right, which I’m perfectly fine spending the same amount of money on, like, a physical thing. And he’s like, “But it’s kind of the same.” So, we have these debates around what does it mean to be in the metaverse and, yeah, invest, and buy, and, like, kind of be immersed in that world.
KEVIN SCOTT: Yeah.
RANA EL KALIOUBY: Which is – it’s so fascinating.
KEVIN SCOTT: Yeah, and it is. And, like, I think, for my kids, it’s given us a bunch of teaching moments. Like, we talk about these digital assets. Like, a lot of the assets, even in the physical world or, like, in the adult world that we have, that we think are real, like, they’re just sort of imaginary. Like, they only have value, because like a whole bunch of people have decided that they have value.
Like, it’s a you have paper in your pocket, or maybe you don’t, that, you know, you’ve decided, like, this, you know, this piece of paper, a dollar bill or a $10 bill is, like, yeah, it’s only worth something because we’ve all agreed that it’s worth something, yeah, and which we just sort of forget. Like, when the thing becomes so ubiquitous and, like, it’s, you know, just a commonplace thing to use it to exchange value.
But, like, in a sense, it’s really, I mean, from a 12-year-old’s perspective, without going into, you know, fiat currencies and economics- and like all sorts of things, like, you know, conceptually, like, it’s just not that much different.
RANA EL KALIOUBY: Right, totally.
KEVIN SCOTT: Well, so let’s talk a little bit about your company. So, it is yet another courageous thing, I think, to go from being an academic to being an entrepreneur. So, tell us what it was like to be a founder, like, straight out of post-doc.
RANA EL KALIOUBY: Yeah. I didn’t have a lot of examples around me, actually, and it was a steep learning curve. I mean, it’s one of the most fulfilling things I’ve done in my life, but also probably the most challenging. And one thing about, it’s back to this leap of faith, right? Like, I took a leap of faith into the unknown, probably being a little bit naïve.
I remember the very first roadmap we built, product roadmap we built, and we showed to investors, was Q1, we’re going to build the technology and bring it to the education market. Q2, we’re going to go to healthcare. Q3, we’re going to do – it was so naïve. Obviously, it took us, like, years to bring the product into one market. (Laughter.)
I remember also very early on, we were being courted by an investor, and he sent me an e-mail. And he said, “Send me your BS.” And I was like, I have no idea what this guy’s talking about. Like, the only BS I know, you can’t send in an e-mail. (Laughter.) And he, of course, meant balance sheet, but I didn’t know.
So, you know, so it was… I learned a ton. We ended up being venture backed, and, you know, we raised $50 million of strategic and venture funding. So, I figured it out on the go. But yeah, it was just, like, this crazy emotional rollercoaster.
KEVIN SCOTT: So, talk about what were some of the more challenging things that you had to deal with, because I know everybody has a slightly different experience. Like, I certainly have my own things, like lots of things that were hard, like, being at a startup.
RANA EL KALIOUBY: What was the hardest thing for you? What’s one hard thing?
KEVIN SCOTT: Well… so, I had been… I’d been around entrepreneurs my entire life. I just didn’t realize that what they were doing was the thing associated with this fancy word. And so, like, I knew a lot about what… some of what was required to do a startup. It’s like, nobody’s going to do anything for you. If you want a thing done, like, you just have to roll up your sleeves and get it done. There’s – you know, my dad was like a good example of this.
It’s like, you know, your job as a leader or, like, someone who’s, like, running a thing, is to empower other people to do what they are great at. And if that means that you’re left with sweeping the floors, or washing the dishes, or whatever it is that has to be done, like, that’s what you go do so that they can go do their best thing. So, like, that, I was already comfortable with.
But one of the hard things for me when, and I’d been a pre-IPO employee at Google, so, but, like, by the time I got there, you know, there were hundreds of engineers at the company, and, like, there was no existential risk of failure.
RANA EL KALIOUBY: Right, yeah.
KEVIN SCOTT: Like, Google was going to be fine by the time I got there. But when I went to my startup, like, it was not obvious that everything was going to be fine. Like, every day, I felt like I could come in and make some mistake that was going to end everything. And the weight of that got greater every day, when we had more and more people. So, like, I talked these really good people into coming to work for me, and, like, they had made a bet on me and the company and what we were doing together. And I just sort of… So, the hard thing for me was just figuring out how to bear the responsibility without breaking, and that was tough.
RANA EL KALIOUBY: Yeah, I can relate to that. I think the – kind of the story that captures that was two years in, we had raised, like, a small seed round. And then we were getting ready to raise a much larger round. And we had – we were basically going to be out of money, come August. And we were talking to a lot of investors, but nothing had materialized.
And we got a call from the venture arm of an intelligence agency. They said, you know, “We’re – we’ve been watching the company, love the technology. We want to put in $40 million,” which was a lot of money for us at the time. But the condition was we would pivot everything to focus on surveillance and security, which we had very strong core values around opt in, and respecting people’s privacy, it’s very personal data. We didn’t feel the technology was there yet to use it in this way.
And so, it was a very hard decision to turn that funding away, not knowing if we will survive past August, right? Like, because we didn’t have a lot of other – we didn’t have an option B. But we did, because it wasn’t in line with our core values, you know, as a company, and why we started the company.
And so, that was hard, I remember. And we hustled and we found the right investors who supported us through our journey, and they shared the same core values. But that was a big decision, because I looked – like you, I looked at my team, and I was like, oh my God, these… I might not be able to pay these people in a few months, and…
KEVIN SCOTT: Yeah. Yeah, I mean, those sorts of decisions are excruciating. And yeah, I think one of the hardest parts is, like, you just don’t know. And this is the thing in general, like, the further along you get in your career, like, you end up making these consequential decisions. And you have no idea until much later whether or not they were the right decision to make. (Laughter.)
RANA EL KALIOUBY: Right. Yeah, absolutely.
KEVIN SCOTT: Yeah.
RANA EL KALIOUBY: I think the thing I’m most proud of kind of in the Affectiva journey, and even now, is just the people, right, because you bring these people on board, they trust you, right, and kind of believe in your vision, and your strategy and roadmap. And I’m just proud because I look at these people, and they’ve developed professionally and personally in their careers. They’ve gotten married and have kids and settled. And I’m like, oh my God, this, like, makes me happy.
KEVIN SCOTT: Yeah, I totally agree with you. I think that is one of the most rewarding things about, you know, doing these things. So, I look at people that I – were on my teams at Google, people who came to work for us at AdMob, people who worked at various points with us at LinkedIn, and just to see what those folks are doing now, it just makes me enormously proud, like, maybe more than anything else that I do. Yeah, yeah, it’s awesome, you’re getting that as well.
I did want to ask you, so, you know, one of the things in technology that is not great is we’ve just had a hard time, since the 1980s, having real representation in the field that it is, you know, clearly biased in a particular set of directions. And, like, we don’t have as many amazing women in the field as we should. And, like, if you narrow down to AI, like, the representation gets even more narrow.
And so, you’re an example of a woman in AI who’s had real success. So, like, how do you think about that, so, you know, both in terms of your own personal experience and feelings, but as well as, like, you know, whether you like it or not, you’re sort of a role model now, and like, what do you do with that?
RANA EL KALIOUBY: Right. Yeah, as you can imagine, I’m super passionate about this. I will say, like, the journey of raising venture and strategic funding for Affectiva hasn’t been easy, because you’re often pitching to people who don’t at all look like me, and often haven’t seen examples of founders who look like me. So, it increases the risk factor, right, whether it’s conscious or subconscious.
So, it’s been tough, and it took me many, many years to own my voice, right, and not feel… and not feel less. I mean, I still kind of have this Debbie Downer voice in my head that basically just says, you know “you don’t fit,” right? And I have to kind of work through that.
But as a result, I now feel very passionate about paying it forward. At some point, I realized that, yes, I have an opportunity here to be a role model, because role models are powerful. When you can see somebody who looks like you who’s, you know, a few steps ahead, you can say, “wow, I can be like that person.”
And so, I take it very seriously. It’s also why I’ve started a fund. So, I don’t just invest in women or underrepresented minorities, but I definitely… you know, half of my investments so far are women-led AI companies, which is very unusual. And I just feel I – you know, I’ve been really lucky, and I have this opportunity to pay it forward.
KEVIN SCOTT: That’s so awesome. So, last couple of questions before we run out of time. So, what are you excited about in AI now? Like, where do you think we’re headed the next few years?
RANA EL KALIOUBY: I’m just really fascinated by how, with machine learning and AI and data, we can now really understand health and wellness. And, like, this intersection of AI and biology, I just think it’s going to truly transform how we think about our health and wellness. And I’m – you know, we all – we measure our health and our physical activity. We can now start tracking our internal body systems in various ways using various sensors.
But eventually, all this data needs to come together, and we’ll learn a lot. And then, with AI and predictive analytics, it can nudge you to change behavior for the better. So, I’m just really fascinated by that space.
KEVIN SCOTT: Yeah, and as am I. And I think it’s one of those places, even where the IQ of the large models is useful, like, not on its own. Like, you have to like build real products on top of it that have a bunch of the things that we talked about earlier. But yeah, I mean, the combination of, like, the powerful models, like, this ubiquitous data that we’ve got about, you know, like, what’s happening with you biometrically at any point in time, like being able to ingest all of the world’s research about, you know, yeah, I mean, it’s really fascinating to think about.
And we need it, too, right? Like – like, we have a – I don’t know what the demographics look like in Egypt. But, like, if you look across the world, it’s… you know, Italy’s population peaked years ago, and is contracting. Japan’s population peaked and is contracting. Germany’s population peaked and is contracting. I think 2022 is the year where China will reach peak population, and will be contracting from here on out.
So, like, we have this really interesting… and it’s not distant future, it’s near future, coming at us where well within our lifetimes, we will have this shift from, you know, what we have now to, like, more retired and aging folks than working age folks. You know, and aging folks will have higher healthcare needs and demands than younger people.
And so, you’ve got this dual problem of, you know, what takes up the slack in the productive workforce, and, like, what – you know, what helps people aging lead a healthy, like, dignified life? So, it’s yeah, something needs to change here. (Laughter.)
RANA EL KALIOUBY: Yeah, and this concept of lifespan and healthspan, right, how can you help people live healthier, longer. And I really do think technology is at the core of the solution, both in terms of diagnosis and personalized and precision medicine, but also in terms of care, right, like using technology and maybe home robots or home companions that can be more helpful. Obviously, a lot of ethics questions come up when we go there.
KEVIN SCOTT: Yeah. But, you know, I think this stuff that you’re talking about, like having things that… AI systems that both understand human emotion and that are able to emote in a way that lets them connect with other human beings. You know, my grandmother lives by herself. She’s 92 years old. Like, she’s well enough not to need to be in a, like, assisted care. Actually, she’s, like, robust, good health.
RANA EL KALIOUBY: Nice.
KEVIN SCOTT: But she’s alone. Like, her – you know, my grandfather died 10 or 15 years ago. I mean, she’s been by herself for a long while. And my mother talks to her five or six times a day on the phone, you know, and she has this community of people who look after her, that she’s been connected with her entire life.
But I do wonder whether there’s a role for technology to play in, like, giving her companionship, not just to make sure she doesn’t fall down, and, you know, like, an injury goes unnoticed, but, like, you know, so that she has some richness in her life, you know, with the companionship that she might not otherwise have.
RANA EL KALIOUBY: Right, yeah. Yeah, I think there’s opportunity there. But again, how do you design these in a way that’s thoughtful, you know, that isn’t spooky, and it doesn’t take away from her human-to-human connections, right?
KEVIN SCOTT: Correct.
RANA EL KALIOUBY: It’s not, like, replacing her… you know, her relationships with this robot. But it’s complementing it, yeah.
KEVIN SCOTT: Correct. Yeah, and I think in a sense, that is maybe harder than training a really big model and, you know, having it be able to pass standardized tests or, you know, do complicated mathematics or whatnot. Like, training something that has authentic, non-manipulative rapport with a human being is tough. (Laughter.)
RANA EL KALIOUBY: It’s tough. Yes, it’s definitely not, we haven’t cracked that code yet.
KEVIN SCOTT: Cool. All right, so, one last question before we go, which I ask everyone. So, you have such interesting work that you do. You know, you’re a mom to two kids, so you’re super busy. But what do you do in your spare time for fun?
RANA EL KALIOUBY: For fun? Ooh, okay. So, I’m an avid Zumba dancer. I love dancing. And so, I’ve been doing it for six years. I’m very close to becoming a certified instructor.
KEVIN SCOTT: Wow.
RANA EL KALIOUBY: So, I don’t know if I’ll pivot my career to being a Zumba instructor ever, but it brings me joy. It’s good exercise, but it… most importantly, it just makes me – I always tell my team, “I’ll be a happier leader, I’ll be a better leader if you let me do my Zumba class. So, don’t schedule on top of it.”
KEVIN SCOTT: That is super, super cool. Well, thank you so much for taking time to chat with us today. And, you know, thank you for what you do. Like, it’s really inspiring to hear about your journey and to think that we’ve got someone like you off tackling these hard problems.
RANA EL KALIOUBY: Thank you so much for having me. It’s a true honor. I appreciate it.
KEVIN SCOTT: Awesome.
CHRISTINA WARREN: What a fascinating conversation with Rana el Kaliouby. So, one thing that I wanted to talk to you about, before we kind of dive into some of the Emotion AI stuff is, you know, you always start out these interviews by asking people how they got into tech.
And what really struck me with your conversation with Rana was how important it is not just, I think, to have access to certain things, but how important it is to have people who support you in your interests, because you have someone who grew up in a time and in a part of the world where not every set of parents would be as open to educating their children the way Rana’s were, certainly, you know, as open to exposing her to technology the way that they were.
And I was really struck by how important that is as an aspect, I think, especially when it comes to representation, that we sometimes omit a little bit.
KEVIN SCOTT: Yeah, I could not agree more, and I think, almost to a person, whenever you find someone who is having a successful career in technology, or in any other domain, there has to be support somewhere because it’s so hard to be successful at anything. There’s so much pulling you in the other direction, like people telling you, you can’t do it, you know, like people trying to get you to doubt yourself, just institutional things that, like, are either aggressively or passive aggressively, pushing back against what it is you’re trying to accomplish, normal human emotion.
Like, you know, some of this stuff is just sort of hard. And people, I think, are naturally afraid of failing, and so, figuring out how to get the courage to get on top of that. And that is a lot of hard to deal with all on your own. So, having that support, like somebody, like, hopefully, lots of somebodies, who are there trying to help you to actually make it, I think, is really important.
And her story is fascinating, because she had a lot of things, you know, sort of that could have potentially pushed back against her achieving these really ambitious and awesome goals that she had for herself.
CHRISTINA WARREN: Absolutely, absolutely. But I think, likewise, you know, her background, but also her interests and the support she had and just, you know, the drive she has within herself, I think, makes her really well suited to doing what she’s doing, which in her own words is, you know, wanting to marry the worlds of AI and emotional… EQ and IQ, I think, is what she said, wanting to marry EQ and IQ, and AI.
One of the interesting stats that she pointed out was that, you know, 93% of how we communicate is nonverbal, but when it comes to AI, we’ve been focusing on that seven percent. And I was really thinking a lot about that, especially as you know, the last few months, we’ve all been playing with so many various GPT3 models. And those models, I think, have gotten really good at – at that seven percent, right? Like, they’re obviously not all the way there, but I think they’re pretty good at some of the written stuff.
From your perspective, as a technologist, as somebody who follows a lot of these things, from having, you know, conversations with people like Rana, what do you think we need to do next to attack that other 93%, those nonverbal cues which become so important to human interaction?
KEVIN SCOTT: Yeah, I think it’s maybe the most important question we should be asking right now. So, it’s easy for me, at least, and I think this is true for other people in technology, to get really carried away with the capability of a thing, so, like, what it can do, or, like, how it can do it. And, like, that’s certainly true with these powerful models.
Like, I honestly have been, even though I’ve been very, very, very deep in this stuff for a really long time, and, you know, we have, you know, in partnership with OpenAI, been, I think, on the forefront of this stuff for the past handful of years. And so, like, we can even see what we think is coming, I’ve been surprised that we accomplished as much as we did this year.
But even though all of that’s sort of a fascinating “what,” like the thing that’s really challenging with these things is, like, what you choose to go do with them. And I think this is some of what Rana is getting at.
So, part of what I hope that we have more conversations about in the coming year is how we can build all of this capability into applications, where you’re serving the user, and where you don’t actually forget the emotional element. I mean, the thing that we sometimes forget is that, you know, there are pretty sophisticated AI systems already that, you know, materialize your Twitter feed or, you know, figure out what it is that you’re going to see on TikTok when you open that app.
CHRISTINA WARREN: Yeah.
KEVIN SCOTT: And, you know, what those systems optimize for it isn’t necessarily the full gamut of emotions that, you know, sort of enrich human beings. Like, a lot of it is like, oh, I’m going to show you something sensational to keep you, like, right here in this -
CHRISTINA WARREN: Right.
KEVIN SCOTT: In this experience. And so, the thing that I hope we can learn from folks like Rana is, like, how we can build these systems in ways where they’re provoking the right emotion, and where they can take emotional feedback from people to help you, yeah, help you basically feel the way that you want to feel, from interacting with your technology, not feel the way someone else wants you to feel. Like, that’s the difference between agency and manipulation.
CHRISTINA WARREN: I love that, I love that. And you’re right, that is the difference between the two. And I think that, you know, that’s certainly what Rana was talking about some of her core values and the things that, you know, she wasn’t willing to take money for, and wanting to talk more about transparency in these systems. And I think that is equally so important.
And so, I totally agree with you. And I’m excited, and I think that I’m heartened that we have people like Rana, who have the background to be able to marry these two worlds together, someone, you know, who has, you know, such a strong background in emotional intelligence, wanting to take that into AI, so that when these systems do come out, maybe we’ll do a little bit better than we did with kind of our gen 1, you know, algorithms which, you know, maybe like engage by enrage, right? Like, maybe we don’t do that in the future, which would be fantastic.
KEVIN SCOTT: Yeah. Let us hope, and maybe, you know, maybe even we use our AI systems to stand in between us in some of these things, so that, like, we can have more agency over, like, what we’re consuming and how we feel about it. (Laughter.)
CHRISTINA WARREN: Wouldn’t that be cool? Wouldn’t that be great if, like, if the AI could actually kind of be that little bit of a barrier to say, hey, I know this looks good, but this is actually going to make you feel really bad later on. Are you sure you want to do this? That would be - that would be really cool. All right –
KEVIN SCOTT: It’s like the little angel sitting on my shoulder, telling me not to eat the jelly doughnut. (Laughter.)
CHRISTINA WARREN: No, see, exactly, exactly. Kevin, we could call it Clippy We already have the IP. And it’s like, are you sure? Are you sure you want to eat the doughnuts? And I mean, I think we’ve already got something there. I like it, I like it. (Laughter.)
KEVIN SCOTT: Yeah, me too. (Laughter.)
CHRISTINA WARREN: That is all the time we have for today. Thank you so much to Rana el Kaliouby for joining us. If you have anything that you would like to share with us any of your thoughts on emotional AI or anything else, please e-mail us anytime at [email protected]. And you can follow Behind The Tech on your favorite podcast platform or check out our full video episodes on YouTube.
Thanks for tuning in.
KEVIN SCOTT: See you next time.