-
PROFILE
-
TOPICS
-
VIDEO
<
>
Meet speaker - Rana el Kaliouby
What if our high-tech devices could decipher and respond to our emotions? Rana el Kaliouby, Ph.D., is on a mission to humanize technology with artificial emotional intelligence, or what she calls “Emotion AI.” She envisions and works to create a world where devices and technologies that once separated us will bring us together. It’s the next frontier in artificial intelligence, with commercial applications across industries – and it’s closer than we think.
Computer scientist, technologist, entrepreneur and business leader, Dr. el Kaliouby believes that “humanizing technology gives us a golden opportunity to re-imagine how we connect with machines, and, therefore, how we connect with each other.” Co-founder and CEO of Affectiva, the pioneer of Emotion AI, she invented the company’s award-winning emotion recognition technology. The Emotion AI platform combines facial expression and tone of voice to infer how a person is feeling, using deep learning and the world’s largest emotion data repository of more than five million faces, analyzed from 75 countries. Teaching machines to measure and interpret human emotions has the potential to dramatically improve lives, with such powerful applications as assisting doctors and nurses in delivering better care, engaging students and personalizing their learning experience, and increasing road safety by making “emotionally aware” vehicles. It also promises to forever change the rules of consumer engagement by providing real-time insight into viewers’ emotional responses to brands, ads and other digital content. Always evolving in her work, Dr. el Kaliouby is now pioneering Human Perception AI, technology that can understand all things human.
As her popular TED talk made apparent, Dr. el Kaliouby has the ability to simplify and translate complex science and make it accessible to any audience. Her credentials as a popular spokesperson for science led to her acting as co-host to the PBS Nova Series, “Nova Wonders,” which premiered in April 2018. A founder of her own technology company, accomplished inventor, and one of few women to have successfully transitioned from academia to business, Dr. el Kaliouby – and her work – receive considerable recognition. A passionate advocate for innovation, ethics in AI and diversity, she was named to the 2018 Thinkers50 Radar list of management thinkers most likely to shape the future of how organizations are managed and led, Fortune named her to its 2018 40 Under 40 list of the most influential young people in business, Forbes America named her as one of the Top 50 Women in Tech, Inc. named her to their Female Founders 100 list, Bostinno named her their 50 on Fire and WIRED named her one of 25 Geniuses Who Are Creating the Future of Business. Recently inducted into the Women in Engineering Hall of Fame, Dr. el Kaliouby also was selected as a World Economic Forum (WEF) Young Global Leader and serves on WEF’s Global Future Council on AI and Robotics. She’s often cited in and interviewed by top business and mainstream outlets, including The New Yorker, Wired, Forbes, Fast Company, The Wall Street Journal, The New York Times, TIME Magazine, Fortune, CNN, CBS and Reddit.
Prior to founding Affectiva, Dr. el Kaliouby was a research scientist at the MIT Media Lab where she spearheaded applications for facial coding to benefit mental health, autism and other research areas. Born and raised in Cairo, she received Bachelor of Science and Master of Science degrees in computer science from the American University in Cairo and a Ph.D. from the computer laboratory, University of Cambridge.
Computer scientist, technologist, entrepreneur and business leader, Dr. el Kaliouby believes that “humanizing technology gives us a golden opportunity to re-imagine how we connect with machines, and, therefore, how we connect with each other.” Co-founder and CEO of Affectiva, the pioneer of Emotion AI, she invented the company’s award-winning emotion recognition technology. The Emotion AI platform combines facial expression and tone of voice to infer how a person is feeling, using deep learning and the world’s largest emotion data repository of more than five million faces, analyzed from 75 countries. Teaching machines to measure and interpret human emotions has the potential to dramatically improve lives, with such powerful applications as assisting doctors and nurses in delivering better care, engaging students and personalizing their learning experience, and increasing road safety by making “emotionally aware” vehicles. It also promises to forever change the rules of consumer engagement by providing real-time insight into viewers’ emotional responses to brands, ads and other digital content. Always evolving in her work, Dr. el Kaliouby is now pioneering Human Perception AI, technology that can understand all things human.
As her popular TED talk made apparent, Dr. el Kaliouby has the ability to simplify and translate complex science and make it accessible to any audience. Her credentials as a popular spokesperson for science led to her acting as co-host to the PBS Nova Series, “Nova Wonders,” which premiered in April 2018. A founder of her own technology company, accomplished inventor, and one of few women to have successfully transitioned from academia to business, Dr. el Kaliouby – and her work – receive considerable recognition. A passionate advocate for innovation, ethics in AI and diversity, she was named to the 2018 Thinkers50 Radar list of management thinkers most likely to shape the future of how organizations are managed and led, Fortune named her to its 2018 40 Under 40 list of the most influential young people in business, Forbes America named her as one of the Top 50 Women in Tech, Inc. named her to their Female Founders 100 list, Bostinno named her their 50 on Fire and WIRED named her one of 25 Geniuses Who Are Creating the Future of Business. Recently inducted into the Women in Engineering Hall of Fame, Dr. el Kaliouby also was selected as a World Economic Forum (WEF) Young Global Leader and serves on WEF’s Global Future Council on AI and Robotics. She’s often cited in and interviewed by top business and mainstream outlets, including The New Yorker, Wired, Forbes, Fast Company, The Wall Street Journal, The New York Times, TIME Magazine, Fortune, CNN, CBS and Reddit.
Prior to founding Affectiva, Dr. el Kaliouby was a research scientist at the MIT Media Lab where she spearheaded applications for facial coding to benefit mental health, autism and other research areas. Born and raised in Cairo, she received Bachelor of Science and Master of Science degrees in computer science from the American University in Cairo and a Ph.D. from the computer laboratory, University of Cambridge.
Suggested Speaking Topics
How to Build AI That You Can Trust
As AI becomes more sophisticated, it will be increasingly performing or aiding in important tasks traditionally done by humans—from driving cars to teaching our children and caring for the sick and elderly. Given the nature of these roles, we should be able to trust the AI systems with which we will be interacting. It’s paramount that developers building these systems carry out their work with these considerations in mind, and create the next generation of AI with trust as its cornerstone. Rana el Kaliouby, a leading pioneer in Human Perception AI, argues that the basis of trust is empathy, and ensuring AI is empathetic is key to having humans trust its competence. In this presentation, el Kaliouby outlines her “5 Tenets of the Social Contract Between AI and Humans.” She argues that to avoid systemic bias, diversity must be reflected not only in the data used to build the AI capabilities of the future, but also in the teams developing the systems. More specifically, AI has to be attuned to the needs of humans, understanding our complex emotions and cognitive states, and adapting to these appropriately. This framework will be essential to the companies and individuals who are forging the future, ensuring they make AI less artificial and more human.
How Emotion AI Could Transform Health Care
AI-based virtual assistants are evolving quickly, and now, increasingly more effort and resources are being put toward making them emotionally intelligent – able to pick up on subtle cues in speech, inflection, or gesture and expression to assess a person’s health and wellbeing. Nurse avatars and social robots – such as Catalia Health’s Mabu, an emotion-aware personal care companion for terminally ill patients – are already proving their abilities to enhance patient-centered health care environments around the world, making routine processes more efficient, and improving experiences and outcomes. But their full potential is even more profound, and far from realized, believes Dr. Rana el Kaliouby.
Emotion AI, she says, provides an opportunity to transform numerous aspects of health and wellbeing, from telemedicine and drug efficacy to mental health research and autism support. By reading into vocal tone or detecting facial expressions, for example, emotion AI-enabled platforms could detect depression or even potentially underlying chronic conditions, such as heart disease. In telemedicine, mood-aware wearables could contribute to monitoring mental health and wellbeing, perhaps helping those with depression track their emotional state and share that data with their doctor.
Our emotional wellbeing and our health is intertwined, Dr. el Kaliouby explains. She discusses the future of health care – through the lens of emotion AI, delving deep into how far it has come and helping envision how far it can still go. Its promise is powerful.
Artificial Emotional Intelligence (Emotion AI) – What It Is and Why It Matters
Today’s devices work hand-in-hand with humans –at work, home, school and play. Dr. Rana el Kaliouby believes they can do much more. An expert in artificial emotional intelligence, or “Emotion AI,” Dr. el Kaliouby explores the valuable applications of humanized technology in media and advertising, gaming, automotive, robotics, health, education and more. She explains how machine learning works, explores the potential for the development of emotion chips, and addresses the ethics and privacy issues of Emotion AI. In her talks, Dr. el Kaliouby gives participants an inside look at the world’s largest emotion data repository, shares results from her research studying more than 5 million faces around the world, and reveals that the emoji mindset may soon be a thing of the past.
Emotion AI and the Future of Work
With any technology advances and automation of tasks, it’s inevitable that jobs will change or disappear. However, artificial intelligence (AI) will, at the same time, create many new types of jobs that did not exist before, augment our abilities and complete menial tasks to free humans up to do the higher level strategic thinking and decision making. Computer scientist and AI expert Dr. Rana el Kaliouby envisions a world in which devices interact with us the same way we interact with one another – through conversation, perception and emotion. Before long, we won’t even remember when our relationship with technology was any different. El Kaliouby, who is currently developing emotion AI, is ensuring that in the near future our devices will have an emotion chip that can sense and react to people’s cognitive and emotional state with accuracy and empathy. Our mobile phones, cars, smart TVs, personal digital assistants and even our household appliances will be able to sense our emotion and cognitive states.
In this presentation, el Kaliouby reveals eight major ways that emotion AI will transform the future of work – from productivity companions and human-robot teams, to a complete re-imagination of our travel experiences and how we connect with one another – showing business leaders how new jobs will be created and why they must rethink how employees are trained (and re-trained) for the constantly evolving technology and business landscape.
Emotion AI Takes a Front Seat in Cars of the Future
Recent innovations around the autonomous car are shaking up the automotive industry. Beyond the issues of driverless or semi-autonomous vehicles, cars of the future are undergoing a fundamental shift in human-machine interaction. Consumers today crave more relational and conversational interactions with devices, as evidenced by the popularity of chatbots and virtual assistants like Siri and Alexa. Next-generation cars are emerging with advanced artificial intelligence systems that include conversational interfaces between the driver, passengers, the vehicle itself and its controls – all connected to the Internet of Things (IoT) and mobile devices individuals use. Leveraging emotion recognition technology that senses and analyzes expressions of emotion, cars will soon come equipped with the ability to perceive our reactions and moods and respond accordingly. Dr. Rana el Kaliouby shows how these “emotionally aware” vehicles will benefit the automotive industry and consumers in a number of ways, such as improving road safety and jacking up the in-car experience with optimal personalization.
Marketing: How Emotion Analytics Drive Deeper Connections with Consumers
The marketing holy grail is building emotional connections with consumers while creating personalized experiences unique to the brand. Yet, marketers struggle to measure the success of their efforts. Dr. Rana el Kaliouby helps them learn to leverage emotion technology to hit the right chord. She points to examples drawn from the more than 1,400 brands already using it to measure and analyze how consumers respond to digital content, such as videos and ads, and even TV shows. She explains why – and how – emotion data helps media companies, brands and advertisers improve their advertising and media spend, and predict sales uplift and virality. She also discusses experiential marketing, and how adding real-time emotion awareness to digital and retail experiences makes such campaigns so successful. Case in point: Hershey’s “smile for a sample,” which uses machines equipped with facial coding technology to distribute candy bars in exchange for smiles.
IoT: Mood-Aware Connected Home Devices that Can Make Life Better
The number of connected devices on and around us is growing exponentially (it will reach 25 billion by 2020, according to Gartner analysts). Ranging from wearable devices and cars to the TV, refrigerator, a mirror and social robots, these devices are designed to bring about positive behavior change as they persuade or motivate the user to do things differently, better, faster or more efficiently. But to be most effective, they need to be perceptual and in tune with our emotions, says Dr. Rana el Kaliouby. Imagine your fridge works with you to eat healthier, or your wearable device and television team up to get you off the couch. Your bathroom mirror senses you’re a bit stressed and adjusts your lighting while turning on the right mood-enhancing music.
Dr. el Kaliouby demonstrates how mood-aware machines have the potential to transform our homes and major industries. Often asked what the future holds for mood-aware technology, her answer is simple: it will be ubiquitous, engrained in the technologies we use every day, running in the background, making our tech interactions more personalized, relevant, authentic and interactive.
Personalized Learning that Leads to Dramatically Improved Outcomes
Artificial intelligence and Emotion AI get us closer to realizing personalized education with the potential to dramatically improve learning outcomes. Today, in online learning environments, it’s not until students take a test that the instructor knows whether the education was effective. Corrective actions take place after the fact. But what if intelligent learning systems could provide a personalized learning experience? Such a system would know an individual and the individual’s unique learning style, sense levels of engagement or frustration, and then adapt in real time. Dr. Rana el Kaliouby shares how these intelligent systems would offer a different explanation when the student is frustrated, slow down a bit in times of confusion, and tell a joke when it’s time to have some fun – just the way an awesome teacher would.
As AI becomes more sophisticated, it will be increasingly performing or aiding in important tasks traditionally done by humans—from driving cars to teaching our children and caring for the sick and elderly. Given the nature of these roles, we should be able to trust the AI systems with which we will be interacting. It’s paramount that developers building these systems carry out their work with these considerations in mind, and create the next generation of AI with trust as its cornerstone. Rana el Kaliouby, a leading pioneer in Human Perception AI, argues that the basis of trust is empathy, and ensuring AI is empathetic is key to having humans trust its competence. In this presentation, el Kaliouby outlines her “5 Tenets of the Social Contract Between AI and Humans.” She argues that to avoid systemic bias, diversity must be reflected not only in the data used to build the AI capabilities of the future, but also in the teams developing the systems. More specifically, AI has to be attuned to the needs of humans, understanding our complex emotions and cognitive states, and adapting to these appropriately. This framework will be essential to the companies and individuals who are forging the future, ensuring they make AI less artificial and more human.
How Emotion AI Could Transform Health Care
AI-based virtual assistants are evolving quickly, and now, increasingly more effort and resources are being put toward making them emotionally intelligent – able to pick up on subtle cues in speech, inflection, or gesture and expression to assess a person’s health and wellbeing. Nurse avatars and social robots – such as Catalia Health’s Mabu, an emotion-aware personal care companion for terminally ill patients – are already proving their abilities to enhance patient-centered health care environments around the world, making routine processes more efficient, and improving experiences and outcomes. But their full potential is even more profound, and far from realized, believes Dr. Rana el Kaliouby.
Emotion AI, she says, provides an opportunity to transform numerous aspects of health and wellbeing, from telemedicine and drug efficacy to mental health research and autism support. By reading into vocal tone or detecting facial expressions, for example, emotion AI-enabled platforms could detect depression or even potentially underlying chronic conditions, such as heart disease. In telemedicine, mood-aware wearables could contribute to monitoring mental health and wellbeing, perhaps helping those with depression track their emotional state and share that data with their doctor.
Our emotional wellbeing and our health is intertwined, Dr. el Kaliouby explains. She discusses the future of health care – through the lens of emotion AI, delving deep into how far it has come and helping envision how far it can still go. Its promise is powerful.
Artificial Emotional Intelligence (Emotion AI) – What It Is and Why It Matters
Today’s devices work hand-in-hand with humans –at work, home, school and play. Dr. Rana el Kaliouby believes they can do much more. An expert in artificial emotional intelligence, or “Emotion AI,” Dr. el Kaliouby explores the valuable applications of humanized technology in media and advertising, gaming, automotive, robotics, health, education and more. She explains how machine learning works, explores the potential for the development of emotion chips, and addresses the ethics and privacy issues of Emotion AI. In her talks, Dr. el Kaliouby gives participants an inside look at the world’s largest emotion data repository, shares results from her research studying more than 5 million faces around the world, and reveals that the emoji mindset may soon be a thing of the past.
Emotion AI and the Future of Work
With any technology advances and automation of tasks, it’s inevitable that jobs will change or disappear. However, artificial intelligence (AI) will, at the same time, create many new types of jobs that did not exist before, augment our abilities and complete menial tasks to free humans up to do the higher level strategic thinking and decision making. Computer scientist and AI expert Dr. Rana el Kaliouby envisions a world in which devices interact with us the same way we interact with one another – through conversation, perception and emotion. Before long, we won’t even remember when our relationship with technology was any different. El Kaliouby, who is currently developing emotion AI, is ensuring that in the near future our devices will have an emotion chip that can sense and react to people’s cognitive and emotional state with accuracy and empathy. Our mobile phones, cars, smart TVs, personal digital assistants and even our household appliances will be able to sense our emotion and cognitive states.
In this presentation, el Kaliouby reveals eight major ways that emotion AI will transform the future of work – from productivity companions and human-robot teams, to a complete re-imagination of our travel experiences and how we connect with one another – showing business leaders how new jobs will be created and why they must rethink how employees are trained (and re-trained) for the constantly evolving technology and business landscape.
Emotion AI Takes a Front Seat in Cars of the Future
Recent innovations around the autonomous car are shaking up the automotive industry. Beyond the issues of driverless or semi-autonomous vehicles, cars of the future are undergoing a fundamental shift in human-machine interaction. Consumers today crave more relational and conversational interactions with devices, as evidenced by the popularity of chatbots and virtual assistants like Siri and Alexa. Next-generation cars are emerging with advanced artificial intelligence systems that include conversational interfaces between the driver, passengers, the vehicle itself and its controls – all connected to the Internet of Things (IoT) and mobile devices individuals use. Leveraging emotion recognition technology that senses and analyzes expressions of emotion, cars will soon come equipped with the ability to perceive our reactions and moods and respond accordingly. Dr. Rana el Kaliouby shows how these “emotionally aware” vehicles will benefit the automotive industry and consumers in a number of ways, such as improving road safety and jacking up the in-car experience with optimal personalization.
Marketing: How Emotion Analytics Drive Deeper Connections with Consumers
The marketing holy grail is building emotional connections with consumers while creating personalized experiences unique to the brand. Yet, marketers struggle to measure the success of their efforts. Dr. Rana el Kaliouby helps them learn to leverage emotion technology to hit the right chord. She points to examples drawn from the more than 1,400 brands already using it to measure and analyze how consumers respond to digital content, such as videos and ads, and even TV shows. She explains why – and how – emotion data helps media companies, brands and advertisers improve their advertising and media spend, and predict sales uplift and virality. She also discusses experiential marketing, and how adding real-time emotion awareness to digital and retail experiences makes such campaigns so successful. Case in point: Hershey’s “smile for a sample,” which uses machines equipped with facial coding technology to distribute candy bars in exchange for smiles.
IoT: Mood-Aware Connected Home Devices that Can Make Life Better
The number of connected devices on and around us is growing exponentially (it will reach 25 billion by 2020, according to Gartner analysts). Ranging from wearable devices and cars to the TV, refrigerator, a mirror and social robots, these devices are designed to bring about positive behavior change as they persuade or motivate the user to do things differently, better, faster or more efficiently. But to be most effective, they need to be perceptual and in tune with our emotions, says Dr. Rana el Kaliouby. Imagine your fridge works with you to eat healthier, or your wearable device and television team up to get you off the couch. Your bathroom mirror senses you’re a bit stressed and adjusts your lighting while turning on the right mood-enhancing music.
Dr. el Kaliouby demonstrates how mood-aware machines have the potential to transform our homes and major industries. Often asked what the future holds for mood-aware technology, her answer is simple: it will be ubiquitous, engrained in the technologies we use every day, running in the background, making our tech interactions more personalized, relevant, authentic and interactive.
Personalized Learning that Leads to Dramatically Improved Outcomes
Artificial intelligence and Emotion AI get us closer to realizing personalized education with the potential to dramatically improve learning outcomes. Today, in online learning environments, it’s not until students take a test that the instructor knows whether the education was effective. Corrective actions take place after the fact. But what if intelligent learning systems could provide a personalized learning experience? Such a system would know an individual and the individual’s unique learning style, sense levels of engagement or frustration, and then adapt in real time. Dr. Rana el Kaliouby shares how these intelligent systems would offer a different explanation when the student is frustrated, slow down a bit in times of confusion, and tell a joke when it’s time to have some fun – just the way an awesome teacher would.
|
|