WEBVTT 1 00:00:00.310 --> 00:00:03.630 We have built with using only a camera and a watch, 2 00:00:03.630 --> 00:00:05.740 or using only a smartphone and a watch, 3 00:00:05.740 --> 00:00:09.810 the ability to monitor and forecast your stress, 4 00:00:09.810 --> 00:00:14.810 mood, and health with 78-87% accuracy already. 5 00:00:15.030 --> 00:00:16.340 Artificial intelligence 6 00:00:16.340 --> 00:00:19.160 has more computational power than ever. 7 00:00:19.160 --> 00:00:21.110 And that's enabling huge strides 8 00:00:21.110 --> 00:00:24.210 in new sciences like affective computing. 9 00:00:24.210 --> 00:00:26.770 Rosalind Picard at the MIT Media Lab 10 00:00:26.770 --> 00:00:29.740 is at the cutting edge of this new technology. 11 00:00:29.740 --> 00:00:32.890 line:15% Affective computing is defined to be computing 12 00:00:32.890 --> 00:00:35.500 line:15% that relates to, arises from, 13 00:00:35.500 --> 00:00:39.190 line:15% or deliberately influences human emotion. 14 00:00:39.190 --> 00:00:42.890 It's actually intended to be an area 15 00:00:42.890 --> 00:00:47.830 that puts affect on the same playing field as cognition. 16 00:00:47.830 --> 00:00:50.340 We've had a lot of work on cognitive science, 17 00:00:50.340 --> 00:00:52.990 computing that recognizes logic, 18 00:00:52.990 --> 00:00:54.493 and reason and thinking. 19 00:00:55.330 --> 00:01:00.020 It's not about making computers emotional and unpleasant, 20 00:01:00.020 --> 00:01:02.470 and all these things that would be a nightmare. 21 00:01:02.470 --> 00:01:04.300 It's very much about bringing together 22 00:01:04.300 --> 00:01:07.570 a proper balance between affect and cognition. 23 00:01:07.570 --> 00:01:09.110 An affective state 24 00:01:09.110 --> 00:01:13.000 is maybe most simply described as an emotional state. 25 00:01:13.000 --> 00:01:16.420 But often it's more subtle than an emotional state. 26 00:01:16.420 --> 00:01:20.370 It could be simply that you're a little agitated, 27 00:01:20.370 --> 00:01:23.090 a little wound up, a little mellow, 28 00:01:23.090 --> 00:01:25.160 a little bit in an irritable mood, 29 00:01:25.160 --> 00:01:26.890 or a little bit in a happy mood. 30 00:01:26.890 --> 00:01:30.060 It may not look like anything outwardly. 31 00:01:30.060 --> 00:01:31.400 Part of the complexity 32 00:01:31.400 --> 00:01:34.460 is we have over simplified notions of emotion. 33 00:01:34.460 --> 00:01:36.870 For example, we might think a smile means happy, 34 00:01:36.870 --> 00:01:39.690 but it turns out that 90% of the time when people 35 00:01:39.690 --> 00:01:42.700 are frustrated in our studies, they smile. 36 00:01:42.700 --> 00:01:45.160 And in fact, they don't just smile with like a mean smile. 37 00:01:45.160 --> 00:01:47.380 They actually smile with a "true smile" 38 00:01:47.380 --> 00:01:49.903 of happiness that psychologists said means happy. 39 00:01:52.010 --> 00:01:53.980 Reading emotions isn't easy, 40 00:01:53.980 --> 00:01:56.270 especially if you're a computer. 41 00:01:56.270 --> 00:01:57.870 The nuances of biometrics 42 00:01:57.870 --> 00:02:00.650 have been just too difficult to analyze. 43 00:02:00.650 --> 00:02:02.760 AI has changed that. 44 00:02:02.760 --> 00:02:04.620 When we're training the machine learning, 45 00:02:04.620 --> 00:02:06.900 we always say we need more data, we need more data. 46 00:02:06.900 --> 00:02:08.170 And then my students were like, 47 00:02:08.170 --> 00:02:09.670 oh, I want to build this mood recognizer. 48 00:02:09.670 --> 00:02:11.880 I'm like, oh, I think that's too hard. 49 00:02:11.880 --> 00:02:12.713 Well, it turned out, 50 00:02:12.713 --> 00:02:14.180 it really is hard when you just try 51 00:02:14.180 --> 00:02:15.560 to build a mood recognizer. 52 00:02:15.560 --> 00:02:16.660 But it turns out that if you try 53 00:02:16.660 --> 00:02:20.350 to build a mood and stress and health recognizer, 54 00:02:20.350 --> 00:02:22.900 ironically, if you actually make it even harder, 55 00:02:22.900 --> 00:02:25.110 if you give it three categories to learn, 56 00:02:25.110 --> 00:02:26.570 instead of just one, 57 00:02:26.570 --> 00:02:28.703 the machine learning actually does better. 58 00:02:29.720 --> 00:02:30.950 Kind of counterintuitive, 59 00:02:30.950 --> 00:02:32.600 but what it's actually doing inside 60 00:02:32.600 --> 00:02:34.500 is it's kind of exploiting the similarities 61 00:02:34.500 --> 00:02:37.560 and differences in these related tasks. 62 00:02:37.560 --> 00:02:39.680 And that's helping it do better at stress, 63 00:02:39.680 --> 00:02:41.420 at mood, and at health. 64 00:02:41.420 --> 00:02:44.090 We've now realized that we've got 65 00:02:44.090 --> 00:02:47.520 the ability to forecast and recognize tomorrow's, 66 00:02:47.520 --> 00:02:50.460 not just stress level, but mood level. 67 00:02:50.460 --> 00:02:52.483 Good bad mood, very coarse. 68 00:02:53.951 --> 00:02:54.784 And health. 69 00:02:54.784 --> 00:02:55.617 Are you going to be healthy? 70 00:02:55.617 --> 00:02:57.890 Are you going to be likely calling in sick 71 00:02:57.890 --> 00:03:01.069 and kind of on a zero to 100 scale estimating where you are, 72 00:03:01.069 --> 00:03:04.053 with about 12 to 14 points of accuracy. 73 00:03:04.920 --> 00:03:07.640 So, with affective computing, 74 00:03:07.640 --> 00:03:10.530 AI can detect and predict emotions. 75 00:03:10.530 --> 00:03:12.410 What else can it do? 76 00:03:12.410 --> 00:03:15.170 I think one of the most exciting revelations 77 00:03:15.170 --> 00:03:18.040 we've had lately is instead of just trying 78 00:03:18.040 --> 00:03:21.460 to build an AI that empathizes and has all of 79 00:03:21.460 --> 00:03:25.620 this emotional intelligence, I'm much more excited now, 80 00:03:25.620 --> 00:03:27.640 by seeing how we can build AI's 81 00:03:27.640 --> 00:03:30.010 that work with people helping people 82 00:03:30.010 --> 00:03:32.230 be more emotionally intelligent. 83 00:03:32.230 --> 00:03:35.070 There's kind of a sweet spot where 84 00:03:35.070 --> 00:03:39.890 the AI is good at helping us detect maybe 85 00:03:39.890 --> 00:03:42.490 where we're not doing what we want to be doing. 86 00:03:42.490 --> 00:03:44.880 It's at that moment of stress that they need 87 00:03:44.880 --> 00:03:47.320 the technology to help maybe give them 88 00:03:47.320 --> 00:03:50.470 that little gentle nudge or reminder. 89 00:03:50.470 --> 00:03:53.580 And then you're empowering the person 90 00:03:53.580 --> 00:03:56.260 to actually take their learning to a whole new level. 91 00:03:56.260 --> 00:03:58.260 It's absolutely a great way to think about 92 00:03:58.260 --> 00:04:02.530 it as a life coach, as a just help you up 93 00:04:02.530 --> 00:04:04.110 your game kind of coach. 94 00:04:04.110 --> 00:04:08.160 And what we wind up with is people who are just more amazing 95 00:04:08.160 --> 00:04:11.160 to work with and people who feel better at 96 00:04:11.160 --> 00:04:14.360 the end of the day and people who accomplish a lot more 97 00:04:14.360 --> 00:04:18.623 and have much greater opportunities tomorrow as well. 98 00:04:23.640 --> 00:04:25.850 So thanks to affective computing, 99 00:04:25.850 --> 00:04:28.080 AI can forecast your stress level, 100 00:04:28.080 --> 00:04:31.890 your mood, and your health, just like your weather app. 101 00:04:31.890 --> 00:04:34.080 Should you take on that big challenge 102 00:04:34.080 --> 00:04:36.960 or stay home until the storm passes? 103 00:04:36.960 --> 00:04:39.310 AI can help you decide. 104 00:04:39.310 --> 00:04:42.120 But what does this mean for organizations 105 00:04:42.120 --> 00:04:44.050 now that they can use AI to gather 106 00:04:44.050 --> 00:04:47.630 and analyze real time data on teams' emotions? 107 00:04:47.630 --> 00:04:48.890 In our next video, 108 00:04:48.890 --> 00:04:50.957 Roz describes how some organizations 109 00:04:50.957 --> 00:04:53.340 are already using affective computing 110 00:04:53.340 --> 00:04:56.600 to improve team performance and employees' well-being. 111 00:04:56.600 --> 00:05:00.100 That positive mood can lead their entire brain 112 00:05:00.100 --> 00:05:02.200 to think differently. 113 00:05:02.200 --> 00:05:04.700 But she also points out potential pitfalls 114 00:05:04.700 --> 00:05:05.700 to avoid. 115 00:05:05.700 --> 00:05:07.440 Even if you have the best of intentions 116 00:05:07.440 --> 00:05:09.690 in observing their stress, this can backfire.