WEBVTT 1 00:00:00.840 --> 00:00:03.120 If you look at the way you use wearables, 2 00:00:03.120 --> 00:00:06.090 you feel like you have a little bit of a helper, 3 00:00:06.090 --> 00:00:07.120 something that's reminding you 4 00:00:07.120 --> 00:00:08.823 to take steps or take a call. 5 00:00:10.070 --> 00:00:12.420 You look at that trend of health and wellness 6 00:00:12.420 --> 00:00:14.560 and starting to add the ability 7 00:00:14.560 --> 00:00:16.830 to track every data point 8 00:00:16.830 --> 00:00:20.490 on how you're feeling, your emotion, 9 00:00:20.490 --> 00:00:23.760 your heart rate, where you're going every day. 10 00:00:23.760 --> 00:00:26.010 This is where I think there's been some fear. 11 00:00:27.390 --> 00:00:29.555 Ethics is a big question. 12 00:00:36.440 --> 00:00:37.710 line:15% How should leaders think 13 00:00:37.710 --> 00:00:42.470 line:15% about the ethical impact that their data has? 14 00:00:42.470 --> 00:00:46.547 Ten years ago, our majors in our educational systems 15 00:00:46.547 --> 00:00:48.163 were typically split. 16 00:00:49.780 --> 00:00:52.770 You had computer science, philosophy, 17 00:00:52.770 --> 00:00:54.903 economics, anthropology. 18 00:00:56.740 --> 00:00:58.530 Look at the financial collapse. 19 00:00:58.530 --> 00:01:01.910 This was a combination of society, 20 00:01:01.910 --> 00:01:05.600 economics, philosophy, humanity, 21 00:01:05.600 --> 00:01:08.080 reactions to things, emotion, right? 22 00:01:08.080 --> 00:01:10.263 It wasn't just one thing in a silo. 23 00:01:11.510 --> 00:01:14.030 What I'm really beginning to see as a major shift 24 00:01:14.030 --> 00:01:18.160 is the appreciation of merging those disciplines together 25 00:01:18.160 --> 00:01:22.390 to really understand how data and AI and ML 26 00:01:22.390 --> 00:01:24.293 are really going to impact the world. 27 00:01:26.800 --> 00:01:29.180 Right now, a lot of times we build technology 28 00:01:29.180 --> 00:01:31.650 for reasons that are great reasons 29 00:01:31.650 --> 00:01:34.300 but sometimes a little shortsighted. 30 00:01:34.300 --> 00:01:35.970 But if every person in a company 31 00:01:35.970 --> 00:01:39.050 is thinking of questions that are bigger than themselves, 32 00:01:39.050 --> 00:01:41.620 questions like making sure we have drinking water, 33 00:01:41.620 --> 00:01:42.930 questions like making sure 34 00:01:42.930 --> 00:01:45.400 the company survives a hundred years, 35 00:01:45.400 --> 00:01:48.893 how, ethically, these things are going to impact the world. 36 00:01:50.550 --> 00:01:52.490 It's not just about what they're building, 37 00:01:52.490 --> 00:01:54.050 it's who they're building it for, 38 00:01:54.050 --> 00:01:56.610 it's what societal impacts it has. 39 00:01:56.610 --> 00:01:58.100 Is it good for sustainability? 40 00:01:58.100 --> 00:02:00.300 And we're doing some of that now 41 00:02:00.300 --> 00:02:01.870 but it really is about bringing that 42 00:02:01.870 --> 00:02:03.960 into the culture and the ways of working 43 00:02:03.960 --> 00:02:05.640 versus just having the very kind of 44 00:02:05.640 --> 00:02:07.603 top-line CEO think about it. 45 00:02:10.340 --> 00:02:14.100 Once companies embrace ethics 46 00:02:14.100 --> 00:02:17.970 and really make it a part of their way of working, 47 00:02:17.970 --> 00:02:20.450 that is really when I think we've cracked the code 48 00:02:20.450 --> 00:02:24.883 in figuring this AI in the next decade question out.