WEBVTT 1 00:00:00.000 --> 00:00:02.140 2 00:00:02.140 --> 00:00:03.850 Sam Ransbotham: Words matter. 3 00:00:03.850 --> 00:00:07.090 In particular, the words we use to describe technology 4 00:00:07.090 --> 00:00:08.890 within an organization matter. 5 00:00:08.890 --> 00:00:11.790 While IKEA product names may be hard to pronounce, 6 00:00:11.790 --> 00:00:13.710 the retailer is crystal clear about how 7 00:00:13.710 --> 00:00:15.460 it talks about technology. 8 00:00:15.460 --> 00:00:18.310 Today, we talk with Barbara Martin Coppola, 9 00:00:18.310 --> 00:00:20.880 the chief digital officer at IKEA Retail, 10 00:00:20.880 --> 00:00:24.960 about indirect benefits from artificial intelligence. 11 00:00:24.960 --> 00:00:27.810 Welcome to Me, Myself, and AI, a podcast 12 00:00:27.810 --> 00:00:29.960 on artificial intelligence in business. 13 00:00:29.960 --> 00:00:33.750 Each episode, we introduce you to someone innovating with AI. 14 00:00:33.750 --> 00:00:37.020 I'm Sam Ransbotham, professor of information systems 15 00:00:37.020 --> 00:00:38.340 at Boston College. 16 00:00:38.340 --> 00:00:41.730 I'm also the guest editor for the AI and Business Strategy 17 00:00:41.730 --> 00:00:44.890 Big Ideas program at MIT Sloan Management Review. 18 00:00:44.890 --> 00:00:47.780 Shervin Khodabandeh: And I'm Shervin Khodabandeh, 19 00:00:47.780 --> 00:00:51.960 senior partner with BCG, and I colead BCG's AI practice 20 00:00:51.960 --> 00:00:53.140 in North America. 21 00:00:53.140 --> 00:00:56.670 Together, MIT SMR and BCG have been 22 00:00:56.670 --> 00:01:00.090 researching AI for five years, interviewing hundreds 23 00:01:00.090 --> 00:01:02.430 of practitioners and surveying thousands 24 00:01:02.430 --> 00:01:06.540 of companies on what it takes to build and to deploy and scale 25 00:01:06.540 --> 00:01:09.130 AI capabilities across the organization 26 00:01:09.130 --> 00:01:11.480 and really transform the way organizations operate. 27 00:01:11.480 --> 00:01:16.930 Sam Ransbotham: Today we're talking with Barbara Martin 28 00:01:16.930 --> 00:01:17.430 Coppola. 29 00:01:17.430 --> 00:01:19.850 She's chief digital officer for IKEA. 30 00:01:19.850 --> 00:01:22.500 Barbara, thanks for taking the time to talk with us. 31 00:01:22.500 --> 00:01:24.780 Barbara Martin Coppola: Thank you for having me. 32 00:01:24.780 --> 00:01:27.820 Sam Ransbotham: The title of our podcast is Me, Myself, and AI, 33 00:01:27.820 --> 00:01:29.720 so we tend to focus on the individual 34 00:01:29.720 --> 00:01:32.010 and the individual story behind technology. 35 00:01:32.010 --> 00:01:32.980 Let's start there. 36 00:01:32.980 --> 00:01:35.080 Can you describe your current role at Ikea? 37 00:01:35.080 --> 00:01:37.830 Barbara Martin Coppola: I'm chief digital officer 38 00:01:37.830 --> 00:01:41.240 at IKEA Retail, which is the world's largest furnishing 39 00:01:41.240 --> 00:01:46.210 retailer, with 367 stores in 30-plus markets. 40 00:01:46.210 --> 00:01:49.350 And my role really is, I'm responsible 41 00:01:49.350 --> 00:01:52.600 for the overall digital development 42 00:01:52.600 --> 00:01:54.430 business for the company, as well 43 00:01:54.430 --> 00:01:56.920 as the digital transformation. 44 00:01:56.920 --> 00:01:59.420 Sam Ransbotham: You've got a pretty significant background 45 00:01:59.420 --> 00:02:03.750 -- roles at Google, and YouTube, and Samsung, Texas Instruments. 46 00:02:03.750 --> 00:02:06.770 And then school in Spain and France. 47 00:02:06.770 --> 00:02:08.880 Can you connect things you've learned 48 00:02:08.880 --> 00:02:11.620 in some of those past roles to how you've applied them 49 00:02:11.620 --> 00:02:13.502 to your current role? 50 00:02:13.502 --> 00:02:15.960 Barbara Martin Coppola: I have worked now in four different 51 00:02:15.960 --> 00:02:20.050 industries -- in semiconductors, then I went to consumer 52 00:02:20.050 --> 00:02:23.700 electronics, pure digital, and now I'm in retail. 53 00:02:23.700 --> 00:02:27.040 And in all those places, digital technology 54 00:02:27.040 --> 00:02:29.220 played a very, very important part. 55 00:02:29.220 --> 00:02:31.940 I'm connecting the dots between everything 56 00:02:31.940 --> 00:02:36.250 that I've learned in the past, both in business models, 57 00:02:36.250 --> 00:02:40.190 in different ways of organizing companies, different cultures, 58 00:02:40.190 --> 00:02:43.600 and leadership as well, to lead the company 59 00:02:43.600 --> 00:02:46.760 toward modernization, digitalization, 60 00:02:46.760 --> 00:02:48.710 and different business models. 61 00:02:48.710 --> 00:02:53.740 My time at Google, for instance, taught me what performance is, 62 00:02:53.740 --> 00:03:00.260 and what I mean by that is how to use agility, iterations, 63 00:03:00.260 --> 00:03:03.220 and measured outcomes of projects. 64 00:03:03.220 --> 00:03:08.130 At Samsung, which to me was the best example of execution I've 65 00:03:08.130 --> 00:03:12.740 seen in my life, I was based in South Korea and [saw] 66 00:03:12.740 --> 00:03:14.980 the power of the collective. 67 00:03:14.980 --> 00:03:19.110 So, how does it translate now to the environment at Ikea? 68 00:03:19.110 --> 00:03:25.120 Once the consensus is reached, then be clear about "OK, 69 00:03:25.120 --> 00:03:27.520 now we've got the decision; [let's] go." 70 00:03:27.520 --> 00:03:30.680 I can go on and on, but at the end of the day, 71 00:03:30.680 --> 00:03:35.010 I think it's adapting to the different cultural norms 72 00:03:35.010 --> 00:03:37.970 and trying to bring [them] from other places 73 00:03:37.970 --> 00:03:41.830 and actually augment it for the benefit of the company, 74 00:03:41.830 --> 00:03:44.560 and the well-being of people, and the reward 75 00:03:44.560 --> 00:03:46.720 that it can be to achieve things together. 76 00:03:46.720 --> 00:03:48.570 Shervin Khodabandeh: That's great. 77 00:03:48.570 --> 00:03:52.520 Barbara, how does AI fit into the overall digital road map? 78 00:03:52.520 --> 00:03:57.620 Barbara Martin Coppola: AI is absolutely core -- essential. 79 00:03:57.620 --> 00:04:02.320 I really believe that data is the most important asset 80 00:04:02.320 --> 00:04:03.980 a company has today. 81 00:04:03.980 --> 00:04:08.390 AI is being applied pretty much every step of the way 82 00:04:08.390 --> 00:04:10.440 in the value chain of retailing. 83 00:04:10.440 --> 00:04:13.700 It has potential for pretty much everything, 84 00:04:13.700 --> 00:04:16.850 and so it is a choice of focus, and it's 85 00:04:16.850 --> 00:04:19.610 a choice of business outcomes and where 86 00:04:19.610 --> 00:04:23.070 do we put the data scientists to be creating magic, really. 87 00:04:23.070 --> 00:04:28.170 We focus the creation of AI toward different values 88 00:04:28.170 --> 00:04:29.440 or outcomes. 89 00:04:29.440 --> 00:04:33.020 That plus the appetite of the company to create amazing 90 00:04:33.020 --> 00:04:36.810 things -- knowing how AI can be a magic wand, 91 00:04:36.810 --> 00:04:41.160 if we want it to be, then the belief of the company that this 92 00:04:41.160 --> 00:04:44.190 is a key component for the success and [competitiveness] 93 00:04:44.190 --> 00:04:46.290 of the company matters a whole lot as well. 94 00:04:46.290 --> 00:04:48.560 Sam Ransbotham: Is there a particular project 95 00:04:48.560 --> 00:04:50.560 that you can give us some details on that you're 96 00:04:50.560 --> 00:04:52.220 excited about, that's happened recently, 97 00:04:52.220 --> 00:04:53.680 or that your team has been involved with? 98 00:04:53.680 --> 00:04:55.597 Barbara Martin Coppola: There's one, actually, 99 00:04:55.597 --> 00:04:57.850 that I love personally. 100 00:04:57.850 --> 00:05:00.860 It's about democratizing design. 101 00:05:00.860 --> 00:05:02.560 What do I mean by that? 102 00:05:02.560 --> 00:05:05.310 You know this feeling where you want 103 00:05:05.310 --> 00:05:07.870 to decorate something as beautiful as what 104 00:05:07.870 --> 00:05:11.200 you have seen in the store, but you don't know where to start? 105 00:05:11.200 --> 00:05:14.870 And so what we have done is just, 106 00:05:14.870 --> 00:05:18.370 with the comfort of your mobile phone, you can scan your room, 107 00:05:18.370 --> 00:05:19.960 take pictures along the way. 108 00:05:19.960 --> 00:05:24.090 And then, through visual AI, we actually 109 00:05:24.090 --> 00:05:27.340 give you back a picture, where you can move the furniture, 110 00:05:27.340 --> 00:05:29.550 you can delete it, and you can actually 111 00:05:29.550 --> 00:05:32.640 fit in 3D models of Ikea furniture 112 00:05:32.640 --> 00:05:36.750 that adapt to the size of your space. 113 00:05:36.750 --> 00:05:39.100 So imagine how powerful that is. 114 00:05:39.100 --> 00:05:40.380 It's visual AI. 115 00:05:40.380 --> 00:05:42.790 It's [from] an amazing team based in California 116 00:05:42.790 --> 00:05:46.730 called Geomagical Labs, and I am very, very, very excited. 117 00:05:46.730 --> 00:05:48.500 It's coming up in the coming months, 118 00:05:48.500 --> 00:05:50.730 and I think it's going to revolutionize 119 00:05:50.730 --> 00:05:53.280 this anxiety of filling up a space without being 120 00:05:53.280 --> 00:05:53.780 a decorator. 121 00:05:53.780 --> 00:05:55.860 Shervin Khodabandeh: I want you to know 122 00:05:55.860 --> 00:05:58.420 that I am sitting at an IKEA desk 123 00:05:58.420 --> 00:06:00.232 that I bought 15 years ago. 124 00:06:00.232 --> 00:06:01.190 And it's moved with me. 125 00:06:01.190 --> 00:06:02.110 Sam Ransbotham: Does it fit? 126 00:06:02.110 --> 00:06:03.480 You don't know from virtual reality 127 00:06:03.480 --> 00:06:04.550 if it was going to fit there or not. 128 00:06:04.550 --> 00:06:05.967 Shervin Khodabandeh: It's perfect. 129 00:06:05.967 --> 00:06:07.000 I love it so much. 130 00:06:07.000 --> 00:06:09.130 I arrange my house around it. 131 00:06:09.130 --> 00:06:12.572 And I have exactly your chair, also, 132 00:06:12.572 --> 00:06:13.780 which I bought with the desk. 133 00:06:13.780 --> 00:06:14.988 Barbara Martin Coppola: Good. 134 00:06:14.988 --> 00:06:17.160 I have exactly the same setting here, actually. 135 00:06:17.160 --> 00:06:19.040 It's one of those that goes up and down, 136 00:06:19.040 --> 00:06:21.750 and so when you get tired of sitting in a Zoom [meeting], 137 00:06:21.750 --> 00:06:23.300 you just go up and you feel better. 138 00:06:23.300 --> 00:06:26.030 Shervin Khodabandeh: But seeing that you have that 139 00:06:26.030 --> 00:06:28.780 makes me feel that I made the right choice. 140 00:06:28.780 --> 00:06:30.140 Barbara Martin Coppola: Yes. 141 00:06:30.140 --> 00:06:35.010 Shervin Khodabandeh: There must be so many other examples, 142 00:06:35.010 --> 00:06:39.910 particularly for like a traditional retailer that 143 00:06:39.910 --> 00:06:43.480 is underexplored or not even on their radar, 144 00:06:43.480 --> 00:06:45.100 because I look at your background, 145 00:06:45.100 --> 00:06:47.970 and you've been in digital-first companies 146 00:06:47.970 --> 00:06:54.380 where everything's been built on an understanding of digital 147 00:06:54.380 --> 00:06:56.150 and data and technology. 148 00:06:56.150 --> 00:07:01.170 And then, at a place like IKEA or other iconic retailers, 149 00:07:01.170 --> 00:07:02.250 where -- 150 00:07:02.250 --> 00:07:06.270 I have to assume -- there is a transition or a transformation 151 00:07:06.270 --> 00:07:10.260 that needs to happen from old school to new school. 152 00:07:10.260 --> 00:07:12.360 What are some of the things you've observed 153 00:07:12.360 --> 00:07:17.770 or some lessons or some advice for other retailers 154 00:07:17.770 --> 00:07:22.690 who sort of are used to a different way of doing things? 155 00:07:22.690 --> 00:07:24.740 And now they have all this opportunity, 156 00:07:24.740 --> 00:07:26.700 the magic wand that you were talking about, 157 00:07:26.700 --> 00:07:30.338 but how do they know where to create that magic 158 00:07:30.338 --> 00:07:32.130 and what's the art of the possible with it? 159 00:07:32.130 --> 00:07:35.150 Barbara Martin Coppola: It's a great question. 160 00:07:35.150 --> 00:07:40.230 I believe that AI is unlimited, and so it's really 161 00:07:40.230 --> 00:07:44.670 saying, "OK, what space do we want to get better at?" 162 00:07:44.670 --> 00:07:50.270 and have the open-mindedness to make different functions work 163 00:07:50.270 --> 00:07:54.500 together, especially digital functions and data scientists. 164 00:07:54.500 --> 00:07:55.970 I'll give you an example. 165 00:07:55.970 --> 00:08:00.550 IKEA went [from having] big blue stores outside of the main 166 00:08:00.550 --> 00:08:03.200 cities -- and that was the main business model -- 167 00:08:03.200 --> 00:08:06.310 to having not only a variety of different stores, 168 00:08:06.310 --> 00:08:09.700 but also a lot of digital touch points. 169 00:08:09.700 --> 00:08:12.720 And so that creates a lot of complexity. 170 00:08:12.720 --> 00:08:16.020 How does the flow of goods need to be 171 00:08:16.020 --> 00:08:20.850 operated so that the costs are not going through the roof? 172 00:08:20.850 --> 00:08:26.610 And that challenge itself is so complex that it requires 173 00:08:26.610 --> 00:08:29.520 AI to be able to be solved for. 174 00:08:29.520 --> 00:08:32.900 There are many variables: There is demand forecasting; 175 00:08:32.900 --> 00:08:36.039 there is the size of the goods; there is the availability 176 00:08:36.039 --> 00:08:38.150 of items; there is a price. 177 00:08:38.150 --> 00:08:42.850 And so, at the end of the day, AI applied to this space 178 00:08:42.850 --> 00:08:46.530 is pretty much the only way to operate 179 00:08:46.530 --> 00:08:48.890 the business in a modern way. 180 00:08:48.890 --> 00:08:53.300 So just with that, and [e-commerce] increasing so much 181 00:08:53.300 --> 00:08:57.030 during the recent years -- we have 5x'ed e-com in three 182 00:08:57.030 --> 00:08:58.840 years, actually, at IKEA -- 183 00:08:58.840 --> 00:09:02.950 we have saved, thanks to AI and enabling 184 00:09:02.950 --> 00:09:08.310 the stores to be fulfillment centers the creation of 15 185 00:09:08.310 --> 00:09:10.820 customer distribution centers. 186 00:09:10.820 --> 00:09:12.930 That is not only great economically, 187 00:09:12.930 --> 00:09:14.570 but it's great for the planet as well. 188 00:09:14.570 --> 00:09:19.730 And so it's multiple possibilities in a positive way 189 00:09:19.730 --> 00:09:22.320 that when you start understanding 190 00:09:22.320 --> 00:09:26.130 the power of this, the demand is higher than what 191 00:09:26.130 --> 00:09:27.410 we can actually achieve. 192 00:09:27.410 --> 00:09:30.270 So then the next challenge is, how do we 193 00:09:30.270 --> 00:09:35.750 scale AI so that we can embed it everywhere in the company? 194 00:09:35.750 --> 00:09:38.930 Shervin Khodabandeh: So to paraphrase you, 195 00:09:38.930 --> 00:09:42.510 it requires open-mindedness, and imagination, 196 00:09:42.510 --> 00:09:44.790 and focusing on "What are the things we 197 00:09:44.790 --> 00:09:46.220 want to do differently?" 198 00:09:46.220 --> 00:09:49.210 But also, it's hard work, right? 199 00:09:49.210 --> 00:09:53.220 Because you've got to then get teams that are not used 200 00:09:53.220 --> 00:09:56.510 to working together -- the scientists and technology teams 201 00:09:56.510 --> 00:09:59.840 -- to work with store operators and managers. 202 00:09:59.840 --> 00:10:03.390 What do you think is the biggest misconception 203 00:10:03.390 --> 00:10:06.920 in the minds of traditional retailers, 204 00:10:06.920 --> 00:10:10.565 or just traditional companies, about AI? 205 00:10:10.565 --> 00:10:11.940 Barbara Martin Coppola: Well, one 206 00:10:11.940 --> 00:10:16.640 that I think is fairly common is that AI will come and disrupt 207 00:10:16.640 --> 00:10:17.410 people. 208 00:10:17.410 --> 00:10:22.890 There is a fair amount of fear that all the knowledge that 209 00:10:22.890 --> 00:10:26.790 people have built -- a bit of a gut feeling managing 210 00:10:26.790 --> 00:10:29.080 the business -- would actually be displaced as well. 211 00:10:29.080 --> 00:10:32.240 But when people start to actually understand 212 00:10:32.240 --> 00:10:36.550 that it's augmenting them and not displacing them, 213 00:10:36.550 --> 00:10:39.210 and that it's at the service of human beings, 214 00:10:39.210 --> 00:10:41.990 and at the service of business, people really 215 00:10:41.990 --> 00:10:43.740 start demanding it. 216 00:10:43.740 --> 00:10:47.250 But it takes seeing it, using it, 217 00:10:47.250 --> 00:10:50.810 being in those cross-functional teams, being outside 218 00:10:50.810 --> 00:10:54.650 of one's comfort zone, and then being 219 00:10:54.650 --> 00:10:59.290 very happy to see that the positive outcome was not just 220 00:10:59.290 --> 00:11:01.930 a sort of black-magic technology; 221 00:11:01.930 --> 00:11:05.280 it was made by human beings and by this cross-functional team 222 00:11:05.280 --> 00:11:06.700 that created this. 223 00:11:06.700 --> 00:11:09.220 So at the end, it's a human process after all. 224 00:11:09.220 --> 00:11:12.480 Shervin Khodabandeh: So this digital transformation 225 00:11:12.480 --> 00:11:18.510 that you've taken IKEA on has been probably even a bigger 226 00:11:18.510 --> 00:11:19.980 cultural transformation. 227 00:11:19.980 --> 00:11:23.090 And you talked about this with the World Economic Forum, 228 00:11:23.090 --> 00:11:27.120 about how the purpose and mission and culture of IKEA 229 00:11:27.120 --> 00:11:28.610 will not change. 230 00:11:28.610 --> 00:11:30.880 But yet we're talking about [how] 231 00:11:30.880 --> 00:11:33.720 some elements of the culture and some openness 232 00:11:33.720 --> 00:11:38.860 to imagine or collaborate or rethink roles has to change. 233 00:11:38.860 --> 00:11:40.950 How do you navigate that balance? 234 00:11:40.950 --> 00:11:45.690 I mean, how do you keep the purpose and the DNA intact 235 00:11:45.690 --> 00:11:49.890 and infuse these radical -- sometimes radical -- 236 00:11:49.890 --> 00:11:52.575 changes into the company? 237 00:11:52.575 --> 00:11:54.450 Barbara Martin Coppola: It's really important 238 00:11:54.450 --> 00:11:57.620 that people feel that their identity 239 00:11:57.620 --> 00:12:00.130 as a collective company does not change, 240 00:12:00.130 --> 00:12:02.230 and that is rooted in the mission 241 00:12:02.230 --> 00:12:03.920 and the values of the company. 242 00:12:03.920 --> 00:12:09.210 That is somehow the compass; whatever one is facing, 243 00:12:09.210 --> 00:12:11.780 you always have that to come back to. 244 00:12:11.780 --> 00:12:15.610 It's a collective identity that is important to maintain 245 00:12:15.610 --> 00:12:17.620 and that normally should give you 246 00:12:17.620 --> 00:12:21.620 strength for actually adapting to new challenges. 247 00:12:21.620 --> 00:12:24.410 And that is not easy, because adapting 248 00:12:24.410 --> 00:12:27.080 to new challenges at the speed of the change 249 00:12:27.080 --> 00:12:31.890 that we're seeing around us requires new leadership. 250 00:12:31.890 --> 00:12:36.910 It requires a leader that is able to not 251 00:12:36.910 --> 00:12:41.130 have all the answers, that is able to surround 252 00:12:41.130 --> 00:12:45.270 herself or himself with different skills 253 00:12:45.270 --> 00:12:50.710 and let go of ego to be able to be listening and leading 254 00:12:50.710 --> 00:12:52.890 toward common achievement. 255 00:12:52.890 --> 00:12:55.750 And that is something that I grew up with -- 256 00:12:55.750 --> 00:12:58.650 a different type of leadership that was very, 257 00:12:58.650 --> 00:13:01.710 very self-assured and knew all the answers. 258 00:13:01.710 --> 00:13:05.370 And that is not what I believe is required right 259 00:13:05.370 --> 00:13:07.230 now to be able to succeed. 260 00:13:07.230 --> 00:13:11.000 So [it's] a huge change, not only in leadership but also 261 00:13:11.000 --> 00:13:13.610 in the culture, so that the company 262 00:13:13.610 --> 00:13:16.330 can move forward, adapt, and create and be 263 00:13:16.330 --> 00:13:17.650 happy in the process as well. 264 00:13:17.650 --> 00:13:20.147 Sam Ransbotham: That "being happy" is important. 265 00:13:20.147 --> 00:13:21.980 One of the things that Shervin and I've just 266 00:13:21.980 --> 00:13:26.340 written about in a recent report is on these cultural benefits 267 00:13:26.340 --> 00:13:27.880 from artificial intelligence. 268 00:13:27.880 --> 00:13:29.490 Like you said, so many people have 269 00:13:29.490 --> 00:13:33.270 this feeling of the fear of people [losing] jobs 270 00:13:33.270 --> 00:13:34.080 and "Oh, no!" 271 00:13:34.080 --> 00:13:37.430 -- the technology scare [that] maybe the magic wand is a dark 272 00:13:37.430 --> 00:13:39.680 magic and not good magic. 273 00:13:39.680 --> 00:13:42.175 How do you make sure that that culture is progressing 274 00:13:42.175 --> 00:13:43.800 in the way that you want it to progress 275 00:13:43.800 --> 00:13:45.230 and that it's improving? 276 00:13:45.230 --> 00:13:48.222 How are you orchestrating that process? 277 00:13:48.222 --> 00:13:49.930 Barbara Martin Coppola: It's interesting, 278 00:13:49.930 --> 00:13:52.130 because the fear when one gets closer 279 00:13:52.130 --> 00:13:56.440 to getting to know how the sausage is made, 280 00:13:56.440 --> 00:14:00.690 and how the outcome can be incredible, 281 00:14:00.690 --> 00:14:02.590 and how the success is collective, 282 00:14:02.590 --> 00:14:05.530 at the end of the day, it becomes actually 283 00:14:05.530 --> 00:14:07.840 a very powerful experience. 284 00:14:07.840 --> 00:14:11.330 And so for the people that have been exposed 285 00:14:11.330 --> 00:14:15.800 to what this can do for them, it becomes actually 286 00:14:15.800 --> 00:14:19.060 a transformation, I would say, in their own mentalities 287 00:14:19.060 --> 00:14:21.580 to move forward, to want more. 288 00:14:21.580 --> 00:14:25.500 At the same time, word of mouth is really important -- 289 00:14:25.500 --> 00:14:28.830 word of mouth of those experiences with people that 290 00:14:28.830 --> 00:14:32.890 are trusted in the company, that will speak how the experiences 291 00:14:32.890 --> 00:14:33.540 went. 292 00:14:33.540 --> 00:14:36.810 [Having] the leadership that is not digital talking 293 00:14:36.810 --> 00:14:40.380 about it and celebrating it is really important as well. 294 00:14:40.380 --> 00:14:43.820 And then the vocabulary of the company changing, 295 00:14:43.820 --> 00:14:48.360 and how the whole management will be data-centric. 296 00:14:48.360 --> 00:14:50.070 There's actually a sentence at IKEA 297 00:14:50.070 --> 00:14:52.440 that says, "We are people-powered and 298 00:14:52.440 --> 00:14:55.190 data-centric," which did not exist four years ago. 299 00:14:55.190 --> 00:14:59.660 And now it's one of the centerpieces, so there [are] 300 00:14:59.660 --> 00:15:03.080 a lot of important small signs as well as 301 00:15:03.080 --> 00:15:06.390 bigger strategies and, of course, talks and education 302 00:15:06.390 --> 00:15:10.080 and onboarding that need to happen all at the same time. 303 00:15:10.080 --> 00:15:12.860 And all in all, we're all human beings; 304 00:15:12.860 --> 00:15:15.620 we can be suspicious of the things that we don't know. 305 00:15:15.620 --> 00:15:18.080 When we get close to it, then we start 306 00:15:18.080 --> 00:15:21.007 feeling the power of knowledge, which is really great. 307 00:15:21.007 --> 00:15:22.590 Sam Ransbotham: What you're describing 308 00:15:22.590 --> 00:15:26.140 is the virtuous cycle, then, of small improvements that 309 00:15:26.140 --> 00:15:28.670 lead to this word of mouth, that lead to a better 310 00:15:28.670 --> 00:15:31.390 appreciation and understanding and increase 311 00:15:31.390 --> 00:15:33.912 the knowledge rather than having to know everything, 312 00:15:33.912 --> 00:15:34.870 but learning as you go. 313 00:15:34.870 --> 00:15:36.840 Barbara Martin Coppola: Learning as you go. 314 00:15:36.840 --> 00:15:40.810 And this is the whole philosophy around testing, and iterating, 315 00:15:40.810 --> 00:15:43.510 and trying, failing, and starting again. 316 00:15:43.510 --> 00:15:45.970 When you think about it, it started 317 00:15:45.970 --> 00:15:48.850 being a digital practice, but it's now, 318 00:15:48.850 --> 00:15:51.600 I believe, widespread in the whole company. 319 00:15:51.600 --> 00:15:56.970 It's de-risking the projects by making them really small, 320 00:15:56.970 --> 00:16:00.180 trying them; if they work, then you can scale. 321 00:16:00.180 --> 00:16:04.580 It lowers the risk, lowers the stress, and overall the company 322 00:16:04.580 --> 00:16:07.340 can be trying new things without the fear 323 00:16:07.340 --> 00:16:09.570 of being perfect all at once. 324 00:16:09.570 --> 00:16:13.930 And that is a fascinating thing to watch, to talk, and [see] 325 00:16:13.930 --> 00:16:18.250 how this [is] influencing, from the financial way of steering 326 00:16:18.250 --> 00:16:22.510 the company to ways of working, all the way to creating 327 00:16:22.510 --> 00:16:25.300 and daring doing things that previously would 328 00:16:25.300 --> 00:16:27.040 take a lot more courage to do. 329 00:16:27.040 --> 00:16:29.620 Shervin Khodabandeh: Yeah, and in many ways, 330 00:16:29.620 --> 00:16:35.720 this comfort with experimentation and imperfect 331 00:16:35.720 --> 00:16:38.810 results -- that's the recipe for learning. 332 00:16:38.810 --> 00:16:41.605 And also, you had to rely on judgment. 333 00:16:41.605 --> 00:16:43.730 And the other thing I really liked in what you said 334 00:16:43.730 --> 00:16:47.750 is, it's not a big bang of "From tomorrow, we're 335 00:16:47.750 --> 00:16:50.390 going to do things this way," but it's a journey. 336 00:16:50.390 --> 00:16:55.160 And I think you've given some very good examples of elements 337 00:16:55.160 --> 00:16:58.510 of that journey -- how, slowly, the hearts and minds of people 338 00:16:58.510 --> 00:16:59.130 will change. 339 00:16:59.130 --> 00:17:00.380 That's been very inspiring. 340 00:17:00.380 --> 00:17:02.390 Barbara Martin Coppola: Absolutely. 341 00:17:02.390 --> 00:17:05.490 And if you throw in the accelerator of COVID 342 00:17:05.490 --> 00:17:08.660 in there, then here you go. 343 00:17:08.660 --> 00:17:17.050 Imagine [it changing] from night to day, [with] the heart 344 00:17:17.050 --> 00:17:21.160 and the soul of IKEA, the stores, suddenly all closed. 345 00:17:21.160 --> 00:17:23.950 And pretty much everybody [was] transitioning 346 00:17:23.950 --> 00:17:28.770 to create fulfillment centers in the stores, because e-com 347 00:17:28.770 --> 00:17:30.760 was 10x-ing overnight. 348 00:17:30.760 --> 00:17:33.380 So imagine the adaptation that it 349 00:17:33.380 --> 00:17:36.750 takes from people that have worked in different roles 350 00:17:36.750 --> 00:17:39.640 to suddenly give that up and just jump 351 00:17:39.640 --> 00:17:42.540 into this new way of working and actually make it happen. 352 00:17:42.540 --> 00:17:45.283 Sam Ransbotham: You mentioned earlier the idea of vocabulary, 353 00:17:45.283 --> 00:17:47.200 and I think it's interesting, the words you've 354 00:17:47.200 --> 00:17:49.660 chosen to talk about this. 355 00:17:49.660 --> 00:17:51.630 You talked about the COVID accelerator; 356 00:17:51.630 --> 00:17:53.630 you weren't talking about it in terms of the ... 357 00:17:53.630 --> 00:17:56.170 I don't know, I can think of other words 358 00:17:56.170 --> 00:17:58.590 that would be much more negative to frame that. 359 00:17:58.590 --> 00:18:00.450 It looks like you've found some ways 360 00:18:00.450 --> 00:18:03.670 to use that as an accelerator in what you're doing. 361 00:18:03.670 --> 00:18:05.430 I wanted to come back to some of what 362 00:18:05.430 --> 00:18:06.710 you were saying about sustainability, 363 00:18:06.710 --> 00:18:08.670 because I think there's a connection there with that as 364 00:18:08.670 --> 00:18:09.170 well. 365 00:18:09.170 --> 00:18:10.970 Barbara Martin Coppola: Absolutely. 366 00:18:10.970 --> 00:18:12.740 And to the comment that you just made, 367 00:18:12.740 --> 00:18:15.060 there is a quote, actually, from the founder 368 00:18:15.060 --> 00:18:17.570 of IKEA, Ingvar Kamprad, that says, 369 00:18:17.570 --> 00:18:20.100 "Never waste a good crisis." 370 00:18:20.100 --> 00:18:22.370 That explains a bit of philosophy 371 00:18:22.370 --> 00:18:26.580 to which, in the middle of disruption, personal drama, 372 00:18:26.580 --> 00:18:30.960 and collective [worry], one can put the mind 373 00:18:30.960 --> 00:18:35.560 to saying, "Can we actually get something good out of this?" 374 00:18:35.560 --> 00:18:39.340 and, in spite of the fear, going, working together, 375 00:18:39.340 --> 00:18:44.850 throwing out all the old silos, and maybe slowness, 376 00:18:44.850 --> 00:18:47.370 and going into action, and actually 377 00:18:47.370 --> 00:18:51.612 succeeding cross-functionally at something that seemed difficult 378 00:18:51.612 --> 00:18:52.320 at the beginning. 379 00:18:52.320 --> 00:18:56.730 So overall, it's a remarkable human story 380 00:18:56.730 --> 00:19:01.230 of going higher in the midst of a drastic and horrible setting, 381 00:19:01.230 --> 00:19:01.780 really. 382 00:19:01.780 --> 00:19:03.900 But you had a question about sustainability. 383 00:19:03.900 --> 00:19:06.650 Sam Ransbotham: What I was thinking about was, 384 00:19:06.650 --> 00:19:10.490 I was connecting your example of the virtual furniture, 385 00:19:10.490 --> 00:19:13.640 and you talked about 15 fulfillment centers 386 00:19:13.640 --> 00:19:15.180 that you didn't have to build. 387 00:19:15.180 --> 00:19:17.730 And so both of those things seem like ways 388 00:19:17.730 --> 00:19:21.380 that artificial intelligence can help with sustainability. 389 00:19:21.380 --> 00:19:24.138 First, clearly, the fulfillment example makes sense. 390 00:19:24.138 --> 00:19:26.430 But I was also thinking about your virtual reality one. 391 00:19:26.430 --> 00:19:29.510 That's someone who isn't buying a bunch of furniture, 392 00:19:29.510 --> 00:19:32.250 taking it home, and deciding they don't like it, 393 00:19:32.250 --> 00:19:34.510 and then taking it back to the store. 394 00:19:34.510 --> 00:19:38.180 So I'm guessing you didn't start that project as a "Hey, 395 00:19:38.180 --> 00:19:41.490 we can save some carbon here," but it certainly 396 00:19:41.490 --> 00:19:42.760 is a nice benefit. 397 00:19:42.760 --> 00:19:46.163 Are there other areas you're doing similar sorts of things? 398 00:19:46.163 --> 00:19:47.330 Barbara Martin Coppola: Yes. 399 00:19:47.330 --> 00:19:49.300 First, I love how you think about this. 400 00:19:49.300 --> 00:19:52.130 I think there's definitely a beautiful benefit, 401 00:19:52.130 --> 00:19:53.608 sustainability-wise. 402 00:19:53.608 --> 00:19:55.150 It's one of the biggest priorities -- 403 00:19:55.150 --> 00:19:57.250 Sam Ransbotham: I'm glad just not to drag the furniture home. 404 00:19:57.250 --> 00:19:58.875 That's all; that's what I'm happy with. 405 00:19:58.875 --> 00:20:02.010 Barbara Martin Coppola: Which is indeed another benefit. 406 00:20:02.010 --> 00:20:04.780 But what I would say is, sustainability 407 00:20:04.780 --> 00:20:08.180 is one of the core, core focus [areas] for Ikea, 408 00:20:08.180 --> 00:20:12.300 and so the company is transforming its business model 409 00:20:12.300 --> 00:20:14.320 into a circular business model. 410 00:20:14.320 --> 00:20:18.150 So that means that we would reuse the furniture; 411 00:20:18.150 --> 00:20:20.250 we would reuse the material. 412 00:20:20.250 --> 00:20:24.180 And when you think about that, the whole logistical aspect 413 00:20:24.180 --> 00:20:26.180 needs to be completely rethought. 414 00:20:26.180 --> 00:20:29.720 And in the midst of that comes data, traceability, 415 00:20:29.720 --> 00:20:33.390 and a whole new value chain that needs to be assembled together. 416 00:20:33.390 --> 00:20:35.710 It doesn't happen overnight, but it's 417 00:20:35.710 --> 00:20:38.710 happening in little chunks that are quite remarkable 418 00:20:38.710 --> 00:20:41.410 and are really cool. 419 00:20:41.410 --> 00:20:44.310 This past Black Friday, for instance, we 420 00:20:44.310 --> 00:20:48.160 allowed people to resell their old furniture to Ikea, 421 00:20:48.160 --> 00:20:50.670 and that was made [possible] through a website, [where] 422 00:20:50.670 --> 00:20:53.530 they would actually give their furniture back, 423 00:20:53.530 --> 00:20:56.970 and we would either resell it, or repurpose it, 424 00:20:56.970 --> 00:20:59.540 or just use the material again. 425 00:20:59.540 --> 00:21:03.470 Three hundred thousand pieces of furniture were resold. 426 00:21:03.470 --> 00:21:06.090 Imagine the amount of forests and material 427 00:21:06.090 --> 00:21:07.590 that that represents. 428 00:21:07.590 --> 00:21:12.610 So more and more, it's "How can we be intelligent and imagine 429 00:21:12.610 --> 00:21:16.410 a different business model, where affordability 430 00:21:16.410 --> 00:21:18.760 is an equal to sustainability?" 431 00:21:18.760 --> 00:21:22.450 We do not want sustainability to be for the few people. 432 00:21:22.450 --> 00:21:25.580 We want sustainability to be for everyone. 433 00:21:25.580 --> 00:21:27.220 And so that means that Ikea needs 434 00:21:27.220 --> 00:21:30.170 to figure out a way of making the whole value 435 00:21:30.170 --> 00:21:33.260 chain economically circular and valuable 436 00:21:33.260 --> 00:21:37.300 so that we fulfill the promise of, by 2030, not only 437 00:21:37.300 --> 00:21:40.497 [being] positive climate-wise, but also having 438 00:21:40.497 --> 00:21:41.580 a circular business model. 439 00:21:41.580 --> 00:21:43.810 Sam Ransbotham: I'm sure you can see this 440 00:21:43.810 --> 00:21:46.300 from your sort of overview of what's 441 00:21:46.300 --> 00:21:48.080 happening within the organization, 442 00:21:48.080 --> 00:21:52.110 but what about the individual workers who are more in contact 443 00:21:52.110 --> 00:21:53.260 with customers? 444 00:21:53.260 --> 00:21:55.150 How do they sense these changes? 445 00:21:55.150 --> 00:21:57.870 I guess they can see, for example, the augmented reality 446 00:21:57.870 --> 00:21:58.380 app. 447 00:21:58.380 --> 00:22:00.120 Are other changes that you're making 448 00:22:00.120 --> 00:22:01.738 visible to the front-line workers? 449 00:22:01.738 --> 00:22:02.780 And if so, what are they? 450 00:22:02.780 --> 00:22:06.440 Barbara Martin Coppola: Yes, there are visible changes 451 00:22:06.440 --> 00:22:08.800 to them -- not to the end consumer -- 452 00:22:08.800 --> 00:22:12.690 which is all the tools that we offer them for a much more 453 00:22:12.690 --> 00:22:14.300 efficient way of working. 454 00:22:14.300 --> 00:22:15.950 I'll give you one that we're trying 455 00:22:15.950 --> 00:22:17.900 that is actually quite cool. 456 00:22:17.900 --> 00:22:20.690 You know the IKEA cafeterias, and so people 457 00:22:20.690 --> 00:22:22.060 go with their trays. 458 00:22:22.060 --> 00:22:26.810 There is now a visual AI tool that scans the tray 459 00:22:26.810 --> 00:22:30.700 and is able to know how much the person needs to pay. 460 00:22:30.700 --> 00:22:36.120 And so the cashier is now free to interact, to counsel, 461 00:22:36.120 --> 00:22:37.550 to help people. 462 00:22:37.550 --> 00:22:40.950 And, to be honest, it's so much more rewarding 463 00:22:40.950 --> 00:22:43.880 than having the traditional cashier job. 464 00:22:43.880 --> 00:22:48.280 And this is just another example to say, "Can we 465 00:22:48.280 --> 00:22:51.360 enable humans to do what humans do best 466 00:22:51.360 --> 00:22:55.530 and allow machines to do the more repetitive tasks, 467 00:22:55.530 --> 00:22:58.420 and liberate ourselves to have that connection 468 00:22:58.420 --> 00:23:03.020 and that humanity that we believe is very rewarding?" 469 00:23:03.020 --> 00:23:06.550 Sam Ransbotham: And I'll draw a contrast between the world that 470 00:23:06.550 --> 00:23:10.540 seems to be moving toward humanoid-looking robots 471 00:23:10.540 --> 00:23:13.213 to interact with customers, when what you've said here 472 00:23:13.213 --> 00:23:15.130 is, that's what the people working want to do. 473 00:23:15.130 --> 00:23:18.058 I think it's a beautiful tying of the function 474 00:23:18.058 --> 00:23:19.350 to the great application of it. 475 00:23:19.350 --> 00:23:21.950 Barbara Martin Coppola: It's this philosophy 476 00:23:21.950 --> 00:23:25.660 to think that technology and AI, in my opinion, 477 00:23:25.660 --> 00:23:28.350 need to be at the service of human beings. 478 00:23:28.350 --> 00:23:31.420 And so either they augment us, or they 479 00:23:31.420 --> 00:23:36.060 have this exponential benefit to get to an outcome faster. 480 00:23:36.060 --> 00:23:38.420 But at the end of the day, when you think about it, 481 00:23:38.420 --> 00:23:42.730 we control technology, we make it happen. 482 00:23:42.730 --> 00:23:46.590 And so technology is a bit of a reflection to who we are 483 00:23:46.590 --> 00:23:50.610 as humans, and that is something that we bring with us -- 484 00:23:50.610 --> 00:23:53.650 our positives and negatives -- when we do the technology. 485 00:23:53.650 --> 00:23:56.540 And that's why putting a mirror to ourselves 486 00:23:56.540 --> 00:23:59.740 and looking at ourselves and understanding our creation 487 00:23:59.740 --> 00:24:03.300 is part of a lot of ethical dilemmas at the same time 488 00:24:03.300 --> 00:24:05.225 that are ongoing in society. 489 00:24:05.225 --> 00:24:06.100 Sam Ransbotham: Good. 490 00:24:06.100 --> 00:24:07.725 I'd like to congratulate us all for not 491 00:24:07.725 --> 00:24:09.000 making any meatball jokes. 492 00:24:09.000 --> 00:24:11.530 Barbara Martin Coppola: Guess what: 493 00:24:11.530 --> 00:24:14.690 There is a candle with [the] scent of meatballs now. 494 00:24:14.690 --> 00:24:16.420 Believe it or not, it is true. 495 00:24:16.420 --> 00:24:19.640 Shervin Khodabandeh: Well, my parents are 88 and 87. 496 00:24:19.640 --> 00:24:24.130 And every week, their thing to do 497 00:24:24.130 --> 00:24:26.773 is, they go to the local IKEA store and have the meatballs. 498 00:24:26.773 --> 00:24:28.440 Barbara Martin Coppola: That is so cool. 499 00:24:28.440 --> 00:24:31.160 Shervin Khodabandeh: They've been doing it for years. 500 00:24:31.160 --> 00:24:32.690 Barbara Martin Coppola: I love it. 501 00:24:32.690 --> 00:24:34.610 You know, the veggie meatballs are getting a lot of traction 502 00:24:34.610 --> 00:24:35.110 now. 503 00:24:35.110 --> 00:24:37.310 And it's part of the whole sustainability movement, 504 00:24:37.310 --> 00:24:39.550 but still people prefer the meatballs. 505 00:24:39.550 --> 00:24:40.702 It's an icon of IKEA. 506 00:24:40.702 --> 00:24:41.410 It's crazy, yeah? 507 00:24:41.410 --> 00:24:43.863 Sam Ransbotham: Barbara, wonderful talking with you. 508 00:24:43.863 --> 00:24:45.530 I think the one element that'll probably 509 00:24:45.530 --> 00:24:50.050 stick with the listeners is this indirect effect of technology, 510 00:24:50.050 --> 00:24:52.190 on effects like culture that you keep mentioning, 511 00:24:52.190 --> 00:24:53.530 or on sustainability. 512 00:24:53.530 --> 00:24:55.730 We tend to think on these first-order effects 513 00:24:55.730 --> 00:24:57.500 of technology, and you've really brought 514 00:24:57.500 --> 00:24:58.750 out a lot of the second order. 515 00:24:58.750 --> 00:25:01.508 And you did characterize that technology as a magic wand, 516 00:25:01.508 --> 00:25:03.550 and I'm going to bristle a little bit about that; 517 00:25:03.550 --> 00:25:05.925 I don't like people to think that these things are magic. 518 00:25:05.925 --> 00:25:08.200 But what you focused on is more, I 519 00:25:08.200 --> 00:25:10.530 guess, the magician holding the wand rather than 520 00:25:10.530 --> 00:25:11.840 the wand itself. 521 00:25:11.840 --> 00:25:13.610 I think that's an important thing. 522 00:25:13.610 --> 00:25:15.610 Thank you for taking the time to talk with us. 523 00:25:15.610 --> 00:25:16.490 We've really enjoyed it. 524 00:25:16.490 --> 00:25:16.780 Thank you. 525 00:25:16.780 --> 00:25:17.480 Shervin Khodabandeh: Yeah. 526 00:25:17.480 --> 00:25:18.260 Thank you so much. 527 00:25:18.260 --> 00:25:18.927 It's been great. 528 00:25:18.927 --> 00:25:21.790 Barbara Martin Coppola: Thank you so much for having me. 529 00:25:21.790 --> 00:25:24.250 Sam Ransbotham: On our next episode, 530 00:25:24.250 --> 00:25:26.380 we talk with Sidney Madison Prescott, 531 00:25:26.380 --> 00:25:29.220 global head of intelligent automation at Spotify. 532 00:25:29.220 --> 00:25:30.140 Please join us. 533 00:25:30.140 --> 00:25:33.180 Allison Ryder: Thanks for listening 534 00:25:33.180 --> 00:25:34.680 to Me, Myself, and AI. 535 00:25:34.680 --> 00:25:37.130 We believe, like you, that the conversation 536 00:25:37.130 --> 00:25:39.360 about AI implementation doesn't start and stop 537 00:25:39.360 --> 00:25:40.540 with this podcast. 538 00:25:40.540 --> 00:25:43.030 That's why we've created a group on LinkedIn, specifically 539 00:25:43.030 --> 00:25:44.140 for leaders like you. 540 00:25:44.140 --> 00:25:46.890 It's called AI for Leaders, and if you join us, 541 00:25:46.890 --> 00:25:48.910 you can chat with show creators and hosts, 542 00:25:48.910 --> 00:25:52.540 ask your own questions, share insights, and gain access 543 00:25:52.540 --> 00:25:55.020 to valuable resources about AI implementation 544 00:25:55.020 --> 00:25:57.120 from MIT SMR and BCG. 545 00:25:57.120 --> 00:26:02.190 You can access it by visiting mitsmr.com/AIforLeaders. 546 00:26:02.190 --> 00:26:04.960 We'll put that link in the show notes, 547 00:26:04.960 --> 00:26:07.390 and we hope to see you there. 548 00:26:07.390 --> 00:26:13.000