WEBVTT
1
00:00:00.120 --> 00:00:02.359
The topics and opinions express in the following show are
2
00:00:02.359 --> 00:00:04.000
solely those of the hosts and their guests and not
3
00:00:04.040 --> 00:00:06.919
those of W FOURCY Radio. It's employees are affiliates. We
4
00:00:07.000 --> 00:00:10.080
make no recommendations or endorsements for radio show programs, services,
5
00:00:10.119 --> 00:00:12.439
or products mentioned on air or on our web. No
6
00:00:12.560 --> 00:00:15.119
liability explicit or implies shall be extended to W four
7
00:00:15.160 --> 00:00:17.920
CY Radio or its employees are affiliates. Any questions or
8
00:00:17.960 --> 00:00:20.399
comments should be directed to those show hosts. Thank you
9
00:00:20.440 --> 00:00:25.480
for choosing W FOURCY Radio.
10
00:00:28.199 --> 00:00:31.600
What's working on Purpose? Anyway? Each week we ponder the
11
00:00:31.640 --> 00:00:34.799
answer to this question. People ache for meaning and purpose
12
00:00:34.840 --> 00:00:38.200
at work, to contribute their talents passionately and know their
13
00:00:38.240 --> 00:00:41.600
lives really matter. They crave being part of an organization
14
00:00:41.679 --> 00:00:44.399
that inspires them and helps them grow into realizing their
15
00:00:44.479 --> 00:00:47.520
highest potential. Business can be such a force for good
16
00:00:47.560 --> 00:00:51.200
in the world, elevating humanity. In our program, we provide
17
00:00:51.200 --> 00:00:54.240
guidance and inspiration to help usher in this world we
18
00:00:54.320 --> 00:00:58.600
all want working on Purpose. Now here's your host, doctor
19
00:00:58.640 --> 00:00:59.719
Elise Cortes.
20
00:01:05.079 --> 00:01:07.000
Welcome back to the Working and Purpose Program, which has
21
00:01:07.000 --> 00:01:08.959
been brought to you with passion your pride since February
22
00:01:09.040 --> 00:01:11.120
twenty fifteen. I'm trying to get in this week. Great
23
00:01:11.159 --> 00:01:13.280
to have you. I'm your host, Doctor A Lease Cortes.
24
00:01:13.879 --> 00:01:16.680
Most leaders are sitting on untapped human energy. I help
25
00:01:16.719 --> 00:01:19.519
them unlock it. That's my jam. I'm an organizational psychologist,
26
00:01:19.640 --> 00:01:22.239
lag with therapist, workforce advisor and the founder of the
27
00:01:22.280 --> 00:01:24.879
Gusto Now movement. But my real title that I go
28
00:01:24.959 --> 00:01:27.040
by pretty much every day is I just simply traffic
29
00:01:27.079 --> 00:01:30.319
and energy. Not the service cond that a motivation or
30
00:01:30.400 --> 00:01:34.439
another engagement initiative, but the deeper force what I call Gusto,
31
00:01:34.680 --> 00:01:39.040
the life force for performance that drives commitment, perseverance, and
32
00:01:39.120 --> 00:01:41.640
genuine ownership of a shared admission in the clients that
33
00:01:41.680 --> 00:01:44.519
we serve. You can learn more about how we work
34
00:01:44.560 --> 00:01:47.920
and how we can work together at gustodation dot com
35
00:01:48.079 --> 00:01:50.400
or my personal site is at least quortes dot com
36
00:01:50.519 --> 00:01:54.280
for information on speaking and books. Today's program we have
37
00:01:54.319 --> 00:01:57.120
with us back Steve Brown, a leading voice and the
38
00:01:57.239 --> 00:02:00.879
conversation on artificial intelligence. He is a form We're executive
39
00:02:00.879 --> 00:02:04.680
at Google deep Mind and Intel has delivered hundreds of engaging,
40
00:02:04.719 --> 00:02:08.800
information rich keynotes across five continents. Inspiring audience is to
41
00:02:08.840 --> 00:02:12.479
take action with AI. He's the author of The Innovation Ultimatum,
42
00:02:12.719 --> 00:02:16.599
how six strategic technologies will reshape every business in the
43
00:02:16.599 --> 00:02:19.280
twenty twenties, which we talked about in an earlier episode.
44
00:02:19.479 --> 00:02:22.919
Today we're talking about his newest book, The AI Ultimatum,
45
00:02:23.039 --> 00:02:26.439
Preparing for a world of Intelligent machines and radical transformation.
46
00:02:26.800 --> 00:02:29.680
He jose Us from Portland, Oregon see a hearty welcome
47
00:02:29.719 --> 00:02:30.960
back to working on purpose.
48
00:02:31.759 --> 00:02:33.120
At least, let's be with you again.
49
00:02:33.840 --> 00:02:35.479
You know, as I told you in exchange and before
50
00:02:35.479 --> 00:02:37.840
we got on air, this as magnificent. It is a
51
00:02:37.919 --> 00:02:40.479
triumph of Steve Brown. You know this. The world needs
52
00:02:40.479 --> 00:02:45.199
to read this book stat So you're welcome. I might
53
00:02:45.240 --> 00:02:48.800
be a fan, I don't know, so.
54
00:02:49.039 --> 00:02:51.800
Lit a labor of love, sir. I'm glad you joined it.
55
00:02:51.919 --> 00:02:55.319
I appreciate that, you know, as a fellow author, I
56
00:02:55.360 --> 00:02:58.120
appreciate when somebody really reads my books too, and or
57
00:02:58.280 --> 00:03:00.599
my book and has something to say and links to it.
58
00:03:00.639 --> 00:03:02.639
And I really really think there's so much in this.
59
00:03:02.800 --> 00:03:06.120
So I want to start with the foundation of your
60
00:03:06.120 --> 00:03:09.879
book that I think is really really beautiful. And so
61
00:03:09.960 --> 00:03:12.439
you talk about how the AA Ultimatum is more about
62
00:03:12.759 --> 00:03:17.599
technological adoption. It requires reimagining how value gets created when
63
00:03:17.680 --> 00:03:22.080
cognative limitations dissolve, and how competitive advantage emerges when analytical
64
00:03:22.159 --> 00:03:26.240
capacity becomes abundant, and how human potential expands and love
65
00:03:26.319 --> 00:03:29.960
that when augmented with artificial intelligence. I think that is
66
00:03:30.280 --> 00:03:33.639
tremendous because as you probably well you have to well know,
67
00:03:34.080 --> 00:03:36.400
there is so much fear around it instead.
68
00:03:37.080 --> 00:03:40.639
Yeah, yeah, I mean leaders have a choice going into this.
69
00:03:41.120 --> 00:03:44.439
They can treat AI like any other technology, which they
70
00:03:44.479 --> 00:03:48.240
shouldn't and apply twentieth century thinking what I call twentieth
71
00:03:48.240 --> 00:03:51.439
century thinking in the book, which is to use AI
72
00:03:51.639 --> 00:03:57.199
as a way to cut costs, boost efficiency, reduce headcounts,
73
00:03:57.280 --> 00:04:01.400
and do substitution right, replacing people with technology. You could
74
00:04:01.400 --> 00:04:04.159
do that, and some leaders will try that, and it
75
00:04:04.199 --> 00:04:07.599
will be the wrong decision. It's the wrong decision, not
76
00:04:07.719 --> 00:04:10.960
just because there's a social impact on that. We can
77
00:04:10.960 --> 00:04:13.039
talk about that as much as you like, but because
78
00:04:13.039 --> 00:04:15.479
it's the wrong thing from a business standpoint to do.
79
00:04:16.279 --> 00:04:20.319
If you have two competitors apply twentieth century thinking, they
80
00:04:20.439 --> 00:04:24.920
use AI for substitution of labor. They reduce the people
81
00:04:25.000 --> 00:04:27.879
as far as they can, assuming that those two businesses
82
00:04:27.959 --> 00:04:31.959
have the same access to resources and technology and capital.
83
00:04:32.560 --> 00:04:35.279
They're going to end up being the same company, and
84
00:04:35.319 --> 00:04:37.600
then they fight against each other on price. Because the
85
00:04:37.600 --> 00:04:42.160
differentiation you have is your people. It also means they're
86
00:04:42.199 --> 00:04:44.720
not going to be ready for the people who apply
87
00:04:44.879 --> 00:04:49.439
twenty first century thinking, which is to use AI to
88
00:04:49.639 --> 00:04:53.639
amplify the impact of the people you have, and to
89
00:04:54.279 --> 00:04:58.879
know if you are applying twentieth century thinking, maybe you'll
90
00:04:58.920 --> 00:05:01.759
reduce your costs, maybe you'll be a little bit more efficient,
91
00:05:02.240 --> 00:05:06.360
maybe you will improve your results by ten or fifteen or,
92
00:05:06.480 --> 00:05:09.959
if you're lucky, twenty twenty five percent. If you apply
93
00:05:10.079 --> 00:05:12.959
twenty first century thinking, the approach I talk about in
94
00:05:13.000 --> 00:05:16.920
the book, maybe you can amplify your impact and think
95
00:05:16.959 --> 00:05:21.759
about boosting your results ten x or twenty x or
96
00:05:21.920 --> 00:05:25.360
fifty x. And the companies that do that will leave
97
00:05:25.759 --> 00:05:28.839
the twentieth century companies behind, and.
98
00:05:28.759 --> 00:05:30.600
I'm about to build those kind of leaders. Steve so,
99
00:05:30.680 --> 00:05:34.120
I really really appreciate how you distinguish those different ways
100
00:05:34.120 --> 00:05:38.040
of thinking. I thought it was really helpful to recognize
101
00:05:38.199 --> 00:05:42.360
that when you're talking about orchestrating and using technology to
102
00:05:43.040 --> 00:05:45.600
create that bigger value, you do distinguish three different kinds
103
00:05:45.600 --> 00:05:49.720
of intelligences if you would describe those for us.
104
00:05:49.240 --> 00:05:53.680
All three different types of agents. Yeah, well thumber agents
105
00:05:53.680 --> 00:05:54.319
and what they are.
106
00:05:55.480 --> 00:05:58.040
Well, so first, what I was looking for is the human,
107
00:05:58.079 --> 00:05:59.600
the artificial and robotic. First.
108
00:05:59.639 --> 00:06:04.160
Oh, yeah, we can do that first, okay, so thank you. Yeah.
109
00:06:04.240 --> 00:06:10.199
Leaders now preside over a blended workforce with three components,
110
00:06:11.160 --> 00:06:13.879
the human workers that they have had to manage in
111
00:06:13.920 --> 00:06:17.879
the past, and now digital employees in the form of
112
00:06:17.920 --> 00:06:21.680
AI agents. And if there's a physical dimension to their work,
113
00:06:21.959 --> 00:06:25.000
then robots, and some of those will be the types
114
00:06:25.000 --> 00:06:27.639
of robots we've seen in the past that have built
115
00:06:27.639 --> 00:06:29.959
our cars for the past forty years, but many of
116
00:06:30.000 --> 00:06:33.480
them will now be humanoid robots that are learning machines
117
00:06:33.879 --> 00:06:38.199
that learn by observing humans perform work practicing themselves. And
118
00:06:38.240 --> 00:06:42.079
then once one robot learns, they all learn. And the
119
00:06:42.160 --> 00:06:45.079
companies that thrive will be the ones that figure out
120
00:06:45.319 --> 00:06:50.079
how to blend all of those three components agents, robots,
121
00:06:50.120 --> 00:06:55.800
and yes, still humans to create this workforce that amplifies
122
00:06:55.879 --> 00:06:58.360
the efforts of the human beings in the organization.
123
00:07:00.079 --> 00:07:02.199
I also appreciated and I really thought, you know, the
124
00:07:02.279 --> 00:07:05.000
kind of work that you were doing must be hopelessly fascinating.
125
00:07:05.040 --> 00:07:08.040
But what you were talking about, how you know, the
126
00:07:08.079 --> 00:07:10.759
aim of considering, looking at the task, what needs to
127
00:07:10.759 --> 00:07:12.560
be done in the organization, and then deciding which of
128
00:07:12.600 --> 00:07:15.279
these intelligence shall we pull on to provide that. So
129
00:07:15.360 --> 00:07:20.920
with the idea of humans provide relationship, relational intelligence, judgment,
130
00:07:21.040 --> 00:07:26.439
creativity value that you set direction, make, ethical choices, create meaning, connection, trust,
131
00:07:27.040 --> 00:07:28.480
Those are the kind of things that we bring to
132
00:07:28.519 --> 00:07:30.720
the party. And of course our job at our in
133
00:07:30.800 --> 00:07:33.600
our world see that is about doubling down a humanity
134
00:07:33.680 --> 00:07:36.160
to make those even stronger. And then of course the
135
00:07:36.240 --> 00:07:42.319
artificial handles analysis at superhuman scale finds patterns across impossibly
136
00:07:42.399 --> 00:07:45.920
large data sets and operates continuously without fatigue. And then
137
00:07:45.959 --> 00:07:47.800
you already brought the robotic thing up. I think it's
138
00:07:47.839 --> 00:07:52.120
really important for leaders to distinguish just how distinctive those
139
00:07:52.160 --> 00:07:53.160
intelligences are.
140
00:07:54.759 --> 00:07:57.240
Yeah, and you always want to keep humans in the loop.
141
00:07:57.879 --> 00:08:00.839
You know, you can automate a lot of and AI
142
00:08:01.040 --> 00:08:06.120
is great for doing things at scale, for collapsing costs
143
00:08:06.160 --> 00:08:11.600
and time, for handling massive data, sense, for delivering hyper
144
00:08:11.600 --> 00:08:14.560
personalization at scale for all of your customers so you
145
00:08:14.600 --> 00:08:17.800
can be giving them offers at an individual level. There's
146
00:08:17.839 --> 00:08:20.199
lots of great things you can do with AI, but
147
00:08:20.279 --> 00:08:22.279
over at the end of the day, you want humans
148
00:08:22.319 --> 00:08:28.839
to be overseeing that agents, which feel like digital employees.
149
00:08:30.120 --> 00:08:32.120
You can communicate with them, you can give them quite
150
00:08:32.159 --> 00:08:36.159
complex tasks. They'll go off and work on those, break
151
00:08:36.200 --> 00:08:39.399
them down into subtasks and try and execute on them.
152
00:08:39.919 --> 00:08:42.360
They need oversight because they will make mistakes, they will
153
00:08:42.399 --> 00:08:45.639
hit a corner case where they can't quite get it right,
154
00:08:46.200 --> 00:08:49.120
they will get something wrong, they'll make an error, they
155
00:08:49.159 --> 00:08:54.279
will need guidance, and so you always want humans overseeing.
156
00:08:54.879 --> 00:08:58.279
What it means is for us as people, is we
157
00:08:58.320 --> 00:09:02.360
move from doing the work to designing the work and
158
00:09:02.399 --> 00:09:07.200
then ultimately overseeing the work of agents and robots. Everybody
159
00:09:07.240 --> 00:09:10.279
becomes a manager, no matter how junior they are in
160
00:09:10.320 --> 00:09:11.039
an organization.
161
00:09:11.320 --> 00:09:13.360
Yeah, that line really struck me in your book, that
162
00:09:13.399 --> 00:09:16.279
we're all, regardless of what we do in an organization,
163
00:09:16.279 --> 00:09:19.240
we're all going to be managers. And then I really
164
00:09:19.240 --> 00:09:23.559
really was quite quite taken with your idea of really
165
00:09:23.600 --> 00:09:27.360
helping organizations to focus on how to transform their business
166
00:09:27.360 --> 00:09:31.559
through your CEO framework. If you could describe that approach
167
00:09:31.600 --> 00:09:32.759
and what that entails.
168
00:09:33.080 --> 00:09:36.240
Yeah, I work with a lot of senior leadership teams, boards,
169
00:09:36.320 --> 00:09:40.360
management teams, and often you know a CEO will ask me, well,
170
00:09:40.919 --> 00:09:44.000
where do we start, steeve, it all seems so overwhelming.
171
00:09:44.039 --> 00:09:46.919
We know we need to apply AI, have limited resources,
172
00:09:46.919 --> 00:09:49.960
Where do we start? And I start them with the
173
00:09:50.000 --> 00:09:54.600
CEO framework, which is a way of finding balance. And
174
00:09:54.600 --> 00:09:57.480
this is all about finding balance, right. The main areas
175
00:09:57.519 --> 00:10:01.399
you can invest with AI. You can invest in improving
176
00:10:01.440 --> 00:10:05.759
your customer experience. How do you hyper personalized offers? How
177
00:10:05.799 --> 00:10:09.279
do you create better product services? And so on? That's
178
00:10:09.320 --> 00:10:14.279
the C of CEO. The E is obvious employees. How
179
00:10:14.320 --> 00:10:17.840
do you use AI to amplify the impact that your
180
00:10:17.840 --> 00:10:21.679
employees have to offload low value work that they find
181
00:10:21.720 --> 00:10:25.039
tedious and that they don't enjoy, taking the suck out
182
00:10:25.080 --> 00:10:27.600
of their jobs if you like. And then the O
183
00:10:27.919 --> 00:10:31.399
is for operations? How do you use AI to streamline
184
00:10:31.399 --> 00:10:34.879
operations and just make things all run much more smoothly.
185
00:10:35.159 --> 00:10:37.559
You want to do all three of those things. And
186
00:10:37.639 --> 00:10:40.480
so when you when you imagine all of the AI
187
00:10:40.600 --> 00:10:43.799
projects that you could apply, you stort of apply that
188
00:10:43.799 --> 00:10:46.600
filter and say, makes make sure they're balanced across the
189
00:10:46.679 --> 00:10:52.159
C and O customers, employees and operations and a few.
190
00:10:52.000 --> 00:10:53.840
Things I want to call out about that really quick.
191
00:10:54.360 --> 00:10:58.879
So first you also suggest starting with, you know, the employees,
192
00:11:00.200 --> 00:11:03.759
to help support them and to see that this new
193
00:11:03.799 --> 00:11:07.720
technology is being focused on helping them and caring for them.
194
00:11:08.000 --> 00:11:10.559
And in the process of doing that, they learned that
195
00:11:10.600 --> 00:11:12.759
who this is actually making my life easier here, and
196
00:11:12.759 --> 00:11:15.080
now they're starting to get bought in and now they
197
00:11:15.120 --> 00:11:17.720
want to help support their initiative versus resistant And I
198
00:11:17.720 --> 00:11:20.679
thought that was a really really you're going to start someplace,
199
00:11:20.720 --> 00:11:22.360
why wouldn't that be a great place to start.
200
00:11:23.279 --> 00:11:24.960
Yeah, I mean you get good bang for the buck
201
00:11:25.039 --> 00:11:28.639
there because you're amplifying everything that your employees do if
202
00:11:28.679 --> 00:11:32.159
you use AI to support them. But also it's about
203
00:11:32.240 --> 00:11:35.679
driving inclusion and support. I work with one client and
204
00:11:35.679 --> 00:11:37.799
they were so proud because they they were ahead of
205
00:11:37.840 --> 00:11:40.559
the game. They had agents. They'd built these agents and
206
00:11:40.600 --> 00:11:45.559
were rolling them out to employees, and the employees were
207
00:11:45.600 --> 00:11:48.320
up in arms. What are you doing? I don't want AI?
208
00:11:48.440 --> 00:11:51.759
Why are you rolling this out? And I asked the CEO.
209
00:11:51.799 --> 00:11:54.399
He's telling me the story and I said, so did
210
00:11:54.440 --> 00:11:58.759
you ask the employees what they wanted? And how AI
211
00:11:59.879 --> 00:12:03.600
you co design these solutions? Oh? No, you know, we
212
00:12:03.759 --> 00:12:05.600
just we thought they'd love them. We just built them
213
00:12:05.600 --> 00:12:07.639
and rolled them out, and they hated them, and some
214
00:12:07.679 --> 00:12:09.320
of them threatened to leave, and some of them did
215
00:12:09.399 --> 00:12:14.279
leave to say, well, how are you surprised you know
216
00:12:14.559 --> 00:12:16.279
what you need to do. I mean, there's two reasons
217
00:12:16.320 --> 00:12:18.919
you want to include people in the design process when
218
00:12:18.960 --> 00:12:24.120
you're building AI solutions. One, you're trying to build supporters,
219
00:12:24.159 --> 00:12:28.720
not saboteurs, and you do that by showing people what's
220
00:12:28.759 --> 00:12:31.639
in it for me. Right, If you co design a solution,
221
00:12:32.120 --> 00:12:34.440
you can see how it's going to help you do
222
00:12:34.559 --> 00:12:37.240
your job. You're going to be much more oriented towards
223
00:12:37.279 --> 00:12:40.080
supporting it and making that role out successful. It's just
224
00:12:40.519 --> 00:12:41.720
change management one O one.
225
00:12:41.759 --> 00:12:42.480
It's not hard.
226
00:12:44.120 --> 00:12:49.159
And the second reason is humans are creative and we
227
00:12:49.320 --> 00:12:51.559
find other ways to do things. So the way that
228
00:12:51.679 --> 00:12:54.600
leaders think work is done, and perhaps the way it's
229
00:12:54.639 --> 00:12:57.799
written down in policies and procedures, is never the way
230
00:12:57.840 --> 00:13:01.559
that work's actually done. And so if you design without
231
00:13:01.600 --> 00:13:05.000
including employees, what you end up is designing a solution
232
00:13:05.120 --> 00:13:07.279
for a world that does not exist, and then you're
233
00:13:07.320 --> 00:13:10.600
wasting everybody's time. So that's two great reasons why you
234
00:13:10.679 --> 00:13:14.039
include employees from day one to make sure that you
235
00:13:14.039 --> 00:13:16.200
have a successful, high impact rollout.
236
00:13:17.720 --> 00:13:20.639
Yeah, I really I thought that was so. And for
237
00:13:20.759 --> 00:13:24.000
the work that we're doing, Steve helping really elevate leaders
238
00:13:24.039 --> 00:13:28.240
and being able to activate purpose as an operational imperative
239
00:13:28.279 --> 00:13:30.639
in organizations, that is such an important thing that we