April 7, 2026

The AI Ultimatum: Will You Replace Humans…or Rise With Them?

The AI Ultimatum: Will You Replace Humans…or Rise With Them?
iHeartRadio podcast player iconSpotify podcast player iconAmazon Music podcast player iconAudible podcast player iconPandora podcast player iconApple Podcasts podcast player iconPodchaser podcast player iconDeezer podcast player iconAudacy podcast player iconYoutube Music podcast player iconSpreaker podcast player iconPodcast Addict podcast player iconCastbox podcast player iconJioSaavn podcast player iconCastamatic podcast player iconCastro podcast player iconFountain podcast player iconGoodpods podcast player iconOvercast podcast player iconPlayerFM podcast player iconPocketCasts podcast player iconPodimo podcast player iconPodurama podcast player iconPodverse podcast player iconPodyssey podcast player iconYouTube podcast player iconRSS Feed podcast player icon
iHeartRadio podcast player iconSpotify podcast player iconAmazon Music podcast player iconAudible podcast player iconPandora podcast player iconApple Podcasts podcast player iconPodchaser podcast player iconDeezer podcast player iconAudacy podcast player iconYoutube Music podcast player iconSpreaker podcast player iconPodcast Addict podcast player iconCastbox podcast player iconJioSaavn podcast player iconCastamatic podcast player iconCastro podcast player iconFountain podcast player iconGoodpods podcast player iconOvercast podcast player iconPlayerFM podcast player iconPocketCasts podcast player iconPodimo podcast player iconPodurama podcast player iconPodverse podcast player iconPodyssey podcast player iconYouTube podcast player iconRSS Feed podcast player icon

AI is no longer a future conversation—it’s here, and it’s forcing a reckoning. Will you use it to cut costs and commoditize talent, or to amplify human brilliance? In this episode, we unpack the real leadership challenge of AI: navigating disruption while unlocking empowerment. The winners won’t be those who automate the fastest, but those who invest wisely in elevating customers, energizing employees, and transforming operations with intention and purpose.

Working on Purpose is broadcast live Tuesdays at 6PM ET and Music on W4CY Radio (www.w4cy.com) part of Talk 4 Radio (www.talk4radio.com) on the Talk 4 Media Network (www.talk4media.com). Working on Purpose is viewed on Talk 4 TV (www.talk4tv.com).

Working on Purpose Podcast is also available on Talk 4 Media (www.talk4media.com), Talk 4 Podcasting (www.talk4podcasting.com), iHeartRadio, Amazon Music, Pandora, Spotify, Audible, and over 100 other podcast outlets.

Become a supporter of this podcast: https://www.spreaker.com/podcast/working-on-purpose--2643411/support.

WEBVTT

1
00:00:00.120 --> 00:00:02.359
The topics and opinions express in the following show are

2
00:00:02.359 --> 00:00:04.000
solely those of the hosts and their guests and not

3
00:00:04.040 --> 00:00:06.919
those of W FOURCY Radio. It's employees are affiliates. We

4
00:00:07.000 --> 00:00:10.080
make no recommendations or endorsements for radio show programs, services,

5
00:00:10.119 --> 00:00:12.439
or products mentioned on air or on our web. No

6
00:00:12.560 --> 00:00:15.119
liability explicit or implies shall be extended to W four

7
00:00:15.160 --> 00:00:17.920
CY Radio or its employees are affiliates. Any questions or

8
00:00:17.960 --> 00:00:20.399
comments should be directed to those show hosts. Thank you

9
00:00:20.440 --> 00:00:25.480
for choosing W FOURCY Radio.

10
00:00:28.199 --> 00:00:31.600
What's working on Purpose? Anyway? Each week we ponder the

11
00:00:31.640 --> 00:00:34.799
answer to this question. People ache for meaning and purpose

12
00:00:34.840 --> 00:00:38.200
at work, to contribute their talents passionately and know their

13
00:00:38.240 --> 00:00:41.600
lives really matter. They crave being part of an organization

14
00:00:41.679 --> 00:00:44.399
that inspires them and helps them grow into realizing their

15
00:00:44.479 --> 00:00:47.520
highest potential. Business can be such a force for good

16
00:00:47.560 --> 00:00:51.200
in the world, elevating humanity. In our program, we provide

17
00:00:51.200 --> 00:00:54.240
guidance and inspiration to help usher in this world we

18
00:00:54.320 --> 00:00:58.600
all want working on Purpose. Now here's your host, doctor

19
00:00:58.640 --> 00:00:59.719
Elise Cortes.

20
00:01:05.079 --> 00:01:07.000
Welcome back to the Working and Purpose Program, which has

21
00:01:07.000 --> 00:01:08.959
been brought to you with passion your pride since February

22
00:01:09.040 --> 00:01:11.120
twenty fifteen. I'm trying to get in this week. Great

23
00:01:11.159 --> 00:01:13.280
to have you. I'm your host, Doctor A Lease Cortes.

24
00:01:13.879 --> 00:01:16.680
Most leaders are sitting on untapped human energy. I help

25
00:01:16.719 --> 00:01:19.519
them unlock it. That's my jam. I'm an organizational psychologist,

26
00:01:19.640 --> 00:01:22.239
lag with therapist, workforce advisor and the founder of the

27
00:01:22.280 --> 00:01:24.879
Gusto Now movement. But my real title that I go

28
00:01:24.959 --> 00:01:27.040
by pretty much every day is I just simply traffic

29
00:01:27.079 --> 00:01:30.319
and energy. Not the service cond that a motivation or

30
00:01:30.400 --> 00:01:34.439
another engagement initiative, but the deeper force what I call Gusto,

31
00:01:34.680 --> 00:01:39.040
the life force for performance that drives commitment, perseverance, and

32
00:01:39.120 --> 00:01:41.640
genuine ownership of a shared admission in the clients that

33
00:01:41.680 --> 00:01:44.519
we serve. You can learn more about how we work

34
00:01:44.560 --> 00:01:47.920
and how we can work together at gustodation dot com

35
00:01:48.079 --> 00:01:50.400
or my personal site is at least quortes dot com

36
00:01:50.519 --> 00:01:54.280
for information on speaking and books. Today's program we have

37
00:01:54.319 --> 00:01:57.120
with us back Steve Brown, a leading voice and the

38
00:01:57.239 --> 00:02:00.879
conversation on artificial intelligence. He is a form We're executive

39
00:02:00.879 --> 00:02:04.680
at Google deep Mind and Intel has delivered hundreds of engaging,

40
00:02:04.719 --> 00:02:08.800
information rich keynotes across five continents. Inspiring audience is to

41
00:02:08.840 --> 00:02:12.479
take action with AI. He's the author of The Innovation Ultimatum,

42
00:02:12.719 --> 00:02:16.599
how six strategic technologies will reshape every business in the

43
00:02:16.599 --> 00:02:19.280
twenty twenties, which we talked about in an earlier episode.

44
00:02:19.479 --> 00:02:22.919
Today we're talking about his newest book, The AI Ultimatum,

45
00:02:23.039 --> 00:02:26.439
Preparing for a world of Intelligent machines and radical transformation.

46
00:02:26.800 --> 00:02:29.680
He jose Us from Portland, Oregon see a hearty welcome

47
00:02:29.719 --> 00:02:30.960
back to working on purpose.

48
00:02:31.759 --> 00:02:33.120
At least, let's be with you again.

49
00:02:33.840 --> 00:02:35.479
You know, as I told you in exchange and before

50
00:02:35.479 --> 00:02:37.840
we got on air, this as magnificent. It is a

51
00:02:37.919 --> 00:02:40.479
triumph of Steve Brown. You know this. The world needs

52
00:02:40.479 --> 00:02:45.199
to read this book stat So you're welcome. I might

53
00:02:45.240 --> 00:02:48.800
be a fan, I don't know, so.

54
00:02:49.039 --> 00:02:51.800
Lit a labor of love, sir. I'm glad you joined it.

55
00:02:51.919 --> 00:02:55.319
I appreciate that, you know, as a fellow author, I

56
00:02:55.360 --> 00:02:58.120
appreciate when somebody really reads my books too, and or

57
00:02:58.280 --> 00:03:00.599
my book and has something to say and links to it.

58
00:03:00.639 --> 00:03:02.639
And I really really think there's so much in this.

59
00:03:02.800 --> 00:03:06.120
So I want to start with the foundation of your

60
00:03:06.120 --> 00:03:09.879
book that I think is really really beautiful. And so

61
00:03:09.960 --> 00:03:12.439
you talk about how the AA Ultimatum is more about

62
00:03:12.759 --> 00:03:17.599
technological adoption. It requires reimagining how value gets created when

63
00:03:17.680 --> 00:03:22.080
cognative limitations dissolve, and how competitive advantage emerges when analytical

64
00:03:22.159 --> 00:03:26.240
capacity becomes abundant, and how human potential expands and love

65
00:03:26.319 --> 00:03:29.960
that when augmented with artificial intelligence. I think that is

66
00:03:30.280 --> 00:03:33.639
tremendous because as you probably well you have to well know,

67
00:03:34.080 --> 00:03:36.400
there is so much fear around it instead.

68
00:03:37.080 --> 00:03:40.639
Yeah, yeah, I mean leaders have a choice going into this.

69
00:03:41.120 --> 00:03:44.439
They can treat AI like any other technology, which they

70
00:03:44.479 --> 00:03:48.240
shouldn't and apply twentieth century thinking what I call twentieth

71
00:03:48.240 --> 00:03:51.439
century thinking in the book, which is to use AI

72
00:03:51.639 --> 00:03:57.199
as a way to cut costs, boost efficiency, reduce headcounts,

73
00:03:57.280 --> 00:04:01.400
and do substitution right, replacing people with technology. You could

74
00:04:01.400 --> 00:04:04.159
do that, and some leaders will try that, and it

75
00:04:04.199 --> 00:04:07.599
will be the wrong decision. It's the wrong decision, not

76
00:04:07.719 --> 00:04:10.960
just because there's a social impact on that. We can

77
00:04:10.960 --> 00:04:13.039
talk about that as much as you like, but because

78
00:04:13.039 --> 00:04:15.479
it's the wrong thing from a business standpoint to do.

79
00:04:16.279 --> 00:04:20.319
If you have two competitors apply twentieth century thinking, they

80
00:04:20.439 --> 00:04:24.920
use AI for substitution of labor. They reduce the people

81
00:04:25.000 --> 00:04:27.879
as far as they can, assuming that those two businesses

82
00:04:27.959 --> 00:04:31.959
have the same access to resources and technology and capital.

83
00:04:32.560 --> 00:04:35.279
They're going to end up being the same company, and

84
00:04:35.319 --> 00:04:37.600
then they fight against each other on price. Because the

85
00:04:37.600 --> 00:04:42.160
differentiation you have is your people. It also means they're

86
00:04:42.199 --> 00:04:44.720
not going to be ready for the people who apply

87
00:04:44.879 --> 00:04:49.439
twenty first century thinking, which is to use AI to

88
00:04:49.639 --> 00:04:53.639
amplify the impact of the people you have, and to

89
00:04:54.279 --> 00:04:58.879
know if you are applying twentieth century thinking, maybe you'll

90
00:04:58.920 --> 00:05:01.759
reduce your costs, maybe you'll be a little bit more efficient,

91
00:05:02.240 --> 00:05:06.360
maybe you will improve your results by ten or fifteen or,

92
00:05:06.480 --> 00:05:09.959
if you're lucky, twenty twenty five percent. If you apply

93
00:05:10.079 --> 00:05:12.959
twenty first century thinking, the approach I talk about in

94
00:05:13.000 --> 00:05:16.920
the book, maybe you can amplify your impact and think

95
00:05:16.959 --> 00:05:21.759
about boosting your results ten x or twenty x or

96
00:05:21.920 --> 00:05:25.360
fifty x. And the companies that do that will leave

97
00:05:25.759 --> 00:05:28.839
the twentieth century companies behind, and.

98
00:05:28.759 --> 00:05:30.600
I'm about to build those kind of leaders. Steve so,

99
00:05:30.680 --> 00:05:34.120
I really really appreciate how you distinguish those different ways

100
00:05:34.120 --> 00:05:38.040
of thinking. I thought it was really helpful to recognize

101
00:05:38.199 --> 00:05:42.360
that when you're talking about orchestrating and using technology to

102
00:05:43.040 --> 00:05:45.600
create that bigger value, you do distinguish three different kinds

103
00:05:45.600 --> 00:05:49.720
of intelligences if you would describe those for us.

104
00:05:49.240 --> 00:05:53.680
All three different types of agents. Yeah, well thumber agents

105
00:05:53.680 --> 00:05:54.319
and what they are.

106
00:05:55.480 --> 00:05:58.040
Well, so first, what I was looking for is the human,

107
00:05:58.079 --> 00:05:59.600
the artificial and robotic. First.

108
00:05:59.639 --> 00:06:04.160
Oh, yeah, we can do that first, okay, so thank you. Yeah.

109
00:06:04.240 --> 00:06:10.199
Leaders now preside over a blended workforce with three components,

110
00:06:11.160 --> 00:06:13.879
the human workers that they have had to manage in

111
00:06:13.920 --> 00:06:17.879
the past, and now digital employees in the form of

112
00:06:17.920 --> 00:06:21.680
AI agents. And if there's a physical dimension to their work,

113
00:06:21.959 --> 00:06:25.000
then robots, and some of those will be the types

114
00:06:25.000 --> 00:06:27.639
of robots we've seen in the past that have built

115
00:06:27.639 --> 00:06:29.959
our cars for the past forty years, but many of

116
00:06:30.000 --> 00:06:33.480
them will now be humanoid robots that are learning machines

117
00:06:33.879 --> 00:06:38.199
that learn by observing humans perform work practicing themselves. And

118
00:06:38.240 --> 00:06:42.079
then once one robot learns, they all learn. And the

119
00:06:42.160 --> 00:06:45.079
companies that thrive will be the ones that figure out

120
00:06:45.319 --> 00:06:50.079
how to blend all of those three components agents, robots,

121
00:06:50.120 --> 00:06:55.800
and yes, still humans to create this workforce that amplifies

122
00:06:55.879 --> 00:06:58.360
the efforts of the human beings in the organization.

123
00:07:00.079 --> 00:07:02.199
I also appreciated and I really thought, you know, the

124
00:07:02.279 --> 00:07:05.000
kind of work that you were doing must be hopelessly fascinating.

125
00:07:05.040 --> 00:07:08.040
But what you were talking about, how you know, the

126
00:07:08.079 --> 00:07:10.759
aim of considering, looking at the task, what needs to

127
00:07:10.759 --> 00:07:12.560
be done in the organization, and then deciding which of

128
00:07:12.600 --> 00:07:15.279
these intelligence shall we pull on to provide that. So

129
00:07:15.360 --> 00:07:20.920
with the idea of humans provide relationship, relational intelligence, judgment,

130
00:07:21.040 --> 00:07:26.439
creativity value that you set direction, make, ethical choices, create meaning, connection, trust,

131
00:07:27.040 --> 00:07:28.480
Those are the kind of things that we bring to

132
00:07:28.519 --> 00:07:30.720
the party. And of course our job at our in

133
00:07:30.800 --> 00:07:33.600
our world see that is about doubling down a humanity

134
00:07:33.680 --> 00:07:36.160
to make those even stronger. And then of course the

135
00:07:36.240 --> 00:07:42.319
artificial handles analysis at superhuman scale finds patterns across impossibly

136
00:07:42.399 --> 00:07:45.920
large data sets and operates continuously without fatigue. And then

137
00:07:45.959 --> 00:07:47.800
you already brought the robotic thing up. I think it's

138
00:07:47.839 --> 00:07:52.120
really important for leaders to distinguish just how distinctive those

139
00:07:52.160 --> 00:07:53.160
intelligences are.

140
00:07:54.759 --> 00:07:57.240
Yeah, and you always want to keep humans in the loop.

141
00:07:57.879 --> 00:08:00.839
You know, you can automate a lot of and AI

142
00:08:01.040 --> 00:08:06.120
is great for doing things at scale, for collapsing costs

143
00:08:06.160 --> 00:08:11.600
and time, for handling massive data, sense, for delivering hyper

144
00:08:11.600 --> 00:08:14.560
personalization at scale for all of your customers so you

145
00:08:14.600 --> 00:08:17.800
can be giving them offers at an individual level. There's

146
00:08:17.839 --> 00:08:20.199
lots of great things you can do with AI, but

147
00:08:20.279 --> 00:08:22.279
over at the end of the day, you want humans

148
00:08:22.319 --> 00:08:28.839
to be overseeing that agents, which feel like digital employees.

149
00:08:30.120 --> 00:08:32.120
You can communicate with them, you can give them quite

150
00:08:32.159 --> 00:08:36.159
complex tasks. They'll go off and work on those, break

151
00:08:36.200 --> 00:08:39.399
them down into subtasks and try and execute on them.

152
00:08:39.919 --> 00:08:42.360
They need oversight because they will make mistakes, they will

153
00:08:42.399 --> 00:08:45.639
hit a corner case where they can't quite get it right,

154
00:08:46.200 --> 00:08:49.120
they will get something wrong, they'll make an error, they

155
00:08:49.159 --> 00:08:54.279
will need guidance, and so you always want humans overseeing.

156
00:08:54.879 --> 00:08:58.279
What it means is for us as people, is we

157
00:08:58.320 --> 00:09:02.360
move from doing the work to designing the work and

158
00:09:02.399 --> 00:09:07.200
then ultimately overseeing the work of agents and robots. Everybody

159
00:09:07.240 --> 00:09:10.279
becomes a manager, no matter how junior they are in

160
00:09:10.320 --> 00:09:11.039
an organization.

161
00:09:11.320 --> 00:09:13.360
Yeah, that line really struck me in your book, that

162
00:09:13.399 --> 00:09:16.279
we're all, regardless of what we do in an organization,

163
00:09:16.279 --> 00:09:19.240
we're all going to be managers. And then I really

164
00:09:19.240 --> 00:09:23.559
really was quite quite taken with your idea of really

165
00:09:23.600 --> 00:09:27.360
helping organizations to focus on how to transform their business

166
00:09:27.360 --> 00:09:31.559
through your CEO framework. If you could describe that approach

167
00:09:31.600 --> 00:09:32.759
and what that entails.

168
00:09:33.080 --> 00:09:36.240
Yeah, I work with a lot of senior leadership teams, boards,

169
00:09:36.320 --> 00:09:40.360
management teams, and often you know a CEO will ask me, well,

170
00:09:40.919 --> 00:09:44.000
where do we start, steeve, it all seems so overwhelming.

171
00:09:44.039 --> 00:09:46.919
We know we need to apply AI, have limited resources,

172
00:09:46.919 --> 00:09:49.960
Where do we start? And I start them with the

173
00:09:50.000 --> 00:09:54.600
CEO framework, which is a way of finding balance. And

174
00:09:54.600 --> 00:09:57.480
this is all about finding balance, right. The main areas

175
00:09:57.519 --> 00:10:01.399
you can invest with AI. You can invest in improving

176
00:10:01.440 --> 00:10:05.759
your customer experience. How do you hyper personalized offers? How

177
00:10:05.799 --> 00:10:09.279
do you create better product services? And so on? That's

178
00:10:09.320 --> 00:10:14.279
the C of CEO. The E is obvious employees. How

179
00:10:14.320 --> 00:10:17.840
do you use AI to amplify the impact that your

180
00:10:17.840 --> 00:10:21.679
employees have to offload low value work that they find

181
00:10:21.720 --> 00:10:25.039
tedious and that they don't enjoy, taking the suck out

182
00:10:25.080 --> 00:10:27.600
of their jobs if you like. And then the O

183
00:10:27.919 --> 00:10:31.399
is for operations? How do you use AI to streamline

184
00:10:31.399 --> 00:10:34.879
operations and just make things all run much more smoothly.

185
00:10:35.159 --> 00:10:37.559
You want to do all three of those things. And

186
00:10:37.639 --> 00:10:40.480
so when you when you imagine all of the AI

187
00:10:40.600 --> 00:10:43.799
projects that you could apply, you stort of apply that

188
00:10:43.799 --> 00:10:46.600
filter and say, makes make sure they're balanced across the

189
00:10:46.679 --> 00:10:52.159
C and O customers, employees and operations and a few.

190
00:10:52.000 --> 00:10:53.840
Things I want to call out about that really quick.

191
00:10:54.360 --> 00:10:58.879
So first you also suggest starting with, you know, the employees,

192
00:11:00.200 --> 00:11:03.759
to help support them and to see that this new

193
00:11:03.799 --> 00:11:07.720
technology is being focused on helping them and caring for them.

194
00:11:08.000 --> 00:11:10.559
And in the process of doing that, they learned that

195
00:11:10.600 --> 00:11:12.759
who this is actually making my life easier here, and

196
00:11:12.759 --> 00:11:15.080
now they're starting to get bought in and now they

197
00:11:15.120 --> 00:11:17.720
want to help support their initiative versus resistant And I

198
00:11:17.720 --> 00:11:20.679
thought that was a really really you're going to start someplace,

199
00:11:20.720 --> 00:11:22.360
why wouldn't that be a great place to start.

200
00:11:23.279 --> 00:11:24.960
Yeah, I mean you get good bang for the buck

201
00:11:25.039 --> 00:11:28.639
there because you're amplifying everything that your employees do if

202
00:11:28.679 --> 00:11:32.159
you use AI to support them. But also it's about

203
00:11:32.240 --> 00:11:35.679
driving inclusion and support. I work with one client and

204
00:11:35.679 --> 00:11:37.799
they were so proud because they they were ahead of

205
00:11:37.840 --> 00:11:40.559
the game. They had agents. They'd built these agents and

206
00:11:40.600 --> 00:11:45.559
were rolling them out to employees, and the employees were

207
00:11:45.600 --> 00:11:48.320
up in arms. What are you doing? I don't want AI?

208
00:11:48.440 --> 00:11:51.759
Why are you rolling this out? And I asked the CEO.

209
00:11:51.799 --> 00:11:54.399
He's telling me the story and I said, so did

210
00:11:54.440 --> 00:11:58.759
you ask the employees what they wanted? And how AI

211
00:11:59.879 --> 00:12:03.600
you co design these solutions? Oh? No, you know, we

212
00:12:03.759 --> 00:12:05.600
just we thought they'd love them. We just built them

213
00:12:05.600 --> 00:12:07.639
and rolled them out, and they hated them, and some

214
00:12:07.679 --> 00:12:09.320
of them threatened to leave, and some of them did

215
00:12:09.399 --> 00:12:14.279
leave to say, well, how are you surprised you know

216
00:12:14.559 --> 00:12:16.279
what you need to do. I mean, there's two reasons

217
00:12:16.320 --> 00:12:18.919
you want to include people in the design process when

218
00:12:18.960 --> 00:12:24.120
you're building AI solutions. One, you're trying to build supporters,

219
00:12:24.159 --> 00:12:28.720
not saboteurs, and you do that by showing people what's

220
00:12:28.759 --> 00:12:31.639
in it for me. Right, If you co design a solution,

221
00:12:32.120 --> 00:12:34.440
you can see how it's going to help you do

222
00:12:34.559 --> 00:12:37.240
your job. You're going to be much more oriented towards

223
00:12:37.279 --> 00:12:40.080
supporting it and making that role out successful. It's just

224
00:12:40.519 --> 00:12:41.720
change management one O one.

225
00:12:41.759 --> 00:12:42.480
It's not hard.

226
00:12:44.120 --> 00:12:49.159
And the second reason is humans are creative and we

227
00:12:49.320 --> 00:12:51.559
find other ways to do things. So the way that

228
00:12:51.679 --> 00:12:54.600
leaders think work is done, and perhaps the way it's

229
00:12:54.639 --> 00:12:57.799
written down in policies and procedures, is never the way

230
00:12:57.840 --> 00:13:01.559
that work's actually done. And so if you design without

231
00:13:01.600 --> 00:13:05.000
including employees, what you end up is designing a solution

232
00:13:05.120 --> 00:13:07.279
for a world that does not exist, and then you're

233
00:13:07.320 --> 00:13:10.600
wasting everybody's time. So that's two great reasons why you

234
00:13:10.679 --> 00:13:14.039
include employees from day one to make sure that you

235
00:13:14.039 --> 00:13:16.200
have a successful, high impact rollout.

236
00:13:17.720 --> 00:13:20.639
Yeah, I really I thought that was so. And for

237
00:13:20.759 --> 00:13:24.000
the work that we're doing, Steve helping really elevate leaders

238
00:13:24.039 --> 00:13:28.240
and being able to activate purpose as an operational imperative

239
00:13:28.279 --> 00:13:30.639
in organizations, that is such an important thing that we

240
00:13:30.679 --> 00:13:32.759
can help drive in terms of the work that we're doing.

241
00:13:33.480 --> 00:13:35.360
And the last thing I wanted to talk about for

242
00:13:35.399 --> 00:13:38.519
this particular segment here that I just really appreciate it is,

243
00:13:39.000 --> 00:13:41.399
I like how you talk about really trying to create

244
00:13:41.519 --> 00:13:45.200
amplified intelligence. You know, so you mentioned it before when

245
00:13:45.200 --> 00:13:49.519
we first started about prioritizing value creation and talent development

246
00:13:49.559 --> 00:13:53.639
over cost cutting, and from a lot of the news

247
00:13:53.639 --> 00:13:56.360
that I get, that isn't the focus. We're not focused

248
00:13:56.360 --> 00:13:59.279
on value creation and talent development. We are focused on

249
00:13:59.279 --> 00:14:01.919
how do we make this cheaper a view, So I

250
00:14:01.960 --> 00:14:03.440
love that in notion, if you could speak a bit

251
00:14:03.480 --> 00:14:05.720
more to this idea of amplified intelligence.

252
00:14:06.279 --> 00:14:08.960
Yeah, leaders need to wake up and take a page

253
00:14:08.960 --> 00:14:12.759
out of Jensen Lung's book Jensen Long runs the Most

254
00:14:12.840 --> 00:14:17.159
Valuable Company on the Planet. In video in October twenty four,

255
00:14:17.399 --> 00:14:21.200
he was asked what's the plan for your workforce. And

256
00:14:21.240 --> 00:14:24.799
here's how he's thinking about workforce planning. He said, you know,

257
00:14:25.120 --> 00:14:27.399
I plan to go the company to about fifty thousand

258
00:14:27.440 --> 00:14:29.559
people with thirty two thousand. Now this is back in

259
00:14:29.600 --> 00:14:35.000
October twenty twenty four, fifty thousand people supported by one

260
00:14:35.159 --> 00:14:40.480
hundred million AI assistants across every group. So he's thinking

261
00:14:40.519 --> 00:14:43.759
and not to deploy AI to go from thirty two

262
00:14:43.840 --> 00:14:47.279
thousand to thirteen or something. He wants to keep growing

263
00:14:47.279 --> 00:14:50.440
his employees and then amplify them so he can have

264
00:14:50.679 --> 00:14:54.600
this huge impact. And he's planning. You know, if I

265
00:14:54.720 --> 00:14:57.159
was Jensen, I don't know his numbers, but if I

266
00:14:57.320 --> 00:14:59.279
was him, I would I'd be figuring out how do

267
00:14:59.320 --> 00:15:02.080
I take from a four to five to six trillion

268
00:15:02.120 --> 00:15:06.039
dollar company to a fifty trillion dollar company using AI.

269
00:15:06.320 --> 00:15:08.799
And that's the mindset every leader needs, because if your

270
00:15:08.879 --> 00:15:12.360
competitor is thinking that way and you're still stuck in

271
00:15:12.399 --> 00:15:15.480
the well, how can we reduce costs? You're going to

272
00:15:15.519 --> 00:15:17.840
be gone five years from now. That is just not

273
00:15:17.879 --> 00:15:20.679
the right way to show up as a leader. You're

274
00:15:20.679 --> 00:15:24.000
doing disservice to your organization and ultimately, as we'll talk

275
00:15:24.000 --> 00:15:29.279
about in the last segment, to the world. Yeah, if

276
00:15:29.360 --> 00:15:32.480
every leader thinks that way, all we're going to do

277
00:15:32.879 --> 00:15:37.759
is turn labor into capital. Right, That's it's the ultimate

278
00:15:38.519 --> 00:15:39.960
capitalism becomes ultimately.

279
00:15:39.639 --> 00:15:41.200
Successful animal, right there.

280
00:15:41.399 --> 00:15:47.159
Right, But unfortunately, here's the snag with that. Consumers are workers,

281
00:15:47.240 --> 00:15:49.559
and if there are no workers, there are no consumers

282
00:15:49.840 --> 00:15:54.960
and everything collapses. So you know, from a system economic

283
00:15:55.000 --> 00:15:57.639
wide perspective, we need to be thinking that way and

284
00:15:57.679 --> 00:16:00.240
that that's what I meant by the AI ultimatum. That's

285
00:16:00.279 --> 00:16:03.360
why I wrote the books. It's not just innovate or

286
00:16:03.440 --> 00:16:06.399
die type of message that's part of the AI ultimatum,

287
00:16:06.440 --> 00:16:09.279
but it's we've got to do this sensibly and responsibly

288
00:16:09.360 --> 00:16:12.519
as a group, otherwise it's all collapse.

289
00:16:13.000 --> 00:16:15.399
Thoughtfully indeed, And on that moment, let's let our listeners,

290
00:16:15.399 --> 00:16:17.679
who viewers, think about the notion of being thoughtful and

291
00:16:17.679 --> 00:16:21.480
intentional and conscious about these choices and decisions. I'm your host,

292
00:16:21.480 --> 00:16:23.919
Doctor Release Cortez, who've been on the air with Steve Brown,

293
00:16:24.279 --> 00:16:27.039
is a leading voice in the conversation on artificial intelligence.

294
00:16:27.360 --> 00:16:30.600
Is a former executive at go, Google, Deep Mind, and Intel,

295
00:16:30.840 --> 00:16:34.600
and has delivered hundreds of engaging, information rich keynotes across

296
00:16:34.600 --> 00:16:38.000
five continents inspiring audiences to make to take action on AI.

297
00:16:38.440 --> 00:16:41.720
You've been hearing us talk about how organizations are starting

298
00:16:41.759 --> 00:16:44.039
to approach it and how they can. After the break,

299
00:16:44.080 --> 00:16:46.360
we're going to talk about the importance though, of getting started.

300
00:16:46.440 --> 00:16:47.240
We'll be right back.

301
00:17:02.960 --> 00:17:06.519
Doctor Elise Cortes is a management consultant specializing in meaning

302
00:17:06.559 --> 00:17:10.400
and purpose. An inspirational speaker and author. She helps companies

303
00:17:10.480 --> 00:17:14.279
visioneer for greater purpose among stakeholders and develop purpose inspired

304
00:17:14.359 --> 00:17:18.319
leadership and meaning infused cultures that elevate fulfillment, performance, and

305
00:17:18.400 --> 00:17:21.599
commitment within the workforce. To learn more or to invite

306
00:17:21.599 --> 00:17:24.359
a lease to speak to your organization, please visit her

307
00:17:24.440 --> 00:17:27.680
at elisecortes dot com. Let's talk about how to get

308
00:17:27.720 --> 00:17:36.480
your employees working on purpose. This is working on Purpose

309
00:17:36.599 --> 00:17:39.880
with doctor Elise Cortes. To reach our program today or

310
00:17:39.920 --> 00:17:42.680
to open a conversation with Elise, send an email to

311
00:17:42.720 --> 00:17:48.160
Alise A. L Se at elisecortes dot com. Now back

312
00:17:48.160 --> 00:17:49.519
to working on purpose.

313
00:17:54.480 --> 00:17:56.279
Thanks for saying with us, and welcome back to working

314
00:17:56.279 --> 00:17:59.119
on Purpose. I'm your host, doctor Elise Cortes, as I too,

315
00:17:59.200 --> 00:18:01.880
am dedicated help to create a world where organizations thrive

316
00:18:01.920 --> 00:18:04.440
because they're people thrive, and they're led by inspirational leaders

317
00:18:04.440 --> 00:18:06.759
that help them find and contribute their greatness, and then

318
00:18:06.799 --> 00:18:08.480
we do business that better is the world. I keep

319
00:18:08.519 --> 00:18:10.599
researching and writing my own books. So one of my

320
00:18:10.680 --> 00:18:14.200
latest came out called The Great Revitalization, How activating meaning

321
00:18:14.200 --> 00:18:16.920
and purpose can radically enliven your business. And I wrote

322
00:18:16.920 --> 00:18:20.440
it to help leaders understand today's workforce. Discerning workforce, what

323
00:18:20.480 --> 00:18:22.200
do they want from you to give their best and

324
00:18:22.240 --> 00:18:24.519
want to stick around? And then I provide you twenty

325
00:18:24.599 --> 00:18:26.920
two best practices to equip you to provide that through

326
00:18:26.960 --> 00:18:29.319
your leadership and your culture. You can find my books

327
00:18:29.359 --> 00:18:32.039
on Amazon or my personal site at least rotests dot

328
00:18:32.039 --> 00:18:34.680
com if you are just now joining us. My guest

329
00:18:34.720 --> 00:18:37.759
is Steve Brown. He's the author of The AI Ultimatum,

330
00:18:38.119 --> 00:18:41.200
Preparing for a World of Intelligent Machines and Radical Transformation.

331
00:18:42.440 --> 00:18:45.039
So one of the things that you say in the book,

332
00:18:45.079 --> 00:18:48.039
which I think most of us felt last year, was

333
00:18:48.279 --> 00:18:51.359
you got to step on the train. It's you got

334
00:18:51.359 --> 00:18:53.279
to step onto this platform. We want you to do

335
00:18:53.319 --> 00:18:57.079
it intelligently and thoughtfully, but you do need to get started.

336
00:18:57.119 --> 00:18:58.559
So if you could talk a little bit about this

337
00:18:58.680 --> 00:19:00.960
notion of just find a way to start.

338
00:19:03.319 --> 00:19:05.680
Yeah, if if you're not started, you know you have

339
00:19:06.119 --> 00:19:09.880
don't wait around. You need to try something right, try something,

340
00:19:10.359 --> 00:19:13.079
and there are three three big steps ahead of you.

341
00:19:14.880 --> 00:19:18.119
The first step is figure out how to enable people

342
00:19:18.279 --> 00:19:22.920
in your teams and to enable yourself with AI tools.

343
00:19:23.240 --> 00:19:26.839
So that means trying out AI, getting to use them,

344
00:19:27.039 --> 00:19:29.640
figuring out which tools you can apply to workflows in

345
00:19:29.680 --> 00:19:33.319
your business, and getting to use them. You know, I'm

346
00:19:33.319 --> 00:19:36.440
old enough to remember when PCs came into the world

347
00:19:36.519 --> 00:19:38.720
and we all had to learn how to use at

348
00:19:38.720 --> 00:19:40.799
the time, word perfect and Lotus one two, three of

349
00:19:40.839 --> 00:19:46.440
your old numbers or word Excel, PowerPoint, Keynote, Google, Workplace, Place,

350
00:19:46.480 --> 00:19:48.880
whatever your thing is, right, you have to learn how

351
00:19:48.920 --> 00:19:51.400
to get the most out of those tools to have

352
00:19:51.519 --> 00:19:54.559
the greatest productivity and impact in the workplace. The same

353
00:19:54.680 --> 00:19:57.839
is true with AI power tools. People need to invest

354
00:19:57.880 --> 00:20:00.480
the time to learn how to get the most of them.

355
00:20:00.680 --> 00:20:03.759
That's step one enabling, and you need to do that

356
00:20:03.799 --> 00:20:06.880
step right away. If you're not doing it already, you know,

357
00:20:07.160 --> 00:20:10.920
find curiosity for what's possible here. Don't be afraid of AI.

358
00:20:11.640 --> 00:20:15.119
The old adage about you, You're not going to lose

359
00:20:15.160 --> 00:20:17.279
your job to an AI You'll lose your job to

360
00:20:17.279 --> 00:20:20.039
a person who knows how to use AI better than you.

361
00:20:20.400 --> 00:20:21.160
It's true.

362
00:20:21.400 --> 00:20:25.400
So investing yourself build your own AI acumen. That is

363
00:20:25.440 --> 00:20:27.319
the first step of a three step process.

364
00:20:30.440 --> 00:20:32.400
So many great things to talk about here. One thing

365
00:20:32.440 --> 00:20:34.799
that I thought was interesting, just from a technical advantage

366
00:20:34.799 --> 00:20:36.440
point that I wanted you to speak to is you

367
00:20:36.519 --> 00:20:39.960
talked about how a successful AI portfolio balance into different

368
00:20:40.000 --> 00:20:44.599
innovation types. And there are very specific things including the loans,

369
00:20:44.599 --> 00:20:47.119
et cetera. If you could talk about that idea of

370
00:20:47.200 --> 00:20:49.759
a portfolio and some of this technical language that you

371
00:20:49.880 --> 00:20:52.359
brought in loon slag rules, things like that.

372
00:20:52.880 --> 00:20:54.440
Yeah, as I mentioned in the previous section, it's all

373
00:20:54.440 --> 00:20:59.039
about finding balance. The CEO frameworks about balance between customers, employees,

374
00:20:59.079 --> 00:21:02.799
and operations. The other framework off is actually one from

375
00:21:02.880 --> 00:21:05.519
Roy Berhat, So I need to give Roy full credit.

376
00:21:05.599 --> 00:21:09.359
He's the head of Bloomberg Beta, which is their venture

377
00:21:09.400 --> 00:21:12.559
capital arm, and he came up with this notion of looms,

378
00:21:12.599 --> 00:21:16.599
slide rules and cranes, and I simplify it in the

379
00:21:16.599 --> 00:21:21.839
book to offload, elevate and extent. Let me explain what

380
00:21:21.880 --> 00:21:23.720
I mean and what Roy means by that. When it

381
00:21:23.759 --> 00:21:28.240
comes to digital employees AI agents, they fall into these

382
00:21:28.240 --> 00:21:32.480
three categories rather neatly. So a loom is an example

383
00:21:32.680 --> 00:21:36.720
of a piece of technology that completely offloads a task

384
00:21:36.960 --> 00:21:40.119
that a human used to do. Right, humans used to weave,

385
00:21:40.240 --> 00:21:42.400
and now there are looms. There are machines that do

386
00:21:42.480 --> 00:21:45.799
all of that. So there are agents that will offload

387
00:21:46.279 --> 00:21:48.920
tasks entirely, things that we just don't want to do.

388
00:21:48.920 --> 00:21:51.039
They'll take the suck out of our jobs. Right, So,

389
00:21:51.880 --> 00:21:55.000
booking meetings for five people and finding a meeting room

390
00:21:55.000 --> 00:21:58.400
and organizing their calendars, no one likes doing that. Have

391
00:21:58.519 --> 00:22:02.480
an AI agent do it, perfect job. Then there are

392
00:22:02.640 --> 00:22:06.480
slide rules. Now I'm just about old enough to remember

393
00:22:06.480 --> 00:22:08.599
what a slide rule is. I've never really used one.

394
00:22:08.720 --> 00:22:11.160
I'm not sure how to use one. But it was

395
00:22:11.200 --> 00:22:13.680
a piece of plastic that allowed you, as when you're

396
00:22:13.680 --> 00:22:17.400
doing mathematical sums, to be able to speed up the

397
00:22:17.440 --> 00:22:21.279
process of figuring out answers to things. It doesn't help

398
00:22:21.319 --> 00:22:23.720
you do something you couldn't do before, it just makes

399
00:22:23.759 --> 00:22:28.400
it quicker. So that's an example of elevating performance, helping

400
00:22:28.440 --> 00:22:31.319
you do something you could already do better, whether that

401
00:22:31.440 --> 00:22:35.039
is being more creative doing it faster and doing it cheaper,

402
00:22:35.039 --> 00:22:38.240
whatever it might be, you're doing it better. Then. The

403
00:22:38.279 --> 00:22:43.119
final category that Bahat talks about is cranes. You could

404
00:22:43.160 --> 00:22:48.440
not build Manhattan without cranes, right. Cranes are a technology

405
00:22:48.440 --> 00:22:52.559
that allows humans to extend their capabilities and do stuff

406
00:22:52.599 --> 00:22:56.440
they couldn't do before. It's the ultimate sort of rarefied

407
00:22:56.559 --> 00:22:59.440
category of agents that you want to be trying to

408
00:22:59.480 --> 00:23:02.440
build so that you can pair these agents with your

409
00:23:02.440 --> 00:23:08.119
employees and through the collaboration of an extend agent, a

410
00:23:08.680 --> 00:23:12.319
crane and a human you allow them to do something

411
00:23:12.319 --> 00:23:17.920
they couldn't do before they become human employees, extending their abilities.

412
00:23:18.119 --> 00:23:20.559
So that's the hardest category. If you can do that,

413
00:23:21.960 --> 00:23:24.920
you know you're really doing well. So again it's about balance.

414
00:23:25.160 --> 00:23:27.400
You want to make sure you don't just come up

415
00:23:27.440 --> 00:23:31.279
with offload agents. You want to balance of offload agents,

416
00:23:31.519 --> 00:23:34.079
elevate agents, and extend agents.

417
00:23:35.079 --> 00:23:38.759
A bit on that notion of the crane agent that

418
00:23:38.839 --> 00:23:42.240
does the extension out from a leadership finish point, if

419
00:23:42.279 --> 00:23:46.160
you want to motivate somebody and you are helping them

420
00:23:46.200 --> 00:23:50.279
to step into their greatness, that's pretty darn motivating, right,

421
00:23:50.640 --> 00:23:52.640
you know. So that's one of those things that when

422
00:23:52.640 --> 00:23:55.200
you think about what somebody would be tremendously motivated to

423
00:23:55.200 --> 00:23:57.519
be able to have the opportunity to do that, to

424
00:23:57.599 --> 00:23:59.119
literally expand themselves.

425
00:24:00.599 --> 00:24:05.559
Yeah, and that's what true leadership is about. Now. Leadership

426
00:24:05.559 --> 00:24:08.160
in the AI era, it used to be that managers

427
00:24:08.240 --> 00:24:12.640
were there to manage scarce resources. You're given a certain

428
00:24:12.680 --> 00:24:15.559
number of heads, you are given a budget, and you

429
00:24:15.640 --> 00:24:17.480
do your best to deliver the results you ask to

430
00:24:17.519 --> 00:24:20.599
deliver within that framework. And you know, we've all been there,

431
00:24:20.640 --> 00:24:25.559
we've all done that. It's not fun, honestly. But now

432
00:24:25.759 --> 00:24:32.279
the opportunity, because AI enables you to collapse constraints of

433
00:24:32.400 --> 00:24:36.240
cost and time, you can start to free yourself up

434
00:24:36.240 --> 00:24:39.079
as a manager and step into a leadership position where

435
00:24:39.119 --> 00:24:42.680
your job is now not to manage resources, it is

436
00:24:42.799 --> 00:24:48.799
to build superhuman teams. By building teams where you're pairing,

437
00:24:49.400 --> 00:24:54.279
you're building collaborations of humans, agents, and robots working together

438
00:24:54.720 --> 00:24:58.440
where one plus one plus one equals seventeen. Right, that's

439
00:24:58.480 --> 00:24:59.559
the opportunity here.

440
00:25:00.000 --> 00:25:04.440
Oh see that was gold Steve. Yeah, that's wonderful. Okay,

441
00:25:04.920 --> 00:25:06.960
So another thing that I thought was really useful and

442
00:25:07.000 --> 00:25:08.920
helping for somebody who's trying to get their head around.

443
00:25:08.920 --> 00:25:10.799
How do I start on this thing? Is you talk

444
00:25:10.880 --> 00:25:16.200
about these three horizon frameworks that organizations can consider about

445
00:25:16.240 --> 00:25:18.960
how they're starting to use it. If you could speak

446
00:25:19.000 --> 00:25:20.799
to those and give us maybe some examples.

447
00:25:21.680 --> 00:25:24.319
Yeah, and again this is about balance, right when you

448
00:25:24.440 --> 00:25:27.680
are trying to figure out what does my portfolio of

449
00:25:27.720 --> 00:25:30.359
AI projects look like. If you're a big enough organization,

450
00:25:30.759 --> 00:25:32.400
you're not just going to have one project, You're going

451
00:25:32.440 --> 00:25:35.319
to have multiple. So you want to spread across CEO.

452
00:25:35.680 --> 00:25:40.359
You want to spread across the extend, elevate, and offload,

453
00:25:40.720 --> 00:25:43.319
but you also want to think about different levels of

454
00:25:43.359 --> 00:25:47.039
investment and ROI essentially. And so I talk in the

455
00:25:47.079 --> 00:25:50.640
book about the IDC three horizons framework, which says you're

456
00:25:50.640 --> 00:25:55.839
going to have short term projects which are incremental in nature. Right,

457
00:25:55.839 --> 00:25:59.400
they're probably six to twelve months of effort, and they're

458
00:25:59.440 --> 00:26:01.799
going to take something that you're already doing and just

459
00:26:01.839 --> 00:26:04.680
make it better. So it's an incremental change. It's not

460
00:26:05.279 --> 00:26:08.599
you're not gonna win any awards or anything, but it's important.

461
00:26:08.640 --> 00:26:10.880
It's going to have an impact on your business. And

462
00:26:11.119 --> 00:26:13.599
there is the second horizon, which is your sort of

463
00:26:13.640 --> 00:26:17.640
medium term. It's probably twelve to eighteen months, where you're

464
00:26:17.640 --> 00:26:20.440
really looking for incremental sorry, not not incremental change, but

465
00:26:20.680 --> 00:26:23.480
a revolutionary change of some sort. You're looking for a

466
00:26:23.519 --> 00:26:28.799
real breakthrough where you're disruptive. You're disrupting something you're already doing,

467
00:26:29.000 --> 00:26:32.880
disrupting your competition, but you're really making a significant impact.

468
00:26:32.960 --> 00:26:34.480
You're going to put a lot more effort into it

469
00:26:34.599 --> 00:26:36.240
because it's going to be a twelve to eighteen month

470
00:26:36.319 --> 00:26:40.240
time horizon, but you're hopefully getting some disruptive impact. Then

471
00:26:40.279 --> 00:26:43.000
there's the longer term stuff, which you're now in the

472
00:26:43.559 --> 00:26:47.279
three year project time horizon. And these are the projects

473
00:26:47.400 --> 00:26:51.599
where you change the game that everybody's playing. You are

474
00:26:51.680 --> 00:26:54.559
coming to market with an entirely new product, an entirely

475
00:26:54.599 --> 00:26:57.720
new service. You perhaps have a different business model, you

476
00:26:57.759 --> 00:26:59.960
are looking to really change the game when your compared

477
00:27:01.039 --> 00:27:03.799
And these are the big investments that you make which

478
00:27:03.839 --> 00:27:08.519
are going to pay off to transform the prospects for

479
00:27:08.559 --> 00:27:11.839
your entire organization. And you want to balance again between

480
00:27:11.880 --> 00:27:13.960
all of these. He wants the short term, the medium term,

481
00:27:14.119 --> 00:27:16.200
and yes, you want to play some long term bets

482
00:27:16.720 --> 00:27:19.400
because you should bet that your competitors are doing that

483
00:27:19.440 --> 00:27:19.680
to you.

484
00:27:20.640 --> 00:27:22.480
I hope in this conversation, which is why I wanted

485
00:27:22.480 --> 00:27:24.079
to bring you on after time I read your book

486
00:27:24.119 --> 00:27:28.000
that this conversation is as exciting and inspiring for our

487
00:27:28.039 --> 00:27:30.160
listeners and viewers as it has been for me. This

488
00:27:30.319 --> 00:27:32.920
is opening up so many more vistas and being able

489
00:27:32.960 --> 00:27:35.839
to sort of you've helped us better understand how to

490
00:27:36.000 --> 00:27:39.799
navigate this what's available, and then most importantly going on

491
00:27:39.839 --> 00:27:41.920
to this next thing. I for a lot of people

492
00:27:41.960 --> 00:27:44.000
who are scared to death of all of this, don't

493
00:27:44.079 --> 00:27:46.039
understand it, don't understand what their role is, and are

494
00:27:46.079 --> 00:27:50.480
only afraid of the negative consequences. I had to celebrate

495
00:27:50.480 --> 00:27:53.319
your notion of how to become a robot proof and

496
00:27:53.440 --> 00:27:56.480
which becomes a human advantage, and that when you're out speaking,

497
00:27:56.519 --> 00:27:59.240
you say that people are always asking you how do

498
00:27:59.359 --> 00:28:01.680
I become role robot proof? Or what should I tell

499
00:28:01.680 --> 00:28:04.200
my children or a grandchildren to prepare for for their

500
00:28:04.279 --> 00:28:06.759
future workplace? If you could speak to the answer you

501
00:28:06.799 --> 00:28:07.559
give them for that.

502
00:28:08.319 --> 00:28:09.839
Yeah, it's the number one question I get from my

503
00:28:09.880 --> 00:28:11.920
audiences when I do a keynote, and it doesn't matter

504
00:28:11.920 --> 00:28:14.759
what country I'm in, what constant I'm on, people are

505
00:28:14.759 --> 00:28:17.200
worried about you know, am I going to be replaced.

506
00:28:17.200 --> 00:28:18.960
What do I tell my children? That's the sort of

507
00:28:19.000 --> 00:28:21.079
biggest question I get. What do I tell my children?

508
00:28:21.880 --> 00:28:23.279
And I tell them all the same thing, which is

509
00:28:23.920 --> 00:28:27.960
Number one. Invest in yourself, Learn how to use AI,

510
00:28:28.279 --> 00:28:32.359
be curious, lean into it. Use this stuff in your

511
00:28:32.359 --> 00:28:34.920
private life, figure out ways you can use it in

512
00:28:34.960 --> 00:28:38.759
your work life, get comfortable with it, and learn how

513
00:28:38.759 --> 00:28:41.279
to get the best results out of these AI tools

514
00:28:41.279 --> 00:28:44.559
so that you're able to amplify the value that you're

515
00:28:44.599 --> 00:28:47.319
providing to a company. You know. One of the things

516
00:28:47.359 --> 00:28:49.200
I talk about in the book is a guy I

517
00:28:49.279 --> 00:28:51.839
used to work with called Dave Todd. I'm still in

518
00:28:51.880 --> 00:28:55.000
touch with him. A great Scottish guy, and when email

519
00:28:55.079 --> 00:28:57.599
first came out back in the day when I still

520
00:28:57.599 --> 00:29:01.440
had a good head of hair, fuse to use email.

521
00:29:02.440 --> 00:29:03.960
He didn't like that he didn't know how to type,

522
00:29:04.160 --> 00:29:07.839
and so he had his admin assistant print out every

523
00:29:07.839 --> 00:29:11.480
one of his emails. He would handwrite a reply on

524
00:29:11.599 --> 00:29:13.839
the paper and hand it back to her to type up.

525
00:29:14.599 --> 00:29:19.559
Can you imagine how productive he was versus right, But

526
00:29:20.079 --> 00:29:24.400
people who refuse to use AI today are doing exactly that.

527
00:29:25.240 --> 00:29:27.119
So there's a whole section of my book called don't

528
00:29:27.119 --> 00:29:30.440
be Dave Todd because you need to lean into this

529
00:29:31.200 --> 00:29:33.480
and you don't have any choice. Same way Dave ultimately

530
00:29:33.480 --> 00:29:35.960
didn't have any choice. He learned to peck his way

531
00:29:35.960 --> 00:29:38.880
through writing emails with two fingers and he kept up

532
00:29:38.880 --> 00:29:41.000
with the world. But we all have to do it.

533
00:29:41.039 --> 00:29:45.119
So investing yourself. Get comfortable with AI and learn how

534
00:29:45.160 --> 00:29:50.000
to use it and be adaptable, be willing to change

535
00:29:50.079 --> 00:29:54.119
the way you do what you do, and find the

536
00:29:54.200 --> 00:29:56.680
vision to be able to say, Okay, if I have

537
00:29:56.880 --> 00:29:59.480
a AI in my life, how would I now do

538
00:29:59.559 --> 00:30:04.440
things to and ask those questions. It's about having what's

539
00:30:04.480 --> 00:30:10.160
called possibility thinking, ask questions like how could we how

540
00:30:10.160 --> 00:30:14.240
could we do dot? And if you hear yourself saying

541
00:30:14.599 --> 00:30:17.599
anything that resembles well, that's not how we do things

542
00:30:17.599 --> 00:30:21.000
around here, you realize you're probably go in their own direction.

543
00:30:21.960 --> 00:30:28.000
Yeah. Yeah, So continuing that conversation about how people ask you,

544
00:30:28.000 --> 00:30:30.000
you know, how do I remain relevant in today's world?

545
00:30:30.480 --> 00:30:32.200
We have to talk about. One of the things you

546
00:30:32.240 --> 00:30:34.440
say is you know, so when your child or grandchild

547
00:30:34.480 --> 00:30:36.480
ask you what should they do to remain competitive in

548
00:30:36.519 --> 00:30:39.920
the workplace, you tell them that they should lean into

549
00:30:39.960 --> 00:30:42.759
building their soft skills, those little pesky things that I've

550
00:30:42.759 --> 00:30:45.079
been teaching for years and years and years, and their

551
00:30:45.079 --> 00:30:48.440
ability to build human connection and nurture relationships, to negotiate,

552
00:30:48.480 --> 00:30:52.359
to inspire, to lead, to build a new business. So

553
00:30:52.640 --> 00:30:54.839
I think that's really solid. I was just by the

554
00:30:54.880 --> 00:30:56.519
way I sat my twenty three or old daughter down

555
00:30:56.559 --> 00:30:58.759
and told her about you guys. I read this amazing

556
00:30:58.759 --> 00:31:02.240
book see Brown. So I did this fast download and

557
00:31:02.319 --> 00:31:04.000
that's part of what I shared with her. And so

558
00:31:04.039 --> 00:31:06.000
if you could speak a little bit to you know,

559
00:31:06.039 --> 00:31:10.559
these these not new, these distinguishing you know, uniquely human

560
00:31:10.640 --> 00:31:13.640
skills that really distinguish us and help us remain competitive.

561
00:31:14.359 --> 00:31:17.000
Yeah. The shorthand phrase I use with my audiences for

562
00:31:17.039 --> 00:31:23.680
that is simple, double down on your humanity, lean into

563
00:31:23.720 --> 00:31:29.039
who you are, and develop the skills that are uniquely human,

564
00:31:30.200 --> 00:31:33.640
your social skills, your communication skills, your skills of persuasion.

565
00:31:34.519 --> 00:31:36.480
Because the jobs that will be left because we will

566
00:31:36.480 --> 00:31:41.279
automate a lot of things, AI will will free us

567
00:31:41.799 --> 00:31:44.880
from repetitive, boring work so that we can focus on

568
00:31:44.920 --> 00:31:47.839
the things that really matter, which are about human connection

569
00:31:48.759 --> 00:31:55.400
and influence and negotiation and communication and inspiration and leadership.

570
00:31:56.880 --> 00:32:00.799
Build those skills I mean, sure, learn how to use AI,

571
00:32:00.839 --> 00:32:05.039
that's very important too, but lean into becoming more human

572
00:32:05.839 --> 00:32:09.480
because that makes you uniquely valuable. People who have both

573
00:32:09.559 --> 00:32:12.519
of those who can span both of these skillsets, that

574
00:32:12.839 --> 00:32:15.960
know how to get the most out of AI and

575
00:32:16.079 --> 00:32:17.920
in some ways to get the most out of other

576
00:32:18.000 --> 00:32:20.880
human beings are going to be the ones that thrive

577
00:32:21.000 --> 00:32:23.640
and are valuable for the longest in the workplace.

578
00:32:23.960 --> 00:32:25.559
And of course that's the part that really made me

579
00:32:25.599 --> 00:32:27.359
giddy because that is exactly the work that we do.

580
00:32:27.960 --> 00:32:31.319
So not yes, well, lean on you to help them

581
00:32:31.359 --> 00:32:33.519
really learn how to use AI, but all the other

582
00:32:33.559 --> 00:32:36.640
doubling down on humanity piece. That's our jam. So we're

583
00:32:36.640 --> 00:32:39.519
a good team. Steeve, you and I go on the road.

584
00:32:39.359 --> 00:32:42.640
Go on the road, listeners and viewers. Let's take a

585
00:32:42.680 --> 00:32:45.160
quick break here. I'm your host, doctor ely'squer TLAs. So

586
00:32:45.160 --> 00:32:46.519
we're not on the air with Steve Brown. He's a

587
00:32:46.599 --> 00:32:49.880
leading voice in the conversation on artificial intelligence, a former

588
00:32:49.960 --> 00:32:52.880
executive at a Google, Deep Mind and Intel, and he

589
00:32:52.960 --> 00:32:56.640
delivers rousing conversations on using AI. We'll be right back

590
00:32:56.640 --> 00:32:58.759
and after the break, we want to talk about going

591
00:32:58.799 --> 00:33:01.920
from offloading to orchestration when it comes to using AI.

592
00:33:18.599 --> 00:33:22.160
Doctor Elise Cortes is a management consultant specializing in meaning

593
00:33:22.200 --> 00:33:26.000
and purpose. An inspirational speaker and author, she helps companies

594
00:33:26.119 --> 00:33:29.880
visioneer for greater purpose among stakeholders and develop purpose inspired

595
00:33:29.960 --> 00:33:33.960
leadership and meaning infused cultures that elevate fulfillment, performance, and

596
00:33:34.000 --> 00:33:37.240
commitment within the workforce. To learn more or to invite

597
00:33:37.240 --> 00:33:39.960
a lease to speak to your organization, please visit her

598
00:33:40.039 --> 00:33:43.319
at Elisecortes dot com. Let's talk about how to get

599
00:33:43.319 --> 00:33:52.119
your employees working on purpose. This is Working on Purpose

600
00:33:52.240 --> 00:33:55.480
with doctor Elise Cortes. To reach our program today or

601
00:33:55.559 --> 00:33:58.319
to open a conversation with Elise, send an email to

602
00:33:58.359 --> 00:34:03.319
Alise Ali se at at least coortes dot com. Now

603
00:34:03.559 --> 00:34:10.679
back to working on Purpose. Thanks foresting with us, and

604
00:34:10.719 --> 00:34:12.719
welcome back to working on Purpose. I am your host,

605
00:34:12.760 --> 00:34:13.840
Doctor Release Cortes.

606
00:34:14.039 --> 00:34:16.199
As you know by now, this program is dedicated to

607
00:34:16.320 --> 00:34:18.920
empowering and inspiring you along your journey to realize more

608
00:34:18.920 --> 00:34:22.039
of your potential. I mentioned the last book that came

609
00:34:22.039 --> 00:34:24.840
out that I wrote, the Great Vitalization, that contains those

610
00:34:24.880 --> 00:34:27.239
twenty two best practices to equip you to elevate your

611
00:34:27.239 --> 00:34:31.440
culture and your leadership. You can download on my website

612
00:34:31.480 --> 00:34:33.880
a free three page assessment to learn just how you're

613
00:34:33.880 --> 00:34:38.199
doing in terms of today's workforce. Go to gustodeshnow dot

614
00:34:38.199 --> 00:34:40.719
com and you can grab it from the homepage if

615
00:34:40.719 --> 00:34:42.840
you are just joining us. My guest is Steve Brown.

616
00:34:42.920 --> 00:34:45.880
He is the author of the AI Ultimatum, Preparing for

617
00:34:45.920 --> 00:34:50.119
a World of Intelligent Machines and Radical Transformation. So there

618
00:34:50.119 --> 00:34:52.480
were so many things, so many pearls, it was hard

619
00:34:52.519 --> 00:34:55.840
to determine which only was only about fifteen topics that

620
00:34:55.880 --> 00:34:58.440
we can really hit here. But I definitely wanted you

621
00:34:58.519 --> 00:35:01.119
to say a bit more. You did already mention a

622
00:35:01.119 --> 00:35:04.639
little bit about this idea of orchestration, but if you

623
00:35:04.639 --> 00:35:07.880
could kind of talk us through or how organizations can

624
00:35:07.880 --> 00:35:11.679
embrace that orchestration in those what is it for components

625
00:35:11.679 --> 00:35:11.960
of it?

626
00:35:13.360 --> 00:35:17.639
Yeah, I mean the orchestration is this idea of deploying

627
00:35:17.719 --> 00:35:23.000
a blended workforce, which is people AI agents so AI

628
00:35:23.039 --> 00:35:25.679
in the form of software and then robots AI in

629
00:35:25.719 --> 00:35:29.199
the form of hardware, and figuring out looking at every

630
00:35:29.199 --> 00:35:33.039
workflow you have, splitting it into tasks and asking the question,

631
00:35:33.119 --> 00:35:36.000
which of these tasks is still best done by a human,

632
00:35:36.360 --> 00:35:38.119
and in many cases it will still be a human

633
00:35:38.280 --> 00:35:41.719
is the best for the job, which is best done

634
00:35:41.760 --> 00:35:44.320
by an agent, and which is best done by a

635
00:35:44.360 --> 00:35:47.360
robot if it's a physical task. And then if it's

636
00:35:47.360 --> 00:35:50.360
done by an agent, what type of agent is it?

637
00:35:50.440 --> 00:35:52.559
Based on the conversation we had in the previous section,

638
00:35:52.800 --> 00:35:55.880
is it an offload agent, is it an elevate agent,

639
00:35:56.000 --> 00:35:58.599
is it an extend agent? And then look at the

640
00:35:58.639 --> 00:36:01.880
handoffs between them, because when you're handing off from one

641
00:36:01.960 --> 00:36:04.559
person to another in a workflow, you have to figure

642
00:36:04.559 --> 00:36:07.719
out how that information is handed from one person to another,

643
00:36:07.760 --> 00:36:10.599
whether it's pieces of paper that are handed down, or

644
00:36:10.880 --> 00:36:14.840
emails or data that's in a database. If you're going

645
00:36:14.840 --> 00:36:17.079
between a human and an agent, or an agent to

646
00:36:17.159 --> 00:36:19.239
a robot, you still have to do the same thing

647
00:36:19.400 --> 00:36:21.920
to figure out how does the information flow and what's

648
00:36:21.960 --> 00:36:24.400
the form that it flows in. How do you make

649
00:36:24.440 --> 00:36:27.400
it easy for a human to receive information from an

650
00:36:27.400 --> 00:36:31.199
AI for example. So you're figuring out this workflow, and

651
00:36:31.480 --> 00:36:35.480
so there's that level of orchestration. You also my orchestrate

652
00:36:35.559 --> 00:36:38.199
flows where agents are working with each other. You going

653
00:36:38.239 --> 00:36:42.159
to have multiple agents working together to try and solve

654
00:36:42.159 --> 00:36:44.280
a problem. For you, and you're seeing a lot of

655
00:36:44.320 --> 00:36:48.719
people working on these solutions now where multiple agents work

656
00:36:48.760 --> 00:36:51.880
behind the scenes and they appear as one agent, but

657
00:36:51.960 --> 00:36:55.800
behind the scenes there's multiple scurrying away to get things

658
00:36:55.880 --> 00:36:58.159
done for you. And Walmart has taken that process. If

659
00:36:58.159 --> 00:37:01.039
you go to the Walmart on your phone, if you

660
00:37:01.079 --> 00:37:05.119
have one, you will see an agent there called Sparky,

661
00:37:05.679 --> 00:37:09.000
and Sparky is actually a collection of agents. Behind the scenes.

662
00:37:09.000 --> 00:37:10.719
There are lots of agents, but it appears to you

663
00:37:10.800 --> 00:37:13.639
as one and Walmart's just going to keep adding new

664
00:37:13.679 --> 00:37:17.280
agents underneath Sparky over time, so Sparky will seem to

665
00:37:17.320 --> 00:37:19.760
get brighter and more capable and more able to help

666
00:37:19.760 --> 00:37:20.320
you with stuff.

667
00:37:21.000 --> 00:37:24.480
That's awesome. It's so awesome. It's exciting, you know, somebody

668
00:37:24.519 --> 00:37:28.840
who believes in human potential and also just potential in general.

669
00:37:28.880 --> 00:37:33.039
This is why you get so excited about this stuff. Steve, Okay,

670
00:37:33.119 --> 00:37:34.239
so we go ahead.

671
00:37:34.599 --> 00:37:39.559
No, it's an exciting time. I mean, it's easy to get.

672
00:37:41.639 --> 00:37:41.840
You know.

673
00:37:42.239 --> 00:37:46.480
The news makes us feel a bit. We're losing our

674
00:37:46.519 --> 00:37:48.920
agency and oh my god, AI is going to come.

675
00:37:49.000 --> 00:37:51.639
We're all out of a job. But if we do

676
00:37:51.719 --> 00:37:55.840
this right, we're going to amplify our ability to do

677
00:37:55.920 --> 00:38:00.920
great stuff. We're probably going to accelerate scientific discuss in

678
00:38:00.960 --> 00:38:03.760
the next five to ten years. I suspect we will

679
00:38:03.800 --> 00:38:10.800
see major breakthroughs in physics, chemistry, biology, medical sciences, mathematics, engineering,

680
00:38:11.320 --> 00:38:15.079
material science that will benefit us all. And if we

681
00:38:15.159 --> 00:38:16.840
do it right, it's going to make it more fun

682
00:38:16.920 --> 00:38:19.800
to come to work because AI will do the stuff

683
00:38:19.800 --> 00:38:20.920
we don't really want to do.

684
00:38:21.199 --> 00:38:24.239
Right, I am with you, I got it, Okay. So

685
00:38:24.360 --> 00:38:28.079
now we've continued as the conversation and organizations have begun

686
00:38:28.199 --> 00:38:31.280
using AI. Now they've learned how to do that, the

687
00:38:31.480 --> 00:38:35.760
orchestration process, they've got these various ways of employing the

688
00:38:35.760 --> 00:38:39.920
intelligence and such. We get to this idea of AGI,

689
00:38:40.199 --> 00:38:43.199
artificial general intelligence. What is it and why is it

690
00:38:43.239 --> 00:38:43.840
so important?

691
00:38:45.119 --> 00:38:47.239
So AGI is a term that's been around for some time,

692
00:38:47.320 --> 00:38:49.840
and it means different things to different people. So let

693
00:38:49.880 --> 00:38:53.760
me tell you what it stands for. It's artificial general intelligence.

694
00:38:54.159 --> 00:38:55.920
It's this idea that you have an AI that is

695
00:38:55.960 --> 00:38:58.840
able to do any task that you want. And probably

696
00:38:59.199 --> 00:39:02.239
the easiest definition to think of is it's an AI

697
00:39:02.400 --> 00:39:05.880
that's as good at every task as any human on

698
00:39:05.920 --> 00:39:08.880
the planet. Perhaps the best human at that task on

699
00:39:08.920 --> 00:39:12.920
the planet. So how far away are we from that.

700
00:39:14.239 --> 00:39:16.199
Some people will tell you that's coming this year. Some

701
00:39:16.199 --> 00:39:20.400
people tell you it's already here in a way. Some

702
00:39:20.440 --> 00:39:22.800
people will tell you, depending on their definition of how

703
00:39:22.800 --> 00:39:25.840
stringent they are, right, can an AI come up with

704
00:39:25.920 --> 00:39:29.079
new ideas on its own? We're not quite there yet.

705
00:39:30.199 --> 00:39:34.320
That could be another five years, but we're pretty close

706
00:39:34.519 --> 00:39:36.960
to having an AI that is able to do anything

707
00:39:37.000 --> 00:39:41.119
that a human can do. Whether that's next year or

708
00:39:41.440 --> 00:39:45.119
ten years from now, in the grand scope of time,

709
00:39:45.239 --> 00:39:48.719
it's pretty soon. And if you have done the hard

710
00:39:48.760 --> 00:39:51.519
work to figure out where does AI fit in my

711
00:39:51.639 --> 00:39:55.480
organization and where does it not because it's better some

712
00:39:55.559 --> 00:39:58.519
tasks are best done by a human, then when the

713
00:39:58.599 --> 00:40:02.000
AI does become AG, you're essentially turning the dial up

714
00:40:02.400 --> 00:40:05.760
on all of those AIS in your workflows to eleven.

715
00:40:05.800 --> 00:40:09.119
And that's a nice reference from a movie from the past,

716
00:40:10.199 --> 00:40:14.280
spinal tap. And so you're getting the benefit of this

717
00:40:14.639 --> 00:40:17.079
this moved AGI. But if you haven't done the hard work,

718
00:40:17.360 --> 00:40:20.239
you don't get that benefit. So that's why the reason

719
00:40:20.320 --> 00:40:23.000
to do this now when even the technology is perhaps

720
00:40:23.280 --> 00:40:26.679
still maturing, because once we get to that point and

721
00:40:26.760 --> 00:40:29.360
it could happen very quickly. You want to be able

722
00:40:29.360 --> 00:40:30.519
to take full advantage of it.

723
00:40:33.239 --> 00:40:36.639
Okay, now let's do a little bit of peering into

724
00:40:36.639 --> 00:40:39.360
the future. One of the things that was really fascinating

725
00:40:39.360 --> 00:40:41.239
about your book as you talk about this, you know

726
00:40:41.320 --> 00:40:45.480
this three decade economic roller coaster where we've stepped onto already,

727
00:40:45.519 --> 00:40:49.760
and you do reference Fenodekosla, the co founder of Sun

728
00:40:49.800 --> 00:40:52.079
Microsystems and a guy who was smart enough to be

729
00:40:52.119 --> 00:40:55.840
an early investor in open AI. Was amazing. But he

730
00:40:56.079 --> 00:40:58.880
has just discussed this idea of these what the next

731
00:40:58.880 --> 00:41:00.679
three decades are going to look like. If you could

732
00:41:00.719 --> 00:41:03.280
sort of speak a little bit to those notions, it's

733
00:41:03.400 --> 00:41:07.400
pretty Hang on to your seat, soliciteners and viewers as

734
00:41:07.400 --> 00:41:08.119
he talks about this.

735
00:41:08.880 --> 00:41:12.280
Yeah, get the children inside, because yes, here we go.

736
00:41:13.079 --> 00:41:13.559
Yeah.

737
00:41:13.639 --> 00:41:15.280
Yeah, So I'll tell you what. Vino Costa and I

738
00:41:15.559 --> 00:41:17.199
have not met ven On Costa. I would love to.

739
00:41:18.159 --> 00:41:20.639
He's a really interesting guy and there's lots of videos

740
00:41:20.960 --> 00:41:23.360
of hears on YouTube if this sounds interesting to you.

741
00:41:23.400 --> 00:41:25.599
But what he says is the rest of the twenty

742
00:41:25.639 --> 00:41:30.360
twenties will feel like a period of extreme productivity growth.

743
00:41:30.800 --> 00:41:34.920
People using AI to improve human productivity and you see

744
00:41:34.920 --> 00:41:38.320
this huge ramp. Then he sees the twenty thirties as

745
00:41:38.360 --> 00:41:42.679
a period of massive disruption where AI gets so good

746
00:41:43.119 --> 00:41:45.440
that it can do most of the work that needs

747
00:41:45.519 --> 00:41:49.199
to be done, whether that is physical work or knowledge work,

748
00:41:49.719 --> 00:41:53.760
and people are displaced from the workforce because we just

749
00:41:53.800 --> 00:41:56.760
don't need humans do these things anymore. And at that

750
00:41:56.960 --> 00:41:59.880
point you have to figure out, well, how do we haul,

751
00:42:00.360 --> 00:42:03.119
you know, pay the mortgage and is it a mortgage still?

752
00:42:03.159 --> 00:42:06.079
And you know, how do we deal with the economy.

753
00:42:07.239 --> 00:42:10.400
If you can get through all of that, Cosler then says,

754
00:42:10.519 --> 00:42:12.719
in the twenty forties, you move into this period of

755
00:42:12.760 --> 00:42:17.599
abundance where we live lives of service, looking after each other.

756
00:42:17.679 --> 00:42:21.679
We don't have to work anymore. We're able to focus

757
00:42:21.719 --> 00:42:26.000
on artistic pursuits, discovery, figuring out what we want to

758
00:42:26.039 --> 00:42:27.599
do with our lives. It's sort of the very top

759
00:42:27.639 --> 00:42:32.280
of Mazow's hierarchy self actualization. And now to get there,

760
00:42:32.519 --> 00:42:35.400
to go between the twenty thirties and the twenty forties,

761
00:42:35.920 --> 00:42:39.239
you need to essentially reinvent the economy and the social contract.

762
00:42:39.719 --> 00:42:42.599
And that is not a conversation our politicians are having

763
00:42:42.679 --> 00:42:47.159
right now, and they need to Unfortunately Superman syndrome means,

764
00:42:47.440 --> 00:42:49.760
you know, politicians don't swoop in and solve a problem

765
00:42:49.880 --> 00:42:52.760
until it's very visible. They're not going to proactively solve

766
00:42:52.760 --> 00:42:55.320
a problem because they want that Superman moment when they

767
00:42:55.360 --> 00:42:58.880
can swoop in, appear to solve a problem and get votes.

768
00:42:59.079 --> 00:43:02.440
So they're not going to proactively fix this. But that's

769
00:43:02.440 --> 00:43:04.599
my biggest worry is how do we make that transition

770
00:43:04.719 --> 00:43:06.679
through the twenty thirties and.

771
00:43:07.000 --> 00:43:09.480
To that end. Something that I have seen here or there,

772
00:43:09.519 --> 00:43:12.519
and we discussed very briefly just a few other episodes,

773
00:43:12.559 --> 00:43:17.760
not very many, is when you talk about the renegotiating

774
00:43:17.800 --> 00:43:20.039
the social contract, even what does the economy look like?

775
00:43:20.480 --> 00:43:23.400
Then we step into the you know, discussing things like

776
00:43:23.599 --> 00:43:27.199
universal basic income, and then you also talk about negative

777
00:43:27.239 --> 00:43:29.960
income tax. So let's situate what those are for our

778
00:43:29.960 --> 00:43:31.400
listeners and viewers.

779
00:43:31.880 --> 00:43:35.760
Yeah, there are multiple proposals out there. I think we're

780
00:43:35.760 --> 00:43:39.400
probably go need a combination of all of them. Universal

781
00:43:39.440 --> 00:43:43.719
basic income is this idea that everybody gets a fixed

782
00:43:43.719 --> 00:43:47.519
amount of money from the government and that you know,

783
00:43:47.840 --> 00:43:50.679
if you work on top of that, go you but

784
00:43:50.960 --> 00:43:54.320
you have enough to live. And there are criticisms of

785
00:43:54.320 --> 00:43:57.440
that probably quite rightly, which is it doesn't necessarily motivate

786
00:43:57.519 --> 00:44:00.760
you to participate in the economy if there's a place

787
00:44:00.800 --> 00:44:04.480
you want to participate. So this idea of negative income

788
00:44:04.519 --> 00:44:07.480
tax is one that says if you earn below a

789
00:44:07.519 --> 00:44:10.599
certain threshold, you get money back from the government. If

790
00:44:10.599 --> 00:44:13.599
you earn above the threshold, you're paying money in as

791
00:44:13.639 --> 00:44:16.519
you would today. So it actually works out to be

792
00:44:16.599 --> 00:44:22.199
much more financially viable. UBI is very expensive, negative income

793
00:44:22.199 --> 00:44:26.920
tax is less expensive, and it maintains that work incentive.

794
00:44:27.760 --> 00:44:31.960
There are also proposals for things like UBS. Not the

795
00:44:32.079 --> 00:44:36.360
Swiss Bank, this is universal basic services. So there are

796
00:44:36.400 --> 00:44:40.760
people trying to figure out how could we deliver a

797
00:44:40.880 --> 00:44:47.760
package of services which is a place to live, food, water, electricity,

798
00:44:48.280 --> 00:44:51.639
broadband AI, you know, the things you need to live

799
00:44:51.679 --> 00:44:55.280
a modern life, for two hundred and fifty bucks a month.

800
00:44:56.519 --> 00:45:01.320
Could we use AI and the reduction in pricing that

801
00:45:01.360 --> 00:45:03.599
you would see Because most of the costs of goods

802
00:45:03.639 --> 00:45:06.960
and services is human labor. If AI takes on most

803
00:45:06.960 --> 00:45:10.559
of that role, then everything becomes radically cheaper. Could you

804
00:45:10.639 --> 00:45:13.960
deliver that twenty and fifty bucks to cover everything that

805
00:45:14.000 --> 00:45:18.440
people have today? Maybe people are working hard on trying

806
00:45:18.480 --> 00:45:19.280
to figure that out.

807
00:45:20.599 --> 00:45:23.559
Really important stuff to be thinking about, like not today,

808
00:45:23.599 --> 00:45:27.159
but like maybe ten years ago or five years ago. Okay,

809
00:45:27.159 --> 00:45:29.519
So we have to finish the show by talking about

810
00:45:29.559 --> 00:45:32.039
what you say as the leadership imperative. And you talk

811
00:45:32.079 --> 00:45:34.840
about and this is really really, you know, sobering when

812
00:45:34.880 --> 00:45:37.320
you think about this. You say, today's business leaders occupy

813
00:45:37.360 --> 00:45:40.840
a unique position in human history. They're not just managing

814
00:45:40.880 --> 00:45:44.119
companies through dentalogical change. There are architects of the future

815
00:45:44.119 --> 00:45:48.519
of human machine collaboration, and they're the last, the last

816
00:45:48.760 --> 00:45:51.440
generation to manage just a purely human workforce.

817
00:45:52.119 --> 00:45:52.360
Right.

818
00:45:52.440 --> 00:45:54.400
So if you can talk about what is this imperative,

819
00:45:54.400 --> 00:45:55.400
you have a will close with that.

820
00:45:57.079 --> 00:45:59.719
So this is the AI ultimatum written large, which is

821
00:45:59.760 --> 00:46:04.440
why named the book. That leaders have to have to

822
00:46:04.519 --> 00:46:08.920
recognize that they're not just making decisions now that affect

823
00:46:09.239 --> 00:46:13.480
their employees and their shareholders and perhaps their customers. The

824
00:46:13.559 --> 00:46:16.960
decisions that they make today, the way that they choose

825
00:46:17.039 --> 00:46:21.559
to use AI, either to replace people or to amplify them,

826
00:46:21.800 --> 00:46:24.559
and that's really the decision the twentieth century versus twenty

827
00:46:24.599 --> 00:46:29.639
first century thinking, cost cutting versus value creation. The decisions

828
00:46:29.679 --> 00:46:34.880
they make collectively will determine how things go for humanity

829
00:46:35.280 --> 00:46:38.199
in the next ten years. So that's a pretty big responsibility.

830
00:46:38.280 --> 00:46:41.760
Leaderships now have a responsibility that extends beyond the borders

831
00:46:41.760 --> 00:46:44.719
of their companies and it requires them. I mean, I've

832
00:46:44.719 --> 00:46:48.800
talked about earlier on there being three steps enabling people

833
00:46:48.800 --> 00:46:53.480
with tools. We talked about the orchestration, having humans and

834
00:46:53.599 --> 00:46:57.679
agents and robots work together. That step two. The third step,

835
00:46:58.000 --> 00:47:00.920
which i is not in the book because it's fresh stuff,

836
00:47:01.400 --> 00:47:04.079
is AI first thinking. And this is where leaders need

837
00:47:04.119 --> 00:47:09.360
to have the courage, the curiosity, the wherewithal to have

838
00:47:09.440 --> 00:47:12.320
a vision which puts AI at the center of their business,

839
00:47:13.079 --> 00:47:17.119
not replacing people, but then wrapping people around that so

840
00:47:17.159 --> 00:47:19.960
that they can ten x, twenty x one hundred x

841
00:47:20.199 --> 00:47:23.960
the impact that their organizations have and deliver more value

842
00:47:24.239 --> 00:47:29.280
for humanity. That requires re engineering on a grand scale.

843
00:47:29.920 --> 00:47:32.800
But if people have that vision, that courage, that boldness

844
00:47:32.800 --> 00:47:36.519
of vision, I'm excited about what the organizations are going

845
00:47:36.519 --> 00:47:38.480
to be able to do for people. And that's what

846
00:47:38.480 --> 00:47:39.800
it's ultimately all about.

847
00:47:40.960 --> 00:47:43.079
What a beautiful way to finish what And you are

848
00:47:43.159 --> 00:47:45.599
such a gift, Steve. I'm so grateful to know you,

849
00:47:45.679 --> 00:47:47.480
to learn from you, to be inspired by you and

850
00:47:47.559 --> 00:47:51.039
to now help evangelize this critical message. So thank you

851
00:47:51.079 --> 00:47:52.880
for writing this brilliant book, getting it out in the

852
00:47:52.920 --> 00:47:55.559
world now, and coming on working in purpose to talk

853
00:47:55.559 --> 00:47:55.960
about it.

854
00:47:56.360 --> 00:47:58.199
My pleasure really nice to be hearers.

855
00:47:58.559 --> 00:48:01.719
Thank you. If you want arn more about Steve Brown

856
00:48:01.760 --> 00:48:04.639
and the work he does helping organizations steward the AI

857
00:48:04.719 --> 00:48:08.280
and Technology Journey and or his books The AI Ultimatum,

858
00:48:08.639 --> 00:48:10.920
you can go to where to find You Steve which

859
00:48:10.960 --> 00:48:13.840
website or LinkedIn? Where shall we find You?

860
00:48:13.840 --> 00:48:16.679
You can find you on LinkedIn and also www dot

861
00:48:16.760 --> 00:48:19.079
Steve Brown dot AI Perfect.

862
00:48:20.079 --> 00:48:21.559
Last week, if you missed the live show, you can

863
00:48:21.599 --> 00:48:23.719
always catch it be recorded podcast. We were on a

864
00:48:23.800 --> 00:48:27.000
with doctor Arthur's here McCauley, the licensed clinical psychologist who

865
00:48:27.039 --> 00:48:29.880
has been treating clients for more than forty five years,

866
00:48:30.559 --> 00:48:33.920
talking about his latest book, Soul Fire, Igniting Inner Strength

867
00:48:33.960 --> 00:48:37.039
in a Fractured World, which is chock full of wisdom

868
00:48:37.079 --> 00:48:39.480
and insights from his years of sitting with people in

869
00:48:39.519 --> 00:48:42.719
their darkest moments and also with them as they're reaching

870
00:48:42.760 --> 00:48:45.800
for their highest heights. He shared many practices he teaches

871
00:48:45.840 --> 00:48:48.760
to help people ignite their best and stabilize their nervous

872
00:48:48.760 --> 00:48:52.039
systems in today's tumultuous times. Next week, we'll be on

873
00:48:52.039 --> 00:48:54.400
the air with Norman Wolf talking about his new book

874
00:48:54.679 --> 00:48:57.719
Leading the Living Organization Transform, How you lead to deliver

875
00:48:57.800 --> 00:49:00.039
extraordinary results. See you then, let's get to you, and

876
00:49:00.079 --> 00:49:02.760
you're leading into discovering how to elevate your operating system

877
00:49:02.800 --> 00:49:06.079
as a leader and create destination workplaces where legendary business

878
00:49:06.119 --> 00:49:07.960
takes place. Let's work on Purpose.

879
00:49:11.079 --> 00:49:13.719
We hope you've enjoyed this week's program. Be sure to

880
00:49:13.760 --> 00:49:17.400
tune into Working on Purpose featuring your host, doctor Elise Cortes,

881
00:49:17.519 --> 00:49:20.840
each week on W four CY. Together we'll create a

882
00:49:20.880 --> 00:49:25.480
world where business operates conscientiously. Leadership inspires and passion performance

883
00:49:25.599 --> 00:49:28.360
and employees are fulfilled in work that provides the meaning

884
00:49:28.400 --> 00:49:32.199
and purpose they crave. See you there, Let's work on Purpose.