back to indexGinni Rometty: IBM CEO on Leadership, Power, and Adversity | Lex Fridman Podcast #362
Chapters
0:0 Introduction
1:19 IBM
10:59 Hiring
16:16 Leadership
23:0 Hard work
28:40 Adversity
35:38 Power
49:5 Sacrifice
54:10 Taking over as CEO
72:24 Negotiating
77:31 Deep Blue vs Garry Kasparov
82:49 IBM Watson
102:42 Work-life balance
109:46 Advice for young people
00:00:00.000 |
I've had to do plenty of unpopular things. I think anytime you have to run a company that endures 00:00:04.800 |
a century and has to endure another century, you will do unpopular things. You have no choice. 00:00:10.400 |
And I often felt I had to sacrifice things for the long term. And whether that would have been 00:00:16.000 |
really difficult things like job changes or reductions, or whether it would be things like, 00:00:22.240 |
"Hey, we're going to change the way we do our semiconductors," and a whole different philosophy, 00:00:29.360 |
you have no choice. And in times of crisis as well, you got to be... 00:00:35.040 |
The following is a conversation with Jeannie Vermetti, who was a longtime CEO, 00:00:41.680 |
president, and chairman of IBM. And for many years, she was widely considered to be one of 00:00:46.720 |
the most powerful women in the world. She's the author of a new book on power, leadership, 00:00:52.800 |
and her life story called "Good Power," coming out on March 7th. She is an incredible leader 00:01:00.880 |
and human being, both fearless and compassionate. It was a huge honor and pleasure for me to sit 00:01:07.200 |
down and have this chat with her. This is the Lex Friedman Podcast. To support it, 00:01:12.320 |
please check out our sponsors in the description. And now, dear friends, here's Jeannie Vermetti. 00:01:19.280 |
You worked at IBM for over 40 years, starting as a systems engineer, and you ran the company 00:01:24.880 |
as chairman, president, and CEO from 2011 to 2020. IBM is one of the largest tech companies in the 00:01:31.840 |
world with, maybe you can correct me on this, with about 280,000 employees. What are the biggest 00:01:39.760 |
challenges running the company of that size? Let's start with a sort of big overview question. 00:01:45.280 |
The biggest challenges I think are not in running them, it's in changing them. 00:01:48.640 |
And that idea to know what you should change and what you should not change. Actually, people don't 00:01:54.320 |
always ask that question. What should endure, even if it has to be modernized, but what should endure? 00:01:59.360 |
And then I found the hardest part was changing how work got done at such a big company. 00:02:04.720 |
What was the parts that you thought should endure? The core of the company that was beautiful and 00:02:10.480 |
powerful and could persist through time, that should persist through time. 00:02:13.520 |
I'd be interested, do you have a perception of what you think it would be? 00:02:16.080 |
Do I have a perception? Well, I'm a romantic for a history of long running companies, 00:02:23.040 |
so there's kind of a tradition as a AI person. To me, IBM has some epic sort of research 00:02:31.600 |
accomplishments where you show off, you know, Deep Blue and Watson, just impressive big moonshot 00:02:39.440 |
challenges and accomplishing those. But that's, I think, probably a small part of what IBM is. 00:02:44.400 |
That's mostly like the sexy public facing part. 00:02:48.160 |
Yeah. Well, certainly the research part itself is over 3,000, so it's not that small. That's 00:02:54.320 |
a pretty big research group. But the part that should endure ends up being a company that does 00:03:01.040 |
things that are essential to the world. Meaning, think back, you said you're a romantic. It was 00:03:09.200 |
the '30s, the social security system. It was putting the man on the moon. It was, you know, 00:03:13.680 |
to this day, banks don't run, railroads don't run. That is at its core, it's doing mission 00:03:21.360 |
critical work. And so that part, I think, is at its core, it's a business to business company. 00:03:26.720 |
And at its core, it's about doing things that are really important to the world becoming running and 00:03:32.640 |
being better. Running the infrastructure of the world, so doing it at scale, doing it reliably. 00:03:37.360 |
Yes, secure in this world, that's like everything. And in fact, when I started, 00:03:41.200 |
I almost felt people were looking for what that was. And together, we sort of, in a word, 00:03:45.680 |
it was to be essential. And the reason I love that word was I can't call myself essential. 00:03:51.280 |
You have to determine I am, right? So it was to be essential, even though some of what we did is 00:03:56.240 |
exactly what you said, it's below the surface. So many people, because people say to me, well, 00:03:59.600 |
what does IBM do now? Right? And over the years, it's changed so much. And today, it's really a 00:04:06.240 |
software and consulting company. Consulting is a third of it. And the software is all hybrid cloud 00:04:12.160 |
and AI. That would not have been true, as you well know, back even two decades ago, right? So 00:04:17.200 |
it changes. But I think at its core, it's that be essential. You said moonshot, can't all be 00:04:22.560 |
moonshots because moonshots don't always work, but mission critical work. 00:04:26.400 |
So given the size, though, when you started running it, did you feel the sort of thing that 00:04:35.280 |
people usually associate with size, which is bureaucracy, and maybe the aspect of size that 00:04:41.680 |
hinder progress or hinder pivoting, did you feel that? - You would, for lots of reasons. I think 00:04:47.920 |
when you're a big company, sometimes people think of process as the client themselves. 00:04:53.280 |
I always say to people, your process is not your customer. There is a real customer here that you 00:04:59.760 |
exist for. And that's really easy to fall into because people are a master to this process. And 00:05:07.360 |
that's not right. And when you're big, the other thing, and boy, there's a premium on it, is speed, 00:05:13.760 |
right? That in our industry, you got to be fast. And go back, like when I took over and it was 2012, 00:05:20.800 |
we had a lot of catching up to do and a lot of things to do, and it was moving so fast. And as 00:05:26.800 |
you well know, all those trends were happening at once, which made them go even faster. And so 00:05:31.920 |
pretty unprecedented, actually, for that many trends to be at one time. And I used to say to 00:05:36.480 |
people, "Go faster, go faster, go faster." And honestly, I've tired them out. I mean, 00:05:43.440 |
it kind of dawned on me that when you're that big, that's a really valuable lesson. And it taught me 00:05:49.360 |
the how's perhaps more important than the what. Because if I didn't do something to change how 00:05:54.560 |
work was done, like change those processes or give them new tools, help them with skills, 00:05:59.360 |
they couldn't. They'll just do the same thing faster. If someone tells you, "You've got hiking 00:06:05.280 |
boots," and they're like, "No, go run a marathon." You're like, "I can't do it in those boots." 00:06:09.040 |
But so you've got to do something. And at first, I think the ways for big companies, 00:06:14.880 |
I would call them like blunt clubs. You do what everyone does. You reduce layers. Because if you 00:06:19.280 |
reduce layers, decisions go faster. It's math. If there's less decision points, things go faster. 00:06:25.600 |
You do the blunt club thing. And then after that, though, it did lead me down a long journey of, 00:06:33.440 |
they sound like buzzwords, but if you really do them at scale, they're hard, around things like 00:06:37.600 |
agile. Because you've really got to change the way work gets done. And we ended up training, 00:06:44.880 |
God, hundreds of thousands of people on that stuff to really change it. 00:06:50.480 |
That's right. Because everybody talks about it. But the idea that you would really have small, 00:06:55.120 |
multidisciplinary teams work from the outside in, set those sort of interim steps, take the 00:07:01.760 |
feedback pivot, and then do it on not just products, do it on lots of things. It's hard to 00:07:07.280 |
do at scale. People always say, "Oh, I got this agile group over here of 40 people." But not when 00:07:12.560 |
you're a couple hundred thousand people. You got to get a lot of people to work that way. 00:07:15.760 |
- The blunt club thing you're talking about. So flatten the organization as much as possible. 00:07:19.600 |
- Yeah, yeah. I probably reduced the layers of management by half. 00:07:22.320 |
And so that has lots of benefits, right? Time to a decision, more autonomy to people. And then 00:07:31.680 |
the idea of faster clarity of where you're going. Because you're not just filtered through so many 00:07:37.520 |
different layers. And I think it's the kind of thing a lot of companies, if you're big, 00:07:42.720 |
have to just keep going through. It's kind of like grass grows. It just comes back, 00:07:46.080 |
and you got to go back town and work on it. So it's a natural thing. But I hear so many people 00:07:52.240 |
talk about it, Lex, this idea of like, "Okay, well, who makes a decision?" You've often heard, 00:07:56.480 |
"Nobody can say yes, and everybody can say no." And that's actually what you're trying to get out 00:08:01.040 |
of a system like that. - So, I mean, your book in general, 00:08:06.320 |
the way you lead is very much about we and us, the power of we. But is there times when a 00:08:12.560 |
leader has to step in and be almost autocratic, take control and make hard, unpopular decisions? 00:08:17.840 |
- Oh, I am sure you know the answer to that. And it is, of course, yes. 00:08:21.600 |
- It's just fun to hear you say it. - It's fun to say it. Yeah. You know, 00:08:24.400 |
'cause I actually, A, there's a leader for a time, but then there's a leader for a situation, 00:08:29.840 |
right? And so I've had to do plenty of unpopular things. I think anytime you have to run a company 00:08:35.440 |
that endures a century and has to endure another century, you will do unpopular things. You have 00:08:40.720 |
no choice. And I often felt I had to sacrifice things for the long-term. And whether that would 00:08:46.880 |
have been really difficult things like job changes or reductions, or whether it would be things like, 00:08:53.520 |
"Hey, we're gonna change the way we do our semiconductors," and a whole different philosophy, 00:08:59.600 |
you have no choice. I mean, and in times of crisis as well, you gotta be... I always said it's not a 00:09:05.200 |
popularity contest. So that's... None of these jobs are popularity contests. I don't care if 00:09:10.480 |
your company's got one person or half a million. They're not popularity contests. 00:09:14.960 |
- But psychologically, is it difficult to just sort of step in as a new CEO and... 00:09:19.920 |
'Cause you're fighting against tradition, against all these people that act like experts of their 00:09:25.680 |
thing, and they are experts of their thing, to step in and say, "We have to do it differently." 00:09:29.840 |
- Yeah. When you gotta change a company, it's really tempting to say, "Throw everything else 00:09:34.400 |
out." Back to that, what must endure, right? But I know when I took over to start, I knew how much 00:09:39.040 |
had to change. The more I got into it, I could see, wow, a lot more had to change, right? 'Cause 00:09:44.160 |
we needed a platform. We'd always done our best when we had a platform, a technology platform. 00:09:49.280 |
You will go back in time and you'll think of the mainframe systems. You'll think of the PC. You'll 00:09:53.840 |
think of perhaps middleware. You could even call services a platform. We needed a platform, 00:10:00.080 |
the next platform here to be there. Skills. When I took over, we inventoried who had modern skills 00:10:07.200 |
for the future. It was two out of 10 people for the future. Not that they didn't have relevant 00:10:10.880 |
skills today, but for the future, two out of 10. Yikes, that's a big problem, right? The speed at 00:10:16.160 |
which things were getting done. So you got so much to do, and you say, "Is that a scary thing?" Yes. 00:10:23.360 |
Do you have to sometimes dictate? Yes. But I did find, and it is worth it. I know every big company 00:10:29.760 |
I know. My good friend that runs General Motors, she's had to change. Go back to what is them, 00:10:36.160 |
them. When you do that, that back to be essential, we started with, "Hey, it's be essential." Then 00:10:42.560 |
the next thing I did with the team was say, "Okay, now this means new era of computing, 00:10:46.560 |
new buyers are out there, and we better have new skills. Okay, now the next thing, 00:10:50.880 |
how do you operationalize it?" It just takes some time, but you can engineer that and get people to 00:10:58.560 |
build belief. For the skills, that means hiring and it means training? Yes. Oh boy, that's a long, 00:11:06.560 |
skills is a really long topic in and of itself. I try to put my view in it. I learned a lot, 00:11:11.040 |
and I changed my view on this a lot. I'll go back at my very beginning, say 40 years ago, 00:11:17.840 |
I would have said at that point, "Okay." I was always in a hurry. I was interviewing to hire 00:11:23.600 |
people. I don't know how you hire people. 40 years ago, I'd be like, "Okay, I got to fit in these 00:11:27.600 |
interviews. I got to hire someone to get this done." Then time would go on. I'm like, "Oh, 00:11:31.520 |
that's not very good." In fact, someone once said to me, "Hey, hire the best people to work for you, 00:11:35.440 |
and your job gets a lot easier. Okay, I should spend more time on this topic, spend more time 00:11:39.760 |
on it." Then it was like, "Okay, hire experts." Okay, hired a lot of experts over my life. 00:11:45.520 |
Then I was really like an epiphany, and it really happened over my tenure running the 00:11:51.760 |
company and having to change skills. If someone's an expert at something and has just done that for 00:11:56.560 |
30 years, the odds of them really wanting to change a lot are pretty low. When you're in a 00:12:02.560 |
really dynamic industry, that's a problem. Okay, that was my first revelation on this. 00:12:08.960 |
Then when I looked to hiring, I can remember when I started my job, we needed cyber people. 00:12:14.960 |
I go out there and I look. Unemployment in the US was almost 10%. Can't find them. Okay, it's 10%, 00:12:20.880 |
and I can't find the people. Okay, what's the issue? Okay, they're not teaching the right 00:12:24.560 |
things. That led me down a path, and it was serendipity that I happened to do a review of 00:12:31.040 |
corporate social responsibility. We had this one little fledgling school in a low-income area, 00:12:36.160 |
and high school with a community college, we gave them internships, direction on curriculum. 00:12:41.440 |
Lo and behold, we could hire these kids. I said, "Hmm, this is not CSR. I just found a new talent 00:12:48.720 |
pool," which takes me to now what I'm doing in my post-retirement. I'm like, "This idea that don't 00:12:55.280 |
hire just for a college degree," we had 99% of our hires were college and PhDs, and I'm all for it. 00:13:05.520 |
No, you should not be. I'm vice chair at Northwestern, one of the vice chairs. 00:13:09.280 |
But I said, "I just really like aptitude does not equal access." These people didn't have access, 00:13:15.520 |
but they had aptitude. It changed my whole view to skills first. And so now for hiring, 00:13:22.080 |
that's kind of a long story to tell you, the number one thing I would hire for now 00:13:25.760 |
is somebody's willingness to learn. And you can test, you can try different ways, 00:13:29.680 |
but their curiosity and willingness to learn, hands down, I will take that trait over anything 00:13:35.440 |
So the interview process, the questions you ask- 00:13:38.560 |
The kind of things you talk to them about is try to get at how curious they are about the work. 00:13:43.360 |
And testing. I mean, we triangulated around it lots of ways. And now look, 00:13:47.200 |
at the heart of it, what it would do is change. You don't think of buying skills, 00:13:52.080 |
you think of building skills. And when you think that way, with so many people, and I think 00:13:57.760 |
this country, many developed countries being disenfranchised, you got to bring them back 00:14:02.960 |
into the workforce somehow, and they got to get some kind of contemporary skills. 00:14:06.160 |
And if you took that approach, you can bring them back into the workforce. 00:14:10.480 |
Yeah, I think some interesting combination of humility and passion. Because like you said, 00:14:15.120 |
experts sometimes lack humility if they call themselves an expert for a few too many years. 00:14:22.240 |
So you have to have that beginner's mind and a passion to be able to aggressively, constantly 00:14:27.600 |
be a beginner at everything and learn and learn and learn. 00:14:30.320 |
You know, I saw it firsthand when we were beginning this path down the cloud in AI, 00:14:35.760 |
and people would say, "Oh, IBM, you know, it's existential. They've got to change." And 00:14:41.840 |
all these things. And I did hire a lot of people from outside, very willing to learn new things. 00:14:46.560 |
"Come on in. Come on in." And I sometimes say, "Shiny objects. Trained in shiny objects. Come 00:14:51.760 |
on in." But I saw something, it was another one of these, "You're not a shiny object. I'm not 00:14:56.160 |
saying that." But I learned something. Okay, some of them did fantastic. And others, they're like, 00:15:04.640 |
"Well, let me school you on everything." But they didn't realize we did really mission-critical 00:15:09.200 |
work. And that'd break a bank. I mean, they would not understand the certain kind of security and 00:15:13.520 |
the auditability and everything they had to go on. And then I watched IBM people say, "Oh, 00:15:19.200 |
I actually could learn something." Some were like, "Yeah. Okay. I don't know how to do that. That's 00:15:22.480 |
a really good thing I could learn." And in the end, there was not like one group was a winner 00:15:26.800 |
and one was a loser. The winners were the people who were willing to learn from each other. 00:15:30.240 |
I mean, to me, it was very stark example of that point. And I saw it firsthand. So, 00:15:36.560 |
that's why I'm so committed to this idea about skills first. And that's how people should be 00:15:40.240 |
hired, promoted, paid, you name it. - Yeah. The AI in general, it seems like 00:15:47.360 |
nobody really understands now what the future will look like. We're all trying to figure it out. So, 00:15:52.960 |
what IBM will look like in 50 years in relation to the software business to AI is unknown. What 00:15:59.680 |
Google will look like, what all these companies, we're trying to figure it out. And that means 00:16:04.240 |
constantly learning, taking risks, all of those things. And nobody's really skilled in AI. 00:16:11.440 |
- You're absolutely right. That's right. I couldn't agree more with you on that. 00:16:16.400 |
- You wrote in the book, speaking of hiring, "My drive for perfection often meant I only 00:16:25.200 |
focused on what needed to change without acknowledging the positive. This could keep 00:16:29.600 |
people from trusting themselves. It could take me a while to learn that just because I could point 00:16:34.800 |
something out didn't mean I should. I still spotted errors, but I became more deliberate 00:16:39.840 |
about what I mentioned and sent back to get fixed. I also tried to curtail my tendency to micromanage 00:16:45.680 |
and let people execute. I had to stop assuming my way was the best or only way. I was learning 00:16:51.920 |
that giving other people control builds their confidence and that constantly trying to control 00:16:56.720 |
people destroys it." So, what's the right balance between showing the way and helping people find 00:17:04.560 |
the way? - That is a good question. Because, like a really flip answer would be, as it gets bigger, 00:17:13.440 |
you have no choice but to just, you know, you can't do it. You have to tell or show. I mean, 00:17:20.960 |
you've got to let people find their way because it's so big you can't, right? That's an obvious 00:17:26.160 |
answer. Scope of work. Bigger it gets, okay, I've got to let more stuff go. But, I have always 00:17:34.160 |
believed that a leader's job is to do as well. And I think there's like a few areas that are 00:17:43.760 |
really important that you always do. Now, it doesn't mean you're showing. So, like when it has 00:17:49.360 |
to do with values and value-based decisions, like I think it's really important to constantly 00:17:54.800 |
show people that you walk your talk on that kind of thing. It's super important. And I actually 00:18:02.800 |
think it's a struggle young companies have because the values aren't deeply rooted. And when a storm 00:18:08.080 |
comes, it's easy to uproot. And so, I always felt like when it was that time, I showed it. I got 00:18:15.520 |
taught that so young at IBM and even General Motors. In fact, I do write about that in the book. 00:18:23.120 |
First time I was a manager, I had a gentleman telling dirty jokes. And not to me, but to other 00:18:30.480 |
people. And it really offended people and some of the women. And this is the very early '80s. 00:18:38.480 |
And they came, said something. I talked to my boss. I'm a first-time manager. And he was 00:18:45.600 |
unequivocal with what I should do. He said, and this was a top performer, "It stops immediately 00:18:50.880 |
or you fire him." So, there are a few areas like that that I actually think you have to always 00:18:56.400 |
continue to role model and show. That to me isn't the kind that like when do you let go of stuff. 00:19:06.400 |
Yeah, whatever you're in service of. And the other thing was, I really felt it was really 00:19:10.960 |
important to role model learning. So, I can remember when we started down the journey 00:19:17.440 |
and we went on to this thing called the Think Academy. IBM's longtime motto had been Think. 00:19:22.960 |
And we said, "Okay, I'm going to make the first Friday of every month compulsory education." 00:19:27.440 |
And, okay, I mean everybody. Like everybody, I don't care what your job is. When the whole 00:19:33.200 |
company has to transform, everybody's got to kind of have some skin in this game and understand it. 00:19:37.280 |
I taught the first hour of every month for four years. Now, okay, I had to learn something. 00:19:43.600 |
But it made me learn. But I was like, "Okay, if I can teach this, you can do it." Right? I mean, 00:19:50.080 |
So, it was a compulsory Thursday night education for you. 00:19:53.120 |
I'm a little bit better prepared than that. But yes, you're so right. Yes. 00:19:58.880 |
So, like personality-wise, you like to prepare? 00:20:01.840 |
Yeah. But there's roots in that go back deeply, deeply, deeply, deeply. And I think 00:20:06.240 |
it's an interesting reason. So, why do? You're prepared, my friend. Yes, you are. You prepare 00:20:16.800 |
But that's okay. I mean, you don't have to prepare everything. I don't prepare everything 00:20:20.400 |
No, but I unfortunately wing stuff. I save it to last minute. I push everything. I'm always 00:20:25.680 |
almost late. And I don't know why that is. I mean, there's some deep psychological thing we 00:20:30.160 |
should probably investigate. But it's probably the anxiety brings out the performance. 00:20:34.720 |
That can be. That's very true with some people. 00:20:36.400 |
I mean, so, I'm a programmer and engineer at heart. And so, programmers famously overestimate 00:20:41.840 |
or underestimate, sorry, how long something's going to take. And so, I just, everything, 00:20:47.520 |
always underestimate. And it's almost as if I want to feel this chaos of anxiety of a 00:20:54.240 |
deadline or something like this. Otherwise, I'll be lazy sitting on a beach with a pina 00:20:59.280 |
I don't know. So, we have to know ourselves. But for you, you like to prepare. 00:21:03.600 |
For me. Yeah. It came from a few different places. I mean, one would have been as a kid, 00:21:08.640 |
I think, I was not a memorizer. And my brother is brilliant. He can read it once, boom, done. 00:21:17.440 |
And so, I always wanted to understand how something happened. It didn't matter what 00:21:21.040 |
it was I was doing. Whether it was algebra, theorems, I always wanted, don't give me the 00:21:24.960 |
answer. Don't give me the answer. I want to figure it out, figure it out. So, I could 00:21:27.440 |
reproduce it again and didn't have to memorize. So, it started with that. And then over time, 00:21:32.720 |
okay, so I was in university in the 70s. When I was in engineering school, I was the only woman. 00:21:39.360 |
You know, I meet people still to this day and they're like, "Oh, I remember you." I'm like, 00:21:42.080 |
"Yeah, sorry, I don't remember you. There were 30 of you, one of me." And I think you already 00:21:47.360 |
get that feeling of, okay, I better really study hard because whatever I say is going to be 00:21:50.640 |
remembered in this class, good or bad. And it started there. So, in some ways, I did it for 00:21:58.160 |
two reasons. Early on, I think it was a shield for confidence. The more I studied, the more prepared 00:22:04.720 |
I was, the more confident. That's probably still true to this day. The second reason I did it 00:22:10.480 |
evolved over time and became different to prepare. If I was really prepared, then when we're in the 00:22:17.040 |
moment, I can really listen to you. See, because I don't have to be doing all this stuff on the fly 00:22:21.920 |
in my head. And I could actually take things I know and maybe help the situation. So, it really 00:22:27.920 |
became a way that I could be present in the moment. And I think it's something a lot of people 00:22:34.720 |
that in the moment, I learned it from my husband. He doesn't prepare by the way at all. So, that's 00:22:38.800 |
not it. But I watched the in the moment part. The negative example. 00:22:43.040 |
No, no, no. And I'm not going to change that. As he says, he's a type C, I'm an A. 00:22:47.840 |
And I have been married 43 years and that seems to work. But that idea that you could be in the 00:22:53.440 |
moment with people is a really important thing. Yeah. So, the preparation gives you the freedom 00:22:58.560 |
to really be present. So, just to linger on, you mentioned your brother. And it seems like in the 00:23:06.000 |
book that you really had to work hard when you studied to sort of, given that you weren't good 00:23:13.360 |
at memorization, you really truly deeply wanted to understand the stuff and you put in the hard 00:23:17.200 |
work. And that seems to persist throughout your career. So, hard work is often associated with, 00:23:23.120 |
sort of has negative associations. Well, maybe with burnout, with dissatisfaction. 00:23:29.200 |
Is there some aspect of hard work at the core of who you are that led to happiness for you? 00:23:38.000 |
I enjoyed it. So, I'll be the first. And I'm really careful to say that to people because 00:23:43.600 |
I don't think everyone should associate, "Gee, to do what you did, there's only one route there." 00:23:48.560 |
Right? And that's just not true. And I do it because I like it. In fact, I'm careful. And 00:23:53.840 |
as time goes on, you have to be careful as more and more people watch you. Whether you like it, 00:23:57.760 |
you're a role model or not. You are a role model for people. Whether you know it, like it, want it, 00:24:01.440 |
does not matter. I learned that the hard way. And I would have to say to people, "Hey, just because 00:24:05.200 |
I do this does not mean I do it for these reasons." Right? So, be really explicit. And I'd come to 00:24:11.360 |
believe, usually when people say the word power, I don't know, do you have a positive or negative 00:24:14.960 |
notion when I say the word power? We'll just do a- - Probably negative one, yeah. 00:24:18.080 |
- For some stereotype or some view that somebody's abused it in some way. You can read the newspaper, 00:24:23.840 |
somebody's doing something. Personal people, like I'll ask people, "Do you want power?" And they're 00:24:29.040 |
like, "Oh no, I'd rather do good." And I think the irony is you need power to do good. 00:24:35.280 |
And so, that sort of led me down to, as I thought about my own life, right? Because it starts in a, 00:24:42.640 |
like many of us, you don't have a lot, but you don't know that because you're like everybody 00:24:48.720 |
else around you at that time. And on one end, tragedy, right? My father leaves my mother, 00:24:54.240 |
homeless, no money, no food, nothing, four kids. She's never worked a day in her life outside of 00:25:00.080 |
a home. And the irony that I hear I would end up as the ninth CEO of one of America's iconic 00:25:06.080 |
companies. And now I co-chair this group 110. And that journey, I said, "The biggest thing I learned 00:25:11.680 |
was you could do really hard, meaningful things in a positive way." So now you ask me about why 00:25:15.920 |
do I work so hard? I ended up writing the book in three pieces for this reason. When you really 00:25:22.720 |
think of your life and power, I thought it kind of fell like a pebble in water. Like there's a 00:25:28.880 |
ring about you really care about yourself and like the power of yourself, power of me. There's a time 00:25:35.360 |
it transcends to that you are working with and for others and another moment when it becomes like 00:25:39.680 |
about society. So my hard work, I'd ask you, one day sit really hard and think about when you close 00:25:46.880 |
your eyes, who do you see from your early life, right? And what did you learn? And maybe it's not 00:25:52.080 |
that hard for you. I mean, it's funny the things then, if I really looked at it, it's no surprise 00:25:59.680 |
what I do today. And that hard work part, my great-grandma, as you and I were comparing notes 00:26:05.280 |
on Russia, right? And never spoke English, spoke Russian, came here to this country, was a cleaning 00:26:11.520 |
person at the Wrigley Building in Chicago. Yet if she hadn't saved every dime she made, 00:26:17.120 |
my mother wouldn't have a home and wouldn't have had a car, right? What did I learn from that? 00:26:22.000 |
Hard work. In fact, actually, when I went to college, she's like, "You know, you really should 00:26:25.440 |
be on a farm. You're so big and strong." That was her view. And then my grandmother, 00:26:31.520 |
another tragic life. What did she do though? And think how long, that's in the 40s, the 50s, 00:26:37.680 |
she made lampshades. And she taught me how to sew, right? So I could sew clothes when we couldn't 00:26:43.200 |
afford them. But my memory of my grandma is working seven days a week, sewing lampshades. 00:26:51.360 |
And then here comes my mom and her situation who climbs her way out of it. So I associate that 00:26:57.680 |
with, well, strong women, by the way, all strong women. And I associate hard work with how you are 00:27:07.040 |
sure you can always take care of yourself. And so I think that the roots go way back there. 00:27:11.840 |
And they were always teaching something, right? My great-grandma was teaching me how to cook, 00:27:15.040 |
how to work a farm, even though I didn't need to be on a farm. My grandma taught me, you know, 00:27:19.440 |
here's how to sew, here's how to run a business. And then my mother would teach us that, 00:27:23.600 |
"Look, with just a little bit of education, look at the difference it could make." Right? 00:27:27.040 |
So anyways, that's a long answer too. I think that hard work thing is really 00:27:31.760 |
deeply rooted from that background. - And it gives you a way out from hard times. 00:27:36.000 |
- Yeah. You know, I think I've seen you on other podcasts say, 00:27:39.520 |
"I thought I did. Do you want a plan B?" Didn't you say, "No, you would not like a plan B?" 00:27:46.240 |
- Yeah, I don't want a plan B. - Because you're like, 00:27:47.760 |
"I would prefer my backup against, am I remembering?" - You have a story like that. 00:27:50.960 |
You seem to have, at least certain moments in your life, seem to do well in desperate times. 00:27:59.520 |
- True enough. True enough, that's true. I learned that very well. But I also think that maybe this 00:28:07.440 |
isn't the same kind of plan B. I think of it as, like I was taught, always be able to take care of 00:28:12.400 |
yourself. Don't have to rely on someone else. And I think that to me, so that's my plan B, 00:28:18.960 |
I can take care of myself. And it's even after what I lived through with my father, I thought, 00:28:24.240 |
"Well, this is at a bar for bad. After this, nothing's bad." And that is a very freeing thought. 00:28:30.400 |
- The being able to take care of yourself, is that, you mean practically, or do you mean just 00:28:35.280 |
a self-belief that I'll figure it out? - I'll figure it out and practically both, 00:28:40.720 |
"I vividly remember the last two weeks of my freshman year when I only had 25 cents left. 00:28:48.160 |
I put the quarter in a clear plastic box on my desk and just stared at it. This is it, I thought, 00:28:54.640 |
no more money." So do you think there's some aspect of that financial stress, even desperation, 00:29:02.560 |
just being hungry, does that play a role in that drive that led to your success to be the CEO 00:29:09.840 |
of one of the great companies ever? - It's a really interesting question 00:29:12.800 |
because I was just talking to another colleague who's CEO of another great American company this 00:29:17.520 |
weekend. And he mentioned to me about all this adversity and he said, or I said to him, I said, 00:29:24.160 |
"Do you think part of your success is 'cause you had bad stuff happen?" 00:29:30.400 |
And he said, "Yes." And so I guess I'd be lying if I didn't say, I don't think you have to have 00:29:38.400 |
tragedy, but it does teach you one really important thing is that there is always a way forward, 00:29:43.840 |
always, and it's in your control. - And I think there's probably wisdom 00:29:47.440 |
for mentorship there, or whether you're a parent or a mentor, that easy times don't result in growth. 00:29:55.120 |
- Yeah, I've heard a lot of my friends and they worry, they say, "Gee, my kids have never had 00:29:59.040 |
bad times." And so what happens here? So I don't know, is it required? And why you end up, 00:30:06.320 |
not required, but it sure doesn't hurt. - You had this good line about advice you 00:30:11.760 |
were given that growth and comfort never coexist, growth and comfort never coexist. 00:30:17.280 |
And you have to get used to that thought. - If someone said that they think of me like 00:30:21.040 |
one of the more profound sort of lessons I had, and the irony is, it's from my husband, 00:30:28.240 |
which is even more funny, actually. - I'm glad you're able to, you could just steal it. 00:30:32.000 |
I mean, you don't have to give him credit. - Oh, I have, I have, shamelessly, as he'll tell you. 00:30:35.200 |
Okay, so the story behind growth and comfort never coexist, but honestly, I think it's been 00:30:42.240 |
a really freeing thought for me, and it's helped me immensely since. Mid-career, and as I write 00:30:49.680 |
about it in the book, I'm mid-career, and I'd been running a pretty big business, actually, 00:30:55.520 |
and the fella I work for is gonna get a new job, he's gonna get promoted. He calls me and he says, 00:30:58.880 |
"Hey, you're gonna get my job. I really want you to have it." And I said to him, "No way." I said, 00:31:04.400 |
"I'm not ready for that job. I got a lot more things I gotta learn. That is like a huge job, 00:31:09.200 |
round the world, every product line, development, you name it, every function, I can't do it." 00:31:14.160 |
He looked at me, he says, "Well, I think you should go to the interview." I went to the interview the 00:31:20.640 |
next day, blah, blah, blah. Guy says to me, looks at me and he says, "I wanna offer you that job." 00:31:25.680 |
And I said, "I would like to think about it." I said, "I wanna go home and talk to my husband 00:31:31.840 |
about it." He kinda looked at me, "Okay." I went home, my husband is sitting there and he says to 00:31:38.560 |
me, I went on and on about the story, et cetera, and he says, "Do you think a man would've answered 00:31:44.640 |
it that way?" And I said, "Hmm." He says, "I know you." He's like, "Six months, you're gonna be 00:31:51.360 |
bored. And all you can think of is what you don't know." And he said, "And I know these other 00:31:56.160 |
people. You have way more skill than them and they think they could do it." And he's like, "Why?" 00:32:01.920 |
And for me, it internalized this feeling that, and I am gonna say something that's a bit stereotyped, 00:32:10.000 |
that it resonates with many, many women, and I'll ask you if it does after, is that they're the most 00:32:15.600 |
harsh critic of themselves. And so this idea that I won't grow unless I can feel uncomfortable, 00:32:21.840 |
doesn't mean I always have to show it, by the way. So that's why I meant growth and comfort can never 00:32:26.000 |
coexist. So I was like, "He's exactly right." Now, the end of that story is I went in and I took the 00:32:32.320 |
job. When I went back to the man who was really my mentor looking out for me, and he looked at me and 00:32:37.760 |
he said, "Don't ever do that again." And I said, "I understand." Because it was okay to be 00:32:43.360 |
uncomfortable. I didn't have to use it. I mean, now I would take stock of the things I can do, 00:32:48.960 |
right? And really think, or I look for times to be uncomfortable. Because I know if I am nervous, 00:32:55.360 |
like, I don't know if you're nervous to meet me. We never met in real person. 00:33:00.720 |
- No, you're not. But then you're, it means you're learning something, right? 00:33:07.520 |
- I think it's interesting. Maybe you could speak to that, the sort of the self-critical 00:33:12.560 |
thing inside your brain. Because I think sometimes it's talked about that women have that. 00:33:21.040 |
But I have that, definitely. And I think that's not just solely a property of women in the workplace. 00:33:28.800 |
- But I also want to sort of push back on the idea that that's a bad thing that you should 00:33:33.360 |
silence. Because I think that anxiety, that leads to growth also. That's like this discomfort. So 00:33:39.760 |
there's this weird balance you have to have between that self-critical engine and confidence. 00:33:45.440 |
- You have to kind of dance. Because if you're super confident, people will value you higher. 00:33:49.200 |
That's important. But if you're way too confident, maybe in the short term you'll gain, 00:33:54.400 |
- Very good point. So I can't really disagree with that. And to me, even when I took on jobs, 00:34:00.160 |
I always felt people say, "Well, what point are you confident enough?" And I came to sort of 00:34:05.200 |
believe, again, a theme of my beliefs that if I was willing to ask lots of questions and understood 00:34:10.640 |
enough, that's all I needed to know. - Let me ask you about your husband 00:34:16.800 |
- So you write in the book. You write in the book. He's just jumping around. Like I said, 00:34:21.440 |
I'm a bit of a romantic. So how did you meet your husband? 00:34:23.920 |
- So I met my husband when I was 19 years old. So I was a young kid. And I met him when I had a 00:34:32.160 |
General Motors scholarship. So I was at Northwestern University through my first two years, 00:34:37.600 |
had a lot of loans, financial aid. And a professor said, "Hey, you should sign up for this 00:34:43.440 |
interview. They're looking to bring forward diverse candidates through their management track. 00:34:47.920 |
Now, these programs don't exist anymore like that. They will pay your tuition, 00:34:51.360 |
your room and board, your expenses at Northwestern, other Ivy League schools, 00:34:55.520 |
these very expensive schools. And I think you'd be a good fit." I am eternally thankful for that 00:35:01.680 |
advice. I went and I interviewed. I actually got the scholarship. I mean, without it, I'd have 00:35:06.080 |
graduated with hundreds of thousands of dollars of debt. So part of that was in the summer, 00:35:11.600 |
I had to work in Detroit. I lived a little room by a cement plant. Not theirs, but I mean, 00:35:18.320 |
- Very, very romantic. And the person who owned the house said, "Hey, I'm having a party. You're 00:35:22.960 |
not invited. I'm going to fix you up with someone tonight." And that turned out to be my husband. 00:35:28.480 |
And so it was a blind date is how we very first met. 00:35:31.680 |
- And then it was over. The story was written. - Yep. 00:35:35.200 |
- If it's okay, just zoom out to, you mentioned power and good power a few times. So if we can 00:35:42.240 |
just even talk about it. Your book is called "Good Power, Leading Positive Change in Our Lives, 00:35:46.880 |
Work, and World." What is good power? What's the essence of good power? 00:35:50.640 |
- Yeah. So the essence of it would be doing something hard or meaningful, but in a positive 00:35:57.520 |
way. I would also tell you, I hope one day I'm remembered for how I did things, not just for 00:36:05.920 |
what I did. I think that could almost be more important. And I think it's a choice we can all 00:36:09.600 |
make. So the essence to me of good power, if I had a contrast, good to bad, let's say, would be that 00:36:15.360 |
first off, you have to embrace and navigate tension. This is the world we live in. And 00:36:22.640 |
by embracing tension, not running from it, you would bridge divides that unites people, 00:36:28.720 |
not divides them. It's a hard thing to do, but you can do it. You do it with respect, 00:36:33.280 |
which is the opposite of fear. A lot of people think the way to get things done is fear. 00:36:37.440 |
And then the third thing would be, you got to celebrate some progress versus perfection. 00:36:44.000 |
Because I also think that's what stops a lot of things from happening. Because if you go for 00:36:48.320 |
whatever your definition of perfect is, it's either polarization or paralyzation. I mean, 00:36:54.480 |
something happens in there versus no, no, no. Don't worry about getting to that actual exact 00:37:01.040 |
endpoint. If I keep taking a step forward of progress, really tough stuff can get done. 00:37:06.160 |
And so my view of that is like, honestly, I hope it can, I said it's like a memoir with purpose. 00:37:12.320 |
I'm only doing it. It was a really hard thing for me to do because I don't actually talk about all 00:37:17.360 |
these things. And I had to, nobody cares about your like scientific description of this. They 00:37:22.000 |
want the stories in your life to bring it alive. So it's a memoir with purpose. And in the writing 00:37:27.200 |
of it, it became the power of me, the power of we, and the power of us. The idea that you build a 00:37:33.120 |
foundation when you're young, mostly from my work life, the power of we, which says, I kind of, 00:37:40.560 |
in retrospect, could see five principles on how to really drive change that would be done in a 00:37:47.200 |
good way. And then eventually you could scale that, the power really of us, which is what I'm 00:37:52.880 |
doing about finding better jobs for more people now that I co-chair an organization called 110. 00:37:58.320 |
So that essence of navigate tensions, do it respectfully, celebrate progress, 00:38:06.800 |
and indulge me one more minute, these sort of, again, it's retrospect that I didn't know this 00:38:14.160 |
in the moment. I had to learn it. I learned it. I am blessed by a lot of people I worked with and 00:38:18.480 |
around. But some of the principles, like the first one says, if you're going to do something, 00:38:27.280 |
change something, do something, you got to be in service of something. Being in service of 00:38:34.000 |
is really different than serving, super different. And like, I just had my knee replaced 00:38:39.680 |
and I interviewed all these doctors. You can tell the difference that the guy who's going to do a 00:38:44.800 |
surgery, hey, my surgery is fine. I really don't care whether you can walk and do the stuff you 00:38:47.680 |
wanted to do again, but because my surgery is fine. Your hardware is good. I actually had some 00:38:52.240 |
trouble. And I had a doctor who was like, you know, this doesn't sound right. I'm coming to you. 00:38:56.720 |
The surgery was fine. It was me that was reacting wrong to it. And he didn't care until I could 00:39:03.120 |
walk again. Okay. There's a big difference in those two things. And it's true in any business 00:39:07.680 |
you have. A waiter serves your food. Okay. He serves his food. He did his job. Or did he care 00:39:13.360 |
he had a good time? So that thought to be in service of, it took me a while to get that, 00:39:18.000 |
like to try to write it, to get that across. Cause I think it's like so fundamental. 00:39:21.680 |
If people were really in service of something, you got to believe that if I fulfill your needs 00:39:28.800 |
at the end of the day, mine will be fulfilled. And that is that essence that makes it so different. 00:39:35.280 |
And then the second part, second principle is about building belief, which is I got to hope 00:39:40.320 |
you'll voluntarily believe in a new future or some alternate reality. And you will use your 00:39:45.200 |
discretionary energy versus me ordering you. You'll get so much more done. Then the third 00:39:50.880 |
change and endure. We kind of talked about that earlier, focus more on the how and the skills. 00:39:56.800 |
And then the part on good tech and being resilient. So anyways, I just felt that 00:40:03.360 |
like good tech, everybody's a tech company. I don't care what you do today. 00:40:06.880 |
And there's some fundamental things you got to do. In fact, pick up today's, any newspaper, right? 00:40:12.400 |
Chat GPT. You're an AI guy. All right. I believe one of the tenants of good tech is, 00:40:18.720 |
it's like responsibility for the longterm. It says, so if you're going to invent something, 00:40:23.280 |
you better look at its upside and its downside. Like we did quantum computing. 00:40:27.120 |
Great. A lot of great stuff, right? Materials development, risk management calculations, 00:40:33.120 |
endless lists one day. On the other side, it can break encryption. That's a bad thing. 00:40:38.080 |
So we work equally hard on all the algorithms that would sustain quantum. I think with chat, 00:40:44.960 |
okay, great. There's equal in, there are people working on it, but like, okay, 00:40:50.880 |
the things that say, hey, I can tell this was written with that, right? Because the implications 00:40:56.320 |
on how people learn, right? If this is not a great thing, if all it does is do your homework, 00:41:00.160 |
that is not the idea of homework as someone who liked to study so hard. But anyways, you get my 00:41:05.200 |
point. It's just the upside and the downside. And that there could be a much larger implications 00:41:08.800 |
that are much more difficult to predict. And it's our responsibility to really work hard to figure 00:41:14.560 |
that out. I was talking to AI ethics a decade ago, and I'm like, why won't anybody listen to us? 00:41:20.480 |
That's another one of those values things that you realize, hey, if I'm going to bring technology in 00:41:24.480 |
the world, I better bring it safely. And that to me comes with when you're an older company that's 00:41:30.320 |
been around, you realize that society gave you a license to operate and it can take it away. 00:41:34.880 |
And we see that happen to companies. And therefore you're like, okay, like why I feel so strong about 00:41:41.200 |
skills. Hey, if I'm going to bring in, it's going to create all these new jobs, job dislocation, 00:41:45.920 |
then I should help on trying to help people get new skills. Anyways, that's a long answer to what 00:41:50.560 |
good tech, but the idea that there's kind of in retrospect, a set of principles you could look at 00:41:57.040 |
and maybe learn something from my sort of rocky road through there. 00:42:01.040 |
But it started with the power of we, and there's that big leap, I think that propagates through 00:42:06.960 |
the things you're saying, which is the leap from focusing on yourself to the focusing on others. So 00:42:11.360 |
that having that empathy, you've said at some point in our lives and careers, our attention 00:42:16.720 |
turns from ourselves to others. We still have our own goals, but we recognize that our actions 00:42:22.000 |
affect many, that it is impossible to achieve anything truly meaningful alone. So it's to you, 00:42:28.640 |
I think maybe you can correct me, but ultimate good power is about collaboration. And maybe 00:42:38.080 |
in large companies, like delegation on great teams. 00:42:42.240 |
The ultimate good power is actually doing something for society. That would be my 00:42:45.520 |
ultimate definition of good power, by the way. 00:42:48.080 |
Yeah, but how it's done, right? The how it's done. And so, you know, when you said a leap, 00:42:56.800 |
do you think people make a leap when they go from thinking about themselves to others? 00:43:00.480 |
Do you think it's a leap or do you think it kind of just is a sort of slow point? 00:43:04.560 |
I think the leap is in deciding that this is, it's like deciding that you will care about others. 00:43:13.040 |
That this is, it's like a leap of going to the gym for the first time. Yes, it takes a long time 00:43:17.840 |
to develop that and to actually care, but that decision that I'm going to actually care about 00:43:22.000 |
other human beings. Yeah. I think, or at least, like, yeah, it just feels like a deliberate action 00:43:29.280 |
Yeah, because sometimes I think it happens a little, it's maybe not as deliberate. Yeah, 00:43:32.720 |
it's a little bit more gradual because it might happen because you realize that, geez, 00:43:36.320 |
I can't get this done alone. So I got to have other people with me. Well, how do I get them 00:43:40.240 |
to help me do something? So I think it does help happen a little bit more gradually. And as you 00:43:45.680 |
get more confident, you start to not think so much that it's about you. And you start to think 00:43:50.400 |
about this other thing you're trying to accomplish. And so that's why I felt it was a little more 00:43:55.680 |
gradual. I also felt like I can remember so well, you know, this idea that, again, now we're in the 00:44:04.960 |
80s, 90s, I'm a woman, I'm in technology. And I was down in Australia at a conference. 00:44:11.920 |
And I gave this great speech, again, me, power of me, you know, I'm thinking I give this great 00:44:17.280 |
speech, financial services, this guy, man walks up to me after I think he's going to like ask me 00:44:21.200 |
some great question. And he said to me, I wish my daughter could have been here. And in that moment, 00:44:27.600 |
and I in at that point, up to then, I'd always been about, look, please don't notice I'm a 00:44:33.200 |
woman, do not notice that I am, I just want to be recognized for my work. Crossing over from me to 00:44:39.440 |
we, like it or not, I was a role model for some number of people. And maybe I didn't want to be, 00:44:46.320 |
but that didn't really matter. So I could either accept that and embrace it or not. I think it's 00:44:51.120 |
a good example of that transition. I did have a little epiphany with that happening. And then I'm 00:44:55.840 |
like, okay, because I would always be like, no, I won't go on a women's conference. I won't talk 00:44:59.680 |
here. I won't, you know, no, no, no. But then I sort of realized, wait a second, you know, 00:45:04.560 |
that old saying, you cannot be what you cannot see. And I said to myself, well, oh, wait a second. 00:45:11.280 |
Okay. I am in these positions I have a responsibility to, and it's to others. And 00:45:17.440 |
that's what I meant. I felt like it can be somewhat gradual that you come and you may 00:45:20.720 |
have these like pivotal moments that you see it, but then you feel it and you sort of move over 00:45:25.840 |
that transom into the power of we. - You're one of the most powerful tech leaders ever. 00:45:31.760 |
And as you mentioned the word power, you know, the old saying goes, power corrupts and absolute 00:45:37.120 |
power corrupts. Absolutely. Was there an aspect of power that was, that you had to resist 00:45:48.000 |
its ability to corrupt your mind to sort of delude you into thinking you're smarter than you are, 00:45:55.040 |
that kind of, all the ways. - That's very dangerous. I agree with, I mean, 00:45:58.320 |
I think you got to be careful who you surround yourself with. That's how I would answer that 00:46:02.240 |
question. Right. And people who will hold the mirror up to you can be done in a very positive 00:46:06.240 |
way, by the way, it doesn't mean, you know, but that we're sycophants, you cannot have that. 00:46:11.120 |
Right. I mean, it's like, I always say to someone like, hey, listen to me, tell me, 00:46:15.040 |
I mean, tell me what would make me better or do something. Or I have a husband that'll do that for 00:46:19.360 |
me quite easily, by the way. He'll always tell me. - He's the one that kind of gives you some 00:46:23.600 |
criticism sometimes. - I have been surrounded myself with a number of people that will do that. 00:46:26.880 |
And I think you have to have that. I had a woman that worked with me for a very long time. And 00:46:32.160 |
at one time we were competitors. And then at some point she started to work for me and stayed with 00:46:36.880 |
me for quite a while. And she was one of the few people that would tell me the truth in, you know, 00:46:40.800 |
sometimes I'm like, enough already. And she'd be like, do not roll your eyes at this. And you 00:46:46.960 |
absolutely have got to have that. And I think it also comes, it'll go back to my complete 00:46:51.600 |
commitment to inclusion and diversity, 'cause you gotta have that variety around you. You'll get a 00:46:56.400 |
better product and a better answer at the end of the day. And so that, to resist that allure, 00:47:01.920 |
I think it's around about who you surround yourself with. I, current politics would say that too. 00:47:06.240 |
- So you, you write about, and in general you value diversity a lot. So 00:47:10.720 |
can you speak to almost like philosophically, what does diversity mean to you? 00:47:14.800 |
- Diversity to me means I'm gonna get a better product, a better answer. I value different views. 00:47:22.960 |
And so it's inclusion. So I always say inclusion, diversity is a number, inclusion is a choice. 00:47:29.280 |
And you can make that choice every single day. - That's a good line. 00:47:33.920 |
- I really believe, and I've witnessed it, that I've had, when my teams are diverse, 00:47:39.120 |
I get a better answer. My friends are diverse, I have a better life. I mean, all these kinds of 00:47:43.760 |
things. And so I also believe it's like no silver thread, there's no easy way. You have to 00:47:49.600 |
authentically believe it. I mean, do you authentically believe that diversity is a good thing? 00:47:54.320 |
- Yeah, but I believe that diversity, like broadly-- - A thought, I very broadly define it. 00:47:59.840 |
- Yeah, so like there's, sometimes the way diversity is looked at, the way diversity is 00:48:05.520 |
used today is like surface level characteristics, which are also important, but they're usually 00:48:10.160 |
reflective of something else, which is a diversity of background, a diversity of thought, a diversity 00:48:14.960 |
of struggle. Some people that grew up middle-class versus poor, different countries, different 00:48:22.240 |
motivations, all of that. Yeah, it's beautiful when different people from very different walks 00:48:26.880 |
of life get together. Yeah, it's beautiful to see. But sometimes it's very difficult to get 00:48:31.440 |
at that on a sheet of paper of the characteristics that defines the diversity. 00:48:37.040 |
- I know, so it is. It's just like, oh, I can't hire exactly for, or if I'm trying to, 00:48:41.040 |
but I do know one thing, that when people say, well, I can't find these kind of people I'm 00:48:46.560 |
looking for, I'm like, you're just not looking in the right places. 00:48:48.400 |
- Right, you have to open up-- - You gotta really open up new pools. 00:48:50.800 |
- Yeah, you have to think, like everybody, you don't have to have a PhD, just like you said. 00:48:55.360 |
- I'm sorry to say it. I know it's very valuable what you have, trust me, but-- 00:48:59.280 |
- Well, just like you said, it could even be a negative. So you mentioned, 00:49:04.000 |
like for good power, you are a CEO, you were a CEO for a long time of a public company. 00:49:10.480 |
Were there times when there was pressure to sacrifice what is good for the world 00:49:16.240 |
for the bottom line in order to do what's good for the company? 00:49:20.960 |
- There were a lot of times for that. I mean, I think every company faces that today in that 00:49:25.680 |
I always felt like there's so much discussion about stakeholder capitalism, right? Do you just 00:49:31.040 |
serve a shareholder or do you have multiple? I have always found, and I've been very vocal about 00:49:36.000 |
that topic, that when I participated, the Business Roundtable wrote up a new purpose statement that 00:49:40.960 |
had multiple stakeholders. I think it's common sense. Like if you're gonna be 100 years old, 00:49:45.920 |
you only get there because you actually do at some time balance all these different stakeholders 00:49:50.800 |
in what it is that you do, and short-term, long-term, all these trade-offs. And I always say 00:49:55.520 |
people who write about it, they write about it black and white, but I have to live in a gray 00:49:59.360 |
world. Nothing I've ever done has been in a black and white world, hardly. Maybe things of values 00:50:04.080 |
that I had to answer, but most of it is gray. And so I think back lots of different decisions. 00:50:10.480 |
I think back, as you would well remember, you're a student of history. IBM was one of the really the 00:50:17.120 |
originators of the semiconductor industry, and certainly of commercializing the semiconductor 00:50:21.440 |
industry. Great R&D and manufacturing, but it is a game of volume. And so when I came on, 00:50:28.640 |
we were still manufacturing R&D and manufacturing our own chips. We were losing a lot of money, 00:50:35.440 |
yet here we had to fight a war on cloud and AI. And so, okay, now shareholders would say, 00:50:41.200 |
"Fine, shut it down." Okay, those chips also power some of the most important systems that power 00:50:46.320 |
these, the banks of today. If I just shut it down, well, what would that do? And so, okay, 00:50:52.480 |
the answer wasn't just stop it. The answer wasn't just keep putting money into it. The answer was, 00:50:58.400 |
and we had to kind of sit in an uncomfortable spot till we found a way. I mean, it's going to sound 00:51:02.720 |
so basic, but you as an engineer understand it. We had to separate. It was a very integrated process 00:51:09.680 |
of research, development, and manufacturing. And you'd also, you'd be perfecting things in 00:51:13.440 |
manufacturing. And these were very high performance chips. We had to be able to separate those. We 00:51:19.280 |
eventually found a way to do that so that we could take the manufacturing elsewhere and we would 00:51:24.960 |
maintain the R&D. I think it's a great example of the question you just asked, because people 00:51:31.280 |
would have applauded, others would have been, "This is horrible." Or we had a financial roadmap 00:51:37.280 |
that had been put in place that said, "I'll make this amount of EPS by this date." There came a 00:51:41.520 |
time we couldn't honor it because we had to invest. And so, there's a million of these decisions. I 00:51:47.760 |
think most people that run firms, any size firm, they're just one right after another like that. 00:51:53.760 |
And you're always making that short and long tension of, "What am I giving up?" 00:51:58.160 |
What is that partnership like with the clients? Because you work with gigantic businesses. 00:52:02.480 |
And what's it like sort of really forming a great relationship with them, understanding 00:52:11.200 |
what their needs are, being in service of their needs? 00:52:14.320 |
Yeah. Very simple. Honor your promises. And that happens over time. I mean, in service of, 00:52:22.400 |
which is often why you can work with competitors, because if you are really in service of you and 00:52:27.360 |
you need something, it takes two of us to do it, that becomes easier to do. Because I really, 00:52:32.240 |
we both care, you get what you needed. And so, I can remember during one of the times I was on a 00:52:38.960 |
European trip and at the time, a lot of, and this is still true, about views about technology and 00:52:46.800 |
national technology giants and global ones and the pros and the cons. And countries want their 00:52:51.600 |
own national champions, quite obvious. I mean, if I'm France or Germany. And there was a lot 00:52:56.560 |
of discussion about security and data and who was getting access to what. And I can remember being 00:53:01.600 |
in one of the, I was with Chancellor Merkel, I had met her many times. She's very well prepared, 00:53:06.560 |
very well prepared every time, as you would know. And I started to explain all these things about 00:53:12.240 |
why, how, you know, how we don't share data, how, who it belongs to. Our systems never had back 00:53:18.080 |
doors. And she sort of stopped me. Like, you're one of the good guys. Like, stop. Now, that wasn't 00:53:24.480 |
about me personally. She's talking about a company that's acted consistent with values for decades, 00:53:30.080 |
right? So to me, how you work with those big kind of clients is you honor your promises. You say 00:53:37.200 |
what you do and you do what you say. And you act with values over a long period of time. And that, 00:53:42.080 |
to me, people say we're valued. It is not a fluffy thing. It is not a fluffy thing. It is a, 00:53:46.560 |
I mean, if I was starting a company now, I'd spend a lot of time on that. On, 00:53:53.520 |
you know, why we do what we do and why some things are tolerable and something, 00:53:58.800 |
you know, what's your fundamental beliefs are. And many people sort of zoom past that stage, 00:54:03.200 |
right? It's okay for a while. - And never sacrifice that. 00:54:07.040 |
- You would never sacrifice that. I don't think you can. 00:54:10.400 |
- So there was a lot of pressure when you took over as CEO and there was 22 consecutive quarters 00:54:16.320 |
of revenue decline between 2012 and the summer of 2017. So it was a stressful time. Maybe not, 00:54:22.080 |
maybe you can correct me on that. So as a CEO, what was it like going through that time, 00:54:28.720 |
the decisions, the tensions in terms of investing versus making a profit? 00:54:33.840 |
- I always felt that, that sense of urgency was so high. And even if I was calm on the outside, 00:54:41.040 |
because you have one of the world's largest pensions. So, so many people depend on you. 00:54:46.880 |
You have a huge workforce. They're depending on you. You have clients whose businesses don't run 00:54:50.720 |
if you don't perform, et cetera. And shareholders, of course, right? And so, but I also am really 00:54:59.840 |
clear. This was perhaps the largest reinvention IBM ever had to undertake. I had a board that 00:55:04.720 |
understood that. In fact, some people, some of the headlines were like, this is existential, 00:55:09.440 |
right? I mean, nobody gives you a right to exist forever. And there aren't many texts. You're the 00:55:13.600 |
student of it. They are gone. They are all gone. And so if we didn't reinvent ourselves, 00:55:20.240 |
we were going to be extinct. And so now, but you're big and it's like changing, what's that old 00:55:26.480 |
saying? Can I change the wheels while the train's running or something like that? Or the engines 00:55:31.280 |
while the plane's flying? And that's what you have to do. And that took time. And so, you know, 00:55:36.720 |
Lex, do I wish it would have been faster? Absolutely. But the team worked so hard. And in 00:55:43.040 |
that timeframe, 50% of the portfolio was changed. It's a very large company. And if you would, 00:55:51.840 |
I also divested $10 billion to businesses. So if you would look at that growth rate without 00:55:56.880 |
divestitures and currency, which now today everyone talks about currency. Back then, 00:56:01.040 |
we were the only international guy. Net of divestitures and currency, the growth was flat. 00:56:07.120 |
Is flat great? No, but flat for a big transformation. I was really proud of the 00:56:12.000 |
team for what they did. That is actually pretty miraculous to have made it through that. I had 00:56:17.200 |
my little nephew one day and he would see on TV occasionally when there'd be criticism, 00:56:21.280 |
and he'd say, "You know, Auntie, does it make you mad when they talk mean?" 00:56:26.000 |
DRH: And I just looked at him and I said, "How do you feel?" I said, "Look, 00:56:30.080 |
I'm doing what has to be done." And I happened to be the one there. And if you have great conviction, 00:56:37.360 |
and I did, a great conviction, I knew it was the right thing. I knew it would be needed for IBM to 00:56:43.280 |
live its second century. And my successor, they have picked up, gone forward. I mean, you go back, 00:56:49.120 |
we did the acquisition of Red Hat. I mean, we had to find our way on cloud, right? We were 00:56:53.440 |
late to it. So we had to find our way. And eventually that led us to hybrid cloud. 00:56:57.920 |
We did a lot of work with Red Hat back in 2017. Oh, we'd always done a lot of work with them. 00:57:03.440 |
Actually, we were one of the first investors when they were first formed. But that was 2018. 00:57:08.880 |
We took quite a hit for even--oh, it was the largest to then software acquisition ever. 00:57:14.080 |
But it is the foundation, right, of what is our hybrid cloud play today and doing very, very well. 00:57:20.720 |
But it had to take a short-term hit for that, right? Short-term hit for a very large $34 billion 00:57:26.480 |
acquisition. But for all of us, it was the right thing to do. So I think when you get really 00:57:32.080 |
centered on, you know it's the right thing to do, you just keep going, right? 00:57:35.680 |
So the team had the vision, they had the belief, and everything else, the criticism doesn't matter. 00:57:39.600 |
So we didn't always have exactly the right--this wasn't a straight arrow. But stay down, 00:57:45.120 |
you know you're right, keep going. Okay, made a mistake. There's no bad mistake as long as 00:57:48.960 |
you learn from it, right? And keep moving. So yes, did it take longer, but we are the largest 00:57:53.600 |
that was there. Could you maybe just on a small tangent, educate me a little bit? So Red Hat 00:57:58.720 |
originally is Linux, open source distribution of Linux, but it's also consulting. Well, it's-- 00:58:04.240 |
A little bit of consulting, but it's mostly software distribution. It's mostly Linux. 00:58:07.440 |
It was mostly software. Yeah, absolutely. Absolutely. 00:58:10.960 |
So but today, IBM is very much, there's, you know-- 00:58:14.480 |
Most IT services in the world is done by IBM. There's so many varied--so basically if you have 00:58:22.560 |
issues, problems to solve in business, in the software space, IBM can help. 00:58:29.040 |
Yes, and so in that my last year, our services business, we broke it into two pieces. And one 00:58:35.600 |
piece was spun off into a company called Kindrel, which is managed outsourcing. Keeping things 00:58:41.120 |
running, and they're off creating their own company. What IBM then retained is really the 00:58:45.840 |
part I built with PWCC, the big consulting arm. And so today, the IBM of today in 2023 is, 00:58:53.120 |
you know, at least ending 2022, was 30% consulting, and the other 70% would be, 00:58:58.320 |
what would you consider software cloud AI? So hybrid cloud and AI is the other, and some 00:59:03.760 |
hardware, obviously. Still, the mainframe is modernized, alive, and kicking, and still running 00:59:09.760 |
some of the most important things of every bank you can think of practically in the world. 00:59:15.280 |
And so that is the IBM of today versus perhaps, you know, and Red Hat is a big piece and an 00:59:23.600 |
important part of that software portfolio. And they had some services with them for 00:59:27.600 |
implementation, but it wasn't a very large part. And it's grown by leaps and bounds, 00:59:33.200 |
you know, because originally the belief was everything was going to go to the public cloud. 00:59:36.880 |
And at least many people thought that way. We didn't. In fact, I mean, we tried. We 00:59:44.000 |
cured a public cloud company. We really tried to work it. But what we found was a lot of the 00:59:48.240 |
mission critical work, it was tuned for consumer world. It wasn't tuned for the enterprise. 00:59:53.520 |
So then time is elapsing here though, and you got to be at scale. And we didn't have 01:00:00.560 |
any application. Remember, we're not an application company. So it wasn't like we had 01:00:04.720 |
an office. We didn't have anything that like pulled things out to the cloud. And so as we 01:00:09.680 |
look for what our best, what we really back to who you are, we really know mission critical work. 01:00:15.120 |
We know where it lives today, and we know how to make it live on the cloud, 01:00:18.880 |
which led us down hybrid cloud. You know, that belief that the real world would turn into, 01:00:23.360 |
there'll be things on traditional that'll, you'll never move because it doesn't make sense. 01:00:27.680 |
There'll be private clouds for, you know, have all the benefit of the cloud, but they just don't 01:00:32.000 |
have, you know, infinite expansion. And then there'll be public clouds and you're going to 01:00:35.760 |
have to connect them and be secure. And that's what took us down the path with Red Hat, that 01:00:40.400 |
belief. - The structure of that is fundamentally different than something that's consumer facing. 01:00:44.320 |
So the lesson you learn there is you can't just reuse something that's optimized for consumers. 01:00:49.520 |
- Yeah, very interesting point. It doesn't mean consumer companies can't move up in the enterprise 01:00:52.560 |
because obviously they have, right? But I think it's very hard to take something from the enterprise 01:00:58.080 |
- And because it got to be simple, consumable, all the things we talked about already. 01:01:01.920 |
- Plus you have to have the relationships with the enterprise. 01:01:03.920 |
- Yeah, very different. Yeah. I mean, you know our history, right? At one time we had the PC business 01:01:08.160 |
and, you know, the short answer to why we would not do that is it's a consumer facing business. 01:01:14.160 |
We were good at the enterprise and that consumer business, A, highly competitive, 01:01:18.400 |
got to be low cost, all the things that are not the same muscles necessarily of being in an 01:01:23.760 |
innovation driven, you know, technology business. - Yeah, but what is now Lenovo, I guess that's 01:01:30.080 |
what's just been done. - That's right, Lenovo acquired it. 01:01:31.920 |
- They were extremely good at that, but not as good as you're saying as an enterprise. 01:01:37.440 |
- Yeah, Lenovo's very good at their PC world, yes, and they can sell into the enterprise, 01:01:43.040 |
right? But you as a consumer can go buy a Lenovo too. So look in China, right? Look in other places. 01:01:48.000 |
So that's what I mean by consumer, you know, an end device. And that was a big decision because 01:01:52.480 |
it would have been one of the last things that had our logo on it that sits in your hands, right? So 01:01:56.480 |
when a new generation says, "Well, what does IBM do?" right? - Was that a difficult decision? 01:02:00.560 |
Do you remember? - This is a long time ago now, 01:02:02.960 |
it's like 2005. So they're all difficult because it's not only things, it's people. 01:02:08.800 |
But it's back to knowing who you are is how I would sum that up as, right? And we were never 01:02:16.800 |
great at making a lot of money at that. And you can remember originally it was IBM PC, 01:02:20.560 |
then there were IBM clones, or they were called IBM clones back then as the field became highly, 01:02:26.960 |
highly competitive. And as things commoditized, we often as they commoditize, we would move them 01:02:33.360 |
out and move on to the next innovation level. - But because of that, it's not as public facing. 01:02:39.840 |
- That's right. That's absolutely right. - Even though it's one of the most 01:02:41.200 |
recognizable logos ever. - Yeah, isn't it true? That is very true. 01:02:44.640 |
That is actually a very important point. And that is, you know, branding, as you say, 01:02:50.080 |
one of the most recognizable and a very highly ranked brand strength around the world. And so 01:02:55.840 |
that's a trade-off. I mean, I can't, you know, because there was a time you'd have something 01:03:02.160 |
of IBM in your home or a cash register, as an example, you'd walk into a store, actually, 01:03:06.720 |
they're still in places that went to Toshiba. - Can you speak to consulting a little bit? What 01:03:12.400 |
does that entail? To train up, to hire a workforce that can be of service to all kinds of different 01:03:18.240 |
problems in the software space, in the tech space. What's entailed in that? 01:03:22.320 |
- I mean, you have to value a different set of things, right? And so you've got to always stay 01:03:28.640 |
ahead. It's about hiring people who are willing to learn. It is about, at the same time, in my view, 01:03:36.320 |
it's what really drives you to be customer-centric. - Maybe you can educate me. I think consulting is 01:03:43.440 |
a kind of, you roll in and you try to solve problems that businesses have, like with expertise, 01:03:50.800 |
right? - Okay. - Is that the process of consulting? - Somewhat, right? So, fair enough. When you say 01:03:56.880 |
the word consulting, it's a really broad spectrum. I mean, I think people could be sitting here 01:04:00.880 |
thinking it does any, it could be, I just give advice and I leave, to all the way to, I run your 01:04:06.080 |
systems, right? And I think it's generally, people use the word to cover everything in that space. 01:04:12.480 |
So we sort of fit in the spot, which is, we would come in and live at that intersection of business 01:04:17.920 |
and technology. So yeah, we could give you recommendations and then we'd implement them 01:04:21.280 |
and see them through, because we had the technology to go to the implementation and see them through. 01:04:24.880 |
And at the time back then, that's what, there'd been five of those that had failed, 01:04:29.520 |
that the companies had bought other consulting firms. And so we were, okay, that was the great 01:04:35.040 |
thing about, I mean, the harrowing thing about it was, here, please go work on this. None of 01:04:38.960 |
the others have ever succeeded before. And yet on the other hand, the great promise was, you could 01:04:43.760 |
really, clients were dying at that time when we were doing that, to get more value out of their 01:04:48.400 |
technology and have it really change the way the business worked. So I think of it as how to 01:04:52.960 |
improve business and apply technology and see it all the way through. That's what we do today still. 01:04:56.720 |
- Yeah, to see it all the way through. Yes. So let me say, it's almost like a personal question. 01:05:01.920 |
So that was a big thing you were a part of that you led in 2002, that you championed and helped, 01:05:08.000 |
I should say, negotiate the purchase of Monday, the consulting arm of PricewaterhouseCoopers 01:05:14.640 |
for $3.5 billion. So what were some of the challenges of that that you remember, 01:05:19.840 |
personal and business? - At that time, PW had to, really had to divest. And so they were in 01:05:29.040 |
peril of going to IPO, right? So we sort of swept in at that point and said, and we'd been thinking 01:05:33.760 |
about it a long time and started to work on that as an acquisition. So, kind of balancing which way 01:05:39.680 |
would they go, IPO or acquisition. And so the challenges are obvious and part of it's why they 01:05:44.720 |
went with us as an acquisition. Big difference to be a private firm than a public firm, very big. 01:05:49.520 |
I can remember one of the guys, he asked somebody, "How long have you been with IBM?" And the person 01:05:53.600 |
answered, "143 quarters." Okay, that's a little enlightening about a business model, right? 01:05:59.360 |
So we had the challenges of being private versus public. You have the challenge of when you acquire 01:06:05.040 |
something like that, as I say, you acquire hearts, not parts. They could leave. I mean, 01:06:10.400 |
you could destroy your value by them leaving. They can walk right out the door. I mean, yes, 01:06:14.480 |
you can put lots of restraints, but still, that you have there. And then we had to really build 01:06:20.400 |
a new business model that people and clients would see as valuable and be willing to pay for. 01:06:26.320 |
And so we had to do something that lived at that intersection and say that how this was unique is 01:06:31.760 |
what we were doing. So you had the people aspect, you had that they were going to be public and they 01:06:37.840 |
had always been private their whole life. And then you had the business model. So, and the others had 01:06:44.160 |
all failed that had tried to do this. So yeah, it was a tough thing to do. 01:06:50.320 |
What about the personal side of that? That was a big leap, step up for you. You've been at IBM 01:06:54.880 |
for a long time. This is a big sort of leadership, like very impactful, large scale leadership 01:07:05.360 |
decisions. What was that like? - So, unlike in my career earlier, 01:07:10.400 |
where I said I was changing jobs, I said I wasn't comfortable, et cetera. So now here, fast forward 01:07:15.440 |
10 years, and I'm like, okay, honestly, how I felt inside on one hand, I did what I learned, 01:07:21.360 |
like inventory what you know how to do. Like you have some good strengths that could work here. 01:07:26.400 |
The other part of me said, boy, this is really high profile. And I felt, and I can remember 01:07:32.640 |
saying to someone like, this is going to kill me or catapult, probably nothing in between. 01:07:38.560 |
- And that wasn't terrifying to you? That was okay? You were okay with that kind of pressure? 01:07:41.520 |
- I was okay with that because I felt I knew enough, you know, like these things I had, 01:07:47.200 |
and I'll tell you the one thing I felt I knew the best. Consultants of any, worth their weight, 01:07:56.480 |
they really do care that they deliver for an end client. And I felt I understood service to a 01:08:04.240 |
client so well, that what it meant to really provide value. So I knew we would have like 01:08:09.280 |
something that I knew the PwC people, more than anything, wanted to deliver value to those clients 01:08:14.720 |
they had, next to then developing their people, that those were like the really two things. 01:08:18.960 |
And that I could, and I also knew they felt they could do better if they had more technology. 01:08:23.680 |
And we did. So there really was a reason, you know, that I could really believe in. So I 01:08:28.160 |
authentically believed back to that point. And I also felt I had built some of those skills to be 01:08:33.760 |
able to do that. But I wouldn't call it terrifying, but make no mistake, Lex, it was very hard. And 01:08:44.080 |
it turned out to be extremely successful. By the time we ended, it was worth 19 and a half, 01:08:48.240 |
well, the time I stepped, I ran it for, oh goodness gracious, quite a long time. I'm 01:08:53.360 |
going to say seven or eight, nine years. And we were 19 and a half billion dollars. 01:08:58.960 |
It made 2.7 billion in profit. It was very consequential to IBM. But the fact that it was 01:09:04.960 |
consequential is also very, I mean, there was a time as we moved through it, I can even remember 01:09:10.400 |
it. We just weren't meeting the goals as fast as we should. And some of it was clients were like, 01:09:15.440 |
oh, now you're IBM. So, I mean, some things I knew would happen, but they happened so much faster. 01:09:21.360 |
It'd be things like clients would say, oh, IBM cares about a quarter. So let's negotiate every 01:09:24.880 |
quarter on these prices. And, you know, when they were private, they didn't have these issues. Well, 01:09:30.560 |
that had an impact on margins really fast. And so that ability- 01:09:35.760 |
You pick them up right away. And I thought, oh boy, I mean, if I don't get this turned around, 01:09:41.360 |
this is really a problem. And the team learned a lot of lessons. I mean, I learned people I had to 01:09:46.640 |
move out, that I learned that when people don't believe they can do something, they probably won't 01:09:51.520 |
do it. So, you know, we wanted to run the business at a certain level. I really did have some great 01:09:56.960 |
leaders, but they didn't really believe it could do that. And I finally had to come to terms with, 01:10:00.800 |
if you don't really believe in something, you really aren't going to probably make it happen 01:10:03.760 |
at the end of the day. And so we would change that. We would have to actually get some more 01:10:09.760 |
help to help us on doing so. But then it turned. And I can remember the day that we started 01:10:14.560 |
really getting the business to hum and start to, it was almost like, finally. And I gave the team 01:10:20.960 |
this little plaque, this little, it was kind of corny, paperweight thing. And I'm going to believe, 01:10:27.120 |
remember if it was Thomas Edison, and he said, "Many of life's greatest failures are people who 01:10:33.920 |
gave up right before they were going to be successful." And it's so true. I mean, there 01:10:40.560 |
was also a governor of Texas who's passed, but she had said, someone said, "What's the secret 01:10:45.680 |
of your success?" And she said, "It's passion and perseverance when everyone else would have given 01:10:50.560 |
up." And I feel that's what that taught me. That taught me, like, no matter how bad this gets, 01:10:56.480 |
you are not giving up. Now you can't keep doing the same thing, like the doctor, this hurts, 01:11:01.120 |
oh, then stop doing it. You can't keep doing the same thing. We had to keep changing till we found 01:11:06.640 |
our right way to get the model to work right. And client work, we never had an issue and kept so 01:11:14.160 |
many of the people. And now we are 25 years almost later, and a number of them run parts of the IBM 01:11:21.440 |
business still today. So it's that old Maya Angelou saying, when you say, "What do I remember?" 01:11:27.200 |
They'll say, "You won't remember the specifics of this, but you'll remember how you felt." And 01:11:32.080 |
that's kind of how I feel. And I think they do too, the whole team does, of that. Like, 01:11:35.600 |
I'll get anniversary notes still on that, you know, when you've been through something like 01:11:40.960 |
that together with people. So during the acquisition, the way you knew the people, 01:11:46.560 |
it's the right team are the ones that could believe that this consulting business can grow, 01:11:52.240 |
can integrate with IBM and all that. - Yeah, I was lucky. Look, I did things 01:11:56.240 |
that helped that. I mean, I knew that people joining us would feel more comfortable if they 01:11:59.840 |
had people leading it that they recognized, et cetera. But again, I learned those that didn't 01:12:06.160 |
then, I eventually had to take some action out. But PWCC had a lot of really dedicated leaders 01:12:12.080 |
to it and I give them a lot of credit. - Well, it's amazing to see a thing that 01:12:17.040 |
kind of start at that very stressful time and then it turns out to be a success. 01:12:24.400 |
what about the acquisition itself? Is there something interesting to say about the, 01:12:28.960 |
like, what you learned about maybe negotiation? Because there's a lot of money involved too. 01:12:33.040 |
- To me, it was a win-win and we both actually cared that customers got value. So there was this 01:12:39.520 |
like third thing that had a benefit, not them, there was this third thing. And then next to that-- 01:12:45.680 |
- I like how you think that people would have the wisdom or what it takes to have great 01:12:50.640 |
negotiation. But yeah, so it's a win-win is one of the ways you can have successful negotiation. 01:12:54.720 |
- But it's like obvious to even say that, right? I mean, if you can, back to being in service of 01:12:58.480 |
something, we were both in service of clients. So in and then, you know, I always say, 01:13:05.280 |
when you have a negotiation with someone, okay, both parties always kind of walk away a little 01:13:09.920 |
bit. Okay, that's good. If they both walk away going, "God, I should have got a little bit more. 01:13:14.160 |
Okay, but it's okay if I should have got." Okay, they're both a little fussy. When one walks away 01:13:18.800 |
and thinks they did great and the other one did horrible, they're usually like born bad. I mean, 01:13:22.720 |
because they never worked that way. And I've always felt that way with negotiations that 01:13:29.600 |
you push too far down and you usually will be sorry you did that, you know? 01:13:35.520 |
- So don't push too far. I mean, that's ultimately what collaboration and empathy 01:13:40.080 |
means is you're interested in the long-term success of everybody together versus like 01:13:45.120 |
your short-term success. - And then you get the discretionary 01:13:47.040 |
energy from them versus like, "Okay, you screwed me here. I'm done," right? 01:13:50.720 |
- So let's even rewind even back. - Oh no. Oh no. Do you feel like this is 01:13:57.440 |
a nostalgia interview? Oh no. - Let me just ask the romantic question. 01:14:01.120 |
What did you love most about engineering, computer science, electrical engineering, 01:14:04.720 |
so in those early days from your degree to the early work? 01:14:08.160 |
- I love that logic part of it, right? And you do get a sense of completion at some point when 01:14:13.760 |
you reach certain milestones that, you know, like, yes, it worked or yes, that finite answer to that. 01:14:20.080 |
So that's what I loved about it. I loved the problem-solving of it. 01:14:22.880 |
- Computing, what led you down that path? Computing in general, what made you fall in 01:14:27.040 |
love with computing, with engineering? - It's probably that back to that desire, 01:14:32.480 |
wanting to know how things work, right? And so that's like a natural thing. You know, 01:14:36.160 |
math, I loved math for that reason. I always wanted to study how did that, you know, 01:14:39.760 |
how did it get that to work kind of thing? So it goes back in that time. But I did start, 01:14:45.200 |
when I went to, when I started at Northwestern, I was already in the engineering school, 01:14:48.400 |
but my first thought was to be a doctor, that that was far more noble, that I should be a 01:14:51.840 |
medical doctor until I could not pass human reproduction as a course. And I thought the 01:14:56.320 |
irony that I could not, I'm like, I got all these colored pencils, I got all these pictures, 01:15:00.880 |
this is not working out for me. - I'm gonna stick to math. 01:15:02.560 |
- It was the only course in my four-year college education I had to take pass/fail, 01:15:07.200 |
I guess, otherwise I risked, you know, impairing my grade point average. 01:15:11.840 |
- Engineering it is. So, but after about 10 years, you jumped from the technical 01:15:16.640 |
role of systems engineer to management, to a leadership role. Did you miss at that time 01:15:22.720 |
the sort of the technical direct contribution versus being a leader, a manager? 01:15:26.640 |
- That's an interesting point. Like I say, I've always been sort of a doer leader, you know, so. 01:15:31.600 |
- So you never lost that. - I never really did. Even, 01:15:35.520 |
you know, and I think this is really important for today. The best way people learn is experientially, 01:15:41.600 |
I think. Now you may, that's being a generalization because there are people 01:15:44.800 |
can learn all different ways, right? So I've done things like with my whole team, 01:15:50.400 |
they all had to learn how to build cloud applications and call the code off. And so, 01:15:55.520 |
you know, I don't care what your job is, write code, you know? And I remember when we were 01:16:01.920 |
trying to get the company to understand AI, we did something called a cognitive jam. Okay, 01:16:06.720 |
there's a reason we picked the word cognitive, by the way, instead of AI. Today we use the word AI. 01:16:10.480 |
It was really symbolic. It was to mean, this is to help you think, not replace your thinking. 01:16:17.920 |
There was so much in the zeitgeist about AI being a bad thing at that time. So that was why we picked 01:16:22.960 |
a mouthful of a word like cognitive. And it was like, no, no, this is to help you actually. So 01:16:28.000 |
do what, you know, do what you do better or do something you haven't yet learned. And we did 01:16:32.800 |
something called the cognitive jam, but the whole point was everybody in the company could volunteer, 01:16:36.720 |
get on a team. You either had to build something that improved one of our products, or did 01:16:41.120 |
something for a client, or did a social, solved a social issue with AI. And again, this goes back 01:16:48.000 |
now, 10 years, and people did things from bullying applications to, you know, railroad stuff to 01:16:55.440 |
whatever it was, but it got like a hundred thousand people to understand, you know, viscerally what is 01:17:01.600 |
AI. So that's a long answer to my belief around experiential. And so do you ever give it up? 01:17:07.280 |
I don't think so. Cause I actually think that's pretty good to get your hands dirty 01:17:10.960 |
in something. You know, you can't do it, you know, depending what you're doing, 01:17:16.160 |
So even a CEO, you try to get your hands dirty a little bit. 01:17:20.720 |
I've played, I mean, still, I'm not saying I'm any good at any of it anymore, but. 01:17:26.240 |
But it's that, yeah, it's that really understand, right? And not be afraid of. 01:17:30.400 |
Yeah. Like we mentioned at the beginning, IBM research has helped catalyze some of the 01:17:35.600 |
biggest accomplishments in computing and artificial intelligence history. So D-Blue, 01:17:41.600 |
IBM D-Blue versus Kasparov chess match in '96 and '97. Just to ask kind of like what your 01:17:49.840 |
perception is, what your memory is of it, what is that moment? Like this seminal moment, I believe 01:17:55.760 |
probably one of the greatest moments in AI history, when the machine first beat a human at a 01:18:02.960 |
You make a very interesting point, because it is like one of the first demonstrations of using a 01:18:07.200 |
game to like bring something to people's consciousness, right? And to this date, 01:18:10.960 |
people use games, right, to demonstrate different things. But at the time, it's funny. I didn't 01:18:18.800 |
necessarily think of it so much as AI, and I'll tell you why. I was, and I'm not a chess player, 01:18:23.200 |
you might be a chess player, so I'm not expert at it. But I think I understand properly of chess, 01:18:27.920 |
that chess has got a finite number of moves that can be made. Therefore, if it's finite, 01:18:32.640 |
really what's a demonstration of a supercomputing, right? It's about the amount of time and how fast 01:18:37.600 |
it can crunch through to find the right move. So in some ways, I thought of it as almost a 01:18:42.160 |
bigger demonstration of that. But it is absolutely, as you said, it was a motivator, one of the big 01:18:48.160 |
milestones of AI, because it put in your consciousness that it's man in this other 01:18:55.200 |
So you saw it as just a challenging computation problem, and this is a way to demonstrate hardware 01:19:02.880 |
But the thing is, there is a romantic notion that chess is the embodiment of human intellect, 01:19:09.360 |
I mean, intelligence, that you can't build a machine that can beat a chess champion in chess. 01:19:14.320 |
See, and I was blessed by not being a chess expert, so it wasn't like, 01:19:21.200 |
Well, that's probably required to not be paralyzed by the immensity of the task, 01:19:26.800 |
so that this is just solvable. But it was a very, very, I think that was a powerful moment, 01:19:32.720 |
so speaking just as an AI person, that reinvigorated the dream. 01:19:39.520 |
You were a little kid back then, though, right? At 95? You have to be, like, were you... 01:19:49.680 |
Um, it was awe-inspiring, because, especially sort of growing up in the Soviet Union, 01:19:59.040 |
you think, especially of Garry Kasparov and chess, like, your intuition is weak about those things. 01:20:06.880 |
I didn't see it as computation. I thought of it as intelligence, because chess, 01:20:13.680 |
for a human being, doesn't feel like computation. 01:20:16.800 |
It feels like some complicated relationship between memory and patterns and intuition 01:20:23.840 |
and guts and instinct and all of those, like... 01:20:28.240 |
If you watch someone play, that's what you would conclude, right? 01:20:30.400 |
So to see a machine be able to beat a human, I mean, you get a little bit of that with Chagy-Pity, 01:20:37.040 |
now, it's like, language was to us humans the thing that we kind of... 01:20:42.800 |
Surely, the poetry of language is something only humans can really have. 01:20:47.200 |
It's going to be very difficult to replicate the magic of 01:20:51.200 |
natural language without deeply understanding language. 01:20:55.120 |
But it seems like Chagy-Pity can do some incredible things with language 01:21:07.440 |
Through all the AI winters from the '60s, the promise of the... 01:21:10.960 |
It was, wow, this is possible for a simple set of algorithms 01:21:17.920 |
to accomplish something that we think of as intelligence. 01:21:20.160 |
So that was truly inspiring, that maybe intelligence, 01:21:27.680 |
And of course, now, the funny thing, what happens is the moment you accomplish it, 01:21:32.480 |
everyone says, "Oh, it's just brute force algorithms. It's silly." 01:21:37.840 |
Every single time you pass a benchmark, a threshold to win a game, people say, 01:21:49.040 |
And there's going to be a moment when we're going to have to contend with AI systems that 01:21:59.040 |
And you have to start to have some difficult discussions about, 01:22:08.480 |
And this is really exciting because that also puts a mirror to ourselves to see, 01:22:13.360 |
okay, what's the right way to treat each other as human beings? 01:22:19.360 |
- It is, because I always say it's a reflection of humanity. 01:22:26.000 |
Bad stuff in the past, you'll teach it bad stuff for the future. 01:22:29.280 |
Which is why I think efforts to regulate it are a fool's errand. 01:22:35.040 |
Because it's not the technology itself is not inherently good or bad, 01:22:39.360 |
but how it's used or taught can be good or bad for sure. 01:22:42.640 |
And so that's, to me, will unveil now a whole different way of having to look at technology. 01:22:49.120 |
- What about another magical leap with the early days of Watson 01:22:56.080 |
And what's your vision for Watson in general? 01:22:57.840 |
- Yeah, and it was really inspired by first chess, right? 01:23:01.280 |
And Kasparov, and then you come forward in time. 01:23:04.080 |
And I think what Watson did, because you used a really important word, 01:23:09.600 |
AI had kind of waxed and waned in these winners, right? 01:23:12.640 |
In and out, in and out, popular or not, more money, less money, in and out, 01:23:18.800 |
And so I think that was one of the first times it brought to the forefront of people like, 01:23:26.480 |
Because here it is playing against these two gentlemen. 01:23:29.520 |
And as you did lose at first, and then finally won at the end of the day. 01:23:34.480 |
And what it was doing is making you say, "Hey, natural language." 01:23:39.200 |
It's actually understanding natural language. 01:23:40.800 |
It's one of the first demonstrations of natural language support 01:23:43.680 |
and bit of reasoning over lots of data, right? 01:23:47.200 |
And so that it could have access to a lot of things, 01:23:54.880 |
And I do think it brought to the conscious of the public, 01:23:57.520 |
and in good ways and bad, because it probably set expectations very high of like, 01:24:02.160 |
And I still do believe that it has got the ability to change 01:24:11.680 |
That so many decisions are not optimal in this world. 01:24:16.080 |
And it's right or wrong what took us down a path of healthcare first with our AI. 01:24:23.120 |
And I think there's a really valuable lesson in what we learned. 01:24:26.240 |
One is that, I actually don't think the challenges are the technology. 01:24:30.800 |
But the challenges are the people challenges around this, right? 01:24:38.000 |
I mean, I saw that straight up with doctors and like, 01:24:43.760 |
meaning they're so busy in the way they've been taught to do something. 01:24:46.960 |
Do they really have time to learn another way? 01:24:49.840 |
I saw it was a mistake when you put it on top of processes that didn't change, 01:24:56.080 |
I mean, it was all human change management around it that were really its biggest challenges. 01:25:01.520 |
And another valuable lesson, we picked, back to usage, you think of IBM as moonshots. 01:25:06.320 |
We picked really hard problems to start with. 01:25:08.560 |
I think you see a lot of technology now starts with really simple problems. 01:25:12.400 |
And by that, it probably starts to build trust because I start little. 01:25:17.280 |
It's like, oh, I'm not ready to outsource my diagnosis to you, 01:25:20.480 |
but I'll get some information here about a test question. 01:25:28.640 |
And when you make a market, choice of problem you work on gets to be very important. 01:25:33.520 |
When you're catching up, well, then it's a scale game. 01:25:37.520 |
But Watson proved, I think, I mean, I hope I'm not being too... 01:25:44.240 |
I think Watson brought AI back out a winner for the world. 01:25:47.760 |
And that since then, there's just been one company after another 01:25:52.160 |
And I have no regrets of anything that we did. 01:25:56.400 |
I mean, we probably rebuilt it many times over. 01:26:00.160 |
And today, to IBM, a Watson is more about AI inside of a lot of things, 01:26:05.920 |
if you think of it that way, which is more like an ingredient versus it's a thing in and of itself. 01:26:10.640 |
And I think that's how it'll bring its real value. 01:26:12.720 |
More as an ingredient, and it's so badly needed. 01:26:15.360 |
And even back then, the issue was so much data. 01:26:28.080 |
- So it's part of the suite of tools that you use when you go to enterprise 01:26:34.080 |
- Yeah, so AI for security, AI in automated operations, AI in your robotics, 01:26:38.480 |
AI on your factory floor, you know what I mean? 01:26:41.680 |
And I think, and that's why even to this day, thousands, I mean, thousands and thousands 01:26:46.160 |
of clients of IBM still have the Watson components that it's the AI being used. 01:26:50.320 |
So it became a platform is how I would say it, right? 01:26:53.040 |
And an ingredient that went inside and consultants, like you said, had to learn. 01:26:59.520 |
I had, they had to learn, don't just put it on something. 01:27:04.560 |
You got to rethink how that thing should work, 01:27:06.880 |
because with the AI, it could work entirely differently. 01:27:09.760 |
And so I also felt it could open up and still will open up jobs to a lot of people, 01:27:15.040 |
because more like an assistant, and it could help me be qualified to do something. 01:27:19.200 |
And we even years ago saw this with the French banks, very unionized, 01:27:23.520 |
but that idea that you could, in this case, the unions voted for it 01:27:29.840 |
And so, and that's just part about being really dedicated 01:27:42.480 |
What do you think about the fact that Hal 9000 was named after IBM? 01:27:54.320 |
I have done, I've like researched this, tried to find any evidence and people have talked to, 01:27:58.480 |
was it really, one letter, it was one letter off, right? 01:28:01.760 |
But people don't know H is one letter off of I, A is one letter off of B, 01:28:07.760 |
That was the, I think that's a solution found afterwards, you know? 01:28:13.520 |
I do think it's one of the early demonstrations of evil AI. 01:28:18.560 |
I could push back on that because it's presented as evil in the movie 01:28:25.280 |
But it's a really interesting ethical question because the role of Hal 9000 01:28:33.840 |
And so the question that is a human question, it's not an AI question, 01:28:41.760 |
They pay very heavy costs for a vision, for a goal of a future that creates a better world. 01:28:51.520 |
And so that's the question certainly in space. 01:28:55.520 |
but the limited resources, who do I allocate my time and money and efforts? 01:28:59.760 |
Like I said, I've spent a decade talking about this question of AI ethics, right? 01:29:02.960 |
And that it needs really considerable, not just attention, 01:29:06.640 |
because otherwise it will mirror everything we love and everything we don't love. 01:29:12.000 |
And again, and that's the beauty in the eye of the beholder, right? 01:29:16.240 |
With what you're doing and what you're going to do, how do you think about it? 01:29:21.040 |
Do you think about the AI you're going to develop as having guardrails 01:29:29.200 |
So there's so many interesting ways to do this the right way, 01:29:35.280 |
I tend to believe that transparency is really important. 01:29:37.920 |
So I think some aspect of your work should be open-sourced 01:29:45.920 |
that creates a kind of forcing function for transparency of how you do things. 01:29:54.000 |
maybe it's because of the podcast and I've just talked to a lot of people, 01:30:03.520 |
Sometimes there's a pressure, you have a PR team, 01:30:05.920 |
you have to care for investors and discussion, so on, let's protect, 01:30:14.320 |
where you have incredible engineers doing fascinating work 01:30:17.600 |
and also doing work that's difficult, complex human questions being answered. 01:30:23.680 |
And we don't know about any of them as a society. 01:30:26.320 |
And so we can't really have that conversation. 01:30:28.720 |
Even though that conversation would be great for hiring, 01:30:31.440 |
it would be great for revealing the complexities of what the company is facing. 01:30:37.200 |
you understand that it wasn't malevolence or half-assedness 01:30:42.000 |
and the decision-making is just a really hard problem. 01:30:44.720 |
And so I think transparency is just good for everybody. 01:30:51.520 |
just having a lot of public conversations about this is serious stuff. 01:30:55.520 |
It's that AI will have a transformative impact on our society 01:31:03.280 |
through all kinds of ways we're not expecting, 01:31:05.360 |
which is social media recommendation systems. 01:31:07.760 |
They at scale have impact on the way we think, 01:31:17.360 |
like the kind of stuff we consume to grow and learn 01:31:19.520 |
and become better human beings, all of that, that's all AI. 01:31:22.800 |
And then obviously the tools that run companies on which we depend, 01:31:29.600 |
we need to know all about those AI decisions. 01:31:34.160 |
well, we don't want the AI to say these specific set of bad things. 01:31:39.280 |
- It's unfortunately, I don't believe it's possible to-- 01:31:48.480 |
by creating a set of cold mathematical rules. 01:31:52.240 |
- Unfortunately, it's all fuzzy and gray areas. 01:32:03.280 |
And that this is, I think back, it was probably 20, I don't know, 15, 16, 01:32:11.360 |
Notice the word transparency, that belief that with AI, 01:32:16.960 |
You should know the data that went into training it. 01:32:21.680 |
If it's being used, you have a right to know these things. 01:32:26.960 |
really powerful principles to be followed, right? 01:32:31.760 |
'cause here we were when we were working on particularly healthcare, 01:32:34.240 |
like, okay, you care who trained it and what, and where did, 01:32:38.320 |
and that's sort of, you know, that comes to your mind. 01:32:43.600 |
But it just in general, people won't trust the technologies, 01:32:47.120 |
I don't think, unless they have transparency into those things. 01:32:51.840 |
- I think a lot of people would like to know, sort of, 01:32:58.960 |
suffer from imposter syndrome, that self-critical brain. 01:33:02.240 |
So, you know, taking that big step into leadership, 01:33:05.760 |
did you at times suffer from imposter syndrome? 01:33:13.440 |
and sort of the confidence to really step up? 01:33:21.600 |
Like, no matter, like the bigger the job gets, 01:33:23.600 |
you turn and you look to the left and the right 01:33:38.240 |
So I do hear a lot of people talk about imposter syndrome, 01:33:47.200 |
I've spent some time helping people on that topic. 01:33:54.720 |
you have a right to be there like anyone else does 01:33:58.240 |
if you've prepared for that moment, you know? 01:34:04.800 |
like a confidence thing more than anything else. 01:34:18.000 |
I have an opportunity, I'm gonna stop propagating. 01:34:18.960 |
- You know, it's good or bad, I just focus on the work. 01:34:22.960 |
- One important lesson you said you learned from your mom 01:34:45.920 |
and you don't realize it when they're happening. 01:34:47.760 |
So most of my, I feel like most of my self-discovery, 01:34:51.920 |
it's been like something happens in a year or two 01:34:55.440 |
or some number later, I look back on it and say, 01:35:09.600 |
I've actually heard you say that on different podcasts 01:35:13.360 |
well, it depends, you know, like know yourself a bit, right? 01:35:19.680 |
like for me, there's moods when you're super self-critical, 01:35:25.440 |
and there's many, sometimes you're emotional, 01:35:40.000 |
regarding gender that you felt in your career? 01:35:50.480 |
- You know, I chose to never look at it, okay? 01:35:58.720 |
and '70s and the '80s where I think I was surrounded, 01:36:04.160 |
viewed our way to get ahead was just to work hard. 01:36:07.200 |
and that was the way you differentiated yourself. 01:36:13.520 |
You're always, you know, you learned a lot of things, 01:36:18.080 |
I'm very remindful that I have worked for companies 01:36:22.560 |
that are very steeped in those values of equal opportunity. 01:36:36.960 |
I get evaluated if my team has built up their skills. 01:36:39.920 |
So this is, you know, when you're really formative, 01:36:42.080 |
you're in a culture that that's what it's valuing, right? 01:36:49.040 |
"Did I ever feel I was held back for that reason?" 01:36:53.120 |
you know, I write about a few of the stories in the book, 01:36:55.680 |
I'm laying cables at night and the guys are at the bar. 01:36:58.160 |
Now, I didn't really wanna go with them to the bar anyways. 01:37:00.640 |
They'd be like, "We'll be back to get you, you know, bye." 01:37:09.280 |
back to my earlier story about being a role model, 01:37:25.600 |
and I was talking about media and about women CEOs 01:37:31.840 |
when it's a woman CEO, they call the person by name 01:37:33.920 |
and when it's a man, they call the company out, 01:37:36.400 |
not the person's name exactly associated with the issue." 01:37:39.200 |
And I said, "Yeah, well, I think you have to just understand 01:37:49.920 |
really can be blown out of proportion, right? 01:38:03.680 |
even some of my best friends, the first reaction is, 01:38:18.960 |
And I say, "No, the book, I really worked hard 01:38:32.960 |
it doesn't even matter whether there's a woman, 01:38:34.160 |
it could be another diverse group that feels it. 01:38:38.640 |
And that's why actually I'm okay talking about it 01:38:42.800 |
There were times in my life on my looks or my weight 01:38:53.600 |
Now, on the other hand, when there's so few of you 01:39:03.600 |
If you do good work, it's easier to be recognized. 01:39:13.040 |
like my advice to young women going to engineering, 01:39:18.240 |
and anything new job you do is gonna be solving problems. 01:39:20.480 |
Things like that are what I take away from that 01:39:31.120 |
when I talk to incredible women like yourself, 01:39:44.480 |
- So you get like somebody that looks like you, 01:39:46.800 |
somebody that, and the category could be tiny, 01:39:58.560 |
put 1 million black employees into the middle class 01:40:01.840 |
get them the right skills, upwardly mobile jobs. 01:40:09.440 |
it just did regular leadership session at IBM 01:40:15.360 |
And here, these are extremely accomplished people. 01:40:27.760 |
I feel like the whole country is on my shoulder, 01:41:22.240 |
I mean, you never told me my interpretation is, 01:41:26.320 |
that you feel like in service of other people 01:41:36.480 |
that you keep asking me really hard questions? 01:42:02.480 |
very specific things about, like you said, science, 01:42:22.400 |
that all of these very well accomplished people 01:42:28.160 |
- And then so then regular people and young people, 01:42:46.080 |
"that the reason I never had children on my own 01:42:49.040 |
"was because I had already raised my family." 01:42:58.800 |
what was the right place to find a work-life balance for you 01:43:05.840 |
have time for away from work and be successful? 01:43:16.960 |
I mean, they will take everything they can from you. 01:43:23.760 |
And when people ask for, you know, I need a roof, 01:43:26.000 |
I'm like, okay, I had to come with terms with, 01:43:28.640 |
the criminal was me if I needed that balance. 01:43:36.160 |
because I am in extreme awe of people with children who work, 01:43:50.080 |
their pain is your pain every minute of the day. 01:43:58.960 |
And so when she couldn't go to the teacher meeting, 01:44:04.880 |
between my brother and I and my other two sisters. 01:44:06.720 |
And so I'm still, they still call me mama bear even. 01:44:11.280 |
I mean, I'm extremely protective of all of them. 01:44:24.640 |
and my husband came from a family where his father died 01:44:29.280 |
very similar end point, different reasons why he ended up, 01:44:37.120 |
And I don't want people to believe to do my job, 01:45:01.440 |
And so I talk about it because it was a choice we made. 01:45:10.160 |
what he had to do, I'd already felt that way. 01:45:16.640 |
Well, I like to think that for my little guys, 01:45:21.920 |
And there's no doubt though, the choices we made, 01:45:33.920 |
when you've got less people to have to take care of. 01:46:04.160 |
built some of the most sophisticated systems. 01:46:09.840 |
and just recently retired as a chief executive 01:46:53.840 |
was because they didn't feel confident to come back. 01:47:03.680 |
they're like, "You're right, not that much happened." 01:47:08.560 |
So, it was a long answer to your question about, 01:47:24.960 |
so you could keep great people in the workforce. 01:47:26.640 |
- So you mentioned your friends with Mayabara, 01:47:43.120 |
is I think she's one of the most authentic leaders out there, 01:47:46.800 |
- I mean, just very different companies, huge challenges. 01:47:49.920 |
- I worked there first, though, remember, right? 01:47:51.360 |
So I'm very, in some ways, I'm very beholden, right? 01:47:58.560 |
well, I'm a bit older, so, but circa that genre. 01:48:04.000 |
When you do anything hard, it takes time and perseverance, 01:48:07.840 |
And you can get that, where do you get the fuel for it? 01:48:15.360 |
or you can get it from your network or your relationships. 01:48:18.080 |
And I'm a firm believer relationships are from what you give, 01:48:22.880 |
Meaning you give, trust me, they will come back 01:48:33.200 |
there'll be a day I need lex, he will be back. 01:48:44.320 |
even though I'm no longer still active as a CEO, 01:48:53.760 |
when I first became a CEO calling me and saying, 01:48:55.920 |
"Hey, it's a little lonely here, so let me talk to you." 01:48:59.360 |
And then when they became, I did the same for them. 01:49:04.640 |
And so it's a very supportive, almost to a T, 01:49:11.040 |
any of the women you could name who have been CEOs, 01:49:13.680 |
I would say, almost to a T, have all been very supportive. 01:49:18.400 |
another non-for-profit right now called Journey, 01:49:22.720 |
the Fortune's Most Powerful Women had started, 01:49:27.680 |
particularly diverse women, but women in general, 01:49:29.360 |
to more quickly be into positions of leadership and power. 01:49:37.600 |
in kind of creating this little group of fellows 01:49:41.520 |
- Friendship and love is core to this whole thing, 01:49:43.680 |
not just the success, but just the whole human condition. 01:49:46.480 |
Let me ask one last question, advice for young people. 01:49:49.920 |
You've had a difficult upbringing, a difficult life, 01:49:59.600 |
or just people in general who are struggling a bit, 01:50:02.000 |
trying to figure out how they can have a career 01:50:04.400 |
they can be proud of, or maybe a life they can be proud of? 01:50:10.320 |
is just one if you leave something a little bit better. 01:50:27.920 |
But my advice would probably, when I'm asked this, 01:50:31.360 |
I would tell them to ask more questions than give answers. 01:50:40.800 |
It's funny, I asked my husband the same question the other day. 01:50:58.000 |
like they're in such a hurry to somewhere, I don't know where. 01:51:00.240 |
And that if they just had patience and let life unfold, 01:51:03.600 |
I think they may be surprised where they ended up. 01:51:06.480 |
And actually, I think that's a really good answer, to be honest. 01:51:45.760 |
- Thanks for listening to this conversation with Jeannie Rometty. 01:51:50.640 |
please check out our sponsors in the description.