back to index

Steve Viscelli: Trucking and the Decline of the American Dream | Lex Fridman Podcast #237


Chapters

0:0 Introduction
0:44 Ethnography
12:57 Challenges of driving a truck
31:36 Trucking industry: State of affairs
64:41 Future of autonomous trucks
90:57 Solving the automated truck dilemma
122:52 Role of society in automated trucking
150:1 Tesla and revolutionizing the trucking industry
169:41 Hope and final thoughts

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Steve Veselli,
00:00:02.640 | formerly a truck driver and now a sociologist
00:00:06.020 | at the University of Pennsylvania,
00:00:07.960 | who studies freight transportation.
00:00:10.760 | His first book, "The Big Rig, Trucking and the Decline
00:00:14.080 | of the American Dream" explains how long haul trucking
00:00:17.640 | went from being one of the best blue collar jobs
00:00:20.020 | to one of the toughest.
00:00:22.120 | His current ongoing book project,
00:00:23.960 | "Driverless, Autonomous Trucks and the Future
00:00:26.680 | of the American Trucker" explores self-driving trucks
00:00:30.120 | and their potential impacts on labor and on society.
00:00:33.240 | This is the Lux Friedman Podcast.
00:00:36.440 | To support it, please check out our sponsors
00:00:38.640 | in the description.
00:00:39.800 | And now, here's my conversation with Steve Veselli.
00:00:43.760 | You wrote a book about trucking called
00:00:47.300 | "The Big Rig, Trucking and the Decline of the American Dream"
00:00:51.120 | and you're currently working on a book
00:00:53.760 | about autonomous trucking called
00:00:55.540 | "Driverless, Autonomous Trucks and the Future
00:00:58.600 | of the American Trucker."
00:01:00.640 | I have to bring up some Johnny Cash to you
00:01:02.200 | 'cause I was just listening to this song.
00:01:03.560 | He has a ton of songs about trucking,
00:01:06.080 | but one of them I was just listening to,
00:01:08.560 | it's called "All I Do is Drive,"
00:01:11.560 | where he's talking to an old truck driver.
00:01:13.520 | It goes, "I asked them if those trucking songs
00:01:17.240 | tell about a life like his.
00:01:19.200 | He said, 'If you want to know the truth about it,
00:01:22.320 | here's the way it is.
00:01:24.160 | All I do is drive, drive, drive.
00:01:27.160 | Try to stay alive, that's the course.
00:01:29.480 | And keep my mind on my load, keep my eye upon the road.
00:01:32.280 | I got nothing in common with any man
00:01:34.760 | who's home every day at five.
00:01:36.360 | All I do is drive, drive, drive.
00:01:39.360 | Drive, drive, drive, drive."
00:01:41.960 | So I gotta ask you, same thing that he asked the trucker.
00:01:45.760 | You worked as a trucker for six months
00:01:47.720 | while working on the previous book.
00:01:51.180 | What's it like to be a truck driver?
00:01:55.020 | - I think that captures it.
00:01:56.420 | It really does.
00:01:59.700 | - Can you take me through the whole experience,
00:02:01.660 | what it takes to become a trucker,
00:02:03.980 | what actual day-to-day life was on day one, week one,
00:02:07.380 | and then over time, how that changed?
00:02:09.400 | - Yeah, well, the book is really about
00:02:12.280 | how that changed over time.
00:02:14.940 | So my experience, and I'm an ethnographer, right?
00:02:18.020 | So I go in, I live with people, I work with people,
00:02:22.500 | I talk to them, try to understand their world.
00:02:26.900 | - Ethnographer, by the way, what is that?
00:02:29.460 | The science and art of capturing the spirit of a people?
00:02:34.460 | - Yeah, life ways.
00:02:36.620 | I think that would be a good way to capture it.
00:02:38.620 | Try to understand what makes them unique as a society,
00:02:43.620 | maybe as a subculture.
00:02:47.660 | What makes them tick that might be different
00:02:50.460 | than the way you and I are wired.
00:02:54.420 | And to really sort of thickly describe it
00:02:57.000 | would be at least one component of it.
00:02:59.100 | That's sort of the basic essential.
00:03:01.060 | And then for me, I want to exercise
00:03:05.500 | what C. Wright Mills called the sociological imagination,
00:03:09.460 | which is to put that individual biography
00:03:13.260 | into the long historical sweep of humanity,
00:03:17.560 | if at all possible.
00:03:19.500 | My goals are typically more modest than C. Wright Mills'
00:03:22.940 | and to then put that biography
00:03:26.220 | in the larger social structure, right?
00:03:29.020 | To try to understand that person's life
00:03:31.540 | and the way they see the world,
00:03:34.080 | their decisions in light of their interests
00:03:36.380 | relative to others and conflict and power
00:03:38.700 | and all these things that I find interesting.
00:03:40.780 | - In the context of society and in the context of history.
00:03:43.860 | - Yeah.
00:03:44.700 | - And the small tangent, what does it take to do that?
00:03:47.940 | To capture this particular group, the spirit, the music,
00:03:52.940 | the full landscape of experiences that a particular group
00:03:57.880 | goes through in the context of everything else.
00:04:00.760 | You only have a limited amount of time
00:04:02.500 | and you come to the table probably with preconceived notions
00:04:06.040 | that are then quickly destroyed, all that whole process.
00:04:08.820 | So I don't know if it's more art or science,
00:04:11.480 | but what does it take to be great at this?
00:04:13.620 | - I do think my first book was a success
00:04:18.580 | relative to my goals of trying to really get at the heart
00:04:22.700 | of sort of the central issues
00:04:25.120 | and the lives being led by people.
00:04:28.100 | If I have a resource, a talent,
00:04:32.560 | it's that I'm a good listener.
00:04:35.220 | I can talk with anybody.
00:04:39.660 | My wife loves to remark on this,
00:04:41.900 | that I can sort of sit down with anyone.
00:04:44.580 | I think I learned that from my dad who worked at a factory
00:04:49.100 | and actually had a lot of truckers go through
00:04:51.700 | the gate that he operated.
00:04:53.800 | And he always had a story, a joke for everybody,
00:04:57.140 | kind of got to know everyone individually.
00:04:59.500 | And he just taught me that essentially everyone
00:05:03.020 | has something to teach you.
00:05:04.820 | And I try to embody that.
00:05:06.540 | Like that's the rule for me is every single person
00:05:11.540 | I interact with can teach me something.
00:05:14.780 | - I gotta ask you, I'm sorry to interrupt
00:05:16.820 | because I'm clearly of the two of us, the poorer listener.
00:05:19.720 | - I think you're a great listener.
00:05:23.100 | - Thank you.
00:05:23.940 | - I've been listening to the podcast,
00:05:25.020 | I think you're a great listener.
00:05:26.340 | - I really appreciate that.
00:05:28.480 | You've done a large number of interviews, like you said,
00:05:31.140 | of truckers for this book.
00:05:33.900 | I'm just curious, what are some lessons you've learned
00:05:38.500 | about what it takes to listen to a person enough,
00:05:43.240 | maybe guide the conversation enough
00:05:46.420 | to get to the core of the person, the idea,
00:05:49.260 | again, the ethnographer goal to get to the core.
00:05:54.060 | - Yeah, I think it doesn't happen in the moment, right?
00:05:58.560 | So I'm a ruminator.
00:06:01.940 | I just sit with the data for years.
00:06:05.820 | I sat with the trucking data for almost 10 full years
00:06:10.340 | and just thought about the problems and the questions
00:06:13.900 | using everything that I possibly could.
00:06:16.940 | And so in the moment, my ideal interview is,
00:06:20.340 | I open up and I say, tell me about your life as a trucker.
00:06:24.860 | And they never shut up and they keep telling me
00:06:28.580 | the things that I'm interested in.
00:06:29.800 | Now, it never works out that way
00:06:32.140 | because they don't know what you're interested in.
00:06:34.540 | And so it's, a lot of it is the, as you know,
00:06:38.260 | as I think you're a great interviewer, prep, right?
00:06:41.660 | So you try to get to know a little bit about the person
00:06:45.100 | and sort of understand kind of the central questions
00:06:48.620 | you're interested in that they can help you explore.
00:06:51.700 | And so I've done hundreds of interviews
00:06:55.660 | with truck drivers at this point.
00:06:57.620 | And I should really go back and read the original ones.
00:07:01.260 | They're probably terrible.
00:07:02.520 | - What's the process like?
00:07:03.400 | You're sitting down, do you have an audio recorder
00:07:05.520 | and also taking notes or do you do no audio recording,
00:07:07.960 | just notes or?
00:07:08.920 | - Yeah, audio recorder and, you know,
00:07:11.300 | social scientists always have to struggle with sampling,
00:07:13.820 | right, like who do you interview?
00:07:15.200 | Where do you find them?
00:07:16.040 | How do you recruit them?
00:07:17.560 | I just happened to have a sort of natural place to go
00:07:21.060 | that gave me essentially the population
00:07:23.160 | that I was interested in.
00:07:24.960 | So all these long haul truck drivers
00:07:26.460 | that I was interested in, they have to stop and get fuel
00:07:29.440 | and get services at truck stops.
00:07:31.960 | So I picked a truck stop at the juncture
00:07:35.160 | of a couple of major interstates,
00:07:37.060 | went into the lounge that drivers have to walk through,
00:07:40.200 | you know, with my clipboard and everybody who came through,
00:07:44.080 | you know, I said, "Hey, are you on break?"
00:07:46.720 | And that was sort of the first criteria was,
00:07:49.600 | do you have time, right?
00:07:50.800 | And if they said yes, I said, you know,
00:07:53.520 | I'd say I'm a graduate student, you know,
00:07:56.360 | at Indiana University, I'm doing a study,
00:07:59.000 | I'm trying to understand more about truck drivers,
00:08:01.260 | you know, will you sit down with me?
00:08:02.500 | And I think the first, I think I probably asked like 104,
00:08:06.880 | 103 people to get the first 100 interviews.
00:08:09.920 | - That's pretty good odds.
00:08:11.880 | - It's amazing, right?
00:08:13.160 | - Wow, okay.
00:08:14.000 | - For, you know, any response rate like that for interview,
00:08:16.600 | I mean, these are people who sat down
00:08:17.920 | and gave me an hour, sometimes more of their time,
00:08:21.600 | just randomly at a truck stop.
00:08:23.160 | And it just tells you something about like,
00:08:26.360 | truckers have something to say, they're alone a lot.
00:08:30.240 | And so I had to figure out how to kind of
00:08:33.080 | turn the spigot on, you know,
00:08:35.280 | and I got pretty good at it, I think, yeah.
00:08:38.580 | - So they have good stories to tell
00:08:40.240 | and they have an active life in the mind
00:08:41.880 | because they spend so much time on the road,
00:08:43.720 | just basically thinking.
00:08:45.360 | - Yeah, there's a lot of reflection,
00:08:47.980 | a lot of struggles, you know,
00:08:50.600 | and it's, they take different forms, you know,
00:08:53.920 | one of the things that they talk about
00:08:55.160 | is the impact on their families.
00:08:56.960 | They say truckers have the same rate of divorce
00:08:59.280 | as everybody else, and that's because trucking saves
00:09:02.200 | so many marriages 'cause you're not around
00:09:04.120 | and ruins so many.
00:09:05.280 | And so it ends up being a wash.
00:09:07.040 | - So, you know, I had this experience,
00:09:12.880 | I met another person and he recognized me from a podcast.
00:09:16.480 | And he said, you know, I'm a fan of yours
00:09:19.040 | and a fan of Joe Rogan, but you guys never talk,
00:09:22.680 | you always talk to people with Nobel prizes,
00:09:25.000 | you always talk to these kinds of people.
00:09:27.160 | You never talk to us regular folk.
00:09:30.180 | And that guy really stuck with me.
00:09:33.520 | First of all, the idea of regular folk is a silly notion.
00:09:36.600 | I think people that win Nobel prizes
00:09:39.280 | are often more boring than the people,
00:09:41.160 | these regular folks in terms of stories,
00:09:43.160 | in terms of richness of experience,
00:09:45.800 | in terms of the ups and downs of life.
00:09:48.040 | And, you know, that really stuck with me
00:09:51.480 | 'cause I set that as a goal for myself
00:09:53.700 | to make sure I talked to regular folk.
00:09:56.620 | And you did just this, talking, again, regular folk,
00:10:02.920 | it's human beings, all of them have experiences.
00:10:07.520 | If you were to recommend to talk to,
00:10:13.100 | to talk to some of these folks with stories,
00:10:15.560 | how would you find them?
00:10:16.920 | - Yeah, so I do do this sometimes for journalists
00:10:19.640 | who will come and they wanna write about
00:10:21.640 | sort of what's happening right now in trucking.
00:10:24.000 | And I send them to truck stops.
00:10:26.520 | You know, I say, you know-- - Just go to truck stops.
00:10:27.800 | - Yeah, there's a town called Effingham, Illinois.
00:10:31.480 | And it's just this place where, you know,
00:10:34.080 | bunch of huge truck stops, tons of trucks,
00:10:36.440 | and really nothing else out there.
00:10:38.720 | You know, it's in the middle of corn country.
00:10:41.720 | And, you know, again, truckers, this, you know,
00:10:44.660 | sadly, I think, you know, the politics of the day,
00:10:48.120 | it's changing a little bit.
00:10:50.300 | I think there's a little, the polarization is getting
00:10:54.040 | to the trucking industry in ways that, you know,
00:10:57.640 | maybe we're seeing in other parts of our social world.
00:11:02.440 | But truckers are generally, you know,
00:11:04.400 | real open, sort of friendly folks.
00:11:07.680 | Now, some of them ultimately like to work alone
00:11:10.120 | and be alone.
00:11:11.380 | That's a relatively small subset, I think.
00:11:15.400 | But all of them are generally, you know,
00:11:17.500 | kind of open, you know, trusting,
00:11:20.160 | willing to have a conversation.
00:11:21.480 | And so, you go to the truck stop and you go in the lounge
00:11:25.440 | and there's usually a booth down there
00:11:27.480 | and somebody's sitting at their laptop or on their phone
00:11:30.520 | and willing to strike up a conversation.
00:11:32.160 | You should try that.
00:11:33.000 | You should, you know? - 100% will try this.
00:11:35.040 | Just again, we're just going from tangent to tangent.
00:11:38.680 | We'll return to the main question.
00:11:40.720 | But what do they listen to?
00:11:43.220 | Do they listen to talk radio?
00:11:45.720 | Do they listen to podcasts, audio books?
00:11:47.720 | Do they listen to music?
00:11:48.840 | Do they listen to silence?
00:11:50.940 | - Everything. - Everything.
00:11:52.080 | - Everything.
00:11:52.920 | Some, I mean, and some still listen to the CB,
00:11:55.440 | which, you know, it's a ever dwindling group.
00:11:59.440 | They'll call it the original internet citizens band.
00:12:02.200 | You know, they, back in the '70s,
00:12:03.880 | they thought it was gonna be the medium of democracy.
00:12:07.280 | And they love to just get on there and, you know,
00:12:10.160 | cruise along one truck after the other and chat away.
00:12:13.720 | Usually, you know, it's guys who know each other
00:12:15.560 | from the same company or happen to run into each other.
00:12:18.600 | But other than that, it's everything under the sun.
00:12:21.840 | You know, and that's, it's probably one of the stereotypes,
00:12:24.600 | and it's, I think it was more true in the past, you know,
00:12:28.160 | about the sort of heterogeneity of truck drivers.
00:12:32.240 | They're a really diverse group now.
00:12:33.860 | You know, there's definitely a large,
00:12:37.060 | still a large component of rural white guys
00:12:40.160 | who work in the industry.
00:12:42.420 | But there's a huge growing chunk of the industry
00:12:45.600 | that's immigrants, people of color, and even some women.
00:12:50.120 | Still huge barriers to women entering it,
00:12:52.680 | but it's a much more diverse place than most people think.
00:12:55.900 | - So let's return to your journey as a truck driver.
00:13:00.200 | What did it take to become a truck driver?
00:13:03.520 | What were the early days like?
00:13:05.120 | - Yeah, so this is, I mean, this is a central part
00:13:07.140 | of the story, right, that I uncovered.
00:13:09.900 | And the good part was that I went in
00:13:11.980 | without knowing what was gonna happen.
00:13:14.300 | So I was able to experience it as a new truck driver would.
00:13:19.140 | It's one of the important stories in the book
00:13:20.880 | is how that experience is constructed by employers
00:13:24.620 | to sort of, you know, help you think the way
00:13:28.060 | that they would like you to think about the job
00:13:29.900 | and about the industry and about the social relations of it.
00:13:34.160 | It's super intimidating.
00:13:35.640 | I say in the book, you know, pretty handy guy,
00:13:40.100 | you know, familiar with tools, machines,
00:13:41.860 | like, you know, comfortable operating stuff,
00:13:44.020 | like from time I was a kid.
00:13:46.780 | The truck was just like a whole 'nother experience.
00:13:50.580 | I mean, as I think most people think about it,
00:13:52.860 | it's this big, huge vehicle, right?
00:13:55.340 | It's really long, it's 70 feet long,
00:13:58.140 | it can weigh 80,000 pounds.
00:14:00.560 | You know, it does not stop like a car,
00:14:02.660 | it does not turn like a car,
00:14:04.400 | but at least when I started, and this has changed,
00:14:10.500 | it's part of the technology story of trucking,
00:14:13.100 | the first thing you had to do was learn how to shift it.
00:14:15.800 | And it doesn't shift like a manual car,
00:14:19.340 | the clutch isn't synchronized,
00:14:21.060 | so you have to do what's called double clutch.
00:14:24.020 | And it's basically the foundational skill
00:14:26.860 | that a truck driver used to have to learn.
00:14:29.520 | So you would, you know, accelerate,
00:14:32.060 | say you're in first gear, you push in the clutch,
00:14:35.000 | you pull the shifter out of first gear,
00:14:37.620 | you let the clutch out,
00:14:39.020 | and then you let the RPMs of the engine drop an exact amount,
00:14:44.020 | then you push the clutch back in,
00:14:46.580 | and you put it in second gear.
00:14:48.300 | If your timing is off,
00:14:50.340 | those gears aren't gonna go together.
00:14:52.580 | So if you're in an intersection,
00:14:54.740 | you're just gonna get this horrible grinding sound
00:14:57.220 | as you coast to a dead stop underneath the stoplight
00:15:02.080 | or whatever it is.
00:15:03.200 | So the first thing you have to do is learn to shift it.
00:15:06.240 | And so at least for me and a lot of drivers
00:15:09.580 | who are going to private company CDL schools,
00:15:12.260 | what happens is it's kind of like a bootcamp.
00:15:14.640 | They ship me three states away from home,
00:15:17.600 | send you a bus ticket and say,
00:15:19.240 | "Hey, we'll put you up for two weeks."
00:15:21.560 | You sit in a classroom,
00:15:22.840 | you sort of learn the theory of shifting,
00:15:25.280 | the theory of kind of how you fill out your log book,
00:15:29.440 | rules of the road,
00:15:31.080 | you do that maybe half the day,
00:15:32.720 | and then the other half you're in this giant parking lot
00:15:35.540 | with one of these old trucks
00:15:36.900 | and just like destroying what's left of the thing.
00:15:40.020 | And it's lurching and belching smoke
00:15:43.460 | and just making horrible noises and like rattling.
00:15:46.000 | I mean, in these things, there's a lot of torque.
00:15:48.440 | And so if you do manage to get it into gear,
00:15:51.020 | but the engine's lugging,
00:15:52.280 | I mean, it can throw you right out of the seat.
00:15:54.540 | So it's like this bull you're trying to ride
00:15:57.960 | and it's super intimidating.
00:15:59.580 | And the thing about it is that for everybody there,
00:16:03.680 | almost everybody there, it's super high stakes.
00:16:07.640 | So trucking has become a job of last resort
00:16:11.040 | for a lot of people.
00:16:12.800 | And so they lose a job in manufacturing.
00:16:16.800 | They get too old to do construction any longer, right?
00:16:20.480 | The knees can no longer handle it.
00:16:24.000 | They get replaced by a machine, their job gets offshored.
00:16:27.480 | And they end up going to trucking
00:16:28.580 | because it's a place where they can maintain their income.
00:16:31.600 | And so it's super high stress.
00:16:34.540 | Like they've left their family behind,
00:16:36.360 | maybe they quit another job.
00:16:38.220 | They're typically being charged a lot of money.
00:16:40.180 | So that first couple of weeks,
00:16:41.900 | like you might get charged $8,000 by the company
00:16:45.060 | that you have to pay back if you don't get hired.
00:16:47.740 | And so the stakes are high
00:16:49.300 | and this machine is huge and it's intimidating.
00:16:52.460 | And so it's super stressful.
00:16:53.840 | I mean, I watched grown men break down crying
00:16:57.700 | about like how they couldn't go home
00:16:59.140 | and tell their son that they had been telling
00:17:01.420 | they were gonna go become a long haul truck driver
00:17:04.300 | that they'd failed.
00:17:05.240 | And it's kind of this super high stress system.
00:17:08.260 | It's designed that way partly
00:17:09.620 | 'cause as one of my trainers later told me,
00:17:11.980 | it's basically a two week job interview.
00:17:14.460 | Like they're testing you.
00:17:15.500 | They're seeing like, how's this person gonna respond
00:17:18.460 | when it's tough, when they have to do the right thing
00:17:21.340 | and it's slow and they need to learn something,
00:17:24.600 | are they gonna rush?
00:17:25.600 | Or are they gonna kind of stay calm, figure it out,
00:17:30.320 | nose to the grindstone?
00:17:31.560 | 'Cause when you're a truck driver, you're unsupervised.
00:17:34.400 | And that's what they're really looking for
00:17:35.800 | is that kind of quality of conscientious work
00:17:39.600 | that's gonna carry through to the job.
00:17:41.160 | - Well, so the truck is such an imposing part
00:17:43.680 | of a traffic scenario.
00:17:45.880 | So you said like turning,
00:17:47.760 | it stresses me out every time I look at a truck
00:17:50.000 | 'cause the geometry of the problem is so tricky.
00:17:53.980 | And so if you combine the fact that they have to,
00:17:56.280 | like everybody, basically all the cars in the scene
00:17:58.260 | are staring at the truck and they're waiting,
00:17:59.940 | often in frustration.
00:18:02.100 | And in that mode, you have to then shift gears perfectly
00:18:06.900 | and move perfectly.
00:18:08.420 | And if when you're new, especially,
00:18:11.100 | like you'll probably, for somebody like me,
00:18:12.700 | it feels like it would take years
00:18:14.560 | to become calm and comfortable in that situation
00:18:17.340 | as opposed to be exceptionally stressed
00:18:18.900 | under the eyes of the road,
00:18:22.380 | everybody looking at you, waiting for you.
00:18:24.340 | Is that the psychological pressure of that?
00:18:27.300 | Is that something that was really difficult?
00:18:28.980 | - Yeah, absolutely.
00:18:30.060 | Again, I saw people freeze up in that intersection
00:18:34.260 | as horns are blaring and the truck's grinding gears
00:18:38.820 | and you just can't, and they just shut down.
00:18:40.700 | They're like, "This isn't for me, I can't do it."
00:18:42.980 | You're right, it takes years.
00:18:45.140 | If trucking is not considered a skilled occupation,
00:18:49.900 | but my six months there, and I was a pretty good rookie,
00:18:54.220 | but when I finished, I was still a rookie,
00:18:56.500 | even shifting, definitely backing,
00:18:59.820 | tight corners in situations.
00:19:02.020 | I could drive competently, but the difference between me
00:19:05.500 | and someone who had two, three years of experience
00:19:09.500 | was it was a giant gulf between us.
00:19:13.420 | And between that and the really skilled drivers
00:19:16.800 | who've been doing it for 20 years,
00:19:19.540 | is still another step beyond that.
00:19:21.180 | So it is highly skilled.
00:19:22.660 | - Would it be fair to break trucking
00:19:24.180 | into the task of driving a truck into two categories?
00:19:28.420 | One is like the local stuff,
00:19:30.200 | getting out of the parking lot, getting into,
00:19:32.060 | getting into driving down local streets
00:19:35.060 | and then highway driving, those two tasks.
00:19:38.780 | What are the challenges associated with each task?
00:19:41.620 | You kind of emphasized the first one.
00:19:43.900 | What about the actual like long haul highway driving?
00:19:48.040 | - Yeah, so, I mean, and they are very different, right?
00:19:51.540 | And the key with the long haul driving
00:19:55.220 | is really a set of, the way I came to understand it
00:20:00.120 | was a set of habits, right?
00:20:03.260 | We have a sense of driving, particularly men, I think,
00:20:07.100 | have a sense of driving as like being really skilled
00:20:10.180 | is like the goal and you can kind of maneuver yourself
00:20:13.260 | out of in and out of tight spaces with great speed
00:20:16.540 | and breaking and acceleration.
00:20:18.400 | For a really good truck driver,
00:20:22.240 | it's about understanding traffic and traffic patterns
00:20:26.860 | and making good decisions
00:20:28.100 | so you never have to use those skills.
00:20:30.420 | And the really good drivers,
00:20:33.060 | the mantra is always leave yourself an out, right?
00:20:37.600 | So always have that safe place that you can put that truck
00:20:40.980 | in case that four-wheeler in front of you
00:20:43.740 | who's texting loses control.
00:20:45.600 | What are you gonna do in that situation?
00:20:50.140 | And what really good truck drivers do on the highway
00:20:54.740 | is they just keep themselves
00:20:56.780 | out of those situations entirely.
00:20:59.500 | They see it, they slow down, they avoid it.
00:21:03.480 | And then the local driving is really something
00:21:06.940 | that takes just practice and routine to learn.
00:21:10.820 | This quarter turn, it feels like the back of the truck
00:21:14.020 | sometimes is on delay when you're backing it up.
00:21:16.720 | So it's like, all right, I'm gonna do a quarter turn
00:21:18.480 | of the wheel now to get the effect that I want
00:21:22.060 | like five seconds from now
00:21:24.060 | in where that tail of that trailer is gonna be.
00:21:26.780 | And there's just no, I mean,
00:21:28.420 | some people have a natural talent for that,
00:21:30.820 | spatial visualization and kind of calculating those angles
00:21:34.040 | and everything, but there's really no escaping the fact
00:21:37.920 | that you've gotta just do it over and over again
00:21:40.520 | before you're gonna learn how to do it well.
00:21:42.920 | - Do you mind sharing how much you were getting paid,
00:21:46.520 | how much you were making as a truck driver
00:21:48.760 | in your time as a truck driver?
00:21:50.600 | - Yeah, I started out at 25 cents a mile
00:21:54.160 | and then I got bumped up to 26 cents a mile.
00:21:56.800 | So we had a minimum pay, which was sort of a new pay scheme
00:22:03.600 | that the industry had started to introduce to,
00:22:07.040 | 'cause there's lots of unpaid work and time.
00:22:10.200 | And so we had a minimum pay of $500 a week
00:22:12.680 | that you would get if you didn't drive enough miles
00:22:15.360 | to exceed that.
00:22:16.300 | You get paid in sort of,
00:22:19.520 | so you get paid when you turn the bills in,
00:22:21.760 | which is the paperwork that goes with the load.
00:22:24.320 | So you have to get that back to your company
00:22:28.480 | and then that's how they bill the customer.
00:22:30.320 | And so you might get a bunch of those bills
00:22:32.440 | that kind of bunch up in one week.
00:22:34.400 | So I might get a paycheck for $1,200.
00:22:38.280 | And I mean, I was a poor graduate student,
00:22:40.960 | so this was real money to me.
00:22:44.040 | And so I had the sort of natural incentive to earn a lot
00:22:49.040 | or to maximize my pay.
00:22:51.400 | Some weeks were that minimum, 500, very few.
00:22:54.160 | And then some I'd get 1,200, 1,300 bucks.
00:22:57.320 | Pay has gone up.
00:23:00.600 | Typical drivers now starting in the 30s
00:23:03.400 | in the kind of job that I was in,
00:23:05.400 | 30s cents per mile, 30 to 35.
00:23:09.000 | - So can we try to reverse engineer that math,
00:23:12.000 | how that maps to the actual hours?
00:23:14.520 | So the hours connected to driving are so widely dispersed.
00:23:18.960 | As you said, some of them don't count as actual work.
00:23:21.520 | Some of it does.
00:23:22.760 | That's a very interesting discussion
00:23:24.200 | that we'll then continue
00:23:25.640 | when we start talking about autonomous trucking.
00:23:28.000 | But you're saying all these cents per mile kind of thing.
00:23:31.720 | How does that map to average hourly wage?
00:23:38.040 | - Yeah, so I mean, and this is kind of the,
00:23:40.160 | this is also an interesting technology story in the end.
00:23:43.960 | And it's the technology story that didn't happen.
00:23:46.960 | So pay per mile was invented by companies
00:23:50.680 | when you couldn't surveil drivers.
00:23:52.480 | You didn't know what they were doing, right?
00:23:53.640 | And you wanted them to have some skin in the game.
00:23:56.000 | And so you'd say, here's the load.
00:23:58.440 | It's going from, for me, I might start in the Northeast,
00:24:03.320 | maybe in upstate New York with a load of beer.
00:24:05.720 | It's a, here's this load of beer,
00:24:07.560 | bring it to this address in Michigan.
00:24:09.440 | We're gonna pay you by the mile, right?
00:24:11.440 | If I was being paid by the hour,
00:24:12.920 | I might just pull over at the diner and have breakfast.
00:24:16.440 | So you're paid by the mile,
00:24:19.080 | but increasingly over time,
00:24:23.800 | the typical driver is spending more and more time
00:24:26.920 | doing non-driving tasks.
00:24:28.280 | There's lots of reasons for that.
00:24:30.320 | One of which is railroads have captured a lot of freight
00:24:33.000 | that goes long distances now.
00:24:34.560 | Another one is traffic congestion.
00:24:37.400 | And the other one is that drivers are pretty cheap.
00:24:39.640 | And they're almost always the low people on the totem pole
00:24:43.000 | in some segments.
00:24:44.400 | And so their time is used really inefficiently.
00:24:48.480 | So I might go to that brewery
00:24:51.300 | to pick up that load of Bud Light.
00:24:54.720 | And their dock staff may be busy
00:24:58.760 | loading up five other trucks.
00:25:00.920 | And they'll say, "Go over there and sit and wait.
00:25:03.440 | And we'll call you on the CB when the dock's ready."
00:25:05.820 | So you wait there a couple hours, they bring you in.
00:25:09.200 | You never know what's happening in the truck.
00:25:10.800 | Sometimes they're loading it with a forklift.
00:25:12.680 | Maybe they're throwing 14 pallets on there full of kegs.
00:25:16.280 | But sometimes it'll take them hours.
00:25:18.480 | And you're sitting in that truck
00:25:19.920 | and you're essentially unpaid.
00:25:22.320 | Then you pull out, you've got control
00:25:26.160 | over what you're gonna get paid
00:25:27.820 | based on how you drive that load.
00:25:29.480 | And then on the other end,
00:25:31.480 | you got a similar situation of kind of waiting.
00:25:34.240 | - So if that's the way truck drivers are paid,
00:25:37.240 | then there's a low incentive for the optimization
00:25:40.480 | of the supply chain to make them more efficient, right?
00:25:43.360 | To utilize truck labor more efficiently.
00:25:47.000 | - Absolutely.
00:25:48.440 | So that's a technology problem that,
00:25:51.640 | one of several technology problems that could be addressed.
00:25:55.160 | I mean, so what did, if we just linger on it,
00:26:01.960 | what are we talking about in terms of dollars per hour?
00:26:06.600 | Is it close to minimum wage?
00:26:08.280 | Is it, you know, there's something you talk about.
00:26:10.660 | There was a conception or a misconception
00:26:15.160 | that truckers get paid a lot.
00:26:18.700 | For their work.
00:26:19.780 | Do they get paid a lot for their work?
00:26:21.760 | - Some do.
00:26:23.300 | And I think that's part of the complexity.
00:26:26.020 | So, you know, what interested me
00:26:27.820 | as an ethnographer about this was,
00:26:29.900 | you know, I'm interested in the kind of economic conceptions
00:26:32.780 | that people have in their heads
00:26:34.300 | and how they lead to certain decisions in labor markets.
00:26:38.420 | You know, why some people become an entrepreneur
00:26:40.540 | and other people become a wage laborer,
00:26:43.100 | or, you know, why some people wanna be doctors
00:26:45.740 | and other people wanna be truck drivers.
00:26:47.560 | That conception, right, is getting shaped
00:26:50.640 | in these labor markets is the argument of the book.
00:26:53.860 | And the fact that drivers can hear,
00:26:57.420 | or potential drivers can hear about these, you know,
00:26:59.780 | workers who make $100,000 plus,
00:27:01.980 | which happens regularly in the trucking industry.
00:27:04.440 | There are many truck drivers
00:27:06.220 | who make more than $100,000 a year, you know,
00:27:09.740 | is an attraction, but the industry is highly segmented.
00:27:13.800 | And so the entry level segment,
00:27:16.420 | and we can probably get into this,
00:27:18.520 | but, you know, the industry is dominated by, you know,
00:27:22.580 | few dozen really large companies that are self-insured
00:27:26.500 | and can train new drivers.
00:27:28.620 | So if you want those good jobs,
00:27:30.240 | you've gotta have several years, up until recently,
00:27:33.580 | now the labor market's becoming tighter,
00:27:35.060 | but you had to have several years of accident-free,
00:27:38.100 | you know, perfectly clean record driving to get into them.
00:27:41.800 | The other part of the segment, you know,
00:27:44.520 | those drivers often don't make minimum wage.
00:27:46.780 | But this leads to one of the sort of central issues
00:27:50.380 | that has been in the courts and in the legislature,
00:27:54.300 | in some states, is, you know,
00:27:56.880 | what should truck drivers get paid for?
00:27:59.300 | Right, the industry, you know,
00:28:01.140 | for the last 30 years or so has said, essentially,
00:28:04.340 | it's the hours that they log for safety reasons
00:28:07.580 | for the Department of Transportation, right?
00:28:11.260 | Now, since the drivers are paid by the mile,
00:28:14.520 | they try to minimize those,
00:28:15.960 | because those hours are limited by the federal government.
00:28:19.080 | So the federal government says,
00:28:19.980 | you can't drive more than 60 hours in a week
00:28:22.620 | as a long-haul truck driver.
00:28:24.320 | And so you wanna drive as many miles as you can
00:28:26.560 | in those 60 hours, and so you under-report them, right?
00:28:31.520 | And so what happens is the companies say,
00:28:34.500 | well, that guy, you know,
00:28:35.560 | he only said he logged 45 hours of work that week,
00:28:39.240 | or 50 hours of work.
00:28:40.640 | That's all we have to pay him minimum wage for.
00:28:43.640 | When, in fact, typical truck driver in these jobs
00:28:46.440 | will work, according to most people,
00:28:48.360 | would sort of define it as like,
00:28:49.520 | okay, I'm at the customer location,
00:28:50.880 | I'm waiting to load, I'm doing some paperwork,
00:28:52.800 | you know, I'm inspecting the truck, I'm fueling it,
00:28:55.840 | just waiting to, you know, get put in the dock,
00:28:58.200 | 80 to 90 hours would be sort of a typical work week
00:29:02.020 | for one of these drivers.
00:29:04.600 | And just look at that,
00:29:05.880 | they don't make minimum wage oftentimes.
00:29:07.320 | - Right, just to be clear,
00:29:08.780 | what we're dancing around here
00:29:10.440 | is that a little bit over, a little bit under,
00:29:13.280 | minimum wage is nevertheless,
00:29:14.940 | most truck drivers seem to be making close to minimum wage.
00:29:19.400 | Like this is the,
00:29:21.000 | so like we maybe haven't made that clear.
00:29:23.960 | There's a few that make quite a bit of money,
00:29:27.200 | but like you're, as an entry,
00:29:29.960 | and for years you're operating essentially minimum wage,
00:29:34.960 | and potentially far less than minimum wage
00:29:37.800 | if you actually count the number of hours
00:29:39.960 | that are taken out of your life
00:29:42.760 | due to your dedication to trucking.
00:29:44.680 | - Well, if you count like the hours taken out of your life,
00:29:49.400 | then you gotta go, you know, maybe a full 24.
00:29:52.760 | - That's right, yeah, from family,
00:29:54.320 | from the high quality of life parts of your life.
00:29:59.320 | - Yeah, and there's a whole nother set of rules
00:30:01.980 | that the Department of Labor has,
00:30:03.840 | which basically say that a truck driver
00:30:06.500 | who's dispatched away from home for more than a day
00:30:09.820 | should get minimum wage 24 hours a day.
00:30:13.340 | And that could be a state minimum wage,
00:30:16.520 | but typically what it would work out to for most drivers
00:30:19.680 | is that, you know, a minimum,
00:30:21.240 | the minimum wage for a truck driver
00:30:23.000 | should be 50s of thousands, you know, 55, $60,000
00:30:26.920 | should be the minimum wage of a truck driver.
00:30:28.880 | And you probably heard about the truck driver shortage.
00:30:31.160 | Like if, you know, which I hope we can talk about,
00:30:35.760 | if the minimum wage for truck drivers
00:30:37.680 | is as it should be on the books at, you know,
00:30:40.280 | around $60,000, we wouldn't have a shortage truck drivers.
00:30:44.260 | - Oh, wow.
00:30:45.460 | And to me, 60,000 is not a lot of money
00:30:49.340 | for this kind of job.
00:30:51.380 | 'Cause you're, this isn't,
00:30:54.020 | this is essentially two jobs,
00:30:56.540 | and two jobs where you don't get to sleep with your wife
00:30:59.940 | or see your kids at night.
00:31:03.420 | That 60,000 is a very little money for that.
00:31:06.520 | But you're saying if it was 60,000,
00:31:09.560 | you wouldn't even have the shortage.
00:31:11.360 | - If that was the minimum.
00:31:12.560 | - If that was the minimum.
00:31:13.480 | - And I think that's what,
00:31:14.800 | now we have drivers who start in the 30s.
00:31:19.040 | But yeah, and I mean, so we're talking two, three jobs,
00:31:21.720 | really, when you look at the total hours
00:31:23.360 | that people are working at, you know,
00:31:25.560 | they can work over a hundred.
00:31:26.720 | If they're a trainer, you know,
00:31:28.560 | training other truck drivers,
00:31:29.720 | well over a hundred hours a week.
00:31:31.800 | - So a job of last resort.
00:31:34.280 | Maybe you can jump around from tangent to tangent.
00:31:37.540 | This is such a fascinating and difficult topic.
00:31:41.600 | I heard that there's a shortage of truck drivers.
00:31:46.480 | So there's more jobs than truck drivers
00:31:48.320 | willing to take on the job.
00:31:49.800 | Is that the state of affairs currently?
00:31:53.280 | - I mean, I think the way that you just put that is right.
00:31:56.480 | We don't have a shortage of people
00:31:59.840 | who are currently licensed to do the jobs.
00:32:02.880 | So I'm working on a project for the state of California
00:32:05.240 | to look at the shortage of agricultural drivers.
00:32:07.140 | And the first thing that the DMV commissioner of a state
00:32:11.440 | wanted to look at was, you know,
00:32:13.720 | is there actually a shortage of licensed drivers?
00:32:15.740 | He's like, I've got a database here
00:32:17.760 | of all the people who have a commercial driver's license
00:32:20.040 | who could potentially have the credential to do this.
00:32:22.700 | There are about 145,000 jobs in California
00:32:28.420 | that require a class A CDL,
00:32:31.480 | which would be that commercial driver's license
00:32:33.640 | that you need for the big trucks.
00:32:35.280 | About 145,000 jobs, the industry in their regular
00:32:41.400 | promotion of the idea that there's a shortage
00:32:43.880 | is always projecting forward and says,
00:32:46.160 | we're gonna need 165,000 or so in the next 10 years.
00:32:50.560 | There are currently like 435,000 people licensed
00:32:53.880 | in the state of California to drive one of these big trucks.
00:32:57.320 | So it is not at all an absence of people who,
00:33:01.780 | I mean, and again, going back to what we were talking about
00:33:04.140 | before, getting that license is not something
00:33:07.220 | that you just walk down to the DMV and take the test.
00:33:10.500 | Like this is somebody who probably quit another job,
00:33:13.980 | was unemployed and took months to go to a training school,
00:33:18.980 | paid for that training school oftentimes,
00:33:22.060 | left their family for months,
00:33:24.140 | invested in what they thought was gonna be
00:33:26.560 | a long-term career and then said, you know what?
00:33:29.260 | Forget it, I can't do it.
00:33:31.300 | - So yeah, so it's not just skill,
00:33:35.460 | it's like they were psychologically invested
00:33:37.540 | potentially for months, if not years,
00:33:39.340 | into this kinds of position as perhaps a position
00:33:42.660 | that if they lose their current job, they could fall too.
00:33:46.220 | Okay, so that's an indication that there's something
00:33:48.540 | deeply wrong with the job if so many licensed people
00:33:51.820 | are not willing to take it.
00:33:53.300 | What are the biggest problems of the job
00:33:57.080 | of truck driver currently?
00:33:59.200 | - Yeah, the job, the problems with the job
00:34:01.540 | and the labor market, right?
00:34:02.700 | But let's start with the job, which is, again,
00:34:07.420 | just so much time that's not compensated directly
00:34:11.000 | for the amount of time.
00:34:12.980 | And that's just psychologically,
00:34:15.160 | and this was a big part of what I studied
00:34:18.000 | for the first book was that conception of like,
00:34:22.280 | what's my time worth, right?
00:34:24.460 | And like what truck drivers love is oftentimes
00:34:28.720 | is that tangible outcome-based compensation.
00:34:33.000 | So they say, you know what?
00:34:35.000 | You know, honest days work, I work hard,
00:34:37.640 | I get paid for what I do, I drive 500 miles today,
00:34:40.540 | that's what I'm gonna get paid for.
00:34:42.600 | And then you get to that dock and they tell you,
00:34:44.760 | sorry, the load's not ready, go sit over there.
00:34:48.300 | And you stew.
00:34:49.720 | - And that weight can break you psychologically
00:34:51.600 | 'cause your time every second becomes more worthless.
00:34:56.600 | - Yeah.
00:34:58.720 | - Or worth less.
00:35:00.080 | - Yeah, and again, the industry is gonna say,
00:35:03.240 | for instance, okay, well, you know,
00:35:05.240 | they've got skin in the game, right?
00:35:06.320 | That argument about sort of compensation
00:35:08.080 | based on sort of output, right?
00:35:10.720 | But that's a holdover from when
00:35:11.880 | you couldn't observe truckers.
00:35:12.960 | Now they all have, you know, satellite-linked computers
00:35:15.740 | in the trucks that tell these large companies,
00:35:18.600 | this driver was, you know, at this GPS location
00:35:21.120 | for four and a half hours, right?
00:35:22.760 | So if you wanted to compensate them for that time directly,
00:35:25.920 | and the trucker can't control what's happening
00:35:28.080 | on that customer location, you know,
00:35:29.560 | they're waiting for that, you know,
00:35:31.320 | firmed, that customer to tell them, hey, pull in there.
00:35:34.120 | And so what it becomes is just a way
00:35:37.660 | to shift the inefficiencies and the cost of that
00:35:40.760 | onto that driver.
00:35:43.560 | Now it's competitive for customers.
00:35:45.560 | So, you know, if you're Walmart,
00:35:47.600 | you might have your choice of a dozen different
00:35:49.860 | trucking companies that could move your stuff.
00:35:52.120 | And if one of them tells you, hey,
00:35:53.920 | you're not moving our trucks in and out of your docks
00:35:56.560 | fast enough, we're gonna charge you
00:35:58.580 | for how long our truck is sitting on your lot.
00:36:01.200 | If you're Walmart, you're gonna say,
00:36:02.200 | I'll go see what the other guy says, right?
00:36:04.560 | And so companies are gonna allow that customer
00:36:08.160 | to essentially waste that driver's time, you know,
00:36:11.520 | in order to keep that business.
00:36:13.420 | - Can you try to describe the economics,
00:36:16.820 | the labor market of the situation?
00:36:18.440 | You mentioned freight and railroad.
00:36:20.960 | What is the sort of the dynamic financials,
00:36:25.480 | the economics of this that allow for such low salaries
00:36:31.720 | to be paid to truckers?
00:36:35.520 | Like what's the competition?
00:36:37.320 | What's the alternative to transporting goods via trucks?
00:36:41.780 | Like what seems to be broken here
00:36:43.320 | from an economics perspective?
00:36:44.880 | - Yeah, so it's, well, nothing.
00:36:47.300 | It's a perfect market, right?
00:36:50.600 | I mean, so for economists, this is how it should work, right?
00:36:53.460 | - But the inefficiencies, like you said, sorry to interrupt,
00:36:55.940 | are pushed to the truck driver.
00:36:59.020 | Doesn't that like spiral, doesn't that lead
00:37:02.020 | to a poor performance on the part of the truck driver
00:37:04.540 | and just like make the whole thing more and more inefficient
00:37:08.020 | and it results in lower payment
00:37:10.860 | to the truck driver and so on?
00:37:12.780 | It just feels like in capitalism,
00:37:17.020 | you should have a competing solution
00:37:19.720 | in terms of truck drivers,
00:37:21.700 | like another company that provides transportation
00:37:24.420 | via trucks that creates a much better experience
00:37:27.540 | for truck drivers, making them more efficient,
00:37:30.300 | all those kinds of things.
00:37:32.340 | How is the competition being suppressed here?
00:37:34.820 | - Yeah, so it is, the competition is based on who's cheaper
00:37:39.320 | and this is the cheapest way to move the freight.
00:37:42.040 | Now, there are externalities, right?
00:37:44.300 | I mean, so this is the explanation
00:37:46.820 | that I think is obvious for this, right?
00:37:49.620 | There are lots of costs that,
00:37:53.820 | whether it's that driver's time,
00:37:54.980 | whether it's the time without their family,
00:37:57.880 | whether it's the fact that they drive through congestion
00:38:02.660 | and spew lots of diesel particulates
00:38:05.620 | into cities where kids have asthma
00:38:08.200 | and make our commutes longer
00:38:09.760 | rather than more efficiently use their time
00:38:11.980 | by sort of routing them around congestion
00:38:14.820 | and rush hour and things like that.
00:38:17.320 | This is the cheapest way to move freight
00:38:21.080 | and so it's the most competitive.
00:38:23.580 | A big part of this is public subsidy of training.
00:38:26.720 | So when those workers are not paying for the training,
00:38:31.260 | you and I often are.
00:38:32.840 | So if you lose your job because of foreign trade
00:38:38.500 | or you're a veteran using your GI benefits,
00:38:43.500 | you may very well be offered training,
00:38:48.380 | publicly subsidized training to become a truck driver.
00:38:50.640 | And so all of these are externalities
00:38:53.200 | that the companies don't have to pay for
00:38:55.900 | and so this makes it the most profitable way to move freight.
00:38:58.660 | - So trucks is way cheaper than trains?
00:39:02.980 | - Well, over the long,
00:39:03.940 | so one of the big stories for these companies
00:39:07.720 | is that the average length of haul,
00:39:10.020 | which becomes very important for self-driving trucks,
00:39:12.940 | the average length of haul has been steadily declining
00:39:15.940 | over the last 15 years or so.
00:39:19.180 | You know, I love this industry collected data
00:39:21.060 | from sort of the big firms that report it,
00:39:23.180 | but roughly been cut in half
00:39:25.620 | from typically about a thousand miles to under 500.
00:39:30.540 | And under 500 is what a driver can move in a day, right?
00:39:36.180 | So you can get loaded, drive and unload, you know,
00:39:40.960 | around 400 miles or something like that.
00:39:43.100 | - I wanna steal a good question
00:39:46.280 | from the Penn Gazette interview you did,
00:39:49.160 | which people should read, it's a great interview.
00:39:51.480 | Was there a golden age for long haul truckers in America?
00:39:55.320 | And if so, this is just a journalistic question,
00:39:58.320 | and if so, what enabled it and what brought it to an end?
00:40:02.560 | - Wow, I might have to have you read my answer to that.
00:40:05.740 | That was a few years ago,
00:40:07.620 | it'd be interesting to compare what I'll say, but.
00:40:10.820 | - I mean, one bigger question to ask, I guess, is like,
00:40:13.920 | you know, Johnny Cash wrote a lot of songs about truckers.
00:40:17.900 | There used to be a time when perhaps falsely,
00:40:22.140 | perhaps it's part of the kind of perception
00:40:24.340 | that you study with the labor markets and so on,
00:40:26.480 | there was a perception of truckers being,
00:40:28.980 | first of all, a lucrative job,
00:40:30.320 | and second of all, a job to be desired.
00:40:34.440 | - Yeah, so I mean, this is,
00:40:37.580 | the trucking industry, to me, is fascinating,
00:40:40.300 | but I think it should be fascinating to a lot of people.
00:40:43.580 | So the golden age was really two different kinds
00:40:47.820 | of markets as well, right?
00:40:51.100 | Today we have really good jobs and some really bad jobs.
00:40:54.500 | We had the Teamsters Union that controlled
00:40:57.580 | the vast majority of employee jobs,
00:41:00.980 | and even where, they had something called
00:41:03.380 | the National Master Freight Agreement.
00:41:05.520 | And this was Jimmy Hoffa, who led the union
00:41:10.420 | through its sort of critical period by the mid '60s,
00:41:15.420 | had unified essentially the entire nation's
00:41:19.220 | trucking labor force under one contract.
00:41:22.280 | Now, you were either covered by that contract
00:41:26.180 | or your employer paid a lot of attention to it.
00:41:29.260 | And so by the end of the 1970s,
00:41:33.140 | the typical truck driver was making
00:41:34.900 | well more than $100,000, typical truck driver
00:41:37.500 | was making more than $100,000 in today's dollars,
00:41:40.740 | and was home every night.
00:41:41.980 | That was, without a doubt,
00:41:45.420 | and even more than unionized auto workers,
00:41:47.940 | steel workers, 10, 20% more than those workers made.
00:41:52.940 | That was the golden age for sort of job quality,
00:41:56.340 | wages, Teamster power.
00:41:57.660 | They were, without a doubt, the most powerful union
00:42:00.340 | in the United States at that time.
00:42:03.160 | At the same time in the 1970s,
00:42:05.240 | you had the mythic long haul trucker.
00:42:09.160 | And these were the guys who were kind of on the margins
00:42:13.240 | of the regulated market,
00:42:15.000 | which is what the Teamsters controlled.
00:42:16.720 | A lot of them were in agriculture,
00:42:17.960 | which was never regulated.
00:42:19.560 | So in the New Deal, when they decided to regulate trucking,
00:42:22.360 | they didn't regulate agriculture
00:42:23.760 | because they didn't want to drive up food prices,
00:42:25.960 | which would hurt workers in urban areas.
00:42:28.240 | So they essentially left agricultural truckers out of it.
00:42:32.280 | And that's where a lot of the kind of outlaw,
00:42:34.480 | asphalt cowboy imagery that we get.
00:42:40.720 | And I grew up, I know you didn't grow up in the US,
00:42:44.600 | that this sort of, as a young child,
00:42:46.760 | and I'm a bit older than you,
00:42:48.720 | but in the late '70s, there were movies and TV shows,
00:42:53.640 | and CBs were craze,
00:42:55.200 | and it was all these kind of outlaw truckers
00:42:57.920 | who were out there hauling some unregulated freight.
00:43:00.660 | They weren't supposed to be trying to avoid the bears,
00:43:03.200 | who are the cops,
00:43:04.040 | and with all this salty language,
00:43:07.320 | and these terms that only they understood,
00:43:10.800 | and the partying at diners, and popping pills,
00:43:14.000 | the California turnarounds.
00:43:15.840 | - So asphalt cowboys, truly.
00:43:17.600 | It's like another form of cowboy movies.
00:43:21.200 | - Oh, absolutely, absolutely.
00:43:22.800 | And I think that sort of masculine ethos of,
00:43:27.120 | like, you got 40,000 pounds of something you care about,
00:43:30.320 | I'm your guy.
00:43:31.520 | You need it to go from New York to California,
00:43:33.620 | don't worry about it, I got it.
00:43:35.160 | That's appealing, and it's tangible.
00:43:37.240 | And you think about people who don't wanna be paper pusher,
00:43:39.860 | and deal with office politics,
00:43:41.620 | like, just give me what you care about,
00:43:43.120 | and I'll take care of it.
00:43:43.960 | You just pay me fair.
00:43:45.560 | And that appeals.
00:43:47.000 | - You mentioned unions, Teamsters, Jimmy Hoffa.
00:43:49.820 | Big question, maybe difficult question,
00:43:53.560 | what are some pros and cons of unions,
00:43:55.760 | historically and today, in the trucking space?
00:43:58.840 | - Yeah, well, if you're a worker, there are a lot of pros.
00:44:03.000 | And I don't, you know, and this was one of the things
00:44:05.840 | I talked to truckers about a lot.
00:44:07.560 | - Yeah, what's their perception of Jimmy Hoffa,
00:44:09.600 | for example, and of unions?
00:44:11.800 | - Yeah, so, and this was probably one of the central
00:44:15.400 | hypotheses that I had going in there.
00:44:17.080 | And it may sound, you know,
00:44:18.640 | someone who does hard science, right?
00:44:21.240 | You may, if you're a social scientist,
00:44:23.640 | you know, sort of use that terminology,
00:44:25.040 | even other social scientists.
00:44:26.280 | - Hypothesis?
00:44:27.120 | - Yeah, you know, they don't like it.
00:44:28.560 | But I do like to think that way.
00:44:31.240 | And my initial hypothesis was that, you know,
00:44:33.920 | and it's very simple, that, you know,
00:44:36.560 | the tenure of the driver in the industry
00:44:39.600 | would have a strong effect on how they viewed unions.
00:44:43.240 | That, you know, somebody who had experienced unions
00:44:46.120 | would be more favorable,
00:44:48.240 | and someone who had not, would not be, right?
00:44:51.060 | And that turned out to be the case, without a doubt.
00:44:55.600 | But in an interesting way,
00:44:57.920 | which was that even the drivers
00:44:59.840 | who were not part of the union,
00:45:01.840 | who in the kind of public debate of deregulation
00:45:07.960 | were portrayed as these kind of small business truckers
00:45:12.760 | who were getting shut out by the big regulated monopolies
00:45:16.280 | and the Teamsters Union, you know,
00:45:17.680 | the corrupt Teamsters Union,
00:45:19.560 | even those drivers longed for the days of the Teamsters.
00:45:23.880 | Because they recognized the overall market impact
00:45:27.820 | that they had, that trucking just naturally
00:45:31.680 | tended toward excessive competition,
00:45:34.380 | that meant that there was no profit to be made.
00:45:37.500 | And oftentimes you'd be operating at a loss.
00:45:39.920 | And so even these, you know,
00:45:42.200 | the asphalt cowboy owner operators from back in the day
00:45:45.160 | would tell me when the Teamsters were in power,
00:45:48.360 | I made a lot more money.
00:45:50.960 | And, you know, this is, you know, unions,
00:45:53.680 | at least those kinds of unions, like the Teamsters,
00:45:57.280 | you know, there's, I think a lot of misconceptions today,
00:46:00.440 | sort of popularly about what unions did back then.
00:46:03.800 | They tied wages to productivity.
00:46:06.040 | Like that was the central thing
00:46:09.080 | that the Teamsters Union did.
00:46:10.640 | And, you know, there were great accounts
00:46:13.120 | of sort of Jimmy Hoffa's perspective
00:46:15.800 | for all his portrayal as sort of corrupt and criminal.
00:46:19.500 | And there's, you know, I'm not disputing
00:46:21.240 | that he broke a lot of laws.
00:46:23.060 | He was remarkably open about who he was and what he did.
00:46:29.400 | He actually invited a pair, a husband and wife team
00:46:32.160 | of Harvard economists to follow him around
00:46:35.860 | and like opened up the Teamsters books to them
00:46:39.040 | so that they could see how he was, you know,
00:46:42.080 | thinking about negotiating with the employers.
00:46:45.480 | And the Teamsters, and this goes back well before Hoffa,
00:46:48.880 | back to the, you know, 1800s,
00:46:52.940 | they understood that workers did better
00:46:55.980 | if their employers did better.
00:46:57.580 | And the only way the employers would do better
00:46:59.520 | was if they controlled the market.
00:47:01.900 | And so oftentimes the corruption in trucking
00:47:04.660 | was initiated by employers who wanted to limit competition.
00:47:07.860 | And they knew they couldn't limit competition
00:47:09.660 | without the support of labor.
00:47:10.980 | And so you'd get these collusive arrangements
00:47:13.020 | between employers and labor to say,
00:47:15.500 | no new trucking companies.
00:47:17.360 | There are 10 of us, that's enough.
00:47:19.100 | We control Seattle, we're gonna set the price
00:47:22.040 | and we're not gonna be undercut.
00:47:23.760 | When there's a shortage of trucks around, it's great,
00:47:27.300 | rates go up, but you get too many trucks.
00:47:30.260 | It's very often that you end up operating at a loss
00:47:33.300 | just to keep the doors open.
00:47:35.480 | You know, you don't have any choice.
00:47:36.580 | You can't, it's what economists call derived demand.
00:47:39.580 | You can't like make up a bunch of trucking services
00:47:41.820 | and store it in a warehouse, right?
00:47:43.140 | You gotta keep those trucks moving to pay the bills.
00:47:47.100 | - Can we also lay out the kind of jobs that are in trucking?
00:47:50.480 | What are the best jobs in trucking?
00:47:52.280 | What are the worst jobs in trucking?
00:47:53.840 | What are we, how many jobs are we talking about today?
00:47:56.960 | - Yeah.
00:47:57.800 | - And what kind of jobs are there?
00:47:59.900 | - So there are a number of different segments.
00:48:04.280 | And the first part would be, you know, are you offering,
00:48:08.160 | the first question would be,
00:48:09.000 | are you offering services to the public
00:48:11.360 | or are you moving your own freight, right?
00:48:13.200 | So are you a retailer, say Walmart or, you know,
00:48:17.780 | a paper company or something like that,
00:48:19.260 | that's operating your own fleet of trucks?
00:48:22.180 | That's private trucking.
00:48:25.860 | For hire are the folks who, you know,
00:48:28.700 | offer their services out to other customers.
00:48:31.180 | So you have private and for hire.
00:48:33.100 | In general, for hire pays less.
00:48:38.020 | - Is that because of the, something you talk about,
00:48:40.780 | employee versus contractor situation,
00:48:43.760 | or are they all tricked or led to become contractors?
00:48:48.760 | - That can become a part of it as a strategy,
00:48:52.080 | but the fundamental reason is competition.
00:48:54.880 | So those private carriers aren't in competition
00:48:58.840 | with other trucking fleets, right,
00:49:00.360 | for their own in-house services.
00:49:02.200 | So, you know, they tend to, and this, you know,
00:49:05.480 | the question of why private versus for hire,
00:49:07.960 | because for hire is cheaper, right?
00:49:09.880 | And so if you need that,
00:49:12.100 | if that trucking service is central to what you do
00:49:15.600 | and you cannot afford disruptions or volatility
00:49:18.060 | in the price of it, you keep it in-house.
00:49:19.860 | - You should be willing to pay more for that
00:49:21.500 | 'cause it's more valuable to you
00:49:22.620 | and you keep it in-house and that.
00:49:23.900 | So that's an interesting distinction.
00:49:25.520 | What about, and this is kind of moving
00:49:27.180 | towards our conversation, what can and can't be automated?
00:49:30.240 | How else does it divide the different trucking jobs?
00:49:36.180 | - So the next big chunk is kind of
00:49:38.180 | how much stuff are you moving, right?
00:49:40.360 | And so we have what's called truckload.
00:49:43.540 | And truckload means, you know, you can fill up a trailer
00:49:46.000 | either by volume or by weight,
00:49:48.380 | and then less than truckload.
00:49:50.440 | Less than truckload, the official definition
00:49:52.600 | is like less than 10,000 pounds.
00:49:54.300 | You know, this is gonna be a couple pallets of this,
00:49:57.420 | a couple pallets of that.
00:49:58.780 | The process looks really different, right?
00:50:00.980 | So that truckload is, you know, point A to point B.
00:50:04.180 | I'm buying, you know, a truckload of bounty paper towels.
00:50:08.500 | I'm bringing it into, you know, my distribution center.
00:50:11.740 | Go pick it up at the bounty plant,
00:50:13.940 | bring it to my distribution center, right?
00:50:15.660 | Nowhere in between do you stop.
00:50:18.020 | At least process that freight.
00:50:19.540 | Less than truckload, what you've got is terminal systems.
00:50:22.820 | And this is what you had under regulation too.
00:50:25.740 | And so these terminal systems, what you do
00:50:27.580 | is you do a bunch of local pickup and delivery,
00:50:29.640 | maybe with smaller trucks.
00:50:31.180 | And you pick up two pallets of this here,
00:50:33.580 | four pallets of this there.
00:50:35.100 | You bring it to the terminal,
00:50:36.260 | you combine it based on the destination.
00:50:38.400 | You then create a full truckload, you know, trailer,
00:50:43.060 | and you send it to another terminal
00:50:44.620 | where it gets broken back down,
00:50:46.060 | and then out for local delivery.
00:50:47.980 | That's gonna look a lot like if you send a package by UPS,
00:50:52.260 | right, they pick all these parcels, right?
00:50:54.660 | Figure out where they're all going,
00:50:55.820 | put them on planes or in trailers
00:50:57.660 | going to the same destination,
00:50:58.740 | then break them out to put them
00:50:59.740 | in what they call package cars.
00:51:02.780 | - Before I ask you about autonomous trucks,
00:51:06.260 | let's just pause for your experience as a trucker.
00:51:09.840 | Did it get lonely?
00:51:12.460 | Like, can you talk about some of your experiences
00:51:15.100 | of what it was actually like?
00:51:16.780 | Did it get lonely?
00:51:17.940 | - Yeah, no, I mean, it was,
00:51:19.300 | I didn't have kids at the time.
00:51:21.300 | Now I have kids, I can't even imagine it.
00:51:23.520 | You know, I've been married for five years.
00:51:28.580 | At the time, my wife hated it, I hated it.
00:51:31.780 | You know, I describe in the book
00:51:34.540 | the experience of being stuck,
00:51:36.940 | if I remember correctly, it was like Ohio,
00:51:39.360 | at this truck stop in the middle of nowhere,
00:51:42.540 | and like, you know, sitting on this concrete barrier
00:51:46.380 | and just watching fireworks in the distance
00:51:48.740 | and like eating Chinese food on the 4th of July.
00:51:51.980 | And you know, my wife calls me from like the family barbecue
00:51:55.540 | and our anniversary is July 8th.
00:51:57.460 | And she's like, "Are you gonna be home?"
00:51:59.140 | And I'm like, "I don't know, you know."
00:52:01.940 | I have a cousin whose husband drove truck,
00:52:08.260 | as a truck driver would say, drove truck for a while.
00:52:12.160 | And he told me, before I went into it,
00:52:15.400 | he was like, "The advantage you have is that you know
00:52:18.620 | "that you're not gonna be doing this long-term."
00:52:21.040 | Like, and Lex, I can't even, like,
00:52:24.540 | the emotional content of some of these interviews,
00:52:27.860 | I mean, I would sit down at a truck stop with somebody
00:52:30.460 | I had never met before and you know, you open the spigot.
00:52:33.940 | And the last question I would ask drivers was,
00:52:38.940 | by the time I really sort of figured out how to do it,
00:52:41.060 | the last question I would ask them is,
00:52:43.180 | you know, what advice would you give to somebody?
00:52:46.100 | Your nephew, you know, a family friend asks you
00:52:50.180 | about what it's like to be a driver and should they do it?
00:52:52.220 | What advice would you give them?
00:52:54.140 | And this question, some of these, you know,
00:52:57.140 | grizzled old drivers, you know, tough, tough guys,
00:53:00.860 | would, that question would, like,
00:53:03.100 | some of them would break down and they would say,
00:53:05.220 | I would say to them, "You better have everything
00:53:08.640 | "that you ever wanted in life already."
00:53:11.700 | Because I've had a car that I've had for 10 years,
00:53:15.340 | it's got 7,000 miles on it.
00:53:17.100 | I own a boat that hasn't seen the water in five years.
00:53:21.860 | My kids, I didn't raise them.
00:53:24.260 | Like, I'd be out for two weeks at a time.
00:53:27.380 | I'd come home, my wife would give me two kids to punish,
00:53:31.500 | a list of things to do, you know, on Saturday night.
00:53:34.580 | And I might leave out Sunday night or Monday morning.
00:53:37.380 | You know, I come home dead tired.
00:53:39.460 | My kids don't know who I am.
00:53:41.840 | And, you know, it was just like,
00:53:43.820 | it was heartbreaking to hear those stories.
00:53:46.900 | - And then before you know it, you know,
00:53:49.420 | life is short and just the years run away.
00:53:52.500 | - Yeah.
00:53:54.340 | - It's a hard question to ask in that context,
00:53:56.620 | but what's the best,
00:53:58.440 | what was the best part of being a truck driver?
00:54:02.200 | Was there moments that you truly enjoyed on the road?
00:54:08.140 | - Oh, absolutely.
00:54:08.980 | There was, there's definitely a pride and mastery of,
00:54:13.140 | you know, even basic competence
00:54:14.580 | of sort of piloting this thing safely.
00:54:17.060 | There's a lot of responsibility to it.
00:54:18.620 | That thing's dangerous and you know it.
00:54:21.300 | So there's some pride there.
00:54:23.260 | For me personally, and I know for a lot of other drivers,
00:54:26.340 | it's just like seeing these behind the scenes places
00:54:29.300 | that you know exist in our economy.
00:54:32.260 | And I think we're all much more aware of them now
00:54:35.860 | after COVID and supply chain mess that we have.
00:54:38.980 | I don't know if we'll talk about that,
00:54:40.360 | but you know, you get to see those places.
00:54:42.820 | You know, you get to see those ports.
00:54:44.300 | You get to see the place where they make the cardboard boxes
00:54:47.860 | that the Huggy diapers go in,
00:54:49.580 | Huggy's diapers go in, or the warehouse full of Bud Light.
00:54:53.580 | I moved Bud Light from like upstate New York
00:54:56.660 | and the first load like went to Atlanta, you know?
00:54:59.860 | And then a couple months later,
00:55:01.480 | I circled back through that same brewery
00:55:03.700 | and I brought a load of Bud Light out to Michigan.
00:55:08.340 | And I was like, holy shit, all the Bud Light, like, you know,
00:55:11.980 | for this whole giant swath of the United States
00:55:14.260 | comes from this one plant,
00:55:15.540 | this cavernous plant with like kegs of beer.
00:55:17.820 | And you see that part of the economy
00:55:20.220 | and it's like, you're almost like you're an economic tourist.
00:55:24.260 | And I think all, everybody kind of appreciates that,
00:55:27.020 | like kind of, it's almost like a behind the scenes tour
00:55:29.660 | that wears off after a few months, you know?
00:55:32.540 | You start to see new things less and less frequently.
00:55:35.660 | At first, everything's novel and sort of life on the road.
00:55:38.540 | And then it becomes just endless miles of white lines
00:55:41.900 | and yellow lines and truck stops.
00:55:44.220 | And the days just blur together, you know?
00:55:47.780 | It's one loading dock after another.
00:55:49.620 | - So you lose the magic of being on the road?
00:55:52.460 | - Yeah, it's very rare the driver that doesn't.
00:55:55.100 | - You mentioned COVID and supply chain.
00:56:00.460 | While being this for a brief time,
00:56:04.900 | this member of the supply chain,
00:56:06.980 | what have you come to understand about our supply chain,
00:56:11.340 | United States and global,
00:56:13.260 | and its resilience against strategies,
00:56:17.140 | catastrophes in the world, like COVID, for example?
00:56:20.220 | - Yeah, I mean, we have built really long,
00:56:24.380 | really lean supply chains.
00:56:26.820 | And just by definition, they're fragile.
00:56:30.360 | You know, the current mess that we have,
00:56:34.340 | it's not gonna clear by Christmas.
00:56:37.260 | It will be lucky if it clears by next Christmas.
00:56:39.940 | - Can you describe the current mess in supply chain
00:56:41.980 | that you're referring to?
00:56:43.060 | - Yeah, so we've got pileups of ships
00:56:46.940 | off the coast of California, Long Beach,
00:56:50.220 | and LA in particular, in bad shape.
00:56:54.860 | You know, last I checked, it was around 60 ships,
00:56:56.900 | all of which are holding thousands of containers
00:57:00.540 | full of stuff that retailers were hoping
00:57:02.740 | was gonna be on shelves for the holiday season.
00:57:07.380 | Meanwhile, the port itself has stacks and stacks
00:57:10.460 | of containers that they can't get rid of.
00:57:12.780 | The truckers aren't showing up to pick up
00:57:15.760 | the containers that are there,
00:57:17.540 | so they can't offload the ships that are waiting.
00:57:21.620 | And why aren't the truckers picking it up?
00:57:26.220 | Partly because there's a long history of inefficiency
00:57:28.260 | in making them wait,
00:57:29.580 | but it's 'cause the warehouses are full.
00:57:31.940 | So we've had all these perverse outcomes
00:57:36.500 | that no one really expected.
00:57:38.020 | Like in the middle of all these shortages,
00:57:40.340 | people are stockpiling stuff.
00:57:43.340 | So there are suppliers who used to keep two months
00:57:47.240 | of supply of bottled water on hand,
00:57:50.660 | and after going through COVID and not having supply
00:57:53.620 | to send to their customers,
00:57:55.540 | they're like, "We need three months."
00:57:58.260 | Well, our system is not designed for major storage of goods
00:58:02.500 | to go up 50% in a category, it's lean.
00:58:05.740 | If you're a warehouse operator, you wanna be 90% plus.
00:58:08.660 | You don't want a lot of open bays sitting around.
00:58:10.740 | So we don't have 10% extra capacity in warehouses.
00:58:15.740 | We don't have 10% of,
00:58:18.060 | trucking capacity can fluctuate a bit,
00:58:19.780 | but you don't have that kind of slack.
00:58:23.620 | And now, I mean, and we saw this
00:58:26.180 | when people shifted consumption.
00:58:28.260 | And I get a little mad when people talk about panic buying
00:58:32.780 | as kind of the reason that we had all these shortages.
00:58:36.660 | It really, like it's preventing us from understanding
00:58:40.780 | the real problem there, which is that lean supply chain.
00:58:44.940 | Sure, there was some panic buying, no doubt about it,
00:58:47.920 | but we had an enormous shift in people's behavior.
00:58:51.840 | So with my sister and brother-in-law,
00:58:55.300 | I own a couple of small businesses and we serve food.
00:58:58.540 | So we get food from Cisco.
00:59:02.380 | Cisco couldn't get rid of food, right?
00:59:04.740 | Because nobody's eating out.
00:59:05.980 | So they've got 50 pound sacks of flour,
00:59:09.180 | sitting in their warehouse that they can't get rid of.
00:59:11.900 | They've got cases of lettuce and meat and everything else
00:59:14.940 | that's just gonna go bad.
00:59:16.220 | So that panic buying certainly exacerbated some things
00:59:20.780 | like toilet paper and whatever,
00:59:21.980 | but we saw just a massive change in demand.
00:59:25.580 | And our supply chains are based on historical data, right?
00:59:28.760 | So that stuff leaves Asia months before
00:59:32.580 | you wanna have it on the shelves.
00:59:34.460 | And you're predicting based on last year,
00:59:37.000 | what you want on that shelf.
00:59:39.820 | And so it's a, I guess at its best,
00:59:43.280 | it's a beautiful symphony of lots of moving parts,
00:59:47.000 | but now everyone can't get on the same page of music.
00:59:52.780 | - But it's not resilient to changes in en masse
00:59:58.560 | human behavior.
00:59:59.540 | So even like I read somewhere,
01:00:03.400 | maybe you can tell me if this is true in relation to food,
01:00:06.420 | it's just the change of human behavior
01:00:08.300 | between going out to restaurants versus eating at home.
01:00:11.900 | As a species, we consume a lot less food that way.
01:00:15.540 | Apparently what I read in restaurants,
01:00:18.440 | like there's a lot of food just thrown out.
01:00:20.620 | It's part of the business model.
01:00:22.180 | And so like you then have to move a lot more food
01:00:26.180 | through the whole supply chain.
01:00:28.060 | And now because you're consuming,
01:00:31.540 | there's leftovers at home,
01:00:32.900 | you're consuming much more of the food you're getting
01:00:37.180 | when you're eating at home,
01:00:38.620 | that's creating these bottleneck situations,
01:00:40.980 | problems as you're referring to,
01:00:42.540 | too much in a certain place, not enough in another place.
01:00:45.380 | And it's just the supply chain is not robust
01:00:47.860 | to those kinds of dynamic shifts in who gets what where.
01:00:52.860 | - Yeah.
01:00:54.160 | Yeah, I mean, so, and I have worked in agriculture a bit
01:00:57.440 | on sort of the supply side,
01:01:00.120 | and there are product categories
01:01:03.140 | where 30% of the crop raised does not get used,
01:01:07.460 | just gets plowed under or wasted.
01:01:09.940 | But here's the importance of this
01:01:12.200 | in sort of getting this right,
01:01:14.440 | not that like panic buying, blame the irrational consumer,
01:01:18.980 | look at the hard sort of truth
01:01:21.220 | of the way we've set up our economy.
01:01:24.260 | And I'll ask you this, Lex,
01:01:26.780 | I know you're a hopeful, optimistic person.
01:01:30.620 | - 100%, yes.
01:01:31.980 | - Yeah, I am too.
01:01:32.980 | I mean, I write about problems all the time,
01:01:34.860 | and so people think I'm sort of like just a Debbie Downer,
01:01:38.020 | you know, pessimist,
01:01:40.020 | but I'm a glass half full kind of guy.
01:01:43.100 | Like I want to identify problems so we can solve them.
01:01:47.420 | So let me ask you this,
01:01:48.380 | we've got these long, lean supply chains.
01:01:52.060 | In the future, do you see more environmental problems
01:01:57.060 | that could disrupt them,
01:02:00.500 | more geopolitical problems that could disrupt trade
01:02:05.500 | from Asia, you know, other institutional failures?
01:02:10.560 | Do those things seem, you know,
01:02:13.860 | potentially more likely in the future
01:02:16.300 | than they have been in say the last 20 years?
01:02:18.660 | - Yeah, it almost absolutely seems to be the case.
01:02:21.860 | So you then have to ask the question of
01:02:24.460 | how do we change our supply chains,
01:02:28.980 | whether it's making more resilient
01:02:31.060 | or make them less densely connected,
01:02:34.160 | you know, building, it's like, what is it?
01:02:38.820 | You know, the Tesla model for in the automotive sector
01:02:43.420 | of like trying to build everything,
01:02:45.060 | like trying to get the factory to do as much as possible
01:02:48.620 | with as little reliance on widely distributed sources
01:02:53.500 | of the supply chain as possible.
01:02:54.980 | So maybe like rethinking how much we rely
01:02:58.660 | on the infrastructure of the supply chain.
01:03:00.860 | - Yeah, I mean, you know, there are some basic,
01:03:03.780 | and I assume, right, that there are a lot of folks
01:03:08.140 | in corporate boardrooms looking at risk
01:03:10.580 | and saying that didn't go well,
01:03:13.080 | and maybe it could have even gone worse.
01:03:16.500 | Maybe we need to think about reshoring, right?
01:03:19.720 | At the very least, one of the things
01:03:22.740 | that I'm hearing about anecdotally
01:03:24.100 | is that they're storing stuff up, you know, when they can,
01:03:27.100 | right, which is, that's probably not sustainable, right?
01:03:31.340 | I mean, at some point, somebody in that corporate boardroom
01:03:34.060 | is gonna say, you know, guys, inventory is getting
01:03:36.660 | kind of heavy, the cost of that is like,
01:03:38.540 | can we really justify that much longer
01:03:40.060 | to the shareholders, right?
01:03:41.780 | We can back off and start, you know,
01:03:43.820 | things are back to normal, let's lean out.
01:03:45.500 | - Well, my hope is that there's a technology solution
01:03:48.380 | to a lot of aspects of this.
01:03:49.580 | So one of them on the supply chain side
01:03:51.420 | is collecting a lot more data,
01:03:53.040 | like having much more integrated and accurate representation
01:03:58.380 | of the inventory all over the place,
01:04:00.660 | and the available transportation mechanisms,
01:04:03.380 | the trucks, the all kinds of freight,
01:04:06.060 | and how in the different models
01:04:08.620 | of the possible catastrophes that can happen,
01:04:14.660 | like how will the system respond?
01:04:15.940 | So having a really solid model that you're operating under,
01:04:19.300 | as opposed to just kind of being in emergency response mode
01:04:23.500 | under poor incomplete information,
01:04:26.180 | which is what seems like is more commonly the case,
01:04:30.460 | except for things like you said, Walmart and Amazon,
01:04:34.100 | they're trying to internally get their stuff together
01:04:36.740 | on that front, but that doesn't help the rest of the economy.
01:04:40.220 | So another exciting technological development
01:04:44.900 | as you write about, as you think about is autonomous trucks.
01:04:48.100 | So these are often brought up in different contexts
01:04:52.300 | as the examples of AI and robots taking our jobs.
01:04:57.300 | How true is this?
01:04:58.700 | Should we be concerned?
01:05:00.260 | - I think they've really come to epitomize
01:05:03.740 | this anxiety over automation, right?
01:05:05.620 | Just, it's such a simple idea, right?
01:05:09.540 | A truck that drives itself,
01:05:11.820 | classic blue collar job that pays well,
01:05:15.260 | guy maybe with not a lot of other good options, right?
01:05:20.420 | To sort of make that same income easily, right?
01:05:23.700 | And you build a robot to take his job away, right?
01:05:27.380 | So I think 2016 or so,
01:05:32.420 | that was the sort of big question out there.
01:05:35.260 | And that's actually how I started studying it, right?
01:05:38.380 | I just wrapped up the book,
01:05:40.020 | just so happened that somebody who was working at Uber,
01:05:43.180 | Uber had just bought auto, saw the book and was like,
01:05:45.740 | "Hey, can you come out and talk to our engineering teams
01:05:48.780 | "about what life is like for truck drivers
01:05:51.680 | "and maybe how our technology could make it better."
01:05:54.700 | And at that time, there were a lot of different ideas
01:05:58.620 | about how they were gonna play out, right?
01:06:00.220 | So while the press was saying,
01:06:03.340 | "All truckers are gonna lose their jobs."
01:06:05.300 | There were a lot of people in these engineering teams
01:06:08.100 | who thought, "Okay, if we've got an individual
01:06:10.140 | "owner operator and they can only drive
01:06:14.420 | "eight or 10 hours a day, they hop in the back,
01:06:18.640 | "they get their rest and the asset that they own
01:06:21.300 | "works for them."
01:06:22.780 | Right, so perfect, right?
01:06:24.800 | And at that time, there were a bunch of reports
01:06:28.660 | that came out and sort of basically what people did
01:06:30.540 | was they took the category of truck driver.
01:06:33.100 | Some people took a larger category from BLS
01:06:35.860 | of sales and delivery workers
01:06:37.680 | that was about three and a half million workers
01:06:40.220 | and others took the heavy duty truck driver category,
01:06:43.580 | which was at the time about 1.8 million or so.
01:06:46.740 | And they picked a start date and a slope
01:06:49.380 | and said, "Let's assume that all these jobs
01:06:52.220 | "are just gonna disappear."
01:06:53.500 | And really smart researcher, Annetta Bernhardt
01:06:57.580 | at the Labor Center at UC Berkeley
01:07:00.700 | was sort of looking around for people
01:07:03.740 | who were sort of deeply into industries
01:07:06.600 | to complicate those analyses, right?
01:07:10.260 | And reached out to me and was like,
01:07:11.580 | "What do you think of this?"
01:07:12.540 | And I said, "The industry is super diverse.
01:07:15.160 | "I haven't given a ton of thought, but it can't be that.
01:07:17.820 | "It's not that simple, it never is."
01:07:21.200 | And so she was like, "Will you do this?"
01:07:23.800 | And I was like ready to move on to another topic.
01:07:26.300 | I had been in trucking for 10 years
01:07:28.700 | and that's how I started looking at it.
01:07:31.020 | And it is, it's a lot more complicated
01:07:33.660 | and the initial impacts and here's the challenge I think,
01:07:38.540 | and it's not just a research challenge,
01:07:40.620 | it's the fundamental public policy challenge
01:07:43.600 | is we look at the existing industry
01:07:46.920 | and the impacts, the potential impacts.
01:07:50.040 | They're not nothing.
01:07:53.040 | For some communities and some kinds of drivers,
01:07:56.240 | they're gonna be hard.
01:07:57.200 | And there are a significant number of them.
01:07:59.280 | Nowhere near what people thought.
01:08:01.280 | I estimate it's like around 300,000,
01:08:04.280 | but that's a static picture of the existing industry.
01:08:08.060 | And here's the key with this is,
01:08:10.700 | at least in my conclusion is,
01:08:14.100 | this is a transformative technology.
01:08:17.060 | We are not going to swap in self-driving trucks
01:08:20.860 | for human driven trucks and all else stays the same.
01:08:24.180 | This is gonna reshape our supply chains,
01:08:27.420 | it's gonna reshape landscapes,
01:08:29.340 | it's gonna affect our ability to fight climate change.
01:08:32.440 | This is a really important technology in this space.
01:08:37.680 | - Do you think it's possible to predict the future
01:08:40.180 | of the kind of opportunities it will create,
01:08:44.420 | how it will change the world?
01:08:46.720 | So like when you have the internet,
01:08:48.520 | you can start saying like all the kinds of ways
01:08:53.140 | that office work, all jobs will be lost
01:08:55.420 | because it's easy to network
01:08:57.040 | and then software engineering allows you
01:08:59.280 | to automate a lot of the tasks
01:09:01.140 | like Microsoft Excel does, you know.
01:09:03.320 | But it opened up so many opportunities,
01:09:08.040 | even with things that are difficult to imagine,
01:09:10.220 | like with the internet, I don't know, Wikipedia,
01:09:12.680 | which is widely making accessible information.
01:09:15.920 | And that increased the general education globally by a lot,
01:09:20.920 | all those kinds of things.
01:09:22.920 | And then the ripple effects of that
01:09:25.400 | in terms of your ability to find other jobs
01:09:29.080 | is probably immeasurable.
01:09:31.000 | So is it just a hopeless pursuit to try to predict
01:09:35.680 | if you talk about these six different trajectories
01:09:40.480 | that we might take in automating trucks,
01:09:44.380 | but like as a result of taking those trajectories,
01:09:47.640 | is it a hopeless pursuit to predict
01:09:49.160 | what the future will result in?
01:09:50.840 | - Yeah, it is.
01:09:52.160 | (laughing)
01:09:53.000 | It absolutely is.
01:09:54.680 | Because it's the wrong question.
01:09:56.920 | The question is, what do we want the future to be
01:09:59.280 | and let's shape it, right?
01:10:01.960 | And I think this is, you know,
01:10:04.280 | and this is the only point that I really wanna make
01:10:07.120 | in my work, you know, for the foreseeable future
01:10:10.200 | is that, you know, we have got to get out of this mindset
01:10:15.200 | that we're just gonna let technology kind of go
01:10:20.400 | and it's a natural process and whatever pops out
01:10:22.520 | will fix the problems on the backside.
01:10:24.360 | And we've got to recognize that one,
01:10:28.160 | that's not what we do, right?
01:10:30.920 | You know, and self-driving vehicles
01:10:33.320 | is just such a perfect example, right?
01:10:35.120 | We would not be sitting here today
01:10:37.200 | if the Defense Department, right,
01:10:38.520 | if Congress in 2000 had not written into legislation
01:10:43.520 | funding for the DARPA challenges, which followed,
01:10:48.040 | actually I think the funding came a couple of years later,
01:10:50.060 | but the priority that they wrote in 2000 was,
01:10:52.840 | let's get a third of all ground vehicles
01:10:55.520 | in our military forces unmanned, right?
01:10:58.440 | And this was before aerial unmanned vehicles
01:11:01.440 | had really sort of proven their worth.
01:11:02.860 | They would come to be incredibly like, you know,
01:11:05.040 | just blow people out of the, blow people's minds
01:11:07.880 | in terms of their additional capabilities,
01:11:09.760 | the lower costs, you know, keeping, you know,
01:11:12.600 | soldiers out of harm's way.
01:11:13.680 | Now, of course they raised other problems
01:11:15.480 | and considerations that I think we're still wrestling with,
01:11:17.640 | but that was even before that they had this priority.
01:11:21.160 | We would not be sitting here today
01:11:22.720 | if Congress in 2000 had not said, let's bring this about.
01:11:26.480 | - So they already had that vision, actually.
01:11:29.080 | I didn't know about that.
01:11:30.040 | So for people who don't know the DARPA challenges
01:11:32.720 | is the events that were just kind of like
01:11:36.340 | these seemingly small scale challenges
01:11:39.640 | that brought together some of the smartest roboticists
01:11:41.800 | in the world, and that somehow created enough of a magic
01:11:45.480 | where ideas flourished, both engineering and scientific,
01:11:51.600 | that eventually then was the catalyst
01:11:54.720 | for creating all these different companies
01:11:56.320 | that took on the challenge.
01:11:57.200 | Some failed, some succeeded,
01:11:58.780 | some are still fighting the good fight.
01:12:01.040 | And that somehow just that little bit of challenge
01:12:03.440 | was the essential spark of progress
01:12:07.420 | that now resulted in this beautiful up and down wave
01:12:10.840 | of hype and profit and all this kind of weird dance
01:12:15.160 | where the B word, billions of dollars
01:12:18.080 | have been thrown around and we still don't know.
01:12:21.120 | And the T word, trillions of dollars
01:12:23.360 | in terms of transformative effects of autonomous vehicles
01:12:25.680 | and all that started from DARPA and that initial vision
01:12:29.720 | of, I guess, as you're saying,
01:12:31.320 | of automating part of the military supply chain.
01:12:35.000 | - Yeah. - I did not know that.
01:12:36.240 | That's interesting.
01:12:37.060 | So they had the same kind of vision for the military
01:12:39.960 | as we're now talking about a vision for the civilian,
01:12:43.340 | whether it's trucking or whether it's autonomous vehicle,
01:12:45.740 | sort of a ride sharing kind of application.
01:12:48.360 | - Yeah, I mean, what an incredible spark, right?
01:12:51.840 | And just the story of what it produced, right?
01:12:57.720 | I mean, your own work on self-driving, right?
01:13:01.220 | I mean, you've studied it as an academic, right?
01:13:04.080 | How many great researchers and minds have been harnessed
01:13:08.440 | by this outcome of that spark, right?
01:13:11.300 | And I think this is sort of theoretically
01:13:13.400 | about technology, right?
01:13:14.280 | This is what makes it so great is that,
01:13:16.880 | this is what makes us human in my opinion, right?
01:13:18.440 | Is that you conceive of something in your mind
01:13:21.320 | and then you bring it into reality, right?
01:13:23.560 | I mean, that's what is so great about it.
01:13:26.500 | - Sometimes you're too dumb to realize how difficult it is
01:13:29.840 | so you take it on.
01:13:30.680 | (laughing)
01:13:31.560 | - Right.
01:13:32.400 | - And then eventually you're in too deep.
01:13:36.720 | So you might as well solve the problem.
01:13:38.720 | - Well, and maybe we're in that situation right now
01:13:41.320 | with self-driving.
01:13:42.280 | But, and so let me throw this out there.
01:13:44.320 | I'd be curious to hear your thoughts on it.
01:13:46.320 | But truck drivers always ask me, is this for real?
01:13:50.120 | Like, is this really do,
01:13:51.640 | it's harder than they think, right?
01:13:53.440 | And they can't really do this.
01:13:55.720 | And at first I was like, look,
01:13:59.280 | this is like the defense department
01:14:01.520 | and basically the top computer science
01:14:04.280 | and robotics departments in the world.
01:14:07.720 | And now Silicon Valley with billions of dollars in funding
01:14:14.000 | and just some of the smartest, hardest working,
01:14:17.400 | most visionary people focused on what is clearly
01:14:20.600 | a gigantic market, right?
01:14:24.640 | And what I tell them is like,
01:14:26.760 | if self-driving vehicles don't happen,
01:14:31.200 | I think this will be the biggest technology failure story
01:14:34.640 | in human history.
01:14:35.680 | I don't know of anything else that is just galvanized.
01:14:39.880 | I mean, you've had people in garages
01:14:41.800 | or weird inventors work on things
01:14:43.360 | their whole lives and come really close
01:14:45.120 | and it never happens and it's a great failure story, right?
01:14:48.160 | But never have we had like whole,
01:14:50.320 | I mean, we're talking about GM, right?
01:14:52.920 | I mean, these are not tech companies, right?
01:14:56.000 | These are industrial giants, right?
01:14:58.200 | What were in the 20th century,
01:15:00.440 | the pinnacle of industrial production in the world
01:15:03.240 | in human history, right?
01:15:05.320 | And they're focused on it now.
01:15:07.280 | So if we don't pull this off, it's like, wow.
01:15:11.320 | - It's fascinating to think about.
01:15:12.560 | I've never thought of it that way.
01:15:14.520 | There was a mass hysteria on a level
01:15:18.400 | in terms of excitement and hype
01:15:20.640 | on a level that's probably unparalleled in technology space.
01:15:23.640 | Like I've seen that kind of hysteria just studying history
01:15:26.400 | when you talk about military conflict.
01:15:28.760 | So we often wage war with a dream of making a better world
01:15:32.680 | and then realize it costs trillions of dollars.
01:15:34.800 | And then we step back and like, and go, wait a minute,
01:15:37.800 | what do we actually get for this?
01:15:40.120 | But in the space of technology,
01:15:41.600 | it seems like all these kinds of large efforts
01:15:44.080 | have paid off.
01:15:45.640 | This, you're right.
01:15:47.120 | It seems like, it seems like even GM and Ford
01:15:51.360 | and all these companies now are a little bit like,
01:15:54.520 | hey, or Toyota and even Tesla,
01:15:58.720 | like, are we sure about this?
01:16:01.800 | - Yeah.
01:16:02.640 | - And it's fascinating to think about
01:16:04.320 | when you tell the story of this,
01:16:06.120 | this could be one of the big first perhaps,
01:16:10.160 | but by far the biggest failures of the dream
01:16:14.120 | in the space of technology.
01:16:16.220 | That's really interesting to think about.
01:16:17.880 | I was a skeptic for a long time
01:16:19.840 | because of the human factor.
01:16:22.880 | Because for business to work in the space,
01:16:25.160 | you have to work with humans
01:16:26.320 | and you have to work with humans at every level.
01:16:29.240 | So in the truck driving space,
01:16:30.520 | you have to work with the truck driver,
01:16:32.480 | but you also have to work with the society
01:16:34.320 | that has a certain conception of what driving means.
01:16:36.960 | And also you have to have work with businesses
01:16:38.760 | that are not used to this extreme level of technology
01:16:43.760 | in the basic operation of their business.
01:16:48.640 | So I thought it would be really difficult
01:16:50.120 | to move to autonomous vehicles in that way.
01:16:53.160 | But then I realized that there's certain companies
01:16:56.200 | that are just willing to take big risks and really innovate.
01:17:00.280 | I think the first impressive company to me was Waymo
01:17:04.320 | or what was used to be the Google self-driving car.
01:17:07.520 | And I saw, okay, here's a company
01:17:11.120 | that's willing to really think long-term
01:17:13.520 | and really try to solve this problem,
01:17:15.640 | hire great engineers.
01:17:17.260 | Then I saw Tesla with Mobileye when they first had,
01:17:22.680 | I thought, actually Mobileye is the thing that impressed me.
01:17:25.800 | When I sat down, I thought,
01:17:27.200 | 'cause I'm a computer vision person,
01:17:28.440 | I thought there's no way a system could keep me in lane
01:17:33.440 | long enough for it to be a pleasant experience for me.
01:17:36.720 | So from a computer vision perspective,
01:17:38.160 | I thought there'd be too many failures,
01:17:39.560 | it'd be really annoying, it'd be a gimmick, a toy,
01:17:42.320 | it wouldn't actually create a pleasant experience.
01:17:44.880 | And when I first was gotten a Tesla with Mobileye,
01:17:47.560 | the initial Mobileye system,
01:17:49.120 | it actually held the lane for quite a long time
01:17:52.600 | to where I could relax a little bit.
01:17:54.600 | And it was a really pleasant experience.
01:17:56.120 | I couldn't exactly explain why it's pleasant
01:17:58.640 | 'cause it's not like I still have to really pay attention,
01:18:01.120 | but I can relax my shoulders a little bit.
01:18:05.520 | I can look around a little bit more.
01:18:07.000 | And for some reason that was really reducing the stress.
01:18:10.560 | And then over time,
01:18:12.800 | Tesla with a lot of the revolutionary stuff they're doing
01:18:15.760 | on the machine learning space,
01:18:17.400 | made me believe that there's opportunities here to innovate,
01:18:22.000 | to come up with totally new ideas.
01:18:23.840 | Another very sad story that I was really excited about
01:18:27.720 | is Cadillac's super cruise system.
01:18:29.480 | It is a sad story because I think I vaguely read in the news
01:18:32.840 | they just said they're discontinuing super cruise,
01:18:36.120 | but it's a nice innovative way
01:18:39.040 | of doing driver attention monitoring
01:18:41.320 | and also doing lane keeping.
01:18:43.400 | And just innovation could solve this
01:18:45.680 | in ways we don't predict.
01:18:47.360 | And same with in the trucking space,
01:18:49.560 | it might not be as simple as like journalists envision
01:18:52.760 | a few years ago where everything's just automated.
01:18:55.720 | It might be gradually helping out the truck driver
01:19:00.160 | in some ways that make their life more efficient,
01:19:03.400 | more effective, more pleasant,
01:19:06.400 | make the like remove some of the inefficiencies
01:19:09.280 | that we've been talking about in totally innovative ways.
01:19:12.520 | And that I still have that dream
01:19:14.880 | that I believe to solve the fully autonomous driving problem
01:19:18.680 | we're still many years away,
01:19:20.440 | but on the way to solving that problem,
01:19:22.920 | it feels like there could be,
01:19:25.060 | if there's bold risk takers and innovators in the space,
01:19:29.420 | there's an opportunity to come up with
01:19:31.880 | like subtle technologies that make all the difference.
01:19:36.160 | That's actually just what I realized
01:19:38.880 | is sometimes little design decisions
01:19:41.440 | make all the difference.
01:19:42.920 | It's the BlackBerry versus the iPhone.
01:19:45.100 | Why is it that you have a glass
01:19:48.800 | and you're using your finger for all of the work
01:19:51.880 | versus the buttons makes all the difference.
01:19:54.280 | This idea that now that you have a giant screen
01:19:57.200 | so that every part of the experience
01:20:00.100 | is now a digital experience.
01:20:01.940 | So you can have things like apps that change everything.
01:20:05.060 | You can't, when you first thinking about
01:20:07.620 | do I want a keyboard or not on a smartphone,
01:20:10.140 | you think it's just the keyboard decision,
01:20:13.380 | but then you later realize by removing the keyboard,
01:20:17.840 | you're enabling a whole ecosystem of technologies
01:20:21.940 | that are inside the phone.
01:20:23.140 | And now you're making the smartphone into a computer.
01:20:25.500 | And that same way,
01:20:27.360 | who knows how you can transform trucks, right?
01:20:30.480 | By like automating some parts of it,
01:20:32.520 | maybe adding some displays,
01:20:36.480 | maybe allows you to maybe giving the truck driver
01:20:39.880 | some control in the supply chain to make decisions,
01:20:43.160 | all those kinds of things.
01:20:44.480 | - Yeah.
01:20:45.800 | - So I don't know.
01:20:46.960 | So where are you on the spectrum of hope
01:20:51.640 | for the role of automation in trucking?
01:20:56.300 | - I think automation is inevitable.
01:20:59.560 | And again, I think this is really going to be transformative
01:21:04.560 | and it's gonna be,
01:21:06.200 | I've studied the history of trucking technology
01:21:09.680 | as much as I can.
01:21:11.000 | There's not a lot of great stuff written
01:21:12.960 | and you kind of have to,
01:21:14.160 | there isn't a lot of data and places to know
01:21:16.920 | sort of volumes of stuff and how they're changing, et cetera.
01:21:19.280 | But the big revolutionary changes in trucking
01:21:24.520 | are because of constellations of factors.
01:21:28.000 | It's not just one thing, right?
01:21:29.680 | So Daimler builds a motorized truck in,
01:21:32.320 | I think it's 1896, right?
01:21:35.360 | Intercity trucking.
01:21:37.600 | So basically what they use that truck for
01:21:39.120 | is just to swap out horses, right?
01:21:41.840 | They basically do the same thing.
01:21:42.960 | The service doesn't really change, you know?
01:21:46.000 | And then World War I really spurs the development
01:21:48.800 | of sort of bigger, larger trucks,
01:21:50.720 | like spreads air filled tires.
01:21:54.440 | And then we start paving roads, right?
01:21:57.880 | And paved roads, right?
01:22:00.480 | Air filled tires and the internal combustion engine.
01:22:04.460 | Now you got a winning mix.
01:22:05.720 | Now it met with demand for people who wanted to get out
01:22:09.380 | from under the thumb of the railroads, right?
01:22:12.180 | So there was all of this pent up demand
01:22:14.720 | to get cheaper freight from the countryside
01:22:18.400 | into cities and between cities
01:22:20.820 | that typically had to go by rail.
01:22:22.760 | And so now, you know, 40 years
01:22:25.880 | after that internal combustion engine,
01:22:28.360 | it becomes this absolutely essential, right?
01:22:31.000 | This necessary but not sufficient piece of technology
01:22:34.360 | to create the modern trucking industry in the 1930s.
01:22:38.880 | And I think self-driving is gonna be,
01:22:41.680 | self-driving trucks are gonna be part of that.
01:22:43.680 | And the idea, I don't know, I guess we credit Jeff Bezos.
01:22:47.720 | The idea is, you know, okay, so Sam Walton,
01:22:52.320 | if we can do it like a slight tangent
01:22:54.060 | on sort of the importance of trucking to business strategy
01:22:57.080 | and sort of how it has transformed our world.
01:22:59.580 | The central insight that Sam Walton had
01:23:03.520 | that made him the giant that he was
01:23:06.160 | in influencing the way that so many people get stuff
01:23:10.480 | was a trucking insight.
01:23:12.520 | And so if you look at the way that he developed his system,
01:23:17.920 | you build a distribution center
01:23:19.680 | and then you ring it with stores.
01:23:22.440 | Those stores are never further out
01:23:25.040 | from that distribution center
01:23:26.280 | than a human-driven truck can drive back and forth
01:23:30.100 | in one day.
01:23:30.940 | And so rather than the way all of his competitors
01:23:34.960 | were doing it with sending trucks all over the place
01:23:38.000 | and having people sleep overnight
01:23:39.560 | and sort of making the trucking service fit
01:23:43.320 | where they had stores,
01:23:45.280 | he designed the layout of the stores
01:23:48.360 | to fit what trucks could do.
01:23:51.400 | And so transportation and logistics
01:23:53.960 | become Walmart's edge
01:23:57.360 | and allows them to dominate the space.
01:23:59.380 | That's the challenge that Amazon has now.
01:24:02.080 | They've mastered the digital part of it,
01:24:05.800 | and now they gotta figure out like,
01:24:07.520 | how do we dominate the actual physical movement
01:24:11.640 | that complements that?
01:24:14.280 | Others are obviously gonna follow,
01:24:16.720 | but the capabilities of these trucks
01:24:18.840 | is completely different
01:24:20.480 | than the capability of a human-driven truck.
01:24:22.880 | So if you're Smith packing, right?
01:24:25.560 | And you've got a bunch of meat in a warehouse
01:24:30.280 | and it's going to grocery distribution centers,
01:24:33.840 | you have that trucker probably come in the night before
01:24:37.320 | and you make him wait so that he has a full 10-hour break,
01:24:41.460 | which is what the law requires,
01:24:42.720 | so that he can get to the furthest reaches
01:24:45.760 | that he can of one of those stores, right?
01:24:48.300 | So he can drive his full 11 hours and bring that meat
01:24:51.680 | so it doesn't have to sit overnight
01:24:53.360 | in that refrigerated trailer, right?
01:24:55.520 | And so their system is based on that.
01:24:57.480 | Now, what happens when that truck
01:25:00.720 | can now travel two times as far, right?
01:25:04.160 | Three times as far?
01:25:05.880 | Now you don't need the warehouses where they were.
01:25:08.840 | Now you can go super lean with your inventory.
01:25:11.760 | Instead of having meat here, meat there, meat there,
01:25:14.480 | you can put it all right here.
01:25:16.280 | And if it's cheap enough,
01:25:18.160 | substitute those transportation costs
01:25:20.440 | for all that warehousing costs, right?
01:25:22.440 | So this is gonna remake landscapes
01:25:24.720 | in the same way that big box supply chains did, right?
01:25:29.280 | And then of course, the further compliment of that is,
01:25:32.920 | how do you then get it to people at their door, right?
01:25:37.640 | And the big box supply chain,
01:25:39.520 | it moves very few items in really large quantities
01:25:44.320 | to very few locations, pretty slowly, right?
01:25:48.380 | E-commerce aspires to do something completely different,
01:25:53.440 | right, move huge varieties of things in small quantities,
01:25:57.720 | virtually everywhere as fast as possible, right?
01:26:00.720 | And so that is like that intercity trucking
01:26:06.480 | in the era of railroad monopolies, right?
01:26:10.360 | The demand for that is potentially enormous, right?
01:26:14.240 | And so there's such a,
01:26:16.840 | so right now I think a lot of the business plans
01:26:20.400 | for sort of automated trucks, right?
01:26:22.200 | And sort of the way that the journalistic accounts
01:26:24.360 | portray it is like, okay,
01:26:25.400 | if we swap out a human for a computer,
01:26:28.680 | what are the labor costs per mile?
01:26:30.260 | And like, oh, here's the profitability
01:26:31.760 | of self-driving trucks, uh-uh.
01:26:34.160 | Like this is transformative technology.
01:26:36.280 | We're gonna change the way we get stuff.
01:26:39.000 | - So we're gonna actually get a lot more trucks, period,
01:26:41.880 | with like with autonomous trucks,
01:26:43.160 | because they would enable a very different
01:26:45.520 | kind of transportation networks, you think?
01:26:47.520 | - Yeah, and this is where it's like, uh-oh.
01:26:50.200 | Like, yeah, so we really thought
01:26:54.640 | we were gonna be electrifying trucks.
01:26:56.520 | If they're going twice as far,
01:26:59.040 | if they're moving three times as much,
01:27:01.480 | if they're going three times as far, right?
01:27:03.320 | What does that mean for how far we are
01:27:05.560 | behind on batteries, right?
01:27:06.800 | We've got sort of these ideas about like,
01:27:09.120 | man, here's how close we could get to meet this demand.
01:27:12.800 | That demand is gonna radically change, right?
01:27:15.040 | These trucks are, you know.
01:27:16.300 | So then we've got to think about, all right,
01:27:18.500 | if it's not batteries, how are we powering these things?
01:27:22.640 | And how many of them are there gonna be?
01:27:25.320 | Like right now we've got 5 million containers
01:27:28.480 | that move from LA and Long Beach to Chicago on rail.
01:27:33.480 | Rail is three or four times at least more efficient
01:27:38.560 | than trucks in terms of greenhouse gas emissions.
01:27:42.640 | And on that lane, it varies a lot depending on demand,
01:27:45.740 | but maybe rail has a 20% advantage in cost, maybe 25%,
01:27:50.040 | but it's a couple of days slower.
01:27:52.120 | So now you cut the cost of that truck,
01:27:54.600 | transportation per mile by 30%.
01:27:57.480 | Now it's cheaper than rail and it gets the stuff there
01:28:00.640 | five days faster than rail.
01:28:02.560 | How many millions of containers are gonna leave LA
01:28:05.680 | and Long Beach on self-driving trucks and go to Chicago?
01:28:08.840 | - And it might look very much like a train
01:28:10.920 | if we go with the platooning solution.
01:28:14.080 | You have these rows of like,
01:28:18.560 | imagine like rows of like 10, like dozens of trucks
01:28:21.680 | or like hundreds of trucks, like some absurd situation.
01:28:25.200 | Just going from LA to Chicago,
01:28:28.120 | just this train, but taking up a highway.
01:28:31.760 | I mean, this is probably a good place
01:28:34.800 | to talk about various scenarios.
01:28:38.000 | - But before we get there,
01:28:39.000 | can I just make one interesting observation
01:28:41.760 | that I made as a driver.
01:28:44.840 | When you're in a truck, you're up higher,
01:28:46.600 | so you can see further and you can see the traffic patterns
01:28:50.040 | and cars move in packs.
01:28:54.400 | I'm sure there's academic research on this, right?
01:28:56.520 | But they move in packs.
01:28:57.400 | They kind of bunch up behind a slower car
01:28:59.480 | and then a bunch of them break free.
01:29:01.240 | And this is sort of on almost free-flowing highways.
01:29:03.920 | They kind of move in packs
01:29:05.200 | and you can kind of see them in the truck.
01:29:07.560 | So, rather than platoons,
01:29:09.240 | we might have like hives of trucks, right?
01:29:11.960 | So you have like 20 trucks moving
01:29:13.960 | in some coordinated fashion, right?
01:29:15.960 | And then maybe the self-driving cars are,
01:29:18.480 | 'cause people don't like to be around them
01:29:20.040 | or whatever it is, right?
01:29:21.400 | You might have a pod of 20 self-driving cars
01:29:24.120 | that are moving in a packet behind them.
01:29:26.560 | - This is what if the aliens came down
01:29:28.880 | or were just observing cars,
01:29:30.960 | which is one of the sort of prevalent characteristics
01:29:34.680 | of human civilization is there seems to be these cars
01:29:37.400 | like moving around that would do this kind of analysis
01:29:40.320 | of like, huh, what's the interesting clustering
01:29:42.560 | of situations here?
01:29:43.760 | Especially with autonomous vehicles, I like this.
01:29:48.720 | Okay, so what technologically speaking do you see
01:29:52.920 | are the different scenarios
01:29:55.240 | of increasing automation in trucks?
01:29:58.160 | What are some ideas that you think about?
01:30:00.880 | - For the most part, I have no influence
01:30:04.880 | on sort of what these ideas were.
01:30:06.400 | So what the project was that I did was,
01:30:09.680 | I said, technology is created by people.
01:30:12.560 | They solve for X and they have some conception
01:30:16.160 | of what they wanna do.
01:30:17.680 | And that's where we should start in sort of thinking
01:30:19.960 | about what the impacts might be.
01:30:22.160 | So I went and I talked to everybody I could find
01:30:25.200 | who was thinking about developing a self-driving truck.
01:30:28.640 | And the question was essentially,
01:30:31.280 | what are you trying to build?
01:30:32.800 | Like, what do you envision this thing doing?
01:30:35.720 | It turned out that for a lot of them was an afterthought.
01:30:40.440 | They knew the sort of technological capabilities
01:30:43.800 | that a self-driving vehicle would have.
01:30:45.400 | And those were the problems that they were tackling.
01:30:48.200 | They were engineers and computer scientists and--
01:30:50.280 | - Oh, robotics people, I love you so much.
01:30:53.600 | This is, I could talk forever about this,
01:30:56.720 | but yes, there's a technology problem.
01:30:58.280 | Let's focus on that and we'll figure out
01:30:59.680 | the actual impact on society,
01:31:02.400 | how it's actually going to be applied,
01:31:03.800 | how it's actually going to be integrated from a policy
01:31:06.920 | and from a human perspective,
01:31:08.280 | from a business perspective later.
01:31:09.840 | First, let's solve the technology problem.
01:31:11.560 | That's not how life works, friends, but okay, I'm sorry.
01:31:14.400 | - Yeah, yeah, so I mean, I'm sure you know
01:31:17.040 | the division of labor in these companies, right?
01:31:19.000 | There's sort of a business development side,
01:31:21.320 | and then there's the engineering side, right?
01:31:22.920 | And the engineers are like, oh my God,
01:31:24.440 | what are these business development people,
01:31:26.360 | why are they involved in this process?
01:31:30.120 | So I ended up sort of coming up with a few different ideas
01:31:34.440 | that people seem to be batting around,
01:31:37.000 | and then really tried to zero in
01:31:39.600 | on a layman's understanding of the limitations, right?
01:31:42.720 | And it turns out that's really obvious and quite simple.
01:31:47.520 | Highway driving's a lot simpler, right?
01:31:49.960 | So the plan is simplify the problem, right?
01:31:53.560 | And focus on highways because city driving
01:31:56.200 | is so much more complicated.
01:31:59.840 | So from that, I came up with basically six scenarios,
01:32:04.280 | actually, I came up with five
01:32:06.640 | that the developers were talking about.
01:32:08.640 | And then one that I thought was a good idea
01:32:11.400 | that I had read about, I think in like 2013 or 2014,
01:32:16.840 | which was actually something
01:32:18.280 | that the US military was looking at.
01:32:20.520 | I actually first heard about the idea
01:32:23.160 | of this kind of automation, at least in sketched out form,
01:32:27.640 | in like 2011, I guess it was with Peloton,
01:32:30.800 | which was this sort of early technology entrant
01:32:33.960 | into the trucking industry,
01:32:35.920 | which was working on platooning trucks.
01:32:39.200 | And all they were doing was, you know,
01:32:41.200 | a cooperative adaptive cruise control
01:32:43.080 | as they came to call it.
01:32:45.760 | And we ended up on a panel together.
01:32:48.240 | And it's kind of interesting because I was on that panel
01:32:52.000 | because I was thinking about how we got the best return
01:32:55.680 | on investment for fuel efficient technologies.
01:32:58.380 | And if it's cool, I'll sort of set this up
01:33:01.200 | because it comes into sort of the story
01:33:03.480 | of some of these scenarios.
01:33:05.680 | So when I studied the drivers,
01:33:09.280 | you had this like complete difference in the driving tasks,
01:33:15.160 | like we were talking about before with long haul and city.
01:33:18.880 | And you're not paid in the city, you've got congestion,
01:33:23.120 | the turns are tight, there's lots of pedestrians,
01:33:27.920 | all the things that self-driving trucks don't like,
01:33:29.880 | truckers don't like.
01:33:31.000 | And they're not paid, there's lots of waiting time.
01:33:35.680 | And then in the highway, they get to cruise,
01:33:37.880 | they're getting paid, they have control,
01:33:39.760 | they go at their own pace, they're making money,
01:33:42.000 | they're happy.
01:33:43.180 | Well, it turned out, I guess it was around 2010,
01:33:45.980 | this is still when we were thinking
01:33:47.380 | about regenerative braking,
01:33:49.020 | and hybrid trucks being sort of like the solution.
01:33:51.520 | The problems with them sort of, and the advantages,
01:33:57.660 | also split on what I was thinking of
01:33:59.380 | as kind of the rural urban divide at that time.
01:34:01.660 | So like you got the regenerative braking,
01:34:04.220 | you can make the truck lighter, you can keep it local.
01:34:08.420 | You don't get any benefit from that hybrid electric
01:34:11.660 | in the rural highway, you want aerodynamics.
01:34:16.660 | There you want low rolling resistance tires
01:34:20.460 | and these super aerodynamic sleek trucks,
01:34:23.140 | where we know with off the shelf technology today,
01:34:26.220 | we could double the fuel economy,
01:34:27.820 | more than double the fuel economy of the typical truck
01:34:30.380 | in that highway segment,
01:34:32.300 | if we segmented the duty cycle.
01:34:34.580 | And so in the urban environment,
01:34:36.580 | you want a clean burning truck,
01:34:37.780 | so you're not giving kids asthma,
01:34:39.460 | you want it lighter, so it's not destroying
01:34:41.700 | those less strong pavements.
01:34:45.140 | You're not, you can make tighter turns,
01:34:47.020 | you don't need a sleeper cab,
01:34:48.340 | 'cause the driver, hopefully is getting home at night.
01:34:51.260 | In the long haul, you want that super aerodynamic stuff.
01:34:54.140 | Now that doesn't get you anything in the city,
01:34:55.740 | and in fact, it causes all kinds of problems,
01:34:57.340 | 'cause you turn too tight,
01:34:58.540 | you crunch up all the aerodynamics
01:35:00.060 | that connect the tractor and the trailer.
01:35:02.900 | So the idea that I had was like,
01:35:05.300 | okay, what if we deliberately segmented it?
01:35:08.100 | Like what if we created these drop lots, outside cities,
01:35:12.500 | where a local city driver who's paid by the hour,
01:35:16.580 | kind of runs these trailers out once they're loaded.
01:35:19.140 | It doesn't sit there and wait while it's being loaded,
01:35:20.800 | they drop off a trailer, they go pick up one that's loaded,
01:35:23.100 | they run it out, when it's loaded, they call them,
01:35:25.140 | and they just run them out there and stage them.
01:35:27.220 | - It's like an Uber driver, but for truckloads.
01:35:30.180 | - Yeah, and we have like intermodal,
01:35:32.060 | we have like, we have basically this would be
01:35:34.060 | the equivalent of like rail to truck intermodal, right?
01:35:37.180 | So you put it on the rail and then,
01:35:38.780 | a trucker picks it up and delivers it, right?
01:35:40.500 | So instead of having the rail,
01:35:42.140 | you'd have these super aerodynamic, hopefully platoons,
01:35:44.900 | or what was at the time was called
01:35:47.380 | long combination vehicles,
01:35:48.780 | which is basically two trailers connected together, right?
01:35:51.020 | 'Cause this is like a huge productivity gain, right?
01:35:54.060 | And then instead of that driver like me,
01:35:56.220 | I would pick up something in upstate New York,
01:35:57.740 | drive to Michigan, drive to Alabama,
01:35:59.940 | drive to Wisconsin, drive to Florida,
01:36:01.820 | I get home every two weeks.
01:36:03.180 | If I'm just running that double trailer,
01:36:08.020 | I might be able to go back and forth
01:36:09.540 | from Chicago to Detroit, right?
01:36:11.700 | Take two trailers there, pick up two trailers going back,
01:36:14.380 | right, and be home every night.
01:36:16.140 | Now the problem with that at the time,
01:36:17.580 | or one of them was, bridge weights.
01:36:20.340 | So you can't, not all bridges can handle
01:36:23.060 | that much weight on them.
01:36:25.340 | They can't handle these doubles, right?
01:36:26.620 | And some places can, some places can't.
01:36:28.540 | And so this platooning idea was happening at the same time.
01:36:31.660 | And we ended up on the same panel
01:36:33.220 | and the founders were like,
01:36:35.180 | "Hey, so what's it like to follow
01:36:37.380 | really close behind another truck?"
01:36:39.020 | Which was kind of the stage that they were at was like,
01:36:41.420 | "What's that experience gonna be like?"
01:36:43.380 | And I was like, "Truckers aren't gonna like it."
01:36:46.340 | I mean, that's just like the cardinal rule
01:36:48.540 | is following distance.
01:36:49.740 | Like that's the one you really shouldn't violate, right?
01:36:52.580 | And when you're out on the road,
01:36:54.060 | like you have that trucker like right on your ass,
01:36:56.940 | people remember that.
01:36:58.180 | They don't remember the 99.9% of truckers
01:37:01.260 | that are not on their ass.
01:37:03.220 | Like they're very careful about that.
01:37:05.340 | - But when the trucks are really close together,
01:37:07.540 | there's benefits in terms of aerodynamics.
01:37:10.100 | So that's the idea.
01:37:11.580 | So like if you want to get some benefits of a platoon,
01:37:16.540 | you want them to be close together,
01:37:17.820 | but you're saying that's very uncomfortable for truckers.
01:37:20.220 | - Yeah, so I mean, I think that ended up at the,
01:37:22.180 | I mean, Peloton I think is sort of winding down
01:37:25.260 | their work on this.
01:37:27.420 | And I think that ended up being still an open question.
01:37:30.980 | Like, and I had a chance to interview a couple of drivers
01:37:33.420 | who at least one, maybe two of which
01:37:35.980 | had actually driven in their platoons.
01:37:38.700 | And I got completely different experiences.
01:37:41.260 | Some of them were like, it's really cool.
01:37:43.340 | You know, I'm like in communication with that other driver.
01:37:46.420 | You know, I can see on a screen what's out,
01:37:49.660 | you know, the front of his truck.
01:37:51.340 | And then some were like, it's too close.
01:37:53.900 | And it might be one of those things that's just,
01:37:55.260 | you know, it takes an adjustment to sort of get there.
01:37:57.940 | So you get the aerodynamic advantage,
01:37:59.380 | which, you know, saves fuel.
01:38:01.900 | There's some problems though, right?
01:38:03.340 | So, you know, you're getting that aerodynamic advantage
01:38:06.940 | because there's a low pressure system
01:38:08.300 | in front of that following truck.
01:38:10.380 | But the engine is designed with higher pressure
01:38:14.460 | feeding that engine, right?
01:38:15.900 | So there's sort of adjustments that you need to make
01:38:18.060 | and, you know, still the benefits are there.
01:38:21.780 | That's one scenario.
01:38:22.900 | And that's just the automation of that
01:38:24.740 | acceleration and braking.
01:38:26.820 | Starsky, which, you know, probably a lot of your listeners
01:38:31.180 | heard about, was working on another scenario,
01:38:34.740 | which was, you know, to solve that local problem
01:38:38.140 | was gonna do teleoperation, right?
01:38:40.420 | Sort of remote piloting.
01:38:41.820 | I had the chance to, you know,
01:38:44.220 | sort of watch them do that.
01:38:46.260 | It was, you know, they drove a truck in Florida
01:38:49.620 | from San Francisco in one of their offices.
01:38:53.620 | That was really interesting.
01:38:54.820 | And then in case it's not clear, teleoperation
01:38:57.140 | means you're controlling the truck remotely,
01:38:59.460 | like it's a video game.
01:39:00.740 | So you've gotten a chance to witness it.
01:39:04.980 | Does it actually work?
01:39:06.780 | - Yeah, I mean, so it's--
01:39:08.820 | - What are the pros and cons?
01:39:10.140 | - You know, one of the problems with doing research like this
01:39:12.660 | with all these Silicon Valley folks is the NDAs.
01:39:16.500 | - Oh, right.
01:39:17.340 | - So, you know, I don't know what I'm able to say
01:39:20.540 | about sort of watching it, but obviously
01:39:22.940 | they're public statements about sort of
01:39:24.580 | what the challenges are, right?
01:39:25.500 | And it's about the latency and the ability
01:39:29.020 | to sort of in real time.
01:39:31.180 | - There's challenges there.
01:39:32.100 | Let me say one thing.
01:39:33.620 | So I'm talking to the, you know,
01:39:36.380 | I've talked to the Waymo CTO.
01:39:38.420 | I'm in conversations with them.
01:39:39.700 | I'm talking to the head of trucking,
01:39:41.420 | Boris Sofman in next month, actually.
01:39:45.500 | I'm a huge fan of his because he was,
01:39:48.100 | I think the founder of Anki,
01:39:49.260 | which is a toy robotics company.
01:39:52.260 | So I love cute, I love human robot interaction.
01:39:55.620 | And he created one of the most effective
01:39:58.820 | and beautiful toy robots.
01:40:02.100 | Anyway, I keep complaining to them on email privately
01:40:07.300 | that there's way too much marketing in these conversations
01:40:11.940 | and not enough showing off the both the challenge
01:40:14.940 | and the beauty of the engineering efforts.
01:40:16.820 | And that seems to be the case for a lot
01:40:18.860 | of these Silicon Valley tech companies.
01:40:21.260 | They put up this, you're talking about NDAs.
01:40:24.140 | For some reason, rightfully, wrongfully,
01:40:29.620 | because there's been so much hype
01:40:30.980 | and so much money being made,
01:40:33.100 | they don't see the upside in being transparent
01:40:38.100 | and educating the public about how difficult the problem is.
01:40:43.300 | It's much more effective for them to say,
01:40:45.500 | we'll have everything solved.
01:40:46.700 | This will change everything.
01:40:47.860 | This will change society as we know it
01:40:49.420 | and just kind of wave their hands.
01:40:51.100 | As opposed to exploring together
01:40:53.060 | like these different scenarios,
01:40:54.340 | what are the pros and cons?
01:40:55.860 | Why is it really difficult?
01:40:57.300 | What are the gray areas of where it works and doesn't?
01:41:01.780 | What's the role of the human in this picture
01:41:03.860 | of the both the sort of the operators
01:41:06.460 | and the other humans on the road?
01:41:08.020 | All of that, which are fascinating human problems,
01:41:11.540 | fascinating engineering problems
01:41:12.980 | that I wish we could have a conversation about
01:41:15.340 | as opposed to always feeling like it's just marketing talk.
01:41:19.140 | Because a lot of what we're talking about now,
01:41:22.500 | even you with having private conversations under NDA,
01:41:26.700 | you still don't have the full picture
01:41:28.820 | of how difficult this problem is.
01:41:31.260 | One of the big questions I've had still have
01:41:35.020 | is how difficult is driving?
01:41:36.940 | I've disagree with Elon Musk and Jim Keller on this point.
01:41:40.980 | I have a sense that driving is really difficult.
01:41:44.380 | The task of driving, just broadly,
01:41:47.660 | this is like philosophy talk.
01:41:49.420 | How much intelligence is required to drive a car?
01:41:54.300 | So from like a Jim Keller,
01:41:58.300 | used to be the head of autopilot,
01:42:00.340 | the idea is that it's just a collision avoidance problem.
01:42:03.100 | It's like billiard balls.
01:42:04.340 | It's like you have to convert the drive.
01:42:06.980 | You have to do some basic perception,
01:42:08.620 | a computer vision to convert driving into a game of pool.
01:42:13.100 | And then you just have to get everything into a pocket.
01:42:15.860 | To me, there just seems to be some game theoretic dance
01:42:19.140 | combined with the fact that people's life is at stake.
01:42:21.820 | And then when people die at the hands of a robot,
01:42:24.500 | the reaction is going to be much more complicated.
01:42:26.500 | So all of that, but that's still an open question.
01:42:28.860 | And the cool thing is all of these companies
01:42:31.900 | are struggling with this question
01:42:34.180 | of how difficult is it to solve this problem sufficiently
01:42:37.980 | such that we can build a business on top of it
01:42:39.900 | and have a product that's going to make
01:42:41.700 | a huge amount of money
01:42:42.980 | and compete with the manually driven vehicles.
01:42:46.700 | And so their teleoperation from Starsky's
01:42:49.700 | is really interesting idea.
01:42:50.940 | I mean, there's a few autonomous vehicle companies
01:42:55.300 | that tried to integrate teleoperation in the picture.
01:42:59.020 | Can we reduce some of the costs
01:43:02.220 | while still having reliability,
01:43:04.660 | like catch when the vehicle fails by having teleoperation?
01:43:11.700 | It's an open question.
01:43:12.860 | So that's for you scenario number two
01:43:17.020 | is to use teleoperation as part of the picture.
01:43:20.060 | - Yeah, let me follow up on that question
01:43:22.260 | of how hard driving is,
01:43:23.380 | because this becomes a big question for researchers
01:43:26.540 | who are thinking about labor market impacts, right?
01:43:28.520 | Because we start from a perspective
01:43:31.540 | of what's hard or easy for humans, right?
01:43:35.060 | And so, if you were to look at truck driving prior to a lot,
01:43:39.820 | and this has been a lot of thinking and debate
01:43:41.940 | in academic research circles
01:43:45.060 | around sort of how you estimate labor impacts, right?
01:43:47.760 | What these models look like.
01:43:49.180 | And a lot of it is about how automatable is a job.
01:43:52.140 | Object recognition, really easy for people, right?
01:43:54.980 | Really hard for computers.
01:43:56.420 | And so there's a whole bunch of things that truck drivers do
01:44:00.580 | that we see as super easy.
01:44:03.860 | And as it would have been characterized 10 years ago,
01:44:07.060 | routine, and it's not for a computer, right?
01:44:10.500 | It turns out to be something that we do naturally
01:44:13.540 | that is sort of cutting edge, right?
01:44:16.500 | Computer science.
01:44:17.920 | So on the teleoperation question,
01:44:20.140 | I think this is a more interesting one
01:44:23.940 | than people would like to sort of let on, I think publicly.
01:44:28.040 | There are gonna be problems, right?
01:44:32.140 | And this is one of the complexities
01:44:33.660 | of sort of putting these things out in the world.
01:44:35.460 | And if you see the real world of trucking,
01:44:37.860 | you realize, wow, it's rough.
01:44:40.900 | There are dirt lots, there's gravel,
01:44:42.480 | there's salt and ice and cold weather,
01:44:44.700 | and there's equipment that just gets left out
01:44:46.700 | in the middle of nowhere,
01:44:47.680 | and the brakes don't get maintained,
01:44:49.580 | and somebody was supposed to service something
01:44:51.540 | and they didn't, you know?
01:44:53.220 | And so you imagine, okay,
01:44:54.820 | we've got this vehicle that can drive itself,
01:44:57.140 | which is gonna require a whole lot of sensors
01:44:59.020 | to tell it that like the doors are still closed
01:45:02.140 | and the trailer is still hooked up,
01:45:03.620 | and each of the tires has adequate pressure,
01:45:05.740 | and any number of probably hundreds of sensors
01:45:09.100 | that are gonna be sort of relaying information.
01:45:11.860 | And one of them, after 500,000 miles
01:45:15.340 | or whatever, it goes out.
01:45:17.060 | Now, do we have some fleet of technicians
01:45:20.300 | sort of continually cruising the highways
01:45:22.460 | and sort of servicing these things as they do what?
01:45:25.140 | Pull themselves off to the side of the road
01:45:27.220 | and say, I've got a sensor fault, I'm pulling over,
01:45:30.100 | or maybe there's some level of safety critical faults,
01:45:32.940 | or whatever it might be.
01:45:36.300 | So, you know, that suggests that there might be a role
01:45:40.660 | for teleoperation, even with self-driving.
01:45:43.780 | And when I push people on it in the conversations,
01:45:47.940 | they all are like, yeah, we kind of have that
01:45:50.300 | on the like bottom of the list,
01:45:52.020 | figure out how to rescue truck, you know?
01:45:54.620 | Like it's like on the to-do list, right?
01:45:56.820 | After solving the self-driving, you know, question is like,
01:46:00.140 | yeah, what do we do with the problems, right?
01:46:02.620 | I mean, now we could imagine like, all right,
01:46:04.940 | we have some protocol that the truck is not,
01:46:07.860 | you know, realizes the system says not safe for operation,
01:46:11.580 | pull to the side.
01:46:13.180 | Good, you have a crash, but now you've got a truck
01:46:15.300 | stranded on the side of the road.
01:46:17.060 | You're gonna send out somebody to like calibrate things
01:46:19.940 | and check out what's going on,
01:46:21.300 | or that sounds like expensive labor,
01:46:23.220 | it sounds like downtime, it sounds like the kind of things
01:46:26.300 | that shippers don't like to happen to their freight,
01:46:28.860 | you know, in a just-in-time world.
01:46:30.980 | And so wouldn't it be great if you could just sort of,
01:46:33.700 | you know, loop your way into the controls of that truck
01:46:37.020 | and say, all right, we've got a sensor out,
01:46:38.980 | says that the tire's bad,
01:46:40.660 | but I can see visually from the camera, looks fine,
01:46:43.340 | I'm gonna drive it to our next depot,
01:46:45.620 | you know, maybe the next Ryder or Penske location, right?
01:46:48.580 | Sort of all these service locations around
01:46:50.500 | and have a technician take a look at it.
01:46:52.740 | So teleoperation often gets this, you know,
01:46:57.060 | dismissive, you know, commentary from other folks
01:47:02.060 | working on other scenarios.
01:47:04.020 | But I think it's potentially more relevant
01:47:06.980 | than we hear publicly.
01:47:09.860 | - But it's a hard problem.
01:47:11.260 | You know, for me, I've gotten a chance to interact
01:47:18.100 | with people that take on hard problems and solve them,
01:47:20.300 | and they're rare.
01:47:21.500 | What Tesla has done with their data engine,
01:47:25.600 | so I thought autonomous driving cannot be solved
01:47:28.600 | without collecting a huge amount of data
01:47:31.320 | and organizing it well,
01:47:32.280 | not just collecting, but organizing it.
01:47:34.800 | And exactly what Tesla is doing now
01:47:37.300 | is what I thought it would be,
01:47:38.640 | like I couldn't see car companies doing that,
01:47:40.600 | including Tesla.
01:47:42.200 | And now that they're doing that, it's like, oh, okay.
01:47:44.640 | So it's possible to take on this huge effort seriously.
01:47:48.160 | To me, teleoperation is another huge effort like that.
01:47:51.880 | It's like taking seriously what happens
01:47:55.480 | when it fails.
01:47:56.960 | What's the, in the case of Waymo for the consumer,
01:48:00.840 | like ride sharing, what's the customer experience like?
01:48:04.320 | There's a bunch of videos online now
01:48:06.000 | where people are like, the car fails
01:48:08.480 | and it pulls off to the side,
01:48:09.960 | and you call like customer service,
01:48:11.600 | and you're basically sitting there for a long time,
01:48:13.920 | and there's confusion,
01:48:15.000 | and then there's a rescue that comes and they start to drive.
01:48:17.720 | I mean, just the whole experience is a mess
01:48:19.760 | that has a ripple effect to how you trust
01:48:23.360 | in the entirety of the experience.
01:48:25.400 | But like actually taking on the problem of that failure case
01:48:29.440 | and revolutionizing that experience,
01:48:32.080 | both for trucking and for ride sharing,
01:48:34.400 | that's an amazing opportunity there
01:48:35.920 | because that feels like it would change everything.
01:48:40.080 | If you can reliably know when the failures happen,
01:48:42.740 | which they will, you have a clear plan
01:48:45.120 | that doesn't significantly affect the efficiency
01:48:47.800 | of the whole process, that could be the game changer.
01:48:51.720 | And if teleoperation is part of that,
01:48:53.200 | it could be just like you're saying,
01:48:54.880 | it could be teleoperation,
01:48:56.440 | or it could be like a fleet of rescuers that can come in,
01:49:00.120 | which is a similar idea.
01:49:01.400 | But teleoperation, obviously,
01:49:03.560 | that allows you to just have a network of monitors,
01:49:07.400 | of people monitoring this giant fleet of trucks
01:49:10.760 | and taking over when needed.
01:49:12.760 | And it's a beautiful vision of the future
01:49:14.680 | where there's millions of robots
01:49:18.160 | and only thousands of humans
01:49:20.360 | monitoring those millions of robots.
01:49:22.720 | That seems like a perfect dance
01:49:27.360 | of allowing humans to do what they do best
01:49:29.560 | and allowing robots to do what they do best.
01:49:31.960 | - Yeah, yeah, so I mean, I think there are,
01:49:34.760 | and we just applied for an NSF,
01:49:36.520 | we didn't get anybody's watching,
01:49:38.760 | but with some folks from Wisconsin who do teleoperation,
01:49:43.640 | and some of this is used for like rovers
01:49:46.000 | and I mean, really high stakes, difficult problems.
01:49:50.480 | But one of the things we wanted to study
01:49:52.120 | were these mines, these Rio Tinto mines in Australia
01:49:55.520 | where they remotely pilot the trucks.
01:49:59.520 | And there's some autonomy, I guess,
01:50:01.640 | but it's overseen by a remote operator.
01:50:05.180 | And it's near Perth and it's quite remote.
01:50:09.400 | And they retrained the truck drivers
01:50:13.440 | to be the remote operators, right?
01:50:15.820 | There's autonomy in the Port of Rotterdam
01:50:18.320 | and places like that where there are jobs there.
01:50:21.040 | And so there, I think, and maybe we'll get to this later,
01:50:24.040 | but there's a real policy question
01:50:25.440 | about sort of who's gonna lose and what we do about it
01:50:28.800 | and whether or not there are opportunities there
01:50:31.320 | that maybe we need to put our thumb on the scale
01:50:33.800 | a little bit to make sure that there's some give back
01:50:38.660 | to the community that's taking the hit.
01:50:41.740 | So for instance, if there were teleoperation centers,
01:50:45.160 | maybe they go in these communities
01:50:46.820 | that we disproportionately source truck drivers from today.
01:50:50.520 | Now, I mean, what does that mean?
01:50:52.000 | It may not be the cheapest place to do it
01:50:53.600 | if they don't have great connectivity
01:50:55.200 | and it may not be where the upper lever managers wanna be
01:50:58.560 | and places like that, issues like that, right?
01:51:01.100 | So I do think it's an interesting question,
01:51:04.920 | both from sort of a practical scenario situation
01:51:09.800 | of how it's gonna work,
01:51:11.080 | but also from a policy perspective.
01:51:14.000 | - So there's platoons, there's teleoperation,
01:51:17.040 | and this is taking care of some of the highway driving
01:51:20.060 | that we're talking about.
01:51:21.160 | Is there other ideas, like,
01:51:22.900 | is there other ideas, scenarios that you have
01:51:27.360 | for autonomous trucks?
01:51:28.840 | - Yeah, so I mean, the most obvious one actually
01:51:31.560 | is just facility to facility, right?
01:51:34.760 | The sort of, it can't go everywhere,
01:51:37.640 | but a lot of logistics facilities
01:51:40.480 | are very close to interstates
01:51:42.120 | and they're on big commercial roads
01:51:45.120 | without bikes and parked cars and all that stuff.
01:51:48.920 | And some of the jobs that I think are really first
01:51:51.740 | on the chopping block are these LTL
01:51:54.520 | that less than truckload what's called line haul, right?
01:51:57.180 | So these are the drivers who go from terminal to terminal
01:51:59.860 | with those full trailers.
01:52:02.340 | And those facilities are often located strategically
01:52:04.980 | to avoid congestion, right?
01:52:07.060 | And to be in big industrial facilities.
01:52:10.100 | So you could imagine that being the first place
01:52:14.180 | you see a Waymo self-driving truck roll out
01:52:17.640 | might be, you know, sort of direct facility to facility
01:52:21.780 | for UPS or FedEx or less than truckload care.
01:52:25.660 | - And the idea there is fully driverless.
01:52:27.300 | So potentially not even a driver in the truck.
01:52:30.100 | It's just going from facility to facility,
01:52:32.620 | empty, zero occupancy.
01:52:34.860 | - Yeah, and those, because that labor is expensive,
01:52:37.620 | you know, they don't keep those drivers out overnight.
01:52:39.500 | Those drivers do a run back and forth typically,
01:52:42.460 | or in a team go back and forth in one day.
01:52:46.980 | - So from the people you've spoken with so far,
01:52:49.800 | what's your sense?
01:52:50.640 | How far are we away from, which scenario is closest
01:52:53.880 | and how far away are we from that scenario
01:52:56.920 | of autonomy being a big part of our trucking fleet?
01:53:01.920 | - Most folks are focused on another scenario,
01:53:05.640 | which is the exit to exit, right?
01:53:07.700 | Which looks like that urban truck ports
01:53:09.700 | thing that I laid out earlier.
01:53:12.360 | You know, so you have a human driven truck
01:53:14.240 | that comes out to a drop lot.
01:53:16.900 | It meets up with an autonomous truck.
01:53:19.000 | That truck then, you know, drives it on the interstate
01:53:22.560 | to another lot and then a human driver, you know,
01:53:26.940 | picks it up.
01:53:27.900 | There are a couple variations maybe on that.
01:53:31.140 | So, or let me just run through the last two scenarios.
01:53:36.040 | - Sure.
01:53:36.880 | - The other thing you could do, right, is to say,
01:53:41.300 | all right, I've got a truck that can drive itself.
01:53:43.880 | And I refer to this one as autopilot,
01:53:46.200 | but you know, you have a human drive it out
01:53:49.300 | to the interstate, but rather than have that transaction
01:53:52.620 | where the human driven truck detaches the trailer
01:53:55.900 | and it gets coupled up to a self-driving truck,
01:53:58.760 | they just, that human driver just hops on the interstate
01:54:01.840 | with that truck and goes in back and goes off duty
01:54:05.560 | while the truck drives itself.
01:54:07.200 | And so you have a self-driving truck
01:54:09.120 | that's not driverless, right?
01:54:11.380 | - And just to clarify,
01:54:12.360 | because Tesla uses the term autopilot,
01:54:14.400 | and so do airplanes,
01:54:15.480 | and so everybody uses the word autopilot.
01:54:17.720 | We're referring to essentially full autonomy,
01:54:20.640 | but because it's exit to exit,
01:54:22.120 | the truck driver is on board the truck,
01:54:24.660 | but they're sleeping in the back or whatever.
01:54:26.820 | - Yeah, and this gets to the really weedy policy questions.
01:54:30.720 | Right?
01:54:31.560 | So basically for the Department of Transportation,
01:54:34.160 | for you to be off duty for safety reasons,
01:54:36.360 | you have to be completely relieved of all responsibility.
01:54:39.360 | So that truck has to not encounter a construction site
01:54:44.360 | or inclement weather or whatever it might be
01:54:48.020 | and call to you and say, hey, or I mean, obviously, right?
01:54:51.320 | We're imagining connected vehicles as well, right?
01:54:53.640 | So you're in a self-driving truck, you're in the back
01:54:56.640 | and trucks 20 miles ahead experience some problem, right?
01:55:01.640 | That may require teleoperation or whatever it is, right?
01:55:04.520 | And it signals to your truck,
01:55:05.880 | hey, tell the driver 20 miles ahead,
01:55:08.800 | he's got to hop in the seat.
01:55:10.880 | That would mean that they're on duty according to the way
01:55:13.040 | that the current rules are written.
01:55:14.400 | They have some responsibility.
01:55:15.920 | And part of that is, we need them to get rest, right?
01:55:19.520 | They need to have uninterrupted sleep.
01:55:22.720 | So that's what I call autopilot.
01:55:25.600 | The final scenario is one that I thought was actually
01:55:30.600 | the one scenario that was good for labor,
01:55:32.740 | which I proposed, 'cause I was like,
01:55:37.200 | well, here's an idea that would be like,
01:55:39.400 | actually good for workers.
01:55:40.820 | And just another brief aside here.
01:55:45.660 | The history of trucking over the last 40 years,
01:55:51.360 | there's been a lot of technological change.
01:55:53.160 | So when I learned to drive the truck,
01:55:55.780 | I had to learn to manually shift it like I was describing.
01:55:57.980 | You had to read these fairly complicated,
01:56:00.720 | big sets of laminated maps to figure out
01:56:03.020 | where the truck can go and where it couldn't,
01:56:05.000 | which is a big deal.
01:56:06.280 | When you take these trucks on the wrong road
01:56:07.880 | and you're destroying a bridge
01:56:09.440 | or you're doing a can opener,
01:56:10.540 | which is where you try to drive it under a bridge
01:56:12.360 | that's too low.
01:56:13.200 | You've probably seen that on YouTube.
01:56:14.520 | If not, check it out, truck can opener.
01:56:17.600 | There's some bridges that are famous for it, right?
01:56:20.880 | And there's one I think called the can opener
01:56:23.240 | and you can find on YouTube.
01:56:24.780 | And you had to log those hours manually
01:56:30.260 | and sort of do the math and plan your work routine.
01:56:34.320 | And I would do this every day.
01:56:35.360 | I'd say like, okay, I'm gonna get up at five.
01:56:37.400 | I've got to think about Buffalo and there's traffic there.
01:56:40.080 | So I wanna be through Buffalo by 6.30,
01:56:43.480 | and then that'll put me in Cleveland at 9.30,
01:56:48.160 | which means I'll miss that rush hour,
01:56:51.920 | which is gonna put me in Chicago.
01:56:53.680 | And so you do this.
01:56:55.000 | And now today, 15 years later,
01:56:59.360 | truck drivers don't have to do any of that.
01:57:02.240 | You don't have to shift the truck.
01:57:03.840 | You don't have to map.
01:57:05.680 | You can figure out the least congested route to go on
01:57:10.080 | and your hours of service are recorded
01:57:12.720 | or a good portion of them are reported automatically.
01:57:17.160 | All of that has been a substantial de-skilling
01:57:19.920 | that has put downward pressure on wages
01:57:23.360 | and allowed companies to kind of speed up,
01:57:25.720 | monitor and direct.
01:57:27.320 | I mean, the key technology that I did work under
01:57:30.880 | is satellite linked computers.
01:57:32.560 | So before you could kind of go out and plan your own work
01:57:34.880 | and the boss really couldn't see what you were doing
01:57:36.920 | and push you and say, you've been on break for 10 hours.
01:57:40.120 | Why aren't you moving?
01:57:41.280 | You know, and you might tell them, you know,
01:57:43.920 | 'cause I'm tired, you know, like I didn't sleep well.
01:57:46.000 | I've got to get a couple more hours.
01:57:47.720 | You know, they're only gonna accept that so many times
01:57:50.400 | or at least some of those dispatchers are.
01:57:52.000 | So all this technology has made the job sort of,
01:57:55.760 | you know, de-skilled the job, you know,
01:57:57.800 | hurt drivers in the labor market, made the work worse.
01:58:01.760 | So I think the burden is really on the technologists
01:58:06.760 | who are like, oh, this will make truck driver jobs better
01:58:09.960 | and sort of envision ways that it would.
01:58:11.520 | It's like, the burden's really,
01:58:13.560 | a proof is really on you to sort of really clearly lay out
01:58:16.920 | what that is gonna look like
01:58:18.600 | because it's 30 or 40 years of history
01:58:21.840 | suggests that technology into labor markets
01:58:25.240 | where workers are really weak and cheap is what wins,
01:58:29.640 | that new technology doesn't help workers.
01:58:31.720 | Or raise their wages.
01:58:33.080 | - So lowers the bar of entry in terms of skill.
01:58:36.640 | - Yeah.
01:58:37.480 | - That's really interesting.
01:58:40.480 | That's tough.
01:58:43.880 | That's tough to know what to do with
01:58:45.560 | because yeah, from a technology perspective,
01:58:47.800 | you wanna make the life of the people
01:58:49.680 | doing the job today easier.
01:58:51.720 | - Is it?
01:58:52.760 | Is that what you want?
01:58:54.040 | - No, but that, like when you think about like what,
01:58:57.200 | exactly, because the reality is you will make
01:59:01.000 | their life potentially a little bit easier,
01:59:04.120 | but that will allow the companies
01:59:06.240 | to then hire people that are less skilled.
01:59:08.800 | It'll get those people that were previously working there
01:59:11.640 | fired or lower wages.
01:59:13.760 | And so the result of this easier
01:59:17.120 | is a lower quality of life.
01:59:19.600 | - Yeah.
01:59:20.440 | - That's dark, actually.
01:59:21.480 | - I know, I'm sorry.
01:59:23.240 | - But you were saying that was for you initially
01:59:25.640 | the hopeful.
01:59:27.160 | - Oh no, so I'll get to that.
01:59:28.920 | But one more thing, 'cause this is not stopping, right?
01:59:31.320 | And this is another interesting question
01:59:33.000 | about the sort of automation.
01:59:34.120 | And I think Uber, right, is an interesting example here,
01:59:38.280 | right, where it's like, okay,
01:59:39.120 | if we had self-driving trucks or self-driving cars, right,
01:59:42.200 | we could automate what used to be taxi service.
01:59:46.160 | There's a whole bunch of stuff
01:59:47.080 | that's already been automated, like the dispatching.
01:59:49.640 | So the dispatchers are already out of work in Rideshare
01:59:53.040 | and the payment is already automated, right?
01:59:54.840 | So you have to automate steps like this.
01:59:57.520 | So you have to have that initial link to dispatch the truck.
02:00:00.840 | You have to have the automated mapping.
02:00:04.160 | So we've sort of done all this incremental automation,
02:00:07.600 | right, that could make the truck completely driverless.
02:00:11.520 | There's some important things happening right now
02:00:14.020 | with the remaining good jobs.
02:00:15.280 | So what you're really paying for
02:00:17.400 | when you get a good truck driver is,
02:00:20.000 | like I said, you get those kind of local skills
02:00:22.620 | of like backing and congested traffic.
02:00:26.320 | Those, it's really impressive to watch
02:00:28.680 | and there's some value on it certainly,
02:00:30.360 | but it's relatively low value
02:00:33.600 | in the actual driving technique, right?
02:00:35.960 | So you bump something, you know, backing into the dock,
02:00:39.120 | it's, you know, it might be a couple thousand dollars
02:00:41.320 | 'cause you ruin a canopy or something over a dock
02:00:43.560 | or tear up a trailer.
02:00:45.400 | What you really want,
02:00:46.480 | those highly skilled conscientious drivers,
02:00:50.520 | and that's really what it is.
02:00:52.080 | And that's what computers are really good at,
02:00:53.940 | is about being conscientious, right,
02:00:55.820 | in the sense of like, they pay attention continually, right?
02:00:59.160 | And how I was describing those long haul segments
02:01:02.240 | where the driver, you know, just keeps out of the situations
02:01:06.600 | that could become problematic.
02:01:08.860 | And just, they don't look at their phone.
02:01:10.800 | I mean, they take the job seriously and they're safe.
02:01:13.360 | And you can give somebody a skills test, right?
02:01:16.000 | In, you know, as a CDL examiner,
02:01:18.360 | you could take them out and say,
02:01:19.200 | all right, I need you to go around these cones
02:01:20.600 | and like drive safely through this school zone.
02:01:24.220 | But what really proves that you're a safe driver
02:01:27.080 | is two years without an accident, right?
02:01:29.640 | Because that means that day after day, hour after hour,
02:01:32.720 | mile after mile, you did the right thing, right?
02:01:36.600 | And not when it was like, oh, some situation's emerging,
02:01:39.360 | but just consistently over time,
02:01:41.360 | kept yourself out of accident situations.
02:01:43.600 | And you can see this with drivers who are, you know,
02:01:45.680 | a million or 2 million safe miles.
02:01:48.000 | The value of those drivers for Walmart
02:01:50.320 | is they don't run over minivans.
02:01:52.780 | The company I work for,
02:01:54.780 | they ran over minivans on a regular basis.
02:01:57.180 | So, you know, when I was trained,
02:01:58.800 | they said we kill 20 people a year.
02:02:00.560 | We send someone to the funeral,
02:02:04.760 | there's a big check involved, don't be that.
02:02:08.400 | You know, we don't wanna go to your funeral
02:02:10.440 | and you don't wanna be the person who caused that funeral.
02:02:15.240 | Okay, so they just write that off.
02:02:18.340 | Okay, that's just part of the business model.
02:02:20.760 | Now, forward collision avoidance
02:02:23.700 | can basically eliminate the vast majority of those accidents.
02:02:30.360 | That's what the value of a really expensive
02:02:33.260 | conscientious driver is based on.
02:02:35.020 | They don't run over minivans.
02:02:37.060 | So as soon as you have that forward collision avoidance,
02:02:40.340 | what's gonna happen to the wages of those drivers?
02:02:42.840 | - By way of a therapy session,
02:02:45.700 | help me understand,
02:02:48.140 | is collision avoidance,
02:02:51.060 | automated collision avoidance systems,
02:02:55.280 | are they good or bad for society?
02:02:57.840 | - Yeah, I mean, you know, this is, they're good.
02:03:02.200 | - Right? - They're good.
02:03:03.520 | - But what do we do about the pain
02:03:07.600 | of a workforce in the short term
02:03:10.340 | because their wages are gonna go down
02:03:14.060 | because the job starts requiring less and less skill?
02:03:18.020 | Is there a hopeful message here
02:03:20.640 | where other jobs are created?
02:03:22.960 | - So I'm a sociologist, right?
02:03:24.980 | So I'm gonna think about what's the structure behind that
02:03:28.660 | that creates that pain, right?
02:03:30.440 | And it's ownership, right?
02:03:32.840 | You know, we don't call it capitalism for nothing.
02:03:35.600 | You know, what capitalists do is they figure out cheaper,
02:03:39.140 | more efficient ways to do stuff.
02:03:40.740 | And they use technology to do that oftentimes, right?
02:03:43.320 | This is the remarkable history of the last couple centuries
02:03:47.660 | and all the productivity gains is, you know,
02:03:50.660 | people who were in a competitive market saying,
02:03:54.500 | if I have to do it, right?
02:03:56.540 | I don't have a choice 'cause like my competitor over there
02:03:59.300 | is gonna eat my lunch if I'm not on my game.
02:04:02.460 | I don't have a choice.
02:04:04.540 | I've gotta invest in this technology
02:04:06.840 | to, you know, make it more efficient, to make it cheaper.
02:04:10.940 | And what do you look for?
02:04:12.340 | You look for, oftentimes, you look for labor costs, right?
02:04:16.100 | You look for high value labor.
02:04:17.540 | If I can take a hundred and, you know,
02:04:19.820 | a lot of these truck drivers make good money,
02:04:21.140 | a hundred thousand dollars, good benefits,
02:04:22.740 | vacation, you know, retirement.
02:04:25.380 | If I can replace them with a $35,000 worker
02:04:28.500 | when I'm competing with maybe a low wage retail employer
02:04:32.420 | rather than some other more expensive employers
02:04:34.820 | for, you know, skilled blue collar workers,
02:04:37.620 | I'm gonna do that.
02:04:39.100 | And that's just, that's what we do.
02:04:41.780 | And so I think those are the bigger questions
02:04:45.700 | around this technology, right?
02:04:46.900 | Is like, you know, are workers gonna get screwed by this?
02:04:50.220 | Like, yeah, most likely.
02:04:51.660 | Like that's what we do.
02:04:54.140 | - So one of the things you say is, I mean, first of all,
02:04:55.900 | the numbers of workers that will feel this pain
02:04:58.880 | is not perhaps as large as the journalists kind of articulate
02:05:02.800 | but nevertheless, the pain is real.
02:05:05.300 | And I guess my question here is,
02:05:11.540 | do you have an optimistic vision
02:05:12.900 | about the transformative effects
02:05:14.420 | of autonomous trucks on society?
02:05:16.540 | Like if you look 20 years from now
02:05:21.420 | and perhaps see maybe 30 years from now,
02:05:24.540 | perhaps see these autonomous trucks
02:05:26.620 | doing the various parts of the scenarios you listed
02:05:30.140 | and they're just hundreds of thousands of them,
02:05:33.400 | just like veins, like blood flowing through veins
02:05:38.400 | on the interstate system.
02:05:41.380 | What kind of world do you see that's a better world
02:05:46.300 | than today that involves these trucks?
02:05:48.500 | - Yeah, can I defend myself first?
02:05:51.020 | 'Cause I'm reading the comments right now
02:05:53.780 | of people, you know, of the economists who are telling me--
02:05:56.020 | - Dear commenter, dear PhD in economics.
02:05:59.540 | - Yes, yes, dear PhD in economics,
02:06:02.580 | I know that higher skilled jobs are created,
02:06:06.580 | you know, by technological advancement, right?
02:06:09.100 | I mean, there are big questions
02:06:09.940 | about how many of them, right?
02:06:11.940 | So the idea that we would create more expensive labor
02:06:16.940 | positions, right, with a new technology, right?
02:06:20.700 | You better check your business plan
02:06:22.600 | if your idea is to take a bunch of low wage labor
02:06:26.340 | and replace it with the same amount of high wage labor,
02:06:28.740 | right, so there's a question about how many of those jobs.
02:06:31.820 | And there's the really important social
02:06:34.100 | and political question of, are they the same people, right?
02:06:38.220 | And do they live in the same places?
02:06:39.780 | And I think that kind of, you know,
02:06:41.980 | geography is a huge issue here with the impacts, right?
02:06:45.460 | Lots of rural workers.
02:06:47.620 | Interesting politically, lots of red state workers, right?
02:06:50.100 | Lots of blue state, maybe union folks
02:06:52.020 | who are gonna try to slow autonomy
02:06:54.020 | and lots of red state, you know,
02:06:55.980 | representatives in the house maybe who wanna,
02:06:58.340 | you know, stand up for their trucker constituents.
02:07:01.540 | So just to defend myself.
02:07:03.260 | - Yeah, and to elaborate, I think economics as a field
02:07:06.660 | is not good at measuring the landscape
02:07:08.340 | of human pain and suffering.
02:07:10.300 | So, you know, sometimes you can forget in the numbers
02:07:13.860 | that it's real lives that are at stake.
02:07:15.460 | That's what I suppose sociology is better at doing.
02:07:18.540 | So-- - We try sometimes.
02:07:19.940 | Sometimes.
02:07:20.780 | - Well, the problem with, I mean,
02:07:22.180 | I'm somebody who loves psychology and psychiatry
02:07:25.980 | and a little bit, I guess, of sociology.
02:07:28.860 | I realized how little, how tragically flawed the field is,
02:07:33.340 | not because of lack of trying,
02:07:34.820 | but just how difficult the problems are.
02:07:37.220 | That to do really thorough studies
02:07:39.780 | that understand the fundamentals of human behavior
02:07:42.420 | and this, yes, landscape of human suffering,
02:07:45.820 | it's just, it's almost an impossible task without the data.
02:07:48.740 | And we currently don't, you know,
02:07:51.300 | not everybody's richly integrated
02:07:53.580 | to where they're fully connected
02:07:54.940 | and all their information is being like recorded
02:07:58.780 | for sociologists to study.
02:08:00.620 | So you have to make a lot of inferences.
02:08:02.300 | You have to talk to people.
02:08:03.500 | You have to do the interviews as you're doing.
02:08:05.180 | And through that, like really difficult work,
02:08:07.940 | try to understand, like hear the music
02:08:11.700 | that nobody else is hearing.
02:08:13.460 | The music of like what people are feeling,
02:08:15.860 | their hopes, their dreams,
02:08:17.580 | and the crushing of their dreams
02:08:19.820 | due to some kind of economic forces.
02:08:22.300 | - Yeah, I mean, we've just lived that
02:08:24.700 | for four and a half years of probably, you know, elites.
02:08:28.660 | Let me just go out on a limb and say,
02:08:30.820 | not understanding the sort of emotional
02:08:33.740 | and psychological currents of a large portion
02:08:36.740 | of the population, right?
02:08:37.780 | And just being stunned by it and confused, right?
02:08:41.180 | Wasn't confusing for me after having talked to truckers,
02:08:46.140 | again, who trucking is a job of last resort.
02:08:48.820 | These are people who've already lost
02:08:50.500 | that manufacturing job oftentimes,
02:08:52.340 | already lost that construction job to just aging, right?
02:08:57.340 | So what, you know, what can we do, right?
02:08:59.460 | What's sort of the positive vision?
02:09:01.060 | Because like we've got tons of highway deaths,
02:09:04.140 | we've got, and just to, you know, the big picture is,
02:09:09.140 | and this is the opportunity, I guess, for investors,
02:09:12.160 | it's a hugely inefficient system.
02:09:15.660 | So we buy this truck,
02:09:17.360 | there's this low wage worker in it oftentimes,
02:09:19.660 | and again, I'm setting aside those really good
02:09:21.900 | line haul jobs in LTL, those are a different case.
02:09:24.640 | That low wage worker is driving a truck that they might,
02:09:30.540 | the wheels might roll seven to eight hours a day.
02:09:32.480 | That's what the truck is designed to do,
02:09:33.940 | and that's what makes the money for the company.
02:09:36.300 | In other seven, eight hours a day,
02:09:37.980 | the driver's doing other kinds of work
02:09:39.880 | that, you know, is not driving.
02:09:41.940 | And then the rest of the day,
02:09:42.820 | they're basically living out of the truck.
02:09:45.540 | You really can't find a more inefficient use of an asset
02:09:48.980 | than that, right?
02:09:51.140 | Now, a big part of that is we pay for the roads
02:09:53.040 | and we pay for the rest areas and all this other stuff.
02:09:55.660 | So the way that I work and the way that, you know,
02:09:58.780 | I think about these problems is
02:09:59.980 | I try to find analogies, right?
02:10:01.400 | Sort of labor processes and things
02:10:03.180 | that make economic sense, you know,
02:10:05.420 | that seem, you know, in the same area of the economy,
02:10:10.420 | but have some different characteristics for workers, right?
02:10:15.460 | And sort of try to figure out
02:10:17.060 | why does the economics work there, right?
02:10:19.580 | And so if you look at those really good jobs,
02:10:24.580 | the most likely way that you as a passenger car driver
02:10:29.140 | would know that it's one of those drivers
02:10:30.860 | is that there are multiple trailers, right?
02:10:33.100 | So you see these, like, maybe it's three small trailers,
02:10:35.620 | maybe it's two sort of medium-sized trailers.
02:10:37.900 | Some places you might even see
02:10:39.020 | two really big trailers together.
02:10:40.760 | You do that because labor's expensive, right?
02:10:44.540 | And it's highly skilled.
02:10:45.660 | And so you use it efficiently and you say,
02:10:47.980 | all right, you know, rather than having you, you know,
02:10:50.260 | haul that little trailer out of the ports, you know,
02:10:52.340 | that sort of half-size container,
02:10:54.040 | we're gonna wait till we get three
02:10:55.140 | or we're gonna coordinate the movement
02:10:56.580 | so that there are three ready.
02:10:57.820 | You go do what truckers call make a set,
02:11:00.620 | put them together, right, and you go.
02:11:03.160 | That's a massive productivity gain, right?
02:11:06.100 | Because, you know, you're hauling two,
02:11:08.140 | three times as much freight.
02:11:09.880 | So the positive scenario that I threw out in 2018
02:11:13.860 | was why not have a human-driven truck
02:11:18.760 | with a self-driving truck that follows it, right?
02:11:20.700 | Just a drone unit.
02:11:21.960 | And it was, you know, to me,
02:11:26.700 | this seemed as a, you know, non-computer scientist,
02:11:29.540 | it's a sociologist, right?
02:11:31.100 | This made a lot of sense because when I got done talking
02:11:33.540 | to the, you know, the computer scientists
02:11:35.740 | and the engineers, they were like, well, you know,
02:11:37.500 | it's like object recognition,
02:11:38.860 | decision-making algorithm, all this stuff.
02:11:40.660 | It's like, all right, so why don't you leave the human brain
02:11:44.720 | in the lead vehicle, right?
02:11:46.740 | You got all that processing and then all that following.
02:11:50.540 | Now, again, this is sort of me being a layperson.
02:11:54.300 | You know, I said, why don't, you know,
02:11:55.700 | then that following truck, right,
02:11:57.060 | takes direction from the front,
02:11:58.300 | it uses the rear of the trailer as a reference point,
02:12:01.040 | it maintains the lane,
02:12:02.140 | you've got cooperative adaptive cruise control,
02:12:04.660 | and you double the productivity of that driver.
02:12:08.600 | You solve that problem that I hated
02:12:11.540 | in my, you know, urban truck ports thing
02:12:14.220 | about the bridge weight.
02:12:15.620 | 'Cause when you get to the bridges, you know,
02:12:17.380 | the two trucks can just spread out just enough
02:12:20.300 | to make the bridge weight, right?
02:12:21.420 | And you can just program that in,
02:12:22.620 | and, you know, they're 50 feet further apart,
02:12:24.500 | 100 feet further apart.
02:12:26.840 | So interesting sort of, I think, story about this
02:12:32.460 | that leads to kind of, I think, the policy questions.
02:12:35.120 | In, I guess, 2017, Jack Reed and Susan Collins,
02:12:41.660 | and, you know, requested from the Senate,
02:12:44.540 | the Senate requested research
02:12:46.100 | on what the impacts of self-driving trucks would be.
02:12:48.780 | And the first stage of that was for the GAO
02:12:51.740 | to do a report, sort of looking at the lay of the land,
02:12:56.740 | talking to some experts.
02:12:58.240 | And I was working on my 2018 report,
02:13:03.140 | help contribute to that GAO report.
02:13:06.220 | And, you know, I had the six scenarios, right?
02:13:08.880 | I'm like, okay, you know, here's what Starsky's doing,
02:13:11.820 | you know, here's what Embark and Uber are doing, you know,
02:13:16.020 | here's what Waymo might be doing.
02:13:18.220 | No, nobody really knows, right?
02:13:20.640 | Here's what Peloton's doing, you know,
02:13:23.340 | here's the autopilot scenario.
02:13:25.300 | And then here's this one that I think
02:13:27.180 | actually could be good for drivers.
02:13:29.500 | So now you've got that driver who's got two,
02:13:32.560 | you know, two times the freight,
02:13:34.460 | their decisions are more important,
02:13:35.700 | they're managing a more complex system, right?
02:13:37.640 | They're probably gonna have to have
02:13:38.620 | some global understanding of how to, you know,
02:13:40.620 | the environments in which it can operate safely.
02:13:42.380 | Right now we're talking upscaling, right?
02:13:46.040 | And so, you know, the GAO, you know,
02:13:50.480 | sort of writes up these different scenarios
02:13:52.300 | and the idea is that it's gonna prepare
02:13:54.400 | for this Department of Transportation,
02:13:56.080 | Department of Labor set of processes to engage stakeholders
02:14:00.680 | and sort of get, you know, get industry perspectives
02:14:05.340 | and then do a study on the labor impacts.
02:14:07.740 | So, you know, that DOT, DOL process starts to happen
02:14:13.080 | and, you know, I get to the workshop
02:14:17.020 | and a friend was sitting at the table next to me
02:14:20.400 | and he holds up the scenarios
02:14:22.760 | that they're gonna have us discuss at this workshop
02:14:25.300 | and he's like, "Hey, these look really familiar," right?
02:14:28.180 | They were the, you know, scenarios from the report
02:14:31.040 | but there were only five instead of six.
02:14:34.040 | - Interesting.
02:14:35.160 | - The sixth scenario, which was the upscaling labor,
02:14:38.200 | good for workers scenario, wasn't discussed.
02:14:42.480 | - So to clarify, that's the integral piece of technology
02:14:45.700 | there is platooning.
02:14:47.440 | - Yeah, I mean, in a sense it's platooning,
02:14:50.040 | but, and in fairness, right, as I pitched that idea
02:14:54.960 | or sort of ran that idea by the computer scientists
02:14:58.680 | and engineers and product managers that I would talk to,
02:15:01.560 | they would say, you know, we thought about that
02:15:05.360 | but that following truck, it's not that simple.
02:15:09.320 | You know, that thing, basically we had to engineer that
02:15:12.640 | to be capable of independent self-driving
02:15:15.960 | because what if there was a cut in
02:15:17.640 | or, you know, any number of scenarios
02:15:19.600 | in which it lost that connection to the lead truck
02:15:23.880 | for whatever reason.
02:15:25.260 | Now, I mean, I don't know--
02:15:26.320 | - Boo hoo, platooning is hard.
02:15:29.720 | There's edge cases, I guarantee the number of edge cases
02:15:32.900 | in platooning is orders of magnitude lower
02:15:35.400 | than the number of edge cases
02:15:37.360 | in the general solo full self-drive.
02:15:40.640 | You do not need to solve the full self-driving problem.
02:15:43.840 | I mean, if you're talking about
02:15:46.280 | probability of dangerous events,
02:15:49.320 | it just seems with platooning,
02:15:50.880 | like you can deal with cut-ins.
02:15:54.760 | - Yeah, so this is, you know, this is beyond,
02:15:56.800 | this is one of the challenge obviously of being a researcher
02:15:59.160 | who, you know, doesn't really have any background
02:16:02.120 | in the technology, right?
02:16:04.480 | So I can dream this up, you know,
02:16:06.280 | I have no idea if it's feasible.
02:16:08.120 | - Well, let me speak, you spoke to the PhDs in economics,
02:16:10.680 | let me speak to the PhDs in computer science.
02:16:12.920 | If you think platooning is as hard
02:16:14.560 | as the full self-driving problem,
02:16:16.200 | we need to talk 'cause I think that's ridiculous.
02:16:20.360 | I think platooning, and in fact,
02:16:22.800 | I think platooning is an interesting idea
02:16:24.480 | for ride sharing as well,
02:16:26.680 | for the general autonomous driving problem,
02:16:28.400 | not just trucking, but obviously trucking
02:16:30.360 | is the big, big benefit
02:16:32.120 | because the number of A to B points in trucking
02:16:35.360 | is much, much lower than the general ride sharing problem.
02:16:38.120 | But anyway, I think it's a great idea,
02:16:40.440 | but you're saying it was removed.
02:16:42.680 | - Yeah, and so you can go, you know,
02:16:45.000 | and listeners could go to these reports,
02:16:47.200 | they're publicly available,
02:16:48.720 | and they explain why in the footnote.
02:16:51.560 | And, you know, they note that there was this other scenario
02:16:54.960 | suggested by at least me,
02:16:56.120 | and I can remember they said someone else did too,
02:16:58.980 | but they said, you know, we didn't include it
02:17:01.920 | because no developers were working on it.
02:17:05.000 | - Interesting.
02:17:05.920 | - Full disclosure, that was the approach
02:17:08.600 | that I took in my research, right?
02:17:11.120 | Which was to go to the developers and say,
02:17:13.400 | what's your vision, right?
02:17:14.920 | What are you trying to develop?
02:17:17.000 | That's what I was trying to do,
02:17:19.080 | and maybe, you know, and then I tried to think
02:17:20.960 | outside the box at the end by adding that one, right?
02:17:23.320 | Like here's one that I have, you know,
02:17:24.480 | people aren't talking about that could be cool.
02:17:26.000 | Now, again, it had been proposed in like 2014
02:17:29.160 | for like fuel convoys.
02:17:31.360 | So you could just have like one super armored lead fuel
02:17:35.040 | truck, right, you know, bringing fuel
02:17:36.960 | to forward operating bases in Afghanistan.
02:17:39.200 | And then you wouldn't need, you know, the super heavy,
02:17:41.960 | you know, you wouldn't have to protect the human life
02:17:43.280 | in the following truck.
02:17:44.120 | - So that's interesting, you're saying like,
02:17:45.440 | when you talk to Waymo,
02:17:46.560 | when you talk to these kinds of companies,
02:17:48.520 | they weren't at least openly saying they're working on this.
02:17:52.100 | So then it doesn't make sense to include it in the list.
02:17:56.100 | - Yeah, and so, but here's the thing, right?
02:17:58.640 | This is the Department of Transportation, right?
02:18:01.040 | And the Department of Labor.
02:18:03.120 | Maybe they could consider some scenarios.
02:18:04.740 | Like maybe we could say, you know,
02:18:06.760 | this technology has got a lot of potential.
02:18:08.960 | Here's what we'd like it to do.
02:18:10.600 | You know, we'd like it to reduce highway deaths,
02:18:12.680 | help us fight climate change, reduce congestion,
02:18:14.880 | you know, all these other things.
02:18:16.720 | But that's not how our policy conversation
02:18:19.080 | around technology is happening.
02:18:20.520 | We're not, and people don't think that we should.
02:18:24.680 | And I think that's the fundamental shift
02:18:26.440 | that we need to have, right?
02:18:27.840 | - I've been involved with this a little bit,
02:18:29.040 | like NHTSA and DOT.
02:18:30.820 | The approach they took is saying,
02:18:33.360 | we don't know what the heck we're doing,
02:18:34.880 | so we're going to just let the innovators do their thing
02:18:38.320 | and not regulate it for a while, just to see.
02:18:41.360 | You don't, you think that's,
02:18:42.780 | you think DOT should provide ideas themselves?
02:18:46.520 | - Well, so this is the great trick
02:18:49.920 | in policy of private actors,
02:18:53.680 | is you get narrow mandates for government agencies, right?
02:18:58.720 | So, you know, the safety case will be handled
02:19:01.960 | by organizations whose mandate is safety.
02:19:04.840 | So the Federal Motor Carrier Safety Administration,
02:19:07.840 | who is, you know, going to be a key player,
02:19:11.240 | I argue in an article that I wrote, you know,
02:19:13.600 | they're going to be a key player
02:19:14.560 | in actually determining which scenario is most profitable
02:19:18.040 | by setting the rules for truck drivers.
02:19:20.200 | Their mandate is safety, right?
02:19:22.440 | Now, they have lots of good people there who want,
02:19:25.120 | you know, who care about truck drivers
02:19:26.520 | and who wish truck drivers' jobs were better,
02:19:29.400 | but they don't have the authority to say,
02:19:32.840 | "Hey, we're going to write this rule
02:19:34.040 | 'cause it's good for truck drivers," right?
02:19:35.680 | And so when you, you know, we need to say,
02:19:40.360 | you know, as a society, we need to not restrict technology,
02:19:44.080 | not stand in the way of things,
02:19:45.680 | we need to harness it towards the goals that matter, right?
02:19:48.920 | Not whatever comes out the end of the pipeline
02:19:52.120 | because it's the easiest thing to develop
02:19:53.880 | or whatever is most profitable for the first actor
02:19:57.120 | or whatever, but, you know, and we do,
02:19:58.760 | the thing is we do that, right?
02:20:00.760 | I mean, like when we sent people to the moon,
02:20:04.280 | you know, we did that,
02:20:06.160 | and there were tremendous benefits
02:20:07.840 | that followed from it, right?
02:20:09.400 | And we do this all the time in, you know,
02:20:11.440 | trying to cure cancer or whatever it is, right?
02:20:13.640 | I mean, we can do this, right?
02:20:17.240 | Now, the interesting sort of epilogue to that story
02:20:21.160 | is, you know, six months or so,
02:20:24.320 | I don't know how long it was, after those meetings
02:20:26.840 | in which that sixth scenario was not considered,
02:20:29.800 | a company called Locomation, you know,
02:20:34.120 | ends up using that, essentially that basic scenario
02:20:38.840 | with a slight variation.
02:20:39.960 | So they leave the human driver in both trucks,
02:20:43.840 | and then that following driver goes off duty,
02:20:46.160 | and then, you know, I've been trying to think
02:20:50.080 | of what the term is, they kind of,
02:20:51.040 | I think of it as like slingshotting,
02:20:52.960 | they sort of, when one runs out of hours,
02:20:54.560 | you know, the one who's off duty goes in front,
02:20:56.320 | and, you know, and so, you know,
02:20:59.360 | if only they had been, you know,
02:21:02.280 | around six months earlier,
02:21:04.480 | it might have been considered by the OT,
02:21:06.600 | but it just says, you know,
02:21:07.440 | who has the authority to propose
02:21:09.320 | what these visions of the future are?
02:21:10.920 | - Well, some of it is also just the company stepping up
02:21:13.840 | and just doing it, screw the authority,
02:21:16.400 | and showing that it's possible,
02:21:18.120 | and then the authority follows.
02:21:19.760 | So that's why I really love innovators in the space.
02:21:24.760 | The criticism I have, the very sort of real,
02:21:29.520 | I don't know, harsh criticism I have
02:21:31.120 | towards autonomous vehicle companies in the space
02:21:34.480 | is they've gotten culturally,
02:21:37.220 | they've, it's become acceptable somehow
02:21:41.440 | to do demos and videos,
02:21:45.440 | as opposed to the old school American way
02:21:48.840 | of solving problems.
02:21:50.640 | There's a culture in Silicon Valley
02:21:53.300 | where you're talking to VCs
02:21:54.880 | that have lost that kind of love of solving problems.
02:22:01.180 | They kind of like envision,
02:22:03.760 | if the story you told me in your PowerPoint presentation
02:22:07.420 | is true, how many trillions of dollars
02:22:09.440 | might I be able to make?
02:22:10.760 | There's something lost in that conversation
02:22:13.280 | where you're not really taking on, like,
02:22:16.280 | the problem in a real way.
02:22:18.080 | So these autonomous vehicle companies realize
02:22:20.180 | we don't need to, we just need to make
02:22:22.020 | nice PowerPoint presentations
02:22:24.200 | and not actually deliver products
02:22:26.080 | that, like, everybody looks outside and says,
02:22:29.400 | holy shit, this is life-changing.
02:22:32.040 | This is where I have to give props to Waymo
02:22:34.240 | is they put driverless cars on the road
02:22:38.000 | and, like, forget PowerPoint slide presentations,
02:22:41.520 | actual cars on the road.
02:22:42.600 | Then you can criticize, like,
02:22:44.020 | is that actually going to work?
02:22:45.620 | Who knows, but the thing is they have cars on the road.
02:22:48.380 | That's why I have to give props to Tesla.
02:22:49.860 | They have whatever you want to say
02:22:51.820 | about risk and all those kinds of things.
02:22:54.260 | They have cars on the road
02:22:55.740 | that have some level of automation,
02:22:57.420 | and soon they have trucks on the road as well.
02:23:00.340 | And that kind of, that component,
02:23:03.980 | I think, is an important part of the policy conversation
02:23:06.820 | 'cause you start getting data from these companies
02:23:10.360 | that are willing to take the big risks
02:23:12.180 | as opposed to making slide decks.
02:23:14.420 | They're actually putting cars on the road
02:23:16.660 | and, like, real lives are at stake.
02:23:19.380 | They could be lost and they could bankrupt the company
02:23:22.020 | if they make the wrong decisions,
02:23:23.540 | and that's deeply admirable to me.
02:23:25.940 | Speaking of which, I have to ask Waymo Trucks.
02:23:28.980 | I think it's called Waymo Via.
02:23:31.060 | So I'm talking to the head of trucking at Waymo.
02:23:34.700 | I don't know if you've gotten a chance
02:23:35.660 | to interact with him.
02:23:37.420 | What's a good question to ask the guy?
02:23:39.540 | What's a good question of Waymo?
02:23:41.380 | Because they seem to be one of the leaders in the space.
02:23:45.100 | They have the zen-like calm of, like,
02:23:48.300 | being willing to stick with it for the long-term
02:23:51.580 | in order to solve the problem.
02:23:53.380 | - Yeah, and I guess they have that luxury, right?
02:23:55.900 | Which I don't think I,
02:23:58.460 | if I had another life as a researcher,
02:24:01.900 | I would love to just study the business strategies
02:24:05.420 | of startups and Silicon Valley sort of structure.
02:24:10.020 | Would you consider Waymo a startup?
02:24:12.180 | - No. - No.
02:24:13.620 | - No, right?
02:24:14.460 | I mean, it's at least not in the things
02:24:16.260 | that seem to matter in the self-driving space.
02:24:18.940 | So you mentioned the demos.
02:24:20.700 | And I don't have enough data as a sociologist
02:24:24.340 | to really say like, oh, this is why they do what they do.
02:24:27.260 | But my hypothesis is there's a real scarcity of talent
02:24:31.620 | and money for this, and there certainly was a scarcity
02:24:34.980 | of partnerships with OEMs and the big trucking companies,
02:24:39.900 | and there was a race for it, right?
02:24:42.020 | And the way that if you don't have the backing of Alphabet,
02:24:47.020 | you do a demo, right?
02:24:49.820 | And you get a few more good engineers who say,
02:24:52.060 | hey, look, they did that cool thing,
02:24:54.060 | like Anthony Levandowski did with Otto,
02:24:56.700 | and that resulted in the Uber purchase of that program.
02:25:00.140 | So what would I ask?
02:25:03.220 | I mean, I think I would ask a lot of questions,
02:25:06.580 | but I think the markets-- - Well, there's also
02:25:07.780 | on-record and off-record conversations,
02:25:09.820 | which unfortunately, I'm asking for
02:25:12.180 | an on-record conversation.
02:25:14.420 | And that, I don't know if these companies
02:25:18.700 | are willing to have interesting on-record conversations.
02:25:21.820 | - Yeah, I mean, I assume that, like, there are questions
02:25:24.780 | that I don't think you'd have to ask.
02:25:26.180 | Like, I assume they're gonna be actually driverless, right?
02:25:28.620 | They're not gonna like keep the driver in there.
02:25:31.260 | So, I mean, for the industry, I think it would be interesting
02:25:34.900 | to know where they see that first adopter, right?
02:25:39.420 | - Oh, you mean from like the scenarios they laid out,
02:25:42.100 | which one are they going to take on?
02:25:44.140 | - Yeah, I mean, 'cause that's gonna, again,
02:25:46.220 | it's those really expensive good jobs, right?
02:25:48.580 | So those LTL jobs, the like UPS jobs.
02:25:51.460 | Now that's gonna be, that's where labor is too, right?
02:25:54.140 | That's where the Teamsters are.
02:25:55.060 | That's the only place they are left, right?
02:25:57.300 | So that's gonna be the big fight on the hill,
02:26:00.540 | and if labor can muster it, right?
02:26:03.380 | I don't know.
02:26:05.020 | There's a really cool,
02:26:06.580 | one thing I would recommend to you and your listeners,
02:26:10.380 | if you really wanna see some, like a remarkable page
02:26:13.220 | in sort of the history of labor and automation,
02:26:15.900 | there's a report that Harry Bridges,
02:26:18.820 | who was the socialist leader of the Longshoremen
02:26:23.660 | on the West Coast, and just galvanized that union,
02:26:26.340 | and they still control the ports today
02:26:28.400 | because of the sort of vision that he laid down.
02:26:31.900 | In the 1960s, he put out a photo journal report
02:26:35.820 | called "Men and Machines," and basically what it was
02:26:39.260 | was it was an internal education campaign
02:26:42.100 | to convince the membership
02:26:44.060 | that they had to go along with automation.
02:26:46.740 | Machines were coming for their jobs,
02:26:48.300 | and what the photo journal, it's almost like 100 pages
02:26:51.220 | or something like that, is like,
02:26:52.580 | here's how we used to do it.
02:26:54.120 | Some of you old timers remember it.
02:26:56.180 | Like, we used to take the barrels of olive oil,
02:26:58.660 | and we'd stack 'em in the hold, and we'd roll 'em by hand,
02:27:01.540 | and we'd put the timber in,
02:27:02.700 | and we'd stack the crates tight, you know?
02:27:05.660 | And that was the pride of the Longshoremen,
02:27:07.660 | was a tight stow.
02:27:10.020 | And now you all know there are cranes that come down,
02:27:13.340 | and there's no longer any rope slings,
02:27:15.580 | and we're loading bulldozers into the hold
02:27:17.780 | to push the ore up into piles,
02:27:19.660 | and then clamshells are coming down.
02:27:21.900 | And he made this case to them, and he said,
02:27:25.220 | this is why we're signing this agreement,
02:27:27.860 | to basically allow the employer to automate.
02:27:32.860 | And we're gonna lose jobs,
02:27:34.780 | but we're gonna get a share of the benefits.
02:27:37.700 | And so our wages are gonna go up,
02:27:39.340 | we're gonna continue to control the hiring
02:27:41.140 | and training of workers, our numbers are gonna go down,
02:27:44.100 | but basically, that last son of a bitch
02:27:46.540 | who's working at the ports,
02:27:48.100 | is gonna be one really well-paid son of a bitch, you know?
02:27:52.060 | It may just be one standing, but he's gonna love his job.
02:27:55.620 | You should check out that report.
02:27:57.740 | - That's an interesting vision of a future
02:27:59.700 | that probably still holds.
02:28:01.940 | That is, I mean, there is some level
02:28:04.180 | to which you have to embrace the automation.
02:28:07.100 | - Yeah, I mean, and who gets, you know,
02:28:08.580 | it's the benefits, right?
02:28:09.540 | It's like, I mean, think of the public dollars
02:28:12.060 | that went into developing self-driving vehicles
02:28:14.020 | in the early days, right?
02:28:14.900 | Not just the vision of it, right?
02:28:16.260 | Which was a public vision to, you know,
02:28:18.860 | take soldiers out of harm's way.
02:28:20.500 | But, you know, a lot of money.
02:28:24.780 | - And there's some way, if you are a business
02:28:27.340 | that's leveraging that technology,
02:28:29.660 | from a broad, historical, ethical perspective,
02:28:33.820 | you do owe it to the bigger community to pay back,
02:28:38.820 | like for all the investment that was paid
02:28:44.940 | to make that technology a reality.
02:28:47.180 | In some sense, I don't know how to make that right, right?
02:28:50.940 | On one, there's pure capitalism,
02:28:54.300 | and then there's communism,
02:28:56.220 | and I'm not sure how to get that balance right.
02:29:01.220 | - You know, I don't have all the answers in here,
02:29:06.260 | you know, and I wouldn't expect, you know,
02:29:09.100 | individual private companies to kind of kick back, right?
02:29:11.940 | That's, capitalism doesn't allow that, right?
02:29:14.100 | Unless you have a huge monopoly, right?
02:29:15.820 | And then you can, on the backside,
02:29:17.540 | create music halls and libraries and things like that.
02:29:20.300 | But, you know, here's what I think, you know,
02:29:23.500 | the basic obligation is, is, you know, come to the table,
02:29:28.500 | like, and have an honest conversation with the policy makers,
02:29:33.380 | with the truck drivers, you know,
02:29:35.820 | with the communities that are at risk.
02:29:37.940 | Like, at least let's talk about these things, you know,
02:29:41.900 | in a way that doesn't look like
02:29:43.500 | the way lobbying works right now,
02:29:45.340 | where you send a well-paid lobbyist to the Hill
02:29:49.140 | to, you know, convince some representative or senator
02:29:52.580 | to stick a sentence or two in that favors you into the,
02:29:55.300 | like, let's have a real conversation.
02:29:57.780 | - Real human conversation.
02:29:58.620 | - Can we just do that?
02:29:59.460 | - Yeah, don't play games.
02:30:01.420 | Real, real human conversation.
02:30:03.220 | Let me ask you, you mentioned Autopilot,
02:30:06.500 | gotta ask you about Tesla,
02:30:08.420 | this renegade little company that seems to be,
02:30:11.260 | from my perspective, revolutionizing autonomous driving
02:30:13.820 | or semi-autonomous driving,
02:30:15.020 | or at least the problem of perception and control.
02:30:18.540 | They've got a semi on the way.
02:30:22.140 | They got a truck on the way.
02:30:24.300 | What are your thoughts about Tesla semi?
02:30:26.580 | - You know, and I did have
02:30:29.140 | some very preliminary conversations
02:30:31.180 | with policy folks there.
02:30:35.020 | You know, nothing really in the tech
02:30:37.260 | or business side of it too much.
02:30:39.980 | And here's why.
02:30:40.980 | I think because electrification and autonomy
02:30:43.900 | run in opposite directions.
02:30:45.300 | And I just, you know, I don't see the application,
02:30:49.980 | the value in self-driving for the truck
02:30:52.580 | that Tesla's gonna produce in the near term.
02:30:55.500 | You know, they're just,
02:30:56.420 | you're not gonna have the battery.
02:30:58.820 | And now you could have wonderful safety systems
02:31:01.220 | and reinforcing the auto,
02:31:04.420 | self-driving features supporting a skilled driver,
02:31:08.900 | but you're not gonna be able to pull that driver out
02:31:11.140 | for long stretches the way that you are
02:31:12.940 | with driverless trucks.
02:31:14.260 | - So do you think, I mean, the reasons,
02:31:18.500 | so yeah, the electrification
02:31:20.980 | is not obviously coupled with the automation.
02:31:24.700 | They have a very interesting approach
02:31:29.820 | to semi-autonomous pushing towards autonomous driving.
02:31:34.820 | Right, it's very unique.
02:31:36.740 | No LIDAR, now no radar.
02:31:41.220 | It's computer vision alone from a large,
02:31:44.220 | they're collecting huge amounts of data from a large fleet.
02:31:47.060 | It's an interesting, unique approach,
02:31:49.460 | bold and fearless in this direction.
02:31:51.740 | If I were to guess whether this approach would work,
02:31:55.260 | I would say no, let's start it.
02:31:58.240 | One, you would need a lot of data
02:32:01.460 | and two, because you have actual cars deployed on the road
02:32:05.100 | using a beta version of this product,
02:32:07.940 | you're going to have a system that's far less safe
02:32:11.980 | and you're going to run into trouble.
02:32:13.340 | It's horrible PR.
02:32:15.100 | Like it just seems like a nightmare,
02:32:17.580 | but it seems to not be the case, at least up to this point.
02:32:20.640 | It seems to be not, you know, on par, if not safer,
02:32:25.640 | and it seems to work really well.
02:32:27.760 | And the human factor somehow manages,
02:32:32.020 | like drivers still pay attention.
02:32:33.820 | Now there's a selection of who is inside
02:32:36.660 | the Tesla Autopilot user base, right?
02:32:39.820 | There could be a self-selection mechanism there,
02:32:42.140 | but however it works,
02:32:43.820 | these things are not running off the road all the time.
02:32:47.300 | So it's very interesting
02:32:49.100 | whether that can sort of creep into the trucking space.
02:32:52.080 | Yes, at first, the long haul problem is not solved.
02:32:57.940 | They need to charge, but maybe you can solve, you know,
02:33:01.260 | a lot of your scenarios involved small distances
02:33:06.260 | and, you know, that last mile aspect,
02:33:10.220 | which is exactly what Tesla is trying to solve
02:33:12.140 | for the regular passenger vehicle space
02:33:17.140 | is the city driving.
02:33:20.420 | It's possible that you have these trucks.
02:33:22.620 | It's almost like, yeah, you solve the last mile delivery
02:33:27.620 | part of some of the scenarios that you mentioned
02:33:31.260 | in autonomous driving space.
02:33:32.780 | Is that, do you think that's from the people
02:33:35.500 | you've spoken with too difficult of a problem?
02:33:37.580 | - The thing that, you know, keeps me so interested
02:33:41.540 | in this space and thinking that it's so important,
02:33:43.580 | you know, is again, that efficiency question,
02:33:46.920 | that safety question,
02:33:48.180 | and the way that these economics can push us potentially,
02:33:52.820 | you know, toward a more efficient system.
02:33:54.780 | So I wanna see those Tesla electric trucks
02:33:57.500 | running out to those truck ports
02:33:59.260 | where you've got those two, you know,
02:34:02.300 | two trucks with a human driver in front, right?
02:34:05.620 | You know, I think that's,
02:34:07.060 | now what's powering those, is it hydrogen?
02:34:09.300 | You know, I mean, I don't, you know,
02:34:11.260 | again, it's very interesting as a researcher
02:34:13.060 | who does not have a background in technology
02:34:14.580 | and doesn't have a horse, you know, in this race.
02:34:18.460 | I mean, you know, for all I know,
02:34:20.620 | self-driving trucks will ultimately be achieved
02:34:22.740 | by some biomechanical sensor that uses echolocation
02:34:26.540 | 'cause we took stem cells of bats.
02:34:28.420 | And, you know, I mean, I don't, you know,
02:34:30.060 | I don't, I am completely unable to assess who's,
02:34:35.060 | you know, who's ahead or who's behind or who makes sense.
02:34:37.780 | But I think one key component there,
02:34:39.900 | and this is what I see with Tesla often,
02:34:42.540 | and it's quite sad to me
02:34:44.460 | that other companies don't do this enough,
02:34:47.380 | is that first principles thinking.
02:34:49.540 | Like, wait, wait, wait, okay.
02:34:51.140 | It's looking at the inefficiencies as opposed to,
02:34:54.100 | I worked with quite a few car companies
02:34:57.020 | and they basically have a lot of meetings.
02:35:00.260 | There's a lot of meetings.
02:35:01.780 | And the discussion is like,
02:35:03.260 | how can we make this cheaper, this cheaper,
02:35:05.140 | this cheaper, this component cheaper,
02:35:06.300 | this cheaper, the cheapification of everything,
02:35:08.700 | just like you said, as opposed to saying,
02:35:11.100 | wait a minute, let's step back.
02:35:13.160 | Let's look at the entirety of the inefficiencies
02:35:15.740 | in the system.
02:35:17.240 | Like, why have we been doing this like this
02:35:19.220 | for the last few decades?
02:35:20.920 | Like, start from scratch.
02:35:22.620 | Can this be 10X, 100X cheaper?
02:35:25.420 | Like, if we not just decrease the cost of one component here,
02:35:30.420 | this component here, or this component here,
02:35:33.620 | but like, let's like redesign everything.
02:35:37.900 | Let's infrastructure, let's have special lanes.
02:35:42.900 | Or in terms of truck ports,
02:35:45.180 | as opposed to having regular human control truck ports,
02:35:47.540 | have some kind of weird like sensors,
02:35:51.700 | like where everything about the truck connecting
02:35:56.020 | at that final destination is automated fully
02:35:58.780 | from the ground up.
02:35:59.780 | You build the facility from the ground up
02:36:01.620 | for the autonomous truck.
02:36:03.440 | All those kinds of sort of questions are platooning.
02:36:06.820 | Let's say, wait a minute, okay.
02:36:09.020 | I know we think platooning is hard,
02:36:11.260 | but can we think through exactly why it's hard
02:36:14.380 | and can we actually solve it?
02:36:15.980 | Like, if we collect a huge amount of data, can we solve it?
02:36:19.120 | And then teleoperation, like, okay, yeah, yeah,
02:36:23.380 | it's difficult to have good signal,
02:36:25.460 | but can we actually, can we have,
02:36:27.740 | can we consider the probability of those edge cases
02:36:31.300 | and what to do in the edge cases
02:36:32.420 | when the teleoperation fails?
02:36:34.240 | Like, how difficult is this?
02:36:35.320 | What are the costs?
02:36:36.320 | How do we actually construct a teleoperation center
02:36:39.600 | full of humans that are able to pay attention
02:36:41.780 | to a large fleet where the average number of vehicles
02:36:44.720 | per human is like 10 or a hundred?
02:36:47.060 | You know, like having that conversation
02:36:50.040 | as opposed to kind of having, you know,
02:36:52.140 | you show up to work and say, all right,
02:36:54.360 | it seems like, you know, because of COVID,
02:36:58.600 | we, you know, are not making as much money.
02:37:00.580 | Can we have a cheaper,
02:37:02.260 | can we give less salary to the trucker?
02:37:04.680 | And can we build, like,
02:37:07.980 | decrease the cost or decrease the frequency
02:37:12.800 | at which we buy new trucks?
02:37:14.600 | And when we do buy new trucks, make them cheaper
02:37:17.160 | by making them crappier, like this kind of discussion.
02:37:20.160 | This is why, to me, it's like Tesla's like rare in this.
02:37:23.280 | And there's some sectors in which innovation
02:37:26.220 | is part of the culture.
02:37:27.600 | In the automotive sector, for some reason,
02:37:29.280 | it's not as much.
02:37:31.220 | This is obviously the problem that Ford and GM
02:37:33.800 | are struggling with.
02:37:34.640 | It's like, they're really good at making cars at scale cheap
02:37:39.360 | and they're like legit good, like Toyota at this.
02:37:42.600 | They're some of the greatest manufacturing people
02:37:44.520 | in the world, right?
02:37:45.360 | - That's incredible.
02:37:46.180 | - But then when it comes to hiring software people,
02:37:48.860 | they're horrible.
02:37:49.700 | So it's culture.
02:37:52.960 | And then it's such a difficult thing
02:37:55.200 | for them to sort of embrace,
02:37:57.080 | but greatness requires that they embrace this,
02:38:01.060 | embrace whatever is required
02:38:03.140 | to remove the inefficiency from the system.
02:38:05.020 | And that may require you to do things very differently
02:38:07.620 | than you've done in the past.
02:38:09.460 | - Yeah, I mean, there are certain things
02:38:12.060 | that the market can do well.
02:38:13.380 | This is how I see the world, right?
02:38:15.520 | And that's the best way to organize
02:38:20.280 | certain kinds of activities,
02:38:21.480 | is the market and private interest.
02:38:24.320 | But I think we go too far in some areas.
02:38:28.860 | Transportation is, if we can't have a public debate
02:38:33.860 | about the roads that we all pay for,
02:38:37.720 | forget about it, private factories
02:38:40.900 | and all these other, healthcare and other places,
02:38:43.120 | it's gonna be way harder there.
02:38:45.360 | Healthcare, I guess, has some direct contact
02:38:49.680 | with the consumer where we're probably gonna have lots
02:38:51.780 | of sort of hands-on public policy
02:38:54.760 | about concerns around patient rights and things like that.
02:38:57.880 | But if we can't figure out how to have
02:39:00.860 | a public policy conversation around how technology
02:39:04.080 | is gonna reform our public roadways
02:39:07.080 | and our transportation system,
02:39:08.820 | we're really leaving way too much to private companies.
02:39:14.040 | It's just, it's not in their,
02:39:17.480 | I get asked this question, like, what should companies do?
02:39:19.640 | And I'm like, just go about doing what you're doing.
02:39:22.440 | I mean, please come to the table and talk about it,
02:39:24.460 | but it's not their role.
02:39:26.440 | I mean, I appreciate Elon's attempts
02:39:30.000 | to have species-level goals,
02:39:34.760 | like, oh, we're gonna go to Mars.
02:39:36.440 | I mean, that's amazing, and that's incredible
02:39:38.960 | that someone can realize that,
02:39:42.960 | have a chance at realizing that vision.
02:39:44.680 | It's amazing, right?
02:39:46.040 | But when it comes to so many areas of our economy,
02:39:50.200 | we can't wait for a hero.
02:39:51.560 | We have to have, and there are way too many
02:39:54.920 | interests involved.
02:39:56.360 | You know, it's who builds the roads, who, you know,
02:39:58.480 | I mean, the money that sloshes around on Capitol Hill
02:40:02.160 | to decide what happens in these infrastructure bills
02:40:05.480 | and the transportation bill is just obscene, right?
02:40:09.120 | - See, I think, this is an interesting view of markets.
02:40:13.000 | Correct me if I'm wrong, let me propose a theory to you,
02:40:17.040 | that progress in the world is made by heroes,
02:40:22.040 | and the markets remove the inefficiencies
02:40:24.080 | from the work the heroes did.
02:40:26.240 | So going to Mars from the perspective of markets
02:40:30.480 | probably has no value.
02:40:32.280 | Maybe you can argue it's good for hiring
02:40:34.120 | to have a vision or something like that,
02:40:35.920 | but like those big projects don't seem
02:40:38.560 | to have an obvious value, but world,
02:40:42.280 | our world progresses by those big leaps.
02:40:46.080 | And then as, after the leaps are taken,
02:40:49.200 | then the markets are very good at removing
02:40:52.000 | sort of inefficiencies, but it just feels like
02:40:54.920 | the autonomous vehicle space
02:40:56.480 | and the autonomous trucking space requires leaps.
02:40:59.840 | It doesn't feel like we can sneak up into a good solution
02:41:04.400 | that is ultimately good for labor,
02:41:06.240 | like for human beings in the system.
02:41:08.480 | It feels like some, like, probably a bad example,
02:41:13.480 | but like a Henry Ford type of character steps in
02:41:16.360 | and say like, we need to do stuff completely differently.
02:41:21.000 | - Yeah, and you said we can't hope for a hero,
02:41:24.840 | but it's like, no, but we can say we need a hero.
02:41:28.440 | We need more heroes.
02:41:29.920 | So if you're a young kid right now listening to this,
02:41:32.160 | we need you to be a hero.
02:41:33.880 | It's not like we need you to start a company
02:41:35.640 | that makes a lot of money, no.
02:41:37.360 | You need to start a company that makes a lot of money
02:41:39.520 | so that you can feed your family as you become a hero
02:41:42.960 | and take huge risks and potentially go bankrupt.
02:41:45.840 | Those risks is how we move society forward, I think,
02:41:49.800 | maybe as a romantic view, I don't know.
02:41:51.960 | - I totally disagree.
02:41:53.080 | - You disagree, goddammit.
02:41:54.560 | - I mean, I--
02:41:55.400 | - And out of the two of us, you're the knowledgeable one.
02:41:58.200 | - No, no, no, I think it's a matter of like,
02:42:02.560 | do we need those heroes?
02:42:03.560 | Absolutely.
02:42:04.640 | I mean, I saw the boosters come down from SpaceX's rockets
02:42:09.640 | and land nearly simultaneously with my kids
02:42:17.960 | after school one day.
02:42:19.800 | And I thought, oh my god, like this is,
02:42:22.840 | like science fiction has been made real.
02:42:25.560 | It's incredible.
02:42:26.560 | And it's a pinnacle of human achievement, right?
02:42:29.720 | It's like, this is what we're capable of.
02:42:32.560 | But we need to have that, those heroes oriented.
02:42:37.080 | We need to allow them, right, to orient toward the right,
02:42:40.940 | toward the goals, right?
02:42:42.920 | We got to, climate change, you know?
02:42:45.720 | I mean, all the heroes out there, right?
02:42:48.880 | I mean, it's time, the clock is ticking.
02:42:52.120 | It's past time.
02:42:53.920 | I've been working on climate change issues
02:42:55.800 | since the mid '90s.
02:42:58.680 | Like, I still remember the first time in 2010
02:43:03.680 | when I got a grant that was completely focused
02:43:08.600 | on adaptation rather than prevention.
02:43:12.080 | And just when it hit me, that like, wow.
02:43:17.080 | - So adaptation versus prevention is like acceptance
02:43:22.200 | that there's going to be catastrophic impact.
02:43:25.440 | We just need, we need to figure out
02:43:27.040 | how do we at least live with that.
02:43:28.680 | - Yeah, and you know, the grant was like,
02:43:30.360 | okay, our agriculture system is gonna move,
02:43:32.680 | our breadbasket is no longer gonna be California.
02:43:34.840 | It's gonna be Illinois.
02:43:36.400 | What does that mean for truck transportation?
02:43:38.520 | - So it's like, so in terms of a big philosophical,
02:43:42.000 | societal level, that's kind of like giving up.
02:43:44.600 | - Yeah.
02:43:45.440 | - In terms of the big heroic actions, yeah.
02:43:47.480 | - You know, failures in human history?
02:43:49.380 | Yeah, that's gonna be, let's hope not the biggest,
02:43:53.780 | but could be.
02:43:55.340 | Do you, so let me say why I disagree, right?
02:43:59.480 | Henry Ford, amazing, right?
02:44:02.540 | To sort of mass produce cars, right?
02:44:04.520 | Daimler, to put that first truck on the road
02:44:07.520 | without the roads, right?
02:44:09.240 | So there's like, we need that innovation.
02:44:11.480 | There's no doubt about it.
02:44:12.520 | And there are rules for that,
02:44:14.240 | but there's big public stuff
02:44:16.560 | that sets the stage that's critical.
02:44:20.240 | And you know, and what it really is,
02:44:22.440 | it's a sociological problem, right?
02:44:25.080 | It's a political problem.
02:44:26.040 | It's a social problem.
02:44:26.880 | We have to say, and we have these screwed up ideas, right?
02:44:29.900 | So we have this politics right now
02:44:31.920 | where like everybody feels like they're getting screwed
02:44:34.320 | and someone undeserving is benefiting.
02:44:38.000 | When in fact, like, at least in the middle, right?
02:44:40.240 | They're huge.
02:44:41.080 | I used to teach this course on rich and poor,
02:44:43.720 | on economic inequality.
02:44:45.120 | And I would go through public housing subsidies
02:44:49.120 | in Philadelphia, section eight subsidies.
02:44:52.660 | And then I would go through my housing subsidies
02:44:57.140 | for my mortgage interest deduction.
02:45:00.320 | And it worked out to basically the average payment
02:45:02.920 | for a section eight housing voucher in my neighborhood.
02:45:06.800 | I'm not a welfare recipient
02:45:08.040 | according to the dominant discourse.
02:45:10.200 | And so we have this completely screwed up sense
02:45:13.200 | of like where our dollars go
02:45:15.120 | and who benefits from the investment.
02:45:17.640 | And we need to, I don't know that we can do it,
02:45:21.880 | but if we're gonna survive,
02:45:24.700 | we need to figure out how to have honest conversations
02:45:28.920 | where private interest is where we need it to be
02:45:32.680 | in fostering innovation and rewarding the people
02:45:36.400 | who do incredible things.
02:45:37.640 | Please, we don't wanna squash that,
02:45:41.120 | but we need to harness that power to solve
02:45:43.480 | what I think are some pretty big existential problems.
02:45:47.520 | - So you think there's a like government level,
02:45:50.640 | national level collaboration required
02:45:53.200 | for infrastructure project?
02:45:54.480 | Like we should really have large moonshot projects
02:46:01.340 | that are funded by our governments.
02:46:04.860 | - At least guided by, I mean,
02:46:06.460 | I think there are ways to finance them
02:46:08.180 | and other things, but we gotta be careful, right?
02:46:10.900 | 'Cause that's where you get all these sort of perverse,
02:46:13.540 | unintended consequences and whatnot.
02:46:15.260 | But if you look at transportation in the United States
02:46:18.420 | and it is the foundation of the manifest destiny,
02:46:23.100 | economic growth, right?
02:46:24.900 | That built the United States into the world superpower
02:46:29.040 | that it became and the industrial power
02:46:30.580 | that it became, it rested on transportation, right?
02:46:33.580 | It was like the Erie Canal, I grew up a few miles
02:46:37.660 | from where they dug the first shovel full
02:46:39.700 | of the Erie Canal and everyone thought it was crazy, right?
02:46:44.100 | But those public infrastructure projects,
02:46:46.660 | the canals, right?
02:46:48.060 | The railroads, yeah, they were privately built,
02:46:50.100 | but they wouldn't have been privately built
02:46:51.800 | without Lincoln funding them essentially
02:46:55.020 | and giving the railroads land in exchange for building them.
02:47:00.860 | The highway system, the Eisenhower,
02:47:03.500 | the payback that the US economy got
02:47:06.740 | from the Dwight D. Eisenhower interstate system
02:47:09.700 | is phenomenal, right?
02:47:12.140 | No private entity was gonna do that,
02:47:14.020 | electrification, dams, water, you know,
02:47:16.740 | we need to do this infrastructure, infrastructure.
02:47:21.580 | - And now more than ever, it's been really upsetting to me
02:47:24.340 | on the COVID front.
02:47:25.820 | There's one of the solutions to COVID,
02:47:29.620 | which seems obvious to me from the very beginning
02:47:32.500 | that nobody's opposed to,
02:47:34.940 | it's one of the only bipartisan things is at-home testing,
02:47:39.820 | rapid at-home testing.
02:47:41.520 | There's no reason why at the government level,
02:47:45.260 | we couldn't manufacture hundreds of millions
02:47:47.100 | of tests a month.
02:47:48.800 | There's no reason starting in May, 2020.
02:47:51.260 | And that gives power to a country that values freedom,
02:47:55.220 | that gives power information to each individual
02:47:57.140 | to know whether they have COVID or not.
02:47:59.260 | So it's possible to manufacture them for under a dollar.
02:48:04.260 | It's like an obvious thing.
02:48:05.820 | It's kind of like the roads.
02:48:07.140 | It's like, everybody's invested.
02:48:08.980 | Let's put countless tests in the hands
02:48:11.580 | of every single American citizen,
02:48:13.500 | maybe every citizen of the world.
02:48:16.580 | The fact that we haven't done that to date,
02:48:19.500 | and there's some regulation stuff with the FDA,
02:48:21.860 | all the kind of dragging of feet,
02:48:24.380 | but there's not actually a good explanation
02:48:26.060 | except our leaders and culturally,
02:48:31.060 | we've lost the sort of, not lost,
02:48:35.960 | but it's a little bit dormant.
02:48:38.020 | The will to do these big projects that better the world.
02:48:42.100 | I still have the hope that when faced
02:48:45.980 | with catastrophic events,
02:48:50.620 | the more dramatic, the more damaging,
02:48:52.820 | the more painful they are,
02:48:53.860 | the higher we will rise to meet those.
02:48:56.740 | And that's where the infrastructure style projects
02:48:58.900 | are really important.
02:48:59.740 | But it's certainly a little bit challenging
02:49:03.860 | to remain an optimist in the times of COVID
02:49:06.620 | because the response of our leaders has not been as great
02:49:10.420 | and as historic as I would have hoped.
02:49:14.300 | I would hope that the actions of leaders
02:49:17.380 | in the past few years in response to COVID
02:49:19.740 | would be ones that are written in the history books.
02:49:23.140 | And we talk about it as we talk about FDR,
02:49:25.820 | but sadly, I don't know.
02:49:27.580 | I think the history books will forget
02:49:29.380 | the actions of our leaders.
02:49:32.320 | Let me just, to wrap up autonomy,
02:49:39.720 | when you look into the future,
02:49:43.940 | are you excited about automation in the space of trucking?
02:49:52.020 | Is it, when you go to bed at night,
02:49:55.560 | do you see a beautiful world in your vision
02:50:01.740 | that involves autonomous trucks?
02:50:03.220 | Like all of the truckers you've become close with,
02:50:07.020 | you've talked to, do you see a better world for them
02:50:10.340 | because of autonomous trucks?
02:50:11.780 | - Damn you, Alex.
02:50:14.140 | You know why?
02:50:15.980 | 'Cause I mean, I wanna be an optimist,
02:50:19.140 | and I wanna think of myself, I guess,
02:50:21.420 | as a half glass bowl kind of person.
02:50:23.780 | But when you ask it like that,
02:50:25.580 | and I think about,
02:50:26.920 | when I look at the challenges to harnessing that for,
02:50:35.620 | just let's take just labor and climate, right?
02:50:40.660 | There are other issues, congestion, et cetera,
02:50:42.500 | infrastructure that are gonna be affected by this,
02:50:45.500 | again, those big transformational issues.
02:50:49.660 | I think it's gonna take the best of us.
02:50:53.300 | Like it's gonna take the best of our policy approaches.
02:50:58.300 | It's gonna take, we need to start investing
02:51:01.100 | in rebuilding those institutions.
02:51:05.180 | I mean, that's what we've seen in the last four years, right?
02:51:07.700 | And the erosion of that was so clear
02:51:11.420 | among these truck drivers.
02:51:12.460 | Like when Trump came in and said,
02:51:17.420 | free trade's good for workers, like, yeah, right.
02:51:20.100 | I grew up in the Rust Belt.
02:51:23.620 | I watched factory after factory close.
02:51:25.720 | All of my ancestors worked at the same factory.
02:51:28.460 | It's still holding on by a thread.
02:51:30.340 | The Democratic Party told blue collar workers for years,
02:51:36.620 | don't worry about free trade, it's not bad for you.
02:51:39.660 | And I know the economists
02:51:40.740 | will probably get in the comment box now.
02:51:42.740 | - We'll look forward to your comments.
02:51:45.780 | - We'll look forward to your comments
02:51:46.760 | about how free trade benefits everybody.
02:51:48.760 | But immigration, you go,
02:51:54.220 | and I think immigration is great.
02:51:56.940 | The United States benefits from it tremendously, right?
02:52:00.020 | But there are costs, right?
02:52:01.900 | Go down to South Philadelphia and find a drywaller
02:52:06.060 | and tell him that immigration hasn't hurt him, right?
02:52:08.900 | Go to these places where there's competition, right?
02:52:13.460 | And yes, we benefit overall,
02:52:15.700 | but we have a system that allows some people
02:52:19.460 | to pay really high costs.
02:52:22.160 | And Trump tapped into that,
02:52:24.000 | there's more than that too, obviously.
02:52:28.700 | And there's lots of really dark stuff
02:52:31.080 | that goes along with it,
02:52:33.000 | the sort of racialization of others and things like that.
02:52:35.720 | But he hit on those core issues
02:52:39.600 | that if you were to go back over my trucking interviews
02:52:42.540 | for 15 years, you would have heard those stories
02:52:44.920 | over and over and over again,
02:52:46.160 | that sense of voicelessness,
02:52:47.840 | that sense of powerlessness,
02:52:49.200 | that sense that there's no difference
02:52:50.960 | between the Democrats and the Republicans
02:52:52.480 | 'cause they're all gonna screw us over.
02:52:54.920 | And that was there,
02:52:56.560 | and you could just ignore it as long as you want
02:52:58.400 | and tell people, don't worry, trade's good for you.
02:53:00.380 | Don't worry, immigration's good for you.
02:53:02.040 | As their communities lose factories,
02:53:04.240 | and I mean, a lot of them were lost to the South
02:53:06.080 | before they were lost to overseas, whatever,
02:53:08.120 | but tapped into that.
02:53:10.400 | And there's a fundamental distrust of,
02:53:13.760 | you look at these Pew polls on
02:53:16.480 | whether people trust the media,
02:53:17.760 | but whether or not they trust higher education,
02:53:20.060 | these institutions that I find magical.
02:53:24.240 | I mean, you look at the vaccine research and stuff,
02:53:28.280 | just brilliant people doing incredible things for humanity.
02:53:33.280 | The idea that we can take these viruses
02:53:38.600 | that used to ravage through the human population
02:53:42.120 | and that we had to be terrified of.
02:53:44.280 | And we've suffered, but we have such power now
02:53:49.280 | to defend ourselves behind these programs.
02:53:54.680 | And to see those, people will be like,
02:53:56.800 | hey, I'm not sure if higher education's
02:53:58.200 | good for the country or not.
02:53:59.480 | It's like, where are we?
02:54:02.200 | So we need to rebuild the faith and trust
02:54:04.200 | in those institutions and have these,
02:54:05.640 | but we need to have honest conversations
02:54:07.440 | before people are gonna buy it.
02:54:10.000 | - Do you have ideas for rebuilding the trust
02:54:12.000 | and giving a voice to the voices?
02:54:13.480 | So is the, many of the things we've been talking about
02:54:18.040 | is so sort of deeply integrated.
02:54:21.440 | You think like, this is the trouble I have
02:54:24.360 | with people that work on AI and autonomous vehicles
02:54:27.000 | and so on.
02:54:28.560 | It's not just a technology problem.
02:54:30.960 | It's this human pain problem.
02:54:35.960 | It's the robot essentially silencing
02:54:39.360 | the voice of a human being
02:54:41.200 | because it's lowering their wage,
02:54:43.120 | making them suffer more and giving them no tools
02:54:46.000 | of how to escape that suffering.
02:54:48.840 | Is there something, I mean,
02:54:52.720 | it even gets into the question of meaning.
02:54:55.560 | So if money is one thing,
02:54:57.080 | but it's also what makes us happy in life.
02:55:01.600 | A lot of those truckers,
02:55:06.320 | the set of jobs they've had in their life
02:55:08.320 | were defining to them as human beings.
02:55:10.880 | And so, and the question with automation
02:55:14.760 | is not just how do we have a job
02:55:18.880 | that gives you money to feed your family,
02:55:22.880 | but also a job that gives you meaning,
02:55:24.760 | that gives you pride.
02:55:26.040 | - Yeah.
02:55:26.880 | - And for me, the hope is that AI and automation
02:55:37.160 | will provide other jobs that will be a source of meaning.
02:55:40.880 | But coupled with that hope
02:55:46.240 | is that there will not be too much suffering
02:55:47.960 | in the transition.
02:55:49.400 | And that's not obvious from the people you've spoken with.
02:55:53.480 | - I mean, I think we need to differentiate
02:55:55.600 | between the effects of technology
02:55:57.000 | and the effects of capitalism, right?
02:55:58.560 | And they are, you know,
02:56:00.440 | the fact that workers don't have a lot of power, right?
02:56:05.120 | In the system matters.
02:56:06.720 | You know, we had a system, right?
02:56:08.160 | And that's why I would say, you know,
02:56:09.400 | go to that, you know, Harry Bridges report.
02:56:12.960 | And, you know, those were workers who had a sense of power.
02:56:16.760 | They said, you know what,
02:56:17.600 | we can demand some of the benefits.
02:56:19.360 | Like, yeah, automate our jobs away,
02:56:21.200 | but, you know, kick a little down to us, right?
02:56:24.320 | And we had in the golden era of American industrialism
02:56:28.880 | in post-World War II, that was the contract.
02:56:32.320 | The contract was employers can do what they want
02:56:35.200 | in automation and all these things.
02:56:37.000 | Yeah, sure, there's some union rules
02:56:38.400 | that make things, you know, less efficient in places.
02:56:40.960 | But the key compromise is tie wages to productivity.
02:56:45.440 | That's what we did.
02:56:46.280 | We tied, that's what unions did.
02:56:47.760 | They tied wages to productivity,
02:56:49.600 | kept demand up, right?
02:56:50.640 | It was good for the economy, some economists think, right?
02:56:54.200 | And that's what, you know, we need to,
02:56:57.520 | I think we need to acknowledge that.
02:56:59.840 | We need to acknowledge the fact
02:57:02.720 | that it's not just technology,
02:57:04.560 | it's technology in a social context
02:57:08.920 | in which some people have a lot of power
02:57:10.560 | to determine what happens.
02:57:12.640 | For me, I don't have all the answers,
02:57:14.440 | but I know what my answer is.
02:57:15.960 | And my answer is, and I think I started with this,
02:57:18.720 | you know, I can learn from every single person, you know.
02:57:23.720 | Did I have to talk to the 200th truck driver?
02:57:26.600 | In my opinion, yes,
02:57:30.080 | because I was gonna learn something
02:57:31.720 | from that 200th truck driver.
02:57:33.960 | Now, people with more power might talk to none,
02:57:38.960 | or they might talk to five and say, okay, I got it.
02:57:41.760 | You know, I, people are amazing,
02:57:46.760 | and every one of them has a life experience and concerns
02:57:49.920 | and, you know, can teach us something.
02:57:53.960 | And they're not in the conversation, you know.
02:57:57.200 | And I know this because I'm the expert, you know.
02:58:00.760 | So I get pulled into these conversations
02:58:02.760 | and people wanna know, you know,
02:58:04.280 | what's gonna happen to labor?
02:58:05.760 | You know, it's like, well,
02:58:07.120 | I tried, so I try to be a sounding board,
02:58:10.240 | and I feel a tremendous weight of responsibility,
02:58:14.040 | you know, for that.
02:58:16.280 | But I'm not those workers, you know.
02:58:21.680 | And they may listen to this or, you know,
02:58:24.960 | walk in the door sometime,
02:58:26.080 | it's about to be like, that guy's full of shit.
02:58:28.280 | That's not what I think at all, you know.
02:58:31.480 | And they don't get heard over and over and over again.
02:58:34.720 | - But in a small way, you are providing a voice to them.
02:58:36.920 | And that's kind of the,
02:58:38.080 | if at scale we apply that empathy and listening,
02:58:44.000 | then we could provide the voice to the voiceless
02:58:46.120 | through our votes, through our money, through,
02:58:48.000 | I mean, that's one way to make capitalism work
02:58:50.960 | at not making the powerless more powerless,
02:58:55.960 | is by all of us being a community
02:58:58.480 | that listens to the pain of others
02:58:59.920 | and tries to minimize that,
02:59:01.200 | to try to give a voice to the voiceless,
02:59:03.360 | give power to the powerless.
02:59:05.480 | I have to ask you on, by way of advice,
02:59:08.560 | young people, high school students, college students,
02:59:12.120 | entering this world full of automation,
02:59:15.120 | full of these complex labor markets and markets, period,
02:59:22.440 | what would you, what kind of advice
02:59:25.200 | would you give to that person about how to have a career?
02:59:29.000 | How to have a life they can be proud of?
02:59:31.320 | - Yeah, I think, this is such a great question.
02:59:35.360 | I don't, it's okay to quote Steve Jobs, right?
02:59:40.360 | (both laughing)
02:59:44.960 | - Always.
02:59:45.800 | - Yeah, I mean, so, and I just heard this recently,
02:59:49.560 | it was a commencement speech that he gave,
02:59:51.640 | and I can't remember where it was.
02:59:53.120 | And he was talking about,
02:59:54.280 | he had famously dropped out of school,
02:59:56.680 | but continued to take classes, right?
02:59:59.000 | And he took a calligraphy class,
03:00:02.120 | and it influenced the design of the Mac and sort of fonts,
03:00:06.000 | and just was something that he had no sense
03:00:10.160 | of what it was gonna be useful for.
03:00:11.360 | And his lesson was, you can't connect the dots
03:00:15.440 | looking forward.
03:00:16.960 | Looking back, you can see all the pieces
03:00:19.880 | that sort of led you to where you ended up.
03:00:22.480 | And for me, studying truck driving,
03:00:24.880 | like, I mean, I literally went to graduate school
03:00:27.120 | because I was worried about climate change,
03:00:28.880 | and I had a whole other dissertation planned,
03:00:31.280 | and then was driving home,
03:00:32.520 | and I had read about all this management literature,
03:00:35.800 | and sort of how you get workers to work hard
03:00:37.680 | for my qualifying exams,
03:00:38.960 | and then read a popular article
03:00:41.000 | on satellite-linked computers.
03:00:44.000 | And the story in the literature was,
03:00:45.880 | you sort of sense of autonomy,
03:00:47.120 | and I was like, well, that monitoring
03:00:49.920 | must affect the sense of autonomy.
03:00:51.280 | And it's just this question that I found interesting,
03:00:54.280 | and it never in a million years
03:00:55.440 | that I ever thought I was gonna spend 15 years
03:00:58.800 | of my life studying truck driving.
03:01:01.520 | And it was like, if you were to map out a career path
03:01:07.040 | in academia or research,
03:01:09.080 | you would do none of the things that I did
03:01:14.080 | that many people advised me against,
03:01:15.920 | where you can't go spend a year working as a truck driver.
03:01:19.960 | That's crazy, or you can't spend all this time
03:01:22.960 | trying to write one huge book.
03:01:25.080 | - By the way, if I could just interrupt,
03:01:28.120 | what was the fire that got you to take the leap
03:01:32.600 | and go and work as a truck driver,
03:01:34.600 | and go interview truck drivers?
03:01:36.880 | This is what a lot of people would be incapable of doing,
03:01:39.720 | just took that leap.
03:01:41.320 | What the heck is up with your mind
03:01:43.840 | that allowed you to take that big leap?
03:01:46.080 | - So I think it's probably Tolkien.
03:01:49.760 | (laughing)
03:01:52.760 | As a teenager, I sort of adopted some sense
03:01:57.240 | of needing to heroically go out in the world,
03:02:00.480 | which I've done at various points in my life,
03:02:04.120 | like looking back in absolutely stupid ways,
03:02:06.600 | where I could have completely,
03:02:08.880 | I ended up dead and traumatized my family,
03:02:11.240 | including I took a couple week trip in the Pacific,
03:02:14.760 | like a solo trip on a kayak.
03:02:16.320 | And basically my kayaking experience up 'til that point
03:02:19.600 | had been on a fairly calm lake
03:02:21.520 | and class one rapids on a river.
03:02:22.360 | - Solo trip on a kayak in the Pacific.
03:02:24.600 | - Yeah, yeah.
03:02:25.440 | So I was working on forestry issues,
03:02:28.600 | and we were starting a campaign
03:02:30.560 | up in really remote British Columbia.
03:02:33.120 | And I was like, okay, if I'm gonna work on this,
03:02:35.320 | I've gotta actually go there myself
03:02:36.760 | and see what this is all about
03:02:38.160 | and see whether it's worth devoting my life right now to.
03:02:42.520 | And just drove up there with this kayak
03:02:45.280 | and put into the Pacific and it was insane.
03:02:50.280 | The tides are huge.
03:02:52.120 | And there was one point in which I was going down a fjord
03:02:56.440 | and two fjords kinda came up and there was a cross channel.
03:03:00.520 | And I had hit the timing completely wrong.
03:03:03.380 | And the tide was sort of rushing up like rivers
03:03:06.440 | in these two fjords.
03:03:08.640 | And then coming through this cross channel
03:03:11.000 | and met and created this giant standing wave
03:03:14.360 | that I had to paddle through.
03:03:16.360 | And now, actually very recently,
03:03:18.480 | I've gone out on whitewater with some people
03:03:20.240 | who know what the hell they're doing.
03:03:21.760 | And I realized just how absolutely stupid
03:03:26.280 | and ill fit I was.
03:03:28.360 | But that's just, I think I've always had that.
03:03:31.160 | - Were you afraid when you had that wave before you?
03:03:33.520 | - That wave scared the shit out of me, yeah.
03:03:35.160 | - Okay, what about taking a leap and becoming a trucker?
03:03:39.120 | - Yeah, there was some nervousness for sure.
03:03:41.600 | And I guess my advantage as an ethnographer
03:03:45.240 | is I grew up in a blue collar environment.
03:03:49.280 | Again, all my ancestors were factory workers.
03:03:52.880 | So I can move through spaces.
03:03:56.240 | I can become comfortable in lots and lots of places,
03:04:03.280 | not everywhere, but along class lines
03:04:05.880 | for sort of white, even white ethnic workers.
03:04:10.760 | I can move in those spaces fairly easily.
03:04:13.120 | I mean, not entirely.
03:04:14.520 | There was one time where I was like, okay,
03:04:17.480 | and I grew up around people who worked on cars.
03:04:19.400 | I'd been to drag races and NASCAR.
03:04:21.200 | And I'd been to Colgate University.
03:04:25.360 | And I think that was probably my initial training
03:04:27.760 | was being this just working class kid
03:04:30.680 | who ends up in this sort of blue blood,
03:04:33.960 | small liberal arts college.
03:04:36.200 | And just feeling like,
03:04:37.640 | both having the entire world opened up to me,
03:04:41.960 | like philosophy and Buddhism
03:04:43.840 | and things that I had never heard of,
03:04:46.000 | and just became totally obsessed with
03:04:48.960 | and just following my interests.
03:04:52.400 | - But also culturally perhaps didn't feel like you fit in.
03:04:55.400 | - Feeling like just a fish out of water.
03:04:57.880 | But at the same time, that sort of drove me
03:05:01.680 | in the sense that it drove an opening of my mind
03:05:04.320 | because I couldn't understand it.
03:05:06.400 | I was like, I didn't know that this world existed.
03:05:09.800 | I don't understand.
03:05:11.840 | And I think maybe that's where my real first step
03:05:14.640 | in trying to understand other people,
03:05:17.120 | 'cause they were my friends.
03:05:18.560 | I mean, they were my teammates.
03:05:19.880 | I played lacrosse in college.
03:05:21.240 | So I was close to people who came
03:05:23.640 | from such different backgrounds than I did.
03:05:25.800 | And I just, I was so confused.
03:05:29.320 | And so I think I learned to learn
03:05:31.960 | and then sort of went from there.
03:05:34.160 | - And then develop your fascination with people.
03:05:36.000 | And the funny thing is you went from trucking now
03:05:38.960 | to autonomous trucks.
03:05:40.240 | I mean, this, speaking of not being able
03:05:41.720 | to connect the dots and, you know,
03:05:44.000 | your life in the next 10 years
03:05:46.600 | could take very interesting directions.
03:05:48.920 | This is very difficult to, first of all,
03:05:50.720 | us meeting is a funny little thing,
03:05:53.320 | given the things I'm working on with robots currently.
03:05:57.280 | But, you know, it may not relate to trucks at all.
03:06:00.740 | There's a, at a certain point,
03:06:03.300 | autonomous trucks are just robots.
03:06:06.620 | And then it starts getting into a conversation
03:06:08.660 | about the roles of robots in society.
03:06:11.600 | - Yeah.
03:06:12.440 | - And the roles of humans and robots.
03:06:15.560 | And that interplay is right up your alley.
03:06:18.380 | - Yeah.
03:06:20.100 | - As somebody who deeply cares about humans
03:06:21.940 | and have somehow found themselves studying robots.
03:06:25.060 | - Yeah, no, it's crazy.
03:06:26.180 | I mean, even four or five years ago,
03:06:28.660 | I would, if you had asked me if I was gonna be
03:06:31.060 | studying trucking still, I would have said no.
03:06:33.540 | And so my advice is, I think if I was gonna give advice,
03:06:36.860 | you know, is, you know, you can't connect the dots
03:06:39.900 | looking forward, you just gotta follow what interests you.
03:06:42.940 | You know, and I think we downplay, right,
03:06:47.460 | that when we talk to, you know, kids,
03:06:50.620 | especially, you know, if you have some bright, gifted kid
03:06:52.360 | that gets identified as like, oh, you could go somewhere,
03:06:54.540 | then we're like, we feed them stuff.
03:06:55.980 | You're like, well, learn the piano
03:06:57.060 | and learn another language, right, learn robotics.
03:06:59.580 | And then we tell other kids like, oh, learn a trade,
03:07:03.780 | you know, like figure out what's gonna pay well.
03:07:05.340 | And not that there's anything against trades.
03:07:06.820 | I think everyone should learn like manual skills
03:07:10.060 | to make things.
03:07:10.900 | I think it's incredibly satisfying and wonderful
03:07:13.740 | and we need more of that.
03:07:15.080 | But also, you know, tell, you know, all kids,
03:07:18.220 | it's okay to like take a class in something random
03:07:20.720 | that you don't think you're gonna get
03:07:22.040 | any economic return on.
03:07:23.940 | Well, because maybe you will end up going into a trade,
03:07:26.440 | but that class that you took in studio art
03:07:30.060 | is gonna mean that, you know,
03:07:31.940 | you look at buildings differently, right,
03:07:33.660 | or you end up sort of putting your own stamp
03:07:35.840 | on, you know, woodworking, you know.
03:07:37.860 | I think that's the key is like follow, you know,
03:07:42.380 | it's cheesy 'cause everybody says follow your passion,
03:07:44.560 | but you know, we say that and then we just, you know,
03:07:48.740 | the 90% of what people hear is, you know,
03:07:51.500 | what's the return on investment for that, you know.
03:07:54.180 | It's like, you're a human being.
03:07:55.940 | Like things interest you, music interests you,
03:07:58.400 | literature interests you, video games interest you,
03:08:00.840 | like follow it, you know.
03:08:02.400 | - Go grab a kayak and go into the Pacific.
03:08:04.740 | - Go do something really, no, don't do that.
03:08:06.740 | That was really stupid. - Go do something stupid
03:08:08.900 | and something you'll regret a lot later.
03:08:11.460 | - My foremother, thank God she didn't know.
03:08:13.780 | - Well, let me ask 'cause for a lot of people,
03:08:16.160 | work, for me, it is quote unquote work
03:08:19.220 | is a source of meaning and at the core
03:08:21.900 | of something we've been talking about with jobs
03:08:25.220 | is meaning.
03:08:27.220 | So the big ridiculous question,
03:08:28.660 | what do you think is the meaning of life?
03:08:31.420 | Do you think work for us humans in modern society
03:08:36.260 | is core to that meaning?
03:08:38.300 | And is that something you think about in your work?
03:08:42.220 | So the deeper question of meaning,
03:08:43.780 | not just financial wellbeing and the quality of life,
03:08:46.700 | but the deeper search for meaning.
03:08:49.120 | - Yeah, the meaning of life is love
03:08:53.700 | and you can find love in your work.
03:08:56.840 | Now, and I don't think everybody can,
03:09:00.300 | there are a lot of jobs out there that just,
03:09:02.900 | you do it for a paycheck.
03:09:04.140 | And I think we do have to be honest about that.
03:09:08.500 | There are a lot of people who don't love their jobs
03:09:11.220 | and we don't have jobs that they're gonna love.
03:09:14.480 | And maybe that's not a sort of realistic,
03:09:18.460 | that's a utopia, right?
03:09:20.540 | But for those of us that have the luxury,
03:09:23.580 | I mean, I think you love what you do,
03:09:27.180 | that people say that.
03:09:30.040 | I think the key for real happiness
03:09:35.040 | is to love what you're trying to achieve.
03:09:37.940 | And maybe you love trying to build a company
03:09:40.400 | and make a lot of money just for the sake of doing that.
03:09:42.840 | But I think the people who are really happy
03:09:45.500 | and have great impacts,
03:09:47.340 | they love what they do because it has an impact on the world
03:09:50.220 | that they think expresses that love, right?
03:09:53.780 | And that could be at a counseling center,
03:09:57.580 | that could be in your community,
03:10:00.500 | that could be sending people to Mars.
03:10:02.760 | - Well, I also think it doesn't necessarily,
03:10:06.140 | the expression of love isn't necessarily
03:10:07.860 | about helping other people directly.
03:10:10.260 | There's something about craftsmanship and skill,
03:10:12.340 | as we've talked about,
03:10:13.820 | that's almost like you're celebrating humanity
03:10:16.660 | by searching for mastery in the task,
03:10:21.660 | in the simple,
03:10:23.220 | like especially tasks that people outside
03:10:25.860 | may see as menial, as not important,
03:10:30.020 | nevertheless, searching for mastery,
03:10:33.180 | for excellence in that task.
03:10:34.540 | There's something deeply human to that,
03:10:36.340 | and also fulfilling,
03:10:38.020 | that just driving a truck and getting damn good at it,
03:10:42.400 | the best who's ever lived driving the truck
03:10:46.460 | and taking pride in that,
03:10:48.380 | that's deeply meaningful.
03:10:50.780 | And also like a real celebration of humanity
03:10:55.340 | and a real show of love, I think, for humanity.
03:10:59.140 | - Yeah.
03:11:00.700 | Yeah, I just had my floors redone
03:11:02.020 | and the guy who did it was an artist.
03:11:04.420 | He sanded these old hundred year old floors
03:11:06.180 | and made them look gorgeous and this is craft.
03:11:08.780 | - That's love right there. - Pride.
03:11:10.420 | Yeah, I mean, he showed us some love.
03:11:13.660 | The product was just like, is enriching our lives.
03:11:17.380 | - Steve, this was an amazing conversation.
03:11:19.500 | We've covered a lot of ground.
03:11:20.980 | Your work, just like you said,
03:11:23.300 | impossible to connect the dots,
03:11:24.820 | but I'm glad you did all the amazing work you did.
03:11:28.340 | You're exploring human nature at the core
03:11:31.340 | of what America is, the blue collar America.
03:11:35.060 | So thank you for your work.
03:11:36.460 | Thank you for the care and the love you put in your work.
03:11:38.820 | And thank you so much
03:11:39.860 | for spending your valuable time with me.
03:11:42.400 | - I appreciate it, Lex.
03:11:43.500 | I'm a big fan, so it's just been great to be on.
03:11:46.020 | - Thanks for listening to this conversation
03:11:49.100 | with Steve Veselli.
03:11:50.660 | To support this podcast,
03:11:52.100 | please check out our sponsors in the description.
03:11:54.820 | And now let me leave you with some words
03:11:56.780 | from Napoleon Hill.
03:11:58.500 | If you cannot do great things,
03:12:00.860 | do small things in a great way.
03:12:03.820 | Thank you for listening and hope to see you next time.
03:12:07.940 | (upbeat music)
03:12:10.520 | (upbeat music)
03:12:13.100 | [BLANK_AUDIO]