back to index

Mark Cuban: Shark Tank, DEI & Wokeism Debate, Elon Musk, Politics & Drugs | Lex Fridman Podcast #422


Chapters

0:0 Introduction
0:54 Entrepreneurship
15:48 Shark Tank
26:13 How Mark made first billion
52:24 Dallas Mavericks
57:49 DEI debate
93:42 Trump vs Biden
96:4 Immigration
105:37 Drugs and Big Pharma
121:38 AI
125:49 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | the person who controls the algorithm controls the world.
00:00:03.740 | Right?
00:00:04.580 | And if you are committed to one specific platform
00:00:09.120 | as your singular source of information
00:00:11.560 | or affiliated platforms,
00:00:13.480 | then whoever controls the algorithm
00:00:15.600 | or the programming there controls you.
00:00:17.960 | - The following is a conversation with Mark Cuban,
00:00:23.160 | a multi-billionaire businessman,
00:00:25.000 | investor and star of the series "Shark Tank,"
00:00:27.760 | longtime principal owner of the Dallas Mavericks,
00:00:30.360 | and is someone who is unafraid
00:00:33.800 | to get into frequent battles on X.
00:00:36.000 | Most recently over topics of DEI,
00:00:38.440 | wokeism, gender, and identity politics
00:00:41.680 | with the likes of Elon Musk and Jordan Peterson.
00:00:44.960 | This is the Lex Friedman Podcast.
00:00:47.040 | To support it,
00:00:47.880 | please check out our sponsors in the description.
00:00:50.240 | And now, dear friends, here's Mark Cuban.
00:00:53.800 | You've started many businesses,
00:00:57.080 | invested in many businesses,
00:00:58.640 | heard a lot of pitches privately and on "Shark Tank."
00:01:02.420 | So you're the perfect person to ask,
00:01:04.760 | what makes a great entrepreneur?
00:01:07.360 | - Somebody who's curious.
00:01:09.040 | They want to keep on learning
00:01:10.040 | 'cause business is ever changing.
00:01:11.440 | It's never static.
00:01:13.160 | Somebody who's agile because as you learn new things
00:01:15.880 | and the environment around you changes,
00:01:18.660 | you have to be able to adapt and make the changes.
00:01:21.120 | And somebody who can sell
00:01:23.920 | because no business has ever survived without sales.
00:01:27.820 | And as an entrepreneur who's creating a company,
00:01:31.040 | whatever your product or service is,
00:01:32.560 | if that's not the most important thing
00:01:34.280 | and you're just dying and excited to tell people about it,
00:01:37.720 | then you're not gonna succeed.
00:01:39.000 | - But it's also a skill thing.
00:01:40.240 | How do you sell?
00:01:41.400 | What do you mean by selling?
00:01:42.280 | - Selling is just helping.
00:01:43.400 | I've always looked at it
00:01:44.320 | about putting myself in the shoes of another person
00:01:48.160 | and asking a simple question.
00:01:49.680 | Can I help this person?
00:01:50.880 | Can my product help?
00:01:51.720 | And from the time I was 12 years old,
00:01:53.320 | selling garbage bags door to door
00:01:54.920 | and just asking a simple question.
00:01:56.760 | Do you use garbage bags?
00:01:58.080 | Do you need garbage bags?
00:01:59.240 | Well, let me save you some time.
00:02:00.760 | I'll bring 'em to your house and drop 'em off to streaming.
00:02:04.280 | Why do we need streaming when we have TV and radio?
00:02:07.200 | Well, you can't get access to your TV and radio
00:02:09.460 | everywhere you go.
00:02:10.680 | So we kind of break down geographic and physical barriers
00:02:14.080 | and cost plus drugs.
00:02:16.280 | What's the product that we actually sell?
00:02:18.280 | We sell trust.
00:02:19.360 | In a simplistic approach, we buy drugs and sell drugs,
00:02:23.120 | but we add transparency to it.
00:02:24.960 | And bringing transparency to an industry
00:02:27.720 | is a differentiation and it helps people.
00:02:30.600 | - Trust in an industry that's highly lacking in trust.
00:02:33.920 | - Exactly.
00:02:35.080 | - Okay, so what's the trick to selling garbage bags?
00:02:37.840 | Let's go back there at 12 years old.
00:02:39.720 | I mean, is it just your natural charisma?
00:02:41.640 | I guess a good question to ask, are you born with it
00:02:44.280 | or can you develop it?
00:02:45.600 | - Oh, you can definitely develop it, yeah.
00:02:47.360 | I mean, because selling garbage bags door to door
00:02:49.160 | was easy, right?
00:02:50.000 | It was like, (knocking)
00:02:51.160 | 12 year old Mark going, hi, my name is Mark.
00:02:53.240 | Do you use garbage bags?
00:02:54.880 | You know what the answer is going to be, right?
00:02:56.640 | Can I just drop them off for you once a week,
00:02:58.560 | whenever you need them, you just call
00:02:59.960 | and I'll bring them down, sure.
00:03:01.680 | So that was easy.
00:03:02.880 | - But I'm sure you've been rejected.
00:03:04.740 | - Oh yeah, of course.
00:03:05.580 | Not everybody says yes.
00:03:06.960 | - What was your percentage?
00:03:08.240 | - I don't remember, but it's pretty close to 100%.
00:03:10.640 | - Oh, okay.
00:03:11.480 | So that's why you don't remember.
00:03:12.560 | - Yeah, right.
00:03:13.880 | 'Cause who's gonna say no to a 12 year old kid
00:03:15.600 | who's gonna save them time and money.
00:03:17.500 | But typically my career where I've started companies,
00:03:20.280 | it's to do something that other people aren't doing,
00:03:23.720 | whether it was connecting PCs and to local area networks
00:03:27.200 | and at micro solutions.
00:03:28.780 | And the salesmanship was walking into a company
00:03:32.400 | and just saying, look, talk to me
00:03:34.800 | and I can help you improve your productivity
00:03:36.640 | and your profitability.
00:03:37.860 | Is that important to you?
00:03:39.160 | And the answer is obviously always yes.
00:03:40.720 | And then the question is, can I do the job
00:03:42.260 | and can I do it cost effectively?
00:03:44.280 | And so you didn't have to be a born salesperson
00:03:47.140 | to be able to ask those questions,
00:03:49.080 | but you have to be able to be willing to put in the time
00:03:52.120 | to learn that business.
00:03:53.920 | And that's the hardest part.
00:03:55.440 | - I'm sure there's a skill thing to it too
00:03:57.680 | in like how you solve the puzzle of communicating
00:04:01.320 | with a person and convincing them.
00:04:03.640 | - Yeah, I mean, there's skill from the perspective
00:04:05.520 | that I read like a maniac.
00:04:07.760 | Then like now you can give me an example
00:04:10.440 | of any type of business and it'll take me two seconds
00:04:12.760 | to figure out how they make money
00:04:14.000 | and how I can make them more productive.
00:04:16.000 | And I think that's probably my biggest skill,
00:04:18.840 | being able to just drill down to what the actual need is,
00:04:21.960 | if any, and then from there being able to say,
00:04:25.120 | well, if this is what this company does
00:04:27.560 | and this is what their goal is,
00:04:29.840 | how can I introduce something new
00:04:31.340 | that they haven't seen before?
00:04:32.840 | And is that a business that I can create and make money from?
00:04:35.680 | - So figure out how this kind of business
00:04:37.720 | makes money in the present and then figure out
00:04:39.880 | is there a way to make more money in the future
00:04:41.840 | by introducing a totally new kind of thing.
00:04:43.720 | - Correct.
00:04:45.240 | - And you can just do that with anything.
00:04:46.500 | - Pretty much, yeah.
00:04:48.040 | - And you think you're born with that?
00:04:50.240 | - No, I worked at it because going back
00:04:52.440 | to what I said earlier about curiosity,
00:04:54.560 | you have to be insanely curious
00:04:56.040 | because the world is always changing.
00:04:57.720 | My dad used to say, "We don't live in the world
00:04:59.760 | "we were born into," which is absolutely true.
00:05:02.560 | If you're not a voracious consumer of information,
00:05:06.840 | then you're not gonna be able to keep up
00:05:08.520 | and no matter what your sales skills or ability are,
00:05:11.260 | they're gonna be useless.
00:05:12.760 | - What'd you learn about life from your dad?
00:05:15.080 | You mentioned your dad.
00:05:16.460 | - My dad did upholstery on cars.
00:05:18.400 | Got up, went to work every morning at seven o'clock,
00:05:21.240 | came back five or six, seven o'clock, exhausted.
00:05:24.640 | And I learned to be nice.
00:05:27.440 | I learned to be caring.
00:05:29.000 | I learned to be accepting.
00:05:31.020 | Just qualities that I think he really tried
00:05:35.320 | to pass on to myself and my two younger brothers
00:05:37.720 | were just be a good human and I think,
00:05:41.080 | he didn't have business experience
00:05:42.240 | so as I got into business, he would just say,
00:05:44.520 | "Sorry, Mark, I can't help you.
00:05:46.380 | "I don't understand what you're doing.
00:05:48.120 | "Neither one of my parents had gone to college.
00:05:50.780 | "You've gotta figure it out for yourself."
00:05:52.500 | But he was also very insistent that,
00:05:55.560 | you know, he worked at a company called Regency Products
00:05:57.820 | where they did upholstery on cars
00:05:59.920 | and he would bring me there to sweep the floors.
00:06:01.800 | Not because he wanted me to learn that business,
00:06:03.640 | because he wanted me to learn
00:06:04.840 | how backbreaking that work was.
00:06:06.760 | I mean, he lost an eye in an accident at work,
00:06:09.220 | a staple broke and the only thing he wanted
00:06:12.880 | from my brothers and I was for us
00:06:14.800 | to never have to work like that,
00:06:16.440 | to go to college, to figure it out.
00:06:18.080 | - You said to be nice.
00:06:19.200 | That said, you also said that you,
00:06:21.420 | when you were first starting a business,
00:06:23.180 | you were a bit more of an asshole
00:06:24.640 | than you wish you would have been.
00:06:25.920 | - Absolutely, yeah, yeah.
00:06:27.200 | Because I was more of a yeller.
00:06:29.120 | I was, you know, I didn't have--
00:06:30.840 | - No, really?
00:06:31.680 | - Yeah. (laughing)
00:06:32.880 | You know, what you see on the sidelines,
00:06:34.320 | you know, with me at a Mavs game, maybe a little bit,
00:06:36.240 | but I also didn't have any patience
00:06:38.760 | for somebody I thought wasn't using
00:06:41.360 | my kind of common sense, right?
00:06:43.360 | Because I was always on the go, go, go, go, go,
00:06:47.120 | when I, particularly when I was younger,
00:06:49.040 | just trying to be successful,
00:06:51.040 | trying to get to the point where I had independence.
00:06:53.960 | And I would tell this to people, you know,
00:06:55.500 | either you're speeding up and getting on the train
00:06:57.240 | or, you know, we'll stop and drop you off
00:06:59.440 | at the next station, but let's go where you go.
00:07:02.520 | - Did you have trouble with the hire fast,
00:07:04.120 | fire fast part of running a business?
00:07:06.480 | - Yeah, always, 'cause I hated firing people,
00:07:08.400 | 'cause it meant, one, it was an admission
00:07:09.800 | of a mistake in the hiring, and two,
00:07:12.180 | the salesperson in me always wanted to come out ahead,
00:07:15.880 | and I was always horrible at firing,
00:07:18.360 | but I always partnered with people
00:07:19.560 | who had no problem with it, so I always delegated that.
00:07:23.400 | - Well, that's the tricky thing.
00:07:24.240 | When you're working with somebody
00:07:26.000 | and they're not quite there, and you have to decide,
00:07:29.960 | are they going to step up and grow into the person
00:07:32.040 | that's the right or they're not?
00:07:34.840 | And in that gray area is probably where you have to fire.
00:07:37.920 | - Was hard, yeah, for sure, because, you know,
00:07:40.240 | there is obviously a failure somewhere in the process.
00:07:43.780 | You know, what did we do wrong?
00:07:45.180 | And when I would interview people for jobs,
00:07:49.220 | I think 99% of the people I've ever interviewed
00:07:52.380 | I've wanted to hire, because in my mind it was like,
00:07:55.780 | okay, I can figure out how to make this person work, right?
00:07:58.300 | And then they wouldn't, and then, you know,
00:08:00.340 | people at the company would be like,
00:08:01.500 | Mark, you suck at this, you know?
00:08:03.500 | And so I always delegated the hiring.
00:08:06.140 | - Yeah, I mean, I'm the same.
00:08:07.740 | I see the potential in people.
00:08:08.860 | I see the beauty in people and which is a great way
00:08:11.860 | to live life, but when you're running a company,
00:08:13.700 | it's a different thing.
00:08:14.540 | - It's different, and you got to know what you're good at
00:08:16.020 | and what you're bad at, right?
00:08:17.340 | I was good at, you know, I was a ready, fire, aim guy,
00:08:21.420 | and I always partnered with people who were very anal
00:08:23.900 | and perfectionist, because where I could just go,
00:08:25.980 | go, go, go, go, go, they would keep me inside the baselines.
00:08:29.220 | - They would do the due diligence.
00:08:31.260 | - Yeah, or just, yeah, the detail work,
00:08:33.020 | the dot the I's and the cross the T's.
00:08:34.940 | - What does it take to take that first leap
00:08:37.220 | into starting a business?
00:08:38.580 | - That's the hardest part.
00:08:39.620 | It really depends on your personal circumstances.
00:08:41.860 | Like I got fired.
00:08:42.820 | I mean, I was sleeping on the floor,
00:08:44.220 | six guys in a three-bedroom apartment,
00:08:46.220 | so I couldn't go any lower. (laughs)
00:08:49.000 | So there was no downside. - Started at the bottom.
00:08:50.880 | - Yeah, there was no downside for me starting a business,
00:08:53.020 | and it was just like, you know, I was 25
00:08:55.660 | when we started MicroSolutions, and, you know,
00:08:58.320 | I'd just gotten fired, and it was like, look,
00:09:01.300 | I'm a lousy employee.
00:09:02.900 | I'm gonna just start going to some of my prospects
00:09:06.540 | that I had at my job and ask them to front the money
00:09:10.000 | that I needed to install some software
00:09:12.420 | and found this company, Architectural Lighting,
00:09:14.500 | who put up $500 for me.
00:09:16.420 | That allowed me to buy software and have 50% margins,
00:09:19.620 | and, you know, that's how I started my company.
00:09:22.140 | - But like by way of advice, would you say?
00:09:23.820 | I mean, it's a terrifying thing.
00:09:25.340 | - Yeah, I mean, you've gotta be in a position
00:09:26.900 | where you're confident.
00:09:27.880 | You know, I get emails and approached by people
00:09:29.900 | all the time, you know, what kind of business should I start
00:09:32.780 | that tells me you're not ready to start a business, right?
00:09:34.900 | Either you're prepared and you know it or you don't.
00:09:37.140 | You know, in the United States, with the American Dream,
00:09:40.820 | everybody kind of always looks at themselves and say,
00:09:44.900 | okay, you know, I have this idea, right?
00:09:47.420 | And then you go through this process of saying,
00:09:49.740 | okay, you know, you talk to your friends or family,
00:09:52.000 | what do you think?
00:09:52.840 | And then almost always, oh, it's a great idea, right?
00:09:55.000 | Then you go on Google and you say, oh my God,
00:09:57.460 | no one else is doing it without thinking, you know,
00:09:59.860 | 10 companies have gone out of business
00:10:01.260 | trying the same thing, but okay, it's on Google.
00:10:04.020 | And then people stop, right?
00:10:06.300 | Because that next step means, okay,
00:10:08.940 | I have to change what I'm doing in my life.
00:10:11.500 | And that's not easy for 99% of the people.
00:10:14.140 | Some people look at that as an opportunity
00:10:15.900 | and get excited about it.
00:10:17.220 | Some people get terrified because it's,
00:10:20.100 | okay, maybe I'm comfortable, maybe I have responsibilities.
00:10:24.140 | And so whatever your circumstances are,
00:10:26.440 | if you want to take that next step,
00:10:28.480 | you have to be able to deal with the consequences
00:10:30.900 | of changing your circumstances.
00:10:32.820 | And that's the first thing, you know,
00:10:34.940 | do you save money so you have, you know,
00:10:36.780 | if you have a job, do you have a mortgage?
00:10:38.740 | Do you have a family?
00:10:39.580 | You've got to save money, you can't just walk.
00:10:41.580 | You know, I mean, they've got to eat
00:10:43.140 | and they've got to have shelter.
00:10:44.540 | But on the other side of the coin, if you've got nothing,
00:10:47.260 | it's the perfect time to start a business.
00:10:49.220 | - Yeah, desperation is a good catalyst
00:10:51.020 | for starting a business.
00:10:52.020 | But in many cases, the decision, as you're talking about,
00:10:55.320 | you're gonna have to make is to leave a job
00:10:57.620 | that's providing some degree of comfort already.
00:11:00.580 | So I suppose when you're sleeping on the floor
00:11:03.180 | and there's six guys, it's a little bit easier.
00:11:05.420 | - It's really easy, right?
00:11:06.380 | Particularly when you get fired and you don't have a job,
00:11:08.540 | you know, and you're looking at bartending at night
00:11:10.260 | to try to pay the bills.
00:11:11.400 | And so it wasn't hard for me, but to your point,
00:11:15.900 | it really comes down to preparation.
00:11:18.140 | You know, if it's important enough to you,
00:11:19.900 | you'll save the money, you'll give up, you know,
00:11:22.620 | whatever it is you need to give up to put the money aside.
00:11:26.460 | If you have obligations, you'll put in the work
00:11:30.100 | to learn as much as you can about that industry
00:11:32.660 | so that when you start your business, you're prepared.
00:11:35.460 | And you can always, you know, at night, on weekends,
00:11:38.540 | whenever you find time, lunch, start making the calls
00:11:41.260 | to find out if people will write you a check, you know,
00:11:43.620 | transfer you money to buy whatever it is you're selling.
00:11:46.500 | And by doing those things,
00:11:47.900 | you can put yourself in a position to succeed.
00:11:50.060 | It's where people just think, okay, you know, Geronimo,
00:11:53.380 | I'm leafing off the edge of a cliff
00:11:55.100 | and I'm starting a business.
00:11:56.420 | That's tough.
00:11:57.420 | - But sometimes that's like the way you do it though.
00:11:59.420 | - There's always examples of any situation or scenario.
00:12:01.820 | Right, right.
00:12:02.740 | But I mean-
00:12:03.580 | - Anecdotal evidence for everything.
00:12:04.980 | - Yeah, but if you're going into a new business,
00:12:06.940 | you're gonna have competition
00:12:07.900 | unless you're really, really, really, really, really lucky.
00:12:10.060 | And that competition is not gonna just say, okay,
00:12:12.360 | let Lex or Mark just kick our ass.
00:12:14.420 | And so you've gotta be prepared
00:12:15.820 | to how you're gonna deal with that competition.
00:12:18.580 | - What do you think that is about America
00:12:21.300 | that has so many people who have that dream
00:12:25.140 | and act on that dream of starting a business?
00:12:28.060 | - You know, I think we've just got a culture
00:12:31.980 | of consumption and more, you know?
00:12:35.100 | And to get more, you've got to, you know,
00:12:39.900 | creating a business gives you the greatest potential upside
00:12:42.460 | and the greatest leverage on your time.
00:12:44.420 | But it also creates the most risk.
00:12:47.700 | - So that capitalist machine,
00:12:49.260 | there's a lot of elements.
00:12:50.300 | By contrast, the respect for the law,
00:12:54.700 | like an entrepreneur can trust that
00:12:56.140 | if they pull it off, the law will protect them.
00:12:58.780 | There won't be a government.
00:12:59.620 | - Hopefully that's still the case, yeah.
00:13:01.620 | There's always, yeah, us versus other countries.
00:13:04.460 | Right, right, so us versus other countries.
00:13:06.380 | Like Joe Biden, of all people, said to me,
00:13:08.940 | it was at an entrepreneurship conference
00:13:11.240 | that when he was vice president, he had put together.
00:13:14.060 | And we had gone up there, a bunch of us from "Shark Tank"
00:13:17.900 | to talk to young entrepreneurs from around the world.
00:13:20.840 | And he said to me, "Mark, you know,
00:13:22.580 | the one thing that separates,
00:13:23.820 | I've been to every country around the world,
00:13:25.380 | "and the one thing that separates us is entrepreneurship.
00:13:28.740 | "We're the most entrepreneurial country in the world,
00:13:31.300 | "and there's no one else who's even close."
00:13:33.300 | And when you look at the origin of the biggest companies
00:13:38.080 | in the world, for the most part,
00:13:39.860 | there's an American origin story somewhere behind there.
00:13:43.100 | And I think that just gets perpetuated on itself.
00:13:47.220 | We see those Horatio Alger stories.
00:13:49.640 | We see examples of the Jeff Bezos of the world,
00:13:53.260 | the Steve Jobs of the world.
00:13:54.340 | And those are the types of people we want to copy.
00:13:58.620 | - Yeah, we want to be really careful
00:14:00.040 | and try to really figure out what that is,
00:14:02.420 | because we don't want to lose that.
00:14:04.540 | - For sure.
00:14:05.380 | - We want to protect the whatever, you know.
00:14:06.700 | And that's a lot of the discussions
00:14:08.620 | about what's the right way to do government,
00:14:10.500 | big government, small government,
00:14:11.780 | what's the right policies, but also culture,
00:14:14.460 | like who we celebrate.
00:14:15.380 | One of the things that troubles me
00:14:17.420 | is that we don't enough celebrate
00:14:19.500 | the entrepreneurs that take risks
00:14:22.620 | and the entrepreneurs that succeed.
00:14:24.180 | It seems like success,
00:14:25.260 | especially when it comes with wealth,
00:14:27.340 | is immediately matched with distrust and criticism
00:14:31.740 | and all that kind of stuff.
00:14:32.580 | - Yeah, it's changing for sure,
00:14:33.500 | because you can go back just 12 years, right?
00:14:36.980 | Traditional media dominated, let's just say, through 2012.
00:14:41.620 | That was the peak of linear television.
00:14:44.140 | Newspapers weren't as strong,
00:14:45.720 | but they still had some breadth and depth to them.
00:14:48.900 | And then social media comes along,
00:14:50.780 | and everybody gets to play in their own sandbox
00:14:53.740 | and share opinions with people who think just like them.
00:14:57.100 | And it also gives them the opportunity
00:15:00.100 | to amplify those feelings.
00:15:02.820 | And I think that's where celebrating entrepreneurs
00:15:07.140 | really started to subside some.
00:15:09.420 | There were always people who were progressive
00:15:11.060 | that were like, billionaires are bad,
00:15:12.700 | or millionaires are bad, depending on the time period.
00:15:15.580 | But you didn't really see it on an ongoing basis, right?
00:15:19.600 | It wasn't gonna be on the evening news.
00:15:21.180 | It wasn't going to be in the front page of the newspaper.
00:15:24.900 | It was going to be if you read a book
00:15:26.380 | and someone talked about it, or you read a magazine
00:15:28.260 | and there was an article talking about
00:15:31.100 | this progressive movement or that progressive movement,
00:15:32.780 | whatever it may be, or political parties.
00:15:36.820 | But now, all of that is front and center in social media.
00:15:41.820 | - And we're trying to figure it out,
00:15:43.060 | how we deal with the mobs of people
00:15:45.380 | and the virality of it all.
00:15:46.940 | I think we'll find our footing
00:15:49.460 | and start celebrating greatness again.
00:15:51.180 | - Well, I mean, that's the whole reason I do Shark Tank.
00:15:52.860 | - That's true, that show celebrates the entrepreneur.
00:15:56.260 | - It's the only place where every single minute
00:15:58.260 | of every single episode, we celebrate the American dream.
00:16:02.500 | And the reason I do it is we tell the entire country
00:16:05.700 | and it's shown around the world even.
00:16:07.740 | We're amazing advertising for the American dream
00:16:10.380 | in I don't even know how many countries.
00:16:12.620 | But every time somebody walks onto that carpet
00:16:15.260 | from Dubuque, Iowa, or Ketchum, Idaho,
00:16:19.060 | that sends a message to every kid who's watching,
00:16:21.460 | seven, eight, nine, 10, 12 year old kid,
00:16:23.980 | that if they can do it from Ketchum, Idaho, you can do it.
00:16:27.460 | If they can have this idea and get a deal
00:16:29.380 | or even present to the Sharks
00:16:30.700 | and have all of America see it, you can do it.
00:16:33.340 | And that, I mean, I'm proud of that.
00:16:36.020 | The 15 years of that, it's just been insane.
00:16:39.740 | Now, kids walk up to me and go,
00:16:41.620 | "Yeah, I started watching you when I was five or 10
00:16:44.660 | and I started a business
00:16:45.700 | 'cause I learned about it from Shark Tank."
00:16:47.260 | And so, I think we're being, it celebrates it
00:16:51.460 | and we convey it and I don't think it's going away,
00:16:54.780 | but there are different battles
00:16:56.460 | we have to fight to support it.
00:16:57.900 | - Yeah, I love even when the business idea
00:16:59.780 | is obviously horrible, just the guts to step up.
00:17:04.780 | - To be there.
00:17:05.620 | - To believe in yourself, to really reach.
00:17:08.260 | I mean, that's what matters.
00:17:09.540 | I mean, 'cause like some of the best business ideas
00:17:12.220 | are probably maybe even you and Shark Tank will laugh at.
00:17:17.180 | - Oh, for sure.
00:17:18.420 | Without question, the good ones,
00:17:20.220 | we're not gonna recognize every good one
00:17:21.780 | and then sometimes we'll just motivate people
00:17:23.540 | to work even harder to get it done
00:17:24.940 | 'cause of what we say to them.
00:17:26.340 | And that's fine too.
00:17:28.020 | There's been great success stories that we said no to.
00:17:30.540 | - What stands out as like a memorable business
00:17:32.580 | on you've been pitched on Shark Tank?
00:17:35.300 | What's the best one that stands out in memory?
00:17:37.220 | - There's no best one, right?
00:17:38.340 | They're all different.
00:17:39.580 | They're all best in their own way, I guess.
00:17:41.380 | There's stupid ones and we haven't had any world class
00:17:47.340 | or world changing earth shattering ones, right?
00:17:49.940 | Because those aren't gonna apply to Shark Tank.
00:17:53.660 | They don't need us, right?
00:17:55.660 | So we typically get businesses that need some help
00:17:57.740 | at some level or another.
00:17:59.500 | But there's ones I've passed that I wish like Spike Ball.
00:18:01.580 | Do you know what Spike Ball is?
00:18:03.020 | So it's just rebounding net that you can put on the beach
00:18:06.060 | and you have these yellow balls and you play a game of,
00:18:08.940 | it's just competitive game, but they're killing it.
00:18:10.860 | So if you go to beaches in New York or LA,
00:18:14.500 | you'll see kids playing it all the time.
00:18:16.140 | And it was a fun game that I wish I had done a deal with.
00:18:20.420 | And there's been others.
00:18:21.260 | - And you passed?
00:18:22.100 | - And I passed.
00:18:22.940 | They were getting some traction
00:18:24.340 | and they wanted to create leagues, Spike Ball leagues,
00:18:26.820 | and they wanted me to be the commissioner.
00:18:28.820 | And I don't wanna be a commissioner
00:18:30.180 | of a new Spike Ball league. (laughs)
00:18:32.540 | - So you have to kind of have this gut feeling
00:18:35.200 | of will this scale, will this click with people?
00:18:38.540 | - Of course, yeah.
00:18:39.380 | Can it be protected?
00:18:40.260 | Is it differentiated?
00:18:41.540 | Is it something that makes me think,
00:18:43.040 | why didn't I think of that?
00:18:44.940 | Or is it just a good, solid business
00:18:48.780 | that's gonna pay a return to the founder
00:18:52.020 | and may not be enough of a business
00:18:53.640 | to return to an investor?
00:18:56.100 | - Yeah, and I guess the question you're trying to see,
00:19:00.260 | will this scale, this promise,
00:19:03.500 | will the promise materialize into a big thing?
00:19:06.540 | - Well, see, I don't even care
00:19:07.380 | if it's gonna be a big thing, right?
00:19:09.540 | 'Cause it's all relative to the entrepreneur.
00:19:11.620 | We had a 19-year-old from Pittsburgh, Laney,
00:19:13.820 | who came on with this simple sugar scrub.
00:19:16.380 | And there was nothing outrageously special about it.
00:19:19.700 | I didn't see it becoming a $100 million business.
00:19:22.100 | I thought it could become a two, three, $5 million business
00:19:25.020 | that paid the bills for her.
00:19:26.500 | And that was good enough.
00:19:27.840 | And six months after the show aired, she called me up.
00:19:31.580 | She goes, "Mark, I've got a million dollars in the bank.
00:19:34.720 | "What am I gonna do?"
00:19:36.060 | I'm like, "Enjoy it.
00:19:37.320 | "Put aside money for your taxes and go back to work."
00:19:40.940 | And so it doesn't have to be a huge business.
00:19:42.660 | It's just gotta be one that makes the entrepreneur happy.
00:19:45.860 | - But then there's the valuation piece.
00:19:47.740 | - Right.
00:19:48.700 | - Do a lot of the entrepreneurs overvalue business?
00:19:51.100 | - Yeah, of course.
00:19:52.100 | Yeah, I mean, that's the nature of it, right?
00:19:54.500 | I mean, and that's really where the biggest conflicts
00:19:57.020 | in "Shark Tank" happen.
00:19:58.180 | That's in the valuation.
00:19:59.620 | They think this is the best business ever.
00:20:02.420 | We had one lady, a couple that came on,
00:20:06.480 | and they had this scraper for cat's tongues, right?
00:20:10.500 | - Nice. - Bizarre.
00:20:11.820 | The most bizarre pitch ever.
00:20:14.060 | - I love it.
00:20:14.900 | - You know, and they had this insane valuation,
00:20:17.260 | and it was on because it was corny and fun TV,
00:20:19.620 | not because it was a good business.
00:20:20.940 | - Oh, really?
00:20:21.780 | Okay.
00:20:22.600 | You didn't see the potential.
00:20:23.440 | - No.
00:20:24.280 | Yeah, none.
00:20:25.860 | - There's a lot of cats in the world, Mark.
00:20:27.140 | - Yes, there are.
00:20:28.180 | They'll go do very well without me.
00:20:29.980 | - So how do you determine the value of a business,
00:20:33.580 | whether it's on "Shark Tank" or just in general?
00:20:35.540 | - It's actually really easy, right?
00:20:37.200 | So if you take, just to use an example,
00:20:39.620 | a business that's valued at $1 million,
00:20:42.260 | and I want to buy 10% of that company for $100,000,
00:20:47.260 | then in order for me to get my money back,
00:20:51.020 | they've gotta be able to generate $100,000
00:20:53.780 | in after-tax cash flow that they're able to distribute.
00:20:57.320 | Can they do it or can they not, right?
00:20:59.500 | And if it's a $2 million valuation,
00:21:01.580 | whatever the valuation is,
00:21:03.220 | that's how much after-tax cash they have to generate
00:21:06.660 | to return that money to investors.
00:21:09.020 | Or the other option is, do I see this business
00:21:12.460 | potentially having an exit, right?
00:21:14.300 | Do they have some unique technology
00:21:15.860 | or do they have something specific about them
00:21:19.060 | that some other company would want to acquire?
00:21:21.260 | Then the cash flow isn't as,
00:21:23.440 | I don't want to say important,
00:21:26.060 | but isn't going to guide the valuation.
00:21:28.700 | - And how do you know if a company's gonna be acquired?
00:21:31.420 | So it's the technology, like the patents,
00:21:33.020 | but also the team?
00:21:34.140 | - Yeah, it could be any of the above, right?
00:21:35.660 | It could be a super products company
00:21:38.920 | that I think is gonna take off.
00:21:41.980 | - And how do you know if they can generate the money?
00:21:43.980 | You made it sound easy, you know?
00:21:46.340 | - Yeah, I mean, can the person sell, you know?
00:21:49.260 | And if not them, can I do it?
00:21:50.700 | Or someone on my team do it for them?
00:21:52.940 | - So you're looking at the person?
00:21:54.220 | - Yeah, for sure, yeah.
00:21:55.500 | That's where Barbara Corker is the best.
00:21:57.140 | She can look at a person and hear them talk for 20 minutes
00:21:59.820 | and know, can that person do the job and do the work?
00:22:02.660 | - Can you tell if they're full of shit or not?
00:22:05.560 | So one of the things with entrepreneurs,
00:22:07.180 | they're kind of, like we said, overvaluing,
00:22:09.340 | so they're maybe overselling themselves,
00:22:11.780 | but also they might be full of shit
00:22:14.860 | in terms of their understanding of the market
00:22:16.580 | or exaggerating what they're thinking to do,
00:22:19.540 | all that kind of stuff.
00:22:20.380 | Can you see through that?
00:22:21.200 | - Yeah, for sure, just by asking questions.
00:22:23.540 | So if they are delusional at some level
00:22:28.540 | or misleading at another level,
00:22:30.660 | I'm gonna call them on it.
00:22:32.700 | So you get people trying to sell supplements
00:22:34.420 | that come on there and it's a cure for cancer
00:22:36.680 | or whatever it may be,
00:22:37.920 | or there's this latest fad that increases your core strength
00:22:42.920 | without doing any exercises, shit like that,
00:22:46.140 | I'm just gonna bounce, I'm gonna pound on them, right?
00:22:48.320 | - See, I still love that.
00:22:49.240 | I still love the trying, just trying.
00:22:51.840 | - You know, give them credit, right?
00:22:52.840 | Because they know all of America is going to see it
00:22:54.720 | and they've deluded themselves to believe
00:22:56.840 | this story so strongly.
00:22:58.680 | - I mean, there's a delusional aspect
00:23:00.120 | to entrepreneurship, right?
00:23:01.600 | Like you just--
00:23:02.880 | - See, that's a great question.
00:23:05.460 | Do you have to be ambitious and set aside reality
00:23:10.480 | at some level to think that you can create a company
00:23:13.000 | that could be worth 10, 100, a billion dollars, right?
00:23:17.040 | Yeah, at some level.
00:23:17.880 | 'Cause you don't know, it's all uncertainty.
00:23:19.820 | But I think if you're delusional, that works against you.
00:23:22.960 | Because everything's grounded in reality.
00:23:26.660 | You've got to execute, you've got to produce.
00:23:30.300 | You can have a vision, right?
00:23:32.060 | And you can say, this is where I want to get to
00:23:33.980 | and that's my mission or this is my driving principle.
00:23:36.940 | But you still got to execute on the business plan
00:23:38.640 | and that's where most people fail.
00:23:40.780 | - Yeah, you have to be kind of two-brained, I guess.
00:23:42.940 | You have to be able to dip into reality
00:23:44.540 | when you're thinking about the specifics of the product,
00:23:47.540 | how to design things, the first principles,
00:23:50.460 | the basics of how to build the thing,
00:23:51.980 | how much it's gonna cost, all of that.
00:23:53.660 | - Yeah, I mean, 'cause if you can't do the basics,
00:23:55.440 | you're not gonna be able to do the bigger things.
00:23:56.760 | And at the same time, you've got to be able...
00:23:58.860 | One of the things that entrepreneurs do
00:24:00.260 | that I always try to remind any that I work with on
00:24:03.620 | is we all tend to lie to ourselves.
00:24:05.740 | Our product is bigger, faster, cheaper, this or that,
00:24:08.580 | as if that is a finite situation
00:24:13.020 | that's never gonna change, right?
00:24:14.820 | And there's always somebody,
00:24:16.660 | I call them leapfrog businesses.
00:24:18.960 | Whoever's competing against you,
00:24:21.020 | if you do A, B or C, they're gonna try to do C, D and E,
00:24:24.300 | right, and you better be prepared for that to come
00:24:26.580 | because otherwise they're out of business too.
00:24:28.740 | So you're never in a vacuum.
00:24:30.260 | You're always competing against sometimes
00:24:32.500 | an unlimited number of entrepreneurs
00:24:34.020 | that you don't even know exist
00:24:35.420 | who are trying to kick your ass.
00:24:36.980 | - And the tricky part of all this too
00:24:38.660 | is you might need to frequently pivot,
00:24:42.380 | especially in the beginning.
00:24:43.860 | - Hopefully not.
00:24:45.100 | - So you think like in the beginning,
00:24:47.540 | the product you have should be the thing
00:24:50.020 | that carries you a long time.
00:24:51.340 | - Yeah, because I mean,
00:24:52.540 | that's your riskiest point in time, right?
00:24:55.060 | And so if you've done your homework,
00:24:57.740 | which includes going out there
00:24:59.380 | and testing product market fit,
00:25:01.740 | you should have confidence
00:25:03.820 | that you're gonna be able to sell it.
00:25:04.940 | Now, if you didn't do your homework
00:25:07.060 | and you go out there and you sell whatever it is,
00:25:11.100 | and you've raised money or whatever,
00:25:14.580 | just to pivot, you've already shown
00:25:17.420 | that you haven't been able to read the market.
00:25:19.780 | And so it's not that pivots can't work
00:25:22.020 | and always don't work, they can,
00:25:24.100 | but more often than not, they don't.
00:25:25.540 | You pivot for a reason,
00:25:26.500 | it's because you made a huge mistake.
00:25:28.100 | - Well, I also mean like the micro pivots,
00:25:30.780 | which is like iterative development of the thing.
00:25:32.980 | - Oh, yeah, oh, yeah, that's not pivot,
00:25:34.300 | yeah, just iterations, yeah.
00:25:36.260 | Entrepreneurship, having any business
00:25:38.220 | is just continuous iteration, continuous.
00:25:41.100 | Your product, your sales pitch, your advertising,
00:25:44.620 | introducing new technology,
00:25:45.860 | how do you use AI or not use AI?
00:25:47.940 | Where do you use it?
00:25:48.780 | What person's the right person?
00:25:50.500 | There's just a million touch points
00:25:53.740 | that you're always reevaluating in real time
00:25:57.140 | that you have to be agile and adapt and change.
00:26:00.580 | - But especially in software,
00:26:02.860 | it feels like business model can evolve really quickly too,
00:26:05.460 | like how are you gonna make money on this?
00:26:06.940 | - Yes, with software for sure,
00:26:08.780 | because anything digital,
00:26:10.820 | because it can change in a millisecond.
00:26:13.060 | - Speaking of which, how did you make your first billion?
00:26:16.540 | - So my partner, Todd Wagner and I
00:26:19.260 | would get together for lunches,
00:26:22.960 | and we were at California Pizza Kitchen
00:26:25.060 | in Preston Hollow in Dallas,
00:26:27.540 | and he was talking about how we could
00:26:32.540 | use this new thing called the internet,
00:26:34.420 | this is late '94, early '95,
00:26:36.900 | to be able to listen to Indiana University basketball games,
00:26:40.020 | 'cause that's where we went to school.
00:26:41.980 | And he was like, look, when we would listen to games,
00:26:45.500 | we would have somebody in Bloomington, Indiana
00:26:47.540 | have a speakerphone next to a radio,
00:26:49.380 | and then we would have a speakerphone in Dallas,
00:26:51.380 | and a six-pack or 12-pack of beer,
00:26:53.520 | and we'd sit around listening to the game,
00:26:55.300 | because there was no other way to listen to it.
00:26:57.460 | So I was like, okay, my first company, MicroSolutions,
00:27:00.260 | I'd written software, done network integration,
00:27:02.980 | and so I was comfortable digging into it.
00:27:05.780 | And so I was like, okay, let's give it a try.
00:27:08.740 | So we started this company called AudioNet,
00:27:11.200 | and effectively became the first
00:27:14.220 | streaming content company on the internet.
00:27:17.220 | And we were like, okay, we're not sure
00:27:20.620 | how we're gonna make this work,
00:27:22.260 | but we were able to make it work,
00:27:23.940 | and we started going to radio stations and TV stations
00:27:26.480 | and music labels and everything,
00:27:29.020 | and evolved AudioNet.com,
00:27:32.060 | which was only audio at the beginning,
00:27:33.820 | to Broadcast.com in 1998, which was audio and video,
00:27:38.820 | and became the largest multimedia site on the internet,
00:27:42.220 | took it public in July of 1998.
00:27:45.700 | It had the largest first-day jump
00:27:47.860 | in the history of the stock market at the time.
00:27:49.900 | And then a year later, we sold it to Yahoo
00:27:52.020 | for $5.7 billion in Yahoo stock,
00:27:55.540 | and I owned right around 30% of the company, give or take.
00:28:00.340 | And so after taxes, that's what got me there.
00:28:02.460 | - Well, there's a lot of questions there.
00:28:03.620 | So the technical challenge of that,
00:28:05.100 | you're making it sound easy, but you wrote code,
00:28:09.660 | but still, in the early days of the internet,
00:28:12.380 | how do you figure out how to create this kind of product
00:28:16.340 | of just audio at first and then video at first?
00:28:18.900 | - A lot of iterations, right, like you talked about.
00:28:22.060 | We started in the second bedroom of my house,
00:28:23.820 | set up a server.
00:28:24.860 | I got an ISDN line, which was a 128K line,
00:28:28.340 | and set up, downloaded Netscape server,
00:28:32.980 | and then started using different file formats
00:28:36.580 | that were progressive loading,
00:28:38.540 | and allowing people to connect to the server
00:28:41.580 | and do a progressive download,
00:28:43.180 | so that the audio, you can listen to the audio
00:28:45.660 | while it was downloading onto your PC.
00:28:47.380 | - Yeah, was it super choppy?
00:28:48.580 | So you were trying to figure out how to do it.
00:28:49.420 | - Oh, yeah, for sure, for sure.
00:28:50.260 | It would buffer, it was, yeah, it wasn't good,
00:28:52.540 | but it was a start.
00:28:53.380 | - But it was good enough 'cause it's the first kind of--
00:28:55.380 | - Yeah, because there was no other competition, right?
00:28:57.340 | There was nobody else doing it.
00:28:58.700 | And so it was like, okay, I can get access
00:29:00.540 | to this, this, or this.
00:29:01.540 | And then there were some third-party software companies,
00:29:03.860 | Zing and Progressive Networks and others,
00:29:06.780 | that took it a little bit further.
00:29:09.020 | So we partnered with them,
00:29:10.980 | and I started going to local radio stations
00:29:14.180 | where literally we would set up a server right next to it.
00:29:18.740 | I had a $49 radio, the highest FM radio that I could find.
00:29:23.740 | And we take the output of the audio signal from the radio
00:29:27.740 | with these two analog cables, plug it into the server,
00:29:31.300 | encode it, and make it available from audionet.com.
00:29:35.580 | Then I would go on UUNet bulletin boards.
00:29:37.820 | I would go on CompuServe.
00:29:39.300 | I would go on Prodigy.
00:29:40.860 | I would go on AOL.
00:29:42.460 | I'd go wherever I could find bodies.
00:29:44.340 | And I'd say, okay, we've got this radio station,
00:29:47.260 | KLIF in Dallas, it's got Dallas sports
00:29:50.060 | and Dallas news and politics.
00:29:52.980 | And if you're in an office or you're outside of Dallas,
00:29:56.500 | connect to audionet.com.
00:29:58.580 | And now you can listen to these things on demand.
00:30:01.580 | And that's how we started.
00:30:03.060 | And it started with one radio station,
00:30:05.340 | and then it was five, then it was 10,
00:30:06.980 | then it was video content.
00:30:08.380 | Then the laws were different then,
00:30:10.180 | so we could literally go out and buy CDs and host them
00:30:13.820 | and just let people listen to whatever music.
00:30:16.020 | And we went from 10 users a day to 100 to 1,000
00:30:21.060 | to hundreds of thousands to a million
00:30:23.260 | over those next four years.
00:30:25.060 | - How did you find the users?
00:30:26.300 | Is it word of mouth?
00:30:27.140 | - Word of mouth.
00:30:27.980 | - Just word of mouth.
00:30:28.800 | - Didn't spend a penny on advertising.
00:30:29.820 | - So the thing you were focusing on
00:30:31.140 | is getting the radio stations and all that.
00:30:32.500 | - Well, radio and TV, anything, any content at all.
00:30:34.740 | - To pick up the phone, how'd you--
00:30:37.500 | - Wherever I could, everything that was public domain,
00:30:39.820 | I'd go out and buy a video or a cassette,
00:30:42.300 | whatever it was, you know?
00:30:44.220 | And this was before the DMs,
00:30:45.820 | the Digital Minimum Copyright Act of '90,
00:30:48.980 | whenever it kicked in.
00:30:49.820 | So literally anything that was audio,
00:30:52.460 | we would put online so people could listen to it.
00:30:55.020 | And if you think about somebody at work,
00:30:57.360 | they didn't have a radio most likely,
00:30:58.740 | and if you did, you couldn't get reception.
00:30:59.940 | Definitely didn't have a TV, but you had a PC,
00:31:03.060 | and you had bandwidth available to you,
00:31:05.380 | and the companies weren't up on firewalls
00:31:08.100 | or anything at that point in time.
00:31:09.340 | So our in-office listening during the day
00:31:12.620 | what just exploded because whoever's sitting next to you,
00:31:16.020 | what are you listening to, right?
00:31:17.540 | And that was the start of it.
00:31:18.820 | And then in early '98, we started adding video
00:31:23.100 | and just other things,
00:31:24.180 | and we had ended up with thousands of servers.
00:31:26.900 | There was no cloud back then,
00:31:28.980 | and just pulling together all those pieces to make it work.
00:31:31.380 | But where we really made our money
00:31:32.820 | was by taking that network that we had built
00:31:38.260 | and then going to corporations and saying,
00:31:40.320 | look, it's 1996, '97, '98,
00:31:44.900 | and to communicate with your worldwide employees,
00:31:47.980 | what they would do is they would go to an auditorium
00:31:51.780 | that had a satellite uplink,
00:31:53.540 | and then they would have people go to theaters
00:31:56.900 | or ballrooms and hotels that had satellite downlinks,
00:32:00.820 | and they would broadcast
00:32:02.180 | the product introductions, whatever.
00:32:04.020 | And so we said to them,
00:32:05.060 | look, you're paying millions of dollars
00:32:06.380 | to reach all your employees.
00:32:07.540 | When you can do it, pay us a half a million dollars,
00:32:10.980 | and we'll do it just on their PCs at work.
00:32:13.420 | So we did, when Intel announced the P90 PC,
00:32:17.420 | we charged them $2 million or whatever to do that.
00:32:20.660 | When Motorola announced a new phone or a new product,
00:32:23.140 | we would charge them.
00:32:24.140 | And so we used the consumer side
00:32:25.980 | to do a proof of concept for the network,
00:32:29.500 | and then we would take that knowledge
00:32:32.380 | and go to corporations,
00:32:33.460 | and that's how we made our revenue.
00:32:34.860 | - And there was some selling there with the corporations.
00:32:36.860 | - Yeah, a lot of selling there,
00:32:37.700 | but we were saving them so much money,
00:32:39.380 | and they were technology companies.
00:32:40.740 | They wanted to be perceived as being leading edge,
00:32:43.300 | and so it was win-win.
00:32:44.940 | - How much technical savvy was required?
00:32:47.740 | You said a bunch of servers.
00:32:49.260 | Like at which point do you get more engineers?
00:32:51.220 | How much did you understand could do yourself?
00:32:54.540 | And then also, once you can't do it all yourself,
00:32:58.860 | how much technical savvy is required
00:33:00.500 | to understand enough to hire the right people
00:33:02.080 | to keep building this innovation?
00:33:03.500 | - I did all the technology,
00:33:04.820 | and then we hired engineer after engineer
00:33:07.100 | after engineer to implement it.
00:33:09.060 | And so-- - Wow.
00:33:10.340 | - Yeah, from putting together a multicast network
00:33:13.500 | to software to just all these different things.
00:33:17.700 | - Was this like a scary thing?
00:33:18.900 | - It was terrifying, right?
00:33:20.680 | Because as we were growing, trying to keep up the scale,
00:33:23.540 | and literally we're buying off-the-shelf PCs,
00:33:26.420 | and then server cards as the technology advanced,
00:33:29.820 | and hard drives, and things would fail,
00:33:32.100 | and we would have to, you know,
00:33:33.980 | we didn't have machine learning back then
00:33:35.460 | to do an analysis of how to distribute server resources.
00:33:40.160 | So, like there was a time when Bill Clinton
00:33:45.160 | and all the Monica Lewinsky stuff happened,
00:33:48.300 | they released the audio of their interviews of him
00:33:53.140 | or something like that, right?
00:33:54.700 | And we literally, I knew at that point in time
00:33:58.220 | when that was released, everybody at work
00:34:00.140 | was gonna wanna listen to it, right?
00:34:01.860 | So, we had to take down servers
00:34:03.500 | that were doing Chicago Cubs baseball, right?
00:34:06.820 | You know, and just make all these on-the-fly decisions
00:34:09.020 | because there was no, we didn't have the tools
00:34:10.780 | to analyze or be predictive.
00:34:12.700 | But yeah, it was all technology-driven and marketing.
00:34:16.380 | - The acquisition by Yahoo, can you tell the story of that?
00:34:20.220 | But also in the broader context of this internet bubble.
00:34:24.300 | This is a fascinating part of human history.
00:34:27.620 | - So, on the acquisition side,
00:34:30.020 | we were the largest media site on the internet
00:34:31.860 | and it wasn't close, there was nobody close.
00:34:34.140 | We were YouTube and relatively speaking,
00:34:36.380 | we would be 10X YouTube relative to the competition
00:34:39.420 | 'cause there was nobody there.
00:34:40.860 | And so, it became obvious to Yahoo, AOL and others
00:34:45.300 | that they needed a multimedia component.
00:34:47.500 | And we had the infrastructure, sales, all that stuff.
00:34:50.820 | And so, Yahoo, when we went public in '98,
00:34:55.820 | or right before I think it was,
00:34:58.140 | they made an investment of like $2 million
00:35:00.780 | which gave us a connection to them.
00:35:02.980 | And then after we went public,
00:35:05.460 | they decided they needed to have multimedia.
00:35:07.820 | And so, in April of '99, we made a deal
00:35:12.180 | and then July of 2000 is when it closed.
00:35:16.260 | - And can you explain to me the trickiness
00:35:21.220 | of what you did after that?
00:35:23.020 | - Oh, the caller?
00:35:25.220 | - Yeah.
00:35:26.060 | - So, when we sold to Yahoo,
00:35:27.820 | we sold for $5.7 billion in stock, not cash.
00:35:31.500 | And so, I looked at, after microsolutions,
00:35:36.500 | when I sold that, I took that money
00:35:40.540 | and initially, I told my broker,
00:35:42.100 | I wanted to invest like a 60-year-old man
00:35:44.060 | 'cause I wanted to protect it.
00:35:45.900 | But then he started asking me all kinds of questions
00:35:48.380 | about all these technologies that I understood
00:35:51.340 | like networks I had installed.
00:35:53.120 | We had become one of the top 20,
00:35:56.400 | let's say, systems integrators in the country.
00:35:59.480 | At one point in time,
00:36:00.320 | we were the largest IBM token ring installer in the country.
00:36:03.660 | It was crazy, right?
00:36:05.040 | Banyan, then Blast from the Past.
00:36:07.520 | I mean, so anyway, so these Wall Street bankers
00:36:10.720 | or analysts rather that were the big analysts of the time
00:36:14.300 | would call me up 'cause they would ask my broker,
00:36:17.040 | what does he know about this product, this product?
00:36:18.560 | And I knew 'em all, what was working and not working, right?
00:36:21.240 | And so the ones that work, I say that it's working.
00:36:24.840 | They say something, the stock would go up 20 bucks, right?
00:36:27.200 | So I'm like, well, and my broker's like,
00:36:29.320 | you need to, you know this better than they do.
00:36:31.120 | You need to invest.
00:36:31.940 | So I started buying and selling stocks
00:36:34.160 | and this was in 1990 and was just killing it.
00:36:37.800 | I was making 80, 90, 100% a year over those next four years
00:36:42.440 | to the point where a guy came in
00:36:44.080 | and asked to use my trading history to start a hedge fund,
00:36:49.000 | which we did and I sold within nine months.
00:36:51.360 | It was great, right?
00:36:52.720 | But the point being as it goes forward,
00:36:54.180 | so when we sold to Yahoo,
00:36:56.760 | I already had a lot of experience trading stocks
00:36:59.400 | and I had seen different bubbles come and go,
00:37:02.520 | a bubble for PC manufacturers,
00:37:04.680 | a bubble for networking manufacturers.
00:37:07.200 | They went up, up, up, up, up,
00:37:08.440 | and then they came straight down after the hype
00:37:11.200 | or somebody just leapfrogged.
00:37:14.000 | And so when we sold to Yahoo, I was like,
00:37:17.680 | I've got a B next to my name.
00:37:19.480 | That's all I need or all I want.
00:37:21.160 | I don't wanna be greedy.
00:37:22.240 | And I'd seen this story before
00:37:23.740 | where stocks get really frothy and go straight down.
00:37:27.480 | And I knew that because all of what I had was in stock,
00:37:30.880 | I needed to find a way to collar it and protect it.
00:37:33.400 | So understanding stocks and trading and options
00:37:36.760 | and all that, my broker and I,
00:37:38.640 | we went and shorted an index that had Yahoo in it.
00:37:42.040 | And so the law at the time was you couldn't short
00:37:45.160 | any indexes that had more than 5% of that stock in it,
00:37:48.800 | that of any one of the Yahoo stock.
00:37:51.880 | And so I took pretty much 20 some million dollars,
00:37:55.600 | everything I had at the time, and I shorted the index.
00:37:58.320 | - This is fascinating, by the way,
00:37:59.680 | 'cause it's based on your estimation that this is a bubble.
00:38:02.640 | - Or just my not wanting to be greedy.
00:38:04.840 | - Sure, so the foundation of this kind of thing
00:38:07.120 | is you don't wanna be greedy.
00:38:09.160 | - Yeah, I mean, how much money do I need, right?
00:38:12.080 | Where other people were saying,
00:38:13.120 | oh, I think it can go up higher, higher, higher.
00:38:14.720 | I went on CNBC, and I told 'em what I had done,
00:38:19.720 | and they were like, and Yahoo stock had gone up
00:38:23.360 | significantly from the time I had collared.
00:38:26.200 | And one of the guys, Joe Kernan, was on there.
00:38:28.200 | Don't you feel stupid now that Yahoo stock has gone up
00:38:31.960 | X percent more?
00:38:32.780 | I'm like, yeah, I feel real stupid sitting on my jet.
00:38:35.080 | (laughing)
00:38:36.480 | - But, so, I mean, there is some fundamental way
00:38:39.960 | in which bubbles are based on this greed.
00:38:42.200 | - Oh, for sure, for sure, yeah.
00:38:43.600 | - And I'd seen it before, right, like I just said.
00:38:45.760 | And so what I did was we put together a collar
00:38:47.940 | where I sold calls and bought puts.
00:38:50.080 | And as it turned out, when the market just cratered,
00:38:53.360 | I was protected.
00:38:54.640 | And over the next two, three years, whatever it was,
00:38:58.360 | it converted to cash, paid my taxes, et cetera.
00:39:01.620 | But it protected me, and as it turns out,
00:39:05.120 | it was called one of the top 10 trades of all time.
00:39:07.640 | And what was even more interesting out of that period,
00:39:11.080 | my broker at that time was at Goldman Sachs,
00:39:12.920 | and I had asked him to see if there was a way
00:39:16.720 | to trade the VIX, right, the volatility index.
00:39:19.800 | And there wasn't, right?
00:39:21.840 | And so one of the people at Goldman
00:39:24.920 | that we were working with to try to create this
00:39:27.440 | actually left Goldman and created indexes
00:39:29.920 | that allowed you to trade the VIX.
00:39:32.720 | - Well, it's not trivial to understand that it's a bubble.
00:39:35.560 | I mean, you're kind of lessening your insight
00:39:39.240 | into all this by saying you just didn't wanna be greedy.
00:39:41.760 | But you still have to see that it's a bubble.
00:39:43.640 | - Yeah, I mean, yeah, obviously,
00:39:44.960 | if I thought it was gonna keep on going up
00:39:46.440 | and there was intrinsic value there,
00:39:48.360 | I would have stayed in it.
00:39:49.320 | But it wasn't so much Yahoo, it was just the entire industry.
00:39:53.320 | Back then, like we're looking at the magic seven
00:39:56.600 | or whatever it is stocks now,
00:39:58.040 | and people are asking, is it a bubble?
00:40:00.240 | And when I would get into cabs,
00:40:03.320 | and people would just start talking about internet stocks.
00:40:06.060 | There were people creating companies
00:40:07.560 | with just a website and going public.
00:40:09.680 | You know, that's a bubble, right?
00:40:11.120 | Where there's no intrinsic value at all.
00:40:13.480 | And people aren't even trying to make operating cap profits.
00:40:16.200 | They're just trying to leverage the frothiness
00:40:17.960 | of the stock market.
00:40:19.200 | That's a bubble.
00:40:20.160 | You don't see that right now.
00:40:21.160 | There's not companies, you don't see hardly,
00:40:23.080 | you don't see any IPOs right now for that matter.
00:40:25.440 | So, you know, I don't think we're in a bubble now,
00:40:27.560 | but back then, yes, I thought we were in a bubble,
00:40:29.760 | but that wasn't really the motivating factor.
00:40:32.000 | - Do you think it's possible we're in a bit
00:40:33.480 | of an AI bubble right now?
00:40:35.880 | - No, because we're not seeing funky AI companies
00:40:39.140 | just go public.
00:40:40.040 | If all of a sudden, we see a rush of companies
00:40:42.600 | who are skins on other people's models,
00:40:45.000 | or just creating models to create models
00:40:48.120 | that are going public, then yeah,
00:40:50.020 | that's probably the start of a bubble.
00:40:51.920 | But that said, my 14-year-old was bragging
00:40:55.800 | about buying NVIDIA, you know, with me
00:40:58.000 | in his Robberhood account.
00:40:59.560 | He tells me the order I placed it,
00:41:01.140 | and he was like, "Oh yeah, it's going up, up, up."
00:41:03.840 | And I'm like, yeah, we're not quite there yet,
00:41:06.200 | but that's one thing to pay attention to.
00:41:08.680 | - Yeah, we're flirting with it.
00:41:09.680 | - Yeah.
00:41:11.120 | - You said that becoming a billionaire requires luck.
00:41:14.160 | - Yeah. - Can you explain?
00:41:15.400 | - Yeah, I mean, there's no business plan
00:41:17.880 | where you can just start it and say,
00:41:19.280 | yeah, I'm definitely going to be a billionaire.
00:41:21.640 | You can, you know, if I had to start all over,
00:41:24.160 | could I start a company that made me a millionaire?
00:41:25.840 | Yeah, 'cause I know how to sell, and I know technology,
00:41:27.880 | and I've learned enough over the years to do that.
00:41:30.960 | Could I make 10 million?
00:41:31.920 | Probably, 100 million?
00:41:33.080 | I hope so.
00:41:35.360 | But a billion, just something good has got to happen.
00:41:38.820 | You know? - The timing.
00:41:40.360 | - Timing, you know, the internet stock market
00:41:42.240 | was going nuts right when we started, you know?
00:41:44.840 | And that certainly, I couldn't predict or control.
00:41:48.520 | You know, it's like AI right now.
00:41:52.640 | AI's been around a long, long, long, long time.
00:41:55.800 | And the NVIDIA processors, or GPUs rather,
00:42:00.800 | you couldn't predict that now's the time
00:42:02.680 | that they were going to get to that cost-effectiveness
00:42:04.860 | where, you know, you could create models and train them,
00:42:08.840 | and although it's expensive, it's still doable.
00:42:12.040 | You know, we didn't really even,
00:42:13.440 | we had ASICs, right, for custom applications,
00:42:15.880 | and we had CPUs that were leading the way,
00:42:18.640 | but GPUs were more for gaming and then crypto mining.
00:42:22.360 | And now, then all of a sudden,
00:42:23.840 | they were the foundation for AI models.
00:42:27.000 | - So I think luck being essential to becoming a billionaire
00:42:32.200 | is a beautiful way to see life in general.
00:42:34.980 | First of all, I personally think that everything good
00:42:37.660 | that's ever happened to me is because of luck.
00:42:40.020 | I think that's just a good way of being.
00:42:41.700 | It's like you're grateful.
00:42:43.780 | That said, there's some examples of people
00:42:46.180 | that you're like, they seem to have done a lot of,
00:42:49.380 | they seem to have gotten lucky a lot.
00:42:51.500 | You know, we mentioned Jeff Bezos.
00:42:53.180 | It seems like he did a lot of really interesting,
00:42:57.580 | powerful decisions for many years
00:42:59.860 | with Amazon to make it successful.
00:43:01.660 | But he was really able to raise money, right?
00:43:04.640 | A lot of money.
00:43:05.760 | And people were really dismissive of him
00:43:08.080 | because they weren't making, they weren't profitable.
00:43:11.800 | And we were in an environment
00:43:13.960 | where it was possible to raise all that money.
00:43:15.460 | - It was possible to raise that money.
00:43:16.680 | I mean, what about somebody you get sometimes feisty with
00:43:20.720 | on the internet, Elon?
00:43:21.840 | But we couldn't even look at Zuck and Bill Gates
00:43:24.440 | and Warren Buffett.
00:43:25.520 | - Look, Zuck was just trying to get laid, right?
00:43:27.880 | And it took off, and you rose some good-
00:43:29.240 | - Aren't we all?
00:43:30.080 | - Right, it's that level, right?
00:43:30.920 | - That's the foundation of human civilization.
00:43:32.840 | - But yeah, so more power to him, right?
00:43:35.120 | You can't take anything away from him.
00:43:36.960 | But yeah, Snapchat, same thing, took off.
00:43:40.460 | Apps didn't take off in 2007 when the iPhone came out.
00:43:43.200 | Apps took off in 2011, 2012.
00:43:45.820 | And if you were there with the right app at the right time.
00:43:48.360 | And even Facebook, you know, in 2004,
00:43:52.640 | the bubble had burst and, you know,
00:43:55.360 | the price for computers had fallen enough
00:43:58.060 | and kids in school all needed computers or laptops.
00:44:00.640 | If he had tried to do something like that, you know,
00:44:03.880 | five years earlier, I mean, he was too young,
00:44:05.400 | but, you know, five years earlier or five years later,
00:44:08.280 | you know, or Friendster might have been the ultimate,
00:44:11.200 | or MySpace.
00:44:12.480 | - Friendster, I remember Friendster.
00:44:14.240 | - Or MySpace, I had a MySpace account
00:44:15.720 | and that was before Facebook.
00:44:17.880 | - Yeah, the timing's important,
00:44:19.640 | but there's like the details of how the product is built,
00:44:22.360 | the fundamentals of the product, like what-
00:44:24.960 | - But that's what gets you,
00:44:25.800 | when the opportunity is there, right?
00:44:28.080 | That's what allows you to take advantage
00:44:29.560 | of that opportunity and the kismet of it all, right?
00:44:32.440 | You've gotta be, 'cause it wasn't like
00:44:34.000 | any of the people I mentioned,
00:44:35.600 | there weren't others trying the same thing, right?
00:44:38.160 | You had to be able to see it.
00:44:39.680 | You had to be able to visualize it
00:44:41.800 | and put together a plan of some sort,
00:44:43.800 | or at least have a path.
00:44:45.360 | And then you had to execute on it
00:44:47.280 | and do all those things at the same time
00:44:49.760 | and have the money available to you.
00:44:51.600 | Because it wasn't like, whether it was Google or Facebook,
00:44:55.240 | you know, they raised a shitload of money.
00:44:57.040 | It wasn't bootstrapping it that got them there.
00:44:59.220 | - And raising money is not just about sales,
00:45:01.520 | it's about the general feeling
00:45:05.340 | of the people with money at that time.
00:45:07.480 | - And proximity.
00:45:08.400 | - Oh yeah, sure.
00:45:10.400 | - If Chuck wasn't at Harvard
00:45:11.300 | and he was at Miami of Ohio University,
00:45:13.880 | or he was at Richland Community College,
00:45:16.720 | same idea, same person, same execution and nothing.
00:45:20.160 | - I believe in the power of individuals
00:45:24.320 | to find their, to realize their potential
00:45:27.760 | no matter where they come from.
00:45:29.300 | - I agree 100% with that, right?
00:45:30.980 | - But luck is required.
00:45:31.900 | - Yeah, I mean, scale is, the only delta is scale.
00:45:34.500 | We're not all blessed with the access to the tools
00:45:38.900 | that you need to hit that grand slam.
00:45:42.020 | - But then also, a billion is not
00:45:43.340 | the only measure of success, right?
00:45:45.180 | - Absolutely not, right?
00:45:46.980 | Everybody defines the success in their own way.
00:45:49.180 | - How do you define success, Mark Cuban?
00:45:51.900 | - Waking up every day with a smile,
00:45:53.580 | excited about the day.
00:45:56.620 | People always say, well, when you get that kind of money,
00:45:58.680 | does it make you happy?
00:46:00.040 | And my answer always is,
00:46:02.480 | if you are happy when you were broke,
00:46:04.320 | you're gonna be really, really, really happy
00:46:06.120 | when you're rich.
00:46:06.960 | (laughing)
00:46:08.080 | - But you gotta work on being happy
00:46:10.040 | when you're broke, I guess.
00:46:11.100 | - Well, you're just being happy, right?
00:46:12.480 | If you were miserable in your job before,
00:46:15.720 | there's a good chance you're still gonna be miserable
00:46:17.480 | if that's just who you are.
00:46:19.200 | - That's a pretty good definition of success, by the way.
00:46:21.400 | - Thank you.
00:46:22.240 | - How do you reach that success
00:46:24.100 | by way of advice to people?
00:46:26.780 | - You know, we talked about my dad, my parents.
00:46:29.300 | I never looked at my dad and said,
00:46:32.540 | okay, you're not successful.
00:46:34.540 | He busted his ass, and when he came home,
00:46:36.900 | we enjoyed our time together, right?
00:46:43.340 | There was nothing at any point in time
00:46:45.340 | where I felt like, oh, this is miserable,
00:46:47.820 | we're awful, we don't have this, we don't have that.
00:46:51.060 | You know, we celebrated the things we did have,
00:46:53.420 | and never knew about the things we didn't have, you know?
00:46:57.220 | And so I think, you know, you have to be able
00:47:01.300 | to find your way to whatever it is
00:47:04.060 | that puts a smile on your face every day.
00:47:05.700 | Some people can do it, and some people can't.
00:47:07.500 | - It's not always about the smile,
00:47:08.940 | or the smile on the outside,
00:47:10.300 | it could be a smile on the inside.
00:47:11.460 | - Yeah, whatever it is, right?
00:47:12.300 | Whatever makes you feel good.
00:47:13.500 | - The struggle, even the struggle, like with your dad,
00:47:16.420 | the really, really hard work can be a fulfilling experience,
00:47:22.380 | because the struggle leading up to then seeing your kids.
00:47:27.380 | - Exactly right, right?
00:47:28.740 | Because that was my dad's grand slam, right?
00:47:31.700 | Seeing three kids go to college, be successful,
00:47:34.860 | be able to spend time with them.
00:47:37.580 | And that was the other thing, you know,
00:47:39.460 | he really made me realize is the most valuable asset
00:47:42.780 | isn't the money, it's your time.
00:47:45.140 | That's why, you know, from a young age,
00:47:46.580 | I wanted to retire, because I wanted to experience
00:47:49.560 | everything that I possibly could in this life.
00:47:51.820 | And, you know, he got joy from us, I get joy from my kids.
00:47:55.440 | And that's the most special thing you ever can have.
00:48:00.860 | - Beautifully said.
00:48:02.020 | You have made some mistakes in your life.
00:48:05.560 | - Yeah, a lot of them.
00:48:07.380 | - One of the bigger ones on the financial side,
00:48:10.060 | we could say is Uber.
00:48:12.260 | - Yeah, we call that not doing something.
00:48:13.900 | Yeah, it wasn't a mistake, it was just,
00:48:15.820 | I mean, it was a mistake.
00:48:16.900 | (laughing)
00:48:18.060 | - I like how you tried to--
00:48:20.020 | - You know, I always try to look at mistakes,
00:48:21.980 | the things you did that didn't turn out
00:48:23.380 | as opposed to things you did to, you know, the negative.
00:48:27.220 | - But can you tell the story of that?
00:48:28.660 | And maybe it's just interesting,
00:48:29.900 | 'cause it is illustrative of like how to know
00:48:33.460 | when a thing is going to be big and not,
00:48:35.740 | and what are the fundamentals of it,
00:48:37.460 | and how to take the risk and not,
00:48:39.180 | and all this kind of stuff.
00:48:40.420 | - So the backstory of that is Bill Gurley came to me
00:48:43.020 | and said, "Mark, there's this guy, Travis,
00:48:46.040 | "that has this company, Red Swoosh,
00:48:48.500 | "which is a peer-to-peer networking company
00:48:50.780 | "that I think you can help."
00:48:53.660 | And so I invested and spent a lot of time with Travis.
00:48:57.820 | And it's funny, 'cause back then, that was like 2006,
00:49:00.720 | I was an investor at Box.net with Aaron Levy,
00:49:05.060 | and oh, there was one other company,
00:49:08.060 | but there were three of 'em where there'd be emails
00:49:10.220 | between, you know, where I'd introduce 'em,
00:49:11.900 | and we'd all talk in these emails,
00:49:13.580 | and they'd all gone to have astronomical success, right?
00:49:18.920 | But so Red Swoosh had its issues, you know,
00:49:22.120 | 'cause I was looking at peer-to-peer
00:49:23.760 | as kind of stealing bandwidth from the internet providers
00:49:27.280 | when bandwidth was a scarce commodity.
00:49:30.040 | And so, you know, what Travis did with that, though,
00:49:33.640 | was great, you know, he convinced gaming companies
00:49:36.160 | who wanted to do downloads of the clients for those games
00:49:39.960 | to use his peer-to-peer on Red Swoosh.
00:49:41.840 | And, you know, he busted his ass,
00:49:44.160 | and I think he sold it for $18 million.
00:49:46.200 | So he did well.
00:49:47.320 | And so it was natural for him to come to me,
00:49:49.680 | and I still have the emails, you know,
00:49:51.560 | and asked me about Uber Cab.
00:49:54.280 | And I thought, okay, this is a great idea.
00:49:56.480 | I really, really like it.
00:49:58.200 | I said, you're gonna, and he showed me his budgets,
00:50:01.360 | and I think they were raising money
00:50:03.320 | at 10 or $15 million or whatever.
00:50:06.280 | And I'm like, your biggest challenge is gonna be
00:50:08.980 | you're gonna have to fight
00:50:09.920 | all the incumbent taxi commissions.
00:50:11.840 | They're gonna wanna put you out of business.
00:50:13.540 | That's gonna be a challenge,
00:50:14.600 | and I think you don't have enough money
00:50:16.640 | designated for marketing to get all that done.
00:50:19.520 | And I said, I'd invest,
00:50:21.080 | but not quite at that valuation, right?
00:50:23.720 | Never came back to me. (laughs)
00:50:26.680 | - Yeah, I mean, there's some lessons there
00:50:28.680 | connected to what you're doing now.
00:50:30.400 | We'll talk about it, Cost Plus Drugs.
00:50:32.020 | It's like looking at an industry
00:50:34.080 | that seems like there's a lot of complexity involved,
00:50:38.800 | but it's like hungry for revolution.
00:50:41.400 | - For sure. - And the cabs are that.
00:50:43.520 | - Yeah, for sure, right?
00:50:45.000 | They were dominated by an insulated few.
00:50:47.960 | They were not very transparent.
00:50:49.320 | You didn't know the intricacies.
00:50:50.840 | They were very politically driven,
00:50:53.800 | and old boy, incestuous network.
00:50:56.140 | And like I told him, Travis,
00:50:59.320 | the best thing about you is you'll run through walls
00:51:01.560 | and break down barriers.
00:51:02.560 | The bad thing about you is you'll run through walls
00:51:05.280 | even if you don't have to.
00:51:07.320 | - Yeah, and there you kind of have to see,
00:51:09.600 | is it possible to raise enough money?
00:51:11.280 | Is it possible to do all this?
00:51:12.680 | Is it possible to break through?
00:51:15.040 | And it's kind of a fascinating success story with Uber is.
00:51:18.880 | - I think he tried to go too big.
00:51:20.360 | He had too big an ambition, which cost him in the end,
00:51:24.240 | not financially and personally,
00:51:25.480 | but just in terms of being able to stick it out with them.
00:51:28.380 | But that's what makes him a great entrepreneur.
00:51:32.000 | - Well, it's a fascinating success story.
00:51:34.000 | You have certain companies like Airbnb
00:51:37.160 | just kind of go into this thing
00:51:39.260 | that we take completely for granted.
00:51:41.440 | - And change it all.
00:51:42.280 | - Just change it all.
00:51:43.120 | - Yeah, yeah, Belinda Johnson,
00:51:44.360 | who worked as our general counsel at Broadcast.com,
00:51:47.360 | was Brian's GC and chief operating officer.
00:51:51.440 | So yeah, they had a smart, smart, smart, smart team.
00:51:55.120 | - And they believed in it.
00:51:56.440 | I mean, it's just, it's a beautiful story
00:51:59.480 | 'cause you're like, all right,
00:52:00.440 | all the things that annoy you about this world,
00:52:02.480 | like they're inefficient and just seem like--
00:52:04.840 | - And see, I probably would have said no,
00:52:06.560 | like a lot of people did to Airbnb
00:52:08.360 | because I'm like, I don't want people
00:52:10.400 | sleeping in my bed.
00:52:11.960 | - I would have too.
00:52:12.800 | I was like, this is not gonna work.
00:52:14.360 | I've done like couch surfing and stuff
00:52:15.840 | and it was always, it didn't seem right.
00:52:18.600 | It didn't seem like you could do this at a large scale.
00:52:20.840 | - Monetize it, yeah, but he did, more power to him.
00:52:24.260 | - In 2000, I think January,
00:52:26.720 | you purchased the majority stake
00:52:28.200 | in the NBA team Dallas Mavericks for 285 million.
00:52:33.200 | So at this point, maybe you can correct me,
00:52:38.160 | but it was one of the worst performing teams
00:52:40.000 | in franchise history.
00:52:41.480 | - True.
00:52:42.480 | - How did you help turn it around?
00:52:46.240 | - I had this big tall guy named Dirk Nowitzki
00:52:48.480 | and I let him be Dirk Nowitzki, right?
00:52:50.640 | And I got out of the way.
00:52:52.680 | But I think more than anything else,
00:52:54.480 | there was the turnaround on the business side
00:52:56.720 | and then there was the turnaround on the basketball side.
00:52:59.880 | And on the basketball side,
00:53:00.920 | I just went in there immediately said,
00:53:02.280 | whatever it takes to win, that's what we're going to do.
00:53:04.720 | Back then, they had three or four coaches
00:53:08.360 | that were responsible for everything.
00:53:10.200 | And I was like, okay, we spend more money
00:53:12.600 | training people on PC software
00:53:15.400 | than we do developing the most important assets
00:53:18.040 | of the business.
00:53:18.880 | So I made the decision to go out there
00:53:20.920 | and hire like 15 different development coaches,
00:53:24.120 | one for each player.
00:53:25.600 | And everybody thought I was just insane,
00:53:28.120 | but it sent the message that we were going to do
00:53:32.240 | whatever it took to win.
00:53:33.840 | And once the guys believe that winning was the goal
00:53:39.800 | as opposed to just making money, attitudes change,
00:53:42.680 | effort went up and the rest is history.
00:53:45.400 | - So the assets of the business here are the--
00:53:47.520 | - The players. - The players.
00:53:48.560 | - Yeah, for sure.
00:53:49.560 | And then on the business side,
00:53:51.160 | the first question I asked myself is,
00:53:54.880 | what business are we in?
00:53:56.160 | And I really didn't know the answer immediately,
00:53:58.720 | but within the first few months,
00:54:00.840 | it was obvious that the entire NBA
00:54:03.880 | thought we were in the business of basketball.
00:54:06.040 | We were not, we were in the experience business.
00:54:08.440 | When you think about sporting events that you've been to,
00:54:11.320 | you don't remember the score,
00:54:12.560 | you don't remember the home runs or the dunks,
00:54:14.440 | you remember who you're with.
00:54:15.760 | And you remember why you went.
00:54:17.080 | Oh, it was my first day with a girl who's now my wife,
00:54:19.480 | or I went with my buddies
00:54:21.200 | and he threw up on the person in front of us.
00:54:23.720 | You know, my dad took me, my aunt, my uncle took me.
00:54:25.800 | Those are the experiences you remember.
00:54:27.800 | And once I conveyed to our people
00:54:31.640 | that this is what we were selling,
00:54:32.880 | that what happened in the arena off the court
00:54:35.840 | was just as important as what happened on the court,
00:54:37.880 | if not more so.
00:54:39.120 | Because if mom or dad are bringing the 10-year-old,
00:54:41.600 | you have to keep them occupied
00:54:43.080 | because they have short attention spans.
00:54:44.520 | And so I would get into fights with the NBA,
00:54:47.680 | put aside the refs,
00:54:48.800 | but getting into fights in the NBA,
00:54:50.560 | I would say NBA, nothing but attorneys, right?
00:54:53.120 | Because they had no marketing skills whatsoever.
00:54:55.240 | And to their credit, they realized that was a problem
00:54:58.200 | and started bringing in better and better
00:54:59.440 | and better marketing people.
00:55:00.560 | - So part of the selling is you're selling the team,
00:55:03.360 | selling the sport, selling the people,
00:55:07.320 | the idea, all of it, like just the--
00:55:09.200 | - Well, yeah, the experience.
00:55:10.040 | So have you ever been to an NBA game?
00:55:12.280 | - Miami Heat.
00:55:13.120 | - Do you remember walking into the arena
00:55:14.960 | and you feel the energy, right?
00:55:17.160 | That's what makes it special.
00:55:18.480 | - Yeah, the energy is everything.
00:55:19.800 | Especially playoff games.
00:55:20.840 | - Right, for sure, right?
00:55:22.000 | And even a regular season game, right?
00:55:23.480 | Even against the worst team,
00:55:25.280 | you know, that's where we get,
00:55:27.440 | because the tickets tend to be a little bit cheaper
00:55:29.640 | on the resale market,
00:55:30.800 | that's where parents will bring their kids.
00:55:32.320 | And so you hear kids screaming the entire game.
00:55:34.880 | And the parents are thrilled to death, right?
00:55:36.880 | They got to do something with your kids.
00:55:38.440 | The kids are thrilled to death
00:55:39.680 | because they got to see basketball, an NBA game,
00:55:42.240 | and scream at the top of their lungs.
00:55:44.240 | And you know, if it turns out to be a close game
00:55:46.400 | and that ball is in the air,
00:55:47.720 | and if it goes in, you know,
00:55:49.800 | everybody's hugging and high-fiving people
00:55:51.400 | you've never seen before in your life.
00:55:53.040 | And if it misses,
00:55:53.880 | you're commiserating with people you've never seen before.
00:55:55.960 | That's such a unique experience that's unique to sports.
00:55:59.200 | And we never sold that.
00:56:00.320 | And that's exactly what we started selling.
00:56:01.760 | - I have to say, like, just going to that game
00:56:03.280 | turned me around on basketball
00:56:04.560 | 'cause I'm more of a football guy.
00:56:05.920 | So basketball wasn't like the main sport.
00:56:08.320 | And I was like, oh, wow, okay.
00:56:09.600 | - It's fun, and it's different, right?
00:56:11.600 | Yeah, the energy in a stadium is completely different
00:56:14.680 | than the energy in an arena.
00:56:16.080 | You know, in the stadium,
00:56:17.960 | particularly if it doesn't have a roof,
00:56:19.240 | it's hard to bottle that energy.
00:56:21.200 | You feel it and you see, like, I'm from Pittsburgh,
00:56:23.360 | so there's the terrible towels
00:56:24.640 | and people screaming defense and everything
00:56:26.800 | at Steelers games.
00:56:27.640 | But in an arena, the energy level is just indescribable.
00:56:32.640 | - So how much of it is the selling the tickets in person,
00:56:35.160 | but also versus what you see on TV?
00:56:39.680 | So when you're owning a team,
00:56:41.320 | do you get any of the cut for what's shown on TV?
00:56:43.960 | - Yeah, yeah, yeah.
00:56:44.920 | So there's a TV deal that's done
00:56:47.080 | with either a local TV broadcaster,
00:56:49.400 | and we get all of that,
00:56:50.800 | or a network broadcaster like ABCC, ESPN, TNT, whatever.
00:56:55.800 | And then we get 1/30 of that.
00:56:58.000 | - So what role does the TV play in, like,
00:57:00.880 | turning a team around? - It keeps fans connected.
00:57:02.760 | Look, when the team is doing really well, it's easy, right?
00:57:05.480 | There's more viewers, everybody's more excited.
00:57:07.720 | And when you're not, there's still gonna be hardcore fans
00:57:11.280 | and general fans and kids that like to watch the game.
00:57:15.800 | - What about, like, the personality
00:57:17.080 | of the people in the stands?
00:57:19.120 | Like, I mean, clearly, you're part of the legend
00:57:23.040 | of the team because you're literally there going wild.
00:57:26.680 | - Yeah, screaming, yeah, the whole game, right?
00:57:28.800 | Yeah, it's funny.
00:57:30.560 | The way I am here is how I am 24 hours a day
00:57:33.400 | unless there's a Mavs game, you know?
00:57:35.440 | And for whatever reason, that's where I let out
00:57:37.480 | all that stress and frustration.
00:57:39.240 | But yeah, I mean, it's not so, the fans,
00:57:42.560 | you know, the sixth man, right?
00:57:44.000 | We need fans to bring that energy.
00:57:45.840 | And amplifying that as much as we can is important.
00:57:49.580 | - You've had a beef recently on Twitter,
00:57:52.760 | on X with Elon over DEI programs.
00:57:56.060 | What to you is the essence of the disagreement there?
00:57:59.720 | - I wouldn't call it a beef, right?
00:58:01.360 | It's just-- - It's a bit of fun.
00:58:03.200 | - Yeah, it's fun for me, right?
00:58:05.040 | I just, you know, it's his platform.
00:58:11.400 | He gets to run it any way he pleases.
00:58:12.800 | He pays for that, right?
00:58:14.280 | And so I have total respect for whatever choices he makes,
00:58:17.680 | even if I don't agree with him.
00:58:19.760 | But because it's his platform,
00:58:22.720 | people are less likely to disagree with him,
00:58:27.920 | particularly somebody who's got a platform themselves.
00:58:32.600 | And so when we start talking about DEI
00:58:35.240 | and it's just de facto racist and this stuff,
00:58:37.960 | stuff that I just think is nonsense,
00:58:40.400 | I have no problem, you know, sharing my opinion.
00:58:43.100 | And, you know, if he disagrees, okay, he can disagree.
00:58:48.920 | I don't care, you know?
00:58:50.160 | And it's fun to engage, but he doesn't really engage.
00:58:54.160 | You know, he just comes back with snark comments,
00:58:56.080 | which is, you know, his choice.
00:58:57.880 | - Yeah, in your comments, well, you do a bit of snark too,
00:59:00.760 | but- - Yeah, a little bit.
00:59:02.820 | - But you're pretty, let's say, rigorous in your response.
00:59:09.040 | So there is some exchange of ideas.
00:59:10.840 | There's some snark, there's some fun,
00:59:12.600 | all that kind of stuff.
00:59:13.440 | And you do voice the opinion
00:59:15.560 | that represents a large number of people, and it's great.
00:59:18.400 | I mean, that's what's, it's really beautiful.
00:59:20.800 | But just lingering on the topic,
00:59:24.640 | what to you is the good and the bad of DEI programs?
00:59:28.280 | - Really simple, right?
00:59:29.320 | D is diversity, and that means you just expand
00:59:32.840 | your pool of potential applicants
00:59:34.480 | to people who you might not otherwise have access to.
00:59:38.160 | You know, to look where you didn't look before,
00:59:40.280 | to look where other people aren't looking
00:59:41.800 | for quality employees.
00:59:43.260 | That's simple.
00:59:45.840 | And the E in equity means when you hire somebody,
00:59:50.480 | you put them in a position to succeed.
00:59:53.000 | The I in inclusion is when you've hired somebody
00:59:57.040 | and they may not be typical, if you will, right?
01:00:00.680 | You show 'em some love and give 'em the support they need
01:00:03.040 | so they can do their job as best they can
01:00:04.760 | and feel comfortable and confident going to work.
01:00:06.520 | It's that simple.
01:00:07.820 | - So that's a beautiful ideal.
01:00:10.360 | When it's implemented, implemented poorly, perhaps,
01:00:13.720 | or in a way that doesn't reach that ideal,
01:00:16.240 | do you see, maybe when it's quota-based,
01:00:20.640 | do you see that it can result in essentially racism
01:00:24.640 | towards Asian people and white people, for example?
01:00:27.640 | - There's a lot to unpack there, right?
01:00:30.260 | So first, you can't do quotas.
01:00:31.680 | They're illegal unless you're,
01:00:33.960 | and I'm not the lawyer on this subject,
01:00:35.240 | but unless you're trying to repair something
01:00:39.140 | that's happened in the past,
01:00:40.160 | like some discrimination that's happened in the past.
01:00:42.640 | So it's not quota-based, and I think that's really
01:00:45.900 | just kind of a straw man that people put out there.
01:00:50.860 | Now, does that mean that there aren't DEI programs
01:00:53.100 | that are implemented poorly?
01:00:53.940 | Of course not.
01:00:54.760 | There are everything that's implemented poorly
01:00:57.260 | in one company to another, right?
01:00:59.180 | Sales, marketing, human resources.
01:01:03.220 | You can pick any element of business
01:01:05.020 | and find companies that implement it poorly.
01:01:07.220 | But that's the beauty of capitalism in a free market,
01:01:11.760 | or mostly free market, where if you make these choices,
01:01:15.300 | and they are the wrong choices,
01:01:17.340 | you're gonna lose your best people.
01:01:19.200 | You're not gonna be able to hire the best people.
01:01:21.420 | You're not gonna execute on your business plans
01:01:23.720 | in the way that we discussed,
01:01:25.340 | regardless of the size of the company.
01:01:27.320 | And it also, I think, depends on
01:01:31.180 | where you're having the discussion.
01:01:33.100 | So when I'm in a different group of people off of X,
01:01:37.020 | the feedback's completely different, right?
01:01:41.340 | But to your question of reverse racism,
01:01:45.820 | yes, it happens, 'cause people are people.
01:01:49.100 | There's no human being that is 100% objective.
01:01:54.100 | And it's also, there's very, very, very few jobs
01:02:00.460 | that can be determined on a purely quantitative basis.
01:02:06.060 | How do you tell one janitor from the other who's the best?
01:02:11.100 | How do you tell one salesperson that you're hiring
01:02:13.540 | versus another you're hiring?
01:02:14.700 | 'Cause they haven't sold your product yet,
01:02:15.940 | so you don't know.
01:02:16.780 | We talked earlier about firing people
01:02:18.340 | 'cause you made mistakes.
01:02:19.780 | And yes, there's discrimination against any group,
01:02:24.780 | white, Asian, black, green, orange, whatever it may be.
01:02:29.900 | But I truly believe that there's far more discrimination
01:02:33.420 | against people of color than there are people who are white.
01:02:37.060 | And I think it's become a straw man
01:02:40.140 | that reverse discrimination because of DEI
01:02:43.820 | is prevalent or near ubiquitous.
01:02:46.800 | - Well, much of American history was defined
01:02:50.980 | by intense radical racism and sexism.
01:02:55.320 | But in the recent years, there was a correction,
01:03:00.420 | and I think the nature of the criticism
01:03:02.540 | is that there's an overcorrection
01:03:04.580 | where DEI programs at universities, at companies,
01:03:08.980 | are often, when they're not doing their job well,
01:03:11.820 | are often hard to criticize
01:03:14.340 | because when you criticize them within the company
01:03:16.740 | or so on, they have a very strong immune system.
01:03:21.240 | If you criticize a DEI program,
01:03:25.460 | it seems like it's very easy to be called racist.
01:03:27.980 | And if you're called racist or sexist,
01:03:30.860 | that's a sticky label.
01:03:32.740 | So you're getting into the culture of organizations
01:03:36.260 | and leadership within organizations
01:03:38.820 | and accepting any type of criticism, put aside DEI.
01:03:43.820 | When I criticized the referees at the NBA, I got fined.
01:03:49.020 | That was their option.
01:03:51.060 | I knew what I was getting into.
01:03:52.740 | Not that they're completely analogous,
01:03:54.140 | but it's cause and effect.
01:03:56.160 | If I'm in a major company and I'm publicly criticizing
01:04:00.260 | or even internally criticizing a sales plan or a product,
01:04:05.260 | our product sucks, right?
01:04:07.220 | Or like there was a Google engineer that got fired
01:04:09.400 | for saying Google had AGI, right?
01:04:12.300 | And nobody believed they did,
01:04:13.500 | and they knew that created problems.
01:04:15.220 | Wasn't DEI related, but it was saying something publicly
01:04:18.940 | that was, in the CEO's eyes,
01:04:21.340 | to the detriment of the company, right?
01:04:23.140 | So I think those are all analogous.
01:04:25.500 | If you're trying to accomplish something
01:04:26.980 | within an organization because you think there's a problem
01:04:29.900 | and there's people speaking out, saying,
01:04:31.940 | look, we're getting it wrong.
01:04:34.140 | I think I'm a victim of all this.
01:04:36.320 | And the company's, right?
01:04:37.960 | Then, you know, leadership has got to make a decision.
01:04:40.660 | Do they agree or not agree?
01:04:42.860 | Are they right?
01:04:43.700 | Are they wrong?
01:04:44.520 | Is it to the positive?
01:04:46.780 | Is it positive or negative to the company?
01:04:48.460 | And you decide.
01:04:49.540 | So, you know, this conversation
01:04:51.460 | that conservatives are being silenced in organizations now,
01:04:58.180 | I just, I haven't seen it.
01:05:00.740 | You know, I've talked to,
01:05:02.140 | and then the other side of your question,
01:05:03.800 | I think, I'm packing it,
01:05:05.540 | is what's driving all this?
01:05:10.540 | Put aside universities, for one, in corporate America.
01:05:14.260 | When I talk to people in corporate America about DEI,
01:05:18.260 | they always start talking about ideology, right?
01:05:24.520 | And like, I've talked to Bill Ackman,
01:05:25.900 | who you've had on, right?
01:05:27.260 | And when I asked him,
01:05:28.940 | well, Bill, you run your own companies.
01:05:30.340 | Who's telling you what to do?
01:05:32.280 | They are.
01:05:33.860 | Who's they?
01:05:34.900 | Well, it's the universities, you know,
01:05:37.060 | the people who have this ideology of DEI.
01:05:40.980 | I'm like, did they force you?
01:05:43.260 | Did they coerce you?
01:05:45.240 | Did you lose control of your company?
01:05:47.600 | No, it's not me.
01:05:48.540 | It happens to other people.
01:05:49.940 | Then I talk to other people, same thing.
01:05:52.520 | So I get, you know, try not to go one-on-one
01:05:55.220 | in Twitter conversations on this topic.
01:05:57.400 | So in the DMs, I'll talk to people
01:05:59.740 | who are really conservative,
01:06:02.020 | and I'll ask the same question.
01:06:04.140 | And I'll be like, well, who's forcing you to do this?
01:06:06.380 | Well, it's the ideology that's everywhere.
01:06:08.020 | You see it, don't, didn't you see the Harvard thing,
01:06:10.340 | you know, in University of North Carolina?
01:06:12.060 | I'm like, I've never had anybody try to push me
01:06:14.300 | in this direction to do this.
01:06:15.780 | This was my business choice.
01:06:17.340 | I'm not trying to tell other people you have to do this.
01:06:20.220 | You make your own business choices.
01:06:22.340 | And so where companies have made their business choices,
01:06:25.300 | and if somebody doesn't feel confident
01:06:26.920 | or comfortable with it,
01:06:28.500 | they may feel they're being discriminated against.
01:06:30.900 | There was something I just read in the Wall Street Journal
01:06:33.080 | where the Wall Street Journal
01:06:34.660 | had a company interview 2 million people, right?
01:06:38.700 | And the difficulty in firing
01:06:40.380 | and how people, when they were fired,
01:06:42.980 | 40% of the people who were fired
01:06:46.340 | felt like it was wrong that they were doing a great job.
01:06:50.220 | Yet, then it talked about the HR person
01:06:53.420 | going through the hassle of trying to explain to this person
01:06:56.780 | through performance reviews
01:06:58.420 | that they weren't doing a good job,
01:07:00.180 | yet the people still thought they were doing a great job
01:07:02.340 | despite being told they're not doing a good job, right?
01:07:04.740 | So I see that as being an analogous to all this,
01:07:07.380 | you know, this huffing and puffing
01:07:09.420 | about reverse discriminations
01:07:12.060 | and conservatives not being able to speak up
01:07:14.460 | because at 40% of people who have been fired
01:07:18.580 | don't believe they should have been fired.
01:07:20.500 | There's a disconnect somewhere
01:07:22.460 | in how you think you're doing your job.
01:07:24.540 | And if you just feel like I can't speak up because of it,
01:07:29.540 | because you're white
01:07:31.700 | and that doesn't comport well with DEI programs,
01:07:35.340 | a lot of things are gonna happen, right?
01:07:39.740 | Either that's gonna come up in your performance review,
01:07:43.100 | HR or your boss is going to have to address it in some way.
01:07:46.940 | It's gonna get to HR at some level
01:07:48.780 | and then decisions are going to have to be made.
01:07:51.500 | And you can't just fire somebody
01:07:52.780 | because they spoke up, right?
01:07:55.140 | Somebody is gonna have to communicate with you.
01:07:56.860 | And so I think a lot of,
01:07:58.540 | I just don't trust the supposed volume
01:08:03.740 | that people say it's happening at
01:08:06.340 | versus everything I've read and seen.
01:08:07.900 | And when I talk to people in positions of authority
01:08:10.180 | within organizations and ask them who's forcing them
01:08:13.860 | to implement these ideologies,
01:08:15.700 | nobody says yes, that there is somebody.
01:08:19.720 | But on Twitter, it sounds great.
01:08:22.060 | - It is true for conservatives, but in general,
01:08:24.620 | you could sell books, you can get likes
01:08:27.060 | when you talk about this ideology.
01:08:29.060 | And there's a degree to which is this woke ideology
01:08:32.500 | in the room with us right now?
01:08:34.140 | Meaning like it's this boogie monster
01:08:37.580 | that we're all kind of--
01:08:38.820 | - Or is it a positive?
01:08:40.300 | - I guess another way to say that
01:08:42.020 | is they don't highlight a lot of the positive progress
01:08:44.340 | that's been made in the positive version of the word woke
01:08:48.300 | in terms of correcting some of the wrongs done in the past.
01:08:51.820 | But that said, if you ask people in Russia,
01:08:55.740 | a lot of them will say, there's no propaganda here,
01:08:58.860 | there's no censorship and all that kind of stuff.
01:09:01.220 | It's sometimes hard to see when you're in it
01:09:03.980 | that this kind of stuff is happening.
01:09:06.460 | It does seem difficult to criticize DEI programs,
01:09:11.500 | not horribly difficult, terrible.
01:09:14.600 | They're this monster that infiltrates everything.
01:09:17.060 | But it is difficult and it requires great leadership.
01:09:20.000 | - So where have you criticized it and been condemned?
01:09:23.580 | Academic or-- - Academic, yeah.
01:09:25.900 | - Academic, let's, two different worlds.
01:09:28.260 | - Companies and academic, yeah.
01:09:29.460 | - Two different worlds.
01:09:30.300 | - But I also think it's not,
01:09:31.780 | I really wanna point my finger at the failure of leadership
01:09:36.660 | of basically firing mediocre people,
01:09:41.120 | like people that are not good at their job.
01:09:43.540 | The problem to me is DEI's defense mechanism,
01:09:48.540 | like immune system is so strong
01:09:51.900 | that the shitty people don't get fired.
01:09:55.580 | So the vision, the ideal of DEI is a beautiful ideal.
01:09:59.420 | It's just like--
01:10:01.100 | - Well, maybe it's 'cause I'm an entrepreneur
01:10:02.500 | when I see an ideal that you try to implement it
01:10:05.460 | and support it and get to that point.
01:10:07.580 | But universities and companies
01:10:10.320 | are night and day different, right?
01:10:11.920 | I can see an argument for the ideology in a university.
01:10:14.480 | I can see, you look at the amount of money spent on it.
01:10:18.280 | And so while the goal is right,
01:10:20.760 | the way they implement it in universities,
01:10:25.240 | the way they implement most things
01:10:26.560 | in universities is wrong, right?
01:10:28.400 | There's a reason why tuition has gone up,
01:10:31.440 | a multitude of, or a multiple of inflation
01:10:36.580 | that they're not well-run organizations across the board.
01:10:40.540 | So I'm not gonna argue with that at all.
01:10:42.180 | So when you've seen me argue with DEI,
01:10:44.060 | I haven't waded into DEI in universities at all.
01:10:46.500 | - So it's mostly focused on companies.
01:10:47.820 | - 100%, right, because that's where I exist.
01:10:50.220 | But at the same time, like I read Christopher Ruffro's book
01:10:53.020 | where he talks about the genealogy of wokeism and ideology,
01:10:57.060 | but then he gets to the point,
01:10:58.180 | and I hope I'm remembering this right,
01:11:00.100 | where he says that the response to it
01:11:03.740 | is decentralized activism, if you will,
01:11:07.340 | that's not the word he used, to try to counter that DEI.
01:11:12.340 | And that seems to me to be counter
01:11:16.720 | to the whole conservative movement right now, right?
01:11:19.280 | Other than school boards, right?
01:11:21.520 | Where it's centralized and the Republican candidate
01:11:25.840 | is all about centralized power in him.
01:11:28.400 | And to me, that's just a conflict
01:11:32.080 | and a lot of the underpinning of the whole DEI conversation
01:11:36.920 | that a lot of which goes through
01:11:39.560 | Christopher Ruffro right now.
01:11:41.560 | - Let's continue on a theme
01:11:42.840 | of fun exchanges on the internet.
01:11:45.400 | So Elon tweeted, "The fundamental axiomatic flaw
01:11:49.320 | "of the woke mind virus
01:11:51.320 | "is that the weaker parties always right, in parentheses,
01:11:54.860 | "even if they want you to die."
01:11:57.680 | And you responded at length, but the beginning is,
01:12:01.140 | "The fundamental axiomatic flaw of the anti-woke mind
01:12:04.960 | "is that it allows groups with historical power
01:12:07.360 | "to play the victim by taking anecdotal examples
01:12:10.720 | "and packaging them into conjured conspiratorial ideology
01:12:14.580 | "that threatens to up and the power structures
01:12:17.420 | "they have been depending on."
01:12:19.540 | So-- - He says it all, right?
01:12:22.960 | (laughing)
01:12:24.120 | - Well, there's a tension there.
01:12:26.280 | So, yes, but both can be abused, right?
01:12:31.280 | Both positions of power can be abused.
01:12:35.660 | There's power in DEI and there's shitty people
01:12:40.660 | that can crave power and hold on to power
01:12:42.940 | and sacrifice their ideals.
01:12:45.060 | - Okay, put aside universities, okay?
01:12:47.120 | Damn it.
01:12:48.220 | - Yeah, I mean-- - Because I'm not gonna argue
01:12:49.780 | that universities implement DEI well, right?
01:12:52.300 | And I'm not gonna tell you that they need to be spending
01:12:57.300 | 20 some million dollars a year on DEI positions.
01:13:02.120 | To me, that's insane.
01:13:03.220 | Do I look at the Harvard and North Carolina decision
01:13:09.480 | and say it was a great decision?
01:13:11.140 | No, because I think having a diverse student body
01:13:15.800 | helps make for kids who are better prepared
01:13:20.120 | for the real world.
01:13:21.480 | But I'm not running a university so it's not my choice.
01:13:24.360 | Maybe at some point in the future I will, but not now.
01:13:27.820 | And in terms of the corporate side of it,
01:13:32.820 | who's telling anybody what to do?
01:13:35.780 | - Well, maybe you can give me some help.
01:13:41.480 | - Sure, I'm here to help you, Lex.
01:13:43.440 | - There's an example in the AI world
01:13:47.280 | of a system called Gemini 1.5.
01:13:51.280 | - Yeah, I mean, everybody was black or whatever,
01:13:53.520 | people of color.
01:13:54.360 | - George Washington was black, Nazis were black.
01:13:57.600 | - So why is it when that came out, it was a big uproar,
01:14:00.920 | but when somebody, so who was it?
01:14:04.700 | One of the people who are trying to fuck with me.
01:14:07.040 | I forget which one.
01:14:09.840 | - There's so many.
01:14:10.680 | - Yeah, but he pointed out to Elon that Grok,
01:14:16.360 | Elon's AI was woke when it answered certain questions.
01:14:21.320 | And other people have pointed out other things to Elon
01:14:24.280 | about Grok, whatever, however it's pronounced,
01:14:26.980 | that was leaning left or woke, right?
01:14:31.800 | And Elon's response was, oh, it'll change,
01:14:34.040 | it's a mistake, we're fixing it.
01:14:36.300 | When it happens to Gemini and Google,
01:14:38.720 | it's the end of the world.
01:14:39.920 | Look how woke they are and it's a reflection
01:14:41.600 | of all their culture.
01:14:42.560 | Now Google comes out and says it's a mistake
01:14:44.320 | and then they dox the guy who was the product manager
01:14:48.160 | or whatever of AI of that product who,
01:14:51.600 | and then they go back and look at his old tweets, right?
01:14:54.020 | And show that he's very left-leaning
01:14:56.680 | and very DEI supportive and that's the end of the world.
01:15:01.200 | - It's not the end of the world,
01:15:02.240 | but Google's so much dependent on trust,
01:15:07.240 | that trust that Google search has as objective as possible
01:15:13.400 | a channel into the world of information.
01:15:17.240 | And so that brand is really important.
01:15:18.880 | - Yeah, see, you're over, you're giving them too much power.
01:15:22.640 | And maybe I'm not recognizing the power, right?
01:15:25.360 | So I'll tell you a personal experience.
01:15:27.680 | Up until a month ago maybe,
01:15:33.320 | if you put in keto gummies, Shark Tank keto gummies,
01:15:39.500 | into Google, it would show up with scammy ads,
01:15:44.500 | scam ad after scam ad.
01:15:45.740 | And I would get emails up until a month ago
01:15:48.980 | from elderly people asking me why the gummies weren't working
01:15:53.980 | and why the companies were charging all this money
01:15:58.140 | on a month-by-month basis when they tried to cancel.
01:16:00.460 | And they said it was the number one deal
01:16:03.000 | on Shark Tank of all time, right?
01:16:04.820 | And all Shark, it was a mistake.
01:16:07.860 | - Well, there's fraud, there's mistakes,
01:16:10.380 | but the mistakes--
01:16:12.600 | - No, but why didn't Google fix it, right?
01:16:14.780 | The distance didn't happen once over one week,
01:16:16.920 | over two weeks, right?
01:16:18.260 | And because it was hard to fix.
01:16:20.300 | As it turns out, I was working with them
01:16:22.520 | to try to find a fix.
01:16:23.900 | And we would both look at the same page.
01:16:26.460 | And if you were inside of Google
01:16:29.140 | within the google.com domain, it would show one page.
01:16:32.980 | If you were outside of Google, it would show another.
01:16:34.980 | And it took us looking at it at the same time
01:16:37.000 | for anybody to realize it.
01:16:38.680 | Meaning that there's a lot of technology problems
01:16:40.500 | that are hard to fix.
01:16:41.460 | - They're super complex, and we could talk about it forever
01:16:43.860 | with social media.
01:16:45.420 | The criticism towards Google, towards other companies,
01:16:48.000 | when they're based in Silicon Valley,
01:16:49.980 | there could be an ideological drift
01:16:51.780 | into a ideological bubble
01:16:55.740 | out of which the technology is created,
01:16:57.340 | and they could be blind to the obvious bias
01:17:00.380 | that comes inherent to them.
01:17:02.340 | - Yeah, but they've got billions of customers
01:17:03.740 | who are not gonna, so what you're saying is
01:17:05.700 | the free market stops with artificial intelligence,
01:17:08.820 | that people don't pay attention and respond,
01:17:12.000 | that Google doesn't listen to the responses,
01:17:14.460 | that people inside of Google
01:17:15.880 | will ignore their own best financial interest,
01:17:18.760 | and even their own best personal interest,
01:17:20.760 | 'cause they know they're gonna get doxed now
01:17:22.480 | on by Elon and others.
01:17:24.680 | And so I just don't see that.
01:17:26.960 | And Elon's not allowed to make those same mistakes,
01:17:29.320 | but-- - No, no, no, no.
01:17:30.160 | - Elon is allowed to make those mistakes,
01:17:32.200 | but Google isn't.
01:17:33.040 | - Oh, no, Elon is 100%, should be criticized
01:17:36.340 | for the ridiculousness of overstatements
01:17:38.820 | that he makes about various products.
01:17:41.540 | He's having a bit of fun, like you are also.
01:17:43.740 | But, and I also believe in the free market,
01:17:46.980 | but it's not always efficient.
01:17:49.660 | There's like a delay.
01:17:50.500 | - Just takes time, yeah, it's fine.
01:17:52.240 | - So which is why Elon is important.
01:17:53.820 | We're calling out, I think,
01:17:55.100 | overstating the criticism of Gemini,
01:17:57.220 | but Elon and others are just--
01:17:58.820 | - Gemini wasn't even
01:17:59.680 | a fully available public product yet.
01:18:03.120 | - It's still a bias that resonates with people--
01:18:06.160 | - That's the way neural networks work, though, right?
01:18:08.440 | That's why there'll be millions of models,
01:18:10.140 | because weights and biases, (laughs)
01:18:13.480 | putting together a neural network.
01:18:15.880 | - But, well, no, so like the black George Washington
01:18:20.880 | is a correction on top of the foundation model
01:18:25.920 | to keep it, quote, unquote, sort of safe.
01:18:28.260 | One of the big criticisms of all of the models, frankly,
01:18:32.440 | probably even Grok, a little bit less so,
01:18:34.880 | is they're like trying to be really conservative
01:18:38.160 | in the sense of trying to be careful
01:18:41.240 | not to say crazy shit.
01:18:42.640 | - Of course.
01:18:43.480 | - Because we don't know how the thing--
01:18:44.320 | - It's brand new, and we know what happens, right?
01:18:46.960 | And they do it on the front end with prompts,
01:18:49.200 | and they try to do it on the back end
01:18:50.640 | with the neural networks that are underneath them, right?
01:18:53.920 | And it doesn't always work,
01:18:55.620 | and that's why there's gonna be millions of models
01:18:57.700 | rather than just four foundational models
01:19:00.560 | or five that everybody uses.
01:19:02.760 | - Well, I guess the main criticism
01:19:04.480 | is you want to have some transparency
01:19:06.560 | of all the teams that are involved
01:19:07.920 | and that this kind of, to the degree
01:19:10.480 | there's a left-leaning ideology within the companies,
01:19:15.280 | it doesn't affect the product.
01:19:16.640 | - But that's the beauty of--
01:19:17.760 | - The free market.
01:19:18.600 | - Yeah, that's where the market corrects it, right?
01:19:20.960 | And not only from the outside,
01:19:23.520 | because everybody you know is going to test it.
01:19:25.920 | Like when YouTube first came out,
01:19:27.760 | well, not first came out, after Google bought them,
01:19:30.380 | there used to be different commands you could give it,
01:19:34.760 | right, there were prompt commands that you could give it,
01:19:38.360 | and you could find all the nasty porn
01:19:40.840 | that got loaded before they kicked it off, right?
01:19:44.860 | And it was just the nastiest shit ever,
01:19:47.400 | and even now to this day,
01:19:49.000 | if there's some horrific, tragic event,
01:19:51.200 | somebody's loading it up, right?
01:19:53.360 | Now, I know that's not direct to your point
01:19:55.560 | of internal influence to the output, right?
01:19:58.800 | But people on the outside are gonna check for that now,
01:20:01.560 | right, it's almost like the new bug contest, right,
01:20:04.080 | to try to find bugs in software.
01:20:06.040 | And then on the inside, if it's all left-leaning
01:20:09.260 | and all you have is left-leaning employees
01:20:11.440 | because most conservatives won't wanna work there then,
01:20:14.920 | again, that's self-correcting as well.
01:20:16.640 | - That's the hope, but it can self-correct
01:20:19.120 | in different kinds of ways.
01:20:20.040 | You can have a different company that competes
01:20:22.680 | and becomes more conservative.
01:20:23.880 | My worry is that it kind of becomes like two different worlds
01:20:27.040 | where there's like-- - It already is.
01:20:28.960 | - No, come on.
01:20:30.560 | - Don't give up.
01:20:31.600 | - Oh, I'm not giving up.
01:20:32.920 | So where does this go is the question, right?
01:20:35.840 | What happens next?
01:20:37.840 | And I mean, going back, I mean,
01:20:39.980 | I've been in so many PC revolutions, right, or evolutions
01:20:44.120 | where porn was the big issue, right?
01:20:48.440 | Now we don't even talk about porn being an issue
01:20:50.240 | even though every post on Twitter now
01:20:54.120 | has link in bio for a porn post, right?
01:20:59.120 | We don't even think that's a negative anymore.
01:21:01.220 | That's just an accepted thing.
01:21:02.840 | And now it's become very where your politics on Twitter.
01:21:07.840 | But again, as you extend that and things grow,
01:21:12.840 | as AI models become more efficient and trainable
01:21:18.600 | on less, for a lot less money,
01:21:20.440 | or even locally on a PC or a phone,
01:21:22.880 | we're all gonna have our own models.
01:21:24.600 | And there's gonna be millions and millions
01:21:27.000 | and millions of models and not just foundational models.
01:21:29.640 | Now maybe they're built some on open source,
01:21:32.120 | maybe it'll be copy-pasta where you can just cut and paste
01:21:36.480 | and create your own model and train it yourself.
01:21:39.720 | Maybe it'll be mixture of experts
01:21:41.560 | where maybe it'll be a meta front end
01:21:43.560 | like we're working on a project
01:21:45.160 | where we take 30 different AI models
01:21:48.120 | and there's just a meta search engine
01:21:50.040 | where it searches all of them
01:21:51.680 | and you can compare all the outputs
01:21:53.480 | and see what you think is the best,
01:21:55.160 | kind of like a search engine, right?
01:21:56.760 | Because you might get, is DEI good, right?
01:22:00.520 | Is the COVID vaccine good, right?
01:22:02.760 | You're gonna get a variety of outputs
01:22:06.880 | and you have to make that decision yourself.
01:22:09.640 | That's what I think is gonna happen with AI as well,
01:22:11.500 | because I think brands, there's no way the Mayo Clinic
01:22:14.680 | and the Harvard Medical School
01:22:17.560 | are just going to contribute all their IP
01:22:19.800 | to CHAP-GPT or Gemini or whatever.
01:22:23.960 | It's gonna have to be licensed
01:22:25.080 | or they're gonna do their own.
01:22:26.860 | - Yeah, I mean, that's a very hopeful message,
01:22:29.080 | but that said, human history
01:22:34.080 | doesn't always autocorrect really quickly,
01:22:36.720 | self-correct really quickly.
01:22:38.280 | Sometimes you get into this very painful things.
01:22:41.320 | You have Stalin, you have Hitler.
01:22:45.400 | You can get to places very quickly
01:22:47.400 | where the ideological thing just builds on itself.
01:22:49.920 | - But Twitter is not real world.
01:22:52.920 | You know, there's 27. - Twitter is not real world,
01:22:55.480 | that's true, yes.
01:22:56.720 | But you could still have a nation captured by an ideology.
01:23:00.720 | I think America has been really good
01:23:03.880 | at having these two blue and red
01:23:06.520 | always at tension with each other, dividing the populace.
01:23:10.560 | And in the process of doing that, figuring stuff out.
01:23:14.240 | Like almost like playing devil's advocate,
01:23:16.720 | but like in real life.
01:23:18.280 | - You know, and that's fair and that's right.
01:23:20.560 | You know, as opposed to Pravda
01:23:21.720 | telling you everything you wanna know, right?
01:23:23.360 | And everybody believing it
01:23:24.640 | 'cause there's control of everything, right?
01:23:26.680 | And so going back to what you said earlier,
01:23:28.840 | people in Russia don't think invading Ukraine,
01:23:31.480 | a lot of them see it as a positive, right?
01:23:35.720 | I'm sure you have relatives and friends
01:23:37.360 | who think it's the best thing that ever happened, right?
01:23:40.400 | 'Cause they believe in Putin.
01:23:42.480 | - They're denazifying Ukraine,
01:23:44.280 | they're removing the Nazis from Ukraine.
01:23:45.880 | - Right, right, 'cause that's exactly what Putin said.
01:23:48.280 | And you know, we don't have one uniform media outlet,
01:23:53.280 | that's the difference.
01:23:56.440 | Even though people like to talk about mainstream media
01:23:58.960 | as being the source of a lot of the friction,
01:24:01.000 | there is no such thing as mainstream media anymore.
01:24:03.360 | You know, Fox is the biggest cable news channel
01:24:07.840 | with the biggest audience,
01:24:09.720 | and they call everybody else mainstream media.
01:24:12.280 | You know, it's insane the things that we accept
01:24:14.960 | from our sources of information.
01:24:16.720 | To me, that's the bigger problem.
01:24:18.720 | The bigger problem is trying to figure out
01:24:21.640 | what is free speech
01:24:22.760 | and what is the line of tolerance for free speech.
01:24:25.640 | And at what point does hateful free speech
01:24:28.400 | crowd out other people, right?
01:24:30.880 | Putin's the master of that.
01:24:32.360 | You're going to jail or you're gonna be dead
01:24:34.280 | if you disagree, right?
01:24:35.800 | Now, God help us if we ever get to that point here,
01:24:38.960 | but the person who controls the algorithm
01:24:41.880 | controls the world, right?
01:24:44.080 | And if you are committed to one specific platform
01:24:48.840 | as your singular source of information
01:24:51.280 | or affiliated platforms,
01:24:53.200 | then whoever controls the algorithm or the programming there
01:24:56.720 | controls you in a lot of respects.
01:24:59.040 | And I think that's where our biggest problem has been.
01:25:01.560 | We get people attached to specific platforms
01:25:05.480 | and apps and media outlets,
01:25:09.000 | and they become part of that team
01:25:11.920 | and they identify as such,
01:25:14.280 | and either you're part of the team or you're not.
01:25:16.880 | And that to me is the fundamental problem.
01:25:18.840 | It's not woke ideology because I never felt any pressure
01:25:22.400 | to make the choices that I've chosen,
01:25:25.000 | and, you know, including diversity, equity, and inclusion.
01:25:28.320 | And I've never forced anybody or told anybody to do it.
01:25:31.480 | I just said, here's my experiences.
01:25:33.280 | Whenever I've talked to people
01:25:34.280 | and talk about the woke ideology,
01:25:36.280 | no one ever got forced.
01:25:37.280 | I mean, if you look at Dylan McDermott, right?
01:25:41.180 | If there was a way to gauge the number of impressions
01:25:44.800 | that she had, right, and where they sourced from,
01:25:48.560 | I'd be willing to bet any amount of money
01:25:50.080 | that 90% plus of the impressions and discussions
01:25:53.720 | of Dylan McDermott were on right-leaning media.
01:25:57.200 | - Several things, actually, let's even go there.
01:25:59.860 | You've gotten a bit of a beef with, again,
01:26:02.080 | fun with Jordan Peterson about this.
01:26:03.760 | - That's a guy whose name I couldn't think of, yeah.
01:26:05.680 | - So the topic there was the gender transition
01:26:08.760 | and Dylan Mulvaney.
01:26:10.400 | Hugh, can you explain the nature of the beef?
01:26:12.560 | I mean, it's an interesting claim you're making
01:26:14.600 | that most of the people who are concerned
01:26:18.480 | about this are conservatives.
01:26:20.360 | - Yeah, just the point is that if you looked at impressions,
01:26:24.320 | like when you run an ad,
01:26:25.240 | you're curious about impressions and who sees them, right?
01:26:28.600 | But if you look at the impressions
01:26:29.800 | related to Dylan McDermott,
01:26:33.160 | I would, like I just said,
01:26:34.360 | I'd bet 90% or more were in conservative media.
01:26:37.720 | And I don't know how many followers.
01:26:40.240 | She had 250,000 followers or whatever
01:26:42.200 | when the Bud Light ad came out.
01:26:44.020 | And if it weren't for Kid Rock shooting at Dylan McDermott
01:26:49.020 | Bud Light cans, she'd be long forgotten.
01:26:53.160 | - Yeah, but most of the people that care about censorship
01:26:56.160 | are gonna be free speech advocates.
01:26:57.520 | So most people that care about Putin suppressing speech
01:27:02.000 | or anybody else suppressing speech
01:27:03.320 | are going to be like libertarians.
01:27:05.520 | So there's probably an explanation to that.
01:27:08.840 | The criticism that Jordan Peterson could provide,
01:27:12.560 | I guess he said that Dylan Mulvaney popularized
01:27:16.560 | the kind of mutilation, right, in his view.
01:27:21.240 | They can affect, there's a very serious life-changing process
01:27:26.000 | that a person goes through.
01:27:27.720 | And when that's applied to a child,
01:27:29.960 | it can do a lot of harm to a person
01:27:31.680 | if- - But my point still holds.
01:27:34.320 | I don't know how many kids were following,
01:27:36.720 | and you can look at the followers list.
01:27:38.160 | It's not like it's hidden, right?
01:27:39.660 | Back then, if they had 250,000 followers,
01:27:41.640 | and now we're on TikTok,
01:27:42.840 | where he might get 50 some thousand views or likes, right?
01:27:47.920 | I don't know how many views, but likes.
01:27:50.680 | I've never seen any evidence that Dylan McDermott
01:27:54.800 | influenced people to transition their gender.
01:27:58.360 | As he transitioned to her, it was documented on TikTok
01:28:01.960 | over the course of a year.
01:28:03.840 | And again, when you go back and look at the views
01:28:07.300 | on those TikToks, it wasn't like enormous.
01:28:12.300 | - Yeah, but the trends start, right?
01:28:16.520 | It could be, whereas people, for young kids,
01:28:21.520 | there to be a trend of, especially when you feel
01:28:25.480 | like an outsider, you feel not yourself,
01:28:27.880 | less than yourself, all this kind of stuff
01:28:29.560 | that kids feel like, that if it's,
01:28:33.240 | because popular enough as a trend,
01:28:35.560 | you would gender transition without meaning to do that.
01:28:40.560 | It's just part of a trend.
01:28:42.840 | That's the worry they have. - Yeah, but that's
01:28:44.120 | a big stretch, right?
01:28:45.680 | To think that all the things that have to happen
01:28:49.800 | before you transition gender, right?
01:28:52.360 | And I'm not saying kids might identify,
01:28:56.360 | find it cool, or in the moment, expedient, if you will,
01:29:01.360 | to dress up as the other gender.
01:29:03.440 | Great, who cares, right?
01:29:05.120 | But to go through the actual physical transition,
01:29:08.440 | I don't remember what the numbers were that I read,
01:29:11.160 | but I do remember that the latest numbers
01:29:14.000 | that came out in terms of transitioning
01:29:16.360 | were from JAMA, which is a medical association,
01:29:19.720 | that said from 2021 to 2022, the numbers
01:29:25.560 | went down, but the bigger point is there are no numbers
01:29:29.520 | for 2023 when post-Dylan McDermott,
01:29:32.480 | so there's no way to know if the assertion is true,
01:29:36.000 | even marginally true.
01:29:37.280 | Now, you can easily suggest it, right?
01:29:41.000 | But you can say that about any social media influencer,
01:29:44.360 | right, you know, people are, kids are dying because,
01:29:49.240 | you know, I mean, it's just like when people
01:29:51.800 | accuse Trump of potentially influencing people
01:29:55.360 | to, you know, inject bleach into their veins.
01:29:58.240 | You can't, you know, that's a big old leap
01:30:01.880 | to say that because, you know, Trump says it,
01:30:04.200 | that people are gonna start injecting,
01:30:06.680 | and then they find somebody who actually did,
01:30:08.760 | and it's like, oh, it must be true, you know,
01:30:10.800 | this is a trend now.
01:30:12.600 | I just, I'm just not buying it that there aren't
01:30:15.840 | enough roadblocks in the way.
01:30:18.600 | Now, I'm not saying it never happens, right?
01:30:20.760 | And I, and for me, to me, you should have to wait
01:30:24.560 | until you're 18 to actually have any surgery to transition,
01:30:27.960 | and if your parents approve it earlier,
01:30:31.520 | then you can have a conversation with your doctor.
01:30:33.680 | But you're suggesting that everybody in that process
01:30:38.040 | to go to transition, a minor, is corrupt,
01:30:42.480 | that the doctor, the sociologist, the psychologist,
01:30:45.800 | all the people involved, the hospital where the surgery
01:30:48.640 | is happening, the insurance company that's paying for it,
01:30:52.000 | they all have been corrupted by this trend.
01:30:54.160 | I just don't see that.
01:30:55.360 | - Well, not corrupted, but you know, people,
01:30:57.560 | it's back to the DEI thing.
01:31:00.480 | There could be pressure, and we are--
01:31:02.720 | - Pressure to operate, so think about all the people
01:31:05.160 | who have to be complicit to do an operation.
01:31:07.720 | - It's not complicit like evil complicit, it's more--
01:31:09.840 | - No, it is evil complicit, right?
01:31:11.440 | Because somebody, in hospitals right now,
01:31:15.720 | they won't perform abortions because of state law.
01:31:18.880 | In Alabama, they stopped IVF treatment immediately
01:31:22.840 | after that ruling by that judge, right?
01:31:25.800 | The QAnon judge.
01:31:27.280 | To think that they're not gonna pay attention
01:31:30.440 | to the possible consequences of being the hospital
01:31:33.320 | that does transgender, that gives doctors
01:31:37.080 | operating rights there, and not be aware of the risks
01:31:40.920 | associated with it and double-check,
01:31:42.920 | to me, that's just insane.
01:31:44.000 | They're risking their entire business and livelihood
01:31:46.600 | and personal relationships for not checking
01:31:49.800 | that this 14-year-old boy who wants to be a girl
01:31:53.720 | or vice versa is there waiting for surgery.
01:31:57.560 | I just don't see that.
01:31:58.400 | - In America, yes, but if we look at humans in general,
01:32:02.200 | and Jordan Peterson, I think,
01:32:05.880 | unjustly, incorrectly brought up Auschwitz.
01:32:08.840 | - Yeah, that was ridiculous.
01:32:10.520 | - But if we look, to me, World War II
01:32:15.240 | is a very interesting time.
01:32:16.760 | It does reveal a lot about human nature
01:32:19.440 | and that humans are able to commit atrocities
01:32:22.760 | without really speaking up.
01:32:24.400 | The point I wanna make is that when you're
01:32:27.800 | in this situation where everybody around you
01:32:31.920 | is committing an atrocity, you can be sort of
01:32:35.040 | the good German, and human nature is such that you can--
01:32:40.240 | - But that is in a time of war.
01:32:43.560 | - Yeah, but it's still human nature.
01:32:46.480 | It's interesting to remember that.
01:32:47.320 | - It's in a time of war.
01:32:49.120 | When you feel like there's nationalism, patriotism,
01:32:52.080 | everything that comes up.
01:32:53.320 | Russia, right?
01:32:54.160 | The moms of the kids sent to Ukraine who didn't come back
01:32:59.600 | in Russia feel certainly different
01:33:01.040 | than the everyday Russian who's just taking
01:33:04.600 | whatever information that's available
01:33:06.000 | from a unified, controlled media.
01:33:07.920 | - Yeah, but we should remember human nature.
01:33:11.520 | It's interesting.
01:33:12.360 | - I'm not dismissing human nature at all,
01:33:13.720 | but there's a difference.
01:33:15.680 | I think that human nature, self-preservation,
01:33:18.560 | influences those decisions.
01:33:20.000 | There's nothing about self-preservation
01:33:22.120 | involved in DEI, wokeness, transgenderism
01:33:26.740 | to compare it to Auschwitz.
01:33:28.360 | That's insane.
01:33:29.640 | - Yeah, well, that comparison is almost always,
01:33:32.360 | probably always is insane.
01:33:34.080 | Comparison between anything and the Holocaust.
01:33:37.040 | - I agree.
01:33:37.880 | - There's a name for that rule,
01:33:39.080 | but once you bring up Hitler, the conversation ends.
01:33:43.080 | I do appreciate you bringing up Trump
01:33:44.920 | and bleach as an example.
01:33:46.420 | So continuing on fun exchanges between you and Elon,
01:33:51.400 | you said if they were having Biden's last wake
01:33:54.480 | and it was him versus Trump
01:33:56.360 | and he was being given last rights,
01:33:58.600 | I would still vote for Biden.
01:34:00.360 | To which Elon replied, caricaturing you,
01:34:03.920 | if Biden were a flesh-eating zombie
01:34:06.280 | with five seconds to live or upon being reelected,
01:34:10.440 | Earth would plunge into a 1,000 years of darkness,
01:34:13.480 | I would still vote for him.
01:34:15.020 | That's basically quoting you, but in a caricature.
01:34:20.000 | And you responded,
01:34:21.000 | while I have your attention, wanted to say thank you.
01:34:25.480 | Your consultants at Tesla followed up
01:34:27.320 | about using cost-plus drugs,
01:34:30.680 | about which we'll talk about,
01:34:32.000 | to save the company money.
01:34:33.480 | Truly appreciate it.
01:34:35.040 | And in parentheses, my limit is 300 years of darkness.
01:34:39.680 | Very well done, Mark.
01:34:41.720 | What's your intuition,
01:34:42.760 | if we just stick on Biden and Trump for a sec,
01:34:45.760 | what's your intuition why Biden
01:34:47.080 | would make a better president than Trump?
01:34:48.800 | - Look at the basics, right?
01:34:51.040 | If you look at the people he's hired,
01:34:52.880 | there hasn't been any turnover in his cabinet at all.
01:34:58.480 | If you look at the people he's hired
01:35:01.000 | over the course of his career,
01:35:02.960 | or while he was vice president in particular,
01:35:05.440 | there's nobody who's turned on him
01:35:07.240 | and came out and written books
01:35:09.000 | and made public statements
01:35:10.680 | about how he's bad for the country.
01:35:12.800 | Now compare that to Trump.
01:35:14.920 | The people closest to him,
01:35:16.500 | almost all of them turn,
01:35:20.160 | unless there's a financial relationship involved.
01:35:23.200 | And to me, that says everything.
01:35:26.040 | - The dynamics of the team is important to you.
01:35:27.880 | - If you're gonna be the most powerful person in the world,
01:35:30.120 | you better know how to manage and lead.
01:35:32.600 | And that's not to say Biden hasn't made a lot of mistakes.
01:35:37.600 | I mean, immigration, the border, is a horrific mistake.
01:35:41.060 | And hopefully he recognizes that.
01:35:44.560 | And I don't like the fact that he doesn't admit his mistakes
01:35:46.920 | and just say, "Okay, I gotta fix it,"
01:35:48.760 | or, "I made a mistake in Afghanistan,"
01:35:50.000 | whatever it may be, right?
01:35:51.600 | The position of commander in chief and president,
01:35:57.400 | you're gonna make mistakes.
01:35:59.560 | Then I look at the other guy,
01:36:01.360 | never miss a mistake, and the list is long.
01:36:04.600 | - What do you think about the immigration situation?
01:36:06.600 | A lot of conservatives are using that,
01:36:08.840 | sort of the theory is that the reason it's happening
01:36:15.320 | is because they would be able to illegally vote.
01:36:19.640 | - That's insane.
01:36:20.480 | - For Biden.
01:36:21.300 | - Yeah, you can't be an illegal immigrant and vote.
01:36:24.160 | And now, in a lot of states, because of the conservatives,
01:36:26.980 | they've passed laws saying you have to show identification.
01:36:29.280 | When I voted in Texas,
01:36:30.640 | you had to show state identification.
01:36:34.600 | They can't vote.
01:36:35.440 | You can't register as an illegal alien
01:36:37.000 | that I'm aware of to vote.
01:36:39.240 | - Yeah, but of course, that story really worries me,
01:36:42.240 | enables or serves as a catalyst
01:36:46.280 | for questioning the legitimacy of an election.
01:36:49.360 | - I remember going to the debate with Trump in 2016,
01:36:53.640 | and he was debating Clinton,
01:36:55.080 | and one of the things he said was,
01:36:57.680 | "We don't even know if this election
01:36:59.680 | will be legitimate if I lose."
01:37:01.560 | This was in 2016 before he was even elected,
01:37:04.720 | and that was where he was going.
01:37:06.320 | That's just what he does.
01:37:07.800 | He's never admitted a mistake.
01:37:09.640 | The guy's failed a zillion times.
01:37:11.120 | Most people say, "Okay, I learned from him."
01:37:13.320 | I read a book about Roy Cohn,
01:37:16.080 | and Roy Cohn was the ultimate deny, deny, deny,
01:37:19.440 | and that was one of Trump's mentors.
01:37:21.280 | And you can see almost everything Roy Cohn ever did
01:37:25.240 | in the same way that Donald Trump approaches things.
01:37:27.720 | - But given how drastic the immigration situation is,
01:37:30.820 | that story becomes more believable.
01:37:33.400 | - Yeah, of course it does, right?
01:37:34.400 | But the facts are still the facts, right?
01:37:36.600 | And in red states, they're gonna be checking every ID.
01:37:40.680 | They're gonna be making sure that's not the case,
01:37:42.800 | and you can also make the argument,
01:37:44.960 | well, in a blue state, it doesn't matter.
01:37:46.200 | In the swing states, they're still gonna be checking
01:37:49.040 | 'cause they know Trump is gonna sue the shit out of him
01:37:50.960 | when he loses, you know?
01:37:52.640 | And so, again, that's where, you know,
01:37:56.720 | people will take those self-preservation steps
01:37:59.880 | to keep their job and do the right thing.
01:38:01.520 | There's still enough people who believe in this country
01:38:04.720 | and how amazing it is to do the right thing.
01:38:07.440 | And a lot of the premises
01:38:08.560 | of what some conservatives are saying and doing,
01:38:12.360 | the underpinning of it is that their fellow citizens
01:38:15.960 | will not do anything, not some things,
01:38:18.920 | anything that serves the best interest of this country.
01:38:22.720 | And to me, that's just wrong.
01:38:24.480 | You know, that is just misleading and wrong.
01:38:27.120 | - I just worry about, I don't care about Trump or Biden.
01:38:30.960 | I care about democracy.
01:38:33.200 | I just worry.
01:38:34.480 | I worry about the viral nature of the idea
01:38:36.920 | of this illegal immigrants.
01:38:38.600 | - But it's just, it's very functional, right?
01:38:41.800 | Either they get across, there's 1,000 different ways to,
01:38:46.080 | an unlimited number of ways
01:38:47.320 | to enter the United States of America undetected, right?
01:38:50.320 | And the South border where it's the easiest and the worst.
01:38:53.080 | And Biden needs to take steps to reduce that.
01:38:55.520 | Remember, when Biden was vice president
01:38:57.560 | and Obama was president,
01:38:59.940 | they called Obama the deporter-in-chief.
01:39:02.920 | He had no problem deporting people.
01:39:05.480 | And I think if I had to guess, and this is just a guess,
01:39:08.800 | that when they looked at the initial statistics
01:39:11.480 | for immigration when Biden took over,
01:39:13.600 | they thought there was room for more immigrants,
01:39:16.200 | not because they would vote,
01:39:17.680 | but you can make a fiscal argument
01:39:20.520 | that in a world where the birth rate is flat to declining,
01:39:25.520 | we need immigrants, right?
01:39:29.040 | And immigrants typically don't have a higher crime rate
01:39:34.040 | or anything than indigenous American citizens.
01:39:37.800 | Indigenous isn't the right word, but American citizens.
01:39:40.500 | And so they made a calculated mistake.
01:39:44.440 | They made a decision that was wrong.
01:39:45.680 | And now they have to fix it
01:39:47.080 | or it's gonna hurt them severely.
01:39:48.320 | But I don't buy what Elon's pushing,
01:39:51.160 | that the whole reason is they are voters
01:39:55.440 | and will become voters.
01:39:57.040 | - And we should say the obvious.
01:40:00.280 | You're a descendant of immigrants.
01:40:01.920 | - Yeah, for sure.
01:40:02.760 | - And the immigrants is what makes this country great
01:40:05.640 | in many parts, the diversity of this nation.
01:40:07.960 | And we should probably keep the people
01:40:09.400 | that are already been in this country for a while
01:40:12.440 | and are killing it, like PhD students and all this.
01:40:16.200 | - That's not what Donald Trump wants, though.
01:40:17.800 | He wants to ship them all out, right?
01:40:20.000 | There's just a whole lot of hyperbole
01:40:21.880 | when it comes to talking about all of these things
01:40:25.080 | we're talking about.
01:40:25.960 | When it's right versus left, my team versus your team,
01:40:30.400 | my tribe versus your tribe,
01:40:32.240 | the only way to stand out is hyperbole.
01:40:34.440 | The hard part, and why I like this conversation,
01:40:36.600 | is how do you distinguish hyperbole versus reality?
01:40:40.240 | And I get where you're going, Lex,
01:40:41.840 | where it's like the smallest spark sometimes
01:40:46.840 | can cause people to change,
01:40:50.760 | and then that spark becomes bigger,
01:40:52.520 | and then it becomes more widespread.
01:40:56.000 | And then all of a sudden, your country has changed.
01:40:58.400 | It's not what you thought it was.
01:40:59.960 | I get that completely, right?
01:41:01.680 | And yes, you always have to be on top of that to make sure.
01:41:05.480 | But a lot of that comes from lack of leadership, right?
01:41:08.120 | And lack of trust.
01:41:09.320 | Because there's nobody who's saying,
01:41:11.840 | "All right, Republicans, that's all hyperbole,
01:41:14.680 | "and you're wrong for that.
01:41:15.840 | "Democrats, you fucked up on immigration, right?
01:41:18.760 | "You fucked up in Afghanistan, right?
01:41:21.080 | "Here's where you made these mistakes, own it."
01:41:23.520 | There's nobody who says,
01:41:24.760 | "We're not gonna just bring in Republicans
01:41:28.360 | "if the Republicans win."
01:41:29.680 | And then, you know, and there's nobody who says,
01:41:31.480 | "We're not gonna just bring in Democrats.
01:41:33.040 | "We're gonna bring in a mix, right?
01:41:34.720 | "We're gonna try to get balance on the Supreme Court."
01:41:37.280 | There's just, there's no leadership that's doing it.
01:41:39.480 | That's the fundamental problem.
01:41:41.200 | It's not about the ideology of woke, it's not the,
01:41:43.760 | no leadership.
01:41:45.160 | - Yeah, leadership, and yeah, there's just,
01:41:48.200 | whatever systems we've created,
01:41:50.080 | it's really frustrating that if you don't like Trump,
01:41:53.520 | it really is Trump derangement syndrome.
01:41:55.840 | Like, he's definitely Hitler.
01:41:58.320 | If you don't like Biden, he's senile lizard person.
01:42:01.720 | - Right.
01:42:02.560 | Everybody gets labeled, right?
01:42:06.040 | Because that works on social media.
01:42:07.840 | That, look, if Elon changed the algorithm
01:42:11.360 | just by taking himself out of it,
01:42:14.920 | seriously, I'm not saying don't post, right?
01:42:16.720 | Post all you want.
01:42:18.080 | But he, you know, if you look at his followers,
01:42:20.880 | they're almost all right-leaning.
01:42:24.000 | If you look at the people he engages with positively,
01:42:26.560 | they're almost all right-leaning.
01:42:28.520 | And if you look at the people
01:42:29.360 | he engages with negatively, like me, right?
01:42:32.440 | I consider myself an independent,
01:42:34.440 | but I lean left on the DEI topic, right?
01:42:37.760 | That influences the algorithm.
01:42:40.640 | And so you see what you see because of what he says.
01:42:44.160 | - Yeah, well, I mean, for sure,
01:42:46.360 | but there could be a lot of influential people on Twitter
01:42:51.320 | that influence the algorithm and all that kind of stuff.
01:42:53.720 | I do feel it's not even about ideology or where you lean.
01:42:57.320 | It's about, like, the algorithm not prioritizing drama.
01:43:02.520 | The attention-grabbing thing
01:43:07.440 | or the lower lizard version of that
01:43:09.920 | where, like, people just want the drama.
01:43:11.800 | They want to check it out.
01:43:13.240 | - When I last read through all the stuff
01:43:15.520 | on their algorithm, right?
01:43:16.480 | Maybe it's changed.
01:43:17.720 | Whoever has the biggest account
01:43:20.320 | and gets engagement on that account
01:43:23.520 | influences what people see the most.
01:43:26.660 | - Yeah, I mean, that's interesting.
01:43:27.600 | I don't know if that's, to the degree that's true,
01:43:29.500 | they've-
01:43:30.340 | - For sure, it's still the case.
01:43:32.440 | - Pretty rigorous description of what's,
01:43:34.660 | of the way the algorithm works.
01:43:37.480 | It's actually kind of fascinating.
01:43:38.680 | There's a clustering of people based on interest.
01:43:41.040 | - Right, but I think they call it
01:43:41.880 | the nearest neighbor approach,
01:43:43.920 | and I think that's what they do.
01:43:44.920 | And so whoever has the biggest account
01:43:47.320 | has the most neighbors who, in turn, have their neighbors
01:43:49.520 | who, in turn, have their neighbors,
01:43:50.880 | and that's how they discern what comes next.
01:43:53.520 | - But there's a clustering still.
01:43:54.800 | So, like, if you don't give a shit about Elon,
01:43:57.440 | you're not gonna- - And you're not following him.
01:43:58.720 | Yeah, you're not following him.
01:43:59.560 | - You're not gonna have an influence.
01:44:00.640 | He's not gonna have an influence.
01:44:02.320 | When you get a break, just create a Burner account
01:44:04.520 | on Twitter and see who they recommend to you.
01:44:07.120 | - Elon.
01:44:08.160 | - And not just Elon.
01:44:09.600 | I mean, the people that Elon likes.
01:44:11.880 | And I'm saying that's not Elon saying,
01:44:13.920 | add this person, add this person,
01:44:15.160 | and suggest this person, this person, and this person.
01:44:17.640 | I'm saying that's what the algorithm is.
01:44:19.840 | - Yeah, there should be transparency around that, for sure.
01:44:21.680 | - There is.
01:44:22.520 | There is, and that's the whole point, right?
01:44:23.880 | He knows there's transparency, and he knows the impact.
01:44:26.480 | That's why when I say take yourself out of the algorithm,
01:44:29.600 | right, don't include his account,
01:44:31.480 | that changes, I think, the output of the algorithm.
01:44:34.280 | - Well, when he wasn't owning Twitter,
01:44:36.080 | he was one of the biggest accounts,
01:44:37.400 | if not the biggest account already.
01:44:39.080 | - He wasn't, but still, like even like the Kim,
01:44:41.160 | well, the Kim Kardashian accounts, whatever, right?
01:44:43.880 | I don't, it wasn't open source to Elon's credit.
01:44:48.280 | It is now, so I couldn't see it to know, right?
01:44:50.960 | So I didn't get the sense one way or the other
01:44:52.640 | of one element being dominant over the other.
01:44:55.560 | But obviously conservatives felt
01:44:56.880 | that left-leaning was more dominant back then.
01:45:00.000 | - Yeah, I would love to see numbers
01:45:01.240 | on all of this.
01:45:02.200 | - Yeah, you'll be fun.
01:45:04.080 | - DEI, everything like this.
01:45:06.560 | Sometimes anecdotal data really frustrates me.
01:45:09.600 | It frustrates me primarily because of how sexy it is.
01:45:13.600 | People just love--
01:45:14.440 | - That's a great way to describe it, yeah.
01:45:15.600 | - Love a story, and I'm like goddammit,
01:45:17.840 | this is not science, this is--
01:45:20.240 | - It's not even common sense.
01:45:21.960 | - Well, no, I think anecdotal stories
01:45:24.800 | often have a wisdom in them.
01:45:27.120 | - No doubt, right.
01:45:28.160 | There's something to be gained from seeing them.
01:45:30.320 | There's a signal there, but how representative
01:45:33.240 | is that signal of the broader thing?
01:45:35.040 | - There's a whole lot more noise
01:45:36.080 | than signal more often than not.
01:45:37.440 | - All right, so as I mentioned, cost plus drugs.
01:45:40.960 | There's so many questions I can ask here,
01:45:42.440 | but what's the big question?
01:45:45.400 | What's broken about our healthcare system?
01:45:47.680 | - There's no transparency.
01:45:49.480 | And lack of transparency leads to lack of trust.
01:45:52.520 | And when you can't trust the healthcare system
01:45:54.760 | other than maybe your doctor, that's a broken system.
01:45:58.240 | - So what aspect of this system,
01:46:02.400 | this cost plus drugs is trying to solve?
01:46:04.320 | - So the thing we're trying to solve for is trust.
01:46:07.680 | And the way we feel we get there
01:46:09.160 | is through complete transparency.
01:46:11.080 | So when you go to costplusdrugs.com
01:46:13.120 | and you put in the name of the medication,
01:46:14.740 | if it's one of the 2,500 and growing that we carry,
01:46:17.920 | we will first show you our cost, what we actually pay for it,
01:46:21.800 | then we'll show you our 15% markup,
01:46:24.000 | then we'll show the pharmacy fill fee and shipping,
01:46:26.760 | and that's your total price.
01:46:28.560 | And that alone, that transparency alone
01:46:31.920 | is completely revolutionizing
01:46:34.040 | how drugs are priced in America today.
01:46:37.200 | And it's led to research being done
01:46:41.120 | comparing our pricing to CMS and ours being cheaper,
01:46:46.120 | then even the government is negotiating,
01:46:48.920 | et cetera, et cetera, et cetera.
01:46:50.280 | And so just that transparency alone has had an impact
01:46:55.800 | and saved millions of people
01:46:57.840 | hundreds of millions of dollars or more.
01:46:59.440 | - And maybe it results in more transparency
01:47:01.400 | in other parts of the system, too,
01:47:03.080 | seeing the business of it.
01:47:04.320 | But what do the so-called middlemen companies,
01:47:08.400 | so the PBMs--
01:47:09.680 | - The Pharmacy Benefit Managers.
01:47:10.920 | - Thank you.
01:47:12.420 | CVS Caremark, Cigna's Express Scripps,
01:47:15.280 | and UnitedHealth's OptumRx,
01:47:19.080 | they control majority of the market.
01:47:21.400 | What do they do wrong?
01:47:22.440 | - They put profits over everything, right?
01:47:24.440 | And they know in an industry that's completely opaque,
01:47:29.040 | they can pretty much do what they want
01:47:31.080 | and nobody gets to see what they're doing in detail.
01:47:34.240 | And so the first thing when you sign a contract
01:47:39.240 | with one of those big PBMs,
01:47:41.640 | it says you can't disclose any of this.
01:47:44.760 | And the fact that you can't be disclosed
01:47:46.480 | means they could tell Lex's company
01:47:49.380 | that they're getting a great price
01:47:53.280 | and they're only being charged X,
01:47:55.120 | and they can tell Mark's company,
01:47:57.040 | oh, you're getting a great price
01:47:58.240 | and we're charging Mark X plus, right?
01:48:00.520 | But Mark doesn't know any better
01:48:02.360 | 'cause there's no way to know.
01:48:03.360 | - The markup is not transparent.
01:48:05.320 | - The cost isn't transparent, the markup isn't transparent.
01:48:08.160 | And there's different things.
01:48:10.680 | I was just talking to a company
01:48:12.640 | in a presentation a couple days ago
01:48:14.720 | and they took the step to leave the big three PBMs
01:48:19.200 | to go to a rebate-free PBM that was smaller.
01:48:22.520 | And what they said led to the decision.
01:48:25.160 | They had a contract with the PBM
01:48:26.800 | for these things called rebates, right?
01:48:28.600 | Where depending on the volume of medications you buy,
01:48:30.980 | they'll kick back to you a percentage of them.
01:48:34.240 | And as it turns out,
01:48:35.720 | when they compared what was contracted for
01:48:37.800 | to what they actually got,
01:48:39.360 | they were getting underpaid every single year.
01:48:41.720 | They just don't care, right?
01:48:43.960 | They'll take products.
01:48:45.200 | There's a drug called Humira, right?
01:48:47.480 | And it is the number one revenue drug in the country.
01:48:52.340 | And there's also a biosimilar, multiple biosimilars,
01:48:55.120 | but one we carry called Usimri.
01:48:57.840 | And Humira, the pre-rebate price
01:49:02.720 | is about $8,000 per month.
01:49:04.660 | And after rebates, depending on the size of the company,
01:49:08.080 | it'll be anywhere from three to $6,000 a month.
01:49:11.360 | You can go to get your doctor
01:49:13.000 | to prescribe that biosimilar Usimri and you pay $594.
01:49:16.960 | But those big three PBMs won't allow their clients
01:49:20.600 | to get Usimri because they don't get a rebate on Usimri.
01:49:25.600 | So they'd rather keep a drug on their formulary,
01:49:29.280 | even though their patients would save,
01:49:32.400 | their customers would save a lot of money,
01:49:34.380 | they'd rather keep a drug and exclude another
01:49:37.340 | because they'll make a lot more money.
01:49:39.220 | - So the CVS Caremark spokesperson,
01:49:42.740 | I think responded to you, Phil Blandeau,
01:49:46.460 | with the usual language that so deeply exhausts me,
01:49:51.280 | but I was wondering if there's any truth to it.
01:49:55.800 | Employers, unions, health plans, and government programs
01:49:58.800 | work with CVS Caremark precisely because we deliver for them
01:50:03.040 | lower drug costs, better health outcomes,
01:50:06.000 | and broad pharmacy access through our true cost,
01:50:10.080 | cost vantage, and choice formulary initiatives.
01:50:14.560 | We are the leading agent of change,
01:50:16.620 | innovation, and transparency in the market.
01:50:18.740 | - That's a whole lot of nothing.
01:50:20.260 | - So they are not transparent.
01:50:22.940 | - No, no.
01:50:23.780 | Call them up, you go to Cost Plus Drugs,
01:50:26.020 | we'll give you our price list of all 2,500 plus drugs.
01:50:29.380 | - The actual cost.
01:50:30.200 | - The actual cost and what we sell it for
01:50:32.540 | because it's just a plus 15%.
01:50:34.340 | Call up any of the big three companies
01:50:36.180 | and ask them for the same thing.
01:50:37.880 | They're gonna laugh at you.
01:50:38.860 | It's so bad, in fact, if you do business with them right now
01:50:41.980 | and you just ask for your claims data,
01:50:44.940 | meaning how many people use Humira that we're paying,
01:50:47.580 | what are we paying for it?
01:50:49.140 | They won't even give it to you
01:50:50.500 | unless you really, really scream and yell at them
01:50:52.460 | and then they'll charge you and take six months to get it.
01:50:54.580 | So like when we moved away from them,
01:50:56.780 | we wanted to get what our claims data was
01:50:58.500 | to understand what we were gonna be facing.
01:51:01.360 | They wouldn't give it to us until like six months later,
01:51:04.420 | I forget the exact amount,
01:51:05.500 | but and then they charge us for it as well, our own data.
01:51:08.860 | - On the CEO front,
01:51:10.020 | you've said that CEOs don't understand healthcare coverage
01:51:13.500 | and it's costing them big.
01:51:14.640 | What's the connection between
01:51:17.500 | all Cost Plus drugs and companies?
01:51:19.700 | - So I can speak for my own companies
01:51:21.660 | and this applies to all companies,
01:51:24.260 | bigger companies that self-insure 'cause we self-insured.
01:51:27.100 | When I finally, when we started Cost Plus,
01:51:30.420 | I finally said, okay, it's time for me to understand
01:51:32.340 | how I'm paying for my healthcare,
01:51:33.900 | for my employees and their families.
01:51:35.820 | And the first thing I looked at was
01:51:37.680 | a lot of these companies use employee benefits consultants.
01:51:40.620 | And turns out I was getting,
01:51:42.460 | I was paying $30 per employee per month,
01:51:46.060 | which was millions of dollars a year.
01:51:48.700 | And they were just sending us to the companies
01:51:52.220 | that paid them the biggest commissions.
01:51:54.300 | I'm like, how fucking dumb am I, right?
01:51:57.940 | So I'm like, okay, we're cutting that.
01:51:59.580 | And then I looked at our medication,
01:52:03.460 | our prescription deal that goes through the PBMs
01:52:06.260 | that we were using.
01:52:07.540 | And that the consultant connected us with.
01:52:09.860 | And I took a list of, this was early on in Cost Plus drugs,
01:52:14.260 | list of the generic drugs that we sold
01:52:16.460 | that cost more than $30
01:52:17.980 | that the Mavericks also had purchased, right?
01:52:20.340 | We were able to get that claims data.
01:52:22.220 | And it turns out we spent $169,000 with that PBM,
01:52:27.220 | one of the big three PBMs.
01:52:29.220 | And it would have cost us buying
01:52:30.460 | from Cost Plus drugs $19,000.
01:52:32.500 | And that's just a simple example.
01:52:36.060 | Then I looked at the insurance side of things, right?
01:52:38.260 | We self-insure, so there weren't premiums per se.
01:52:41.500 | But we were getting charged $17.15 per employee per month
01:52:45.900 | just to use the network that they put together for us,
01:52:49.460 | providers, hospitals, whatever.
01:52:51.420 | And I'm like, all right, are there companies
01:52:53.020 | that won't charge us to put together these networks?
01:52:55.380 | Turns out there's a lot of them.
01:52:57.080 | And those insurance companies and those PBMs
01:53:00.780 | are also responsible for determining what claims,
01:53:04.420 | what to authorize and what to deny, right?
01:53:09.420 | So for a drug, it may be, all right,
01:53:12.000 | this is an expensive drug,
01:53:13.600 | but before they'll say they'll pay for the drug
01:53:17.540 | that your doctor wants to prescribe for you,
01:53:19.380 | you have to try these three other drugs
01:53:20.820 | in what's called step-up therapy, right?
01:53:22.620 | To see if these other cheaper drugs work,
01:53:24.480 | or they're not even necessarily cheaper.
01:53:26.340 | They may be being pushed
01:53:28.980 | because they're getting a higher rebate.
01:53:31.860 | And so I'm like, that's insane.
01:53:33.700 | I want my employees to get the medication
01:53:36.420 | that the doctors say is best.
01:53:39.100 | And so I didn't realize those were the intricacies
01:53:42.940 | of how my health or where my healthcare dollars went.
01:53:46.680 | There's not a single CEO who does
01:53:48.860 | because that's not a core competency that they need.
01:53:51.020 | And the CFOs, that's not their core competency.
01:53:53.820 | And the HR people, they contribute
01:53:56.340 | and they understand it some
01:53:57.460 | because they're dealing with the claims,
01:53:59.000 | but they spend most of their prescription drug-related time
01:54:02.040 | or healthcare-related times
01:54:03.420 | trying to get pre-authorizations approved.
01:54:06.220 | So your kid breaks their arm or you get sick
01:54:10.300 | and you go to the doctor,
01:54:11.920 | and before the doctor will do a surgery or do whatever,
01:54:15.560 | they have to go to the insurance company
01:54:17.020 | and get pre-authorized.
01:54:18.460 | And then they always say no, right?
01:54:20.660 | And then you have to go back
01:54:21.700 | and somebody has to argue for you.
01:54:23.020 | And that just eats up employee time
01:54:24.700 | because I'm sick or my kid's sick
01:54:27.000 | and you're wasting my time.
01:54:28.020 | Eats up HR time.
01:54:29.300 | The CEOs don't know any of this, right?
01:54:31.820 | So what I'm saying is, one,
01:54:33.980 | the smartest thing to do
01:54:34.820 | is to get a healthcare CEO at every company
01:54:36.860 | with over, let's say, 500 employees
01:54:38.940 | that focuses on all these things.
01:54:40.540 | You'd save a shitload of money.
01:54:42.300 | And two, healthcare is your second largest
01:54:45.420 | line-item expense after payroll.
01:54:48.060 | And in some companies,
01:54:49.860 | it's hundreds, billions of dollars, right?
01:54:52.540 | And you don't understand it
01:54:53.820 | and you're letting these guys rip you off.
01:54:55.940 | And it's because these big CEOs don't understand it
01:54:59.120 | and are getting ripped off
01:55:00.740 | that the industry is the way it is
01:55:03.580 | because that allows the opacity to continue.
01:55:07.540 | - That's fascinating.
01:55:08.820 | So most companies outsource, offload
01:55:13.820 | the expertise on the healthcare side
01:55:17.060 | when they really should be internally,
01:55:18.460 | there should be an expert--
01:55:19.300 | - Yes, because it's the wellness of your employees
01:55:21.660 | and their families.
01:55:22.500 | - And it costs a lot of money.
01:55:24.100 | - Yeah, but if your employees aren't healthy
01:55:25.720 | or if they're worried about their kids,
01:55:27.540 | and what is more worrisome and detrimental
01:55:30.820 | to the performance of a company, right?
01:55:34.060 | A DEI program or having to go to HR
01:55:39.060 | and scream and yell and explain,
01:55:41.300 | and your doctor wasting their time doing the same thing
01:55:44.100 | to get authorization for a surgery or a medication.
01:55:48.140 | It's insane.
01:55:48.980 | - What made you decide to step into this cartel-like
01:55:52.460 | situation where so much is opaque?
01:55:56.220 | - So I got a cold email from a Dr. Alex Hosch-Miyansky,
01:55:59.140 | who's my co-founder.
01:55:59.980 | He's a radiologist by trade and a physicist
01:56:02.220 | and a smart motherfucker.
01:56:03.620 | And he had a pharmacy that he wanted
01:56:06.700 | to create a compounding pharmacy
01:56:08.460 | that would manufacture generic drugs
01:56:10.460 | that were in short supply
01:56:11.900 | because it happens all the time that things aren't available.
01:56:15.140 | I'm like, you're thinking too small.
01:56:16.560 | We should do something on a much bigger scale.
01:56:19.020 | And then it was right around the time
01:56:20.540 | they were sending the pharmacy bro,
01:56:21.900 | Martin Shkreli to jail.
01:56:23.740 | And so I was reading up on that
01:56:25.620 | and he increased the price of this drug Daraprim.
01:56:29.100 | I think it was like 7,500% or increased
01:56:31.740 | a low cost drug to $7,500, one of those.
01:56:34.180 | And I'm like, well, if he can just jack up
01:56:35.900 | the price of this drug and charge more
01:56:38.420 | and get away with it,
01:56:39.400 | this has to be an incredibly inefficient market.
01:56:41.800 | And so the question is, why is he able to do it?
01:56:45.820 | And it was immediately apparent
01:56:47.480 | that it was a lack of transparency.
01:56:49.380 | And so can we start a company that is fully transparent
01:56:52.100 | with our costs, our markup and our selling price?
01:56:55.300 | And see if it works.
01:56:57.140 | And so we went for it and it took off immediately.
01:56:59.920 | I mean, you read a press release from a company saying,
01:57:03.820 | they were creating a cost advantage program,
01:57:05.860 | basically pretending to replicate us.
01:57:08.180 | - Yeah.
01:57:09.140 | - We haven't been in business two years.
01:57:11.140 | How insane is that?
01:57:13.500 | - Did you get a lot of pressure?
01:57:14.580 | I mean, I'm sure they're very good at playing games.
01:57:17.460 | So cartel type situations, they protect.
01:57:20.340 | It feels like healthcare,
01:57:21.900 | it's very difficult to get in there.
01:57:23.580 | - Yeah, it does.
01:57:24.420 | - I mean, and the whole industry is an arbitrage,
01:57:26.340 | but we don't work inside the system,
01:57:27.700 | we work outside the system.
01:57:29.300 | And so we don't work with those biggest companies.
01:57:31.140 | The biggest companies with the most dominant control,
01:57:33.900 | it's very insulated and very controlled, like you said.
01:57:36.900 | We work outside them, we won't work with them.
01:57:39.100 | And so because of that,
01:57:40.620 | we don't have access to every medication
01:57:42.660 | because they've told a lot of the big brand manufacturers
01:57:46.240 | that if they work with us,
01:57:47.460 | they'll take them off their formularies
01:57:49.340 | or change the rebate structure
01:57:51.180 | so that they won't be prescribed as much.
01:57:53.020 | - Yeah, it is dark, but we'll get past that, right?
01:57:55.260 | Because there's a downstream impact of all this
01:57:58.060 | in the rebates and the greediness of those big three PBMs.
01:58:02.100 | When you go to a local pharmacy here in Austin, right?
01:58:05.980 | And let's just say you have a friend here, right?
01:58:09.100 | That is on Medicare or Medicare Advantage,
01:58:11.380 | and they go to a local pharmacy
01:58:13.100 | and they get a drug that costs $600.
01:58:16.460 | Well, an insurance company, that's $600.
01:58:20.540 | The pharmacy first buys that drug for probably that price,
01:58:24.180 | minus 5%, so $570.
01:58:26.900 | Then there's probably a copay by the patient,
01:58:29.220 | and that's probably $20.
01:58:30.380 | So now the net investment that the pharmacy,
01:58:33.900 | the local pharmacy has for that brand medication is $550.
01:58:38.720 | Where it gets really fucked up is those big three PBMs,
01:58:44.980 | they're not reimbursing them $550 or more.
01:58:49.260 | They're reimbursing them $500 or less.
01:58:53.700 | And literally those community pharmacies
01:58:56.340 | are eating that loss.
01:58:58.900 | And as a result,
01:58:59.980 | they're going out of business left and right.
01:59:01.780 | And the most insane part of it is,
01:59:04.480 | yes, with corporate employer insurance, that happens,
01:59:08.580 | but it happens more with Medicare Part D
01:59:10.660 | and Medicare Advantage.
01:59:11.940 | It happens all the time with those, almost with every script.
01:59:14.860 | So the government is complicit
01:59:17.340 | in these community pharmacies going out of business.
01:59:20.340 | So how does that connect to Cost Plus Drugs
01:59:23.940 | and what we're doing and the big brands?
01:59:26.040 | The big brands know that
01:59:27.820 | if all these community pharmacies are going,
01:59:30.420 | tens of thousands of them are going to go out of business
01:59:32.700 | because of the way this pricing is,
01:59:34.600 | they're gonna lose a connection
01:59:36.680 | between their brand medications
01:59:38.820 | and grandma and grandpa and aunt Sally.
01:59:41.260 | And all that business is gonna get transferred
01:59:43.540 | to the big companies.
01:59:44.860 | And they're gonna have even less leverage.
01:59:46.680 | So they're working with us to come up with programs
01:59:49.180 | that are very supportive of independent pharmacies.
01:59:52.160 | And that's gonna allow us to break the cartel
01:59:55.260 | because it's in their best interest
01:59:56.800 | not to allow them to be so vertically integrated
01:59:59.540 | that they destroy the entire community
02:00:01.660 | and independent pharmacy industry.
02:00:03.640 | - Is there other aspects of the healthcare industry
02:00:05.560 | that could use this kind of transparency and revolutionizing?
02:00:10.560 | - Yeah, so what we're gonna do with our own healthcare,
02:00:13.580 | right, we're not gonna be in the business
02:00:15.260 | of selling healthcare or anything like that or operating.
02:00:17.580 | But the things we do for my companies,
02:00:20.740 | we're only gonna do deals with providers,
02:00:23.660 | healthcare providers,
02:00:24.760 | that allow us to be completely transparent.
02:00:27.140 | So that whatever contracts we do,
02:00:28.820 | we're gonna post them all.
02:00:29.700 | Whatever pricing we get, we're gonna post them all.
02:00:31.940 | So that every company who's our size or even bigger
02:00:35.220 | will have a template that they can work on,
02:00:37.560 | which will take it away from
02:00:39.460 | the big three insurance companies and the big three PBMs.
02:00:44.160 | Because now, without that transparency,
02:00:46.500 | they have to use consultants who are getting paid
02:00:48.500 | by those big three, you know, those big companies,
02:00:50.860 | and aren't giving them the best response.
02:00:54.300 | And so now that transparency will overcome that.
02:00:56.740 | - And you're using your, how should I say it,
02:00:59.860 | celebrity, your name, to kind of push this forward.
02:01:02.620 | - Yeah, it's the only company I've ever put my name on.
02:01:04.060 | - It's weird that people aren't getting into the space.
02:01:06.820 | Like, public people, you know, like big,
02:01:09.900 | there's not like a big, you know, you look at tech,
02:01:12.580 | there's like these, like, CEOs are open and public
02:01:17.080 | and public, and they're pushing the company,
02:01:18.720 | and they're selling everything.
02:01:20.360 | It's like all transparent.
02:01:22.220 | But you don't see that in healthcare.
02:01:25.480 | - No, because it's a big business, and most people,
02:01:28.320 | like if I was 25 trying to start a company,
02:01:30.240 | I'd work in the system.
02:01:31.280 | 'Cause if I can build it up big enough,
02:01:32.760 | they would just buy me.
02:01:34.240 | And I'd make, you know, money and buy a sports team.
02:01:36.520 | But I don't need that money now.
02:01:38.560 | - Let me ask you about AI.
02:01:39.600 | You've gotten a little bit of an argument about open source.
02:01:43.700 | I think you stepped in between Vinod Khosla
02:01:46.520 | and Mark Andreessen.
02:01:47.980 | You think AI should be open sourced?
02:01:50.140 | - Yeah, for sure.
02:01:51.620 | - So like all that discussion we've been having
02:01:53.580 | about like Google and so on, one of the solutions-
02:01:55.460 | - Well, okay, two different things.
02:01:56.580 | Meaning that Meta's doing open source, right?
02:02:00.500 | That's a good choice for them.
02:02:01.540 | I think that's a smart choice, right?
02:02:03.400 | But it's just a business decision for everybody else.
02:02:05.420 | I don't think it should be forced.
02:02:07.100 | - Forced, yes, yeah.
02:02:08.280 | - Right, and even Google's open sourcing some of the models.
02:02:12.120 | - Because they're all, that's a very incestuous industry
02:02:15.120 | where, you know, the people all work together at some level.
02:02:18.700 | They read the same papers.
02:02:20.120 | They go to the same conferences.
02:02:21.800 | You know, it's like the early days of streaming
02:02:23.660 | and the internet where people used
02:02:25.200 | the same technology everywhere.
02:02:26.920 | And now they just try different things
02:02:28.320 | and you get one smart or two, a couple smart people
02:02:30.680 | in one company like Anthropic, right?
02:02:33.320 | And they do things a little bit better
02:02:34.520 | and model efficiency gets better.
02:02:36.720 | So, you know, it's just a business choice
02:02:38.780 | but I don't think it should be forced
02:02:40.140 | but I think it's a smart business decision.
02:02:42.300 | - Open sourcing is a smart business decision.
02:02:43.920 | - Yeah.
02:02:44.820 | - It's a tricky one.
02:02:45.780 | I mean, Google's pioneering that with TensorFlow
02:02:48.660 | in the AI space.
02:02:50.500 | That's a tricky decision to get.
02:02:51.340 | - It really, really is, right?
02:02:52.620 | But go back to historically, you know,
02:02:55.260 | there was digital computing which was a dominant player
02:02:58.700 | and they thought, and IBM to a certain extent,
02:03:02.280 | thought that they wouldn't be subject to a problem
02:03:06.500 | with the PC industry.
02:03:08.020 | And then all of a sudden,
02:03:09.500 | with their mainframes and everything,
02:03:13.060 | they had captive software,
02:03:14.800 | they wouldn't use off the shelf software, right?
02:03:17.380 | So for a digital equipment mainframe or an IBM mainframe,
02:03:21.020 | you needed software that was written for it.
02:03:22.780 | There was nothing off the shelf.
02:03:23.940 | And when the PC industry came along,
02:03:26.260 | it was the exact opposite.
02:03:27.720 | There was, you know, MS-DOS and then Windows,
02:03:29.980 | things that were off the shelf that every PC could use.
02:03:32.300 | And that changed how people thought about software.
02:03:34.940 | And I think the same thing will happen here
02:03:36.740 | where it's going to be as models become more efficient
02:03:41.500 | and easier and less expensive to train,
02:03:44.700 | I think there'll be more reasons to open source.
02:03:49.100 | - Yeah, that's the hope.
02:03:50.140 | It creates more competition
02:03:51.300 | and a lot of different diversity of approaches
02:03:54.780 | in how they're implemented, deployed,
02:03:57.340 | what kind of products they create, all of that.
02:03:59.700 | Vinod compared the danger of that to the Manhattan Project.
02:04:03.980 | - Yeah, yeah, I'm not buying that at all.
02:04:06.340 | - You don't see the parallels
02:04:07.180 | between nuclear weapons and AI.
02:04:08.820 | - No, no, I think I'm not an AI fatalist at all, right?
02:04:13.500 | I'm an AI optimist.
02:04:15.580 | And, but it's not to say that there isn't a lot
02:04:19.140 | of scary shit that can happen with it.
02:04:20.940 | - Yeah.
02:04:21.780 | - Militarily, you know, like I said earlier,
02:04:25.060 | I'm a big believer that there's going to be millions
02:04:28.020 | and tens of millions of models
02:04:29.980 | and people will take their expertise
02:04:33.940 | and either get hired for it and contribute
02:04:37.700 | or create their own models and license.
02:04:40.780 | So that, you know, you see now
02:04:42.620 | with this thing called mixture of experts, right?
02:04:44.780 | Where you connect things and people can take their expertise
02:04:49.780 | and we'll be able to take that expertise
02:04:52.860 | and retain it in a way that they want to retain it.
02:04:56.420 | So, you know, I don't think there's gonna be
02:04:58.460 | one medical database.
02:04:59.460 | And I told this to people at a couple of big companies
02:05:01.780 | that were doing healthcare initiatives.
02:05:04.060 | Branding is so important in the healthcare space,
02:05:08.580 | if, you know, for hospitals, you know, the Mayo Clinics,
02:05:11.340 | the MD Andersons, they're huge brands.
02:05:13.780 | And I don't think they're just gonna give up their expertise
02:05:16.220 | to some, you know, main singular model, you know,
02:05:20.540 | and say, okay, you know, whatever expertise we have
02:05:23.780 | is available to you in Gemini or ChatGPT
02:05:26.900 | or, you know, so-and-so's version of Med is open source.
02:05:30.420 | I just don't, there's just, that would be business suicide.
02:05:33.420 | And so I think you're gonna see each of them
02:05:36.580 | have their own models and update them as they go
02:05:39.220 | and license them.
02:05:40.380 | - Yeah, and yeah, make money from the expertise.
02:05:43.100 | - You have to. - Don't give away the expertise.
02:05:43.940 | - You have to. - Yeah, yeah, yeah.
02:05:46.140 | And the expertise evolves and grows
02:05:47.620 | and all that kind of stuff and you want to own that growth.
02:05:50.820 | What advice would you give to young people?
02:05:53.100 | You have an exceptionally successful career.
02:05:55.260 | You came from little, made a lot.
02:05:57.780 | What advice would you give them?
02:05:59.620 | - Love your life, right?
02:06:01.020 | You know, find the things that you can enjoy.
02:06:03.540 | Be curious.
02:06:04.940 | You don't have to have all the answers
02:06:07.100 | when you're 12, 15.
02:06:09.040 | I get emails from 13, 15-year-old kids, right?
02:06:11.700 | - What do I do? - What do I do, right?
02:06:13.620 | You know, I feel like I'm being held back.
02:06:15.740 | I'm like a 15, you feel like you're being held back.
02:06:19.100 | But just be curious 'cause you don't have to have the answers.
02:06:21.620 | You don't have to know what you're gonna be
02:06:22.700 | when you grow up.
02:06:24.060 | I'm a hardcore believer that everybody has something
02:06:28.980 | that they're really, really, really good at
02:06:31.980 | that could be world-class great,
02:06:33.460 | every single human being on this planet.
02:06:35.500 | And the hard part is just finding what that is
02:06:38.100 | and in some places having resources to enable it.
02:06:41.300 | But be curious so you can find out what it is.
02:06:45.300 | I didn't take a technology,
02:06:46.580 | I took one technology class in college
02:06:50.040 | for TRAN programming and I cheated on it, right?
02:06:52.740 | I mean, it wasn't until I got a job at Mellon Bank
02:06:55.580 | and I started learning how to program
02:06:56.980 | in this thing called Ramus,
02:06:58.460 | this scripting computing language,
02:07:00.460 | that I realized, oh, this is interesting to me
02:07:02.140 | and I like it.
02:07:02.980 | And that's what got me a job selling software
02:07:05.620 | and going on from there.
02:07:08.380 | You just don't know what that's going to be
02:07:10.460 | until you go out and experience different things.
02:07:12.500 | So for anybody young out there listening,
02:07:14.740 | enjoy your life, find things to smile about,
02:07:18.540 | be curious, read, watch, expose yourself
02:07:22.860 | to as many different ideas as you can
02:07:25.580 | because something's gonna click at some point.
02:07:27.460 | You may be 15, you may be 25, you may be 55,
02:07:31.220 | but it can happen.
02:07:33.060 | - One thing to mention is sometimes it's difficult
02:07:35.700 | where your parents, people around you
02:07:38.580 | might not be conducive or might not be of help
02:07:42.300 | in finding the thing you're good at.
02:07:45.900 | In fact, in my own life, the society was such
02:07:49.460 | that I don't know if they've helped much
02:07:52.500 | at the thing I was good at.
02:07:54.540 | I'm still not sure what that is.
02:07:56.220 | But I think--
02:07:57.060 | - I think interviewing done pretty well for you.
02:07:58.580 | - Well, it's not even,
02:07:59.660 | there was a thing where I saw the beauty in people.
02:08:05.580 | It varied intensely.
02:08:07.780 | So you can call it empathy, all that kind of stuff.
02:08:10.540 | - I would call it wokeness.
02:08:11.900 | - Super woke.
02:08:14.300 | I guess you could say just super woke.
02:08:16.780 | That's me.
02:08:17.620 | But in the education system I came up in,
02:08:23.140 | it was a very hard mathematics, science, and so on.
02:08:27.100 | And I didn't notice that, whatever that was in me.
02:08:30.700 | But you have to keep the flame going.
02:08:32.780 | You have to try to find your way and see what that's useful.
02:08:35.460 | And others around you might not always notice it,
02:08:38.660 | so it might take time.
02:08:39.540 | So it could be lonely.
02:08:41.460 | You can really have to find the strength
02:08:43.860 | to believe in yourself.
02:08:44.900 | - Oh, for sure.
02:08:46.100 | And I'll tell you one quick story.
02:08:48.220 | 1992, I went to Moscow State University
02:08:51.820 | to teach kids how to start businesses.
02:08:56.220 | - Wow.
02:08:57.060 | - 'Cause I had sold microsolutions and I wanted to travel.
02:09:01.140 | And I took Russian in high school.
02:09:02.940 | My Russian is like not so good.
02:09:05.140 | (laughing)
02:09:07.860 | - Good enough to remember that.
02:09:09.060 | - Yeah, right, yeah.
02:09:10.860 | But it was interesting to me and I bring it up
02:09:12.980 | because they didn't know what the word profit meant, right?
02:09:17.980 | But at the same time, I would go around and meet people
02:09:23.780 | and it was this entrepreneurial,
02:09:27.420 | like right after the Soviet Union fell,
02:09:30.380 | entrepreneurship went through the roof.
02:09:32.020 | I mean, a lot of it was mafia-driven,
02:09:33.940 | but people found that spark
02:09:37.940 | because I think that is natural.
02:09:40.940 | And so you just never know when and how and when
02:09:43.980 | the circumstances will come together
02:09:46.100 | for you to be able to take advantage.
02:09:48.060 | - That spark is really important to comment on
02:09:50.180 | is in Russia and Ukraine,
02:09:52.240 | I think the system kind of suppresses that spark somehow.
02:09:57.580 | As you said, you saw the natural entrepreneurship,
02:10:00.420 | but there's not the entrepreneurial spirit
02:10:04.180 | once you grow up in both of the nations I mentioned.
02:10:06.740 | There is.
02:10:07.980 | - No, I believe it, right?
02:10:08.820 | I mean- - Especially in Ukraine.
02:10:09.660 | But there's something about the system
02:10:10.660 | that kind of- - Without question.
02:10:12.420 | - Be reasonable, be secure, be safe.
02:10:14.820 | - There would have been no reason for me to go over
02:10:16.100 | to do what I was doing if it was otherwise.
02:10:18.780 | - But that's the thing
02:10:19.620 | that really can help a country flourish.
02:10:22.540 | - You know, it's gonna be interesting with Ukraine
02:10:23.940 | if they're able to survive this, right?
02:10:25.940 | Because as horrific as it is,
02:10:28.660 | as you saw across Europe after World War II,
02:10:31.020 | the rebuilding creates opportunities.
02:10:35.380 | - Rebuilding creates opportunities,
02:10:37.000 | but first the war has to end, how that ends-
02:10:39.660 | - I don't know either.
02:10:40.500 | - Is a really complex path.
02:10:42.580 | What gives you hope about the future of humanity?
02:10:45.220 | - Just looking in my kids' eyes,
02:10:48.820 | just talking to them and seeing their spirit,
02:10:53.300 | their friend spirit.
02:10:54.660 | And obviously we're blessed as can be, right?
02:10:56.720 | And it's not the same for every kid,
02:10:58.380 | but I do, I get emails that I respond,
02:11:01.100 | don't respond to all of them,
02:11:02.040 | but from 13, 14, 15-year-old kids around the world,
02:11:05.940 | 'cause Shark Tank's shown everywhere,
02:11:08.020 | asking me business questions.
02:11:09.620 | And it's just like, they took the time.
02:11:12.780 | They were that curious and that interested.
02:11:14.940 | And I see it when I talk to schools,
02:11:18.020 | you know, when I go to different groups
02:11:21.300 | that spark in kids' eyes that there's something bigger
02:11:26.660 | and better and exciting out there.
02:11:28.260 | And that's not to say there's not fear,
02:11:30.100 | you know, climate and any other number of things,
02:11:33.860 | but that's the beauty of kids.
02:11:35.900 | And I think Gen Z really embodies that.
02:11:40.900 | And to me, that's just really exciting.
02:11:43.820 | - They dream, they dream big.
02:11:45.700 | They see the opportunity for making the world better.
02:11:48.380 | That's cool.
02:11:49.220 | It's cool to see young people in their eyes, that dream.
02:11:53.020 | And I could be the one to do it too,
02:11:55.580 | which is a super powerful thing.
02:11:56.620 | - It's funny, 'cause when I go talk
02:11:58.340 | to like elementary school kids, right?
02:12:01.020 | One of the things I do, I said, okay, let's look around.
02:12:03.820 | You see that light there?
02:12:05.180 | One day that light didn't exist.
02:12:06.940 | Then somebody had the idea.
02:12:08.460 | Then somebody created a product of it.
02:12:10.260 | And now your school bought that.
02:12:11.760 | You see that chair?
02:12:13.300 | Chairs didn't always look like that.
02:12:14.700 | Somebody had that idea.
02:12:16.620 | Why not you?
02:12:17.500 | So when you walk out and when I make them do, ask yourself,
02:12:20.760 | why not me?
02:12:21.800 | Why can't I be the one to change the world?
02:12:24.300 | - Thank you for that beautiful, hopeful message.
02:12:26.740 | And thank you for talking today, Mark.
02:12:28.260 | You're fun to follow.
02:12:31.700 | I'm a big fan of yours,
02:12:32.940 | but you're also an important person in this world.
02:12:35.220 | I really appreciate everything you do.
02:12:36.780 | - Well, I appreciate it.
02:12:37.620 | Thanks for saying that, Lexan.
02:12:38.540 | Keep on doing what you're doing.
02:12:39.380 | This was great.
02:12:40.220 | I really enjoyed this.
02:12:41.820 | - Thanks for listening to this conversation
02:12:43.340 | with Mark Cuban.
02:12:44.580 | To support this podcast,
02:12:45.660 | please check out our sponsors in the description.
02:12:48.220 | And now let me leave you with some words from Oscar Wilde.
02:12:51.820 | Imagination was given to man
02:12:53.700 | to compensate him for what he is not.
02:12:56.820 | And a sense of humor was provided
02:12:59.500 | to console him for what he is.
02:13:02.900 | Thank you for listening and hope to see you next time.
02:13:06.100 | (upbeat music)
02:13:08.680 | (upbeat music)
02:13:11.260 | [BLANK_AUDIO]