back to index

20 Years of Tech Startup Experiences in One Hour


Chapters

0:0
0:27 Jeremy Howard
11:52 Octopus Deploy
12:48 Fast Ai
13:21 Kaggle
27:11 Deep Learning
37:0 Fast Ai
37:5 Faster Ai
43:25 Lack of Investment in Research
53:56 What's Harder Getting an Idea
58:44 What's Next
59:34 How To Market an Early Stage Company

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Hi everybody and welcome to the literally just launched
00:00:05.000 | Queensland AI Hub.
00:00:07.660 | There's the rock in the hoodie.
00:00:10.560 | Queensland AI Hub is in Queensland.
00:00:12.760 | So I actually was only wearing this for the advertising.
00:00:17.640 | I actually don't need it.
00:00:18.840 | All right.
00:00:22.900 | So welcome to sunny Queensland.
00:00:25.820 | My name's Jeremy Howard.
00:00:29.640 | I'm originally from Australia, grew up in Melbourne
00:00:33.880 | and then spent 10 years
00:00:37.360 | over in the San Francisco Bay area.
00:00:40.880 | What I always used to think of as Silicon Valley,
00:00:43.280 | but then I got there, was staying in San Francisco
00:00:46.600 | when somebody said, let's meet up in Silicon Valley
00:00:48.680 | and an hour and a half later, I still hadn't got there.
00:00:50.960 | And I thought, oh my God, okay,
00:00:52.120 | it's actually quite a long way, especially with the traffic.
00:00:55.920 | So San Francisco Bay area, I was there for about a decade
00:00:59.280 | and returned back here to Australia two months ago
00:01:04.160 | and have made the move from Melbourne to Queensland,
00:01:09.160 | which I'm very, very happy about.
00:01:11.600 | So this is a really lovely place to be.
00:01:15.420 | Having said that, overwhelmingly the reaction
00:01:21.120 | that Rachel, my wife and fast AI co-founder and I get
00:01:24.680 | when we tell somebody, when they come up and they'll say,
00:01:27.480 | oh, welcome to Australia, welcome to Queensland.
00:01:32.480 | How long are you here for?
00:01:34.960 | Oh, we've moved here.
00:01:36.460 | You've moved here, why?
00:01:39.600 | And there's this kind of sense of like,
00:01:43.920 | why would anybody wanna move to Australia?
00:01:46.940 | Why would anybody wanna move to Queensland?
00:01:48.360 | You were there, you were in Silicon Valley.
00:01:50.900 | Not really, San Francisco, but what are you doing?
00:01:54.280 | And to be fair, it is a reasonable question
00:01:59.160 | because, so to be fair, this is not exactly
00:02:04.160 | the global hub of AI and AI investment.
00:02:07.800 | In fact, we're way down here in terms of investment in AI
00:02:12.800 | at a massive 0.29% of global investment.
00:02:18.100 | And this data is from Andrew Lai from Boab AI.
00:02:22.360 | Thank you very much to Andrew, who's actually given me
00:02:24.200 | quite a lot of cool data that I'll be sharing.
00:02:27.220 | So yeah, I definitely feel that.
00:02:34.480 | I gotta say it's 0.29% more than when I left,
00:02:37.960 | so that's good.
00:02:38.920 | But I wanna kind of make the argument today
00:02:44.440 | that actually this is a really great place
00:02:47.400 | to start a tech startup and actually a really great place
00:02:50.100 | to do AI research or AI implementations
00:02:55.100 | despite the obvious issues.
00:02:58.620 | So let me tell you about this insight
00:03:07.640 | through the lens of kind of describing my journey,
00:03:12.880 | I guess, to get here.
00:03:15.800 | So my journey, as I said, kind of started in Australia,
00:03:20.800 | right, that's a bit of a thick one, isn't it?
00:03:23.720 | Let's try making that a bit thinner.
00:03:26.400 | Okay, so I started out in Australia
00:03:31.680 | and 25 or so years ago, I thought,
00:03:36.280 | you know, it'd be really cool to start a startup.
00:03:38.700 | I mean, I can only think of the startup sense,
00:03:41.120 | start a company, you know, make a company.
00:03:43.280 | And then I thought, well, there's a problem.
00:03:45.720 | Jeremy, you don't know anything about business.
00:03:48.080 | So, you know, initially it was like,
00:03:52.720 | oh, let's do a startup or a company.
00:03:55.060 | And it's like, nah, you don't know anything about business.
00:03:59.700 | You don't know what you're doing.
00:04:01.080 | So let's learn about that.
00:04:03.080 | So I actually went into consulting.
00:04:04.920 | So I thought, okay, let's go to McKinsey and company.
00:04:10.140 | They know about business and spend a couple of years there.
00:04:15.400 | And I went to a couple of different consulting firms
00:04:19.980 | along that journey.
00:04:21.360 | And what I discovered along the way
00:04:23.040 | is there's no such thing as business.
00:04:25.680 | There's such a thing as like making things
00:04:28.680 | that people want and then selling it to them.
00:04:30.760 | And that's about the end of it.
00:04:31.960 | So I did certainly learn some valuable skills
00:04:34.600 | from our time in consulting,
00:04:35.960 | particularly the skills around how to influence people,
00:04:39.760 | how to influence organizations,
00:04:42.760 | but the actual explicit feedback I got about my ideas
00:04:46.120 | were on the whole terrible.
00:04:48.520 | For example, I was very proud of myself
00:04:51.300 | when one day I came in to work with a CD-ROM
00:04:54.880 | that I had bought that contained this really cool thing.
00:04:57.680 | Somebody had like got lots of data
00:05:00.280 | about what movies people like.
00:05:03.560 | And it's like, this person likes these movies
00:05:05.360 | and this person likes these movies.
00:05:07.040 | And through some kind of magic I didn't understand,
00:05:09.300 | which I now know is called collaborative filtering,
00:05:11.420 | you could type in some movies you like
00:05:13.640 | and it would tell you other movies you might like.
00:05:16.080 | And so I went into and I talked to one of the directors
00:05:19.000 | at the consulting firm and I said,
00:05:21.160 | imagine building a company based on this.
00:05:23.400 | Like you could even have like a website that wasn't static,
00:05:26.120 | but you go to their homepage and it could like tell you
00:05:28.780 | what things you might want to buy
00:05:30.940 | and wouldn't that be awesome?
00:05:32.260 | And the consulting director was like,
00:05:34.440 | you have no idea how companies work.
00:05:37.120 | This isn't a company.
00:05:38.720 | Companies are about that competition,
00:05:41.140 | about market forces.
00:05:42.600 | This is nerdy technology.
00:05:44.400 | Similar reaction when somebody was talking about
00:05:49.880 | creating a new web search engine,
00:05:53.080 | which was gonna be just like Yahoo,
00:05:55.400 | but as a Java applet.
00:05:57.140 | And so, and it would also have the power
00:05:59.840 | of these like big brands behind it.
00:06:02.000 | And I kind of said to them,
00:06:03.760 | I don't know, I wondered about like,
00:06:05.140 | what if we instead of having like lots of humans
00:06:09.240 | finding websites and putting them into a hierarchy,
00:06:12.480 | could we use like an algorithm
00:06:14.560 | that would automatically find interesting websites
00:06:17.160 | based on like what you typed in or something?
00:06:19.440 | Similar reaction.
00:06:20.720 | No, no, no, no, you don't understand.
00:06:22.800 | Humans need other humans to help them find things.
00:06:26.000 | You can't like get some computer
00:06:29.180 | to like do this very human job.
00:06:31.720 | And so overall, this was kind of my experience
00:06:34.200 | of learning business.
00:06:37.400 | And this is the first piece of advice I have
00:06:39.400 | for potential people doing tech startups here
00:06:42.960 | is don't listen to old people.
00:06:46.640 | 'Cause us old people don't know what we're talking about,
00:06:51.080 | unless it's explicitly about the actual thing
00:06:54.160 | that you wanna do.
00:06:56.280 | And they actually have years of experience in that thing,
00:07:00.000 | doing it in the new way
00:07:01.240 | that you're thinking of doing it.
00:07:02.420 | Because otherwise, all you get is these kind of
00:07:07.040 | biases about business as usual about the status quo.
00:07:10.640 | So somehow, you know, and I mean,
00:07:15.100 | in my 20s, I didn't know that
00:07:19.120 | and I thought there's something wrong with me
00:07:21.960 | that I didn't understand business,
00:07:23.800 | that I didn't understand why these ideas were bad ideas.
00:07:27.160 | So I actually ended up doing consulting for 10 years,
00:07:29.760 | which was eight years longer than I had planned,
00:07:31.800 | still trying to figure out what's wrong with me.
00:07:35.160 | Eventually, I decided to do it anyway.
00:07:37.240 | So that was the end of consulting.
00:07:39.720 | And I thought, okay, I'll start a company.
00:07:42.600 | Now, the problem is that I'd read that statistically speaking,
00:07:47.600 | new small businesses generally fail.
00:07:50.880 | So I actually had a genius move.
00:07:53.100 | I decided to start two new small businesses.
00:07:55.400 | 'Cause I thought, you know,
00:07:57.040 | probabilistically speaking, better chance of success.
00:08:00.720 | So I started two companies.
00:08:02.400 | I started Fastmail.
00:08:04.440 | And literally within like a month of each other,
00:08:07.360 | I started Optimal Decisions Group.
00:08:09.220 | Now, aren't you drawing Optimal Decisions Group?
00:08:13.620 | So Fastmail was an interesting startup.
00:08:17.000 | It was basically the first one to provide synchronized email,
00:08:20.240 | whether email you got in your phone or on your laptop
00:08:23.580 | or in your workplace, you're to see the same email.
00:08:26.480 | It's something that actually everybody in business
00:08:28.920 | already had 'cause they used MS Exchange
00:08:31.000 | or they used Lotus Notes, but normal people didn't.
00:08:34.160 | And I wanted to have that.
00:08:35.760 | So I built this company and it's still going great.
00:08:40.760 | And then Optimal Decisions was a insurance pricing,
00:08:45.760 | algorithms company.
00:08:47.520 | So very, very different.
00:08:48.360 | Fastmail sold to millions of customers around the world.
00:08:52.640 | And Optimal Decisions sold to huge insurance companies.
00:08:57.040 | So it's basically only three or four insurance companies
00:08:59.640 | in Australia big enough to use our product.
00:09:01.540 | And then, you know, a couple of dozen in America,
00:09:03.760 | some in South Africa and so forth.
00:09:05.560 | So very different kind of things.
00:09:07.160 | I didn't know anything about, you know,
00:09:11.920 | the Australian startup scene.
00:09:13.920 | So I didn't get any government grants.
00:09:15.920 | I didn't get any funding 'cause like for a consultant,
00:09:18.600 | you don't know about this stuff.
00:09:19.780 | You just build things and sell them to people.
00:09:22.720 | And so these were not Australian startups.
00:09:28.720 | They were startups
00:09:33.880 | that happened to be in Australia.
00:09:35.960 | But like, for example, Fastmail at the time,
00:09:39.440 | this is really weird.
00:09:40.760 | I called up IBM and I ordered servers
00:09:44.200 | and I had them shipped to somewhere in New York
00:09:48.380 | that I'd never been.
00:09:49.220 | And they plugged them in for me.
00:09:50.560 | And so my servers were in there
00:09:52.360 | because like, why wouldn't you do that?
00:09:54.560 | The cost of bandwidth in America
00:09:56.040 | was about a hundred times cheaper than Australia.
00:09:59.160 | And the number of customers I had access to in America
00:10:03.040 | was orders of magnitude higher.
00:10:04.960 | And so it never occurred to me
00:10:07.520 | to have my servers in Australia
00:10:09.200 | because Australia is far away
00:10:11.000 | and it's small and it's expensive.
00:10:12.720 | And kind of similar with ODG, you know,
00:10:15.320 | the focus, I mean, I certainly had some Australian clients
00:10:18.540 | but my focus was on American clients
00:10:21.600 | 'cause there's a lot more big insurance companies in America.
00:10:24.600 | And so this turned out great
00:10:28.400 | because living in Australia,
00:10:31.240 | I didn't quite have a sense of how far away we are
00:10:33.520 | and how much no one gives a shit about us
00:10:36.160 | other than maybe like cricket.
00:10:37.660 | But they don't.
00:10:42.720 | And but the fact that then we were just companies,
00:10:47.000 | not Australian companies, it didn't matter.
00:10:48.680 | It didn't matter we were a long way away.
00:10:50.180 | It didn't matter we were somewhere
00:10:51.400 | with crappy, expensive internet, you know?
00:10:53.520 | It just, you know, we were competing on a global stage
00:10:57.080 | without any constraints caused by our location.
00:11:01.360 | And so that turned out to be great.
00:11:02.520 | We ended up selling fast mail to Opera,
00:11:05.600 | which is a Norwegian company.
00:11:07.160 | We sold ODG to Lexus Nexus,
00:11:09.380 | which eventually is a UK company.
00:11:12.320 | And, you know, that turned out great.
00:11:16.680 | And so the kind of advice I guess I found,
00:11:20.760 | I feel like from that I got out of that was,
00:11:24.040 | in Australia don't try to be an Australian company, you know?
00:11:27.600 | Yes, there's lots of agriculture.
00:11:29.760 | Yes, there's lots of mining,
00:11:31.880 | but that is tiny compared to all the world out there.
00:11:35.940 | And furthermore, Australian companies
00:11:37.800 | are very, very hard to sell to.
00:11:39.480 | They're very conservative.
00:11:40.640 | They're very slow moving.
00:11:41.920 | If you create something like fast mail, right?
00:11:44.680 | Where anybody can go on the internet
00:11:46.080 | and give you money for your thing,
00:11:48.600 | that tends to work out great.
00:11:50.240 | So like, for example,
00:11:51.560 | you come across this company called Octopus Deploy,
00:11:54.360 | which was a guy in Queensland who thought,
00:11:56.920 | "Oh, I could create a better kind of
00:11:58.120 | "continuous integration system for .NET."
00:12:01.200 | Created an open source software, chucked it up on GitHub,
00:12:04.560 | made a better version that you could buy
00:12:06.520 | if you wanted like 10 copies of it.
00:12:08.580 | Like it was, again, it's similar idea.
00:12:10.920 | It wasn't an Australian company.
00:12:12.520 | It was a company that happened to be in Australia.
00:12:14.800 | And a few years later, now a few months ago,
00:12:20.660 | they got, I think it was $185 million of funding.
00:12:25.660 | And none of that funding was from Australian investors.
00:12:28.240 | That was all from American investors.
00:12:29.720 | So it kind of bypassed the whole Australian thing
00:12:34.720 | and just focused on saying like,
00:12:36.000 | "You know what?
00:12:36.840 | "I'm a pretty good .NET developer.
00:12:38.100 | "I pretty much understand quite well deployment.
00:12:41.040 | "You know, why don't I make something
00:12:44.000 | "that anybody can just come along and use?"
00:12:47.160 | And so it's a similar thing now for Rachel and I with FastAI.
00:12:50.960 | We started FastAI, which we'll come back to later in the US.
00:12:55.360 | We're now moving to Australia.
00:12:57.600 | It doesn't matter.
00:12:58.740 | Like no one thinks of FastAI as being
00:13:00.940 | an American AI company, and we can do it
00:13:03.200 | just as well here as there.
00:13:04.800 | And so, you know, we have access to the global marketplace.
00:13:08.880 | Having said that, the next startup
00:13:17.200 | and some of these I co-founded.
00:13:19.600 | So ODGI co-founded, and obviously the next one,
00:13:22.000 | which is Kaggle co-founded.
00:13:26.880 | With Kaggle, we decided to try a different approach
00:13:31.880 | which was to get VC funding.
00:13:33.800 | Now, a similar thing, you know.
00:13:38.120 | I said to Anthony who we're doing this with,
00:13:42.160 | let's not even try to get funding in Australia
00:13:45.840 | because Australia doesn't fund tech startups.
00:13:50.080 | Like it's basically so little
00:13:51.560 | as you could just ignore it, it's tiny.
00:13:53.780 | In fact, the amount of funding
00:13:58.240 | of startups in Australia in a year
00:14:03.180 | is less than the amount of funding of startups
00:14:06.120 | in the US in a day.
00:14:08.240 | So when I say it's different, it's very, very different.
00:14:12.180 | So we went to San Francisco to try and get funding.
00:14:15.920 | And we were pre-revenue.
00:14:20.720 | And honestly, we didn't tell this to the VCs.
00:14:24.600 | We were kind of pre-business model.
00:14:26.740 | We were pretty enamored with the idea,
00:14:31.080 | but didn't quite know how to make money out of it.
00:14:33.240 | And so we thought we were being very bold
00:14:35.360 | by asking for $500,000.
00:14:39.040 | It's like, okay, that's crazy.
00:14:41.880 | But we did, you know.
00:14:45.760 | And I will never forget the time
00:14:48.360 | we went into Andreessen Horowitz
00:14:50.880 | and Mark Andreessen said,
00:14:54.400 | how much money you're looking for?
00:14:56.280 | And we said, $500,000?
00:15:01.000 | And Mark was like, hmm, what would you do
00:15:04.720 | with $5 million?
00:15:07.560 | And we were like, make a better company.
00:15:10.880 | But this was actually a start of a theme in the Bay Area,
00:15:19.920 | which was every time we'd say we want to do X,
00:15:23.320 | people would say like, well, okay, that's great.
00:15:25.800 | What if you could make an even bigger X?
00:15:27.520 | Or like, what if you could make an even better X?
00:15:31.980 | So then the Node Coaster
00:15:34.880 | came to our little co-working space in San Francisco.
00:15:39.880 | And this is the other thing to know.
00:15:41.760 | If you ever go fundraising in the Bay Area,
00:15:44.640 | everybody knows everybody.
00:15:47.480 | And they all know everything about what's going on.
00:15:49.480 | So the Node was like, oh, I heard Mark Andreessen
00:15:53.000 | is looking at giving you $5 million.
00:15:55.240 | They're like, oh, yes.
00:15:57.880 | What would you do if Coaster Ventures
00:16:01.640 | gave you another $5 million?
00:16:04.480 | And they were like, wow, it just kept pushing.
00:16:08.920 | And it was a very different experience
00:16:12.400 | 'cause I found doing my little startups in Australia,
00:16:17.400 | it was always like, oh, I'm trying to create
00:16:23.320 | an email company that does synchronized email
00:16:27.000 | and I'm trying to sell it on the internet.
00:16:28.520 | And almost everybody would say like, why?
00:16:32.320 | Microsoft already has an email service.
00:16:34.240 | Yahoo already has an email service.
00:16:35.600 | They're bigger than you.
00:16:37.000 | They've got more developers than you.
00:16:39.440 | There's like, honestly, is there any chance that,
00:16:42.640 | obviously there's no chance you can beat them.
00:16:44.400 | So why are you doing this?
00:16:46.240 | Is there something smaller you could do?
00:16:50.760 | Is there something more targeted you could do?
00:16:52.360 | Is there something focused
00:16:53.280 | on the Australian market you could do?
00:16:55.200 | I was like, everybody, best friends,
00:16:57.560 | colleagues, acquaintances.
00:16:59.400 | And it's very difficult because you end up
00:17:03.400 | constantly doubting your sanity.
00:17:07.080 | And the truth is, to be a tech founder
00:17:10.680 | requires a whole lot of arrogance.
00:17:15.680 | You need the arrogance to believe
00:17:21.680 | that you can actually build something
00:17:28.040 | that other people are gonna wanna buy
00:17:30.040 | and that then other people who come along
00:17:31.320 | and try to compete with you won't do as well as you
00:17:33.160 | and you'll do better.
00:17:34.000 | You have to have the arrogance to believe you can win.
00:17:36.560 | Which is a lot of arrogance.
00:17:39.440 | But you also need the humility to recognise
00:17:42.920 | that other people come along
00:17:44.240 | and they actually have some better ideas than you.
00:17:46.080 | And so sometimes you should borrow those ideas
00:17:48.080 | or sometimes you should try and find ways to do it better.
00:17:50.120 | So it requires this weird combination
00:17:51.760 | of great humility and great arrogance.
00:17:56.120 | And in Australia, I found people mainly noticed the arrogance.
00:18:01.600 | But yeah, in the Bay Area, everybody was just like,
00:18:05.080 | oh, this is really cool that you're trying to do this thing.
00:18:08.960 | How can we help?
00:18:09.800 | Can we help you make it bigger?
00:18:11.520 | The other thing that I got a lot in Australia
00:18:16.560 | was this kind of sense of like,
00:18:18.040 | why are you trying to create that
00:18:20.000 | when they're already perfectly good things?
00:18:22.280 | It's almost like you're a whinger or a complainer.
00:18:26.840 | It's like, things aren't good enough.
00:18:29.040 | You know, why aren't you okay with what's there?
00:18:32.560 | Whereas the other thing is there's this nice sense
00:18:33.920 | in the Bay Area of like, oh, it's really cool
00:18:37.960 | that you're trying to do something better.
00:18:39.800 | And so there are some cultural things that I felt
00:18:43.840 | Australia's kind of needs to get over
00:18:46.880 | to build a great tech entrepreneur ecosystem.
00:18:50.520 | 'Cause it doesn't have to be Australia-wide,
00:18:52.760 | but you want people in your community
00:18:54.440 | who are cheering you on and who are believing in you.
00:18:58.600 | Anyway, we didn't actually end up
00:19:00.920 | taking money from Andres and Horowitz.
00:19:03.400 | I can't quite remember.
00:19:04.320 | Oh, that's right, I remember why.
00:19:05.600 | They hadn't done any machine learning investments before.
00:19:09.080 | And so what actually happens with these VCs
00:19:11.480 | is the VCs you speak to don't do any of the tech stuff
00:19:15.200 | themselves, they hand it off to,
00:19:17.720 | mainly to academics, which is something
00:19:20.520 | we don't say to have a great ecosystem for here either,
00:19:23.200 | is like, you don't see this strong connection
00:19:25.080 | between investors and academics in Australia.
00:19:27.600 | In the US, you know, Bernard would ring up
00:19:30.560 | one of the professors at Stanford or Berkeley
00:19:32.960 | and say, can you please meet with Jeremy and Anthony?
00:19:35.920 | You know, this is what they're building.
00:19:37.080 | Can you check this, this and this?
00:19:38.960 | So with Andres and Horowitz, I mean, to their credit,
00:19:41.360 | they, through their DD, they kind of came to the point
00:19:44.240 | where they said, okay, we're just not convinced
00:19:45.560 | about the size of the machine learning marketplace.
00:19:47.720 | We haven't done machine learning before.
00:19:49.120 | We're not comfortable with this.
00:19:50.280 | So we got out, we ended up getting our $5 million
00:19:52.480 | from somebody else.
00:19:53.920 | And one of the really interesting things
00:19:55.480 | in the VC world over there is the whole thing
00:19:58.840 | is so driven by fear of missing out, by FOMO.
00:20:02.440 | So then suddenly people that we hadn't heard from
00:20:06.280 | suddenly started emailing us with like,
00:20:08.840 | can you come here today?
00:20:11.560 | You know, we really wanna see you guys.
00:20:12.960 | We're really excited about what you're doing.
00:20:14.400 | These are people who not replied to emails for weeks.
00:20:18.120 | And I'll never forget one of them.
00:20:20.000 | I'm not gonna say who.
00:20:21.840 | We went down to their office.
00:20:22.920 | We're like, we kind of had a promise
00:20:24.560 | between Anthony and I had a promise between ourselves.
00:20:26.720 | We'd never say no, right?
00:20:28.720 | We'd take every opportunity.
00:20:30.000 | We're like, we were sick of talking to VCs.
00:20:31.760 | We're like, okay, we've said, we always say yes.
00:20:35.040 | I'm so glad we did.
00:20:35.880 | Otherwise we would have missed out
00:20:36.720 | on this amazing situation.
00:20:39.960 | The people who said they were dying to see us
00:20:43.760 | left us waiting, I can't remember,
00:20:45.280 | for like half an hour in their giant board room.
00:20:48.440 | And then this guy finally does come in.
00:20:51.280 | He charges in, no introduction.
00:20:54.640 | I hear you're gonna take money
00:20:57.360 | from fucking Mark fucking Andreessen.
00:20:59.480 | Is that right?
00:21:01.760 | And I think Anthony was about to reply
00:21:04.840 | and the guy doesn't let it and he goes,
00:21:06.800 | well, let me tell you something.
00:21:09.360 | If Mark fucking Andreessen was here right now,
00:21:11.440 | I'd throw him out the fucking window.
00:21:13.520 | I'd break his arm.
00:21:14.480 | I'd take him to Stanford hospital.
00:21:16.120 | It's just down the road, you know?
00:21:17.720 | And then I'd fucking break it again.
00:21:21.000 | (audience laughing)
00:21:22.320 | This was his introduction, and Anthony goes,
00:21:24.720 | we're not taking money from Mark Andreessen.
00:21:28.280 | Well, that's fucking all right then,
00:21:30.440 | 'cause I fucking hate Mark fucking Andreessen.
00:21:32.760 | (audience laughing)
00:21:34.680 | It's like, it was so much like this over there.
00:21:39.680 | The place is crazy.
00:21:40.600 | If you've ever seen Silicon Valley, the TV show,
00:21:42.960 | it's all real, but it's crazier than that,
00:21:46.880 | but they couldn't put that in the real thing.
00:21:49.880 | Do you guys remember the hot dog detector in that show?
00:21:53.880 | Did you notice there was a real hot dog detector?
00:21:56.280 | They actually built for it on the app store.
00:21:58.360 | That was built by a fast AI student, by the way.
00:22:01.240 | He used to come in every week to class
00:22:06.120 | and he'd always ask these weird asked questions.
00:22:09.960 | He'd be like, I can't tell you what I'm doing,
00:22:12.560 | but let's say somebody was trying to find microphones
00:22:17.440 | and then they got lots of pictures of microphones
00:22:19.680 | and then some of them weren't microphones,
00:22:22.640 | but they looked like microphones.
00:22:24.160 | (audience laughing)
00:22:25.920 | And then eventually, the show comes out
00:22:29.400 | and he's like, okay, that's what I was building.
00:22:31.680 | (laughs)
00:22:32.840 | That was so great.
00:22:34.000 | That was definitely one of our star students.
00:22:36.440 | Anywho, so.
00:22:40.880 | Yeah.
00:22:43.960 | Okay, so with Kaggle, what happened?
00:22:49.600 | Was I actually didn't expect us to raise any money, honestly.
00:22:54.600 | So I just kind of was humoring Anthony.
00:23:00.760 | He was always the one with gumption, you know?
00:23:03.240 | And I was like, yeah, okay, I'll pitch
00:23:06.840 | and I'll build the financial models
00:23:08.000 | and I'll build the deck, but don't have high expectations.
00:23:10.160 | So then we raised over $10 million
00:23:12.840 | and yeah, the Node Coaster kind of looked at us
00:23:17.880 | and was like, so when are you guys moving here?
00:23:20.960 | I was like, oh. (laughs)
00:23:23.800 | And obviously at that point, I can't not
00:23:26.760 | 'cause I've been in every pitch and whatever.
00:23:30.040 | So that's how I moved to San Francisco
00:23:32.480 | and I got to call my mom and was like,
00:23:34.320 | oh, this is what just happened.
00:23:35.920 | So yeah, I mean,
00:23:41.560 | moving to San Francisco was interesting.
00:23:43.840 | It was like, all right, so let's do that.
00:23:47.240 | Australia, US, what is going on with this?
00:23:52.240 | US, there you go.
00:23:55.320 | It was interesting, like I was really starstruck.
00:24:01.120 | I was like, oh, there's Google, you know?
00:24:03.800 | There's Facebook, you know,
00:24:05.000 | meetups would be at Google or Facebook
00:24:07.320 | and I'd be like talking to a Google product manager
00:24:09.480 | and I was definitely like, wow, this is very exciting.
00:24:12.840 | I felt quite starstruck.
00:24:14.880 | But the other thing I really noticed was like,
00:24:16.840 | I was talking to like these legends,
00:24:20.360 | but then I was like, they're actually really normal.
00:24:24.000 | You know, I kind of expected them to be on another level.
00:24:26.440 | I felt like as a little Australian nobody,
00:24:30.640 | I would just be dominated by these people.
00:24:33.320 | But no, I mean, when I compared them
00:24:36.440 | to my mates back in Australia, they weren't all that.
00:24:40.600 | I mean, they were fine, you know?
00:24:42.760 | They were smart enough, they were passionate.
00:24:45.360 | But they weren't on another level at all.
00:24:47.880 | And I kind of realized that actually
00:24:49.920 | the Australian kind of talent pool is just fantastic,
00:24:54.920 | you know?
00:24:56.200 | But there's this huge difference
00:24:57.960 | in opportunity and belief, you know?
00:25:02.960 | Like everybody I spoke to, you know, in San Francisco,
00:25:08.120 | like literally that I'd staying in Airbnb
00:25:10.960 | for the first few months.
00:25:13.000 | The Airbnb, people that ran the Airbnb, I was at like,
00:25:16.600 | "Oh, are you here doing tech startup?"
00:25:19.480 | 'Cause like everybody is there doing tech startup.
00:25:20.920 | Yeah, yeah.
00:25:21.760 | Oh yeah, me too.
00:25:24.200 | You know, I'm a photographer.
00:25:25.880 | And I've got this idea that's gonna revolutionize
00:25:28.120 | how photography is done, you know,
00:25:30.600 | in product development settings.
00:25:33.160 | Like everybody you talk to has not just got an idea,
00:25:36.480 | but they want to tell you about it.
00:25:37.760 | They believe it's the best idea.
00:25:39.400 | They believe it's gonna succeed.
00:25:40.800 | Which I don't get that, or at least at that time
00:25:44.280 | in Australia as I was kind of in Australia,
00:25:46.320 | I didn't get that nearly as much, you know?
00:25:50.480 | So I think that was a really interesting difference.
00:25:53.600 | And it gave me a lot of confidence in myself
00:25:56.480 | as an Australian to see that like actually,
00:26:00.400 | Aussies are not way behind.
00:26:02.400 | We're actually pretty damn good, you know?
00:26:05.880 | So that was kind of interesting to me.
00:26:09.280 | But there was other differences there.
00:26:13.080 | I guess it's part of this kind of boldness, right?
00:26:16.960 | So I felt like folks there were on the home or bold.
00:26:20.480 | But interestingly, even though they were in the center
00:26:22.920 | of the world's biggest marketplace,
00:26:26.080 | they were still actually more global.
00:26:29.560 | You know, none of them were trying to build
00:26:30.840 | American startups through American audiences,
00:26:33.200 | American companies.
00:26:34.040 | There was always this assumption
00:26:37.200 | that we're gonna chuck stuff up on the internet
00:26:39.360 | and everybody's gonna go and buy it.
00:26:41.120 | And you know, in terms of like who really needs
00:26:44.840 | that attitude, it's us in Australia.
00:26:47.880 | Now one of the really cool things about being at Kaggle
00:26:52.480 | was that I got to see, you know,
00:26:57.480 | I was the chief scientist there as well as the president.
00:27:00.640 | So I actually got to kind of validate
00:27:02.280 | and check out the winning solutions.
00:27:04.120 | And so I was always like really seeing
00:27:06.120 | what are the actual best ways to do things right now.
00:27:09.240 | And around 2012, I started noticing deep learning,
00:27:13.600 | starting to win things or at least do pretty well.
00:27:18.080 | And I had last used neural nets like 20 years earlier.
00:27:22.400 | They kind of put them aside as being like,
00:27:24.520 | probably gonna change the world one day, but not yet.
00:27:27.280 | And then 2012, it's like, oh, I think the day is coming.
00:27:34.200 | And that really became very clear during 2013.
00:27:39.200 | So one of my real concerns was,
00:27:44.360 | which I shared with my wife, Rachel,
00:27:47.680 | was that the people using these neural nets
00:27:51.080 | were like, they were like all the same person.
00:27:53.840 | They were from one of five universities
00:27:56.520 | that were all very exclusive.
00:27:58.840 | They were all white, they were all male,
00:28:01.320 | and they were all solving stupid problems,
00:28:06.000 | like trying to find their cats in their photos or whatever.
00:28:09.800 | I mean, look, okay, it's nice to find your cats
00:28:11.800 | in your photos and people make a lot of money from that.
00:28:14.280 | But where were the people trying to deal
00:28:16.240 | with global water shortages or access to education
00:28:21.120 | or dealing with huge economic inequity?
00:28:26.120 | It wasn't on the radar.
00:28:30.160 | And we knew that that was because
00:28:32.200 | you only get a kind of a diversity of problems solved
00:28:35.360 | if you have a diversity of people solving them.
00:28:37.880 | So we actually started getting pretty concerned about that.
00:28:42.880 | But at the same time, I also felt like
00:28:51.200 | maybe there's some low-hanging fruit.
00:28:53.280 | There's something I could do right now
00:28:55.760 | that would make a really big difference.
00:28:58.560 | So to give you a sense of this,
00:29:01.560 | I wonder if I've got any slides about this thing,
00:29:03.040 | let me have a little look.
00:29:04.240 | So to give you a sense of how I feel
00:29:10.160 | about deep learning now,
00:29:11.440 | and I felt the same way about it then,
00:29:13.520 | is it's a fundamental kind of like,
00:29:18.520 | it's a fundamental technology
00:29:23.120 | that I think is as important as electricity.
00:29:28.000 | And it's literally like electricity and steam engine
00:29:32.440 | kind of said, okay, you don't really need
00:29:34.880 | to generally put human or animal energy inputs in anymore
00:29:39.880 | once it was eventually really sorted.
00:29:41.800 | And kind of deep learning is on the way
00:29:44.360 | to doing the same thing for intellectual inputs.
00:29:46.520 | It's kind of this fast, extraordinary thing.
00:29:49.000 | And there are people who,
00:29:54.000 | there are people who kind of have this sense of like,
00:29:59.000 | oh, neural nets are some hypey, fatty thing.
00:30:06.840 | It's, I don't know, it's just another
00:30:10.960 | in a long line of AI and ML technologies.
00:30:15.200 | I just don't agree with that at all.
00:30:17.240 | Like if you just look at what it can do, right?
00:30:20.680 | So here's an example of DALI,
00:30:22.840 | which is an open AI algorithm.
00:30:24.400 | You type in an illustration of a baby daikon radish
00:30:27.640 | in a tutu walking a dog.
00:30:29.760 | And these are not cherry picked.
00:30:30.960 | These are the first things that it does.
00:30:33.920 | It's not finding these, it's drawing them from scratch
00:30:37.760 | 'cause nobody's asked for that before, right?
00:30:40.920 | You type in an armchair in the shape of an avocado,
00:30:45.400 | it draws these for you.
00:30:48.560 | Like this is not something an SVM does.
00:30:51.360 | This is not something a random forest does.
00:30:53.200 | This is not something a logistic regression does.
00:30:54.960 | This is, you know, to somebody
00:30:58.520 | who doesn't know what's going on,
00:30:59.600 | it just feels magical, you know?
00:31:02.720 | DeepMind created this thing called alpha fold,
00:31:10.280 | which blew away decades of research in protein folding
00:31:15.280 | from a bunch of people who had basically
00:31:19.680 | never worked on protein folding before.
00:31:21.640 | I mean, the closest, you know,
00:31:25.080 | really close example of this from kind of what I've seen
00:31:29.920 | is early in the days of my medical startup in Lytic,
00:31:33.080 | we were bringing in everybody we could
00:31:36.760 | to tell us from the pathology world,
00:31:38.760 | from the radiology world and so forth
00:31:40.000 | to tell us about their research.
00:31:41.640 | And so we had this guy come in and tell us about his PhD
00:31:44.600 | in histopathology segmentation.
00:31:47.680 | And he spent 45 minutes telling us about his, you know,
00:31:51.480 | new approach involving a graph cut algorithm
00:31:53.880 | and waterfall and blah, blah, blah.
00:31:55.440 | And he was getting like new state of the art results
00:31:58.960 | on this particular kind of histopathology segmentation.
00:32:02.080 | And we were like, oh, that sounds pretty cool.
00:32:04.600 | He was like, yeah, I used to think that too yesterday.
00:32:08.040 | But I saw you guys are doing some stuff
00:32:09.960 | with deep learning and I kind of got curious.
00:32:12.400 | So I thought I'd try this with deep learning yesterday
00:32:14.840 | and I ran a model overnight
00:32:17.000 | and it beat my last five years of work.
00:32:19.320 | So now I'm not so sure.
00:32:22.240 | And like, this is like a really common story.
00:32:26.120 | Like every time I try just about anything
00:32:28.080 | with deep learning, I'm like
00:32:30.920 | beating everything I've done before,
00:32:32.240 | beating other people,
00:32:33.120 | what other people have done before.
00:32:34.920 | And the interesting thing about this is
00:32:40.640 | if you haven't done any deep learning yourself,
00:32:44.600 | you might not realize that there really is
00:32:47.480 | kind of just one algorithm.
00:32:50.280 | Like there's very, very little changes
00:32:53.880 | that go between kind of one model and another.
00:32:57.480 | So for example, I looked at the source code
00:33:01.140 | for the AlphaGo Zero model,
00:33:02.960 | which was the thing which absolutely smashed
00:33:06.000 | all previous Go playing approaches.
00:33:08.960 | And the model was almost identical
00:33:11.640 | to the computer vision object recognition models
00:33:15.880 | that I used.
00:33:16.720 | It's a base of basically a bunch of residual layers
00:33:19.920 | with convolutions and relu's and batch norms
00:33:22.320 | and stacked up.
00:33:23.520 | And it's just an extraordinarily powerful general approach.
00:33:28.520 | And so it's really cool kind of as a researcher
00:33:32.720 | because you can read papers from proteomics
00:33:36.800 | or chemo informatics or natural language
00:33:39.560 | or game playing or whatever.
00:33:41.220 | And like 90% of it you get
00:33:44.960 | because it's just the same stuff
00:33:47.320 | rejigged in a slightly different way.
00:33:49.320 | So that was kind of how I felt
00:33:53.360 | and how I feel about deep learning.
00:33:58.360 | And actually,
00:34:00.520 | I realized that there really was some low hanging fruit
00:34:07.920 | at that time in deep learning
00:34:09.800 | and specifically as medicine.
00:34:11.300 | No one literally was doing deep learning in medicine.
00:34:17.340 | And it turns out that there's such a shortage globally
00:34:22.440 | of medical specialists, of doctors
00:34:25.280 | that according to the World Economic Forum,
00:34:26.920 | it's gonna take 300 years to fill in the gap
00:34:30.720 | to basically allow the developing world
00:34:33.240 | to have access to the same medical expertise
00:34:35.280 | as the developed world.
00:34:36.600 | And I thought this is totally unacceptable.
00:34:40.200 | I wonder if we could help make doctors more productive
00:34:45.200 | by adding some deep learning stuff to what they're doing.
00:34:50.320 | Let's try and do some kind of proof of concept.
00:34:56.920 | And so we spent four weeks,
00:35:00.000 | me and three other people spent four weeks
00:35:01.640 | just training a model on some lung CT scans.
00:35:06.640 | And again, like literally none of us knew anything
00:35:09.080 | about radiology or whatever.
00:35:10.680 | And we discovered much throughout kind of shock
00:35:13.240 | that this thing we trained had much lower false negatives
00:35:18.080 | and much lower false positives
00:35:19.600 | at recognizing malignant lung tumors
00:35:22.360 | than a panel of four top Stanford radiologists.
00:35:25.060 | So that turned into my next startup,
00:35:28.480 | which was called Enlidic.
00:35:31.420 | And yeah, again, for Enlidic, I went the VC route,
00:35:36.420 | raised over $10 million.
00:35:42.020 | So this time, this was actually started
00:35:45.940 | from the start in the US.
00:35:47.540 | And it was kind of a lot easier 'cause I knew people.
00:35:50.680 | And yeah, I mean,
00:35:58.700 | this was both great and disappointing.
00:36:01.620 | It was great in the sense that I really hoped
00:36:04.380 | that this startup would help put medical deep learning
00:36:07.320 | on the map and it absolutely did.
00:36:09.620 | It got a huge amount of publicity.
00:36:11.500 | And within a couple of years, particularly in radiology,
00:36:16.500 | deep learning was everywhere.
00:36:20.100 | On the other hand, it always felt like
00:36:23.140 | I'm just doing this one little thing,
00:36:26.820 | when there's so many great people around the world
00:36:30.980 | solving important problems and disaster resilience
00:36:34.060 | or access to food or whatever,
00:36:36.220 | and they don't have a way to tap into
00:36:41.100 | this incredibly powerful tool.
00:36:42.700 | And so between this and this kind of concern
00:36:47.140 | about inequality and the kind of exclusivity
00:36:51.280 | and the kind of homogenous group of people
00:36:54.960 | working on deep learning, Rachel and I actually decided
00:36:59.300 | to start something new, which was fast.ai.
00:37:02.340 | And so fast.ai is all about helping everybody
00:37:09.900 | do what Enlidik is doing,
00:37:16.580 | but not having a bunch of deep learning people do it,
00:37:20.760 | but to have disaster resilience built
00:37:23.100 | by disaster resilience people and have ecology stuff
00:37:26.340 | built by ecology people.
00:37:27.960 | Because it's much easier, this is our hypothesis,
00:37:31.140 | it'd be much easier for a domain expert in ecology
00:37:34.580 | to become an effective deep learning practitioner
00:37:36.700 | than from a deep learning practitioner
00:37:38.180 | to actually fully immerse themselves
00:37:40.300 | in the world of ecology to the point
00:37:41.580 | that they would know what problems to solve
00:37:43.020 | and where to get the data from
00:37:44.000 | and what the constraints are
00:37:45.020 | and how to operationalize things
00:37:46.980 | and understand the legal frameworks
00:37:48.500 | and make the connections in the networks, blah, blah, blah.
00:37:52.220 | So at the time we started fast.ai,
00:37:53.940 | this was quite at the extreme end
00:37:58.820 | of kind of ludicrous ideas
00:38:00.420 | because there was just this total knowledge
00:38:02.820 | that everybody said to do deep learning,
00:38:04.540 | you need a PhD, you probably need a postdoc.
00:38:07.800 | It's something that only a few people in the world
00:38:10.280 | could ever be smart enough to do.
00:38:12.980 | You'd need very, very deep math.
00:38:15.540 | And you need, increasingly you're gonna need
00:38:19.780 | more computers than anybody can afford.
00:38:21.780 | And it was really lots and lots of gatekeeping.
00:38:25.060 | And thankfully it turned out our hypothesis
00:38:27.140 | was actually correct.
00:38:28.680 | And in the intervening years,
00:38:30.680 | we've trained through our courses,
00:38:32.940 | hundreds of thousands of people.
00:38:34.480 | And every few days we get lovely, lovely emails
00:38:40.140 | from people telling us how they've just published a paper
00:38:43.740 | in a top journal, or they've got a new job,
00:38:46.220 | or they've bought deep learning to their startup.
00:38:49.700 | And increasingly they're using also the software
00:38:52.820 | that we're building, the fast.ai library
00:38:54.460 | to do this more quickly and better.
00:38:58.580 | And so that's been really great.
00:39:03.320 | And one of the important things here,
00:39:08.320 | which I guess is something I did learn from consulting
00:39:11.980 | is that the world's smartest people
00:39:14.780 | are not all at universities.
00:39:17.500 | What universities do have the people who
00:39:20.780 | stay in the same place their whole life.
00:39:27.780 | If you're an academic at a university,
00:39:29.340 | you've literally spent your whole life
00:39:30.780 | in educational institutions.
00:39:32.740 | And so these are not generally, not always,
00:39:35.700 | but they're not generally the most bold
00:39:38.980 | and grounded group of people, as you may have noticed.
00:39:42.220 | And in fact, in industry, there's a lot of brilliant people
00:39:45.860 | doing brilliant research.
00:39:47.760 | And so this has been one of the interesting things
00:39:49.380 | in fast.ai is a lot of the really powerful examples
00:39:52.400 | we hear about are actually coming from industry.
00:39:56.280 | Unfortunately, the problem with America is, well, you know.
00:40:03.740 | So we realized we couldn't stay there
00:40:11.780 | and we certainly couldn't bring up our child there,
00:40:15.220 | particularly after 2020 because, you know.
00:40:18.820 | So we tried really hard to get back
00:40:23.120 | and eventually the government here let us in.
00:40:25.440 | And coming back to Australia was just amazing
00:40:31.020 | because having lived here my whole life,
00:40:35.760 | I kind of had this vague sense that Australia
00:40:38.820 | had a really nice culture and kind of this,
00:40:41.300 | like something about going to America
00:40:42.900 | that was a bit off.
00:40:45.380 | But then coming back here, it just really hit me
00:40:49.540 | that like Australia is such a bloody good country.
00:40:53.980 | Like, and the people, like there's this kind of like,
00:40:57.800 | you know, sense of this kind of fair go
00:41:01.900 | and this kind of sense of helping people out
00:41:04.360 | and this kind of informality.
00:41:06.020 | And it's just after spending 10 years in America,
00:41:09.980 | it was just this huge breath of fresh air
00:41:12.220 | to be back here and that fresh air,
00:41:13.940 | you know how when you're really hot
00:41:15.380 | and there's a cool breeze and you've really,
00:41:17.180 | that feels great, it was like that.
00:41:19.500 | You know, it was like, it felt like I'd been stifling
00:41:22.580 | humidity for 10 years and I kind of came back to sanity.
00:41:25.380 | So that was amazing, but at the same time,
00:41:30.140 | I was also shocked by how little have changed here.
00:41:33.260 | Yes, a whole lot of accelerators and incubators
00:41:38.740 | and angel networks had sprung up,
00:41:40.940 | none of which existed when I was here.
00:41:43.780 | But when it actually came to the rubber hitting the road,
00:41:46.140 | I was trying to find people like doing like
00:41:50.140 | really world-class deep learning research
00:41:53.900 | or building startups which had, you know,
00:41:56.340 | huge global impact or venture capitalist investing
00:42:00.940 | in the biggest, boldest ideas.
00:42:03.220 | And I can't really find it, you know.
00:42:07.500 | And actually, Michael Evans was kind enough to
00:42:11.500 | let me share some stuff that he has been working on,
00:42:15.580 | kind of looking at this from a data point of view.
00:42:18.180 | And you can kind of see it in the data, right?
00:42:22.620 | From an investing point of view,
00:42:25.260 | seed and angel investment in Australia is like,
00:42:29.220 | per capita, is like an order of magnitude behind the US.
00:42:33.500 | And this is like, this is where things get going, right?
00:42:37.820 | If you've got 10 times less money per person
00:42:40.820 | going into like getting things going,
00:42:43.740 | that's gonna be really hard for entrepreneurs, right?
00:42:46.460 | Investment activity,
00:42:51.020 | Australia is not even on the chart.
00:42:56.220 | So our investment activity in AI
00:42:58.100 | is averaging around $20 million a year.
00:43:00.780 | And here's something that Michael told me that shocked me.
00:43:03.180 | Last year it decreased by 80%.
00:43:05.420 | Now you might think, oh, fair enough, COVID, guess what?
00:43:07.620 | The rest of the world, it grew by 20%.
00:43:09.900 | So on the rest of the world, investors went like,
00:43:11.900 | oh, this is creating new opportunities.
00:43:14.100 | In Australia, which is like not even hit that much by COVID,
00:43:18.100 | investors, but they went home.
00:43:19.940 | So this is kind of lack of risk-taking,
00:43:24.260 | that's a real concern.
00:43:26.100 | There's a lack of investment in research.
00:43:28.540 | So, you know, this is the OECD average,
00:43:33.020 | not only are we worse, but we're getting worse, right?
00:43:36.300 | And again, this is the fundamental stuff,
00:43:38.620 | seed investment, angels, research.
00:43:43.220 | So in general, tech, our share of the global value added,
00:43:50.020 | it's the amount of stuff,
00:43:51.820 | value that we're adding to the economy.
00:43:53.820 | This is the Australian tech share of that.
00:43:57.340 | It's plummeting and it's near the very bottom of the OECD.
00:44:01.460 | We're behind Chile, Turkey.
00:44:03.740 | So, and these are like data points
00:44:10.140 | that reflect something that I was already seeing.
00:44:12.980 | So like I kind of caught up Michael and I was like,
00:44:14.620 | this is something I'm seeing.
00:44:15.700 | Am I mad?
00:44:16.540 | And it's like, no, you're not mad.
00:44:18.260 | I've got the data to show you what you're seeing.
00:44:21.380 | This is actually the one that kind of resonated
00:44:25.220 | the most with me.
00:44:26.220 | In terms of talking with enterprises,
00:44:29.180 | this is a Deloitte study,
00:44:30.860 | talking with big enterprises.
00:44:32.500 | They asked, okay, why are you interested in AI?
00:44:35.860 | Half of Aussie enterprises said,
00:44:38.060 | oh, we want to catch up or keep up.
00:44:41.060 | 22% said, 'cause we want to get ahead.
00:44:45.980 | And this is a worse,
00:44:47.740 | this is worse than every other country that they spoke to.
00:44:51.380 | Aussie customers are so conservative.
00:44:54.580 | I really noticed this,
00:44:57.420 | like if you want to sell to enterprises in Australia,
00:44:59.580 | you have to tell them that their competitors
00:45:01.100 | already bought it.
00:45:01.940 | If you want to say you could use this
00:45:05.100 | to power ahead of your field
00:45:06.980 | and become a global success story, they don't care.
00:45:10.420 | I don't exactly know why this is,
00:45:12.300 | but it's true in the data
00:45:14.220 | and it's kind of absolutely true from all of my experience.
00:45:18.020 | Having said that, in the OECD,
00:45:21.940 | Australia ranks right at the top
00:45:24.940 | in terms of like our use of tech, right?
00:45:28.220 | And this is what I was saying earlier,
00:45:29.660 | like Aussies are awesome.
00:45:31.540 | You know, we're smart, we're technical, you know?
00:45:36.340 | And yet we're nearly at the bottom
00:45:40.380 | in terms of our investment in tech.
00:45:44.660 | So it's kind of this weird thing.
00:45:45.860 | And this is actually why I think Australia
00:45:48.220 | is a great place to build a startup.
00:45:52.500 | The reason I think this is because
00:45:58.300 | if you can get past all this stuff pulling you down,
00:46:02.820 | all this like, why bother?
00:46:05.060 | You'll just get beaten.
00:46:07.220 | Can you take less money than you want?
00:46:10.540 | Blah, blah, blah.
00:46:12.820 | You're in a place where you're surrounded
00:46:14.980 | by brilliant people.
00:46:17.100 | They don't have other cool tech startups
00:46:19.340 | to go to on the whole.
00:46:20.180 | I mean, it's not that there's none, right?
00:46:21.500 | But there's relatively very few, you know?
00:46:24.020 | And so when, one of the things that was fascinating
00:46:27.300 | in San Francisco was that people would say like,
00:46:32.300 | oh, we've got such an edge
00:46:34.780 | because our R&D hub is in Melbourne.
00:46:38.100 | And so we're paying, you know, I think it was like
00:46:40.620 | on average one quarter to one fifth of the salaries
00:46:43.340 | of being paying in San Francisco
00:46:45.100 | and they could actually get people
00:46:46.420 | like straight out of university.
00:46:48.220 | And in Lytic, to get people straight out of undergrad,
00:46:50.540 | I had to pay them at least 200 grand US, right?
00:46:55.260 | Which by the way, if you're a student
00:46:58.220 | not working on deep learning, right?
00:47:01.060 | This is the technology where like people who understand it
00:47:05.580 | and can wield it well can get paid 200 grand
00:47:08.980 | straight out of undergrad, you know?
00:47:10.380 | So it's not a bad thing to have in your toolbox
00:47:12.780 | even from a job market point of view.
00:47:15.420 | So it's actually, sadly, it's kind of like this hidden gem.
00:47:20.340 | It's like this diamond in the rough.
00:47:22.140 | And so I've often noticed when kind of VCs come and visit
00:47:26.900 | or top researchers come and visit,
00:47:29.420 | they're often really surprised
00:47:31.740 | at how many brilliant people are here.
00:47:33.900 | Because let me tell you, in San Francisco,
00:47:37.180 | even though I'm Australian, I'm looking out for it,
00:47:38.940 | you don't hear about that, you know?
00:47:42.620 | It's like, you know, even looking at like academic papers,
00:47:48.940 | I'd always be like looking out
00:47:50.420 | for really influential academic papers
00:47:53.100 | that helped me with my work in deep learning.
00:47:55.700 | Do they have any Aussie authors?
00:47:57.900 | And invariably, if the answer was yes,
00:48:00.700 | it was because they've moved to the Bay Area, you know?
00:48:03.820 | And I think that's, yeah, I think that's such a waste.
00:48:08.820 | You know, we have all these brilliant people.
00:48:12.700 | We have this kind of fantastic system.
00:48:16.020 | We've got, you know, technically competent people,
00:48:20.700 | you know, in the workplace.
00:48:22.260 | I think there are big opportunities here,
00:48:25.020 | but I'd say for building a tech startup,
00:48:29.100 | and obviously for me, I particularly think
00:48:30.620 | building an AI startup, you know,
00:48:32.220 | where deep learning is some key component, you know,
00:48:35.180 | why wouldn't you be like being at the start of the steam age
00:48:39.500 | and trying to create a new kind of loom
00:48:41.380 | that doesn't use steam, you know?
00:48:43.540 | It doesn't make any sense to me.
00:48:45.020 | Anyway, so you create startups here.
00:48:47.860 | It's like, do it
00:48:49.100 | in as un-Australian a way as possible, right?
00:48:54.900 | It's like, you don't have to have Australian investors.
00:48:58.540 | You don't have to have Australian customers.
00:49:00.060 | Like just believe that you can put something up
00:49:02.060 | on the internet that people are gonna buy, you know?
00:49:05.940 | And, you know, don't worry about whether it's mining
00:49:10.180 | or whether it's agricultural,
00:49:11.620 | whether it's something your PhD advisor,
00:49:13.500 | who's never built trained a deep learning model,
00:49:15.300 | thinks is interesting or whatever, you know?
00:49:18.020 | To me, that's kind of the secret to how,
00:49:25.020 | you know, we can have some great startups here.
00:49:30.500 | And I will say, as that happens, things will change, right?
00:49:35.100 | And things are already starting to change.
00:49:36.620 | So like something really interesting
00:49:39.220 | is what's happening in Adelaide, right?
00:49:40.780 | So Adelaide has this fantastic AI and machine learning center
00:49:45.780 | and they're doing something which is almost unheard of
00:49:49.180 | in universities, which is that they're forging
00:49:51.860 | really great partnerships with the tech community
00:49:56.860 | to the point where Amazon is now there too, right?
00:50:00.740 | And so Amazon has gone and said, okay,
00:50:02.460 | we're gonna partner with Adelaide University of Adelaide.
00:50:06.060 | And so there's now kind of the two centers next door,
00:50:09.820 | very closely related.
00:50:10.980 | And of course, what's now happening,
00:50:12.340 | I can't tell you the details, but it happened to know
00:50:14.780 | lots more big tech companies are now planning
00:50:18.740 | to head to Adelaide as well.
00:50:20.060 | And so you can imagine what's gonna happen, right?
00:50:22.060 | Now, lots of people are gonna like go to those
00:50:24.860 | and then they'll leave and they'll create startups
00:50:26.700 | and then other startups who wanna go there
00:50:28.100 | and then other big companies who wanna go there.
00:50:30.180 | And so, and then of course,
00:50:32.060 | what's gonna happen in all the other capitals,
00:50:33.740 | they'll be like, oh my God,
00:50:34.580 | it looks like happening in Adelaide,
00:50:35.700 | we have to do that as well.
00:50:38.100 | And this is very, very different
00:50:40.020 | to how things are currently done.
00:50:41.500 | 'Cause universities like here are in many ways
00:50:46.420 | incredibly anti-entrepreneur, anti-tech entrepreneur.
00:50:51.420 | So for example, you know,
00:50:55.260 | a lot of brilliant work gets done out of UQ and QUT.
00:50:58.180 | They're sponsoring this AI hub, that's fantastic.
00:51:00.700 | But if an academic there wants to start a startup,
00:51:06.880 | they have to give QUT 70% to start.
00:51:11.880 | And let me tell you, that's literally impossible.
00:51:15.680 | So there's zero successes
00:51:17.660 | 'cause that's like no one will invest in that company
00:51:20.580 | and the founder can't even be invested in that company.
00:51:23.260 | Like, and it's not just Queensland,
00:51:24.980 | this is basically every university in Australia.
00:51:29.700 | Adelaide made a huge step of going from 70% to 49%.
00:51:35.640 | Compare this to like Stanford or Berkeley,
00:51:39.680 | where like every academic I know there in engineering
00:51:43.080 | or computer science has four or five startups,
00:51:45.080 | that they have a 5% equity stake in.
00:51:47.840 | You know, half of their students go to those startups.
00:51:50.880 | Then those students find interesting research directions
00:51:55.520 | from the work that they're doing,
00:51:57.200 | which they then go back
00:51:58.280 | and then they fund a new group of people at the university.
00:52:01.720 | I mean, if you look at the relationship, for example,
00:52:03.520 | between Stanford and Google,
00:52:05.800 | it's like constant back and forth research,
00:52:09.480 | huge amounts of funding from Google to Stanford,
00:52:12.000 | lots of job opportunities for standard people at Google.
00:52:14.840 | The idea that the way you leverage your academic talent
00:52:19.840 | is by forcing them to give you 70% of their company
00:52:23.800 | is absolute insanity and it's totally not working.
00:52:28.120 | And I personally know of many academics in Australia
00:52:31.280 | who have decided not to start startups
00:52:34.000 | because of this reason.
00:52:34.840 | And also because most universities will tell you,
00:52:38.400 | you're not allowed to keep working here
00:52:40.720 | if you're working at a startup,
00:52:42.420 | which of course it should be the opposite.
00:52:44.000 | It should be like, oh, wow,
00:52:45.160 | you're getting industry experience,
00:52:46.600 | you're learning about actual applied problems,
00:52:49.480 | we'll pay you a bonus, you know?
00:52:52.320 | So there's a lot of kind of issues
00:52:55.000 | with how the kind of tech sector is working here
00:53:00.000 | and how entrepreneurialism is working here,
00:53:01.800 | but the most important thing is the kind of the raw
00:53:05.040 | foundation that we have,
00:53:07.080 | which I think is one of the best in the world.
00:53:09.600 | And so that's one of the reasons that we came here
00:53:14.600 | is because we wanna help any way we can
00:53:20.080 | change Australia from a diamond in the rough
00:53:24.520 | to a glowing diamond that everybody around the world knows.
00:53:30.360 | So that's what we wanna do, thank you.
00:53:32.280 | (audience clapping)
00:53:35.280 | - That's awesome to get an insight into your experiences
00:53:41.200 | over the last, well, since you started your first startup.
00:53:44.840 | From the beginning when you first started
00:53:49.480 | to when you went to US and now when you had your first
00:53:54.320 | couple of months back in Australia,
00:53:56.800 | what's harder, getting an idea,
00:54:00.440 | getting money or getting good data to make it all happen?
00:54:05.480 | - I think if getting good data is the thing you find hard,
00:54:10.960 | then you're doing the wrong thing, right?
00:54:13.240 | So the thing you're doing should be something
00:54:16.080 | which you're deeply in that field, right?
00:54:19.120 | So like if you're somebody in the legal industry,
00:54:24.440 | you should be doing a legal startup.
00:54:26.800 | If you're in the HR industry, do an HR startup.
00:54:29.200 | If you're in the medical field, do a medical startup
00:54:31.080 | because then getting data is easy
00:54:33.440 | because you're surrounded by it.
00:54:35.360 | Or your friends working companies with it,
00:54:37.240 | you personally worked in companies with it.
00:54:39.120 | So I'd say like start working on a problem
00:54:42.360 | that you're deep into.
00:54:45.280 | And then coming up with an idea,
00:54:50.320 | that shouldn't really be hard
00:54:53.400 | because everything's broken.
00:54:56.840 | If you noticed, nothing quite works properly.
00:54:59.560 | Everything's finicky and frustrating and has stupid bits.
00:55:04.080 | So just particularly stuff at your workplace,
00:55:09.080 | do you know all the stuff that takes longer than it should
00:55:12.520 | or problems that have never been solved properly?
00:55:16.760 | So really the key thing is execution
00:55:23.360 | and tenacity.
00:55:24.960 | Like one thing I really noticed with fast mail
00:55:26.960 | was when we started fast mail,
00:55:29.320 | it was actually pretty hard to start an email company
00:55:31.400 | 'cause there was very little open source software around
00:55:35.680 | and very few examples of how to build this kind of thing.
00:55:38.800 | But very quickly there was kind of like all kinds
00:55:41.600 | of open source software appeared.
00:55:42.880 | It became pretty easy and we got new competitors monthly
00:55:46.840 | and they'd stick around for like six months
00:55:51.040 | and then they'd disappear because it'd give up
00:55:53.320 | 'cause it was hard.
00:55:54.920 | And I will say like in most startups
00:55:58.560 | I've been involved in every month,
00:56:01.040 | it feels like there's a problem so dire
00:56:03.640 | that we're definitely gonna die.
00:56:05.200 | But you kind of have to keep going anyway.
00:56:08.600 | So I think it's your execution and tenacity.
00:56:11.600 | - Thank you Jeremy.
00:56:15.480 | The Dolly model is very impressive.
00:56:18.360 | When I was young it was obvious what a computer model
00:56:21.200 | didn't understand, it couldn't recognize a car for example.
00:56:24.600 | When you look at that model,
00:56:25.960 | it's not clear to me what it does
00:56:27.560 | and doesn't understand anymore.
00:56:29.040 | I wondered if you had a comment about that.
00:56:31.360 | - Only to say I actually don't care
00:56:36.600 | about understanding or not.
00:56:39.320 | Like I'm kind of philosophically interested
00:56:41.000 | and I am a philosophy major,
00:56:42.840 | but as a big learning practitioner,
00:56:44.720 | all I care about is like what it can do.
00:56:47.800 | So yeah, I mean, it's a fascinating question.
00:56:49.360 | I don't think there's any way to ever answer that.
00:56:52.200 | I actually don't know what you understand.
00:56:54.360 | You could tell me,
00:56:55.480 | but I don't know if you're telling the truth.
00:56:57.280 | You know, it's just a fundamentally impossible question
00:57:00.520 | to answer I think.
00:57:01.360 | And but it's not one we need to answer.
00:57:03.200 | We just need to know what can it do, what kind of do.
00:57:07.120 | Any new courses planned for 2021?
00:57:17.120 | Under some vague definition of planned, yes.
00:57:22.120 | We need to do a part two
00:57:24.240 | of our deep learning for coders course.
00:57:26.880 | So that's planned in the sense of like,
00:57:29.760 | yeah, I should write that sometime.
00:57:32.160 | And the other course,
00:57:33.680 | which I'm really excited about is I'm planning to do a course
00:57:36.520 | which is a kind of full stack startup creation course
00:57:40.760 | involving everything from like creating a Linux server
00:57:44.640 | and a system administration of Linux
00:57:46.640 | through to how the domain name system works
00:57:48.520 | through to investment,
00:57:50.080 | through to getting product market fit,
00:57:52.200 | through to collecting data and so forth.
00:57:55.480 | There is a course a bit like that,
00:57:57.960 | that Balaji Srinivasan did on Coursera
00:58:00.400 | could store up startup engineering,
00:58:01.920 | but it's not quite available anymore
00:58:04.640 | 'cause of Coursera and it's also getting a bit dated
00:58:07.120 | and it doesn't really have such an AI thing.
00:58:09.840 | So that's, I don't know if that'll be 2021.
00:58:12.080 | It might be 2022, but there's a couple of courses
00:58:14.480 | I'm looking at.
00:58:16.880 | Okay, so that's that one already.
00:58:21.880 | Are you getting some check days?
00:58:27.040 | Since I had a five year old,
00:58:28.320 | I'm suddenly less interested in motorcycling,
00:58:30.440 | I'm sad to say.
00:58:31.480 | So yes, those courses I described
00:58:34.120 | will probably be in person
00:58:36.000 | at whatever university feels like having us.
00:58:43.880 | So that's what, so yeah, what's next?
00:58:45.600 | I'm gonna keep doing what I'm doing.
00:58:47.400 | But what I wanna do is,
00:58:49.080 | I wanna do fast AI with awesome Australians.
00:58:55.440 | It's from a purely selfish point of view.
00:58:57.360 | I'd like this to be like a real global hub of brilliance
00:59:02.360 | because I want people around me to be awesome.
00:59:05.120 | So I would love it if people were flying here
00:59:12.080 | in order to be part of this amazing community.
00:59:14.520 | And I actually think that's totally, totally doable.
00:59:18.320 | Particularly 'cause you're so beautiful.
00:59:19.760 | Like I think we've got a lot of benefits,
00:59:21.800 | particularly in Queensland.
00:59:23.440 | Like who wouldn't wanna come to Queensland?
00:59:26.200 | Yeah, thank you for getting graph data.
00:59:30.280 | Sure, that was a great question.
00:59:32.600 | What's your recommended way of marketing?
00:59:34.080 | Okay, so how to market an early stage company?
00:59:38.560 | The first thing is make it very, very easy
00:59:42.560 | to use your product and to buy it, right?
00:59:46.240 | So I don't wanna see, like, okay,
00:59:48.920 | so there's gotta be a pricing section, right?
00:59:51.160 | I don't wanna see a section that says like,
00:59:53.480 | email asks for sales inquiries.
00:59:55.560 | That's insane.
00:59:56.400 | Like, no, I'm not gonna, who does that, right?
01:00:00.240 | If it says it's $5 a month, it's like, fine,
01:00:02.800 | here's the credit card, right?
01:00:04.200 | I need to be able to use the damn thing.
01:00:07.840 | So like have an open source version
01:00:09.600 | or at least a limited demo or something.
01:00:13.240 | Have screenshots.
01:00:14.680 | Like I wanna be able to go to your site
01:00:17.320 | and immediately know what are you selling?
01:00:20.200 | Is it any good?
01:00:21.800 | What does it look like?
01:00:23.480 | Can I give it a go?
01:00:24.920 | And then pay you for it.
01:00:26.960 | So that's kind of like the first is to avoid anti-marketing,
01:00:30.280 | where you make life difficult for your customers.
01:00:32.760 | And then the best kind of marketing is the media, right?
01:00:37.080 | So like you will get far, far, far more awareness
01:00:40.840 | of what you're doing if you can get something written
01:00:43.960 | about it in Wired or the Washington Post or BBC
01:00:48.240 | than any amount of advertising.
01:00:51.200 | And that is all about personal outreach from you, the CEO,
01:00:57.080 | to journalists who you have carefully researched
01:01:02.760 | and confirmed would definitely be interested
01:01:04.600 | in what you're doing and then telling them about it.
01:01:08.360 | And that actually doesn't happen very often.
01:01:10.680 | Most people go through like PR firms
01:01:13.320 | who journalists can't stand dealing with.
01:01:15.920 | And so like I've basically never paid
01:01:21.560 | for any advertising of any sort.
01:01:24.440 | But if you do a Google news search,
01:01:28.240 | you'll see that we've got a shitload of media, right?
01:01:31.280 | And last year in particular,
01:01:33.320 | I wanted to like go take that to another level
01:01:35.920 | because I co-founded Masks for All globally.
01:01:40.160 | And so I literally wanted every single person in the world
01:01:43.200 | to know they should wear a mask.
01:01:44.920 | And so this is like my media campaign.
01:01:46.720 | So I just wrote to everybody, I talked to everybody
01:01:51.160 | and ended up on everything from Laura Ingraham on Fox News,
01:01:54.880 | through to BBC News and wrote in the Washington Post
01:01:57.760 | and USA Today.
01:01:59.240 | And nowadays, thank God people actually wear masks.
01:02:04.120 | So yeah, media is your magic marketing tool.
01:02:08.600 | Last one?
01:02:11.560 | Okay, last one.
01:02:12.800 | - Thanks so much Jeremy and Rachel
01:02:17.280 | and your team for the Fast AI course.
01:02:19.200 | It's amazing. - Thanks.
01:02:20.960 | - And accessible.
01:02:22.520 | In the era of global warming,
01:02:25.760 | how concerned should we be with the energy usage
01:02:29.000 | of deep learning models?
01:02:31.120 | And yeah, your thoughts or ideas
01:02:32.640 | on how we can master this challenge.
01:02:34.640 | - So it's a great question.
01:02:37.800 | I would, the way I think of it,
01:02:40.280 | and I'm not an expert on this,
01:02:42.560 | but the way I think of it is
01:02:44.000 | from a general resource constraint point of view,
01:02:50.120 | we should not be using no more resources
01:02:52.880 | than necessary to solve the problem,
01:02:56.360 | including energy.
01:02:58.320 | Unfortunately, a lot of companies like Google,
01:03:05.280 | to pick one out at random,
01:03:07.240 | have huge research departments
01:03:09.040 | that are very explicitly in center to create research
01:03:12.400 | that shows the results of using huge amounts of energy.
01:03:15.920 | Specifically huge amounts of Google Compute Hours.
01:03:19.680 | And this is very, very effective marketing
01:03:21.720 | because if you can, like journalists love writing
01:03:24.720 | about big engineering solutions.
01:03:27.720 | And they'll always say like,
01:03:29.160 | this used 10,000 TPU hours or whatever.
01:03:33.080 | Now, so the thing is, this is what we focus on,
01:03:37.040 | the vast majority of problems that we see solved in practice,
01:03:42.880 | useful pragmatic solutions are solved on a single GPU
01:03:47.880 | in a few hours and you can buy a GPU for a few hundred bucks.
01:03:54.000 | And there's all kinds of resources like this,
01:03:56.840 | as the resource of just like the amount of education
01:03:59.560 | that you need or the resource of the amount of data
01:04:01.400 | that you need or whatever, but like overall,
01:04:03.640 | people dramatically overestimate the amount of resources
01:04:07.960 | you need to get good results out of deep learning.
01:04:11.880 | This is very explicitly
01:04:13.600 | because that's what a lot of people want you to believe.
01:04:16.720 | They want you to believe
01:04:17.560 | that you have to hire their consulting firm,
01:04:19.840 | that you have to use their compute hours,
01:04:22.000 | that you have to use their special software,
01:04:24.640 | that you have to buy lots of their cards or whatever.
01:04:29.320 | But yeah, overall, there's a massive over emphasis on,
01:04:35.840 | you know, using vast amounts of stuff in deep learning.
01:04:40.800 | Sure, I have to mention Dawnbench.
01:04:45.200 | So in fact, I have a slide about Dawnbench,
01:04:47.920 | if I remember correctly, 'cause I kind of skipped over it.
01:04:50.240 | Yeah, so this is something that Rachel and I
01:04:52.600 | are passionate about and we went crazy when TPUs came out
01:04:57.600 | because Google was like,
01:05:01.400 | oh, these are these magic special things.
01:05:03.680 | And the media was like, okay, everybody else is screwed now
01:05:06.600 | 'cause they don't have TPUs.
01:05:08.200 | So only Google can now do deep learning.
01:05:12.800 | And so there was a competition at that time
01:05:16.960 | that had just come out just shortly after TPUs
01:05:19.080 | got marketed to hell called Dawnbench,
01:05:22.160 | which was basically who can train ImageNet the fastest.
01:05:25.200 | And at this time, the fastest people were solving it
01:05:28.160 | in about 12 hours.
01:05:30.400 | When I say solve it, that means getting it to an accuracy,
01:05:33.080 | like I'm at a top five accuracy of something percent.
01:05:36.360 | And yeah, not surprisingly, Google put in their pitch
01:05:41.360 | and I think they got like three hours or something.
01:05:47.760 | And Intel put on a huge TPU pod or whatever.
01:05:51.440 | Intel competed and they of course put in an entry
01:05:54.640 | with 1024 Intel servers operating in parallel.
01:05:58.600 | And we thought, okay, if these guys win, we're so screwed
01:06:03.200 | because it's gonna be like, okay, to be good at this,
01:06:05.600 | you really do need to be Google or Intel.
01:06:07.640 | So some of our students and me spent basically a week
01:06:12.600 | saying if we could do better and we won.
01:06:15.960 | And we did it in 18 minutes.
01:06:18.520 | And it was just by using like common sense,
01:06:23.320 | and just like, yeah, just keeping things simple.
01:06:28.440 | And so like, and we kind of like,
01:06:31.240 | we've done similar things a few times
01:06:33.160 | because these big tech BMS are always trying to convince you
01:06:36.200 | that you're not smart enough, that your software
01:06:38.520 | is not good enough, that your computers are not big enough,
01:06:41.240 | but it's always been bullshit so far and it always will be.
01:06:44.680 | - Thank you, Jeremy.
01:06:49.680 | I think we'll call it there.
01:06:51.480 | If anyone else has any further questions,
01:06:53.120 | feel free to try and have a chat to Jeremy,
01:06:56.000 | depending on when he chooses to leave.
01:06:57.880 | I think from everyone here at the Meetup,
01:06:59.880 | we just wanna say thank you for sharing your time.
01:07:02.360 | Rachel as well, we'll hopefully have you down here
01:07:04.160 | in the next few months.
01:07:05.400 | And really looking forward to having you involved
01:07:07.560 | in the local community.
01:07:09.440 | For everyone who is keen to be involved