back to index

Singapore: the AI Engineer Nation — with Minister Josephine Teo


Chapters

0:0 Introductions
0:34 Singapore's National AI Strategy
2:50 Ministry of Digital Development and Information
8:49 Defining a National AI Strategy
14:32 AI Safety and Governance
16:50 AI Adoption in Companies and Government
19:53 Balancing AI Innovation and Safety
22:56 Structuring Government for Rapid Technological Change
27:8 Doing Business with Singapore
32:21 Training and Workforce Development in AI
37:5 Career Transition Help for Post-AI Jobs
40:19 AI Literacy and Coding as a Language
43:28 Sovereign AI and Digital Infrastructure
50:48 Government and AI Workloads
51:2 Favorite AI Use Case in Government
53:52 AI and Elections

Whisper Transcript | Transcript Only Page

00:00:00.000 | (upbeat music)
00:00:02.580 | - Hey, everyone.
00:00:05.600 | Welcome to the Latent Space Podcast.
00:00:07.240 | This is Alessio, partner and CTO
00:00:08.920 | in Residence at Decibel Partners,
00:00:10.360 | and I'm joined by my co-host, Swiss,
00:00:12.080 | founder of Small AI.
00:00:13.280 | - Hey, everyone.
00:00:14.720 | This is a very, very special episode.
00:00:16.040 | We have here Mr. Josephine Teo from Singapore.
00:00:18.800 | Welcome.
00:00:19.640 | - Hi, Sean, and hi, Alessio.
00:00:21.120 | Thank you for having me.
00:00:22.320 | - Of course.
00:00:23.160 | - You are the Minister for Digital Development
00:00:25.580 | and Information and Second Minister for Home Affairs.
00:00:27.720 | And we are meeting here at Raise,
00:00:29.000 | which, effectively, your agency.
00:00:30.840 | Maybe we want to explain a little bit
00:00:32.240 | about what Singapore is doing in AI.
00:00:34.680 | - Well, we've had an AI strategy
00:00:36.960 | at the national level for some years now.
00:00:39.480 | And about two years ago,
00:00:42.480 | when generative AI became so prominent,
00:00:47.480 | we thought it was about time for us
00:00:49.360 | to refresh our national AI strategy.
00:00:52.680 | And it's not unusual on such occasions
00:00:55.680 | for us to consult widely.
00:00:57.480 | We want to talk to people who are familiar with the field.
00:01:01.040 | We want to talk to people who are active as practitioners.
00:01:05.840 | And we also want to talk to people in Singapore
00:01:08.880 | who have an interest in seeing the AI ecosystem develop.
00:01:13.560 | So when we put all these together,
00:01:15.920 | we discovered something else by chance,
00:01:17.880 | and it was really a bonus.
00:01:19.880 | This was the fact that there were already Singaporeans
00:01:22.840 | that were active in the AI space,
00:01:25.320 | particularly in the US, particularly in the Bay Area.
00:01:29.400 | And one of the exciting things for us
00:01:32.720 | was how could we also consult these Singaporeans,
00:01:36.520 | who clearly still have a passion for Singapore.
00:01:39.560 | They do care about what happens back home,
00:01:42.920 | and they want to contribute to it.
00:01:44.600 | So that's how Raise came about.
00:01:46.560 | And Raise actually preceded the publication
00:01:49.800 | of the refresh of our national AI strategy,
00:01:54.000 | which took place in December last year.
00:01:56.440 | So the inputs of the participants from Raise
00:01:59.360 | helped us to sharpen what we thought would be important
00:02:04.320 | in building up the AI ecosystem.
00:02:06.720 | And also with the encouragement of participants at Raise,
00:02:11.720 | primarily Singaporeans who were doing great work in the US,
00:02:15.840 | we decided to raise our ambitions.
00:02:18.880 | - Yeah, that's a good name.
00:02:19.960 | - That's why we say AI for the public good,
00:02:22.400 | recognizing the fact that commercial interest
00:02:24.640 | will certainly drive exciting developments
00:02:28.120 | in the industry space.
00:02:30.400 | But keep in mind, there is a need to make sure
00:02:33.640 | that AI serves the public good.
00:02:35.440 | And we say for Singapore and the world.
00:02:37.400 | So the idea is that experiments
00:02:39.480 | that are carried out in Singapore,
00:02:41.320 | things that are scaled up in Singapore
00:02:43.200 | potentially could have contributions elsewhere in the world.
00:02:47.320 | And so AI for the public good for Singapore and the world,
00:02:49.800 | that's how it came about.
00:02:51.080 | - I was listening to some of your previous interviews
00:02:53.360 | and even the choice of the name development
00:02:56.080 | in the ministry name was very specific.
00:02:58.360 | You mentioned naming is your ethos.
00:03:00.560 | Can you explain maybe a bit about what the ministry does,
00:03:03.120 | which is not simply funding R&D,
00:03:05.280 | but it's also thinking about
00:03:06.560 | how to apply the technologies in industry
00:03:08.360 | and just maybe give people an overview
00:03:10.320 | since there's not really an equivalent in the US.
00:03:13.360 | - Yeah, so when people talk about our smart nation efforts,
00:03:17.880 | it was helpful in articulating a few key pillars.
00:03:22.880 | We talked about one pillar being a vibrant digital economy.
00:03:27.800 | We also talk about a stable digital society
00:03:31.200 | because digital technologies,
00:03:32.840 | the way in which they are used
00:03:34.680 | can sometimes cause divisions in society
00:03:37.680 | or entrench polarization.
00:03:40.360 | They can also have the potential of causing social upheaval.
00:03:44.040 | So when we talked about stable digital society,
00:03:46.800 | that was what we had in mind.
00:03:47.840 | How do you preserve cohesion?
00:03:49.480 | Then we said that in this domain,
00:03:51.880 | government has to be progressive too.
00:03:54.280 | You can't expect the rest of Singapore to digitalize
00:03:57.880 | and yet the government is falling behind.
00:04:00.280 | So a progressive digital government
00:04:02.400 | is another very important pillar.
00:04:04.440 | And underpinning all of this
00:04:06.360 | has to be comprehensive digital security.
00:04:08.560 | There is, of course, cybersecurity,
00:04:10.600 | but there is also how individuals feel safe
00:04:13.680 | in the digital domain,
00:04:15.400 | whether as users on social media
00:04:18.040 | or if they're using devices
00:04:20.520 | and they're using services that are delivered digitally.
00:04:24.440 | So when we talk about these four pillars of a smart nation,
00:04:29.160 | people get it.
00:04:31.080 | And when we then asked ourselves,
00:04:34.840 | what is the appropriate way to think of the ministry?
00:04:39.000 | We used to be known as
00:04:39.920 | the Ministry of Communications and Information.
00:04:43.000 | And we had been doing all this digital stuff
00:04:48.000 | without actually putting it into our name.
00:04:51.320 | So when we eventually decided to rename the ministry,
00:04:55.160 | there were a couple of options to choose from.
00:04:58.000 | We could have gone for digital technologies.
00:05:00.240 | We could have gone for digital advancement.
00:05:02.200 | We could have gone for digital innovation.
00:05:04.600 | But ultimately we decided on digital development
00:05:07.080 | because it wasn't the technologies,
00:05:10.040 | the advancements or the innovation that we cared about.
00:05:13.000 | They are important,
00:05:14.320 | but we're really more interested in their impact to society,
00:05:18.160 | impact to communities.
00:05:19.600 | So how do we shape those developments?
00:05:21.840 | How do we achieve a digital experience that is trustworthy?
00:05:26.840 | How do we make sure that everyone,
00:05:30.440 | not just individuals who are savvy from the get-go
00:05:35.440 | in digital engagements,
00:05:37.840 | how does everyone in society,
00:05:39.640 | regardless of age, regardless of background,
00:05:42.400 | also feel that they have a sense of progression,
00:05:45.720 | that embracing technology brings benefits to them?
00:05:49.160 | And we also believe that if you don't pay attention to it,
00:05:53.480 | then you might not consciously apply the use of technology
00:05:58.440 | to bring people together.
00:06:00.160 | And you may passively just allow society to break apart
00:06:04.640 | without being too...
00:06:05.480 | - Oh my God, that's drastic.
00:06:06.320 | - That sounds very drastic.
00:06:07.160 | That sounds a bit scary.
00:06:08.720 | But we thought that it's important to say
00:06:10.520 | that we do have the objective of bringing people together
00:06:13.880 | with the help of technology.
00:06:15.440 | So that's how we landed on the idea of digital development.
00:06:20.440 | And there's one more dimension,
00:06:22.040 | that one we draw reference from perhaps
00:06:25.160 | the physical developmental aspects of cities.
00:06:28.320 | We say that if you think of yourself as a developer,
00:06:32.240 | all developers have to conceptualize,
00:06:35.760 | all developers have to plan,
00:06:38.240 | developers have to implement.
00:06:40.840 | And in the process of implementation,
00:06:42.600 | you will monitor and things don't go as well
00:06:45.320 | as you'd like them to.
00:06:46.240 | You have to rectify.
00:06:47.320 | Yeah, it sucks, essentially, it is.
00:06:50.320 | But that's what any developer, any good developer must do.
00:06:54.400 | But a best-in-class developer would also have to think
00:06:59.400 | about the higher purpose that you're trying to achieve.
00:07:02.520 | Should also think about who are the partners
00:07:05.000 | that you bring into the picture
00:07:07.320 | and not try to do everything alone.
00:07:09.320 | And I think very importantly,
00:07:11.120 | a best-in-class developer seeks to be a leader
00:07:15.280 | in thought and action.
00:07:17.080 | So we say that if we call ourselves
00:07:19.000 | the Ministry of Digital Development,
00:07:21.280 | how do we also, you know,
00:07:23.360 | whether in thinking of the digital economy,
00:07:25.800 | thinking of the digital society,
00:07:27.200 | digital security or digital government,
00:07:29.360 | embody these values.
00:07:30.600 | These values of being a bridge builder,
00:07:33.320 | being an entity that cares about the longer-term impact
00:07:37.600 | that serves a higher purpose.
00:07:38.960 | So those were the kinds of things
00:07:40.840 | that we brought into the discussions on our own renaming.
00:07:45.840 | And that's quite a good experience for the whole team.
00:07:49.280 | From the outside, I actually was surprised.
00:07:50.720 | I was looking for MCI and I couldn't find it.
00:07:52.520 | (laughs)
00:07:53.360 | Since you renamed it.
00:07:54.200 | - There, there, there, I found it.
00:07:55.040 | - Yeah, exactly.
00:07:55.880 | We have to plug a little logo for the cameras.
00:07:59.160 | I really like that you are now recognizing
00:08:01.000 | the role of the web, digital development, technology.
00:08:04.720 | We never really had it officially.
00:08:06.560 | It used to be Ministry of Information,
00:08:08.080 | Communication and the Arts.
00:08:09.480 | You know, one thing that we're going to touch on
00:08:10.720 | is the growth of Singapore as an engineering hub.
00:08:13.280 | You know, OpenAI is opening an office in Singapore
00:08:15.880 | and how we can grow more AI engineers in Singapore as well.
00:08:18.880 | 'Cause I do think that that is something
00:08:20.600 | that people are interested in,
00:08:21.880 | whether or not it's for their own careers
00:08:23.040 | or to hire out in Singapore.
00:08:24.880 | Maybe it's a good time to get into a National AI Strategy.
00:08:28.160 | You presented it to the PM, now PM, I guess.
00:08:32.240 | I don't know what the process was
00:08:33.400 | because we have a new PM.
00:08:34.680 | Most of our audience is not going to be Singaporeans.
00:08:37.400 | There are going to be more Singaporeans than normal,
00:08:39.640 | but most of our audience are not Singaporeans.
00:08:41.320 | They've never heard of it,
00:08:42.400 | but they all come from countries
00:08:43.920 | which are all trying to figure out
00:08:44.760 | the National AI Strategy.
00:08:46.000 | So how did you go about defining a National AI Strategy?
00:08:49.600 | Well, in some sense,
00:08:52.080 | we went back to the drawing board and said,
00:08:55.000 | what do we want to see AI be able to do in Singapore?
00:08:59.800 | I mean, there are all these exciting developments.
00:09:01.960 | Obviously, we'd like to be part of the action.
00:09:04.520 | It has to be in service of something.
00:09:06.560 | And what we were interested in is just try and find a way
00:09:11.480 | to continuously uplift our people.
00:09:13.760 | Because ultimately, for any national strategy to work,
00:09:18.400 | it must bring benefits to the local communities.
00:09:23.360 | And the local communities can be defined very broadly.
00:09:27.080 | You have citizen communities,
00:09:29.520 | and citizens would like to be able to do better jobs,
00:09:33.040 | and they would like to be able to earn higher wages.
00:09:35.600 | But it's not just citizen communities.
00:09:37.680 | Citizens are themselves sometimes involved in businesses.
00:09:40.760 | So how about the enterprise community?
00:09:42.880 | And in the enterprise community,
00:09:44.400 | in the Singapore landscape, it's really interesting.
00:09:46.640 | Like most other economies, we do have SMEs,
00:09:49.200 | but we also have multinationals that are at the very cutting edge.
00:09:54.080 | Because in order to succeed in Singapore,
00:09:55.880 | they have to be very competitive.
00:09:57.760 | So the question is,
00:09:59.360 | how can they, through the use of technologies and including AI,
00:10:05.520 | offer an even higher value proposition to their customers,
00:10:09.200 | to their owners?
00:10:11.000 | And so we were very interested in seeing enterprise applications of AI.
00:10:16.840 | That, in a way, also relates back to the workforce.
00:10:21.120 | Because for all of the employees of these organisations,
00:10:25.480 | then to see that their employers are implementing AI models,
00:10:29.880 | and they are identifying AI use cases,
00:10:32.680 | is tremendously motivating for the broader workforce to themselves
00:10:37.360 | to want to acquire AI-related skills.
00:10:40.040 | Then not forgetting that for the large body of small and medium enterprises,
00:10:46.200 | it's always going to be a little bit harder
00:10:49.000 | for smaller businesses to access technologies.
00:10:52.960 | So what do we put in place to enable these small businesses
00:10:57.840 | to take advantage of what AI has to offer?
00:11:01.600 | So you have to have a holistic strategy that can fire up many different engines.
00:11:06.360 | So we work across the board to make compute available,
00:11:10.160 | firstly to the research community,
00:11:12.200 | but also taking care to ensure that compute capacity
00:11:16.920 | could be available to companies that are in need of them.
00:11:21.480 | So how do we do that?
00:11:22.480 | That's one question that we have to go get it organised.
00:11:25.160 | Then another very important aspect is making data available.
00:11:29.120 | And I think in this regard,
00:11:30.560 | some of the earlier work that we did was helpful.
00:11:33.080 | We did, from more than a decade ago,
00:11:35.560 | already have privacy laws in place.
00:11:37.960 | We have data protection.
00:11:40.120 | And these laws have also been updated
00:11:42.320 | so as to support businesses with legitimate use cases.
00:11:45.400 | So the clarity and the certainty is there.
00:11:48.640 | And then we've also tried to organise data,
00:11:52.480 | make it more readily available.
00:11:54.640 | Some of it, for example, could be specific to the finance sector,
00:11:59.080 | some specific to the logistics sector.
00:12:01.640 | But then there are also different kinds of data
00:12:04.720 | that lies within government possession.
00:12:07.080 | And we are making it much more readily available
00:12:10.200 | to the private sector.
00:12:11.440 | So that deals with the data part of it.
00:12:14.040 | I think the third and very important part of it is talent.
00:12:17.400 | And we're thinking of talent at different levels.
00:12:20.160 | We're thinking of talent at the uppermost level,
00:12:23.080 | you know, for want of a better term,
00:12:24.440 | we call them AI creators.
00:12:26.080 | We know that they are very highly sought after.
00:12:28.760 | There aren't all that many in the world.
00:12:30.600 | And we want to interest them to do work with Singapore.
00:12:33.800 | Sometimes they will be in Singapore,
00:12:36.360 | but there is a value in them being plugged
00:12:39.160 | into the international networks,
00:12:41.040 | to be plugged into globally leading edge projects
00:12:45.800 | that may or may not be done out of Singapore.
00:12:49.120 | We think that keeping those linkages are very important.
00:12:52.200 | These AI creators have to be supported
00:12:54.120 | by what we generally refer to as AI practitioners.
00:12:57.880 | We're talking about people who do data science.
00:13:00.200 | We're talking about people who do machine learning.
00:13:02.560 | They're engineers, absolutely engineers.
00:13:06.480 | But then you also need the broad swath of AI users,
00:13:11.080 | people who are going to be comfortable using the tools
00:13:14.600 | that are made available to them.
00:13:16.480 | So you may have, for example, a group within a company
00:13:19.480 | that designs AI bots or finds use cases,
00:13:23.720 | but if their colleagues aren't comfortable using them,
00:13:27.280 | then in some sense, the picture is not complete.
00:13:30.040 | So we want to address the talent question
00:13:33.800 | at all of these levels.
00:13:35.640 | In a sense, we are fortunate Singapore is compact enough
00:13:39.720 | for us to be able to get these kinds
00:13:41.640 | of interventions organized.
00:13:43.800 | We already have a robust training infrastructure.
00:13:47.120 | We can rely on that.
00:13:48.520 | People know what funding support is available to them.
00:13:52.720 | Training providers know that if they curate programs
00:13:57.440 | that lead to good employment outcomes,
00:14:00.000 | they are very likely to be able to get support
00:14:02.200 | to offer these programs at subsidized rates.
00:14:05.200 | So in a sense, that ecosystem is able to support
00:14:09.360 | what we hope to see come out of an AI strategy.
00:14:12.840 | So those are just some of the pieces that we put in place.
00:14:15.120 | - Many pieces, 15 items.
00:14:17.000 | - Yeah, okay.
00:14:17.840 | - So for people who are interested, they can look it up,
00:14:19.880 | but I just wanted to get an introduction to people.
00:14:22.320 | Many people don't even know
00:14:23.200 | that we have a very active AI strategy,
00:14:25.200 | and actually it's the second one.
00:14:26.680 | Like there's already been like a five-year plan,
00:14:29.120 | pre-generative AI, which was very foresighted.
00:14:32.320 | - One thing that we also pay attention to
00:14:34.600 | is how can AI be developed and deployed
00:14:38.200 | in a responsible manner, in a way that is trustworthy.
00:14:42.480 | And we want to plug ourselves
00:14:44.640 | into conversations at the forefront.
00:14:47.440 | We have an AI Safety Institute,
00:14:49.400 | and we work together with our colleagues in the US,
00:14:52.560 | as well as in the UK and anywhere else
00:14:54.800 | that has AI Safety Institutes
00:14:56.440 | to try and advance our understanding of this topic.
00:14:59.040 | But I think more importantly is that in the meantime,
00:15:03.040 | we've got to offer the business community,
00:15:06.240 | offer AI developers something practical to work with.
00:15:10.040 | So we've developed testing tools,
00:15:13.440 | by no means perfect, but they're a start.
00:15:16.320 | And then we also said that because AI Verify
00:15:18.760 | was developed for traditional AI, classical AI,
00:15:22.160 | then for generative AI, you need something different,
00:15:24.400 | something that also does red teaming,
00:15:26.160 | something that also does benchmarking.
00:15:28.480 | But actually our interests go beyond that,
00:15:31.080 | beyond AI governance frameworks and practical tools.
00:15:34.480 | We are interested in getting into the research
00:15:37.320 | as to how do you prove that an AI system is really safe?
00:15:42.320 | How do you get into the mathematics of it?
00:15:46.240 | I'm not an expert in this field,
00:15:47.960 | but I think it's not difficult for people to understand
00:15:50.800 | that until you can get to a proof,
00:15:53.200 | then some of the other testing is reassuring,
00:15:56.640 | but to an extent.
00:15:58.000 | - It may be fundamentally unprovable.
00:16:00.040 | - It may well be.
00:16:00.880 | - You might have to be comfortable with that
00:16:02.040 | and go ahead anyway.
00:16:03.000 | - Yes.
00:16:03.840 | - Yeah.
00:16:04.800 | - The simulations especially are really interesting.
00:16:07.480 | I think NTU is going to be one of the first universities
00:16:10.080 | to have these cyber ranges for like a AI red teaming training.
00:16:13.440 | One of our companies does AI red teaming
00:16:15.400 | and their customers are like
00:16:17.040 | some of the biggest foundation model labs and then GovTech.
00:16:19.600 | It's like the only government organization working.
00:16:22.880 | So yeah, Singapore has been at the forefront of this.
00:16:26.000 | I mean, we sat down with the CP of Grab,
00:16:28.440 | Philip Kendall on my trip there,
00:16:30.240 | and they shut down their whole company for a week
00:16:32.480 | to just focus on Gen AI training.
00:16:34.080 | Like literally, if you work at Grab,
00:16:36.360 | you have to do something in Gen AI
00:16:38.520 | and kind of learn and get comfortable with it.
00:16:40.720 | Going back to your point,
00:16:41.560 | I think the interest of the government
00:16:44.040 | easily transpires into the companies.
00:16:45.880 | You know, it's like, this is like a national priority,
00:16:48.600 | so we should all spend time in it.
00:16:50.320 | - Yeah, you're right.
00:16:51.240 | I mean, companies like Grab,
00:16:53.360 | what they are trying to do is to make awareness
00:16:58.360 | as so broad within their organization
00:17:02.640 | and to get to a level of comfort with using Gen AI tools,
00:17:07.160 | which I think is a smart move
00:17:09.760 | because the returns will come later,
00:17:13.000 | but they will surely come.
00:17:14.280 | They're not the only ones doing that.
00:17:15.640 | I'm glad to say some of our leading banks,
00:17:18.120 | even Singapore Airlines,
00:17:19.480 | which may be the airline that you flew into Singapore,
00:17:22.960 | they've got a serious team looking at AI use cases.
00:17:27.600 | And I don't know whether you are aware of it,
00:17:29.360 | they have definitely quite a good number.
00:17:32.080 | I'm not sure that they have talked about it openly
00:17:34.720 | because airline operations are quite complex.
00:17:36.640 | - At least Singapore Airlines offer.
00:17:37.760 | (all laughing)
00:17:39.440 | - No, because airline operations are very complex.
00:17:42.160 | - Yeah. - Yeah.
00:17:43.000 | - There are lots of things that you can optimize.
00:17:44.880 | There are lots of things that you have to comply with.
00:17:47.480 | There are lots of processes that you must follow.
00:17:50.080 | And this kind of context makes it interesting for AI.
00:17:54.840 | You can put it to good use.
00:17:56.480 | And government mustn't be lagging too.
00:17:58.560 | We've always believed that in time to come,
00:18:01.360 | we may well have to put in place guardrails,
00:18:04.480 | but you are able to put in place guardrails better
00:18:08.400 | if you yourself have used the technology.
00:18:10.880 | So that's the approach that we are taking.
00:18:13.200 | Quite early on, we decided to lay out some guidelines
00:18:17.760 | on how GenAI could be used by government offices.
00:18:22.320 | And then we also went about developing tools
00:18:25.360 | that will enable them to practice
00:18:27.640 | and also to try their hand at it.
00:18:30.640 | I think in today's context,
00:18:31.840 | we're quite happy with the fact
00:18:33.320 | that there are enough colleagues within government
00:18:37.360 | that are competent,
00:18:38.480 | that know, in fact, how to generate their own AI bots,
00:18:42.760 | create a systems for their colleagues.
00:18:45.560 | And that's quite an exciting development.
00:18:47.600 | I will mention that as obviously a citizen
00:18:51.960 | and someone keen on developing AI in Singapore,
00:18:54.880 | I do worry that we lead with safety,
00:18:58.960 | lead with public good.
00:19:00.520 | I'm not sure that the Singapore government is aware
00:19:03.600 | that safety sometimes is a bad word in some AI circles
00:19:07.160 | 'cause their word is associated with censorship.
00:19:09.320 | - Or over-regulation.
00:19:10.520 | - Over-regulation.
00:19:11.520 | Nuffing is the Gen Z word for this,
00:19:14.920 | of capabilities in order to be safe.
00:19:16.840 | And actually that pushes what you call AI creators,
00:19:19.360 | some others might call LLM trainers, whatever.
00:19:21.880 | There are trade-offs, you cannot have it all.
00:19:23.720 | You cannot have safe and cutting edge sometimes
00:19:27.200 | because sometimes cutting edge means unsafe.
00:19:29.160 | I don't know what the right answer is,
00:19:30.840 | but I will say that my perception is
00:19:33.000 | a lot of the Bay Area, San Francisco is on the,
00:19:35.840 | let everything be unregulated as possible,
00:19:38.320 | let's explore the frontier.
00:19:40.160 | And Europe's approach is like,
00:19:42.520 | we're gonna have government conferences
00:19:44.160 | on the safety of AI even before creating frontier AI.
00:19:47.520 | And Singapore, I think is like in the middle of that,
00:19:50.400 | there's a risk, maybe not, I saw you shake your head.
00:19:53.240 | - It's a really interesting question.
00:19:55.520 | How do you approach AI development?
00:19:59.080 | Do you say that there are some ethical principles
00:20:02.240 | that should be adhered to?
00:20:03.720 | Do you say that there are certain guidelines
00:20:07.960 | that should inform the developer's thinking?
00:20:11.760 | - And we don't have a law in place just yet.
00:20:15.840 | We've only introduced very recently
00:20:18.360 | a law that has yet to be passed.
00:20:20.640 | This is on AI-generated content,
00:20:23.200 | other synthetic materials
00:20:24.920 | that could be used during an election,
00:20:27.040 | but that's very specific to an election.
00:20:28.720 | - That's a topic we wanted to touch on.
00:20:30.240 | - It's very specific to election.
00:20:32.000 | For the broader base of AI developers
00:20:35.160 | and AI model deployers,
00:20:39.240 | the way in which we have gone about it
00:20:40.880 | is to put in place the principles.
00:20:43.880 | We articulate what good AI governance should look like.
00:20:48.440 | And then we've decided to take it one step further.
00:20:51.000 | We have testing tools, we have frameworks,
00:20:54.240 | and we've also tried to say,
00:20:57.240 | well, if you go about AI development,
00:21:01.240 | what are some of the safety considerations
00:21:04.840 | that you should put in place?
00:21:06.560 | And then we suggest to AI model developers
00:21:10.240 | that they should be transparent.
00:21:12.400 | What are the things they ought to be transparent about?
00:21:14.720 | For example, your data, how is it sourced?
00:21:17.560 | You should also be transparent about the use cases.
00:21:20.840 | What do you intend for it to be used for?
00:21:23.480 | So there are some of these specific guidelines
00:21:25.720 | that we provide.
00:21:27.080 | They are, to a large extent, voluntary in nature.
00:21:30.320 | But on the other hand, we hope that through this process,
00:21:33.640 | there is enough education being done
00:21:36.720 | so that on the receiving end,
00:21:38.640 | those who are impacted by those models
00:21:40.600 | will learn to ask the right questions.
00:21:42.520 | And when they ask the right questions
00:21:44.040 | of the model developers and the deployers,
00:21:46.880 | then that generates a virtual cycle
00:21:49.920 | where good questions are being brought to the surface
00:21:53.880 | and there is a certain sense of responsibility
00:21:57.080 | to address those questions.
00:21:58.760 | I take your point that until you are very clear
00:22:02.200 | about the outcomes you want to achieve,
00:22:04.560 | putting in place regulations could be counterproductive.
00:22:07.600 | And I think we see this in many different sectors.
00:22:10.280 | Well, since AI is often talked about
00:22:12.840 | as general-purpose technology,
00:22:14.800 | yes, of course, in another general-purpose technology,
00:22:18.040 | electricity, in its production,
00:22:19.960 | of course there are regulations around that.
00:22:22.000 | You know, how to keep the workers safe
00:22:23.520 | in a power plant, for example.
00:22:25.680 | But many of the regulations do not attempt
00:22:29.360 | to stifle electricity usage to begin with.
00:22:32.680 | It says that, well, if you use electricity
00:22:34.880 | in this particular manner or in that particular manner,
00:22:38.560 | then here are the rules that you have to follow.
00:22:40.840 | I believe that that could be true of AI too.
00:22:43.800 | It depends on the use cases.
00:22:45.400 | If you use it for elections,
00:22:46.920 | then okay, we will have a set of rules.
00:22:48.960 | But if you're not using it for elections,
00:22:50.480 | then actually in Singapore today, go ahead.
00:22:53.440 | But of course, if you do harmful things,
00:22:55.200 | that's a different story altogether.
00:22:56.800 | - How do you structure a ministry
00:22:59.040 | when the technology moves so quickly?
00:23:01.080 | Even if you think about the moratorium
00:23:03.280 | that Singapore had on data center build-out
00:23:05.080 | that was lifted recently,
00:23:06.400 | obviously, you know, that's a forward-looking thing.
00:23:09.400 | As you think about what you want to put in place for AI
00:23:12.080 | versus what you want to wait out and see,
00:23:13.680 | like, how do you make that decision?
00:23:15.400 | You know, CEOs have to make the same decision.
00:23:17.400 | It's like, should I invest in AI now?
00:23:18.720 | Should I, like, follow and see where it goes?
00:23:20.760 | Like, what's the thought process and who do you work with?
00:23:23.640 | - The fortunate thing for Singapore, I think,
00:23:25.920 | is that we're a single tier of government.
00:23:28.760 | In many other countries, you may have the federal level
00:23:32.840 | and then you have the provincial or state-level governments,
00:23:36.640 | depending on the nomenclature
00:23:38.600 | in that particular jurisdiction.
00:23:40.160 | For us, it's a single tier.
00:23:41.240 | - City-state.
00:23:42.080 | - City-state.
00:23:42.920 | When you're referring to the government,
00:23:44.600 | well, it's the government, no one else.
00:23:46.440 | Okay, is it the federal government
00:23:47.480 | or is it the local government?
00:23:50.080 | So that in itself is greatly facilitative already.
00:23:55.080 | The second thing is that we do have a strong culture
00:23:59.920 | of cooperating across different ministries.
00:24:02.800 | In the digital domain, you absolutely have to,
00:24:05.360 | because it's not just my ministry
00:24:08.520 | that is interested in seeing applications being developed
00:24:12.880 | and percolate throughout our system.
00:24:15.400 | If you are the Ministry of Transport,
00:24:17.200 | you'd be very interested how artificial intelligence,
00:24:20.360 | machine learning can be applied to the rail system
00:24:23.840 | to help it to advance from corrective maintenance,
00:24:26.720 | where you go in and maintain equipment
00:24:29.680 | after they've broken down,
00:24:31.000 | to preventive maintenance,
00:24:32.680 | which is still costly because you can't go around
00:24:35.800 | maintaining everything preventatively.
00:24:38.040 | So how do you prioritize?
00:24:39.680 | And if you use machine learning to prioritize
00:24:43.640 | and move more effectively into predictive maintenance,
00:24:47.440 | then potentially you can have a more reliable rail system
00:24:50.720 | without it costing a lot more.
00:24:52.800 | So Ministry of Transport
00:24:54.600 | would have this set of considerations
00:24:56.560 | and they have to be willing to support innovations
00:24:59.760 | in their particular sector.
00:25:01.600 | In healthcare, there would be equally
00:25:03.480 | a different set of considerations.
00:25:05.280 | How can machine learning,
00:25:07.000 | how can AI algorithms be applied to help physicians?
00:25:11.000 | Not to overtake physicians.
00:25:12.560 | I don't think physicians can be overtaken so easily,
00:25:15.240 | not at all, for the imaginable future.
00:25:18.160 | But can it help them with diagnosis?
00:25:20.880 | Can it help them with treatment plans?
00:25:24.200 | What constitutes an optimized treatment plan
00:25:27.800 | that would take into consideration the patient's
00:25:31.120 | whole set of health indicators?
00:25:34.720 | Then how does a physician look at all these inputs
00:25:37.920 | and still apply judgment?
00:25:39.640 | Those are the areas that we would be very interested in
00:25:42.320 | as MDDI, but equally, I think,
00:25:44.400 | my colleagues in the Ministry of Health.
00:25:46.120 | So the way in which we organize ourselves
00:25:48.480 | must allow for ownership to also be taken by our colleagues,
00:25:52.480 | that they want to push it forward.
00:25:54.200 | We keep ourselves relatively lean.
00:25:56.160 | At the broad level, we may say
00:25:58.440 | there's a group of colleagues who looked at digital economy,
00:26:01.200 | another group that looks at digital society,
00:26:03.840 | another group looks at digital government.
00:26:05.760 | But actually, there are many occasions
00:26:08.280 | where you have to be cross-disciplinary.
00:26:10.840 | Even digital government,
00:26:12.120 | the more you digitalize your service delivery to citizens,
00:26:15.680 | the more you have to think about the security architecture.
00:26:20.280 | The more you have to think about
00:26:21.560 | whether this delivery mechanism is resilient.
00:26:24.720 | And you can't do it in isolation.
00:26:26.320 | You have to then say,
00:26:27.960 | if the standards that we set for ourselves
00:26:29.760 | are totally dislocated with what the industry does,
00:26:32.680 | how hyperscalers go about architecting their security,
00:26:37.240 | then the two are not interoperable.
00:26:39.120 | So a degree of flexibility,
00:26:41.880 | a way of allowing people to take ownership
00:26:44.800 | of the areas that come within their charge,
00:26:47.960 | and very importantly, constantly building bridges,
00:26:51.400 | and also encouraging a culture of not saying
00:26:54.600 | that here's where my job stops.
00:26:57.320 | In a field that is, as you say,
00:26:58.960 | developing as quickly as it does,
00:27:00.920 | you can't rigidly say that beyond this, not my problem.
00:27:05.440 | It is your problem
00:27:06.280 | until you find somebody else to take care of it.
00:27:08.080 | - The thing you raised about healthcare
00:27:10.200 | is something that a lot of people here are interested in.
00:27:12.680 | If someone, let's say a foreign startup or company,
00:27:16.680 | or someone who is a Singaporean founder
00:27:19.160 | wants to do this in the healthcare system,
00:27:22.160 | what should they do?
00:27:23.240 | Who do they reach out to?
00:27:24.600 | It often seems impenetrable,
00:27:26.000 | but I feel like we want to say Singapore's open for business,
00:27:29.200 | but where do they go?
00:27:31.000 | - Well, the good thing about Singapore
00:27:32.400 | is that it's not that difficult eventually
00:27:34.200 | to reach the right person.
00:27:35.640 | But we can also understand
00:27:37.160 | that to someone who is less familiar with Singapore,
00:27:39.840 | you need an entry point.
00:27:41.440 | And fortunately, that entry point
00:27:43.120 | has been very well served by the Economic Development Board.
00:27:46.960 | The Economic Development Board has got colleagues
00:27:49.360 | who are based in, I believe, more than 40 cities,
00:27:53.640 | and they serve as a very useful initial touchpoint.
00:27:58.640 | And then they might provide advice
00:28:00.520 | as to who do you link up with in Singapore?
00:28:03.440 | And it doesn't take more than a few clicks,
00:28:06.920 | in a way, to get to the right...
00:28:09.320 | - I will say I've been dealing with EDB
00:28:10.800 | a little bit for my conference,
00:28:11.960 | and they've been extremely responsive.
00:28:13.640 | And it's been nice to see,
00:28:15.720 | 'cause I never get to see this out of government,
00:28:17.600 | nice to see that as someone
00:28:19.800 | that wants to bring a foreign business into Singapore,
00:28:22.760 | they're kind of rolling out the welcome mat.
00:28:24.600 | - But we also recognise that in newer areas,
00:28:27.280 | there could be question of,
00:28:29.000 | oh, okay, this is something unfamiliar.
00:28:31.240 | The way in which we go about it is to say that,
00:28:34.320 | okay, even if there is no particular group or entity
00:28:39.120 | that champions a topic,
00:28:41.720 | we don't have to immediately turn away that opportunity.
00:28:46.360 | There must be a way for us
00:28:47.840 | to connect to the right group of people.
00:28:50.280 | So that tends to be the approach that we take.
00:28:52.520 | - There's a bit of tension.
00:28:53.440 | The external perception of Singapore,
00:28:55.560 | people are very influenced by still the Michael Faye incident
00:28:58.400 | of like 30 years ago.
00:28:59.720 | And they feel us as conservative.
00:29:02.080 | And I feel like within Singapore,
00:29:03.800 | we know what the OB markers are, quote unquote,
00:29:06.800 | and then we can live within that.
00:29:08.160 | And it's actually,
00:29:09.000 | you can have a lot of experimentation within that.
00:29:10.760 | In fact, I think a lot of Singapore's success in finance
00:29:13.280 | has been due to sort of a liberal acceptance
00:29:16.360 | of what we can do.
00:29:17.240 | I don't have a point apart from which to say,
00:29:19.160 | I hope that people who are looking to explore Singapore
00:29:23.040 | don't have that preconception that we are hard to deal with
00:29:26.520 | because we're very eager, I think, is my perception.
00:29:30.120 | - We need to hop on a plane and get to Singapore
00:29:32.240 | and then we are happy to show them around.
00:29:34.720 | - I'll take this chance to mention that,
00:29:36.040 | so next year, I kind of have been pitching
00:29:37.480 | as the Olympics of Singapore year,
00:29:39.920 | in the sense that ICLR,
00:29:41.200 | one of the big machine learning conferences is coming.
00:29:43.800 | I think your agency had a,
00:29:45.280 | one of your agencies had a part to do with that.
00:29:47.360 | And I'm bringing my own conference as well
00:29:48.840 | to host alongside.
00:29:50.000 | - This is a conference on AI engineers?
00:29:51.720 | - Yes.
00:29:52.560 | - Fantastic.
00:29:53.720 | You'll be very welcome.
00:29:54.840 | - Oh yeah, thanks.
00:29:55.920 | I hope so.
00:29:57.120 | Well, you can't deny me entry.
00:29:58.360 | - Should we have reason to?
00:29:59.600 | (laughing)
00:30:00.440 | - No, no, no.
00:30:02.080 | My general hope is that when conferences like ICLR
00:30:05.200 | happen in Singapore,
00:30:06.120 | that a lot of AI creators
00:30:07.440 | will be coming to Singapore for the first time,
00:30:09.080 | and they'll be able to see like the kind of work
00:30:10.840 | that's been done.
00:30:11.680 | - Yes.
00:30:12.520 | - And that will be on the research side.
00:30:13.440 | And I hope that the engineering side grows as well.
00:30:15.560 | - Yeah.
00:30:16.400 | - We can talk about the talent side if you want.
00:30:18.400 | - Well, it's quite interesting for me
00:30:19.880 | because I was listening to your podcast,
00:30:21.760 | explaining the different dimensions
00:30:23.560 | of what an AI engineer does.
00:30:26.560 | And maybe we haven't called them AI engineers just yet,
00:30:31.160 | but we are seeing very healthy interest
00:30:33.320 | amongst people in companies
00:30:35.680 | that take an enthusiastic approach
00:30:38.360 | to try and see how AI can be helpful to their business.
00:30:42.840 | They seem to me to fit the bill.
00:30:45.720 | They seem to me already,
00:30:47.720 | whether they recognize it or not,
00:30:49.200 | to be the kind of AI engineers that you have in mind.
00:30:51.840 | Meaning that they may not have done a PhD.
00:30:54.960 | They may not have gotten their degrees in computer science.
00:30:59.400 | They may not have themselves used NLP.
00:31:03.560 | They may not be steep in this area,
00:31:06.560 | but they are acquiring the skills very quickly.
00:31:08.600 | They are pivoting.
00:31:09.800 | They have the domain knowledge.
00:31:11.280 | - Correct.
00:31:12.120 | It's not even about the pivoting.
00:31:13.360 | They might just train from the start,
00:31:14.640 | but the point is that they can take a foundation model
00:31:16.520 | that's capable of anything
00:31:17.920 | and actually fashion it into a useful product
00:31:20.320 | at the end of it.
00:31:21.160 | - Yes.
00:31:22.000 | - Which is what we all want.
00:31:22.840 | Everybody downstairs wants that.
00:31:24.920 | Everyone here wants that.
00:31:25.800 | They want useful products,
00:31:26.760 | not just general capable models.
00:31:29.200 | And I see the job title.
00:31:31.200 | There are some people walking around
00:31:32.280 | with their lanyards today,
00:31:33.600 | which is kind of cool.
00:31:34.600 | I think you have a lot of terms,
00:31:35.840 | which are AI creators, AI practitioners.
00:31:38.080 | I want to call out that there was this interesting goal
00:31:40.760 | to increase the triple,
00:31:41.920 | the number of AI practitioners, right?
00:31:43.880 | Which is part of the national AI strategy
00:31:45.280 | from 5,000 to 15,000.
00:31:47.120 | But people don't walk around
00:31:47.960 | with the title AI practitioners.
00:31:49.160 | - Absolutely not.
00:31:50.120 | - So I'm like, no, you have to focus on job title
00:31:52.840 | because job titles get people jobs.
00:31:55.200 | - Yeah, fair enough.
00:31:56.040 | - It is just shorthand for companies to hire
00:31:58.200 | and it's a shorthand for people to skill up
00:32:01.640 | in whatever they need in order to get those jobs.
00:32:04.320 | And I'm a very practical person here.
00:32:06.240 | I think many Singaporeans are,
00:32:08.200 | and that's kind of my pitch on the AI engineer side.
00:32:10.520 | - Well, thank you for that suggestion.
00:32:12.360 | We'll be thinking about how we also help Singaporeans
00:32:16.320 | understand the opportunities to be AI engineers.
00:32:19.760 | How can they get into it?
00:32:21.720 | - A lot of governments are trying to do this, right?
00:32:23.080 | Like train their citizens and offer opportunities.
00:32:25.640 | I have not been in the Singapore workforce my adult career.
00:32:30.640 | So I don't really know what's available
00:32:32.840 | apart from SkillsFuture.
00:32:34.080 | I think that there are a lot of people wanting help
00:32:37.360 | and they go for courses, they get certificates.
00:32:39.960 | I don't know how we get them over the hump
00:32:42.000 | of going into industry and being successful engineers.
00:32:46.320 | And I fear that we're going to create
00:32:48.320 | a whole bunch of certificates that don't mean anything.
00:32:50.120 | I don't know if you have any thoughts or responses on that.
00:32:52.920 | - This idea that you don't want to over-rely
00:32:55.760 | on qualifications and credentials
00:32:59.120 | is also something that has been recognised
00:33:02.360 | in Singapore for some years now.
00:33:04.280 | That even includes your academic qualifications.
00:33:07.080 | So every now and then you do hear people decide
00:33:10.040 | that that's not the path that they're going to take
00:33:11.880 | and they're going to experiment
00:33:13.520 | and they're going to try different ways.
00:33:15.040 | Entrepreneurship could be one of it.
00:33:16.760 | For the broad workforce, what we have discovered
00:33:19.600 | is that the signal from the employer
00:33:21.800 | is usually the most important.
00:33:24.280 | As members of the workforce, they are very responsive
00:33:27.320 | to what employers are telling them.
00:33:29.120 | So in the organisational context,
00:33:31.320 | if, like in the case of Grab,
00:33:33.600 | Alessio was talking about, you know,
00:33:35.320 | them shutting down completely for one week
00:33:37.800 | so that everyone can pick up generative AI skills,
00:33:40.640 | that sends a very strong signal.
00:33:42.400 | So quite a lot of the government funding
00:33:45.000 | will go to the company
00:33:47.480 | and say that it's an initiative you want to undertake.
00:33:50.320 | We recognise that it does take up
00:33:52.360 | some of your company's resources
00:33:54.080 | and we are willing to help with it.
00:33:55.480 | These are what we call company-led training programmes.
00:33:58.400 | But not everyone works for a company
00:34:01.120 | that is progressive.
00:34:02.520 | And if the company is not ready
00:34:04.760 | to introduce an organisation-wide training initiative,
00:34:09.640 | then what does an individual do?
00:34:11.520 | So we have an alternative to offer.
00:34:16.040 | What we've done is to work with
00:34:19.840 | knowledgeable industry practitioners
00:34:23.360 | to identify for specific sectors
00:34:27.440 | the kinds of technology that will disrupt jobs
00:34:31.880 | within the next three to five years.
00:34:34.280 | We're not choosing to look at a very long horizon
00:34:36.800 | because no one really knows
00:34:38.960 | how the future of work will be like
00:34:41.240 | in 15, 35 years,
00:34:44.240 | except in very broad terms.
00:34:46.040 | You can, you can say in very broad terms
00:34:48.600 | that you are going to have shorter learning cycles,
00:34:52.120 | you are going to have skills atrophy
00:34:54.560 | at a much quicker rate.
00:34:55.640 | With those broad things, we can say.
00:34:57.920 | But specifically, the job that I'm doing today,
00:35:00.960 | the task that I have to perform today,
00:35:03.160 | how will I do them differently?
00:35:05.240 | I think in three to five years, you can say.
00:35:07.320 | And you can also be quite specific.
00:35:09.560 | If you're in logistics,
00:35:10.920 | what kinds of technology will change the way you work?
00:35:14.320 | Robotics will be one of them.
00:35:15.680 | Robotics isn't as likely to change jobs
00:35:18.320 | in financial services,
00:35:20.120 | but AI and machine learning will.
00:35:23.240 | So if you identify the timeframe,
00:35:25.840 | and if you identify the specific technologies,
00:35:28.920 | then you go to a specific job role and say,
00:35:31.880 | here's what you're doing today,
00:35:33.640 | and here's what you're going to be doing
00:35:34.920 | in this new timeframe.
00:35:36.480 | Then you have a chance to allow individuals
00:35:39.360 | to take ownership of their learning
00:35:41.000 | and say then, how do I plug it?
00:35:42.800 | So one of the examples I like to give
00:35:45.080 | is that if you look at the accounting profession,
00:35:47.680 | a lot of the routine work will be replaceable.
00:35:51.360 | A lot of the tasks that are currently done by individuals
00:35:55.560 | can be done with a good model backing you.
00:35:59.600 | Now, then what happens to the individual?
00:36:01.600 | They have to be able to use the model.
00:36:03.800 | They have to be able to use the AI tools,
00:36:06.360 | and then they will have to pivot to doing other things.
00:36:09.120 | For example, there will still be a great shortage
00:36:11.520 | of people who are able to do forensics.
00:36:14.360 | And if you want someone to do forensics,
00:36:17.040 | for example, a financial crime has taken place.
00:36:19.840 | Within an organization, there was a discovery
00:36:22.880 | that was fraud.
00:36:23.840 | How did this come about?
00:36:25.360 | That forensics work still needs an application
00:36:28.080 | of human understanding of the problem.
00:36:30.960 | Now, one of the jobs that we found
00:36:33.040 | is that a person with audit experience
00:36:35.160 | is actually quite suitable to do digital forensics
00:36:38.080 | because of their experience in audit.
00:36:41.040 | So then how do we help a person like that pivot?
00:36:43.880 | Good if his employer is interested
00:36:45.760 | to invest in his training,
00:36:47.520 | but we would also like to encourage individuals
00:36:50.640 | to refer to what we call jobs transformation maps
00:36:54.080 | to plan their own career trajectory.
00:36:56.880 | That's exactly what we have done.
00:36:58.480 | I think we have definitely more than a dozen of jobs,
00:37:00.920 | such job transformation maps available.
00:37:03.920 | And they cut across a variety of sectors.
00:37:05.920 | - So it's like open source career change programs.
00:37:08.720 | - Exactly, I think you put it better than I, Sean.
00:37:10.720 | - Oh, no.
00:37:11.560 | - Yeah.
00:37:12.400 | - Yeah, you can count on me for marketing at least.
00:37:13.240 | - Yeah.
00:37:14.400 | So actually one day,
00:37:16.040 | somebody is going to feed this into a model.
00:37:17.960 | - Yeah, I was exactly thinking that.
00:37:19.440 | - Yeah, they have to.
00:37:20.480 | Actually, if they just use REG,
00:37:22.680 | it wouldn't be too difficult, right?
00:37:24.000 | Because that document to add to a database
00:37:27.560 | for the purposes of REG,
00:37:29.000 | they will still all fit into the window.
00:37:31.400 | It's going to be possible.
00:37:32.320 | - This is a planning task.
00:37:33.560 | There's the talk of the week,
00:37:34.680 | a talk of the town this week
00:37:35.920 | because of OpenAI's O1 model.
00:37:38.080 | That is the next frontier after REG
00:37:40.480 | is planning and reasoning.
00:37:42.600 | So the steps need to make sense.
00:37:44.560 | - Yes.
00:37:45.400 | - And that hasn't been,
00:37:46.600 | that is not typically a part of REG.
00:37:48.000 | REG is more recall of facts.
00:37:51.000 | And this is much more about planning
00:37:52.520 | as something that in sequence makes sense
00:37:55.320 | to get to a destination.
00:37:56.280 | - That's right.
00:37:57.120 | - Which could be really interesting.
00:37:58.440 | I would love the auditors
00:37:59.640 | to spell out the reasoning traces
00:38:01.400 | so that the language model guys can go and train on it.
00:38:03.960 | - Yeah.
00:38:04.800 | The planning part,
00:38:06.280 | I was trying to do this a couple of years ago.
00:38:08.560 | That was when I was still in the manpower ministry.
00:38:11.640 | We were talking to, in fact,
00:38:13.920 | some recruitment firms in the U.S.
00:38:16.800 | And it's exactly as you described.
00:38:19.640 | It's a planning process.
00:38:21.880 | To pivot from one career to the next
00:38:24.520 | is very often not a single step.
00:38:26.600 | There might be a path for you to take there.
00:38:29.520 | And if you were able to research
00:38:32.640 | the whole database of people's career paths,
00:38:36.360 | then potentially for every person
00:38:38.560 | that shows up and ask the question,
00:38:40.680 | you can use this database to map a new career path.
00:38:44.520 | - I'm very open about my own career transition
00:38:46.680 | from finance to tech.
00:38:48.000 | That's why I brought Quincy Larson here to RAISE
00:38:50.520 | because he taught me to code.
00:38:52.720 | And I think he can teach Singapore to code.
00:38:54.600 | - Wow, why not?
00:38:55.640 | - If they want to.
00:38:56.480 | (laughing)
00:38:57.840 | - Many do.
00:38:58.680 | - Yeah, many do.
00:39:00.240 | - So they will be complementary.
00:39:02.160 | There will be, there is a planning aspect of it.
00:39:05.320 | But if you wanted to use REG,
00:39:07.360 | it does not have individual personalized career paths
00:39:10.640 | to draw on.
00:39:11.600 | That one has got a frame,
00:39:14.760 | a proposal of how you could go about it.
00:39:17.520 | It could tell you,
00:39:18.920 | maybe from A, you could get to B.
00:39:22.640 | Whereas what you're talking about planning is that,
00:39:24.760 | well, here's how someone else has gotten from A to B
00:39:27.960 | by going through CDE in between.
00:39:30.840 | So they're complementary things.
00:39:33.080 | - You and I talked a little bit this morning
00:39:34.560 | about winning the 30-year war, right?
00:39:36.800 | Like a lot of the plans are very short-term,
00:39:38.440 | very like, how can we get it now?
00:39:39.880 | How can we, like, we got OpenAI to open an office here.
00:39:42.680 | Great, let's go and get Anthropic, Google DeepMind,
00:39:44.400 | all these guys, the AI creators to move to Singapore.
00:39:47.040 | Hopefully we can get there, maybe not.
00:39:48.800 | Maybe, maybe not, right?
00:39:49.640 | It's hard to tell.
00:39:50.680 | The 30-year war, in my mind,
00:39:51.920 | is the kind of scale of operation that we did
00:39:54.920 | that leads me to speak English today.
00:39:56.600 | We, as a government, decided strategically,
00:39:59.040 | English is an important thing.
00:40:00.680 | We'll teach it in schools.
00:40:01.520 | We'll adopt it as the language of business.
00:40:03.920 | And you and I discussed, like,
00:40:05.080 | is there something for code?
00:40:06.400 | Is it that level?
00:40:07.640 | Is it time for that kind of shift
00:40:08.960 | that we've done for English, for Mandarin?
00:40:11.400 | And like, is this the third one
00:40:12.800 | that we speak Python as a second language?
00:40:15.440 | And I want to just get your reactions to this crazy idea.
00:40:19.400 | - Well, it may not be so crazy,
00:40:21.400 | the idea that you need to acquire literacy
00:40:24.960 | in a particular field.
00:40:26.480 | I mean, some years ago,
00:40:28.320 | we decided that computer literacy
00:40:29.960 | was important for everyone to have
00:40:32.240 | and put in place quite a lot of programs
00:40:34.440 | in order to enable people at various stages of learning,
00:40:39.160 | including those who are already adult learners,
00:40:42.240 | to try and acquire these kinds of skills.
00:40:44.640 | So, you know, AI literacy is not a far-fetched idea.
00:40:49.640 | Is it all going to be coding?
00:40:52.120 | Perhaps for some people,
00:40:55.000 | this type of skills will be very relevant.
00:40:58.160 | Is it necessary for everyone?
00:41:00.240 | That's something I think the jury is out.
00:41:04.080 | I don't think that there is a clear conclusion.
00:41:08.000 | We've discussed this also with colleagues
00:41:10.560 | from around the world who are interested
00:41:12.400 | in trying to improve the educational outcomes.
00:41:16.680 | These are professional educators
00:41:19.520 | who are very interested in curriculum.
00:41:21.960 | They're interested in helping the children
00:41:25.040 | become more effective in the future.
00:41:28.360 | And I think as far as we are able to see,
00:41:32.040 | there is no real landing point yet.
00:41:33.880 | Does everyone need to learn coding?
00:41:35.760 | And I think even for some of the participants
00:41:38.320 | that raised today,
00:41:39.520 | they did not necessarily start with a technical background.
00:41:43.320 | Some of them came into it quite late.
00:41:46.320 | This is not to say that we are completely close to the idea.
00:41:50.240 | I think it is something that we will continue to investigate.
00:41:53.120 | And the good thing about Singapore is that
00:41:55.960 | if and when we come to the conclusion
00:41:58.360 | that that's something that has to become
00:42:00.200 | either third language for everyone
00:42:03.080 | or has to become as widespread as mathematics
00:42:07.680 | or some other skill set, digital skills,
00:42:10.840 | or rather reading skills,
00:42:12.400 | then maybe it's something that we have to think about
00:42:15.160 | introducing on a wider scale.
00:42:16.920 | - In July, we were in Singapore.
00:42:18.600 | We hosted the Sovereign AI Summit.
00:42:21.440 | We gave a presentation to a lot of the leaders
00:42:23.480 | from Temasek, GSE, EDVI,
00:42:25.880 | about some of the stuff we've seen in Silicon Valley
00:42:28.360 | and how different countries are building out AI.
00:42:30.920 | Singapore was 15% of NVIDIA's revenue in Q3 of 2024.
00:42:35.920 | So you have a big investment
00:42:37.560 | kind of like in sovereign data infrastructure
00:42:39.560 | and the power grid and all the build-outs there.
00:42:42.000 | Malaysia has been a very active space for that too.
00:42:45.400 | How do you think about the importance
00:42:47.360 | of owning the infrastructure
00:42:49.000 | and understanding where the models are run,
00:42:51.080 | both from the autonomous workforce perspective,
00:42:53.760 | as you enable people to use this,
00:42:55.080 | but also you mentioned the elections.
00:42:56.800 | If you have a model that is being used
00:42:58.160 | to generate election-related content,
00:43:00.080 | you wanna see where it runs,
00:43:01.840 | whether or not it's running in a safe environment.
00:43:03.760 | And obviously there's more on the more geopolitical side
00:43:07.280 | that we will not touch on,
00:43:08.440 | but why was that so important for Singapore to do so early,
00:43:12.200 | to make such a big investment?
00:43:14.120 | And how do you think about,
00:43:15.600 | especially the Saudi Sino-American,
00:43:17.640 | not bloc, but like coalition,
00:43:20.080 | was at an office in Singapore
00:43:21.280 | and you can see Indonesia from a window,
00:43:23.240 | you can see Malaysia from another window.
00:43:24.760 | So everything there is pretty interconnected.
00:43:27.840 | - Yeah, there seems to be a couple of strands
00:43:30.400 | in your question.
00:43:32.000 | There was a strand on digital infrastructure.
00:43:34.560 | And then I believe there was also a strand
00:43:37.000 | in terms of digital governance.
00:43:39.160 | How do you make sure that the environment
00:43:41.400 | continues to be supportive of innovation activities,
00:43:45.000 | but also that you manage the potential harms?
00:43:48.400 | I think there's a key term of sovereign AI as well,
00:43:50.640 | that's kind of going around.
00:43:51.640 | I don't know what level this is at.
00:43:52.480 | - What did you have in mind?
00:43:54.000 | - Yeah, especially as you think about
00:43:55.960 | deploying some of these technologies and using them,
00:43:58.720 | you could deploy them in any data center
00:44:01.280 | in the world in theory,
00:44:02.800 | but as they become a bigger part of your government,
00:44:05.440 | they become a bigger part of like the infrastructure
00:44:08.000 | that kind of like the country runs on,
00:44:09.880 | maybe bringing them closer to you is more important.
00:44:13.040 | You're one of the most advanced country in doing that.
00:44:15.120 | So I'm curious to hear kind of what that planning was,
00:44:17.840 | the decision was going into it.
00:44:19.080 | It's like, this is something important for us to do today
00:44:21.760 | versus kind of waiting later.
00:44:23.680 | And yeah, also we want to touch on the elections thing
00:44:27.040 | that you also mentioned,
00:44:28.200 | but that's kind of like a separate topic.
00:44:31.080 | - He's squeezing two questions in one.
00:44:32.760 | - Right, Alessio, a couple of years ago,
00:44:35.320 | we articulated for the government a cloud first strategy,
00:44:39.040 | which therefore means that we accept that there are benefits
00:44:42.880 | of putting some of our workloads on the cloud.
00:44:45.520 | For one thing, it means that you don't have
00:44:47.720 | all the capacity available to you
00:44:50.960 | on a dedicated basis all the time.
00:44:53.160 | The need for flexibility,
00:44:54.680 | we acknowledge the need to be able to expand more quickly
00:44:57.960 | when the workload needs increase.
00:45:00.800 | But when we say a cloud first strategy,
00:45:03.520 | it also means that there will be certain things
00:45:06.160 | that are perhaps not suitable to put on the cloud.
00:45:09.240 | And for those, you need to have a different set
00:45:12.440 | of infrastructure to support.
00:45:14.120 | So having a hybrid approach,
00:45:16.320 | where some of the workloads,
00:45:18.120 | even for government, can go to the cloud,
00:45:20.360 | and then some of the workloads have to remain on-prem.
00:45:23.040 | I think that is a question of the mix.
00:45:26.280 | To the extent that you are able to identify the systems
00:45:30.120 | that are suitable to go to the cloud,
00:45:32.360 | then the need to have the workloads run
00:45:35.640 | on your on-prem systems is more circumscribed as a result.
00:45:39.680 | And potentially you can devote better resources
00:45:43.800 | to safeguarding this smaller bucket,
00:45:47.280 | rather than to try and spread your resources
00:45:50.640 | to protecting the whole,
00:45:52.160 | because you are also relying on security architecture
00:45:56.040 | of cloud service providers.
00:45:58.200 | So this hybrid approach, I think,
00:45:59.920 | has defined how we think about government workloads.
00:46:04.680 | In some sense, how we will think about AI workloads
00:46:08.280 | is not going to be entirely different.
00:46:10.400 | This is looking at the question
00:46:12.760 | from the government standpoint.
00:46:14.320 | But more broadly, if you think about Singapore as a whole,
00:46:18.760 | equally, not all the AI workloads
00:46:20.800 | can be hosted in Singapore.
00:46:22.720 | The analogy I like to make sometimes is,
00:46:26.360 | if you think about manufacturing,
00:46:28.080 | some of the earlier activities
00:46:30.720 | that were carried out in Singapore,
00:46:32.560 | at some point in time, became not feasible to continue.
00:46:36.240 | And then they have to be redistributed elsewhere.
00:46:38.800 | You're always going to be part of this supply chain.
00:46:42.320 | There is a global supply chain,
00:46:44.000 | there is a regional supply chain.
00:46:46.440 | And if everyone occupies a point in that supply chain
00:46:50.240 | that is optimal for their own circumstances,
00:46:54.080 | that plays to their advantage,
00:46:56.040 | then in fact, the whole system gains.
00:46:58.120 | That's also how we will think of it.
00:46:59.920 | Not all the AI workloads,
00:47:01.520 | no matter how much we expand our data center capacity,
00:47:05.560 | will be possible to host.
00:47:07.440 | Now, the only way we can host all the AI workloads
00:47:09.640 | is if we are totally unambitious.
00:47:11.600 | There's so little AI workload
00:47:13.320 | that you can host everything in Singapore.
00:47:14.560 | That has to be the case, right?
00:47:15.800 | I mean, if there's more AI workloads,
00:47:17.080 | it has to be distributed elsewhere.
00:47:18.840 | Does all of it require the latency,
00:47:21.560 | the very tight latency margins that you can tolerate
00:47:24.560 | and absolutely have to have them in Singapore?
00:47:26.400 | Some of it actually can be distributed.
00:47:28.120 | Well, we'll have to see.
00:47:30.320 | But a reasonable guess would be
00:47:32.360 | that there is always going to be scope for redistribution.
00:47:35.280 | And in that sense,
00:47:36.560 | we look at the whole development in our region
00:47:38.920 | in a positive way.
00:47:40.160 | There is just more scope to be able to host these activities.
00:47:44.120 | For Southeast Asia?
00:47:45.040 | For Southeast Asia.
00:47:46.640 | Could be elsewhere in the world.
00:47:48.280 | And it's generally a helpful thing to happen.
00:47:51.640 | Keep in mind also that
00:47:53.240 | when you look at data center capacity in Singapore,
00:47:56.400 | relative to our GDP, relative to our population,
00:47:59.560 | it's already one of the most dense in the world.
00:48:01.760 | In that regard,
00:48:03.080 | that doesn't mean that we stop expanding the capacity.
00:48:06.080 | We are still trying to open up headroom.
00:48:08.280 | And that means greener data centers.
00:48:10.560 | And there are really two main ways
00:48:12.800 | of making the greener centers become a reality.
00:48:16.000 | One is you use less energy.
00:48:18.240 | One is you use greener energy.
00:48:19.840 | And we are pursuing activities on both fronts.
00:48:22.560 | I think one of the ideas in the Sovereign AI team
00:48:25.720 | is the government also becoming an intelligence provider.
00:48:29.520 | So if you think about the accounting work that you mentioned,
00:48:32.200 | some of these AI models can do some of that work.
00:48:34.480 | In the future,
00:48:35.320 | do you see the government kind of like being able to offer
00:48:38.720 | AI accountants as a service
00:48:40.640 | in the Singaporean infrastructure?
00:48:42.520 | I think that's one of the themes that are very new,
00:48:44.800 | but like, as you have,
00:48:45.880 | most countries have like shrunken population,
00:48:47.840 | like declining workforce.
00:48:49.520 | So there needs to be a way to close the gap
00:48:51.560 | for like productivity growth.
00:48:52.920 | And I think like governments owning
00:48:54.480 | some of these AI infrastructure for workloads
00:48:56.440 | and then re-offering it to local enterprises
00:48:59.520 | and small businesses will be one of the drivers
00:49:02.200 | of this kind of like gap closure.
00:49:03.960 | So yeah, I was just curious to get your thoughts,
00:49:06.320 | but it seems like you're already thinking about
00:49:08.640 | how to scale versus what to put outside of the country.
00:49:12.080 | - But we were, we were thinking about access for startups.
00:49:16.800 | We were concerned about access by the research community.
00:49:20.320 | So we did set aside,
00:49:22.280 | I think a reasonable budget in Singapore
00:49:24.360 | to make available compute capacity
00:49:27.960 | for these two groups in particular.
00:49:29.880 | What we are seeing is a lot of interest
00:49:32.640 | on the part of private providers.
00:49:34.840 | Some are hyperscalers,
00:49:37.040 | but they're not confined to hyperscalers.
00:49:38.800 | There are also data center operators
00:49:40.560 | that are offering to provide compute as a service.
00:49:45.480 | So they would be interested in linking up
00:49:48.080 | with entities that have the demand.
00:49:50.920 | We'll monitor the situation.
00:49:52.720 | In some sense, government ought to compliment
00:49:57.000 | what is available in the private sector.
00:49:59.160 | It's not always the case that government has to step in.
00:50:02.440 | So we'll look at where the needs are.
00:50:04.160 | - Yeah, you told me that this was a change
00:50:05.680 | in the way the government works
00:50:07.760 | in the private sector recently.
00:50:09.200 | - Certainly the idea that we were talking specifically
00:50:12.440 | about training.
00:50:13.320 | We said that with adult education in particular,
00:50:16.240 | it's very often the case that training intermediaries
00:50:19.360 | in the private sector are closer to the needs of industry.
00:50:22.720 | They're more familiar with what the employers want.
00:50:26.040 | The government should not assume
00:50:27.480 | that it needs to be the sole provider.
00:50:31.040 | So yes, our institutes of higher learning,
00:50:33.640 | meaning our polytechnics, our universities,
00:50:35.600 | they also run programs that are helpful to industry,
00:50:38.480 | but they're not the only ones.
00:50:40.400 | So it would have to depend on the situation,
00:50:43.320 | who is in a better position to fulfill those requirements.
00:50:48.160 | - Yeah, excellent.
00:50:49.000 | We do have to wrap up for your other events going on.
00:50:51.720 | There's a lot of programs that the Singapore government
00:50:54.040 | and GovTech in particular does to make use of AI
00:50:57.360 | within the government to serve citizens and for internal use.
00:51:00.160 | I'll show that in the show notes for readers and listeners.
00:51:03.160 | But I was wondering if you personally
00:51:04.400 | have a favorite AI use case that has inspired you
00:51:07.360 | or maybe affected your life or kids' life in some way.
00:51:12.000 | - That's a really good question.
00:51:13.760 | I would say I'm more proud of the fact
00:51:15.680 | that my colleagues are so enthusiastic.
00:51:18.200 | I'm not sure whether you've heard of it.
00:51:19.600 | Internally, we have something called AIBot.
00:51:21.680 | - Yes, your staff actually sent me like three times,
00:51:23.640 | like AIBot, AIBot, AIBot.
00:51:24.480 | - Oh, okay.
00:51:25.320 | - I was like, what is this AIBot?
00:51:26.400 | I've never heard of it,
00:51:27.240 | but apparently it's like the rag system
00:51:28.600 | for the Singapore government.
00:51:29.960 | - Yeah, what happens is that we're encouraging
00:51:32.880 | our colleagues to experiment
00:51:34.640 | and they have access to internal memos
00:51:39.640 | in each ministry or each agency that are treasure trove
00:51:44.240 | of how the agency has thought about a problem.
00:51:46.640 | So for example, if you're the Inland Revenue
00:51:49.280 | and somebody comes to you with an appeal for a tax case,
00:51:52.720 | well, it has been decided on before, many times over.
00:51:56.400 | But to a newer colleague, what is the decision to begin with?
00:51:59.560 | Now, they can input through a rag system,
00:52:03.480 | all the stuff that they have done in the past,
00:52:05.720 | and it can help the newer colleague
00:52:07.600 | figure out the answer much faster.
00:52:09.920 | Doesn't mean that there's no longer a pause
00:52:12.480 | to understand, okay, why is it done this way?
00:52:15.160 | To your point earlier, that the reasoning part of it
00:52:17.400 | also has to come to the fore.
00:52:19.040 | That's potentially one next step that we can take.
00:52:22.000 | But at least there are many bots
00:52:24.000 | that are being developed now
00:52:25.520 | that are helping lots of agencies.
00:52:27.960 | Could be the Inland Revenue, as I mentioned earlier.
00:52:30.200 | It could be the agency that looks after our social security
00:52:33.600 | that has a certain degree of complexity
00:52:36.320 | that if you simply did a search
00:52:38.600 | or if you relied on our previous assistant,
00:52:41.880 | it was an assistant that was not so smart,
00:52:45.200 | if I could put it that way.
00:52:46.160 | It gave a standard answer,
00:52:48.680 | and it wasn't able really to understand your question.
00:52:51.760 | And it was frustrating when after asking A,
00:52:55.600 | you say, "Okay, then how about B?
00:52:58.040 | And then how about C?"
00:52:59.080 | It wasn't able to then take you to the next level.
00:53:02.040 | It just kept spewing out the same answer.
00:53:04.960 | So I think with the AI bots that we've created,
00:53:07.760 | the ability to have a more intelligent answer
00:53:11.240 | to the question has improved a great deal.
00:53:14.040 | But it's still early days yet,
00:53:15.560 | but they represent the kind of advancements
00:53:17.960 | that we'd like to see our colleagues make more of.
00:53:21.040 | - Yeah, Jensen Huang calls this
00:53:22.040 | like preservation of institutional knowledge.
00:53:23.920 | It can actually transfer knowledge much easier.
00:53:26.320 | And I'm also very positive on the impact of this
00:53:28.440 | for an aging population.
00:53:29.520 | You know, we have one of the lowest birth rates in the world
00:53:31.480 | and, you know, making our systems,
00:53:32.960 | our government systems smarter for them,
00:53:34.920 | it is the most motivating thing as an engineer
00:53:36.720 | that I would work on.
00:53:38.080 | - Great.
00:53:38.920 | - Yeah, I'm very excited about that.
00:53:40.360 | Is there anything we should ask you, like open-ended?
00:53:43.840 | - Unless you had another question
00:53:45.240 | that we didn't really finish.
00:53:46.760 | - Oh, just, yeah, I think just the elections piece,
00:53:49.240 | we were super interested.
00:53:50.840 | Singapore's running for elections.
00:53:52.680 | How worried are you?
00:53:53.800 | How worried are you about AI?
00:53:55.880 | And, you know, it's a very topical thing for US as well.
00:53:58.720 | - Well, we have seen it show up elsewhere.
00:54:02.320 | It's not only in the US.
00:54:04.000 | There have been several other elections.
00:54:05.840 | I think in Slovakia, for example,
00:54:08.440 | you know, there was material,
00:54:12.040 | there was content that was put out
00:54:13.680 | that eventually turned out to be false.
00:54:15.800 | And it was very damaging
00:54:17.320 | to the person being portrayed in that content.
00:54:21.240 | So the way we think about it is that political discourse
00:54:26.240 | has to be built on the foundation of facts.
00:54:29.720 | It's very difficult to have honest discourse.
00:54:31.680 | You can be critical of each other.
00:54:33.160 | It doesn't mean that I have to agree with your opinions.
00:54:36.200 | Doesn't mean that only what you say
00:54:38.640 | or what somebody else says is acceptable.
00:54:41.640 | But the discourse has to be based on facts.
00:54:44.480 | So the troubling point about AI-generated content
00:54:48.240 | or other synthetic material
00:54:49.880 | is that it no longer contains facts.
00:54:53.640 | It's made up.
00:54:54.920 | So that in itself is problematic.
00:54:57.360 | So if a person is depicted in a realistic manner
00:55:02.120 | to be saying something that he did not say
00:55:04.360 | or to be doing something that he did not do,
00:55:06.880 | that's very confusing for people
00:55:08.800 | who want to participate in the discourse.
00:55:11.200 | In an election, it could also affect people favorably
00:55:15.840 | or in a prejudicial manner.
00:55:17.880 | And neither of it is right.
00:55:19.760 | So we have to take a decision
00:55:22.560 | that when it comes to an election,
00:55:24.520 | we have to decide on the basis of what actually happened,
00:55:28.160 | what was actually said.
00:55:30.040 | We may not like what was said,
00:55:31.760 | but that was what was actually said.
00:55:33.480 | You can't create something and override it, as it were.
00:55:37.720 | So that was where we were coming from.
00:55:39.200 | It is, in a way, a very specific set of requirements
00:55:43.440 | that we are putting in place,
00:55:45.760 | which is that in an election setting,
00:55:48.960 | we should only be shown saying what we actually said
00:55:53.960 | or doing what we actually did.
00:55:56.320 | And anything else would be an assault on factual accuracy.
00:56:01.320 | And that should not become a norm in our election.
00:56:06.680 | And people should be able to trust what was said
00:56:09.360 | and what they are seeing.
00:56:11.560 | So that's where it's coming from.
00:56:13.600 | - Thank you so much for your time.
00:56:14.440 | You've been extremely generous.
00:56:15.720 | To have a minister as a listener of our little thing,
00:56:17.560 | but hopefully it's useful to you as well.
00:56:19.000 | - Thank you.
00:56:19.840 | - If you're interested in anything, let us know.
00:56:21.240 | - I hope your AI engineer conference
00:56:23.520 | in Singapore is a great success.
00:56:25.080 | - Yeah, well, you can help us.
00:56:26.800 | - Let us know how we can support you.
00:56:27.640 | - Yes.
00:56:28.480 | - Okay.
00:56:29.320 | - Yeah, that's it.
00:56:30.160 | (upbeat music)
00:56:32.760 | (upbeat music)
00:56:35.340 | (upbeat music)
00:56:37.920 | (upbeat music)