back to index

Top Ten Challenges to Reach AGI — Stephen Chin, Andreas Kollegger


Whisper Transcript | Transcript Only Page

00:00:00.000 | All right.
00:00:19.560 | Hey, so great to see everyone here at AI Engineer World's Fair.
00:00:24.140 | Andre and I have the honor of curating the GraphRag track,
00:00:27.640 | which is happening here.
00:00:28.640 | And I thought the jokes Simon had about bugs were --
00:00:32.100 | Spot on.
00:00:33.100 | Spot on, hilarious.
00:00:34.480 | And that's the reason why we care so much about getting really good data,
00:00:38.740 | like building a solid foundation, and good grounding for models.
00:00:42.480 | And we're going to chat a bit because I think we have a social responsibility.
00:00:48.240 | We're getting so close to AGI as an industry.
00:00:53.540 | We have a social responsibility to kind of see what the boundaries,
00:00:57.280 | and what the limits are of this.
00:00:59.740 | And as proper computer scientists, the answer is, always look at science fiction for the answer.
00:01:07.380 | Look to the past to see the future.
00:01:08.740 | Exactly.
00:01:09.740 | Okay.
00:01:10.740 | So, play along with us.
00:01:11.740 | What we're going to do is we'll each play off a riff on a sci-fi meme.
00:01:15.740 | Give a big round of applause either if you think it's true or funny,
00:01:20.200 | or if you just like the movie.
00:01:22.200 | All right.
00:01:23.200 | You're up first, ABK.
00:01:24.200 | Okay.
00:01:25.200 | Starting off with Memento.
00:01:25.840 | In the Memento, the main character has really bad short-term memory.
00:01:29.200 | He has a specific disease, so he can't remember what happened 15 minutes ago.
00:01:34.200 | This is the essence of prompt engineering.
00:01:37.660 | All right.
00:01:38.660 | Round of applause.
00:01:40.660 | Okay.
00:01:41.660 | Okay.
00:01:42.660 | Okay.
00:01:43.660 | All right.
00:01:44.660 | Skynet.
00:01:45.660 | The mandatory fear-mongering.
00:01:47.660 | Even without evil intent, autonomous systems can make reasonable-seeming decisions have awful unforeseen
00:01:55.660 | consequences.
00:01:58.660 | Okay.
00:01:59.660 | That's a little better.
00:02:00.660 | All right.
00:02:01.660 | Your turn.
00:02:02.660 | Okay.
00:02:03.660 | The matrix, of course, for Neo4j.
00:02:05.660 | We love it.
00:02:06.660 | And for now, agents live in a simulation that we're creating for them.
00:02:10.940 | Will we notice when they flip the script and we're living in their simulation?
00:02:18.220 | A little bit.
00:02:19.220 | I think that's the winner so far.
00:02:20.220 | I know.
00:02:21.220 | That's close.
00:02:22.220 | Off to you.
00:02:23.220 | All right.
00:02:24.220 | Hal warned us about trust issues, lack of transparency, misaligned galls, the erosion of human oversight,
00:02:30.340 | and the potential for deception.
00:02:32.600 | Okay.
00:02:35.600 | This is very short.
00:02:40.100 | Are emotions a bug or a feature?
00:02:44.600 | My personal favorite.
00:02:45.600 | I love this one.
00:02:46.600 | Okay.
00:02:47.600 | Okay.
00:02:48.600 | So we got a little monster reference here.
00:02:51.600 | What are the obligations and social responsibilities of the creator?
00:02:56.600 | Should we be kind or threatening?
00:02:59.600 | Costing tokens.
00:03:01.600 | All right.
00:03:02.600 | We'll take that as a flat.
00:03:04.600 | Okay.
00:03:05.600 | Your turn.
00:03:06.600 | I'm the Terminator.
00:03:09.600 | Should we go ahead and just invent time travel now?
00:03:13.600 | All right.
00:03:14.600 | We got a big thumbs up on time travel.
00:03:17.600 | Except time travel.
00:03:18.600 | Okay.
00:03:20.600 | Okay.
00:03:21.600 | A good Star Wars one.
00:03:22.600 | Can AGI truly grasp the nuances of human language and culture or will forever misunderstand the meaning of sarcasm and idioms?
00:03:33.600 | And amazing jokes.
00:03:35.600 | Okay.
00:03:36.600 | When AGI arrives and we finally have a globe spanning multi-agent system with the hive mind, will we be assimilated or will we be pets?
00:03:47.600 | Okay.
00:03:48.600 | Last one.
00:03:49.600 | Just like Deep Thought's famous answer, we might have the tools to build AGI, but do we even know what the right questions are?
00:03:58.600 | All right.
00:04:01.600 | So that one was good as well.
00:04:03.600 | All right.
00:04:04.600 | So come by the GraphRag track.
00:04:05.600 | We're going to reveal which of these ten memes are solved by graphs and graph technology.
00:04:12.600 | And join us in Golden Gate B. Thank you very much.
00:04:15.600 | Thank you, everyone.
00:04:16.600 | Thank you.
00:04:17.600 | Thank you.
00:04:18.600 | We'll see you next time.