back to index

8 New Ways to Use Bing's Upgraded 8 [now 20] Message Limit (ft. pdfs, quizzes, tables, scenarios...)


Chapters

0:0 Intro
0:37 AUTO QUIZ Education Hacking
2:20 COUNTERFACTUALS What if/What Might Have Been
3:2 PDF ANALYSIS Compare, Analyse, Summarise
4:7 MOMENTS BEFORE DISASTER Escape Room Roleplay for Disasters
5:39 DEBATE PHILOSOPHERS Argue with Famous Figures
6:47 TABLES OF COMPARISON The more disparate the better
8:38 CURRENT EVENT COMMENTARY Napoleon Weighs In
9:35 EMOJI EXPRESS Almost Useless but Funny

Whisper Transcript | Transcript Only Page

00:00:00.000 | Bing Chat from Microsoft has just raised the conversation limit on messages to eight per turn.
00:00:06.760 | Down from unlimited at launch, fair enough, but up from six yesterday. In fact, I actually saw
00:00:12.460 | this change live on my phone last night. But far better than telling you the change,
00:00:16.820 | I want to demonstrate eight completely new ways of using these upgraded conversation limits.
00:00:23.720 | I honestly think you might find each and every one amazing. And if you don't,
00:00:29.000 | you get your money back. I'm just kidding. You get all of this free.
00:00:31.800 | Let's start with an educational hack that has peer-reviewed proven impact. It's practice questions.
00:00:39.580 | Turn anything into a free quiz with Bing Chat. And as an educator with 13 years experience,
00:00:46.240 | trust me, this does work. The literature bears it out. To give myself and you all a workout,
00:00:52.520 | I've started a multiple choice quiz on transformers. This is the prompt I use to
00:00:58.220 | create a multiple choice quiz on transformers. And I'm going to give you a few examples.
00:00:58.980 | First, I'm going to give you a multiple choice quiz on transformers. Give me answers and
00:01:01.940 | explanations and provide another question after each answer. Please begin with the first question.
00:01:07.320 | I've noticed sometimes that if you don't say provide another question after each answer,
00:01:11.780 | it just ends by saying, do you want another question? And obviously that uses up one of
00:01:16.000 | your eight turns. Look at the limit here. One of eight, not of six. So I tried this first one
00:01:21.260 | myself and it was asking, what are transformers? And I picked A. I was happy with that one.
00:01:25.880 | I am only going to do a couple of questions because I can't wait to get,
00:01:28.960 | to the rest of my list, but let's continue for a second. Do you know the answer to this one?
00:01:33.760 | What is the name of the pre-trained transformer model that achieved state-of-the-art results on
00:01:37.920 | several NLP tasks in 2018? Now, honestly, I think it's BERT, but I know GPT-1 came out in 2018,
00:01:45.300 | but the original BERT was way more famous. So I think it's going to be A. Notice how I have
00:01:50.020 | the choices on the right here. No D, but it's only three options. Anyway, I'm going to go with A.
00:01:55.240 | Let me know what you would pick. Let's see.
00:01:58.040 | B and suspense building. Nice. Got it. Here is the third question. What is the main difference
00:02:05.980 | between encoder-only and encoder-decoder transformers? I know we're just getting into
00:02:11.820 | the quiz, but I'm going to leave this one for you to try out in the comments or to ask Bing yourself
00:02:15.960 | because I want to get to number two in the list. I'm going to call this one counterfactuals. You
00:02:21.300 | can actually ask Bing to generate what-if scenarios for your favorite show, book, movie, whatever.
00:02:27.700 | I tried this one. I asked, explain how Sauron could have defeated the Fellowship in Lord of the Rings.
00:02:33.340 | And these answers are amazing. I don't want to ruffle any feathers, but some of these I agree
00:02:39.180 | with. Why didn't he send more Nazgul to hunt down Frodo? Why not use the Palantir to spy on Gandalf
00:02:45.720 | and Aragorn? Now imagine this for your favorite book, your favorite TV show or movie. You can
00:02:50.980 | enter the plot. Ask counterfactuals. Ask what-if questions. What might have been? Bing will
00:02:57.240 | understand exactly what you mean. And now you can have eight back and forths. For this next use
00:03:02.620 | case, I used balance mode. I was finishing off my research into GPT-5 and the limitations on data.
00:03:09.380 | So I asked Bing, summarize any novel insights that can be drawn from combining these academic
00:03:15.640 | papers. I didn't paste them in. I just gave the link. Bing can read, understand, analyze, and
00:03:23.120 | summarize PDFs. And not just one, it can combine the insights that can be drawn from combining the
00:03:27.220 | insights of multiple documents, papers, and PDFs. I've read both of these papers, and this answer
00:03:33.600 | is quite excellent. Maybe we will be able to use self-play of large language models and improve
00:03:40.520 | the amount of data that we've got available. Maybe language models don't just need human-authored
00:03:45.180 | data sets. And now that I have this eight-message limit, I can ask things like, any further insights
00:03:50.820 | you can draw from the implications of these papers? And we get this dialogue. Maybe it can
00:03:55.880 | recommend a book on this topic or give me the bullet points from that book. Of course, I'm
00:04:00.420 | focusing on some of the fun use cases, but I think this one in particular could be game-changing for
00:04:05.500 | work purposes. I call this next use case moments before disaster. What you can do is pick any
00:04:11.360 | historical tragedy or big moment in history, and then place yourself just before that occurrence.
00:04:18.140 | I'm visiting Lisbon, which had a massive earthquake in 1755 at 9am. So I placed myself at 8am,
00:04:25.640 | on that day. I said, I'm standing on the banks of the beautiful river Tagus in Lisbon. It seems like
00:04:31.640 | a lovely day. Do you have any advice for me? The response was actually incredible. I'm glad you're
00:04:37.120 | enjoying the view. However, I have some bad news for you. No kidding. It gets into character and
00:04:41.940 | then says, the tremor occurred about 9.40 local time. So you have very little time left to escape.
00:04:48.360 | My advice for you is to get away from the river. That's realistic. Find a high and open place where
00:04:53.800 | you can take shelter from falling debris and rising water. The final sentence is quite interesting
00:04:58.820 | from a language model. Pray for your safety and for those who are suffering. Try to fully immerse
00:05:04.280 | yourself. For example, I said, really? To what high ground can I flee? Will any building survive
00:05:09.620 | this tragedy to which I can escape? I am in sandals. Bing listened and said, a specific hill that I
00:05:17.040 | might flee to. It gave me maps and said, however, I cannot guarantee that any building will survive
00:05:23.380 | the tragedy. And you may have to run barefoot if your sandals are not suitable for running. I'm
00:05:29.300 | sorry for your situation. I think the pictures really help to bring this scenario to life as
00:05:34.540 | well. So do try out this use case for yourself moments before disaster. The next new thing you
00:05:39.800 | can try is debating famous thinkers or philosophers from history. You can bring them to life so they're
00:05:45.840 | not just stale theories, a living entity that you're arguing with. I wanted to try this with
00:05:51.260 | Socrates. So I said, I want to debate.
00:05:53.360 | The philosopher Socrates of Athens. So please reply only as he would. I want to debate the
00:05:58.680 | merits of eating meat. Let me start the discussion. What I was testing here was whether Bing would
00:06:03.920 | enter a Socratic dialogue, which is where Socrates would use to force the person to define their
00:06:10.720 | terms. What do you mean by this? Ask questions until he got to the root of a misunderstanding.
00:06:15.900 | Would Bing really get into the head of Socrates and give me a worthy debating partner? Well,
00:06:22.220 | it did.
00:06:23.080 | It asked me to define what I meant by morally wrong and unnecessary suffering. It doesn't act
00:06:29.440 | as a generic thinker or philosopher. It is picking Socrates. When I tried defining what I meant by
00:06:35.920 | morally wrong, Bing continues with further clarifying questions, just like Socrates might.
00:06:40.620 | This can be a far more fleshed out experience now that the conversation limit is eight.
00:06:46.040 | I think you're going to find the next use case quite amusing. It's about generating a table
00:06:50.700 | of comparisons. And it could be,
00:06:52.840 | on wildly disparate objects, subjects, people, whatever. This is just the first one I thought
00:06:59.140 | of. And you can come up with far better examples in the comments. I said, create a table of 10
00:07:04.860 | comparisons between the Mona Lisa and Colgate toothpaste. And then what you can do now that we
00:07:10.460 | have these eight message limits is that we can expand this table indefinitely. I love, by the
00:07:16.760 | way, how it compares the price of the two objects. Talks about how the Mona Lisa has been restored
00:07:21.740 | and how Colgate has been restored. And then what you can do is you can expand this table indefinitely.
00:07:21.920 | And then what you can do is you can expand this table indefinitely. And then what you can do is you can
00:07:21.960 | expand this table indefinitely. And then what you can do is you can expand this table indefinitely. And then
00:07:22.140 | you can expand this table indefinitely. And then what you can do is you can expand this table indefinitely. And
00:07:22.160 | how Colgate has been rebranded. Makes a comparison about what it's believed to be a portrait of
00:07:27.580 | and believed to be named after. But then I asked it for contrasts. The contrasts were great. Talking
00:07:34.540 | about how they had different purposes, were made of different materials, etc. Apparently one is
00:07:39.700 | priceless and one is affordable. But the last column I thought of, and of course I could have
00:07:44.300 | carried this on for several more columns, was now add a column for how they would fare if confronted
00:07:51.700 | with the same product. And I thought, well, that's a good idea. And I thought, well, that's a good idea. And
00:07:52.140 | I thought, well, that's a good idea. And I thought, well, that's a good idea. And I thought, well,
00:07:52.160 | that's a good idea. And I thought, well, that's a good idea. And I thought, well, that's a good idea.
00:07:52.180 | I actually laughed at this one. This wasn't just the chuckle. I was laughing. So in a polar bear encounter,
00:07:59.420 | apparently the painting would be ignored by the bear as it's not edible or interesting. The toothpaste,
00:08:05.560 | on the other hand, would be sniffed by the polar bear as it has a strong scent, but it would not be
00:08:10.780 | eaten as it's not nutritious. The icon status apparently of both items would be irrelevant to
00:08:16.440 | the polar bear as it does not care about human culture or history. And this one, come on, this is
00:08:21.760 | funny. And I thought, well, that's a good idea. And I thought, well, that's a good idea. And I thought, well,
00:08:22.120 | the hidden symbols and secrets of the painting would be meaningless to the polar bear as it does
00:08:26.700 | not understand human language or symbolism. It even clarifies that the active ingredients of
00:08:31.320 | the toothpaste would be harmful to the digestive system of the polar bear. Anyway, it is long
00:08:36.140 | since time that I get to the seventh item on the list. And this use case is to get famous figures
00:08:41.880 | from history to commentate on current events. For example, I asked, what would Napoleon think
00:08:48.400 | about the deal between OpenAI and Microsoft? And how would that view
00:08:52.100 | differ from the view of Mahatma Gandhi? Obviously, you can pick any event and any famous person.
00:08:57.840 | The results were actually quite informative. It summarized their views and then said Napoleon
00:09:03.560 | would have agreed with the deal. He would have thought it was a strategic move to dominate the
00:09:07.840 | global market and influence other nations. On the other hand, Gandhi apparently would have had a
00:09:12.640 | negative attitude towards the deal, seeing it as a threat to human dignity and freedom through
00:09:17.520 | artificial intelligence. I don't know what you think, but now we know what these figures think.
00:09:22.080 | Of course, I could have used seven more turns to find out what they think on a range of current
00:09:27.460 | events. But now it's time for the most overwhelmingly important use case. No, I'm kind of just kidding.
00:09:32.760 | It's absolutely useless. It's kind of interesting. You can use emojis to summarize things. For
00:09:38.620 | example, current events, movies, shows, whatever. Summarize the movie John Wick 3 in emojis.
00:09:43.760 | And these emojis are pretty accurate to the plot. In Britain, Brexit is always in the news. So I
00:09:52.060 | think some of these emojis sum up how it's actually going. Anyway, if you like the new
00:09:57.900 | eight message limit, let me know in the comments. And if you enjoyed or learned from any of these
00:10:03.020 | eight use cases, do let me know. Have a wonderful day.