So a crazy question, but I work a lot with machine learning, with deep learning. I'm not sure if you touch that world much, but you could think of programming as a thing that takes some input. Programming is the task of creating a program, and a program takes some input and produces some output.
So machine learning systems train on data in order to be able to take in input and produce output. But they're messy, fuzzy things, much like we as children grow up. We take some input, we make some output, but we're noisy. We mess up a lot. We're definitely not reliable.
Biological systems are a giant mess. So there's a sense in which machine learning is a kind of way of programming, but just fuzzy. It's very, very, very different than C++, because C++ is, just like you said, it's extremely reliable, it's efficient, you can measure, you can test it in a bunch of different ways.
With biological systems or machine learning systems, you can't say much except sort of empirically saying that 99.8% of the time it seems to work. What do you think about this fuzzy kind of programming? Do you even see it as programming? Is it totally another kind of world? I think it's a different kind of world, and it is fuzzy.
And in my domain, I don't like fuzziness. That is, people say things like they want everybody to be able to program, but I don't want everybody to program my airplane controls or the car controls. I want that to be done by engineers. I want that to be done with people that are specifically educated and trained for doing, building things, and it is not for everybody.
Similarly, a language like C++ is not for everybody. It is generated to be a sharp and effective tool for professionals, basically, and definitely for people who aim at some kind of precision. You don't have people doing calculations without understanding math, right? Counting on your finger is not going to cut it if you want to fly to the moon.
And so there are areas where an 84% accuracy rate, 16% false positive rate is perfectly acceptable and where people will probably get no more than 70. You said 98%. What I have seen is more like 84, and by really a lot of blood, sweat, and tears, you can get up to 92 and a half.
So this is fine if it is, say, pre-screening stuff before the human look at it. It is not good enough for life-threatening situations. And so there's lots of areas where the fuzziness is perfectly acceptable and good and better than humans, cheaper than humans, but it's not the kind of engineering stuff I'm mostly interested in.
I worry a bit about machine learning in the context of cars. You know much more about this than I do. I worry too. But I'm sort of an amateur here. I've read some of the papers, but I've not ever done it. And the idea that scares me the most is the one I have heard, and I don't know how common it is, that you have this AI system, machine learning, all of these trained neural nets, and when there's something that's too complicated, they ask the human for help.
But the human is reading a book or asleep, and he has 30 seconds or three seconds to figure out what the problem was that the AI system couldn't handle and do the right thing. This is scary. I mean, how do you do the cut over between the machine and the human?
It's very, very difficult. And for the designer of one of the most reliable, efficient, and powerful programming languages, C++, I can understand why that world is actually unappealing. It is for most engineers. To me, it's extremely appealing because we don't know how to get that interaction right, but I think it's possible.
But it's very, very hard. It is. And I was stating a problem, not a solution. That is impossible. I would much rather never rely on a human. If you're driving a nuclear reactor or an autonomous vehicle, it's much better to design systems written in C++ that never ask human for help.
Let's just get one fact in. All of this AI stuff is on top of C++. So that's one reason I have to keep a weather eye out on what's going on in that field, but I will never become an expert in that area. But it's a good example of how you separate different areas of applications and you have to have different tools, different principles, and they interact.
No major system today is written in one language, and there are good reasons for that. 1 1 2 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17