Once upon a time, we did things ourselves. Then we made software and told it what to do. Now we have software with algorithms, that learn what to do on their own. They still need input, but what they learn from their input is much more than what we've programmed into them. Before we even realized it, these machine programs know more than we do.
CGP Grey makes sense of all this for those of us who don't program computers, much less design software or build artificial intelligence algorithms. Still, making machine smarter and smarter brings up more questions than it answers. First off, is that really a wise thing to do? -via Metafilter
No. No. No.
"Algorithm" simply means "process" : follow these steps and you'll get the results you want. It tells someone, or a computer, the steps to follow to accomplish a specific task.
Here's the algorithm for making a sandwich:
1. get your bread
2. get your filling
3. put your filling on top of one piece of bread
4. put the other piece of bread on top of that
That's an algorithm a person could follow. If you were writing it for a computer, you'd need to be more detailed, because computers are very stupid.
But, the word has absolutely nothing to do with artificial intelligence or spooky thinking machines. Sure, you can come up with an algorithm that describes a process to do some kind of AI stuff, but that's a specific algorithm, a specific process.