- Bubble sort, or in-place insertion sort (or any O(n^2) sort).
- Quick sort or merge sort (or any O((log n) * n) sort).
- Building and searching a balanced binary tree (or any tree construction and traversal methods).
- Building and using a hash table.
- Finding an optimal hash for a static hash table.
- Approximate optimization (or search) by bifurcation (binary search).
- Dijkstra’s Algorithm (or any other graph search algorithms).
- LU decomposition (any method).
- Fisher-Yates Shuffle (and/or other set and permutation related algorithms).
- Minimum Spanning Trees (e.g. Prim’s Algorithm).

I can only think of these, but some of them have several variants. Anyone want to suggest more? I’m willing to edit.

Update: Thanks for all of the constructive comments! Perhaps I should have added a paragraph at the beginning to say: I don’t think they should necessarily know these algorithms by heart, but they should know that these algorithms exist, and know how to look them up and implement them. They should also know how these algorithms can affect the run-time complexity of their code. Most of the time in modern programming, we use code from libraries, and the software engineer can simply choose among several alternatives. For example, the C++ STL provides many container classes. The container can have a big impact on code performance. There are hundreds, if not thousands of libraries available to the modern software engineer. In that case, knowledge of the underlying algorithms can help them to make a better choice. In any case, even these very solved problems need to be written by *someone*, even if that someone is just writing a library to be used by others.

Another list suggestion:

Algorithms is not something that most programmers can beneficially memorize. I’ll list 10 ideas/concepts/techniques instead.

- Name things after what they are. Variables, methods, functions, modules—the best place to document something is its name.
- Make your code short. The fewer tokens and complexity, the less room for bugs.
- Put related things textually close.
- Hash tables. Use whenever appropriate.
- Modularization. Break up software into understandable and testable modules.
- Caching. All computer science is an exercise in caching.
- Optimize what matters: in a CPU- or memory-bound program, only a tiny part of the code determines the performance.
- Use C for I/O-bound programs, including networking code, and for very little else.
- Never get married to a language, a framework, a library, or a tool. Don’t be a
*<tool name>*programmer; instead, use what is appropriate. - Optimizing the algorithm matters more than micro-optimizing some inner loop.

If I were to list 10 specific algorithms as examples everyone should be familiar with, I’d probably go with these:

- Binary search
- Quicksort
- Backtracking
- Hash table
- Linear congruential generator and Mersenne Twister
- Linear least squares
- Karatsuba algorithm
- Divide and conquer for Matrix multiplication
- Dijkstra’s algorithm
- Knuth–Morris–Pratt algorithm

Memorizing them is pointless, but understanding the ideas is very useful.