One of the geekier conversations I used to get drawn into (up there along with arguing about favorite Serenity episodes, lamenting the demise of Omni magazine, and coming to blows over D&D rules interpretations) was over the relative merits of various sorting algorithms. Flowing Data has some links to some interesting visualization approaches to sorting algorithms. This one is for quicksort (colors start out random on the left and must be sorted into Roy G Biv order).
In college I did a project on solving traveling-salesman-type problems with an algorithm called simulated annealing. Many approaches to the traveling salesman problem pick a random path and then randomly switch pairs of routings on the path, and then stick with the alternative that gives the best score (in this case the shortest path). Rinse, repeat a zillion times. The problem is this approach can get stuck in, or converge on, a local optima that are not as good as the single grand optimum for the problem.
In simulated annealing, the algorithm is allowed to sometimes jump to a worse (ie longer) path, which lets it jump out of local minima. The amount of the backwards jump that is allowed is slowly reduced over time as the algorithm runs. It is called simulated annealing because this is very similar to the annealing process in metals, where temperature is decreased slowly to, initially, allow the metal molecules to jump to higher energy states so that the whole can settle into a more homogeneous structure.
Anyway, we tried to show how the algorithm proceeded to a solution over time and the visualizations looked a little like this.