The realization that continuous algorithms are helpful (and crucial) in designing highly efficient algorithms for discrete optimization problems on sparse graphs.

From what I'm aware of, this was first introduced in http://arxiv.org/abs/0803.0988, which gave (among other things) an algorithm for minimum cost flow that runs in [math]\tilde{O}(m^{3/2})[/math] time (m is the number of edges in the graph). Purely combinatorial algorithms for flows encounter the `flow decomposition barrier' at [math]\Omega(n^2)[/math], and to date this is the only algorithm for minimum cost flow that gets below that.

Fast forward 5 years, developments in this direction led to algorithms for finding approximate maximum flows (and multicommodity flows) in undirected graphs in [math]O(m^{1 + \epsilon})[/math] time (http://arxiv.org/abs/1304.2338, http://arxiv.org/abs/1304.2077), and an algorithm for maximum bipartite matching that runs in [math]\tilde{O}(m^{10/7})[/math] time (http://arxiv.org/abs/1307.2205). The latter improves the previous best running time for maximum bipartite matching of [math]O(m^{3/2})[/math] from 1971. At the moment, many in the area believe that almost all textbook graph optimization algorithms (except BFS/DFS and Dijkstra's algorithm) will be rewritten in a few years.

Other brilliant algorithmic developments that come to mind are approximations and applications of graph expansion, pipage rounding, streaming algorithms, and compressive sensing. However, I'll only make a fool out of myself by discussing these topics.