TensorFlow 1.0 came out in February. Keras is now part of TensorFlow. You can build and train a convolutional neural network in an afternoon without understanding most of what's happening underneath. The results are good.
This is not a complaint. This is the most important thing that happened in machine learning this year.
What boring means here
Boring means that the tooling works reliably enough that you stop thinking about it. Boring means the questions shift from "how do I build a neural network" to "what problem should I apply this to." Boring means it becomes infrastructure rather than research.
Two years ago, getting a deep learning model running required Theano, careful dependency management, and a tolerance for cryptic errors. Today you install TensorFlow, follow a tutorial, and have something working before lunch. The barrier collapsed faster than most people expected.
Why this matters for builders
The technologies that change how software gets built are rarely the ones that stay exciting. The ones that matter most become invisible. TCP/IP is boring. SQL is boring. HTTP is boring. That's why they're everywhere.
Deep learning becoming boring suggests it's on that path. Not for every use case, not without caveats, but the trajectory is clear.
What comes next
The unsolved problem is deployment. Training a model is straightforward now. Getting it into a product in a way that's reliable, fast, and maintainable is still harder than it should be. That's where the interesting engineering work is in 2018.
Also: the gap between what the models can do on benchmarks and what they can do on real production data is still large. Boring tooling doesn't solve that. The data problem doesn't go away just because the training is easy.
But the fact that we're talking about deployment instead of training is itself progress. A year ago we were still talking about whether the models worked at all.
With gusto, Fatih.