The Google Brain Team – Looking Back on 2017
164 points by amaks 7 years ago | 6 comments- jacksmith21006 7 years agoMy favorite of these is the Jeff Dean paper on using NN for database indexes. Then doing the processing on the TPUs. Really looking forward to see the difference in power required in using a TPU versus a CPU using a traditional approach.
- pkaye 7 years agoSeems like the AutoML project if successful will result in a loss of atleast the lower end AI jobs.
- riku_iki 7 years agoIs it open source?
- riku_iki 7 years ago
- dspoka 7 years agoThe most promising area here to me seem like automl. The promise of the new machine learning was that we get to move away from tedious feature engineering and everything will work and be simple. It may have become simpler but training/debugging new DL models is still painful causing the focus to move to extensive hyperparameter search. automl may become the next step in abstraction, where we design single models/algorithms that are able to build viable networks for many tasks/purposes.
- tramGG 7 years agoI'm really excited by AutoML as well, it seems like one of the few next level advancements. One of the authors of AutoML is presenting in SF in Feb at the decentralized ai summit can't wait.
- tramGG 7 years ago
- sabujp 7 years agothe best one is "I'm too busy for romance". : https://google.github.io/tacotron/publications/tacotron2/ind...