Making The Black Box Transparent: Lessons in Opacity
YOW! Data 2019
Deep Learning is all the rage now. It is powerful, it is cheap. To proponents of "explainable" machine learning however, this is not really good news - deep learning is essentially a black box that one can't look into.
To be sure, there are efforts to peek inside the black box to see what it's learning - saliency maps and various visualization tools are useful to understand what is going on in deep learning neural networks. The question of course, is whether it's worth it?
In this talk I shall cover the basics of looking into a deep neural network and share a different approach of looking into neural networks.