Advantages
Christoph Molnar
Feature visualizations give unique insight into the working of neural networks, especially for image recognition. Given the complexity and opacity of neural networks, feature visualization is an important step in analyzing and describing neural networks. Through feature visualization, we have learned that neural networks learn simple edge and texture detectors first and more abstract part and object detectors in higher layers. Network dissection expands those insights and makes interpretability of network units measurable.
Feature visualization is a great tool to communicate in a non-technical way how neural networks work.
Feature visualization can be combined with feature attribution methods, which explain which pixels were important for the classification. The combination of both methods allows to explain an individual classification along with local visualization of the learned features that were involved in the classification. See The Building Blocks of Interpretability from distill.pub.