| Efficient structuring of the latent space for controllable data reconstruction and compression | |
Graphics and Visual Computing, Volume 7 - dec 2022
Explainable neural models have gained a lot of attention in recent years. However, conventional encoder–decoder models do not capture information regarding the importance of the involved latent variables and rely on a heuristic a-priori specification of the dimensionality of the latent space or its selection based on multiple trainings. In this paper, we focus on the efficient structuring of the latent space of encoder–decoder approaches for explainable data reconstruction and compression. For this purpose, we leverage the concept of Shapley values to determine the contribution of the latent variables on the model’s output and rank them according to decreasing importance. As a result, a truncation of the latent dimensions to those that contribute the most to the overall reconstruction allows a trade-off between model compactness (i.e. dimensionality of the latent space) and representational power (i.e. reconstruction quality). In contrast to other recent autoencoder variants that incorporate a PCA-based ordering of the latent variables, our approach does not require time-consuming training processes and does not introduce additional weights. This makes our approach particularly valuable for compact representation and compression. We validate our approach at the examples of representing and compressing images as well as high-dimensional reflectance data.
Images and movies
| Other publications in the database
|
BibTex references
@Article { TWMK22,
author = "Trunz, Elena and Weinmann, Michael and Merzbach, Sebastian and Klein, Reinhard",
title = "Efficient structuring of the latent space for controllable data reconstruction and compression",
journal = "Graphics and Visual Computing",
volume = "7",
month = "dec",
year = "2022",
doi = "https://doi.org/10.1016/j.gvc.2022.200059",
key = "trunz-2022-shapley",
url = "http://graphics.tudelft.nl/Publications-new/2022/TWMK22"
}
Back