Abstract
Learning neural fields has been an active topic in deep learning research, focusing, among other issues,
on finding more compact and easy-to-fit representations. In this paper, we introduce a novel low-rank
representation termed Tensor Train Neural Fields (TT-NF) for learning neural fields on dense regular grids
and efficient methods for sampling from them. Our representation is a TT parameterization of the neural
field, trained with backpropagation to minimize a non-convex objective. We analyze the effect of low-rank
compression on the downstream task quality metrics in two settings. First, we demonstrate the efficiency of
our method in a sandbox task of tensor denoising, which admits comparison with SVD-based schemes designed
to minimize reconstruction error. Furthermore, we apply the proposed approach to Neural Radiance Fields,
where the low-rank structure of the field corresponding to the best quality can be discovered only through
learning.
Featured in
- IEEE Journal of Selected Topics in Signal Processing 2024
- MBZUAI 2023 Workshop (Seeking Low‑Dimensionality in Deep Neural Networks)
- ICLR 2023 Workshop (Neural Fields)
Paper
Check out the full paper on arXivPoster
View the workshop posterSource code
TT-NF
: The official repository of this project
Subscribe to my Twitter feed to receive updates about this and my other research!
Citation
@article{obukhov2022ttnf, author = {Obukhov, Anton and Usvyatsov, Mikhail and Sakaridis, Christos and Schindler, Konrad and Van Gool, Luc}, journal = {IEEE Journal of Selected Topics in Signal Processing}, title = {TT-NF: Tensor Train Neural Fields}, year = {2024}, pages = {1-13}, doi = {10.1109/JSTSP.2024.3454980} }