...

TT-NF: Tensor Train Neural Fields

Anton Obukhov · Mikhail Usvyatsov · Christos Sakaridis

Konrad Schindler · Luc Van Gool

Abstract

Learning neural fields has been an active topic in deep learning research, focusing, among other issues, on finding more compact and easy-to-fit representations. In this paper, we introduce a novel low-rank representation termed Tensor Train Neural Fields (TT-NF) for learning neural fields on dense regular grids and efficient methods for sampling from them. Our representation is a TT parameterization of the neural field, trained with backpropagation to minimize a non-convex objective. We analyze the effect of low-rank compression on the downstream task quality metrics in two settings. First, we demonstrate the efficiency of our method in a sandbox task of tensor denoising, which admits comparison with SVD-based schemes designed to minimize reconstruction error. Furthermore, we apply the proposed approach to Neural Radiance Fields, where the low-rank structure of the field corresponding to the best quality can be discovered only through learning.

Paper

Check out the full paper on arXiv

Source code

TT-NF : The official repository of this project

Subscribe to my Twitter feed to receive updates about this and my other research!

Citation

@misc{obukhov2022ttnf,
  author = {Obukhov, Anton and Usvyatsov, Mikhail and Sakaridis, Christos and Schindler, Konrad and Van Gool, Luc},
  title = {TT-NF: Tensor Train Neural Fields},
  year = {2022},
  doi = {10.48550/ARXIV.2209.15529},
  url = {https://arxiv.org/abs/2209.15529},
  publisher = {arXiv},
  copyright = {Creative Commons Attribution Share Alike 4.0 International}
}