Principal Investigator Sertac Karaman
We describe a new nonparametric function approximation framework based on a continuous extension of the tensor-train decomposition. The approximation, termed a function-train (FT), results in a tensor-train structure whose cores are univariate functions of each variable. An advantage of the FT over discrete approaches is that it produces an adaptive approximation of tensor fibers that is not tied to any tensorized discretization procedure. Furthermore, the representation of low-rank functions in FT format enables efficient continuous computation: we can add, multiply, integrate, and differentiate functions in polynomial time with respect to dimension. Our approach is in the spirit of other continuous computation packages such as Chebfun, and yields an algorithm which requires the computation of "continuous" matrix factorizations such as the LU and QR decompositions of vector-valued functions. Our contributions include creating a maxvol algorithm for finding the maximum volume submatrix of a matrix-valued function. We also introduce a maxvol cross-approximation algorithm to obtain skeleton decompositions of vector-valued functions, a cross-approximation algorithm for converting black-box functions into FT format, and a continuous rounding algorithm that re-approximates an FT by an FT of lower ranks. We demonstrate the benefits of our approach by integrating high dimensional and discontinuous functions. We also show how the algorithm performs on a variety of approximation tasks.