ИСТИНА |
Войти в систему Регистрация |
|
ИПМех РАН |
||
Numerical data are frequently organized as $d$-dimensional matrices, also called tensors. However, only small values of $d$ are allowed if we need to keep this data in a computer memory. In the case of many dimensions, special representation formats are crucial and it looks natural to try the so called tensor decompositions. In the recent decade, the known tensor decompostions have been considerably revisited and the two of them appeared and are now recognized as the most adequate and useful tools for numerical analysis. These two are the Tensor-Train and Hierarchical-Tucker decompositions. Both are intrinsically related with low-rank matrices associated with a given tensor. In the talk, we present these decompositions and the role of low-rank matrices for the construction of efficient numerical algorithms. We also consider how these tools facilitate some new approaches to solving numerical problems in several application areas, such as drug design, thin optical coatings, coagulation and fragmentation of particles, identification of a model parameters etc.