ITensor/ITensorNetworks.jl

Wish list for `alternating_update`

Opened this issue · 0 comments

Algorithms we want to be able to implement with the alternating_update and BP code:

  • TNS addition (direct sum, density matrix, fitting, gauging)
  • TNS contraction (density matrix, fitting, gauging)
  • TNS compression (density matrix, fitting, gauging)
  • TNS eigensolving
  • TNS linear solving
  • TNS evolution with TDVP and gates
  • TNS METTS
  • TCI
  • Quantics optima
  • Non-quadratic optimization
  • Infinite TNS
  • Subspace expansion
  • Parallelization over regions

An initial goal will be to either implement these, or make it easy to implement these, through callback functions or function overloading, and have them work for MPS, TTN, and general tensor networks (at first using the BP approximation though allow an interface for customizing that). See also #112.

@emstoudenmire @JoeyT1994