/GMAMP

Generalized approximate message passing (GAMP) and generalized vector AMP (GVAMP) are Bayes-optimal algorithms widely used for unknown signal reconstruction of generalized linear models (GLM). However, they both have their own limitations, i.e., either requirements for independent and identically distributed (IID) transformation matrices or high-complexity matrix inverse. In this article, we provide a universal generalized memory AMP (GMAMP) framework including the existing orthogonal AMP/VAMP, GVAMP, and MAMP as instances. It gives new directions to address GLM and performs well in ill-conditional systems with low complexity. The proposed Bayes-optimal GMAMP is an example that overcomes the IID-matrix limitation of GAMP and avoids the high-complexity matrix inverse in GVAMP. Our proposed framework paves the way for compressed sensing, imaging, signal processing, communications, deep learning, and other fields.

Primary LanguageMATLABMIT LicenseMIT

Watchers