Variational Logistic Regression is too slow!
AlexYurasov opened this issue · 2 comments
I was comparing Variational Logistic Regression and Relevance Vector Classifier, and though RVC seems to be more complicated model it is much faster to fit than Variational Bayesian Logistic Regression? Is there any implementation problems?
Hi, there are couple of things here:
-
Relevance Vector Classifier uses Laplace approximation which is much faster than Local Variational Approximation used in Variational Logistic Regression (however it is less accurate at the same time).
Main difference (except ARD prior) is that Variational Logistic Regression needs to optimize latent local variational parameter for EACH OBSERVATION, so obviously it should slow for large datasets.
As a general advice it is better to use Laplace approximation in case you have large number of samples,
for smaller datasets it is preferable to use Local Variational Approximation. -
You are still right VLR is very slow for high dimensional inputs. I updated code, now instead of using
pseudo inverse I use cholesky decomposition, this allows to avoid costly dot products and makes code a bit faster.
If you think there are other places for improvement let me know!
Thanks !