JuliaSmoothOptimizers/QRMumps.jl

QRMumps not solving a problem being solved by SuiteSparseQR

Opened this issue · 6 comments

Taking that specific problem :

using SparseArrays, QRMumps

rows = [1, 6, 2, 7, 2, 8, 2, 9, 2, 5, 10, 2, 5, 11, 2, 3, 12, 3, 13, 3, 14, 5, 15, 3, 4, 16, 4, 17, 4, 18]
cols = [1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 5, 6, 6, 6, 7, 7, 7, 8, 8, 9, 9, 10, 10, 11, 11, 11, 12, 12, 13, 13]
vals = [4.242640687119286, 0.0, 2.8284271247461903, 0.0, 2.8284271247461903, 0.0, 16.970562748477143, 0.0, -0.7067534044254388, 1.4142135623730951, 0.0, 0.7067534045431721, 2.8284271247461903, 0.0, 1.4142135623730951, 1.4142135623730951, 0.0, 1.4142135623730951, 0.0, -2.8284271247461903, 0.0, 15.556349186104047, 0.0, 1.4142135623730951, 1.4142135623730951, 0.0, 1.4142135623730951, 0.0, -7.0710678118654755, 0.0]
A    = sparse(rows, cols, vals)
b = [80.61017305526643, -10.606248341154838, -2.8284271247461903, -23.607675141438072, 52.32590180780452, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

spmat = qrm_spmat_init(A)
d_qrmumps = qrm_least_squares(spmat, b)

d_qr      = SparseArrays.qr(A)\(b)

QRMumps is not able to find a solution whereas SuiteSparseQR can:

d_qrmumps = [19.0, NaN, NaN, NaN, NaN, NaN, Inf, NaN, NaN, NaN, NaN, NaN, NaN]
d_qr      = [19.0, 0.0, 0.0, 0.0, 15.00700000133325, 0.0, 0.0, 0.0, 0.0, 1.9993636362424316, -2.0, -14.693147180559947, 0.0]

NB : Lines above 5 being all empty in A and b, this works:

spmat = qrm_spmat_init(A[1:5, :])
d_qrmumps = qrm_least_squares(spmat, b[1:5])
dpo commented

@abuttari The above is admittedly a bit unusual because of the large block of zeros at the bottom of A and b, but it should work. Could you let us know if something is going wrong in the interface, in QRMumps itself, or if the problem is between the chair and the monitor?

that seems to be a rank-deficient matrix, right? unfortunately qrm only handles full-rank matrices unlike SPQR.

dpo commented

Oh, I was not aware of this limitation. Does it have to have full column rank? It seems the full row rank call above works.

what's probably happening is that qrm sees that the system is undetermined (5 rows, 13 columns) and thus factorizes the transpose of the matrix. In this case the returned solution is the one of minimum norm whereas, I think, SPQR returns just one of the many. Obviously you can do the same with SPQR (i.e., ask it to factoriza A' instead of A)

dpo commented

Is pivoting difficult to implement or is it a performance killer?

pivoting in QR is harder to implement than in LU already on dense matrices because it prevents the use of BLAS3. On sparse ones it's even worse because it leads to higher fill-il. There have been a few attempts long time ago but did not lead to anything usable. SPQR uses a variant of the Heath method: when a small coefficient is found on the diagonal of R during the factorization, the corresponding column is skipped. This leads to a R factor which is not triangular but somewhat staircase-triangular. In qr_mumps this method is hard to implement because the staircase-triangular structure clashes with the 2D blocking used for parallelization. I have a few ideas but really lack the time to work on them...