I believe state of the art methods aim for linear time complexity, using the Fast Multipole Method to perform matrix-vector products in O(N) operations rather than O(N^2), and applying that in an iterative solver such as GMRES, which typically requires a constant number of iterations, because the matrix is just a slight perturbation of the identity matrix.
The results are approximate (but that's always true), and, IIUC, the quality of the approximation depends on how well the matrix can be factored into a local component (with values inversely proportional to |(i - j)|) and a sparse global component.
Like petters writes above, "matrices of the form s^2 I + B, where B(i, j) depends on the distance between points i, j." The GMRES + FMM approach is more general, but FMM is less and less Fast as we move away from the structure above towards general matrices.
This structure makes sense because it reflects physical reality… useful in when solving physics problems (: Less so when the matrix corresponds to logical/business constraints over 0/1 variables which may not be so local anymore.
Tangentially-related silly trivia: in cryptocurrency circles, a 'hodlr' is someone who is committed to a long position. I.e. who is 'holds' the cryptocurrency with the idea that in the long (perhaps very long) term it will be more valuable. The term comes from a misspelling of 'hold' by a drunk poster to bitcointalk [1].
Could someone with the appropriate CS background give a summary for layperson programmers? I understand matrices and transforms but what is 'solving' them?