Yeah, you understand correctly. The thing is, I don't know any mathematical insight that singles out lambda calculus as a basis for computation among other possible bases, or any mathematical insight that singles out Lisp as an approach to writing self-interpreters. In other words, there's no theorem saying Lisp lies at the extreme end of any spectrum.
To many people, Lisp is simply the most advanced language they know, so they view other languages as "Lisp plus some features that you could implement with macros". They might as well view other languages as "assembly plus some features that you could implement with assembly". It's just the blub paradox all over again.
To add insult to injury, many of today's advanced languages aren't even built on Lisp. They throw away the key idea of Lisp (easy equivalence between code and data) in order to achieve other things which are not achievable in Lisp (provability of certain classes of statements about all valid programs). To put it in an exaggerated but not entirely untrue way, these days Lisp seems like a dead end in terms of research. Most of the interesting stuff is coming from ML-like languages instead.
To many people, Lisp is simply the most advanced language they know, so they view other languages as "Lisp plus some features that you could implement with macros". They might as well view other languages as "assembly plus some features that you could implement with assembly". It's just the blub paradox all over again.
To add insult to injury, many of today's advanced languages aren't even built on Lisp. They throw away the key idea of Lisp (easy equivalence between code and data) in order to achieve other things which are not achievable in Lisp (provability of certain classes of statements about all valid programs). To put it in an exaggerated but not entirely untrue way, these days Lisp seems like a dead end in terms of research. Most of the interesting stuff is coming from ML-like languages instead.