I'd agree if we were living in the 1800's, but we're not. We're living in the age where everything is googleable (Including the text book). You don't have to memorize things to have information.
Even in the 1800's, in a good library, I'd have more information available in a searchable, indexed form than I could quickly read into "working memory" to be able to "process" with.
Having a Grey's anatomy book in front of me, all available to read, does not make me able to perform anatomy at an advanced level.
It's similar to having data in secondary storage, that still needs to be loaded into a process's working memory, before it can be worked on. This is specifically true of condensed terminology that's used to compress further explanations.
Following the anatomy example - if I don't have the terminology to describe the different positional views available without doing an external lookup first, following a basic description becomes very difficult and involves so many lookups as to become impossible to follow.
I can't find the article/paper, but I remember reading something where someone calculated the trade-off for a specific API between knowing a percentage of the API method signatures of by heart versus looking up each one using the IDE documentation each time. It made a very big performance difference in the end.
BTW: knowing what you don't NEED to remember and can just reference when required, is probably part of the skills making you an expert.
Learning is gaining the "wisdom" of the material. If all you're doing is rote memorization, you're missing the forest for the trees.