There are probably a lot of memory utilisation problems with multicore processors in general, I'm not an expert on the hardware details.
Regarding database fitting in RAM or not; I think the Postgres recommendation is that if you want your DB to actually sit in RAM your machine RAM needs to be at least twice the size. Our 1.9 database is about 250GB and I assume we're using relatively cheap commodity servers, I don't think you can probably fit 512GB RAM in there. (And yes - this does cause problems with certain queries - but mostly, it's okay if rarely-used data slips out of file cache.)
Basically we are trying to run some course websites and it doesn't seem like we should have to get a huge box with 'Cray' on the front to achieve that.
Anyhow it does work okay without a supercomputer - but I think database query count optimisation (reducing the number of queries used for a page) is definitely the best way to improve performance (if there is nothing critical wrong with it otherwise). It'll help performance both for places like here, where it basically means we can just be a bit more relaxed about it and maybe not have to upgrade a server quite so soon, and for people who run the system on a $20/month hosting account that not only puts their database and their webserver on the same box, but also 100 other peoples' databases and webservers.
PS Regarding processor cache - as noted I'm not an expert, but I would think there might be better results on this if you put PHP requests on one box and database requests on a different one, so it isn't constantly switching between totally different code bases and datasets.