Our server upgrades are continuing along quite smoothly. Last night we got Squid caching up and running. That means that if the wiki delivers a page to a logged-out user once, the rendered page is saved by Squid until it is changed, saving all of the database lookups and processing to turn the WikiText into a full-blown HTML page. This measure is awesome since it not only makes a large majority of the browsing faster, it also makes the site extremely resistant to Slashdotting / Digging / etc. (since those are logged-out users all accessing the same pages – which would be in the cache).
Currently, only about 30% of our page requests are getting served by the Squid, but that’s partially since the API has people sending all kinds of weird requests at it (varied spellings, capitalizations, etc.). Wikipedia serves around 60% of its pages through their Squids, so we have potential for even more savings as the web-traffic catches up to the API traffic.
Tonight, I’ll be moving on to try to use load-balancing to get our other web-server into the party (this is a bit trickier than it sounds, so it might take a while). Then I’ll try to upgrade the new web-server with APC like I did for the first web-server yesterday. Once that setup is done, I’m going to be begging for a slashdot just to see how well the servers can fare against that kind of onslaught (we can take it!).