(last edited on April 15, 2015 at 4:27 pm)
Disgusted with the way FireFox was constantly forgetting my login credentials with WordPress, I suspected the issue might have something to do with the way my caching plugin, Quick Cache, was working. It didn’t handle browser-side header expiration or GZIP compression through its control panel, for example, and I suspected the overly-broad rules I’d defined in my .htaccess file had something to do with it. So, I figured it was worth trying another plugin. Notes follow.
Though I’ve had bad luck with W3 Total Cache in the past, I figured enough time had passed that I could give it another try. After all, Quick Cache hasn’t been updated since 2011, and W3 Total Cache (henceforth referred to as W3TC) was updated a few months ago. Plus, people rave about W3TC, and big ISPs like MediaTemple recommend it soundly. What could go wrong?
Well, the first bit of news is that W3TC is not, as of version 0.9.2.5, exactly “network friendly” as its FAQ implies:
Does this plugin work with WordPress in network mode? Indeed it does.
It does work, but it isn’t designed for efficient administration of a WordPress Network. If you’re not familiar with WordPress networks, it’s a collection of individual blogs running the same installation of WordPress. It used to be called WordPress Multisite and was a separate version of WordPress, but in Version 3 they rolled it all into one.
My website here at davidseah.com is running in Network mode, which means I can make every sub-blog have its own theme and custom behaviors. It seemed like a good idea at the time to switch from single-blog to network mode, giving me a lot of flexibility (theoretically) in starting whole new blogs instead of trying to factor new content into the old hierarchy, which is a big mess. With Network mode, I can update a single WordPress installation and have every blog update. Otherwise, I’d have to update each one individiually, which I found to be a pain in the butt. ONE INSTALLATION TO RULE THEM ALL.
Anyway, when you start running a complex WordPress installation, you start to hit limits with your server. I’m running a Media Temple (dv) 4, which is capable of handling maybe 1000-2000 page views per hour without choking. Right now, I get a maximum of 500 page views/hour on a normal day, and the server keeps running without problem (I should note that there is another site that gets similar levels of traffic, so I’m seeing around 1000 pageviews/hour peak). However, in the event that a page on my website becomes very popular, the server will quickly start to slow down. Currently, it can handle perhaps 20 simulatenous page requests (each comprised of several complicated database transactions) before it starts to run out of working memory and get really slow (over 10 seconds to get a webpage back). With a caching plugin, we can store the results of one request and serve it again to another requester. So long as the requests are exactly the same, we can save a lot of server computation and memory usage, which increases our speed AND capacitity.
Two Levels of Page Caching
There are a number of ways a WordPress caching plugin does this. One basic way is to have the plugin intercept the output of a certain request, say the URL https://davidseah.com/compact-calendar
and saving it to a file somewhere. The computational resources that rendering that page is substantial, taking several seconds for WordPress to generate and deliver it from scratch. Loading a file and returning it, though, is way faster. PHP, the web server programming language that WordPress is written in, is capable of doing this, and that’s what W3TC calls basic disk caching. It is the lowest common denominator of page caching. If WordPress works, then this probably will work too.
While this speeds things up tremendously, there’s still a problem: WordPress needs to load and run to deliver the saved page from the cache. This means the webserver needs to initialize PHP each time, load WordPress into it, and then execute the cache program. Each instance of PHP takes up a minimum of 32MB of memory and a bit of time to set up. What would be more awesome is if the webserver (itself a program) could find that cache file itself and return it without loading PHP at all. This is what is called enhanced disk page caching in W3TC, and it works by adding a chunk of code to your website’s .htaccess
file. The code, contained in a mod-rewrite
block, basically checks for the existance of a file in the cache directory that is based on the incoming URL. If W3TC has generated a file, then the webserver returns it and the request is completed. If the the file doesn’t exist, that means that URL hadn’t been visited before, so WordPress is loaded to generate the web page—with W3TC saving the output for later use in the cache directory. Most of the time, PHP is not loaded, saving megabytes and megabytes of memory and plenty of time. The tricky part is detecting when to update the cached file, because new posts change the content of the website. For example, the home page of davidseah.com contains the last few blog posts I’ve posted. If I add a new post, and W3TC doesn’t change the content of the cached file for the home page, then no one will see the updated file. Plus, caching isn’t desirable when you’re using the admin page or submitting forms; it screws stuff up. So a lot of the complexity of a WordPress cache plugin comes from reliably detecting these special cases.
In addition to the page caching described above, W3TC can also handle other website optimizations. One of them is browser-side caching, which tells your browser that certain files—say, the logo image—change very seldomly, so it shouldn’t even bother to request it; use the old one you have. This is done by sending special control headers as part of the response to a page request, tagging each image and file appropriately. There’s also compression, which crunches down long text files into a compact binary representation, often saving 50-80% of the space. Computers are faster these days at uncompressing files than transmitting them over the Internet, so that’s another way you can save. Still another approach is to packing so a bunch of small files are sent packed into one large one, then unpacked. It takes a fixed amount of time to initiate a file transfer, and only a small number are allowed at the same time; packing files can save huge amounts of time.
My WordPress Network Experience with W3TC
W3TC, I imagine, works great on a single blog installation. It has several problems on Network installations.
- You need to activate the plugin on every blog. You can’t activate it for the entire network. This is a pain in the butt.
- Enhanced Disk Mode triggers an Error. In truth, it seems to actually work from my testing if I look at what the browser is actually getting (hint: view source, and look at the bottom of the text). I spent a few hours trying to debug this before finally deciding that it was probably a bug in the plugin itself.
There are a few more other problems I’ve had with it:
- Minify options break. This might be an issue with the plugin in general, but every time I’ve enabled it the site has broken in peculiar ways. I avoid it. It’s always broken on my particular website for as long as W3TC has been around, but I don’t have the patience to debug it.
- Object Caching is a no-go with FastCGI. PHP code can be executed by the webserver (Apache, in my case) in a couple of ways: as a module built-into Apache, or invoked through an external PHP interpreter. For security reasons, my server runs PHP as the latter, using something called “FastCGI”. For speed, multiple instances of PHP are spawned so they can run simultaneously under their own “user”. When they are done executing their code, they then die off to reclaim memory space. The problem? There’s no way to share memory between these instances, which makes PHP acceleration unfeasible. If you’re not running PHP as an Apache module, forget about it here. This isn’t an issue with W3TC itself, but I’m just saying I can’t use it on my setup.
- Database Caching. I haven’t tried this, actually. It gives me the willies because some plugins do scary things to intercept database calls to substitute their own mojo.
There are also a few awesome things that work well:
- Enabling GZIP Compression. This is the compression + packing advantage. W3TC takes care of this for you.
- Enabling Expiration Headers. W3TC handles the complex setup for dozens of files and filetypes, so you don’t have to. Thank the heavens!
And there are a few weird things too:
- Setup requires that you understand that the configuration settings come in two flavors: preview and deploy. The UI is set up in a way that you think “preview” would work like preview everywhere else: showing you a one-time view of your page, without having to save your settings permanently and perhaps screwing things up. W3TC uses the word “preview” to mean both the preview mode and the preview action. The preview action only works in preview mode. Get it? No? Just remember to click DISABLE in general settings at the very top. This disables PREVIEW MODE, not the CACHING, which you set individually lower on the page.
- Detailed Settings are available if you click on them in the Performance tab. At first I thought that these referred to the settings when you first click on performance, but actually they are different.
Despite Confusion, It Seems to Work
In summary, W3TC’s features are somewhat unusable on my setup, and they’re confusing at times. Although I’m only using Page Caching with HTML Expiration and GZIP Compression, this is doing exactly what I need. The errors and at-times confusing interface are not confidence building, but after spending time with it I’d say it probably is doing a more comprehensive job than Quick Cache. In Quick Cache’s favor, it works with WordPress Network without complaining. There’s another cache plugin called WP SuperCache that I used a few year ago, but I dropped it when an update to it seemed to be causing problems. That’s when I switched to Quick Cache.
Followup: I dumped W3TC and am now using Super Cache
After living with W3TC for maybe a month, I nuked it from my system. As I said earlier, it’s probably fine for a simple large WordPress install with a relatively unchanging structure. My website, though, changes quite a bit. I’m constantly adding bits of code or refreshing images. W3TC does not deal well with this, and visiting every single blog in my network to flush the cache is a pain. The advanced caching features for browser caching, minification, and compression have been more annoying than helpful. I ended up disabling browser caching (a nightmare if you are changing any Javascript, CSS, or underlying wordpress template). While it is feasible to configure W3TC precisely to tune the setup, it is only slightly less fun than using Facebook’s Privacy Controls.
I’m now back to SuperCache. The main problem I have with it is that its cache refresh is unreliable. However, it’s relatively easy to just go into the Content settings and delete the entire cache at once. It doesn’t do browser-based caching, which in my case is a nice bonus. For GZIP compression, I followed these instruction from the WordPress Codex.
While this setup isn’t the absolute fastest I could get it, it’s fast enough. Google’s Insights Speed Test gives me an 89/100. Response time is generally under 3 seconds. A good portion of that is due to dynamic scripts (Facebook, for example) that run after the page has rendered its main content, so the effective pageload speed is between 0.5 and 1.5 seconds. It would be nice to combine some of those external CSS and JS files into fewer files, but it’s a relatively minor gain compared to the others.
3 Comments
Hey David. Speaking as a long-time user of W3 Total Cache, I can tell you that it has improved immensely over the last few years. I use Nginx instead of Apache and the W3TC crew have worked very hard to help autogenerate rules to accommodate this.
I’m very happy with how the caching works – although the one thing they still haven’t resolved is how to deal with Minify. It’s always been incredibly buggy for me.
I’m running a multisite setup and a few single instance setups (server separation).
I love super cache.
I recently decided to give w3tc a try since it has slick looking CDN setups and I wanted to geek out on those.
Less than a day later I uninstalled w3tc and put super cache back on (and decided to try cdn another way).
Reasons: – caching bot page requests by default which lead to regular visitors getting shown css less pages. I didn’t want to try to figure out where to fix that. – minify never seemed to work and it kept trying to hit my ssl connection (I have ssl setup on the backend with a non-ssl front-end and it wouldn’t stop displaying the error even after disabling minify. – CDN calls were consistently dropped for random files. Sometimes it just wouldn’t call for the file. This is an issue if you’ve uploaded your wp-includes files. – The plugin cached it’s own admin pages to it’s own detriment. I often was redisplayed an admin page after updating something only to have the previous setting displayed. Once I hard refreshed the page my saved version was then displayed. I doubt beginner users would even put up with it. – Some other non-code related issues like UX, options layouts, and nag messages.
Thanks a lot for this insightful post! W3TC turned out to be a pain in the ass after I recently switched to a multisite setup and I had to wholeheartedly fully and totally agree with your conclusion/follow-up and have switched to Super Cache now (and added the GZIP compression via htaccess according to your advice).