March 14, 2005

Compressing your CSS with PHP Webdesigners, bloggers, htmly folk, you can reduce the transfer size of your CSS file by up to 75% using the methods outlined in this article. Results may vary, but looks a neat hack.

I won't be using this until I get my blog back up & running, which will probably be later this century, but it certainly looks a good thing to have in your arsenal of techniques.

  • thanks Doris, CSS is the throbbing center of my "professional" life, this is great info!
  • Medusa...what might be the "throbbing center" of your "unprofessional life"...just wondered...... :) /note to self: avoid the use of the term "throbbing center".... check
  • [this is a hack] A much better strategy from an engineering perspective is to use Apache's mod_deflate filter. Most modern browsers support it already. It's absolutely nonsensical to conflate the transport and presentation layers using PHP, especially when there are better solutions to the problem. PS: 10Kb is hardly worth compressing.
  • What if you aren't using Apache or the modules? Anything that can be compressed is worth compressing. IMHO.
  • At Swim Two Birds: What if you aren't using Apache or the modules?
    Then that, in a nutshell, is the problem.
  • PS: 10Kb is hardly worth compressing. ...plus the fact that linked CSS files don't change (much) and are cached by the browser. So you're talking about a one-time savings of maybe 5Kb, and introducing several layers of complexity that can break. Cool hack, but cost/benefit just isn't worth it for a live site.
  • I'll second fuyu. There are very few good reasons to use anything but Apache, and mod_gzip (1.3.x) or mod_deflate (2.0.x) are the best ways to accomplis h this, since you'll get compression on all your text (HTML, etc). wanderingstan: CSS files are the most requested files on most web sites, large and small, I've had access to the logs for. Browsers don't seem to do a very effective job of caching them.
  • Hey all, some good conversation going on here... @fuyugare: If you read the latest version of the article (and the original, I believe) you will see that I proposed this for folks who do not have access to either of the apache compression modules. As for CSS file sizes, have a look at this. I used 10kb in the Gzip article as it's a nice round number, but work on any site that has several different layout configurations and some other complex CSS going on, and you're talking about numbers much greater than 10. In fact at 40kb, you're talking about a savings of roughly 30kb. That's a significant number on a first hit to a site. @rodgerd: I'd be quite interested in hearing some numbers that you may have wrt browsers *not* caching CSS.
  • MikeP: If you read the latest version of the article (and the original, I believe) you will see that I proposed this for folks who do not have access to either of the apache compression modules.
    Ah, I did miss that for note for some reason. Sorry! I continue to be unconvinced that this is worth the maintenance hassle. (Sure, you can trivially autogenerate the .php from the .css, but that's not what I'm talking about.) First of all, 40KB of text is roughly one fourth of Romeo and Juliet, which I can get in 9.46 seconds (bottleneck is gutenberg's server, as we're both on Abilene). So we're talking about 2.4 seconds for the hypothetical 40KB CSS, which I would claim is pretty reasonable load time for a website. Now this number is highly pessimal as the file was served by FTP, which has a huge setup/teardown cost. Over HTTP, from a well-designed site like stopdesign.com, I could have gotten the same data in 0.7 seconds uncompressed (see below). Second, let's look at some compression numbers. One of the larger CSS files from stopdesign.com clocks in at 23KB uncompressed, and between 4.2KB (bzip2) and 4.8KB (gzip -9) compressed. This gives a compression ratio of 80%, which is admittedly impressive. However, note that the HTML on the stopdesign.com site itself is about 20KB (compressed: 6KB). Compressing just the CSS and not the HTML gives a total compression ratio of merely 42% instead of the achievable 74%! Third, the logo image on stopdesign is 39KB. The hypothetical user who wants to wring every last bit of benefit from smaller payloads, but strangely doesn't use Apache (or doesn't have access to the configuration) should figure out how to enable deflate compression on their preferred webserver (or bother their admin).
  • Thank you, fuyugare. This hack is total rice.
  • Hey fuyugare, WRT #1 - fine, but I can notice a difference for a couple of sites that I work on when seding the CSS compressed vs uncompressed. Certainly it is marginal, but I'm on a 512 line. I can imagine that the sites' users on dial-up (and there are quite a few) would notice it even more. #2 - Whether Doug compresses his html or not wasn't the point - I'm sure if he was worried about this stuff he'd compress the html as well. #3 - Same as #2. If he was worried about it... #4 - Not always possible, I'm afraid. I agree with you tht this isn't for everyone - I started using this for a client who's area is highly competitive and whose competitors sites tended to be heavy - the extra snap in their site was intended to keep users on site, happy etc.
  • Don't be discouraged. All data must be compressed. This is the future of information transmission, the future of physics. For that reason, this technique is valuable, even if some consider the gain infinitesimal. I'm sure that the first developers of the internal combustion engine were told 'hey, there's no point to this, it's too much effort. I have 2 horses, and they eat grass. I can get to coventry in three days, this thing will only save me a few hours.'
  • MikeP, you have successfully missed every one of my points. I'll note them again: 1. 40KB is not worth compressing. 2. If you must compress, compress all text. 3. The largest components in a web page, by size, are not textual, and generally cannot be compressed. ---- I'll note that I think there might be a really cool trick hidden behind this silly hack. Straight up compression is a solved problem, but how else can we exploit the idea of marshalling data in an executable format? So far we have seen some ad-hoc applications of this scheme: obfuscating e-mail addresses using Javascript, for example. However, the kind of trick I'm envisioning is along the following lines: get the static parts of a dynamic page and store them in a compressed, cacheable form. Then, send only the dynamic portion of the page, but include some DOM-walking Javascript that marries the static and dynamic parts on the client. (This Javascript code can itself be compressed and cached, because it is highly unlikely that the templating scheme for the dynamic page changes too often.) Here's another possibility. There are parts of a dynamic page that are already cached: images, CSS, etc. Why not also cache the static markup and text in a dynamic page? One simple way to do this is to just send diffs. This is already done for other content — the MPEG video format, for instance, is essentially a stream of diffs to frames sent earlier. A third possibility — with Wiki pages, the part of the wiki that transforms the Wiki syntax into HTML is usually the simplest part of the Wiki software. It can usually be reduced to a collection of regular expressions. Even in a non-Wiki context we have "simplified markup" schemata like Markdown. Raw HTML is generally far more verbose than these formats. Why not move the HTML generation portion to the client? This could be a huge win for sites like Wikipedia where a large portion of server time is devoted to the operation of converting Wiki syntax to HTML. Wikimedia solved it by aggressively cacheing the results of these conversions, but I wonder if they ignored a much simpler alternative.
  • fuyugare, I did get what you were saying, but really #2 and #3 do not refute anything that I wrote about, therefore, while good points, are moot to this conversation. WRT #1... Well, I get that too, however I can notice a difference at 512 when observing how a page paints (albeit very slight), so someone on dial up would probably notice an extra 30 kilobytes. I agree that, as you said, this is a hack, that Apache's modules would be the best way, and 10kb isn't much to worry about. In the end for you 30Kb isn't worth it either, and with that, I disagree. Simple as that ;-)
  • Oops: as you said...