This post originated from an RSS feed registered with Ruby Buzz
by Guy Naor.
Original Post: Serving Compressed Content from Amazon's S3
Feed Title: Famundo - The Dev Blog
Feed URL: http://devblog.famundo.com/xml/rss/feed.xml
Feed Description: A blog describing the development and related technologies involved in creating famundo.com - a family management sytem written using Ruby On Rails and postgres
If you have yet to check out Amazon's S3 service, go do that now. I'll wait for you to come back! It's a very simple storage server that is also really really cheap, and high-performance. Backed by Amazon's network, it's also pretty reliable.
I am in the process of moving a lot of the static content of Famundo into the service. But one important requirement is serving JavaScript and CSS files compressed, as we have pretty big files for both.
By default S3 will serve the files as uncompressed text files, increasing the load time for the clients, so a solution for compressed files serving was needed.
When saving files on S3 I can pass in HTTP headers that will be returned when the file is accessed. Using this along with pre-compressed JS and CSS files, we can have S3 serve the files compressed. The limitation here as opposed to serving it directly with a web-server, is that there is no content type negotiation, meaning, the files will always be served compressed. So if the clients accessing your files cannot accept compression, you are out of luck, and got to either serve uncompressed or do it from your content negotiating server. In Famundo we target only newer browsers (FF, IE6+ and Safary 1.2+), so pre-compressed files works for us.
Enough words, time for some code. I'm showing it using the Ruby AWS::S3 library, but you can use whichever library/language you want. The principle is the same. What we are doing, is compressing the files, then uploading the compressed files and the most important part - assigning to it the correct HTTP headers.
require'rubygems'require'aws/s3'require'stringio'require'zlib'AWS::S3::Base.establish_connection!(:access_key_id=>'YOUR S3 ACCESS KEY',:secret_access_key=>'YOUR S3 SECRET KEY')strio=StringIO.open('','w')gz=Zlib::GzipWriter.new(strio)gz.write(open('test.css').read)# Here is the file on your file systemgz.close# 'test.css' is the name of the file on the S3 system, and 'the_bucket' is the bucket name# Note the use of "Content-Encoding" => 'gzip', this is what make it all workS3Object.store('test.css',strio.string,'the_bucket',:access=>:public_read,"Content-Encoding"=>'gzip')
You can now open it in your browser: http://s3.amazonaws.com/the_bucket/test.css and get it as CSS, though it was transfered compressed.
You could do the same with static html files as well. Don't do it with images, as there is nothing to gain by compressing them.
If there is anything else you would like to learn how to do with S3 (virtual serving, metadata manipulation, etc...), let me know.