The Artima Developer Community
Sponsored Link

Ruby Buzz Forum
Serving Compressed Content from Amazon's S3

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
Guy Naor

Posts: 104
Nickname: familyguy
Registered: Mar, 2006

Guy Naor is one of the founders of famundo.com and a long time developer
Serving Compressed Content from Amazon's S3 Posted: Mar 5, 2007 9:56 AM
Reply to this message Reply

This post originated from an RSS feed registered with Ruby Buzz by Guy Naor.
Original Post: Serving Compressed Content from Amazon's S3
Feed Title: Famundo - The Dev Blog
Feed URL: http://devblog.famundo.com/xml/rss/feed.xml
Feed Description: A blog describing the development and related technologies involved in creating famundo.com - a family management sytem written using Ruby On Rails and postgres
Latest Ruby Buzz Posts
Latest Ruby Buzz Posts by Guy Naor
Latest Posts From Famundo - The Dev Blog

Advertisement

If you have yet to check out Amazon's S3 service, go do that now. I'll wait for you to come back! It's a very simple storage server that is also really really cheap, and high-performance. Backed by Amazon's network, it's also pretty reliable.

I am in the process of moving a lot of the static content of Famundo into the service. But one important requirement is serving JavaScript and CSS files compressed, as we have pretty big files for both.

By default S3 will serve the files as uncompressed text files, increasing the load time for the clients, so a solution for compressed files serving was needed.

When saving files on S3 I can pass in HTTP headers that will be returned when the file is accessed. Using this along with pre-compressed JS and CSS files, we can have S3 serve the files compressed. The limitation here as opposed to serving it directly with a web-server, is that there is no content type negotiation, meaning, the files will always be served compressed. So if the clients accessing your files cannot accept compression, you are out of luck, and got to either serve uncompressed or do it from your content negotiating server. In Famundo we target only newer browsers (FF, IE6+ and Safary 1.2+), so pre-compressed files works for us.

Enough words, time for some code. I'm showing it using the Ruby AWS::S3 library, but you can use whichever library/language you want. The principle is the same. What we are doing, is compressing the files, then uploading the compressed files and the most important part - assigning to it the correct HTTP headers.

require 'rubygems'
require 'aws/s3'
require 'stringio'
require 'zlib'

AWS::S3::Base.establish_connection!(:access_key_id => 'YOUR S3 ACCESS KEY', :secret_access_key => 'YOUR S3 SECRET KEY')

strio = StringIO.open('', 'w')
gz = Zlib::GzipWriter.new(strio)
gz.write(open('test.css').read) # Here is the file on your file system
gz.close
# 'test.css' is the name of the file on the S3 system, and 'the_bucket' is the bucket name
# Note the use of "Content-Encoding" => 'gzip', this is what make it all work
S3Object.store('test.css', strio.string, 'the_bucket', :access => :public_read, "Content-Encoding" => 'gzip' ) 

You can now open it in your browser: http://s3.amazonaws.com/the_bucket/test.css and get it as CSS, though it was transfered compressed.

You could do the same with static html files as well. Don't do it with images, as there is nothing to gain by compressing them.

If there is anything else you would like to learn how to do with S3 (virtual serving, metadata manipulation, etc...), let me know.

Read: Serving Compressed Content from Amazon's S3

Topic: Ruby Blocks as Closures Previous Topic   Next Topic Topic: FOSDEM 2007

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use