The Changelog

Open Source moves fast. Keep up.

Speed up AWS S3 by 2000x with this transparent proxy #

Amazon S3 works pretty well, is cheap, and is not too slow. It is employed as a blob store by so many companies that it’s practically the de facto solution. So, if you could speed up S3 I am sure it would have a pretty big impact. That is exactly what MimicDB is trying to do.

By maintaining a transactional record of every API call to S3, MimicDB provides a local, isometric key-value store of data on S3. MimicDB stores everything except the contents of objects locally. Tasks like listing, searching and calculating storage usage on massive amounts of data are now fast and free.

The readme says that on average tasks like those are 2000x faster using MimicDB. It also reduced the number of API calls to S3 thus reducing the price. If you use S3 heavily, MimicDB looks like it could be an interesting addition to your stack.

s3_multipart: Multi-part file uploads straight to S3 for Rails #

There’s some things that every web application needs, and some they don’t need very often. File uploads are closer to the first category than the second; thinking back, most of my apps needed to upload files. If you need to upload big files, it’s kind of a problem: if you’re building a twelve-factor style app on something like Heroku, your web workers should have pretty short timeouts, since most of your requests are served quickly. But your file uploads take a long time. What to do?

Enter s3_multipart. From the README:

The S3 Multipart gem brings direct multipart uploading to S3 to Rails. Data is piped from the client straight to Amazon S3 and a server-side callback is run when the upload is complete.

Multipart uploading allows files to be split into many chunks and uploaded in parallel or succession (or both). This can result in dramatically increased upload speeds for the client and allows for the pausing and resuming of uploads.

Neat, eh? The README has more details of how to use the gem. It’s a bit complex: you need to set up CORS on your S3 bucket, run some generators, write some JavaScript.

The gem is still young, and looking for contributions. This is a tough problem, and having an easy way to solve it is great!

fistface: Sinatra + S3 + Heroku = DIY @font-face web service #

For those looking to set their web sites free from the doldrums of boring web typography with the power of @font-face, many quickly discover that Firefox does not allow cross-domain font embedding. To get around the issue, you’ve got two choices. Host the fonts on the same domain as your site, or set the Access-Control-Allow-Origin HTTP header. Since Amazon S3 doesn’t allow you to set this header, your dreams of using it as a poor man’s CDN for your fonts are dashed.

Until now. Thoughtbot has released Fistface, a small Sinatra app that lets you roll your own font service with S3 as your backend.

Assuming you’ve got an S3 account, create a bucket and optionally map a CNAME to it. Next, upload your fonts in the following folder structure:


Then install the fistface gem:

gem install fistface

Next, create a Rack rackup file

require 'rubygems'
require 'bundler'
run FistFace

Fistface runs nicely on Heroku.

bundle install
git init
git add .
git commit -m "Creating a Fist Face instance"
heroku create
heroku config:add S3_URL=

Now you can embed your fonts with the following <link> tag

<link href="" rel="stylesheet" type="text/css">

… and CSS:

@font-face {
  font-family: 'Chunk';
  font-weight: normal;
  font-style: normal;
  src: local('☺'), url('') format('truetype');

It’s not Typekit, but it’s yours. Need web fonts? Be sure and check out Font Squirrel, FontSpring, or the ever-growing Google Font Directory.

[Source on GitHub]

knox: Amazon S3 library for Node.js #

Some of the best open source software is byproduct of great commercial software. Such is the case with most everything LearnBoost given to the Node.js community. Their latest offering is Knox, TJ’s Amazon S3 client.

Knox was built for Node 0.2.x and offers a Node-like low-level http.Client API. To get started, install via npm

npm install knox

In your Node app, create a Knox client:

var client = knox.createClient({
    key: '<api-key-here>'
  , secret: '<secret-here>'
  , bucket: 'learnboost'

You can now upload something to your S3 bucket via an HTTP PUT:

fs.readFile('', function(err, buf){
  var req = client.put('/test/', {
      'Content-Length': buf.length
    , 'Content-Type': 'text/plain'
  req.on('response', function(res){
    if (200 == res.statusCode) {
      console.log('saved to %s', req.url);

GETing that object again is just as simple:

client.get('/test/').on('response', function(res){
  res.on('data', function(chunk){

Be sure and check the README for more options.

[Source on GitHub]