Say you’ve got a few app servers, and you want to serve up some largish files from your rails app (eg, pdfs) behind a login screen. Well, you could put them on s3 and redirect the user to s3 with expiring links. However, this would mean the eventual URL the user gets in their browser is going to be a s3 URL that expires in an ugly way with an XML error when the link expires. And if the link (copied from the address bar) is shared around, it’ll work for non-authorised people for a little while. Then when the s3 link expires, the receivers of the link will never get to see your site/product (maybe they might want to register/subscribe), instead they’ll just get a yucky s3 API looking error in XML, and nowhere to go.

Well, where could we put the files? How about using the file system on the app servers?

Two things to solve.. how to efficiently ship the files to the app servers, and how to serve them without tying up expensive Ruby processes.

1. Shipping the files to the app servers with Capistrano and Rsync

If you’re using docker or similar, you might want to bake the files into the images, if there aren’t too huge.

Assuming you’ve got a more classic style of deploy though..

Welcome old friends Capistrano and Rsync. Using Rsync we can ensure we minimise time / data sending files using binary diffs. We can also do the file transfers simultaneously to app app servers using cap. Here’s the tasks I put together. The deploy_pdfs task will even set up the shared directory for us.

We’re sticking the files into the ‘shared/pdfs’ directory created by capistrano on the app servers. Locally, we have them sitting in a ‘pdfs’ directory in the root of the rails app. This might seem inconsistent (and it is), but the reason is due to limitations/security restrictions with X-Sendfile.

2. Serving the files with X-Sendfile/Apache to let Rails get on with other things

So Rails provides a helpful send_file method. Awesome! Just password protect a controller action and then send_file away! Oh but wait, that will tie up our expensive/heavy ruby processes sending files. Fine for Dev, but not so great for production. The good news is we can hand this work off to our lightweight Apache/nginx processes. Since I use Apache/Ubuntu, that’s what I’ll cover here, but the nginx setup is similar. Using the X-Sendfile header, our rails app can tell the web server a path to a file to send to the client.

How to set up Apache & Rails for X-Sendfile

Ok let’s get Apache rolling:

You need to whitelist the path that files can be sent from, and it can’t be a soft link. It needs to be an absolute path on disk. Hence we are using the ‘shared’ directory capistrano creates, rather than a soft linked directory in ‘current’. X-Sendfile header itself lets you send files without a path (just looks for the files in the whitelisted path), but unfortunately we can’t use this as Rails send_file checks the path exists and raises if it can’t find the file.

In your rails app in production.rb add:

  # Hand off to apache to send big files
  config.action_dispatch.x_sendfile_header = 'X-Sendfile'

In development you probably don’t need this since you won’t be using a server that supports x_sendfile. Without this config, rails will just read the file on disk and send it itself.

In a controller somewhere, just use send_file to hand off to Apache. You’ll need to specify the absolute path to the file in the ‘shared’ directory. I’d suggest putting the path to the shared directory in an environment variable or config file (however you do this usually for your app per environment), and then just append the relevant filename on to it. Also, remember to validate the requested filename (I use a whitelist of filenames to be sure), to avoid the possibility of malicious requests getting sent private files they shouldn’t from elsewhere on disk.