I was recently tasked by my friends at 3musketeers to provide some recommandations on how to serve and properly cache images stored in MongoDB to a mobile-oriented AngularJS application, through a Grails server and here is my answer. At some point this should be part of the Grails html5 mobile scaffolding plugin.


The first thing to me is to create a simple quite generic endpoint that receives in URL parameter the id of the document to use and simply puts the image as a byte array to the output stream of the request after setting the Content-Type header to something like image/png or image/jpeg according to the actual type of the image. To make the endpoint more generic the attribute holding the image in the object may be provided as a parameter too. There are of course other aspects in that like authorization (who is able to download what images) to deal with and error cases (the object no longer existing, …).

Then in HTTP there are two main ways to cache things, with ETag validation and URL discrimination with long expiration. Both will use the version attribute added by GORM.

ETag validation

In this form URLs to the endpoint would look like GET /images/_id_ where _id_ is the value of the object’s _id attribute.

When a request comes to the endpoint, it first grabs the object from mongodb using the provided identifier and checks the value of the If-None-Match request header. If the value is identical to the object’s version, then it simply returns 304 Not Modified with no payload. If the value is different it replies with the content and an additional header ETag et to the value of object’s version. The next time the browser needs the image it will make a request including the If-None-Math. If the image did not change the server answers with an empty response, which is fast.

The drawback of that solution is that the browser needs to make a conditional request every time it needs the resource, even if the response is small it is not adapted to mobile applications that may be being high-latency network or offline. To prevent that an Expires header can be added to the response with a relatively small value, and the browser will not issue requests during this time. But it causes another drawback: for some time browsers may use outdated versions of the image.

Even worse, the src attribute of the image in the DOM will not change when AngularJS underlying object changes, so if the object is displayed it will not update its image immediately.

URL discrimination with long expiration

In this form URLs to the endpoint would look like GET /images/_id_/_version_ where _id_ is the value of the object’s _id attribute and _version_ is the version of the object generated by GORM.

The endpoint could simply ignore the value of _version_ (since there is no way to get the image for an older version of the object), grab the object from mongodb using the provided identifier and replies with the payload, after specifying the Expires header to a long value, the recommended one being one year in the future. Browsers will make no more request for the object in this version, not waking up the power consumming radio on a mobile phone. And if the object changes, the URL changes, forcing the browser to make a new request.

Along this no-request-at-all result, another good point is that if AngularJS model is modified, the binding to the image src would be magically updated, triggering image reload since the URL changed. To make URL computation easier a simple directive should do the job.

The need to make a request everytime the object changes is not optimal. A variation of this strategy would be to add the sha256 sum of the image content as an additional attribute of the object, and use that to discriminate the URL. The browser would not need to download an image if it had not effectively changed, even if the container object itself changes. To me this strategy would fully make sense in stores like CouchDB where one would create a simple view indexed by object id and adding this hash somewhere. But with the stack in requirements it means creating a hook on update keeping this hash up to date, and I find those approaches too fragile and prefer wasting some bytes on the network.


I think URL discrimination with long expiration is more adapted to a mobile that may go offline, and to the way model properties are bound to the DOM in single page applications like AngularJS. The ETag-based solution would be valid only if computing URLs on client side was hard, but it should not be the case.