I chose to use the easy-thumbnails app with S3, a possibility granted by the storages framework. At a glance, everything works OK; thumbnails are created and rendered accordingly. However, on a thumbnail-heavy page, I (and my profiling app) noticed significant load times (3s for a page with 7 thumbnails).
Drilling down, it became evident that something was checking the S3 bucket for each thumbnail (a httpplib call occurred for each one).
TL;DR: If you use the default
ImageField, the thumbnail storage is created by the Thumbnailer framework. Each time the storage object is created, it checks for the bucket’s existence. This means a query to S3. Lots of images equals lots of queries to S3, checking for a bucket’s existence.
Logic goes something like this:
easy_thumbnails.files.pyexecuting line 52 results in creating a new
ThumbnailerField()object created, which uses
thumbnail_storageparameter. This is empty in the above call, which results later in creating the storge object
Thumbnailerclass constructor contains:
123if not thumbnail_storage:thumbnail_storage = get_storage_class(settings.THUMBNAIL_DEFAULT_STORAGE)()
I have thought of two options:
- Either tweak the
Thumbnailerclass constructor or
- Try to find a way not to re-create the storage object
I ended up using in my model something like:
source_image = ThumbnailerImageField(
Where the thumbnail_storage is initialised only once at the beginning of the .models file:
thumbnail_storage = get_storage_class(settings.THUMBNAIL_DEFAULT_STORAGE)()
This way, the storage object is created only once per Django instance and the bucket existence is also queried only once.
Now, a page query takes about 120ms on average instead of 3s. Wow. Much speedup.
A little experiment: If you find this post and ad below useful, please check the ad out :-)