class S3PipelineStorage(PipelineMixin, CachedFilesMixin, S3BotoStorage):
     pass



PIPELINE_JS = {
 'main.js': {
    'output_filename': 'js/main.min.js',
    'source_filenames': [
        'js/external/underscore.js',
        'js/external/backbone-1.0.0.js',
        'js/external/bootstrap-2.2.0.min.js',
    ]
  }}

When I first ran the collectstatic command yesterday, it properly created a 
cache busting file named "main.min.d25bdd71759d.js

However, when I now run the command, it is failing to overwrite that cached 
file (and update the hash) during the post process phase.

It keeps updating "main.min.js", such that main.min.js on S3 is current 
with the code in my filesystem. A new cached file, however is not created. 
It keeps the same old hash even though the underlying main.min.js file has 
changed.

When I manually delete the cache busting file on AWS, I get the following 
message from running collectstatic with verbosity set to 3:

Post-processed 'js/main.min.js' as 'js/main.min.d25bdd71759d.js

So, it kept the same hash as the old version and it is not up to date.

settings.DEBUG is set to False

Why won't the hash update?  I assume that it is because it is hashing the 
contents of  the old version of "js/main.min.js" but how would it access 
that old version if the new version is up on S3?

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at http://groups.google.com/group/django-users.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to