[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-06-17 Thread Alex Tsai
Hi all,

I'm having what might be the same issue (can't tell without stack trace 
from the OP)

I have a similar config to what Brad posted.  The media library is actually 
able to create folders in S3 correctly so I think my permissions are 
working.  (the IAM in amazon just has access to all S3 permissions at the 
moment).

However, when I upload, it seems there's some sort of submission and then 
it never actually appears in the S3 bucket.  Boto then dies when trying to 
move the key out of the uploads sub folder with the following stacktrace, 
which is what results in the 500 in my case.

Sorry for the possible threadjack (if it turns out we're not actually 
having the same issue) but I think it might be...  Any help would be 
greatly appreciated as I've been beating my head against this for a while.  

Thanks!


Traceback (most recent call last):


  File 
"/app/.heroku/python/lib/python2.7/site-packages/django/core/handlers/base.py", 
line 112, in get_response

response = wrapped_callback(request, *callback_args, **callback_kwargs)


  File 
"/app/.heroku/python/lib/python2.7/site-packages/django/views/decorators/csrf.py",
 
line 57, in wrapped_view

return view_func(*args, **kwargs)


  File 
"/app/.heroku/python/lib/python2.7/site-packages/filebrowser_safe/decorators.py",
 
line 25, in decorator

return function(request, *args, **kwargs)


  File 
"/app/.heroku/python/lib/python2.7/site-packages/django/contrib/admin/views/decorators.py",
 
line 17, in _checklogin

return view_func(request, *args, **kwargs)


  File 
"/app/.heroku/python/lib/python2.7/site-packages/filebrowser_safe/views.py", 
line 333, in _upload_file

default_storage.move(smart_text(uploadedfile), smart_text(file_path), 
allow_overwrite=True)


  File 
"/app/.heroku/python/lib/python2.7/site-packages/filebrowser_safe/storage.py", 
line 103, in move

k = self.bucket.copy_key(new_key_name, self.bucket.name, old_key_name)


  File "/app/.heroku/python/lib/python2.7/site-packages/boto/s3/bucket.py", 
line 883, in copy_key

response.reason, body)


S3ResponseError: S3ResponseError: 404 Not Found

NoSuchKeyThe specified key does not 
exist.uploads/watermark50.pngD75E83E3B666147EHoJQ9yw2fOLHE2Mkh88tnnSqwau20sEbXfiGbkKKeSKKEIy6UbBCW2QyNpv+pZCh

The entry in my settings.py is as follows:

AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')

AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')

AWS_STORAGE_BUCKET_NAME = os.environ.get('S3_STATIC_BUCKET')

AWS_PRELOAD_METADATA = True #helps collectstatic do updates


AWS_QUERYSTRING_AUTH = False

AWS_S3_SECURE_URLS = False

AWS_S3_ENCRYPTION =  False

from boto.s3.connection import ProtocolIndependentOrdinaryCallingFormat

AWS_S3_CALLING_FORMAT = ProtocolIndependentOrdinaryCallingFormat()


DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'

DEFAULT_S3_PATH = "media"

MEDIA_ROOT = ''

MEDIA_URL = ''


STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'

STATIC_ROOT = "/static/"

STATIC_URL = 'http://s3.amazonaws.com/%s/static/' % 
AWS_STORAGE_BUCKET_NAME

ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'


On Friday, 6 June 2014 08:26:39 UTC-7, bob hosk wrote:
>
> Hi Brad,
>
> Thanks again for the reply.
>
> I tried again with these settings (see 
> https://github.com/fpghost/mezzdemo/blob/master/settings.py), and
> still the exact same error for me. This is proving stubborn indeed!
>
> I should note I'm working on the local dev server at the moment, so for my 
> domain I just tried a few
> combos of localhost. Not sure if that is the right way.
>
> I'm curious though, is this working for anyone else? After all this is the 
> out of box mezzanine install here,
> so it is very easy to reproduce (rip my requirements.txt and settings.py 
> into mezzdemo dir):
>
> mkdir mezzdemo; cd mezzdemo
> virtualenv --no-site-packages mezzdemoenv
> source mezzdemoenv/bin/activate
> pip install -r requirements.txt
> cd ..; mezzanine-project mezzdemo
> python manage.py createdb
> python manage.py collectstatic
>
> The only thing you need to do is set your own env variables for aws secret 
> key and id, and set own bucket_name in settings.py.
> Then try:
>
> python manage.py runserver
>
> and attempt to upload a picture to the media_lib...
> Does this work for others?
> 
>
> 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-06-06 Thread bob hosk
Hi Brad,

Thanks again for the reply.

I tried again with these settings (see 
https://github.com/fpghost/mezzdemo/blob/master/settings.py), and
still the exact same error for me. This is proving stubborn indeed!

I should note I'm working on the local dev server at the moment, so for my 
domain I just tried a few
combos of localhost. Not sure if that is the right way.

I'm curious though, is this working for anyone else? After all this is the 
out of box mezzanine install here,
so it is very easy to reproduce (rip my requirements.txt and settings.py 
into mezzdemo dir):

mkdir mezzdemo; cd mezzdemo
virtualenv --no-site-packages mezzdemoenv
source mezzdemoenv/bin/activate
pip install -r requirements.txt
cd ..; mezzanine-project mezzdemo
python manage.py createdb
python manage.py collectstatic

The only thing you need to do is set your own env variables for aws secret 
key and id, and set own bucket_name in settings.py.
Then try:

python manage.py runserver

and attempt to upload a picture to the media_lib...
Does this work for others?

   


-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-06-02 Thread Brad Bode
I looked at your Allowed Hosts setting, which can cause the 500 error. Try 
this and tell me if it works:

ALLOWED_HOSTS = ['*.YOUR_DOMAIN.com',,'www.YOUR_DOMAIN.com', 
> '.compute-1.amazonaws.com']
> EC2_PRIVATE_IP = None
> try:
> EC2_PRIVATE_IP = 
> requests.get('http://169.254.169.254/latest/meta-data/local-ipv4', 
> timeout=0.01).text
> except requests.exceptions.RequestException:
> logger.info("EC2 private IP addition to allowed hosts failed to 
> resolve")
> pass
> if EC2_PRIVATE_IP:
> ALLOWED_HOSTS.append(EC2_PRIVATE_IP)


And read this:
http://dryan.me/articles/elb-django-allowed-hosts/

>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-28 Thread bob hosk
Still no joy sadly. I uploaded the bare bones mezzanine fresh install,only 
with attempts in the settings to use
s3 storage to git: g...@github.com:fpghost/mezzdemo.git, if anyone would be 
so kind as to pull and see if it works with their bucket. (don't worry I'll 
change the django secret key before going into production). 

With MEDIA_ROOT non defined, or with it set to '' I get the HTTP 500 Error, 
with it as '/media/' or 'media' I get 400 bad request. My understanding is 
MEDIA_ROOT is the storage point of the media files, so if s3 takes care of 
it, then MEDIA_ROOT should be blank or empty, but if not I might have guess 
''//s3.amazonaws.com/mybucket/media/''. Either way no joy.

This is not a https site just the basic demo install.

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-27 Thread Ryan Sadwick
My site is SSL and I haven't had any problems with s3.  It was rough 
initially but I just continued to work through the problems and found a 
sweet spot.  I find that it is important to understand how S3/Boto work 
instead of pasting someone's settings.  Here is what I used.  I stripped 
out my variables with strings, happy tinkering.  You should find all 
documentation on the Boto site.

*DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'*
*AWS_ACCESS_KEY_ID = 'your_id'*
*AWS_SECRET_ACCESS_KEY = 'your_key'*
*AWS_STORAGE_BUCKET_NAME = 'bucketname'*
*STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'*
*AWS_QUERYSTRING_AUTH = False*
*AWS_S3_SECURE_URLS = True*

*AWS_S3_CUSTOM_DOMAIN = 's3.amazonaws.com/bucket_name'*
*S3_URL = 'https://s3.amazonaws.com/bucket_name'*
*MEDIA_URL = S3_URL*
*STATIC_URL = S3_URL*

For SSL to work, I had to use the aws_s3_custom_domain.  It's not 
documented and buried.

For thumbnails, I use sorl_thumbnails.  I found that sorl works great with 
s3 and I haven't run into any issues and implementing it within templates 
was easy.

Also, make sure your images are publicly accessible.  Test your access of 
your image bucket within your browser.  If you're getting an error there, 
you'll need to read up on permissions.  

Good luck and don't give up :)


On Tuesday, May 27, 2014 11:45:23 AM UTC-4, bob hosk wrote:
>
> Could it be trying to upload to the wrong dir( when in the admin), err was
>
>  "/admin/media-library/upload_file/ HTTP/1.1" 500 1
>
> Shouldn't this be my amazon bucket url followed by something like /media?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-27 Thread bob hosk
Could it be trying to upload to the wrong dir( when in the admin), err was

 "/admin/media-library/upload_file/ HTTP/1.1" 500 1

Shouldn't this be my amazon bucket url followed by something like /media?

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-27 Thread bob hosk
Hi Brad,

Sorry it's taken me so long to get back to you on this, after your 
fantastic reply.
It's been a very hectic week

I tried exactly the settings you most recently posted, and unfortunately 
I'm still hitting the
500 error when attempted to use the upload featured image section of the 
mezzanine blog.

The strange thing is everything else works fine (collectstatic can collect, 
and the static files
themselves get stored). In my amazon bucket I see two dir media/ and 
static/ so all is good there.

Also I checked these settings on a plain django project and there was no 
problem whatsoever uploading
media from the admin by the user.

Hmm...

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-21 Thread Brad Bode
If your desire is to store thumbnails on S3 then the easiest solution I 
found is to replace the thumbnails generator that mezzanine provides. Why? 
As good as mezzanine is it is not setup for S3 bucket storage of thumbnails 
and will only generate them with the roo media url.

What do you do? 

Checkout my answer here:
https://groups.google.com/d/msg/mezzanine-users/N_twDWnSXQg/8JEv3VTrFQ8J

That will work. 

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-21 Thread Brad Bode


1) Why do you define
AWS_SECRET_KEY = os.environ['AWS_SECRET_KEY']
AWS_SECRET_ACCESS_KEY = AWS_SECRET_KEY
and not just 
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_KEY']


Legacy testing crap code basically (a poor understanding of settings). I 
updated the settings to remove duplicates and create clarity (see end of this 
post). Helping you helps me clean up my code :)

2) What is the difference between 
AWS_BUCKET_NAME and AWS_STORAGE_BUCKET_NAME here? and why
do define 'AWS_STORAGE_BUCKET_NAME' twice? (just a typo?)

Same answer as #1. 

3) You define DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'
and STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'. Are these the 
locations
of your custom storage classes DefaultStorage and StaticStorage? How are they 
related to
to the PublicS3BotoStorage class from the bug fix? (assuming they are)


Yes, these are the custom classes. I highly suggest using 
Django-folder-storages, which is a small wrapper around storeages that sets up 
the custom static and media folders. This way you do not have to write your 
own. Here is a link to the plugin. If you are using PIP it's easy to install.


https://github.com/jamstooks/django-s3-folder-storage

4) Why are MEDIA_ROOT and MEDIA_URL empty?

Because MEDIA_URL will be prepended to media URL's when we don't need it to. 
The entire URL will be generated by Storages and we don't need the MEDIA_URL 
prepended. Most templates do something like this to generate a URL:

{{ MEDIA_URL }}{% thumbnail selected_landing_page.featured_image 665 400 %}


That would prepend the MEDIA_URL and give you urls that look like: 

http://someurl/image.pnghttp://anotherurl/some.png


* "PublicS3BotoStorage' object has no attribute 'isdir'"*


This issue is solved by using Django Folder Storage per #3. I tried rolling my 
own, but was met with this error every time. I don't know why because I 
literally copied his code directly and couldn't get it to work. IF you solve 
this issue let me know. I think it might have to do with the order of when the 
app is registered. If you make your custom class a new module and register it 
before your application then it would likely work. I never tried it because 
Django FOlder Storage worked fine.


*Here are my cleaned up settings:*

AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
# Enable S3 deployment only if we have the AWS keys
S3_DEPLOYMENT=AWS_ACCESS_KEY_ID is not None
if S3_DEPLOYMENT:
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
AWS_QUERYSTRING_AUTH = False
AWS_S3_SECURE_URLS = False
AWS_S3_ENCRYPTION =  False
from boto.s3.connection import ProtocolIndependentOrdinaryCallingFormat
AWS_S3_CALLING_FORMAT = ProtocolIndependentOrdinaryCallingFormat()

DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'
DEFAULT_S3_PATH = "media"
MEDIA_ROOT = ''
MEDIA_URL = ''

STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'
STATIC_S3_PATH = "static"
STATIC_ROOT = "/%s/" % STATIC_S3_PATH
STATIC_URL = '//s3.amazonaws.com/%s/static/' % AWS_STORAGE_BUCKET_NAME
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
else:
logging.info("S3 disabled")
pass


-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-21 Thread bob hosk
Hi, 

Many thanks for taking the time to write such a detailed reply.

By thumbnails do you mean the featured images for the blog? If so
yes I was planning on storing that, plus other user uploaded media, and 
static files on s3.

So using the bugfix you pointed me to I created the file
mystorages.py  (see link) in my project 
defining the subclass 

PublicS3BotoStorage (where are you using this in your settings.py?)

Some questions:

1) Why do you define
AWS_SECRET_KEY = os.environ['AWS_SECRET_KEY']
AWS_SECRET_ACCESS_KEY = AWS_SECRET_KEY
and not just 
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_KEY']

2) What is the difference between 
AWS_BUCKET_NAME and AWS_STORAGE_BUCKET_NAME here? and why
do define 'AWS_STORAGE_BUCKET_NAME' twice? (just a typo?)

3) You define DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'
and STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'. Are these the 
locations
of your custom storage classes DefaultStorage and StaticStorage? How are they 
related to
to the PublicS3BotoStorage class from the bug fix? (assuming they are)

4) Why are MEDIA_ROOT and MEDIA_URL empty?

These considerations lead me to try and mimic your settings with


# Attempt fix from mezzanine users group

#
# AMAZON - AWS
# S3 config
AWS_ACCESS_KEY_ID="myid"
AWS_SECRET_ACCESS_KEY="myaccesskey"

AWS_STORAGE_BUCKET_NAME = 'my_bucket'
AWS_BUCKET_NAME = 'my_bucket'

AWS_QUERYSTRING_AUTH = False
AWS_S3_SECURE_URLS = False
AWS_S3_ENCRYPTION =  False
from boto.s3.connection import ProtocolIndependentOrdinaryCallingFormat
AWS_S3_CALLING_FORMAT = ProtocolIndependentOrdinaryCallingFormat()

DEFAULT_FILE_STORAGE = 'mystorages.PublicS3BotoStorage'
DEFAULT_S3_PATH = "media"
MEDIA_ROOT = ''
MEDIA_URL = ''

STATICFILES_STORAGE = 'mystorages.PublicS3BotoStorage'
STATIC_S3_PATH = "static"
STATIC_ROOT = "/%s/" % STATIC_S3_PATH
STATIC_URL = '//s3.amazonaws.com/%s/static/' % AWS_STORAGE_BUCKET_NAME
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'

Notice in particular I use STATICFILES_STORAGE = 
'mystorages.PublicS3BotoStorage'

where mystorages.py is defined as in my link above (i.e. the bug fix 
subclass defined in there).

The site loads and css/js/img are served from s3, but now when I try to add 
a blog post in the admin
I get the error

 "PublicS3BotoStorage' object has no attribute 'isdir'"

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-20 Thread Brad Bode
There is one mistake I made in my code. The HTTP included in the static URL 
will mostly cause no problems. However, if you are behind SSL then you may 
have issues. 

Instead of this:
STATIC_URL = 'http://s3.amazonaws.com/%s/static/' % AWS_STORAGE_BUCKET_NAME

Use this:
STATIC_URL = '//s3.amazonaws.com/%s/static/' % AWS_STORAGE_BUCKET_NAME


Not the HTTP is gone. This makes it's protocol independent. I have this 
working behind a load balancer on Amazon Elastic Beanstalk using SSL. 

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-20 Thread Brad Bode
I feel for you...
Getting Django + Mezzanine working on Amazon with S3 is a pain in the ass. 
Truthfully.
I do have it working though after a lot of learning. 

Where do you plan to store your thumbnails? On S3? If you want Thumbnails 
to work on S3 there is some trickery that you need to do. 

As far as getting it working...

First, using django storages + django folder storage will work provided you 
add some tweaks. It took me A LOT of figuring out due to legacy code and 
existing bugs not being patched in other plugins. Particularly boto and 
storages. 

1) Execute on this bug fix (Brad Bode, the most recent poster is me)
https://github.com/boto/boto/issues/1477

2) Execute on this bug fix:
https://bitbucket.org/david/django-storages/issue/181/from-s3-import-callingformat-seems-broke
If you want to use SSL you will need to use 
the ProtocolIndependentOrdinaryCallingFormat instead of 
OrdinaryCallingFormat
You will also need this if using SSL
*https://docs.djangoproject.com/en/dev/ref/settings/#secure-proxy-ssl-header*
 

3) Here is my working settings:

#
# AMAZON - AWS
# S3 config
AWS_SECRET_KEY = os.environ['AWS_SECRET_KEY']
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = AWS_SECRET_KEY
# Enable S3 deployment only if we have the AWS keys
S3_DEPLOYMENT=AWS_ACCESS_KEY_ID is not None
if S3_DEPLOYMENT:
AWS_BUCKET_NAME = os.environ['AWS_BUCKET_NAME']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_BUCKET_NAME']
AWS_QUERYSTRING_AUTH = False
AWS_S3_SECURE_URLS = False
AWS_S3_ENCRYPTION =  False
from boto.s3.connection import ProtocolIndependentOrdinaryCallingFormat
AWS_S3_CALLING_FORMAT = ProtocolIndependentOrdinaryCallingFormat()
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_BUCKET_NAME']

DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'
DEFAULT_S3_PATH = "media"
MEDIA_ROOT = ''
MEDIA_URL = ''

STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'
STATIC_S3_PATH = "static"
STATIC_ROOT = "/%s/" % STATIC_S3_PATH
STATIC_URL = 'http://s3.amazonaws.com/%s/static/' % 
AWS_STORAGE_BUCKET_NAME
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
else:
logging.info("S3 disabled")
pass

I'll try to check back in on this thread. 

On Tuesday, May 20, 2014 2:50:45 PM UTC-7, bob hosk wrote:
>
> Hi
>
> I'm using the demo mezzanine with a settings.py with 
>
> STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
> DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
>
> STATIC_URL = 'https://' + AWS_STORAGE_BUCKET_NAME + '.
> s3.amazonaws.com/'
> MEDIA_URL = 'https://' + AWS_STORAGE_BUCKET_NAME + '.
> s3.amazonaws.com/media/'
>
> to store my static/media files on amazon s3 via django storages backed. 
> '/media/ is just a dir
> I created on my amazon s3 bucket from the website console.
>
> All works relatively well with Debug = False, except when I login to the 
> admin, create a new blog
> post and try to upload a featured image. My selected fail yields a HTTP 
> Error and in the console I see
>  [20/May/2014 22:37:33] "POST /admin/media-library/upload_file/ 
> HTTP/1.1" 500 6672
>
> If I comment out the four variables above from settings.py and turn on 
> Debug =True (so
> django is back serving my files on the runserver) there is no problem at 
> all, all media is uploaded
> as expected to the local dir.
>
> I believe my amazon aws credentials must be good, has collectstatic has no 
> problem pushing
> all my css/js/imgs to my amazon bucket, and these seem to be being served 
> ok. It appears just to be
> this media upload from the admin failing...
>
> How can I fix?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[mezzanine-users] Re: Amazon s3 storage backed, upload image to blog from admin HTTP 500 on demo site.

2014-05-20 Thread bob hosk
My settings.py is here , and I think 
everything else is just the default mezzanine project install.

local_settings.py is here 

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mezzanine-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.