ccess_key_id=os.environ['AWS_ACCESS_KEY_ID'],
>>
>> aws_secret_access_key=os.environ['AWS_SECRET_ACCESS_KEY'])))
>> session.commit()
>> print('Done creating connections.')
>>
>>
>> On Thu, Jun 16, 2016 at 11:01
nnections.')
>
>
> On Thu, Jun 16, 2016 at 11:01 AM Tyrone Hinderson
> wrote:
>
> > Hey Jacob,
> >
> > Thanks for your quick response. I doubt I can take your approach, because
> >
> >1. It's imperative that the s3 connection be contained with
')
On Thu, Jun 16, 2016 at 11:01 AM Tyrone Hinderson
wrote:
> Hey Jacob,
>
> Thanks for your quick response. I doubt I can take your approach, because
>
>1. It's imperative that the s3 connection be contained within an
>environment variable
>2. M
Hey Jacob,
Thanks for your quick response. I doubt I can take your approach, because
1. It's imperative that the s3 connection be contained within an
environment variable
2. My scheduler is deployed on an AWS box which uses an IAM role to
connect to s3, not a credentials
e_hook.py:53} INFO - Using connection to:
> [bucket].s3-us-east-1.amazonaws.com <http://s3-us-east-1.amazonaws.com/>
>
> [2016-06-15 21:40:26,583] {logging.py:57} ERROR - Could not create an
> S3Hook with connection id "S3_LOGS". Please make sure that airflow[s3] is
> ins
uot;S3_LOGS". Please make sure that airflow[s3] is
installed and the S3 connection exists.
=
It's clear that my connection exists because of the "Using connection to:"
line. However, I fear that my connection URI string is malformed. C
Where are you seeing that an S3 connection is required? It will only be
accessed if you tols Airflow to send logs to S3. The config option can also
be null (default) or a google storage location.
The S3 connection is a standard Airflow connection. If you would like it to
use environment variables
We ran into this issue as well. If you set the environment variable to
anything random, it'll get ignored and control will pass through to
.aws/credentials
We used "n/a"
It's kind of annoying that the s3 connection is a) required, and b) poorly
supported as an env var.
On Tu
I was logging to S3 in 1.7.0, but now I need to create an S3 "Connection"
in airflow (for remote_log_conn_id) to keep doing that in 1.7.1.2. Rather
than set this "S3" connection in the UI, I'd like set a AIRFLOW_CONN_S3 env
variable. What does an airlfow-friendly s3 "connection string" look like?