[ 
https://issues.apache.org/jira/browse/AIRFLOW-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15593110#comment-15593110
 ] 

Chris Riccomini commented on AIRFLOW-583:
-----------------------------------------

Agreed, I think your patch is correct. I want to wait a beat for [~jlowin] and 
[~alexvanboxel] to have a look, since they're heavy Google cloud users, too.

> GoogleCloudStorageToBigQueryOperator not properly decoding download file with 
> schema_object
> -------------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-583
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-583
>             Project: Apache Airflow
>          Issue Type: Bug
>            Reporter: Giovanni Briggs
>            Priority: Minor
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> When trying to use the *GoogleCloudStorageToBigQueryOperator* with the 
> _schema_object_ field, I received the following error:
> {code}
>  File "airflow/contrib/operators/gcs_to_bq.py", line 123, in execute
>     schema_fields = self.schema_fields if self.schema_fields else 
> json.loads(gcs_hook.download(self.bucket, self.schema_object))
>   File "lib/python3.4/json/__init__.py", line 312, in loads
>     s.__class__.__name__))
> TypeError: the JSON object must be str, not 'bytes'
> {code}
> With a little more debugging, it appears that _gcs_hook.download()_ returns a 
> byte object rather than a string.  This causes json.loads to crash because it 
> requires a string input.
> The fix is to decode the byte object at the point where we run *json.loads*.  
> This solution should work with both Python 2 and 3.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to