[jira] [Commented] (AIRFLOW-5271) EmrCreateJobFlowOperator throwing error in airflow 1.10.4 version

2019-11-04 Thread Nidhi Chourasia (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5271?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16967310#comment-16967310
 ] 

Nidhi Chourasia commented on AIRFLOW-5271:
--

Yes followed the steps and it is working now.[~ash]

> EmrCreateJobFlowOperator throwing error in airflow 1.10.4 version
> -
>
> Key: AIRFLOW-5271
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5271
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.10.4
> Environment: Operating System details:
> ubuntu@ip-10-0-1-252:~$ cat /etc/os-release
> NAME="Ubuntu"
> VERSION="18.04.1 LTS (Bionic Beaver)"
> ID=ubuntu
> ID_LIKE=debian
> PRETTY_NAME="Ubuntu 18.04.1 LTS"
> VERSION_ID="18.04"
> HOME_URL="https://www.ubuntu.com/;
> SUPPORT_URL="https://help.ubuntu.com/;
> BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/;
> PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy;
> VERSION_CODENAME=bionic
> UBUNTU_CODENAME=bionic
>Reporter: Nidhi Chourasia
>Priority: Blocker
>
> h3. *ERROR LOGS:* 
> {{[2019-08-21 05:39:42,970] \{emr_create_job_flow_operator.py:66} INFO - 
> Creating JobFlow using aws-conn-id: aws_default, emr-conn-id: emr_default
> [2019-08-21 05:39:42,981] \{logging_mixin.py:95} INFO - [2019-08-21 
> 05:39:42,980] \{connection.py:296} ERROR - Expecting 
> property name enclosed in double quotes: line 1 column 2 (char 1)
> Traceback (most recent call last):
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/models/connection.py",
>  line 294, in extra_dejson
> obj = json.loads(self.extra)
>   File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
> return _default_decoder.decode(s)
>   File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
> obj, end = self.raw_decode(s, idx=_w(s, 0).end())
>   File "/usr/lib/python2.7/json/decoder.py", line 380, in raw_decode
> obj, end = self.scan_once(s, idx)
> ValueError: Expecting property name enclosed in double quotes: line 1 column 
> 2 (char 1)
> [2019-08-21 05:39:42,982] \{logging_mixin.py:95} INFO - [2019-08-21 
> 05:39:42,981] \{connection.py:297} ERROR - Failed parsing 
> the json for conn_id aws_default
> [2019-08-21 05:39:43,054] \{taskinstance.py:1047} ERROR - Parameter 
> validation failed:
> Unknown parameter in input: "TerminationProtected", must be one of: Name, 
> LogUri, AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, 
> BootstrapActions, SupportedProducts, NewSupportedProducts, Applications, 
> Configurations, VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, 
> SecurityConfiguration, AutoScalingRole, ScaleDownBehavior, CustomAmiId, 
> EbsRootVolumeSize, RepoUpgradeOnBoot, KerberosAttributes
> Unknown parameter in input: "KeepJobFlowAliveWhenNoSteps", must be one of: 
> Name, LogUri, AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, 
> BootstrapActions, SupportedProducts, NewSupportedProducts, Applications, 
> Configurations, VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, 
> SecurityConfiguration, AutoScalingRole, ScaleDownBehavior, CustomAmiId, 
> EbsRootVolumeSize, RepoUpgradeOnBoot, KerberosAttributes
> Unknown parameter in input: "Ec2SubnetId", must be one of: Name, LogUri, 
> AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, BootstrapActions, 
> SupportedProducts, NewSupportedProducts, Applications, Configurations, 
> VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, SecurityConfiguration, 
> AutoScalingRole, ScaleDownBehavior, CustomAmiId, EbsRootVolumeSize, 
> RepoUpgradeOnBoot, KerberosAttributes
> Unknown parameter in input: "Ec2KeyName", must be one of: Name, LogUri, 
> AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, BootstrapActions, 
> SupportedProducts, NewSupportedProducts, Applications, Configurations, 
> VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, SecurityConfiguration, 
> AutoScalingRole, ScaleDownBehavior, CustomAmiId, EbsRootVolumeSize, 
> RepoUpgradeOnBoot, KerberosAttributes}}
> h3. {{*CORRESPONDING DAG CODE:*}}
> {{}}
> {noformat}
> // code placeholder
> airflow_test_json=json.load(open(airflow_home+'/test.json'))
> airflow_asset_analytics_creator = EmrCreateJobFlowOperator(
> task_id='create_asset_analytics_databricks_test',
> job_flow_overrides=airflow_test_json['Job'],
> timeout=10,
> aws_conn_id='aws_default',
> emr_conn_id='emr_default',
> dag=dag
> )
> airflow_asset_analytics_sensor = EmrJobFlowSensor(
> task_id='check_asset_analytics_databricks_stable',
> job_flow_id="{{ 
> task_instance.xcom_pull('create_asset_analytics_databricks_test', 
> key='return_value') }}",
> aws_conn_id='aws_default',
> dag=dag
> )
> 

[jira] [Commented] (AIRFLOW-5271) EmrCreateJobFlowOperator throwing error in airflow 1.10.4 version

2019-08-21 Thread Ash Berlin-Taylor (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5271?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16912176#comment-16912176
 ] 

Ash Berlin-Taylor commented on AIRFLOW-5271:


Did you recently upgrade from before 1.10.1? See 
https://github.com/apache/airflow/blob/master/UPDATING.md#emrhook-now-passes-all-of-connections-extra-to-createjobflow-api

> EmrCreateJobFlowOperator throwing error in airflow 1.10.4 version
> -
>
> Key: AIRFLOW-5271
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5271
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.10.4
> Environment: Operating System details:
> ubuntu@ip-10-0-1-252:~$ cat /etc/os-release
> NAME="Ubuntu"
> VERSION="18.04.1 LTS (Bionic Beaver)"
> ID=ubuntu
> ID_LIKE=debian
> PRETTY_NAME="Ubuntu 18.04.1 LTS"
> VERSION_ID="18.04"
> HOME_URL="https://www.ubuntu.com/;
> SUPPORT_URL="https://help.ubuntu.com/;
> BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/;
> PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy;
> VERSION_CODENAME=bionic
> UBUNTU_CODENAME=bionic
>Reporter: Nidhi Chourasia
>Priority: Blocker
> Fix For: 1.10.5
>
>
> h3. *ERROR LOGS:* 
> {{[2019-08-21 05:39:42,970] \{emr_create_job_flow_operator.py:66} INFO - 
> Creating JobFlow using aws-conn-id: aws_default, emr-conn-id: emr_default
> [2019-08-21 05:39:42,981] \{logging_mixin.py:95} INFO - [2019-08-21 
> 05:39:42,980] \{connection.py:296} ERROR - Expecting 
> property name enclosed in double quotes: line 1 column 2 (char 1)
> Traceback (most recent call last):
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/models/connection.py",
>  line 294, in extra_dejson
> obj = json.loads(self.extra)
>   File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
> return _default_decoder.decode(s)
>   File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
> obj, end = self.raw_decode(s, idx=_w(s, 0).end())
>   File "/usr/lib/python2.7/json/decoder.py", line 380, in raw_decode
> obj, end = self.scan_once(s, idx)
> ValueError: Expecting property name enclosed in double quotes: line 1 column 
> 2 (char 1)
> [2019-08-21 05:39:42,982] \{logging_mixin.py:95} INFO - [2019-08-21 
> 05:39:42,981] \{connection.py:297} ERROR - Failed parsing 
> the json for conn_id aws_default
> [2019-08-21 05:39:43,054] \{taskinstance.py:1047} ERROR - Parameter 
> validation failed:
> Unknown parameter in input: "TerminationProtected", must be one of: Name, 
> LogUri, AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, 
> BootstrapActions, SupportedProducts, NewSupportedProducts, Applications, 
> Configurations, VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, 
> SecurityConfiguration, AutoScalingRole, ScaleDownBehavior, CustomAmiId, 
> EbsRootVolumeSize, RepoUpgradeOnBoot, KerberosAttributes
> Unknown parameter in input: "KeepJobFlowAliveWhenNoSteps", must be one of: 
> Name, LogUri, AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, 
> BootstrapActions, SupportedProducts, NewSupportedProducts, Applications, 
> Configurations, VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, 
> SecurityConfiguration, AutoScalingRole, ScaleDownBehavior, CustomAmiId, 
> EbsRootVolumeSize, RepoUpgradeOnBoot, KerberosAttributes
> Unknown parameter in input: "Ec2SubnetId", must be one of: Name, LogUri, 
> AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, BootstrapActions, 
> SupportedProducts, NewSupportedProducts, Applications, Configurations, 
> VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, SecurityConfiguration, 
> AutoScalingRole, ScaleDownBehavior, CustomAmiId, EbsRootVolumeSize, 
> RepoUpgradeOnBoot, KerberosAttributes
> Unknown parameter in input: "Ec2KeyName", must be one of: Name, LogUri, 
> AdditionalInfo, AmiVersion, ReleaseLabel, Instances, Steps, BootstrapActions, 
> SupportedProducts, NewSupportedProducts, Applications, Configurations, 
> VisibleToAllUsers, JobFlowRole, ServiceRole, Tags, SecurityConfiguration, 
> AutoScalingRole, ScaleDownBehavior, CustomAmiId, EbsRootVolumeSize, 
> RepoUpgradeOnBoot, KerberosAttributes}}
> h3. {{*CORRESPONDING DAG CODE:*}}
> {{}}
> {noformat}
> // code placeholder
> airflow_test_json=json.load(open(airflow_home+'/test.json'))
> airflow_asset_analytics_creator = EmrCreateJobFlowOperator(
> task_id='create_asset_analytics_databricks_test',
> job_flow_overrides=airflow_test_json['Job'],
> timeout=10,
> aws_conn_id='aws_default',
> emr_conn_id='emr_default',
> dag=dag
> )
> airflow_asset_analytics_sensor = EmrJobFlowSensor(
> task_id='check_asset_analytics_databricks_stable',
> job_flow_id="{{ 
>