[jira] [Commented] (FLINK-13085) unable to run python test locally

2019-07-03 Thread sunjincheng (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-13085?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16878266#comment-16878266
 ] 

sunjincheng commented on FLINK-13085:
-

Hi Bowen Li,Just a reminder, before assigning a ticket to someone, please ask 
his opinion. 

> unable to run python test locally
> -
>
> Key: FLINK-13085
> URL: https://issues.apache.org/jira/browse/FLINK-13085
> Project: Flink
>  Issue Type: Bug
>  Components: API / Python
>Reporter: Bowen Li
>Assignee: sunjincheng
>Priority: Major
> Fix For: 1.9.0
>
>
> Ran ./dev/lint-python.sh and got:
> {code:java}
> === FAILURES 
> ===
> __ ExecutionConfigTests.test_equals_and_hash 
> ___
> self =  testMethod=test_equals_and_hash>
> def test_equals_and_hash(self):
> config1 = 
> ExecutionEnvironment.get_execution_environment().get_config()
> config2 = 
> ExecutionEnvironment.get_execution_environment().get_config()
> self.assertEqual(config1, config2)
> self.assertEqual(hash(config1), hash(config2))
> config1.set_parallelism(12)
> self.assertNotEqual(config1, config2)
> >   self.assertNotEqual(hash(config1), hash(config2))
> E   AssertionError: -1960065877 == -1960065877
> pyflink/common/tests/test_execution_config.py:293: AssertionError
> __ ExecutionEnvironmentTests.test_get_execution_plan 
> ___
> self = 
>  testMethod=test_get_execution_plan>
> def test_get_execution_plan(self):
> tmp_dir = tempfile.gettempdir()
> source_path = os.path.join(tmp_dir + '/streaming.csv')
> tmp_csv = os.path.join(tmp_dir + '/streaming2.csv')
> field_names = ["a", "b", "c"]
> field_types = [DataTypes.INT(), DataTypes.STRING(), 
> DataTypes.STRING()]
> t_env = BatchTableEnvironment.create(self.env)
> csv_source = CsvTableSource(source_path, field_names, field_types)
> t_env.register_table_source("Orders", csv_source)
> t_env.register_table_sink(
> "Results",
> CsvTableSink(field_names, field_types, tmp_csv))
> >   t_env.scan("Orders").insert_into("Results")
> pyflink/dataset/tests/test_execution_environment.py:111:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _
> pyflink/table/table.py:583: in insert_into
> self._j_table.insertInto(table_path, j_table_path)
> .tox/py27/lib/python2.7/site-packages/py4j/java_gateway.py:1286: in __call__
> answer, self.gateway_client, self.target_id, self.name)
> pyflink/util/exceptions.py:139: in deco
> return f(*a, **kw)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _
> answer = 'xro290'
> gateway_client = 
> target_id = 'o288', name = 'insertInto'
> def get_return_value(answer, gateway_client, target_id=None, name=None):
> """Converts an answer received from the Java gateway into a Python 
> object.
> For example, string representation of integers are converted to Python
> integer, string representation of objects are converted to JavaObject
> instances, etc.
> :param answer: the string returned by the Java gateway
> :param gateway_client: the gateway client used to communicate with 
> the Java
> Gateway. Only necessary if the answer is a reference (e.g., 
> object,
> list, map)
> :param target_id: the name of the object from which the answer comes 
> from
> (e.g., *object1* in `object1.hello()`). Optional.
> :param name: the name of the member from which the answer comes from
> (e.g., *hello* in `object1.hello()`). Optional.
> """
> if is_error(answer)[0]:
> if len(answer) > 1:
> type = answer[1]
> value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
> if answer[1] == REFERENCE_TYPE:
> raise Py4JJavaError(
> "An error occurred while calling {0}{1}{2}.\n".
> >   format(target_id, ".", name), value)
> E   Py4JJavaError: An error occurred while calling 
> o288.insertInto.
> E   : java.lang.NullPointerException
> E at 
> org.apache.flink.api.common.io.FileOutputFormat.setWriteMode(FileOutputFormat.java:146)
> E at 
> org.apache.flink.api.java.DataSet.writeAsText(DataSet.java:1510)
> E at 
> org.apache.flink.table.sinks.CsvTableSink.emitDataSet(CsvTableSink.scala:76)
> E at 
> org.apache.flink.table.api.internal.BatchTableEnvImpl.writeToSink(BatchTableEnvImpl.scala:128)
> E

[jira] [Commented] (FLINK-13085) unable to run python test locally

2019-07-03 Thread Dian Fu (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-13085?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16878255#comment-16878255
 ] 

Dian Fu commented on FLINK-13085:
-

[~phoenixjiangnan] The exception is because the interfaces of Java 
CSVTableSource has changed and so you need to rebuild the Java package with 
"mvn clean package -DskipTests" before executing "./dev/lint-python.sh".

> unable to run python test locally
> -
>
> Key: FLINK-13085
> URL: https://issues.apache.org/jira/browse/FLINK-13085
> Project: Flink
>  Issue Type: Bug
>  Components: API / Python
>Reporter: Bowen Li
>Assignee: sunjincheng
>Priority: Major
> Fix For: 1.9.0
>
>
> Ran ./dev/lint-python.sh and got:
> {code:java}
> === FAILURES 
> ===
> __ ExecutionConfigTests.test_equals_and_hash 
> ___
> self =  testMethod=test_equals_and_hash>
> def test_equals_and_hash(self):
> config1 = 
> ExecutionEnvironment.get_execution_environment().get_config()
> config2 = 
> ExecutionEnvironment.get_execution_environment().get_config()
> self.assertEqual(config1, config2)
> self.assertEqual(hash(config1), hash(config2))
> config1.set_parallelism(12)
> self.assertNotEqual(config1, config2)
> >   self.assertNotEqual(hash(config1), hash(config2))
> E   AssertionError: -1960065877 == -1960065877
> pyflink/common/tests/test_execution_config.py:293: AssertionError
> __ ExecutionEnvironmentTests.test_get_execution_plan 
> ___
> self = 
>  testMethod=test_get_execution_plan>
> def test_get_execution_plan(self):
> tmp_dir = tempfile.gettempdir()
> source_path = os.path.join(tmp_dir + '/streaming.csv')
> tmp_csv = os.path.join(tmp_dir + '/streaming2.csv')
> field_names = ["a", "b", "c"]
> field_types = [DataTypes.INT(), DataTypes.STRING(), 
> DataTypes.STRING()]
> t_env = BatchTableEnvironment.create(self.env)
> csv_source = CsvTableSource(source_path, field_names, field_types)
> t_env.register_table_source("Orders", csv_source)
> t_env.register_table_sink(
> "Results",
> CsvTableSink(field_names, field_types, tmp_csv))
> >   t_env.scan("Orders").insert_into("Results")
> pyflink/dataset/tests/test_execution_environment.py:111:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _
> pyflink/table/table.py:583: in insert_into
> self._j_table.insertInto(table_path, j_table_path)
> .tox/py27/lib/python2.7/site-packages/py4j/java_gateway.py:1286: in __call__
> answer, self.gateway_client, self.target_id, self.name)
> pyflink/util/exceptions.py:139: in deco
> return f(*a, **kw)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _
> answer = 'xro290'
> gateway_client = 
> target_id = 'o288', name = 'insertInto'
> def get_return_value(answer, gateway_client, target_id=None, name=None):
> """Converts an answer received from the Java gateway into a Python 
> object.
> For example, string representation of integers are converted to Python
> integer, string representation of objects are converted to JavaObject
> instances, etc.
> :param answer: the string returned by the Java gateway
> :param gateway_client: the gateway client used to communicate with 
> the Java
> Gateway. Only necessary if the answer is a reference (e.g., 
> object,
> list, map)
> :param target_id: the name of the object from which the answer comes 
> from
> (e.g., *object1* in `object1.hello()`). Optional.
> :param name: the name of the member from which the answer comes from
> (e.g., *hello* in `object1.hello()`). Optional.
> """
> if is_error(answer)[0]:
> if len(answer) > 1:
> type = answer[1]
> value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
> if answer[1] == REFERENCE_TYPE:
> raise Py4JJavaError(
> "An error occurred while calling {0}{1}{2}.\n".
> >   format(target_id, ".", name), value)
> E   Py4JJavaError: An error occurred while calling 
> o288.insertInto.
> E   : java.lang.NullPointerException
> E at 
> org.apache.flink.api.common.io.FileOutputFormat.setWriteMode(FileOutputFormat.java:146)
> E at 
> org.apache.flink.api.java.DataSet.writeAsText(DataSet.java:1510)
> E at 
> org.apache.flink.table.sinks.CsvTableSink.emitDataSet(CsvTableSink.scala:76)
> E at