[ https://issues.apache.org/jira/browse/SPARK-39309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-39309: ------------------------------------ Assignee: (was: Apache Spark) > '_SubTest' object has no attribute 'elapsed_time' > ------------------------------------------------- > > Key: SPARK-39309 > URL: https://issues.apache.org/jira/browse/SPARK-39309 > Project: Spark > Issue Type: Sub-task > Components: Project Infra, Tests > Affects Versions: 3.4.0 > Reporter: Yikun Jiang > Priority: Major > > {code:java} > Traceback (most recent call last): > 1582 File "/usr/lib/python3.9/runpy.py", line 197, in _run_module_as_main > 1583 return _run_code(code, main_globals, None, > 1584 File "/usr/lib/python3.9/runpy.py", line 87, in _run_code > 1585 exec(code, run_globals) > 1586 File "/__w/spark/spark/python/pyspark/pandas/tests/test_namespace.py", > line 585, in <module> > 1587 unittest.main(testRunner=testRunner, verbosity=2) > 1588 File "/usr/lib/python3.9/unittest/main.py", line 101, in __init__ > 1589 self.runTests() > 1590 File "/usr/lib/python3.9/unittest/main.py", line 271, in runTests > 1591 self.result = testRunner.run(self.test) > 1592 File "/usr/local/lib/python3.9/dist-packages/xmlrunner/xmlrunner.py", > line 421, in run > 1593 result.printErrors() > 1594 File "/usr/lib/python3.9/unittest/runner.py", line 110, in printErrors > 1595 self.printErrorList('FAIL', self.failures) > 1596 File "/usr/local/lib/python3.9/dist-packages/xmlrunner/xmlrunner.py", > line 217, in printErrorList > 1597 '%s [%.3fs]: %s' % (flavour, test_info.elapsed_time, > 1598AttributeError: '_SubTest' object has no attribute 'elapsed_time' {code} > [https://github.com/xmlrunner/unittest-xml-reporting/issues/218#issuecomment-665088941] > > Looks like we need to switch xmlrunner to > https://pypi.org/project/unittest-xml-reporting/ -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org