Yikun commented on code in PR #37117: URL: https://github.com/apache/spark/pull/37117#discussion_r918757225
########## python/pyspark/ml/util.py: ########## @@ -536,10 +536,8 @@ def __get_class(clazz: str) -> Type[RL]: """ parts = clazz.split(".") module = ".".join(parts[:-1]) - m = __import__(module) - for comp in parts[1:]: - m = getattr(m, comp) - return m + m = __import__(module, fromlist=[parts[-1]]) + return getattr(m, parts[-1]) Review Comment: I try a e2e test to help understand, looks like no semantic changes: ```python clazz = "pyspark" parts = clazz.split(".") module = ".".join(parts[:-1]) m = __import__(module) Traceback (most recent call last): File "<stdin>", line 1, in <module> ValueError: Empty module name parts = clazz.split(".") module = ".".join(parts[:-1]) m = __import__(module, fromlist=[parts[-1]]) Traceback (most recent call last): File "<stdin>", line 1, in <module> ValueError: Empty module name ``` ```python clazz = "pyspark.sql" # previous parts = clazz.split(".") module = ".".join(parts[:-1]) m = __import__(module) for comp in parts[1:]: m = getattr(m, comp) m <module 'pyspark.sql' from '/Users/yikun/spark/python/pyspark/sql/__init__.py'> # new parts = clazz.split(".") module = ".".join(parts[:-1]) m = __import__(module, fromlist=[parts[-1]]) getattr(m, parts[-1]) <module 'pyspark.sql' from '/Users/yikun/spark/python/pyspark/sql/__init__.py'> ``` ~BTW, the `from_list` value seems useless but just a placeholder in our case.~ `from_list` help to `getattr(m, parts[-1])`, such as m.sql See also: https://stackoverflow.com/questions/2724260/why-does-pythons-import-require-fromlist At this time, I miss @zero323 a bit, maybe he could give some idea? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org