OK right, you would have seen a different error otherwise.

Yes profiles are only a compile-time thing, but they should affect the
effective POM for the artifact. mvn -Pscala-2.13 help:effective-pom shows
scala-parallel-collections as a dependency in the POM as expected (not in a
profile). However I see what you see in the .pom in the release repo, and
in my local repo after building - it's just sitting there as a profile as
if it weren't activated or something.

I'm confused then, that shouldn't be what happens. I'd say maybe there is a
problem with the release script, but seems to affect a simple local build.
Anyone else more expert in this see the problem, while I try to debug more?
The binary distro may actually be fine, I'll check; it may even not matter
much for users who generally just treat Spark as a compile-time-only
dependency either. But I can see it would break exactly your case,
something like a self-contained test job.

On Thu, Aug 26, 2021 at 8:41 PM Stephen Coy <s...@infomedia.com.au> wrote:

> I did indeed.
>
> The generated spark-core_2.13-3.2.0.pom that is created alongside the jar
> file in the local repo contains:
>
> <profile>
>   <id>scala-2.13</id>
>   <dependencies>
>     <dependency>
>       <groupId>org.scala-lang.modules</groupId>
>
> <artifactId>scala-parallel-collections_${scala.binary.version}</artifactId>
>     </dependency>
>   </dependencies>
> </profile>
>
> which means this dependency will be missing for unit tests that create
> SparkSessions from library code only, a technique inspired by Spark’s own
> unit tests.
>
> Cheers,
>
> Steve C
>
> On 27 Aug 2021, at 11:33 am, Sean Owen <sro...@gmail.com> wrote:
>
> Did you run ./dev/change-scala-version.sh 2.13 ? that's required first to
> update POMs. It works fine for me.
>
> On Thu, Aug 26, 2021 at 8:33 PM Stephen Coy <s...@infomedia.com.au.invalid>
> wrote:
>
>> Hi all,
>>
>> Being adventurous I have built the RC1 code with:
>>
>> -Pyarn -Phadoop-3.2  -Pyarn -Phadoop-cloud -Phive-thriftserver -Phive-2.3
>> -Pscala-2.13 -Dhadoop.version=3.2.2
>>
>>
>> And then attempted to build my Java based spark application.
>>
>> However, I found a number of our unit tests were failing with:
>>
>> java.lang.NoClassDefFoundError: scala/collection/parallel/TaskSupport
>>
>> at org.apache.spark.SparkContext.$anonfun$union$1(SparkContext.scala:1412)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>> at org.apache.spark.SparkContext.withScope(SparkContext.scala:789)
>> at org.apache.spark.SparkContext.union(SparkContext.scala:1406)
>> at
>> org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:698)
>> at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:184)
>>         …
>>
>>
>> I tracked this down to a missing dependency:
>>
>> <dependency>
>>   <groupId>org.scala-lang.modules</groupId>
>>
>> <artifactId>scala-parallel-collections_${scala.binary.version}</artifactId>
>> </dependency>
>>
>>
>> which unfortunately appears only in a profile in the pom files associated
>> with the various spark dependencies.
>>
>> As far as I know it is not possible to activate profiles in dependencies
>> in maven builds.
>>
>> Therefore I suspect that right now a Scala 2.13 migration is not quite as
>> seamless as we would like.
>>
>> I stress that this is only an issue for developers that write unit tests
>> for their applications, as the Spark runtime environment will always have
>> the necessary dependencies available to it.
>>
>> (You might consider upgrading the
>> org.scala-lang.modules:scala-parallel-collections_2.13 version from 0.2 to
>> 1.0.3 though!)
>>
>> Cheers and thanks for the great work!
>>
>> Steve Coy
>>
>>
>> On 21 Aug 2021, at 3:05 am, Gengliang Wang <ltn...@gmail.com> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version
>> 3.2.0.
>>
>> The vote is open until 11:59pm Pacific time Aug 25 and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>> <https://aus01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fspark.apache.org%2F&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357375097%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=F1lGO%2BCKG8cc8sCWQDkhhYRS%2F09jv95H%2Fg%2BriRaQAqk%3D&reserved=0>
>>
>> The tag to be voted on is v3.2.0-rc1 (commit
>> 6bb3523d8e838bd2082fb90d7f3741339245c044):
>> https://github.com/apache/spark/tree/v3.2.0-rc1
>> <https://aus01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fapache%2Fspark%2Ftree%2Fv3.2.0-rc1&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357385040%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=I8szk4saFf%2FWK8KQHsc5MoBqwxGwIqw5jCDQEBATZL0%3D&reserved=0>
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-bin/
>> <https://aus01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdist.apache.org%2Frepos%2Fdist%2Fdev%2Fspark%2Fv3.2.0-rc1-bin%2F&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357385040%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=nyZJElyU5s75gVkslrp%2FxCil7QjtQdzfSB4gu5LFrhE%3D&reserved=0>
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> <https://aus01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdist.apache.org%2Frepos%2Fdist%2Fdev%2Fspark%2FKEYS&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357394993%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=9QExpB36u9Hyct7PnNgG5Pu2%2FaAHyUyX4xE1OKyABPA%3D&reserved=0>
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1388
>> <https://aus01.safelinks.protection.outlook.com/?url=https%3A%2F%2Frepository.apache.org%2Fcontent%2Frepositories%2Forgapachespark-1388&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357394993%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=%2BhAiaaQ7E6UhlgnaKqAdEzTj7kXEbRIHaCsqRODgfrs%3D&reserved=0>
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-docs/
>> <https://aus01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdist.apache.org%2Frepos%2Fdist%2Fdev%2Fspark%2Fv3.2.0-rc1-docs%2F&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357394993%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=e6iZeYKeP%2FUTpvMK6%2F%2Bd8sMielJh8iy5c4tUrFK0pfI%3D&reserved=0>
>>
>> The list of bug fixes going into 3.2.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>
>> <https://aus01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fissues.apache.org%2Fjira%2Fprojects%2FSPARK%2Fversions%2F12349407&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357404964%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=2UT27CBy2y1szGeKI7n5sv5NXVGfHET0jhWUtUxuaKs%3D&reserved=0>
>>
>> This release is using the release script of the tag v3.2.0-rc1.
>>
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.2.0?
>> ===========================================
>> The current list of open tickets targeted at 3.2.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK
>> <https://aus01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fissues.apache.org%2Fjira%2Fprojects%2FSPARK&data=04%7C01%7Cscoy%40infomedia.com.au%7Ca931c1f56adb435bc97008d968faba6e%7C45d5407150f849caa59f9457123dc71c%7C0%7C1%7C637656248357404964%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=zfh1SRHMtlqPi0WMl0oBSjoS4FkB67j4UNuhH%2BmIuGM%3D&reserved=0>
>>  and
>> search for "Target Version/s" = 3.2.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>> This email contains confidential information of and is the copyright of
>> Infomedia. It must not be forwarded, amended or disclosed without consent
>> of the sender. If you received this message by mistake, please advise the
>> sender and delete all copies. Security of transmission on the internet
>> cannot be guaranteed, could be infected, intercepted, or corrupted and you
>> should ensure you have suitable antivirus protection in place. By sending
>> us your or any third party personal details, you consent to (or confirm you
>> have obtained consent from such third parties) to Infomedia’s privacy
>> policy. http://www.infomedia.com.au/privacy-policy/
>>
>
>

Reply via email to