Mitch: You are using scala 2.11 to do this. Have a look at Building Spark
<https://spark.apache.org/docs/latest/building-spark.html> "Spark requires
Scala 2.12/2.13; support for Scala 2.11 was removed in Spark 3.0.0."

søn. 27. feb. 2022 kl. 20:55 skrev Mich Talebzadeh <
mich.talebza...@gmail.com>:

> OK I decided to give a try to maven.
>
> Downloaded maven and unzipped the file WSL-Ubuntu terminal as unzip
> apache-maven-3.8.4-bin.zip
>
> Then added to Windows env variable as MVN_HOME and added the bin directory
> to path in windows. Restart intellij to pick up the correct path.
>
> Again on the command line in intellij do
>
> *mvn -v*
> Apache Maven 3.8.4 (9b656c72d54e5bacbed989b64718c159fe39b537)
> Maven home: d:\temp\apache-maven-3.8.4
> Java version: 1.8.0_73, vendor: Oracle Corporation, runtime: C:\Program
> Files\Java\jdk1.8.0_73\jre
> Default locale: en_GB, platform encoding: Cp1252
> OS name: "windows 10", version: "10.0", arch: "amd64", family: "windows"
>
> in Intellij add maven support to your project. Follow this link Add Maven
> support to an existing project
> <https://www.jetbrains.com/help/idea/convert-a-regular-project-into-a-maven-project.html>
>
> There will be a pom.xml file under project directory
>
> [image: image.png]
>
> Edit that pom.xml file and add the following
>
> <?xml version="1.0" encoding="UTF-8"?>
> <project xmlns="http://maven.apache.org/POM/4.0.0";
>          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
>          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd";>
>     <modelVersion>4.0.0</modelVersion>
>
>     <groupId>spark</groupId>
>     <artifactId>MichTest</artifactId>
>     <version>1.0</version>
>
>     <properties>
>         <maven.compiler.source>8</maven.compiler.source>
>         <maven.compiler.target>8</maven.compiler.target>
>     </properties>
>     <dependencies>
>     <dependency>
>         <groupId>org.scala-lang</groupId>
>         <artifactId>scala-library</artifactId>
>         <version>2.11.7</version>
>     </dependency>
>     <dependency>
>         <groupId>org.apache.spark</groupId>
>         <artifactId>spark-core_2.10</artifactId>
>         <version>2.0.0</version>
>     </dependency>
>     <dependency>
>         <groupId>org.apache.spark</groupId>
>         <artifactId>spark-sql_2.10</artifactId>
>         <version>2.0.0</version>
>     </dependency>
>     </dependencies>
> </project>
>
> In intellij open a Terminal under project sub-directory where the pom file
> is created and you edited.
>
>
>  *mvn clean*
>
> [INFO] Scanning for projects...
>
> [INFO]
>
> [INFO] ---------------------------< spark:MichTest
> >---------------------------
>
> [INFO] Building MichTest 1.0
>
> [INFO] --------------------------------[ jar
> ]---------------------------------
>
> [INFO]
>
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ MichTest ---
>
> [INFO] Deleting D:\temp\intellij\MichTest\target
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] BUILD SUCCESS
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Total time:  4.451 s
>
> [INFO] Finished at: 2022-02-27T19:37:57Z
>
> [INFO]
> ------------------------------------------------------------------------
>
> *mvn compile*
>
> [INFO] Scanning for projects...
>
> [INFO]
>
> [INFO] ---------------------------< spark:MichTest
> >---------------------------
>
> [INFO] Building MichTest 1.0
>
> [INFO] --------------------------------[ jar
> ]---------------------------------
>
> [INFO]
>
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> MichTest ---
>
> [WARNING] Using platform encoding (Cp1252 actually) to copy filtered
> resources, i.e. build is platform dependent!
>
> [INFO] Copying 0 resource
>
> [INFO]
>
> [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ MichTest
> ---
>
> [INFO] Nothing to compile - all classes are up to date
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] BUILD SUCCESS
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Total time:  1.242 s
>
> [INFO] Finished at: 2022-02-27T19:38:58Z
>
> [INFO]
> ------------------------------------------------------------------------
>
> Now create the package
>
>  *mvn package*
> [INFO] Scanning for projects...
> [INFO]
> [INFO] ---------------------------< spark:MichTest
> >---------------------------
> [INFO] Building MichTest 1.0
> [INFO] --------------------------------[ jar
> ]---------------------------------
> [INFO]
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> MichTest ---
> [WARNING] Using platform encoding (Cp1252 actually) to copy filtered
> resources, i.e. build is platform dependent!
> [INFO] Copying 0 resource
> [INFO]
> [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ MichTest
> ---
> [INFO] Nothing to compile - all classes are up to date
> [INFO]
> [INFO] --- maven-resources-plugin:2.6:testResources
> (default-testResources) @ MichTest ---
> [WARNING] Using platform encoding (Cp1252 actually) to copy filtered
> resources, i.e. build is platform dependent!
> [INFO] skip non existing resourceDirectory
> D:\temp\intellij\MichTest\src\test\resources
> [INFO]
> [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @
> MichTest ---
> [INFO] Nothing to compile - all classes are up to date
> [INFO]
> [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ MichTest ---
> [INFO] No tests to run.
> [INFO]
> [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ MichTest ---
> [INFO] *Building jar:
> D:\temp\intellij\MichTest\target\scala-2.11\MichTest-1.0.jar*
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD SUCCESS
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time:  1.511 s
> [INFO] Finished at: 2022-02-27T19:40:22Z
> [INFO]
> ------------------------------------------------------------------------
>
> sbt should work from the same directory as well.
>
> I find it easier* to package from the command line in Intellij Terminal. *
>
> Let me know if this is a wrong approach or if there are better ways of
> doing it.
>
> HTH
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Sun, 27 Feb 2022 at 10:05, Mich Talebzadeh <mich.talebza...@gmail.com>
> wrote:
>
>> Got curious with this intellij stuff.
>>
>> I recall using sbt rather than MVN so go to terminal in your intellij and
>> verify what is installed
>>
>>  sbt -version
>> sbt version in this project: 1.3.4
>> sbt script version: 1.3.4
>>
>>  scala -version
>>
>> Scala code runner version 2.11.7 -- Copyright 2002-2013, LAMP/EPFL
>>
>> java --version
>> openjdk 11.0.7 2020-04-14
>> OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.7+10)
>> OpenJDK 64-Bit Server VM AdoptOpenJDK (build 11.0.7+10, mixed mode)
>>
>> For now in the directory where you have Main.scala create build.sbt file
>>
>> // The simplest possible sbt build file is just one line:
>>
>> scalaVersion := "2.11.7"
>> // That is, to create a valid sbt build, all you've got to do is define the
>> // version of Scala you'd like your project to use.
>>
>> / To learn more about multi-project builds, head over to the official sbt
>> // documentation at http://www.scala-sbt.org/documentation.html
>> libraryDependencies += "org.scala-lang.modules" %% 
>> "scala-parser-combinators" % "1.1.2"
>> libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.0"
>>
>>
>> This is my Main.scala file a copy from sparkbyexample
>> <https://sparkbyexamples.com/spark/spark-setup-run-with-scala-intellij/>
>>
>>
>> package org.example
>> import org.apache.spark.sql.SparkSession
>> object SparkSessionTest extends App{
>>   val spark = SparkSession.builder()
>>     .master("local[1]")
>>     .appName("SparkByExample")
>>     .getOrCreate();
>>
>>   println("First SparkContext:")
>>   println("APP Name :"+spark.sparkContext.appName);
>>   println("Deploy Mode :"+spark.sparkContext.deployMode);
>>   println("Master :"+spark.sparkContext.master);
>>
>>   val sparkSession2 = SparkSession.builder()
>>     .master("local[1]")
>>     .appName("SparkByExample-test")
>>     .getOrCreate();
>>
>>   println("Second SparkContext:")
>>   println("APP Name :"+sparkSession2.sparkContext.appName);
>>   println("Deploy Mode :"+sparkSession2.sparkContext.deployMode);
>>   println("Master :"+sparkSession2.sparkContext.master);
>> }
>>
>>  Go back to Terminal under directory where you have both files built.sbt
>> and Main.scala
>>
>>
>> *sbt clean*
>>
>> [info] Loading global plugins from C:\Users\admin\.sbt\1.0\plugins
>>
>> [info] Loading project definition from
>> D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\project
>>
>> [info] Loading settings for project scala from build.sbt ...
>>
>> [info] Set current project to MichTest (in build
>> file:/D:/temp/intellij/MichTest/src/main/scala/com/ctp/training/scala/)
>>
>> [success] Total time: 0 s, completed Feb 27, 2022 9:54:10 AM
>>
>> *sbt compile*
>> [info] Loading global plugins from C:\Users\admin\.sbt\1.0\plugins
>> [info] Loading project definition from
>> D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\project
>> [info] Loading settings for project scala from build.sbt ...
>> [info] Set current project to MichTest (in build
>> file:/D:/temp/intellij/MichTest/src/main/scala/com/ctp/training/scala/)
>> [info] Executing in batch mode. For better performance use sbt's shell
>> [warn] There may be incompatibilities among your library dependencies;
>> run 'evicted' to see detailed eviction warnings.
>> [info] Compiling 1 Scala source to
>> D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\target\scala-2.11\classes
>> ...
>> [success] Total time: 5 s, completed Feb 27, 2022 9:55:10 AM
>>
>>  *sbt package*
>>
>> [info] Loading global plugins from C:\Users\admin\.sbt\1.0\plugins
>>
>> [info] Loading project definition from
>> D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\project
>>
>> [info] Loading settings for project scala from build.sbt ...
>>
>> [info] Set current project to MichTest (in build
>> file:/D:/temp/intellij/MichTest/src/main/scala/com/ctp/training/scala/)
>>
>> [success] Total time: 1 s, completed Feb 27, 2022 9:56:48 AM
>>
>>  *ls*
>>
>>
>>     Directory:
>> D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala
>>
>>
>> Mode                 LastWriteTime         Length Name
>>
>> ----                 -------------         ------ ----
>>
>> d-----         2/27/2022   7:04 AM                null
>>
>> d-----         2/27/2022   8:33 AM                project
>>
>> d-----         2/27/2022   9:08 AM                spark-warehouse
>>
>> d-----         2/27/2022   9:55 AM                target
>>
>> -a----         2/27/2022   9:17 AM           3511 build.sbt
>>
>> Note that you have target directory and underneath scala-2.11 (in my
>> case) and the uber jar file michtest_2.11-1.0.jar
>>
>> *ls*
>>     Directory:
>> D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\target\scala-2.11
>> Mode                 LastWriteTime         Length Name
>> ----                 -------------         ------ ----
>> d-----         2/27/2022   9:55 AM                classes
>> d-----         2/27/2022   9:55 AM                update
>> -a----         2/27/2022   9:56 AM           3938 *michtest_2.11-1.0.jar*
>>
>>
>> These are old stuff but still shows how to create a jar file with sbt
>>
>> HTH
>>
>>
>>   view my Linkedin profile
>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Sat, 26 Feb 2022 at 22:48, Sean Owen <sro...@gmail.com> wrote:
>>
>>> I don't think any of that is related, no.
>>> How are you dependencies set up? manually with IJ, or in a build file
>>> (Maven, Gradle)? Normally you do the latter and dependencies are taken care
>>> of for you, but you app would definitely have to express a dependency on
>>> Scala libs.
>>>
>>> On Sat, Feb 26, 2022 at 4:25 PM Bitfox <bit...@bitfox.top> wrote:
>>>
>>>> Java SDK installed?
>>>>
>>>> On Sun, Feb 27, 2022 at 5:39 AM Sachit Murarka <connectsac...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hello ,
>>>>>
>>>>> Thanks for replying. I have installed Scala plugin in IntelliJ  first
>>>>> then also it's giving same error
>>>>>
>>>>> Cannot find project Scala library 2.12.12 for module SparkSimpleApp
>>>>>
>>>>> Thanks
>>>>> Rajat
>>>>>
>>>>> On Sun, Feb 27, 2022, 00:52 Bitfox <bit...@bitfox.top> wrote:
>>>>>
>>>>>> You need to install scala first, the current version for spark is
>>>>>> 2.12.15
>>>>>> I would suggest you install scala by sdk which works great.
>>>>>>
>>>>>> Thanks
>>>>>>
>>>>>> On Sun, Feb 27, 2022 at 12:10 AM rajat kumar <
>>>>>> kumar.rajat20...@gmail.com> wrote:
>>>>>>
>>>>>>> Hello Users,
>>>>>>>
>>>>>>> I am trying to create spark application using Scala(Intellij).
>>>>>>> I have installed Scala plugin in intelliJ still getting below error:-
>>>>>>>
>>>>>>> Cannot find project Scala library 2.12.12 for module SparkSimpleApp
>>>>>>>
>>>>>>>
>>>>>>> Could anyone please help what I am doing wrong?
>>>>>>>
>>>>>>> Thanks
>>>>>>>
>>>>>>> Rajat
>>>>>>>
>>>>>>

-- 
Bjørn Jørgensen
Vestre Aspehaug 4, 6010 Ålesund
Norge

+47 480 94 297

Reply via email to