Hi Luis,

Right...

I managed all my Spark "things" through Maven, bu that I mean I have a pom.xml 
with all the dependencies in it. Here it is:

<project xmlns="http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
        <modelVersion>4.0.0</modelVersion>
        <groupId>app</groupId>
        <artifactId>main</artifactId>
        <version>0.0.1</version>
        <build>
                <plugins>
                        <plugin>
                                <artifactId>maven-compiler-plugin</artifactId>
                                <version>3.3</version>
                                <configuration>
                                        <source>1.7</source>
                                        <target>1.7</target>
                                </configuration>
                        </plugin>
                </plugins>
        </build>
        <dependencies>
                <dependency>
                        <groupId>mysql</groupId>
                        <artifactId>mysql-connector-java</artifactId>
                        <version>5.1.6</version>
                </dependency>
                <dependency>
                        <groupId>org.hibernate</groupId>
                        <artifactId>hibernate-core</artifactId>
                        <version>5.2.0.Final</version>
                </dependency>
                <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-core_2.10</artifactId>
                        <version>1.6.2</version>
                </dependency>
                <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-sql_2.10</artifactId>
                        <version>1.6.2</version>
                        <scope>provided</scope>
                </dependency>
                <dependency>
                        <groupId>com.databricks</groupId>
                        <artifactId>spark-csv_2.10</artifactId>
                        <version>1.4.0</version>
                </dependency>
                <dependency>
                        <groupId>org.apache.commons</groupId>
                        <artifactId>commons-lang3</artifactId>
                        <version>3.4</version>
                </dependency>
                <dependency>
                        <groupId>joda-time</groupId>
                        <artifactId>joda-time</artifactId>
                        <version>2.9.4</version>
                </dependency>
        </dependencies>
</project>


When I run the application, I run it, no through Maven, but through Eclipse as 
a run configuration.

At not point I see or set a SPARK_HOME. I tried programmatically as well and 
Spark does not get it.

I do not connect to a Spark cluster (yet) just on my machine...

I hope it is clear, just started spark'ing recently...

jg




> On Jul 4, 2016, at 6:28 PM, Luis Mateos <luismat...@gmail.com> wrote:
> 
> Hi Jean, 
> 
> What do you mean by "running everything through maven"? Usually, applications 
> are compiled using maven and then launched by using the 
> $SPARK_HOME/bin/spark-submit script. It might be helpful to provide us more 
> details on how you are running your application.
> 
> Regards,
> Luis
> 
> On 4 July 2016 at 16:57, Jean Georges Perrin <j...@jgp.net 
> <mailto:j...@jgp.net>> wrote:
> Hey Anupam,
> 
> Thanks... but no:
> 
> I tried:
> 
>               SparkConf conf = new SparkConf().setAppName("my 
> app").setMaster("local");
>               JavaSparkContext javaSparkContext = new JavaSparkContext(conf);
>               javaSparkContext.setLogLevel("WARN");
>               SQLContext sqlContext = new SQLContext(javaSparkContext);
> 
> and
> 
>               SparkConf conf = new SparkConf().setAppName("my 
> app").setMaster("local");
>               SparkContext sc = new SparkContext(conf);
>               sc.setLogLevel("WARN");
>               SQLContext sqlContext = new SQLContext(sc);
> 
> and they are still very upset at my console :)...
> 
> 
>> On Jul 4, 2016, at 5:28 PM, Anupam Bhatnagar <anupambhatna...@gmail.com 
>> <mailto:anupambhatna...@gmail.com>> wrote:
>> 
>> Hi Jean,
>> 
>> How about using sc.setLogLevel("WARN") ? You may add this statement after 
>> initializing the Spark Context. 
>> 
>> From the Spark API - "Valid log levels include: ALL, DEBUG, ERROR, FATAL, 
>> INFO, OFF, TRACE, WARN". Here's the link in the Spark API. 
>> http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext
>>  
>> <http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext>
>> 
>> Hope this helps,
>> Anupam
>>   
>> 
>> 
>> 
>> On Mon, Jul 4, 2016 at 2:18 PM, Jean Georges Perrin <j...@jgp.net 
>> <mailto:j...@jgp.net>> wrote:
>> Thanks Mich, but what is SPARK_HOME when you run everything through Maven?
>> 
>>> On Jul 4, 2016, at 5:12 PM, Mich Talebzadeh <mich.talebza...@gmail.com 
>>> <mailto:mich.talebza...@gmail.com>> wrote:
>>> 
>>> check %SPARK_HOME/conf
>>> 
>>> copy file log4j.properties.template to log4j.properties
>>> 
>>> edit log4j.properties and set the log levels to your needs
>>> 
>>> cat log4j.properties
>>> 
>>> # Set everything to be logged to the console
>>> log4j.rootCategory=ERROR, console
>>> log4j.appender.console=org.apache.log4j.ConsoleAppender
>>> log4j.appender.console.target=System.err
>>> log4j.appender.console.layout=org.apache.log4j.PatternLayout
>>> log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p 
>>> %c{1}: %m%n
>>> # Settings to quiet third party logs that are too verbose
>>> log4j.logger.org.spark-project.jetty=WARN
>>> log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
>>> log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
>>> log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
>>> log4j.logger.org.apache.parquet=ERROR
>>> log4j.logger.parquet=ERROR
>>> # SPARK-9183: Settings to avoid annoying messages when looking up 
>>> nonexistent UDFs in SparkSQL with Hive support
>>> log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
>>> log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
>>> 
>>> HTH
>>> 
>>> 
>>> 
>>> Dr Mich Talebzadeh
>>>  
>>> LinkedIn  
>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>  
>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>
>>>  
>>> http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>
>>> 
>>> Disclaimer: Use it at your own risk. Any and all responsibility for any 
>>> loss, damage or destruction of data or any other property which may arise 
>>> from relying on this email's technical content is explicitly disclaimed. 
>>> The author will in no case be liable for any monetary damages arising from 
>>> such loss, damage or destruction.
>>>  
>>> 
>>> On 4 July 2016 at 21:56, Jean Georges Perrin <j...@jgp.net 
>>> <mailto:j...@jgp.net>> wrote:
>>> Hi,
>>> 
>>> I have installed Apache Spark via Maven.
>>> 
>>> How can I control the volume of log it displays on my system? I tried 
>>> different location for a log4j.properties, but none seems to work for me.
>>> 
>>> Thanks for help...
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org 
>>> <mailto:user-unsubscr...@spark.apache.org>
>>> 
>>> 
>> 
>> 
> 
> 

Reply via email to