Unsubscribe Thanks and Regards,Malligarjunan S.
Unsubscribe Thanks and Regards,Malligarjunan S.
On Saturday, 3 December 2016, 20:42, Sivakumar S
wrote:
Unsubscribe
It is my mistake, some how I have added the io.compression.codec property value
as the above mentioned class. Resolved the problem now
Thanks and Regards,
Sankar S.
On Wednesday, 27 August 2014, 1:23, S Malligarjunan smalligarju...@yahoo.com
wrote:
Hello all,
I have just checked out
have been fixed in 1.1.
On Mon, Aug 25, 2014 at 9:57 AM, S Malligarjunan
smalligarju...@yahoo.com.invalid wrote:
Hello All,
I have added a jar from S3 instance into classpath, i have tried following
options
1. sc.addJar(s3n://mybucket/lib/myUDF.jar)
2. hiveContext.sparkContext.addJar(s3n
Hello all,
I have just checked out branch-1.1
and executed below command
./bin/spark-shell --driver-memory 1G
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT,
Hello All,
I have added a jar from S3 instance into classpath, i have tried following
options
1. sc.addJar(s3n://mybucket/lib/myUDF.jar)
2. hiveContext.sparkContext.addJar(s3n://mybucket/lib/myUDF.jar)
3. ./bin/spark-shell --jars s3n://mybucket/lib/myUDF.jar
I am getting ClassNotException when
Hello Yin,
Additional note:
In ./bin/spark-shell --jars s3n:/mybucket/myudf.jar I got the following
message in console.
Waring skipped external jar..
Thanks and Regards,
Sankar S.
On , S Malligarjunan smalligarju...@yahoo.com wrote:
Hello Yin,
I have tried use sc.addJar
in order to specify the location of data
(i.e. using CREATE EXTERNAL TABLE user1 LOCATION). You can take a look at
this page for reference.
Thanks,
Yin
On Thu, Aug 21, 2014 at 11:12 PM, S Malligarjunan
smalligarju...@yahoo.com.invalid wrote:
Hello All,
When i execute the following query
Hello Yin,
Forgot to mention one thing, the same query works fine in Hive and Shark..
Thanks and Regards,
Sankar S.
On , S Malligarjunan smalligarju...@yahoo.com wrote:
Hello Yin,
I have tried the create external table command as well. I get the same error.
Please help me to find
Hello All,
When i execute the following query
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
CREATE TABLE user1 (time string, id string, u_id string, c_ip string,
user_agent string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' LINES
TERMINATED BY '\n' STORED AS TEXTFILE
Hello Experts,
Appreciate your input highly, please suggest/ give me hint, what would be the
issue here?
Thanks and Regards,
Malligarjunan S.
On Thursday, 17 July 2014, 22:47, S Malligarjunan smalligarju...@yahoo.com
wrote:
Hello Experts,
I am facing performance problem when I use
Hello Experts,
I am facing performance problem when I use the UDF function call. Please help
me to tune the query.
Please find the details below
shark select count(*) from table1;
OK
151096
Time taken: 7.242 seconds
shark select count(*) from table2;
OK
938
Time taken: 1.273 seconds
Without
12 matches
Mail list logo