Not sure, but you can create that path in all workers and put that file in
it.
Thanks
Best Regards
On Thu, Mar 26, 2015 at 1:56 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote:
The Hive command
LOAD DATA LOCAL INPATH
Could you try putting that file in hdfs and try like:
LOAD DATA INPATH 'hdfs://sigmoid/test/kv1.txt' INTO TABLE src_spark
Thanks
Best Regards
On Thu, Mar 26, 2015 at 2:07 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
When you run it in local mode ^^
Thanks
Best Regards
On Thu, Mar 26,
I don;t think thats correct. load data local should pick input from local
directory.
On Thu, Mar 26, 2015 at 1:59 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Not sure, but you can create that path in all workers and put that file in
it.
Thanks
Best Regards
On Thu, Mar 26, 2015 at 1:56
The Hive command
LOAD DATA LOCAL INPATH
'/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4/examples/src/main/resources/kv1.txt'
INTO TABLE src_spark
1. LOCAL INPATH. if i push to HDFS then how will it work ?
2. I cant use sc.addFile, cause i want to run Hive (Spark SQL) queries.
On Thu, Mar
When you run it in local mode ^^
Thanks
Best Regards
On Thu, Mar 26, 2015 at 2:06 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote:
I don;t think thats correct. load data local should pick input from local
directory.
On Thu, Mar 26, 2015 at 1:59 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
I am now seeing this error.
15/03/25 19:44:03 ERROR yarn.ApplicationMaster: User class threw exception:
FAILED: SemanticException Line 1:23 Invalid path
''examples/src/main/resources/kv1.txt'': No files matching path
Try to give the complete path to the file kv1.txt.
On 26 Mar 2015 11:48, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote:
I am now seeing this error.
15/03/25 19:44:03 ERROR yarn.ApplicationMaster: User class threw
exception: FAILED: SemanticException Line 1:23 Invalid path
Does not work
15/03/26 01:07:05 INFO HiveMetaStore.audit: ugi=dvasthimal
ip=unknown-ip-addr cmd=get_table : db=default tbl=src_spark
15/03/26 01:07:06 ERROR ql.Driver: FAILED: SemanticException Line 1:23
Invalid path
Now its clear that the workers are not having the file kv1.txt in their
local filesystem. You can try putting that in hdfs and use the URI to that
file or try adding the file with sc.addFile
Thanks
Best Regards
On Thu, Mar 26, 2015 at 1:38 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote:
Does not
I am facing same issue, posted a new thread. Please respond.
On Wed, Jan 14, 2015 at 4:38 AM, Zhan Zhang zzh...@hortonworks.com wrote:
Hi Folks,
I am trying to run hive context in yarn-cluster mode, but met some error.
Does anybody know what cause the issue.
I use following cmd to build
I solve this by increase the PermGen memory size in driver.
-XX:MaxPermSize=512m
Thanks.
Zhan Zhang
On Mar 25, 2015, at 10:54 AM, ÐΞ€ρ@Ҝ (๏̯͡๏)
deepuj...@gmail.commailto:deepuj...@gmail.com wrote:
I am facing same issue, posted a new thread. Please respond.
On Wed, Jan 14, 2015 at 4:38 AM,
Where and how do i pass this or other JVM argument ?
-XX:MaxPermSize=512m
On Wed, Mar 25, 2015 at 11:36 PM, Zhan Zhang zzh...@hortonworks.com wrote:
I solve this by increase the PermGen memory size in driver.
-XX:MaxPermSize=512m
Thanks.
Zhan Zhang
On Mar 25, 2015, at 10:54 AM,
You can do it in $SPARK_HOME/conf/spark-defaults.con
spark.driver.extraJavaOptions -XX:MaxPermSize=512m
Thanks.
Zhan Zhang
On Mar 25, 2015, at 7:25 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)
deepuj...@gmail.commailto:deepuj...@gmail.com wrote:
Where and how do i pass this or other JVM argument ?
Hi Folks,
I am trying to run hive context in yarn-cluster mode, but met some error. Does
anybody know what cause the issue.
I use following cmd to build the distribution:
./make-distribution.sh -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4
15/01/13 17:59:42 INFO
14 matches
Mail list logo