Hi All,
I am trying to access hive from Spark but getting exception
The root scratch dir: /tmp/hive on HDFS should be writable. Current
permissions are: rw-rw-rw-
Code :-
String logFile = "hdfs://hdp23ha/logs"; // Should be some file
on
Hi All,
I am very new to Spark-MLib .I am trying to understand and implement Spark
Mlib's LDA algorithm
Goal is to get Topic present documents given and terms with in those topics
.
I followed below link
https://gist.github.com/jkbradley/ab8ae22a8282b2c8ce33
This property already exists.
-Original Message-
From: "ashesh_28 [via Apache Spark User List]"
<ml-node+s1001560n26769...@n3.nabble.com>
Sent: 4/13/2016 11:02 AM
To: "Amit Singh Hora" <hora.a...@gmail.com>
Subject: Re: Unable to Access files in Hadoop
I am trying to access directory in Hadoop from my Spark code on local
machine.Hadoop is HA enabled .
val conf = new SparkConf().setAppName("LDA Sample").setMaster("local[2]")
val sc=new SparkContext(conf)
val distFile = sc.textFile("hdfs://hdpha/mini_newsgroups/")
println(distFile.count())
but
Hi all ,
I am using Cloudera's SparkObHbase to bulk insert in hbase ,Please find
below code
object test {
def main(args: Array[String]): Unit = {
val conf = ConfigFactory.load("connection.conf").getConfig("connection")
val
Hi All,
I am trying to wrote an RDD as Sequence file into my Hadoop cluster but
getting connection time out again and again ,I can ping the hadoop cluster
and also directory gets created with the file name i specify ,I believe I am
missing some configuration ,Kindly help me
object
Hi All ,
My spark job started reporting zookeeper errors after seeing the zkdumps
from Hbase master i realized that there are N number of connection being
made from the nodes where worker of spark are running i believe some how
the connections are not getting closed that is leading to error
Hi All,
I am using below code to stream data from kafka to hbase ,everything works
fine until i restart the job so that it can restore the state from
checkpoint directory ,but while trying to restore the state it give me below
error
ge 0.0 (TID 0, localhost): java.lang.ClassCastException:
Hi All,
I am using below code to stream data from kafka to hbase ,everything works
fine until i restart the job so that it can restore the state from
checkpoint directory ,but while trying to restore the state it give me below
error
ge 0.0 (TID 0, localhost): java.lang.ClassCastException:
I am running spark locally to understand how countByValueAndWindow works
val Array(brokers, topics) = Array("192.XX.X.XX:9092", "test1")
// Create context with 2 second batch interval
val sparkConf = new
Hi All,
I have downloaded pre built Spark 1.1.1 for Hadoop 2.3.0 then i did mvn
install for the jar spark-assembly-1.1.1-hadoop2.3.0.jar available in lib
folder of the spark downloaded and added its dependency as following in my
java program
dependency
groupIdorg.apache.spark/groupId
. table contains 10lakh rows
How many rows are there in the table ?
nit: Example uses classOf[TableInputFormat] instead of
TableInputFormat.class.
Cheers
On Wed, Aug 6, 2014 at 5:54 AM, Amit Singh Hora [hidden email]
http://user/SendEmail.jtp?type=nodenode=11651i=1 wrote:
Hi All,
I am
12 matches
Mail list logo