Hi Folks
This might sound not apprioprate but i want to collect large data(15G approx)..
and do some processing on driver and broadcast that back to each nodes. Is
there any option to collect data off_heap.. like we can store rdd off heap.
i.e sort of collect data on tachyon FS...
Hi folks
I need to reparation large set of data around(300G) as i see some portions have
large data(data skew)
i have pairRDDs [({},{}),({},{}),({},{})]
what is the best way to solve the the problem
-
To unsubscribe, e-mail: us
hi
I tried to build latest master branch of spark
build/mvn -DskipTests clean package
Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ... SUCCESS [03:46 min]
[INFO] Spark Project Test Tags SUCCESS [01:02 min]
[INFO] Spark Project Laun
r example EMR in AWS has a job submit UI.
>
> Spark submit just calls a REST api, you could build any UI you want on top of
> that...
>
>
> On Tue, Oct 6, 2015 at 9:37 AM, shahid qadri <mailto:shahidashr...@icloud.com>> wrote:
> Hi Folks
>
> How i can
Hi Folks
How i can submit my spark app(python) to the cluster without using
spark-submit, actually i need to invoke jobs from UI
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h.
Hi Sparkians
How can we create a customer partition in pyspark
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
> On Aug 25, 2015, at 10:43 PM, shahid qadri wrote:
>
> Any resources on this
>
>> On Aug 25, 2015, at 3:15 PM, shahid qadri wrote:
>>
>> I would like to implement sorted neighborhood approach in spark, what is the
>&g
Any resources on this
> On Aug 25, 2015, at 3:15 PM, shahid qadri wrote:
>
> I would like to implement sorted neighborhood approach in spark, what is the
> best way to write that in pyspark.
-
To unsubscribe,
I would like to implement sorted neighborhood approach in spark, what is the
best way to write that in pyspark.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.or
;-)
> Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs
>
> Le 15 févr. 2015 à 14:44, Shahid Qadri >
> a écrit :
>
> guys getting this error
>
> raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code,
> error_message, additional_info)
> RequestError: Tr
guys getting this error
raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code,
error_message, additional_info)
RequestError: TransportError(400, u'MapperParsingException[failed to parse
[SOURCES.DATE_COMP]]; nested: MapperParsingException[failed to parse date
field [--], tried
11 matches
Mail list logo