lease.
>
> Best regards,
> Burak
>
>> On Tue, Jul 26, 2016 at 3:51 AM, Julio Antonio Soto de Vicente
>> <ju...@esbet.es> wrote:
>> Hi all,
>>
>> Maybe I am missing something, but... Is there a way to update a package
>> uploaded to spark-packages
Hi all,
Maybe I am missing something, but... Is there a way to update a package
uploaded to spark-packages.org under the same version?
Given a release called my_package 1.1.2, I would like to re-upload it due to
build failure; but I want to do it also as version 1.1.2...
Thank you.
Try adding to spark-env.sh (renaming if you still have it with .template at the
end):
PYSPARK_PYTHON=/path/to/your/bin/python
Where your bin/python is your actual Python environment with Numpy installed.
> El 1 jun 2016, a las 20:16, Bhupendra Mishra
> escribió:
Hi,
Same goes for the PolynomialExpansion in org.apache.spark.ml.feature. It would
be dice to cross-validate with degree 1 polynomial expansion (this is, with no
expansion at all) vs other degree polynomial expansions. Unfortunately, degree
is forced to be >= 2.
--
Julio
> El 2 may 2016, a
Hi,
Indeed, Hive is not able to perform predicate pushdown through a HBase table.
Nor Hive or Impala can.
Broadly speaking, if you need to query your HBase table through a field other
than de rowkey:
A) Try to "encode" as much info as possible in the rowkey field and use it as
your
Unfortunately, Koert is right.
I've been in a couple of projects using Spark (banking industry) where CentOS +
Python 2.6 is the toolbox available.
That said, I believe it should not be a concern for Spark. Python 2.6 is old
and busted, which is totally opposite to the Spark philosophy IMO.
Hi Amir,
I believe that the first step should be looking for a library that implements
the streaming API.
> El 24/11/2015, a las 10:32, Amir Rahnama escribió:
>
> I wanna end the situation where python users of spark need to implement the
> twitter source for
-dev +user
Hi Sandeep,
Perhaps (flat)mapping values and using an accumulator?
> El 29/10/2015, a las 23:08, Sandeep Giri escribió:
>
> Dear All,
>
> If a continuous stream of text is coming in and you have to keep publishing
> the overall word count so far since