Disclaimer: CDH questions are better handled at cdh-us...@cloudera.org.

But the question I'd like to ask is: why do you need your own Spark
build? What's wrong with CDH's Spark that it doesn't work for you?

On Thu, Jan 8, 2015 at 3:01 PM, freedafeng <freedaf...@yahoo.com> wrote:
> Could anyone come up with your experience on how to do this?
>
> I have created a cluster and installed cdh5.3.0 on it with basically core +
> Hbase. but cloudera installed and configured the spark in its parcels
> anyway. I'd like to install our custom spark on this cluster to use the
> hadoop and hbase service there. There could be potentially conflicts if this
> is not done correctly. Library conflicts are what I worry most.
>
> I understand this is a special case. but if you know how to do it, please
> let me know. Thanks.
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/correct-best-way-to-install-custom-spark1-2-on-cdh5-3-0-tp21045.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to