Re: How to extract data in parallel from RDBMS tables

2019-04-02 Thread Surendra , Manchikanti
Looking for a generic solution, not for a specific DB or number of tables. On Fri, Mar 29, 2019 at 5:04 AM Jason Nerothin wrote: > How many tables? What DB? > > On Fri, Mar 29, 2019 at 00:50 Surendra , Manchikanti < > surendra.manchika...@gmail.com> wrote: > >> Hi J

Re: How to extract data in parallel from RDBMS tables

2019-03-28 Thread Surendra , Manchikanti
titionColumn, lowerBound, and upperBound > > https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html > > On Wed, Mar 27, 2019 at 23:06 Surendra , Manchikanti < > surendra.manchika...@gmail.com> wrote: > >> Hi All, >> >> Is there any way to copy all the tables

How to extract data in parallel from RDBMS tables

2019-03-27 Thread Surendra , Manchikanti
Hi All, Is there any way to copy all the tables in parallel from RDBMS using Spark? We are looking for a functionality similar to Sqoop. Thanks, Surendra

Re: Running Spark on Yarn

2016-03-29 Thread Surendra , Manchikanti
Hi Vineeth, Can you please check resource(RAM,Cores) availability in your local cluster, And change accordingly. Regards, Surendra M -- Surendra Manchikanti On Tue, Mar 29, 2016 at 1:15 PM, Vineet Mishra <clearmido...@gmail.com> wrote: > Hi All, > > While starting Spark o

Re: DataFrameWriter.save fails job with one executor failure

2016-03-25 Thread Surendra , Manchikanti
Hi Vinoth, As per documentation DirectParquetOutputCommitter better suits for S3. https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/DirectParquetOutputCommitter.scala Regards, Surendra M -- Surendra Manchikanti On Fri, Mar

Re: Problem using saveAsNewAPIHadoopFile API

2016-03-23 Thread Surendra , Manchikanti
Hi Vetal, You may try with MultiOutPutFormat instead of TextOutPutFormat in saveAsNewAPIHadoopFile(). Regards, Surendra M -- Surendra Manchikanti On Tue, Mar 22, 2016 at 10:26 AM, vetal king <greenve...@gmail.com> wrote: > We are using Spark 1.4 for Spark Streaming. Kafka is da

Re: ERROR ArrayBuffer(java.nio.channels.ClosedChannelException

2016-03-20 Thread Surendra , Manchikanti
Hi, Can you check Kafka topic replication ? And leader information? Regards, Surendra M -- Surendra Manchikanti On Thu, Mar 17, 2016 at 7:28 PM, Ascot Moss <ascot.m...@gmail.com> wrote: > Hi, > > I have a SparkStream (with Kafka) job, after running several days, it > fai