Re: writing into oracle database is very slow

2019-04-19 Thread spark receiver
hi Jiang, i was facing the very same issue ,the solution is write to file and using oracle external table to do the insert. hope this could help. Dalin On Thu, Apr 18, 2019 at 11:43 AM Jörn Franke wrote: > What is the size of the data? How much time does it need on HDFS and how > much on

Re: Hive to Oracle using Spark - Type(Date) conversion issue

2018-06-06 Thread spark receiver
Use unix time and write the unix time to oracle as number column type ,create virtual column in oracle database for the unix time like “oracle_time generated always as (to_date('1970010108','MMDDHH24')+(1/24/60/60)*unixtime ) > On Mar 20, 2018, at 11:08 PM, Gurusamy Thirupathy wrote: >

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-13 Thread spark receiver
Hi Panagiotis , Wondering you solved the problem or not? Coz I met the same issue today. I’d appreciate so much if you could paste the code snippet if it’s working . Thanks. > 在 2018年4月6日,上午7:40,Aakash Basu 写道: > > Hi Panagiotis, > > I did that, but it still

Re: Reload some static data during struct streaming

2017-11-13 Thread spark receiver
I need it cached to improve throughput ,only hope it can be refreshed once a day not every batch. > On Nov 13, 2017, at 4:49 PM, Burak Yavuz <brk...@gmail.com> wrote: > > I think if you don't cache the jdbc table, then it should auto-refresh. > > On Mon, Nov 13, 2

Reload some static data during struct streaming

2017-11-13 Thread spark receiver
Hi I’m using struct streaming(spark 2.2) to receive Kafka msg ,it works great. The thing is I need to join the Kafka message with a relative static table stored in mysql database (let’s call it metadata here). So is it possible to reload the metadata table after some time interval(like