Re: [EXTERNAL] Re: Incorrect csv parsing when delimiter used within the data

2023-01-04 Thread Sean Owen
That does not appear to be the same input you used in your example. What is the contents of test.csv? On Wed, Jan 4, 2023 at 7:45 AM Saurabh Gulati wrote: > Hi @Sean Owen > Probably the data is incorrect, and the source needs to fix it. > But using python's csv parser returns the correct

Re: [EXTERNAL] Re: Re: Incorrect csv parsing when delimiter used within the data

2023-01-04 Thread Shay Elbaz
If you have found a parser that works, simply read the data as text files, apply the parser manually, and convert to DataFrame (if needed at all), From: Saurabh Gulati Sent: Wednesday, January 4, 2023 3:45 PM To: Sean Owen Cc: Mich Talebzadeh ; User Subject:

Re: [EXTERNAL] Re: Incorrect csv parsing when delimiter used within the data

2023-01-04 Thread Saurabh Gulati
Hi @Sean Owen Probably the data is incorrect, and the source needs to fix it. But using python's csv parser returns the correct results. import csv with open("/tmp/test.csv") as c_file: csv_reader = csv.reader(c_file, delimiter=",") for row in csv_reader:

Re: Got Error Creating permanent view in Postgresql through Pyspark code

2023-01-04 Thread Stelios Philippou
Vajiha, I believe that you might be confusing stuff ? Permanent View in PSQL is a standard view. Temp view or Global View is the Spark View that is internal for Spark. Can we get a snippet of the code please. On Wed, 4 Jan 2023 at 15:10, Vajiha Begum S A wrote: > > I have tried to Create a

Re: [EXTERNAL] Re: Incorrect csv parsing when delimiter used within the data

2023-01-04 Thread Sean Owen
That input is just invalid as CSV for any parser. You end a quoted col without following with a col separator. What would the intended parsing be and how would it work? On Wed, Jan 4, 2023 at 4:30 AM Saurabh Gulati wrote: > > @Sean Owen Also see the example below with quotes > feedback: > >

Got Error Creating permanent view in Postgresql through Pyspark code

2023-01-04 Thread Vajiha Begum S A
I have tried to Create a permanent view in Postgresql DB through Pyspark code, but I have received the below error message. Kindly help me to create a permanent view table in the database.How shall create permanent view using Pyspark code. Please do reply. *Error Message::* *Exception has

[BUG?] How to handle with special characters or scape them on spark version 3.3.0?

2023-01-04 Thread Vieira, Thiago
Hello everyone, I’ve already raised this question on stack overflow, but to be honest I truly believe this is a bug at new spark version, so I am also sending this email. Previously I was using spark version 3.2.1 to read data from SAP database by JDBC connector, I had no issues to perform the

Re: [EXTERNAL] Re: Incorrect csv parsing when delimiter used within the data

2023-01-04 Thread Saurabh Gulati
Hey guys, much appreciate your quick responses. To answer your questions, @Mich Talebzadeh We get data from multiple sources, and we don't have any control over what they put in. In this case the column is supposed to contain some feedback and it can also

Re: How to set a config for a single query?

2023-01-04 Thread Shay Elbaz
Hi Felipe, I had the same problem - needed to execute multiple jobs/actions multithreaded, with slightly different sql configs per job (mainly spark.sql.shuffle.partitions). I'm not sure if this is the best solution, but I ended up using newSession() per thread. It works well except for the

Re: How to set a config for a single query?

2023-01-04 Thread Saurabh Gulati
Hey Felipe, Since you are collecting the dataframes, you might as well run them separately with desired configs and store them in your storage. Regards Saurabh From: Felipe Pessoto Sent: 04 January 2023 01:14 To: user@spark.apache.org Subject: [EXTERNAL] How to

Re: Incorrect csv parsing when delimiter used within the data

2023-01-04 Thread Mich Talebzadeh
What is the point of having *,* as a column value? From a business point of view it does not signify anything IMO view my Linkedin profile https://en.everybodywiki.com/Mich_Talebzadeh *Disclaimer:* Use it at your own risk. Any