have any
> constraint for it)
>
> For this kind of run time how we can usually identify the root cause of
> it?
>
>
> On Thu, May 11, 2023 at 9:37 PM Farhan Misarwala <
> farhan.misarw...@gmail.com> wrote:
>
>> Hi Karthick,
>>
>> I think I have seen
Hi Karthick,
I think I have seen this before and this probably could be because of an
incompatibility between your spark and delta versions.
Or an incompatibility between the delta version you are using now vs the
one you used earlier on the existing table you are merging with.
Let me know if
t; jdbc driver like progress direct etc. Most of these defects are related to
> jdbc driver itself!
>
> HHT,
>
> Mich
>
> On Fri, 30 Apr 2021 at 13:49, Farhan Misarwala
> wrote:
>
>> Hi Mich,
>>
>> I have tried this already. I am using the same metho
option("table", tableName). \
> save()
> except Exception as e:
> print(f"""{e}, quitting""")
> sys.exit(1)
>
> HTH
>
>
>view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebza
Hi Anbutech,
If I am not mistaken, I believe you are trying to read multiple
dataframes from around 150 different paths (in your case the Kafka
topics) to count their records. You have all these paths stored in a
CSV with columns year, month, day and hour.
Here is what I came up with; I have