You can replace temp views. Again: what you can't do here is define a temp
view in terms of itself. If you are reusing the same name over and over,
it's probably easy to do that, so you don't want to do that. You want
different names for different temp views, or else ensure you aren't doing
the kind of thing shown in the SO post. You get the problem right?

On Mon, Dec 13, 2021 at 10:43 AM Daniel de Oliveira Mantovani <
daniel.oliveira.mantov...@gmail.com> wrote:

> I didn't post the SO issue, I've just found the same exception I'm facing
> for Spark 3.2. Almaren Framework has a concept of create temporary views
> with the name "__TABLE__".
>
> Example, if you want to use SQL dialect to a DataFrame to join a
> table/aggregation/apply a function whatever. Instead of you create a
> temporary table you just use the "__TABLE__" alias. You don't really care
> about the name of the table. You may use this "__TABLE__" approach in
> different parts of your code.
>
> Why can't I create or replace temporary views in different DataFrame with
> the same name as before ?
>
>
>
> On Mon, Dec 13, 2021 at 1:27 PM Sean Owen <sro...@gmail.com> wrote:
>
>> If the issue is what you posted in SO, I think the stack trace explains
>> it already. You want to avoid this recursive definition, which in general
>> can't work.
>> I think it's simply explicitly disallowed in all cases now, but, you
>> should not be depending on this anyway - why can't this just be avoided?
>>
>> On Mon, Dec 13, 2021 at 10:06 AM Daniel de Oliveira Mantovani <
>> daniel.oliveira.mantov...@gmail.com> wrote:
>>
>>> Sean,
>>>
>>> https://github.com/music-of-the-ainur/almaren-framework/tree/spark-3.2
>>>
>>> Just executing "sbt test" will reproduce the error, the same code works
>>> for spark 2.3.x, 2.4.x and 3.1.x why doesn't it work for spark 3.2 ?
>>>
>>> Thank you so much
>>>
>>>
>>>
>>> On Mon, Dec 13, 2021 at 12:59 PM Sean Owen <sro...@gmail.com> wrote:
>>>
>>>> ... but the error is not "because that already exists". See your stack
>>>> trace. It's because the definition is recursive. You define temp view
>>>> test1, create a second DF from it, and then redefine test1 as that result.
>>>> test1 depends on test1.
>>>>
>>>> On Mon, Dec 13, 2021 at 9:58 AM Daniel de Oliveira Mantovani <
>>>> daniel.oliveira.mantov...@gmail.com> wrote:
>>>>
>>>>> Sean,
>>>>>
>>>>> The method name is very clear "createOrReplaceTempView"  doesn't make
>>>>> any sense to throw an exception because this view already exists. Spark
>>>>> 3.2.x is breaking back compatibility with no reason or sense.
>>>>>
>>>>>
>>>>> On Mon, Dec 13, 2021 at 12:53 PM Sean Owen <sro...@gmail.com> wrote:
>>>>>
>>>>>> The error looks 'valid' - you define a temp view in terms of its own
>>>>>> previous version, which doesn't quite make sense - somewhere the new
>>>>>> definition depends on the old definition. I think it just correctly
>>>>>> surfaces as an error now,.
>>>>>>
>>>>>> On Mon, Dec 13, 2021 at 9:41 AM Daniel de Oliveira Mantovani <
>>>>>> daniel.oliveira.mantov...@gmail.com> wrote:
>>>>>>
>>>>>>> Hello team,
>>>>>>>
>>>>>>> I've found this issue while I was porting my project from Apache
>>>>>>> Spark 3.1.x to 3.2.x.
>>>>>>>
>>>>>>>
>>>>>>> https://stackoverflow.com/questions/69937415/spark-3-2-0-the-different-dataframe-createorreplacetempview-the-same-name-tempvi
>>>>>>>
>>>>>>> Do we have a bug for that in apache-spark or I need to create one ?
>>>>>>>
>>>>>>> Thank you so much
>>>>>>>
>>>>>>> [info] com.github.music.of.the.ainur.almaren.Test *** ABORTED ***
>>>>>>> [info]   org.apache.spark.sql.AnalysisException: Recursive view
>>>>>>> `__TABLE__` detected (cycle: `__TABLE__` -> `__TABLE__`)
>>>>>>> [info]   at
>>>>>>> org.apache.spark.sql.errors.QueryCompilationErrors$.recursiveViewDetectedError(QueryCompilationErrors.scala:2045)
>>>>>>> [info]   at
>>>>>>> org.apache.spark.sql.execution.command.ViewHelper$.checkCyclicViewReference(views.scala:515)
>>>>>>> [info]   at
>>>>>>> org.apache.spark.sql.execution.command.ViewHelper$.$anonfun$checkCyclicViewReference$2(views.scala:522)
>>>>>>> [info]   at
>>>>>>> org.apache.spark.sql.execution.command.ViewHelper$.$anonfun$checkCyclicViewReference$2$adapted(views.scala:522)
>>>>>>> [info]   at scala.collection.Iterator.foreach(Iterator.scala:941)
>>>>>>> [info]   at scala.collection.Iterator.foreach$(Iterator.scala:941)
>>>>>>> [info]   at
>>>>>>> scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
>>>>>>> [info]   at
>>>>>>> scala.collection.IterableLike.foreach(IterableLike.scala:74)
>>>>>>> [info]   at
>>>>>>> scala.collection.IterableLike.foreach$(IterableLike.scala:73)
>>>>>>> [info]   at
>>>>>>> scala.collection.AbstractIterable.foreach(Iterable.scala:56)
>>>>>>>
>>>>>>> --
>>>>>>>
>>>>>>> --
>>>>>>> Daniel Mantovani
>>>>>>>
>>>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> --
>>>>> Daniel Mantovani
>>>>>
>>>>>
>>>
>>> --
>>>
>>> --
>>> Daniel Mantovani
>>>
>>>
>
> --
>
> --
> Daniel Mantovani
>
>

Reply via email to