To rule out any REPL artefacts, do you see the same thing if you put the code in a script and run the script?
> On 9 Sep 2022, at 20:17, Sean McAfee <eef...@gmail.com> wrote: > > Hello-- > > I recently started playing around with PySpark. It soon occurred to me that > it would be a lot more fun to work in Raku instead of Python, and I recalled > that it's supposed to be possible to get handles to Python objects from Raku > and call methods on them seamlessly, so I tried to make it happen. I got > pretty far, but now I'm stymied. Here are the steps I can take in the Raku > interpreter: > > > use Inline::Python > Nil > > my \python = Inline::Python.new > Inline::Python.new > > Self-explanatory. > > > python.run('from pyspark.sql import SparkSession') > (Any) > > No errors, that looks promising... > > > my \spark = python.run('SparkSession.builder.getOrCreate()', :eval) > ... spam from initialization of Spark session deleted... > Inline::Python::PythonObject.new(ptr => > NativeCall::Types::Pointer.new(4461193984), python => Inline::Python.new) > > Now we're getting somewhere! (I had to source-dive to guess that I needed > that :eval; without it, an Any is returned.) > > > my \sql = spark.sql('select 1+1') > [] > > Uh...what? I was expecting to get another Python object back, a DataFrame. > (I think; I'm much more familiar with the Scala interface to Spark.) Instead > I have an empty array. > > Even more puzzlingly, if I re-run that last statement, I get an error: > "instance has no attribute 'sql'". If I re-run the statement over and over, > the response alternates between an empty array and that error. > > Does anyone have any insight into what's going on? >