Any help is appreciated. I have spark batch job based on condition I would
like to start another batch job by invoking .sh file. Just want to know can
we achieve that?
Thanks
Amit
On Fri, Aug 7, 2020 at 3:58 PM Amit Sharma wrote:
> Hi, I want to write a batch job which would call another batch
Note, none of this applies to Direct streaming approaches, only receiver
based Dstreams.
You can think of a receiver as a long running task that never finishes.
Each receiver is submitted to an executor slot somewhere, it then runs
indefinitely and internally has a method which passes records
Hi,
I'm having some trouble figuring out how receivers tie into spark
driver-executor structure.
Do all executors have a receiver that is blocked as soon as it
receives some stream data?
Or can multiple streams of data be taken as input into a single executor?
I have stream data coming in at
Hi
I am getting the following error while trying to import the package
org.apache.spark.sql.avro.functions._ in the scala shell:
scala> import org.apache.spark.sql.avro.functions._
:23: error: object functions is not a member of package
org.apache.spark.sql.avro
import