[jira] [Updated] (SPARK-19546) Every mail to u...@spark.apache.org is getting blocked

2017-02-10 Thread Shivam Sharma (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-19546?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shivam Sharma updated SPARK-19546:
--
Description: 
Each time I am sending mail to  u...@spark.apache.org I am getting email from 
yahoo-inc that "tylerchap...@yahoo-inc.com is no longer with Yahoo! Inc".


  was:
Each time I am sending mail to  u...@spark.apache.org I am getting email from 
yahoo-inc that "tylerchap...@yahoo-inc.com is no longer with Yahoo! Inc".

P

Summary: Every mail to u...@spark.apache.org is getting blocked  (was: 
Every mail to u...@spark.apache.org is blocked)

> Every mail to u...@spark.apache.org is getting blocked
> --
>
> Key: SPARK-19546
> URL: https://issues.apache.org/jira/browse/SPARK-19546
> Project: Spark
>  Issue Type: IT Help
>  Components: Project Infra
>Affects Versions: 2.1.0
>Reporter: Shivam Sharma
>Priority: Minor
>
> Each time I am sending mail to  u...@spark.apache.org I am getting email from 
> yahoo-inc that "tylerchap...@yahoo-inc.com is no longer with Yahoo! Inc".



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-19546) Every mail to u...@spark.apache.org is blocked

2017-02-10 Thread Shivam Sharma (JIRA)
Shivam Sharma created SPARK-19546:
-

 Summary: Every mail to u...@spark.apache.org is blocked
 Key: SPARK-19546
 URL: https://issues.apache.org/jira/browse/SPARK-19546
 Project: Spark
  Issue Type: IT Help
  Components: Project Infra
Affects Versions: 2.1.0
Reporter: Shivam Sharma
Priority: Minor


Each time I am sending mail to  u...@spark.apache.org I am getting email from 
yahoo-inc that "tylerchap...@yahoo-inc.com is no longer with Yahoo! Inc".

P



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-19546) Every mail to u...@spark.apache.org is getting blocked

2017-02-10 Thread Shivam Sharma (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-19546?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shivam Sharma updated SPARK-19546:
--
Priority: Major  (was: Minor)

> Every mail to u...@spark.apache.org is getting blocked
> --
>
> Key: SPARK-19546
> URL: https://issues.apache.org/jira/browse/SPARK-19546
> Project: Spark
>  Issue Type: IT Help
>  Components: Project Infra
>Affects Versions: 2.1.0
>Reporter: Shivam Sharma
>
> Each time I am sending mail to  u...@spark.apache.org I am getting email from 
> yahoo-inc that "tylerchap...@yahoo-inc.com is no longer with Yahoo! Inc".



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29251) intermittent serialization failures in spark

2023-11-20 Thread Shivam Sharma (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29251?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17788101#comment-17788101
 ] 

Shivam Sharma commented on SPARK-29251:
---

I am also getting the same issue, did you find the fix?

> intermittent serialization failures in spark
> 
>
> Key: SPARK-29251
> URL: https://issues.apache.org/jira/browse/SPARK-29251
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.4.3
> Environment: We are running Spark 2.4.3 on an AWS EMR cluster. Our 
> cluster consists of one driver instance, 3 core instances, and 3 to 12 task 
> instances; the task group autoscales as needed.
>Reporter: Jerry Vinokurov
>Priority: Major
>  Labels: bulk-closed
>
> We are running an EMR cluster on AWS that processes somewhere around 100 
> batch jobs a day. These jobs are running various SQL commands to transform 
> data and the data volume ranges between a dozen or a few hundred MB to a few 
> GB on the high end for some jobs, and even around ~1 TB for one particularly 
> large one. We use the Kryo serializer and preregister our classes like so:
>  
> {code:java}
> object KryoRegistrar {
>   val classesToRegister: Array[Class[_]] = Array(
>     classOf[MyModel],
>    [etc]
>   )
> }
> // elsewhere
> val sparkConf = new SparkConf()
>       .registerKryoClasses(KryoRegistrar.classesToRegister)
> {code}
>  
>  
> Intermittently throughout the cluster's operation we have observed jobs 
> terminating with the following stack trace:
>  
>  
> {noformat}
> org.apache.spark.SparkException: Failed to register classes with Kryo
>   at 
> org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:140)
>   at 
> org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:324)
>   at 
> org.apache.spark.serializer.KryoSerializerInstance.(KryoSerializer.scala:309)
>   at 
> org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:218)
>   at 
> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:288)
>   at 
> org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:127)
>   at 
> org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:88)
>   at 
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
>   at 
> org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
>   at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1489)
>   at 
> org.apache.spark.sql.execution.datasources.csv.CSVFileFormat.buildReader(CSVFileFormat.scala:103)
>   at 
> org.apache.spark.sql.execution.datasources.FileFormat$class.buildReaderWithPartitionValues(FileFormat.scala:129)
>   at 
> org.apache.spark.sql.execution.datasources.TextBasedFileFormat.buildReaderWithPartitionValues(FileFormat.scala:165)
>   at 
> org.apache.spark.sql.execution.FileSourceScanExec.inputRDD$lzycompute(DataSourceScanExec.scala:309)
>   at 
> org.apache.spark.sql.execution.FileSourceScanExec.inputRDD(DataSourceScanExec.scala:305)
>   at 
> org.apache.spark.sql.execution.FileSourceScanExec.inputRDDs(DataSourceScanExec.scala:327)
>   at 
> org.apache.spark.sql.execution.FilterExec.inputRDDs(basicPhysicalOperators.scala:121)
>   at 
> org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
>   at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:627)
>   at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>   at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>   at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:156)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>   at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>   at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>   at 
> org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
>   at 
> org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
>   at 
> org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
>   at 
> org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
>   at 
> org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
>   at 
> 

[jira] [Commented] (SPARK-21928) ClassNotFoundException for custom Kryo registrator class during serde in netty threads

2023-11-20 Thread Shivam Sharma (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-21928?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17788104#comment-17788104
 ] 

Shivam Sharma commented on SPARK-21928:
---

I am getting this intermittent failure on spark 2.4.3 version. Here is the full 
stack trace:
{code:java}
Exception in thread "main" java.lang.reflect.InvocationTargetExceptionat 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)   
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)at 
org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:65)
at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)Caused 
by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 75 
in stage 1.0 failed 4 times, most recent failure: Lost task 75.3 in stage 1.0 
(TID 171, phx6-kwq.prod.xyz.internal, executor 71): java.io.IOException: 
org.apache.spark.SparkException: Failed to register classes with Kryoat 
org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1333)at 
org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:208)
at 
org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:66)
at 
org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66)   
 at 
org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:96) 
   at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:89)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)   
 at org.apache.spark.scheduler.Task.run(Task.scala:121)at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
   at java.lang.Thread.run(Thread.java:748)Caused by: 
org.apache.spark.SparkException: Failed to register classes with Kryoat 
org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:140)
at 
org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:324)
at 
org.apache.spark.serializer.KryoSerializerInstance.(KryoSerializer.scala:309)
at 
org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:218)
at 
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:305)
at 
org.apache.spark.broadcast.TorrentBroadcast.$anonfun$readBroadcastBlock$3(TorrentBroadcast.scala:235)
at scala.Option.getOrElse(Option.scala:138)at 
org.apache.spark.broadcast.TorrentBroadcast.$anonfun$readBroadcastBlock$1(TorrentBroadcast.scala:211)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1326)... 
14 moreCaused by: java.lang.ClassNotFoundException: 
com.xyz.datashack.SparkKryoRegistrarat 
java.lang.ClassLoader.findClass(ClassLoader.java:530)at 
org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at 
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
at 
org.apache.spark.util.ChildFirstURLClassLoader.loadClass(ChildFirstURLClassLoader.java:48)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)at 
java.lang.Class.forName0(Native Method)at 
java.lang.Class.forName(Class.java:348)at 
org.apache.spark.serializer.KryoSerializer.$anonfun$newKryo$6(KryoSerializer.scala:135)
at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)   
 at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) 
   at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)at 
scala.collection.TraversableLike.map(TraversableLike.scala:237)at 
scala.collection.TraversableLike.map$(TraversableLike.scala:230)at 
scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)at 
org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:135)
... 22 more
Driver stacktrace:at 
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)
at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)
at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)
at