Github user tbertelsen commented on the pull request:

    https://github.com/apache/spark/pull/4534#issuecomment-73972070
  
    It compiles if you include `new`. It just becomes a `null` filled array.
    ~~~
    scala> new Array[Nothing](1)
    res19: Array[Nothing] = Array(null)
    ~~~
    
    I think your right about this being fairly artificial.  One solution could 
be, just not to accept Nothing as the type of an RDD, and fail fast and 
predictable like this:
    
    ~~~
        if (implicitly[ClassTag[T]].runtimeClass == classOf[Nothing]) {
          throw new RuntimeException("RDD[Nothing] is not allowed.")
        }
    ~~~
    
    We must however be carefull. Right now `sc.emptyRDD` returns an 
`RDD[Nothing]`, and there could easily be unforeseen consequences by 
prohibiting `RDD[Nothing]`
    
    
    If we just want to provide a good error message, we could also just catch 
the ArrayStoreException and wrap it in an exception with a more descriptive 
message. Perhaps at the place where, it is already wrapped in the 
SparkDriverExecutionException. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to