I have a small snippet of code which relays on  argonaut
<http://argonaut.io/>   for JSON serialization which is ran from a
`PairRDDFunctions.mapWithState` once a session is completed.

This is the code snippet (not that important):

  override def sendMessage(pageView: PageView): Unit = {
    Future {
      LogHolder.logger.info(s"Sending pageview: ${pageView.id} to
automation")
      try {
        Http(url)
          .postData(pageView.asJson.toString)
          .option(HttpOptions.connTimeout(timeOutMilliseconds))
          .asString
          .throwError
      }
      catch {
        case NonFatal(e) => LogHolder.logger.error("Failed to send
pageview", e)
      }
    }
  }

argonaut relys on a user implementation of a trait called `EncodeJson[T]`,
which tells argonaut how to serialize and deserialize the object.

The problem is, that the trait `EncodeJson[T]` is not serializable, thus
throwing a NotSerializableException:

Caused by: java.io.NotSerializableException: argonaut.EncodeJson$$anon$2
Serialization stack:
        - object not serializable (class: argonaut.EncodeJson$$anon$2,
value: argonaut.EncodeJson$$anon$2@6415f61e)

This is obvious and understandable.

The question I have is - What possible ways are there to work around this?
I'm currently depended on a third-party library which I can't control of
change to implement Serializable in anyway. I've seen this  this
StackOverflow answer
<http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou>
  
but couldn't implement any reasonable workaround.

Anyone have any ideas?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-a-non-serializable-third-party-JSON-serializable-on-a-spark-worker-node-throws-NotSerializablen-tp26372.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to