stevedlawrence commented on code in PR #1515:
URL: https://github.com/apache/daffodil/pull/1515#discussion_r2251991004


##########
daffodil-core/src/main/scala/org/apache/daffodil/runtime1/processors/DataProcessor.scala:
##########
@@ -155,8 +116,7 @@ class DataProcessor(
    * @return the serializable object
    */
   @throws(classOf[java.io.ObjectStreamException])
-  private def writeReplace(): Object =
-    new SerializableDataProcessor(ssrd, tunables, variableMap.copy())
+  private def writeReplace(): Object = new DataProcessor(ssrd, tunables, 
variableMap.copy())

Review Comment:
   We could change the `save` method to also disable debugging, similar to how 
we disable validation, e.g.
   
   ```scala
       val dpToSave = this.copy(
         // reset to original variables defined in schema
         variableMap = ssrd.originalVariables,
         validator = NoValidator,
         // don't save any warnings that were generated
         diagnostics = Seq.empty,
         // disable debugger if provided
         areDebugging = false,
         optDebugger = None
       )
   ```
   
   So using `save` will clear the debugger, but manually serializing will not. 
And if you want to debug using a save/reload parser, you just need to call 
withDebugger after reloading it, similar to what you have to do with validators.
   
   I don't think we need to support our own debuggers being serializable, but I 
could imagine a use case where Spark or some other framework that directly 
serializes the DataProcessor could create a custom debugger that is 
serializable, so we don't necessarily need to exclude that capability. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to