[ 
https://issues.apache.org/jira/browse/SYSTEMML-1158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15762612#comment-15762612
 ] 

Felix Schüler commented on SYSTEMML-1158:
-----------------------------------------

There might be something else going on. When I run the same thing in Intellij, 
I get an IndexOutofBound for a CP instruction that isn't in the explain output 
and so probably the spark rangeindex instruction is recompiled into a CP 
instruction and then something goes wrong?

{code}
java.lang.ArrayIndexOutOfBoundsException: 0
        at 
org.apache.sysml.runtime.io.IOUtilFunctions.countNnz(IOUtilFunctions.java:281)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:638)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:570)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
        at org.apache.spark.scheduler.Task.run(Task.scala:86)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
16/12/19 15:16:18 INFO executor.Executor: Finished task 0.0 in stage 7.0 (TID 
11). 1488 bytes result sent to driver
16/12/19 15:16:18 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 7.0 
(TID 11) in 25 ms on localhost (1/2)
16/12/19 15:16:18 WARN scheduler.TaskSetManager: Lost task 1.0 in stage 7.0 
(TID 12, localhost): java.lang.ArrayIndexOutOfBoundsException: 0
        at 
org.apache.sysml.runtime.io.IOUtilFunctions.countNnz(IOUtilFunctions.java:281)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:638)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:570)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
        at org.apache.spark.scheduler.Task.run(Task.scala:86)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

16/12/19 15:16:18 ERROR scheduler.TaskSetManager: Task 1 in stage 7.0 failed 1 
times; aborting job
16/12/19 15:16:18 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 7.0, whose 
tasks have all completed, from pool 
16/12/19 15:16:18 INFO scheduler.TaskSchedulerImpl: Cancelling stage 7
16/12/19 15:16:18 INFO scheduler.DAGScheduler: ShuffleMapStage 7 
(mapPartitionsToPair at RDDConverterUtils.java:185) failed in 0.073 s
16/12/19 15:16:18 INFO scheduler.DAGScheduler: Job 6 failed: collect at 
SparkExecutionContext.java:777, took 0.090557 s

org.apache.sysml.api.mlcontext.MLContextException: Exception when executing 
script

        at org.apache.sysml.api.mlcontext.MLContext.execute(MLContext.java:301)
        at org.apache.sysml.api.mlcontext.MLContext.execute(MLContext.java:270)
        at 
org.apache.sysml.compiler.macros.RewriteMacrosSpec$$anonfun$8$$anon$8.run(RewriteMacrosSpec.scala:175)
        at 
org.apache.sysml.compiler.macros.RewriteMacrosSpec$$anonfun$8$$anon$8.run(RewriteMacrosSpec.scala)
        at 
org.apache.sysml.compiler.macros.RewriteMacrosSpec$$anonfun$8.apply$mcV$sp(RewriteMacrosSpec.scala:195)
        at 
org.apache.sysml.compiler.macros.RewriteMacrosSpec$$anonfun$8.apply(RewriteMacrosSpec.scala:173)
        at 
org.apache.sysml.compiler.macros.RewriteMacrosSpec$$anonfun$8.apply(RewriteMacrosSpec.scala:173)
        at 
org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
        at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
        at org.scalatest.Transformer.apply(Transformer.scala:22)
        at org.scalatest.Transformer.apply(Transformer.scala:20)
        at org.scalatest.FreeSpecLike$$anon$1.apply(FreeSpecLike.scala:373)
        at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
        at org.scalatest.FreeSpec.withFixture(FreeSpec.scala:1740)
        at 
org.scalatest.FreeSpecLike$class.invokeWithFixture$1(FreeSpecLike.scala:370)
        at 
org.scalatest.FreeSpecLike$$anonfun$runTest$1.apply(FreeSpecLike.scala:382)
        at 
org.scalatest.FreeSpecLike$$anonfun$runTest$1.apply(FreeSpecLike.scala:382)
        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
        at org.scalatest.FreeSpecLike$class.runTest(FreeSpecLike.scala:382)
        at org.scalatest.FreeSpec.runTest(FreeSpec.scala:1740)
        at 
org.scalatest.FreeSpecLike$$anonfun$runTests$1.apply(FreeSpecLike.scala:441)
        at 
org.scalatest.FreeSpecLike$$anonfun$runTests$1.apply(FreeSpecLike.scala:441)
        at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
        at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
        at 
org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
        at org.scalatest.FreeSpecLike$class.runTests(FreeSpecLike.scala:441)
        at org.scalatest.FreeSpec.runTests(FreeSpec.scala:1740)
        at org.scalatest.Suite$class.run(Suite.scala:1424)
        at 
org.scalatest.FreeSpec.org$scalatest$FreeSpecLike$$super$run(FreeSpec.scala:1740)
        at 
org.scalatest.FreeSpecLike$$anonfun$run$1.apply(FreeSpecLike.scala:486)
        at 
org.scalatest.FreeSpecLike$$anonfun$run$1.apply(FreeSpecLike.scala:486)
        at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
        at org.scalatest.FreeSpecLike$class.run(FreeSpecLike.scala:486)
        at org.scalatest.FreeSpec.run(FreeSpec.scala:1740)
        at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
        at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
        at 
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
        at 
com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:51)
        at 
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:237)
        at 
com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: org.apache.sysml.api.mlcontext.MLContextException: Exception 
occurred while executing runtime program
        at 
org.apache.sysml.api.mlcontext.ScriptExecutor.executeRuntimeProgram(ScriptExecutor.java:380)
        at 
org.apache.sysml.api.mlcontext.ScriptExecutor.execute(ScriptExecutor.java:323)
        at org.apache.sysml.api.mlcontext.MLContext.execute(MLContext.java:293)
        ... 48 more
Caused by: org.apache.sysml.runtime.DMLRuntimeException: 
org.apache.sysml.runtime.DMLRuntimeException: ERROR: Runtime error in for 
program block generated from for statement block between lines 6 and 8 -- Error 
evaluating for program block
        at 
org.apache.sysml.runtime.controlprogram.Program.execute(Program.java:130)
        at 
org.apache.sysml.api.mlcontext.ScriptExecutor.executeRuntimeProgram(ScriptExecutor.java:378)
        ... 50 more
Caused by: org.apache.sysml.runtime.DMLRuntimeException: ERROR: Runtime error 
in for program block generated from for statement block between lines 6 and 8 
-- Error evaluating for program block
        at 
org.apache.sysml.runtime.controlprogram.ForProgramBlock.execute(ForProgramBlock.java:162)
        at 
org.apache.sysml.runtime.controlprogram.Program.execute(Program.java:123)
        ... 51 more
Caused by: org.apache.sysml.runtime.DMLRuntimeException: ERROR: Runtime error 
in program block generated from statement block between lines 7 and 7 -- Error 
evaluating instruction: 
CP°rangeReIndex°data·MATRIX·DOUBLE°_Var60·SCALAR·INT·false°_Var60·SCALAR·INT·false°1·SCALAR·INT·true°_Var61·SCALAR·INT·false°_mVar62·MATRIX·DOUBLE
        at 
org.apache.sysml.runtime.controlprogram.ProgramBlock.executeSingleInstruction(ProgramBlock.java:320)
        at 
org.apache.sysml.runtime.controlprogram.ProgramBlock.executeInstructions(ProgramBlock.java:221)
        at 
org.apache.sysml.runtime.controlprogram.ProgramBlock.execute(ProgramBlock.java:168)
        at 
org.apache.sysml.runtime.controlprogram.ForProgramBlock.execute(ForProgramBlock.java:150)
        ... 52 more
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: 
Task 1 in stage 7.0 failed 1 times, most recent failure: Lost task 1.0 in stage 
7.0 (TID 12, localhost): java.lang.ArrayIndexOutOfBoundsException: 0
        at 
org.apache.sysml.runtime.io.IOUtilFunctions.countNnz(IOUtilFunctions.java:281)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:638)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:570)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
        at org.apache.spark.scheduler.Task.run(Task.scala:86)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1454)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1441)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1441)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
        at scala.Option.foreach(Option.scala:257)
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:811)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1667)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1622)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1611)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1873)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1886)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1899)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1913)
        at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:912)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
        at org.apache.spark.rdd.RDD.collect(RDD.scala:911)
        at 
org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:360)
        at 
org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
        at 
org.apache.sysml.runtime.controlprogram.context.SparkExecutionContext.toMatrixBlock(SparkExecutionContext.java:777)
        at 
org.apache.sysml.runtime.controlprogram.context.SparkExecutionContext.toMatrixBlock(SparkExecutionContext.java:745)
        at 
org.apache.sysml.runtime.controlprogram.caching.MatrixObject.readBlobFromRDD(MatrixObject.java:486)
        at 
org.apache.sysml.runtime.controlprogram.caching.MatrixObject.readBlobFromRDD(MatrixObject.java:60)
        at 
org.apache.sysml.runtime.controlprogram.caching.CacheableData.acquireRead(CacheableData.java:411)
        at 
org.apache.sysml.runtime.controlprogram.context.ExecutionContext.getMatrixInput(ExecutionContext.java:215)
        at 
org.apache.sysml.runtime.instructions.cp.MatrixIndexingCPInstruction.processInstruction(MatrixIndexingCPInstruction.java:64)
        at 
org.apache.sysml.runtime.controlprogram.ProgramBlock.executeSingleInstruction(ProgramBlock.java:290)
        ... 55 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 0
        at 
org.apache.sysml.runtime.io.IOUtilFunctions.countNnz(IOUtilFunctions.java:281)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:638)
        at 
org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$CSVToBinaryBlockFunction.call(RDDConverterUtils.java:570)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:185)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
        at org.apache.spark.scheduler.Task.run(Task.scala:86)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

{code}

> MLContext values not found when data is read in DML script
> ----------------------------------------------------------
>
>                 Key: SYSTEMML-1158
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-1158
>             Project: SystemML
>          Issue Type: Bug
>          Components: APIs
>    Affects Versions: SystemML 0.11
>         Environment: IBM DSX
>            Reporter: Felix Schüler
>
> When executing a DML script through MLContext that reads data from a file, 
> resulting variables can not be found in the MLContext because they are 
> removed from the context. This does not happen when the data is initialized 
> randomly inside the script. 
> This example works fine with MLContext.getTuple
> {code}
> data = rand(100, 10)
> c = ncol(data)
> r = nrow(data)
> stats = rand(rows=1, cols=c)
> for (i in 0:(r - 1)) {
>   stats = (stats + data[i + 1,])
> }
> stats = (stats / r)
> {code}
> When I replace the random matrix initialization with {code} 
> read("~/abalone_transformed", format="csv"){code} I get the following 
> exception: 
> {code} 
> Name: org.apache.sysml.api.mlcontext.MLContextException
> Message: Variable 'stats' not found
> StackTrace:   at 
> org.apache.sysml.api.mlcontext.MLResults.getData(MLResults.java:103)
>   at org.apache.sysml.api.mlcontext.MLResults.outputValue(MLResults.java:1996)
>   at org.apache.sysml.api.mlcontext.MLResults.getTuple(MLResults.java:663)
> {code}
> The instructions for the working case:
> {code}
> PROGRAM ( size CP/SP = 28/0 )
> --MAIN PROGRAM
> ----GENERIC (lines 1-4) [recompile=false]
> ------CP createvar _mVar167234 
> scratch_space//_p664_10.143.133.52//_t0/temp83636 true MATRIX binaryblock 3 3 
> 1000 1000 9 copy
> ------CP rand 3 3 1000 1000 0.0 1.0 1.0 -1 uniform 1.0 48 
> _mVar167234.MATRIX.DOUBLE
> ------CP createvar _mVar167235 
> scratch_space//_p664_10.143.133.52//_t0/temp83637 true MATRIX binaryblock 1 3 
> 1000 1000 3 copy
> ------CP rand 1 3 1000 1000 0.0 1.0 1.0 -1 uniform 1.0 48 
> _mVar167235.MATRIX.DOUBLE
> ------CP assignvar 3.SCALAR.INT.true c.SCALAR.INT
> ------CP assignvar 3.SCALAR.INT.true r.SCALAR.INT
> ------CP cpvar _mVar167234 data
> ------CP cpvar _mVar167235 stats
> ------CP rmvar _mVar167234
> ------CP rmvar _mVar167235
> ----GENERIC (lines 5-7) [recompile=false]
> ----FOR (lines 5-7)
> ------GENERIC (lines 6-6) [recompile=false]
> --------CP + i.SCALAR.INT.false 1.SCALAR.INT.true _Var167236.SCALAR.INT
> --------CP createvar _mVar167237 
> scratch_space//_p664_10.143.133.52//_t0/temp83638 true MATRIX binaryblock 1 3 
> 1000 1000 -1 copy
> --------CP rangeReIndex data.MATRIX.DOUBLE _Var167236.SCALAR.INT.false 
> _Var167236.SCALAR.INT.false 1.SCALAR.INT.true 3.SCALAR.INT.true 
> _mVar167237.MATRIX.DOUBLE
> --------CP rmvar _Var167236
> --------CP createvar _mVar167238 
> scratch_space//_p664_10.143.133.52//_t0/temp83639 true MATRIX binaryblock 1 3 
> 1000 1000 -1 copy
> --------CP + stats.MATRIX.DOUBLE _mVar167237.MATRIX.DOUBLE 
> _mVar167238.MATRIX.DOUBLE
> --------CP rmvar _mVar167237
> --------CP rmvar stats
> --------CP cpvar _mVar167238 stats
> --------CP rmvar _mVar167238
> ----GENERIC (lines 8-11) [recompile=false]
> ------CP createvar _mVar167239 
> scratch_space//_p664_10.143.133.52//_t0/temp83640 true MATRIX binaryblock 1 3 
> 1000 1000 -1 copy
> ------CP / stats.MATRIX.DOUBLE 3.SCALAR.INT.true _mVar167239.MATRIX.DOUBLE
> ------CP rmvar stats
> ------CP cpvar _mVar167239 stats
> ------CP rmvar _mVar167239
> ------CP rmvar r
> ------CP rmvar c
> ------CP rmvar stats
> {code}
> Failing case (note that this is in a notebook reading a file from object 
> storage)
> {code}
> PROGRAM ( size CP/SP = 38/7 )
> --MAIN PROGRAM
> ----GENERIC (lines 1-4) [recompile=true]
> ------CP createvar pREADdata ~/abalone_transformed false MATRIX csv -1 -1 -1 
> -1 -1 copy false , true 0.0
> ------CP createvar _mVar150511 
> scratch_space//_p664_10.143.133.52//_t0/temp75273 true MATRIX binaryblock -1 
> -1 1000 1000 -1 copy
> ------SPARK csvrblk pREADdata.MATRIX.DOUBLE _mVar150511.MATRIX.DOUBLE 1000 
> 1000 false , true 0.0
> ------CP createvar _mVar150512 
> scratch_space//_p664_10.143.133.52//_t0/temp75274 true MATRIX binaryblock -1 
> -1 1000 1000 -1 copy
> ------SPARK chkpoint _mVar150511.MATRIX.DOUBLE _mVar150512.MATRIX.DOUBLE 
> MEMORY_AND_DISK
> ------CP rmvar _mVar150511
> ------CP cpvar _mVar150512 data
> ------CP rmvar _mVar150512
> ----GENERIC (lines 1-4) [recompile=true]
> ------CP ncol data.MATRIX.DOUBLE.false _Var150513.SCALAR.INT
> ------CP nrow data.MATRIX.DOUBLE.false _Var150514.SCALAR.INT
> ------CP createvar _mVar150515 
> scratch_space//_p664_10.143.133.52//_t0/temp75275 true MATRIX binaryblock 1 
> -1 1000 1000 -1 copy
> ------SPARK rand 1 ¶_Var150513¶ 1000 1000 0.0 1.0 1.0 -1 
> scratch_space/_p664_10.143.133.52//_t0/ uniform 1.0 _mVar150515.MATRIX.DOUBLE
> ------CP assignvar _Var150513.SCALAR.INT.false c.SCALAR.INT
> ------CP assignvar _Var150514.SCALAR.INT.false r.SCALAR.INT
> ------CP rmvar _Var150513
> ------CP rmvar _Var150514
> ------CP cpvar _mVar150515 stats
> ------CP rmvar _mVar150515
> ----GENERIC (lines 5-7) [recompile=true]
> ------CP createvar _mVar150516 
> scratch_space//_p664_10.143.133.52//_t0/temp75276 true MATRIX binaryblock -1 
> -1 1000 1000 -1 copy
> ------SPARK chkpoint data.MATRIX.DOUBLE _mVar150516.MATRIX.DOUBLE 
> MEMORY_AND_DISK
> ------CP rmvar data
> ------CP cpvar _mVar150516 data
> ------CP rmvar _mVar150516
> ----FOR (lines 5-7)
> ------CP - r.SCALAR.INT.false 1.SCALAR.INT.true _Var150517.SCALAR.INT
> ------CP rmvar _Var150517
> ------GENERIC (lines 6-6) [recompile=true]
> --------CP + i.SCALAR.INT.false 1.SCALAR.INT.true _Var150518.SCALAR.INT
> --------CP ncol data.MATRIX.DOUBLE.false _Var150519.SCALAR.INT
> --------CP createvar _mVar150520 
> scratch_space//_p664_10.143.133.52//_t0/temp75277 true MATRIX binaryblock 1 
> -1 1000 1000 -1 copy
> --------SPARK rangeReIndex data.MATRIX.DOUBLE _Var150518.SCALAR.INT.false 
> _Var150518.SCALAR.INT.false 1.SCALAR.INT.true _Var150519.SCALAR.INT.false 
> _mVar150520.MATRIX.DOUBLE MULTI_BLOCK
> --------CP rmvar _Var150518
> --------CP rmvar _Var150519
> --------CP createvar _mVar150521 
> scratch_space//_p664_10.143.133.52//_t0/temp75278 true MATRIX binaryblock 1 
> -1 1000 1000 -1 copy
> --------SPARK + stats.MATRIX.DOUBLE _mVar150520.MATRIX.DOUBLE 
> _mVar150521.MATRIX.DOUBLE
> --------CP rmvar _mVar150520
> --------CP rmvar stats
> --------CP cpvar _mVar150521 stats
> --------CP rmvar _mVar150521
> ----GENERIC (lines 8-11) [recompile=true]
> ------CP createvar _mVar150522 
> scratch_space//_p664_10.143.133.52//_t0/temp75279 true MATRIX binaryblock 1 
> -1 1000 1000 -1 copy
> ------SPARK / stats.MATRIX.DOUBLE r.SCALAR.INT.false _mVar150522.MATRIX.DOUBLE
> ------CP rmvar stats
> ------CP cpvar _mVar150522 stats
> ------CP rmvar _mVar150522
> ------CP rmvar r
> ------CP rmvar c
> ------CP rmvar stats
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to