[ 
https://issues.apache.org/jira/browse/SYSTEMML-1595?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16003852#comment-16003852
 ] 

Mike Dusenberry commented on SYSTEMML-1595:
-------------------------------------------

Interesting.  I just checked, and MLContext uses plain "text" dummy {{write}} 
statements as well, so that explains why I was seeing the issue originally with 
MLContext.  Should we start adding the block sizes to these persistent writes 
anyway?  Or should we just update the persistent write -> transient write 
rewrite rule to grab the block sizes from the input to the PersistentWrite?

> Missing Block Sizes For PersistentWrites & TransientWrites
> ----------------------------------------------------------
>
>                 Key: SYSTEMML-1595
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-1595
>             Project: SystemML
>          Issue Type: Bug
>            Reporter: Mike Dusenberry
>         Attachments: scenario1.dml
>
>
> In the attached script, the resulting PersisentWrites for {{doutc1_agg}} & 
> {{dWc1_agg}} end up having unknown block sizes, despite the input DAGs for 
> those variables having known block sizes.  Due to this, when we use MLContext 
> and mark those variables as outputs, the PersistentWrites will be rewritten 
> to TransientWrites, and the block sizes will remain unknown.
> To run:
> {code}
> spark-submit $SYSTEMML_HOME/target/SystemML.jar -f scenario1.dml -explain 
> recompile_hops
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to