kaknikhil commented on a change in pull request #425: DL: Add training for 
multiple models
URL: https://github.com/apache/madlib/pull/425#discussion_r309941439
 
 

 ##########
 File path: src/ports/postgres/modules/deep_learning/madlib_keras.py_in
 ##########
 @@ -455,17 +495,23 @@ def fit_transition(state, dependent_var, 
independent_var, model_architecture,
         # Once done with all images on a segment, we update weights
         # with the total number of images here instead of the merge function.
         # The merge function only deals with aggregating them.
-        updated_weights = [ total_images * w for w in updated_weights ]
-            # In GPDB, each segment would have a keras session, so clear
-            # them after the last buffer is processed.
-        clear_keras_session()
-
+        if not is_cerebro:
+            updated_weights = [ total_images * w for w in updated_weights ]
+
+        # In GPDB, each segment would have a keras session, so clear
+        # them at the end of final iteration and also every final buffer for 
cerebro
+        if is_final or is_cerebro:
+            K.clear_session()
+            sess.close()
+            del SD['segment_model']
+            del SD['sess']
+
+    all_ops_len = len([n.name for n in 
tf.get_default_graph().as_graph_def().node])
 
 Review comment:
   Why do we need this variable `all_ops_len` ? I only see this being written 
to but never read from.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to