[GitHub] spark pull request: [SPARK-15340][SQL]Limit the size of the map us...

2016-05-18 Thread akohli
Github user akohli commented on the pull request: https://github.com/apache/spark/pull/13130#issuecomment-219998908 sure (i get that) but 1000 is an arbitrary number that may or may not sufficient. why not 10? why not 1? --- If your project is set up for it, you can reply to

[GitHub] spark pull request: [SPARK-15340][SQL]Limit the size of the map us...

2016-05-17 Thread akohli
Github user akohli commented on the pull request: https://github.com/apache/spark/pull/13130#issuecomment-219750370 curious on a couple of things, firstly, you are using CacheBuilder with a cache size of 1000, is that sufficient? Wouldn't it be better to catch an exception or d