[ https://issues.apache.org/jira/browse/SPARK-16928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-16928: ------------------------------ Priority: Minor (was: Major) Description: In both OnHeapColumnVector and OffHeapColumnVector, we implemented getInt() with the following code pattern: {code} public int getInt(int rowId) { if (dictionary == null) { return intData[rowId]; } else { return dictionary.decodeToInt(dictionaryIds.getInt(rowId)); } } {code} As dictionaryIds is also a ColumnVector, this results in a recursive call of getInt() and breaks JIT inlining. As a result, getInt() will not get inlined. was: In both OnHeapColumnVector and OffHeapColumnVector, we implemented getInt() with the following code pattern: public int getInt(int rowId) { if (dictionary == null) { return intData[rowId]; } else { return dictionary.decodeToInt(dictionaryIds.getInt(rowId)); } } As dictionaryIds is also a ColumnVector, this results in a recursive call of getInt() and breaks JIT inlining. As a result, getInt() will not get inlined. > Recursive call of ColumnVector::getInt() breaks JIT inlining > ------------------------------------------------------------ > > Key: SPARK-16928 > URL: https://issues.apache.org/jira/browse/SPARK-16928 > Project: Spark > Issue Type: Improvement > Affects Versions: 2.0.0 > Reporter: Qifan Pu > Priority: Minor > > In both OnHeapColumnVector and OffHeapColumnVector, we implemented getInt() > with the following code pattern: > {code} > public int getInt(int rowId) { > if (dictionary == null) { > return intData[rowId]; > } else { > return dictionary.decodeToInt(dictionaryIds.getInt(rowId)); > } > } > {code} > As dictionaryIds is also a ColumnVector, this results in a recursive call of > getInt() and breaks JIT inlining. As a result, getInt() will not get inlined. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org