HeartSaVioR commented on a change in pull request #24046: [MINOR][SQL] Throw
better exception for Encoder with tuple more than 22 elements
URL: https://github.com/apache/spark/pull/24046#discussion_r264081453
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.scala
##########
@@ -80,7 +80,11 @@ object ExpressionEncoder {
* name/positional binding is preserved.
*/
def tuple(encoders: Seq[ExpressionEncoder[_]]): ExpressionEncoder[_] = {
- // TODO: check if encoders length is more than 22 and throw exception for
it.
+ if (encoders.length > 22) {
Review comment:
To me, no strong preference given it would bring similar error message,
though it would throw different exceptions (even both two different exceptions
make sense for given point, though I think `UnsupportedOperationException` is
more clearer to represent what's happening).
Change to `require` looks to help reducing lines and simplifies the code.
Couldn't find other difference. @maropu Are you OK with stick to current one?
Or do you have strong preference to use `require` on this?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]