OK??thanks for remind me.

My sql like this(contain a Chinese word):


SELECT
        'HIGH' AS LEVEL,
        'Firewall uplink bandwidth exception:greater than 10000' AS content,
        `system.process.username`,
        `system.process.memory.rss.bytes`
FROM
        test
WHERE
        `system.process.username` LIKE '%????%'
        AND 
        `system.process.memory.rss.bytes` > 10000



Get exception when I submit the job to cluster.


Caused by: org.apache.calcite.runtime.CalciteException: Failed to encode 
'%????%' in character set 'ISO-8859-1'
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method) ~[na:1.8.0_45]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown 
Source) ~[na:1.8.0_45]
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown 
Source) ~[na:1.8.0_45]
        at java.lang.reflect.Constructor.newInstance(Unknown Source) 
~[na:1.8.0_45]
        at 
org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:463) 
~[flink-table_2.11-1.3.1.jar:1.3.1]
        at org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:572) 
~[flink-table_2.11-1.3.1.jar:1.3.1]
        at org.apache.calcite.util.NlsString.<init>(NlsString.java:81) 
~[flink-table_2.11-1.3.1.jar:1.3.1]
        at org.apache.calcite.rex.RexBuilder.makeLiteral(RexBuilder.java:864) 
~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.rex.RexBuilder.makeCharLiteral(RexBuilder.java:1051) 
~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.sql2rel.SqlNodeToRexConverterImpl.convertLiteral(SqlNodeToRexConverterImpl.java:117)
 ~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.sql2rel.SqlToRelConverter$Blackboard.visit(SqlToRelConverter.java:4408)
 ~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.sql2rel.SqlToRelConverter$Blackboard.visit(SqlToRelConverter.java:3787)
 ~[flink-table_2.11-1.3.1.jar:1.3.1]
        at org.apache.calcite.sql.SqlLiteral.accept(SqlLiteral.java:427) 
~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.sql2rel.SqlToRelConverter$Blackboard.convertExpression(SqlToRelConverter.java:4321)
 ~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.sql2rel.StandardConvertletTable.convertExpressionList(StandardConvertletTable.java:968)
 ~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.sql2rel.StandardConvertletTable.convertCall(StandardConvertletTable.java:944)
 ~[flink-table_2.11-1.3.1.jar:1.3.1]
        at 
org.apache.calcite.sql2rel.StandardConvertletTable.convertCall(StandardConvertletTable.java:928)
 ~[flink-table_2.11-1.3.1.jar:1.3.1]
        ... 50 common frames omitted



Is there anyone tell me how to deal with it,thanks!


------------------ ???????? ------------------
??????: "Nico Kruber";<n...@data-artisans.com>;
????????: 2017??7??25??(??????) ????11:48
??????: "user"<user@flink.apache.org>; 
????: "????"<ji_ch...@qq.com>; 
????: Re: How can I set charset for flink sql??



Please, for the sake of making your email searchable, do not post stack traces 
as screenshots but rather text into your email.

On Tuesday, 25 July 2017 12:18:56 CEST ???? wrote:
> My sql like this(contain a Chinese word)
> 
> Get exception when I submit the job to cluster.
> 
> 
> 
> Is there anyone tell me how to deal with it,thanks!

Reply via email to