ABHISHEK KUMAR GUPTA created SPARK-29573: --------------------------------------------
Summary: Spark should work as PostgreSQL when using + Operator Key: SPARK-29573 URL: https://issues.apache.org/jira/browse/SPARK-29573 Project: Spark Issue Type: Sub-task Components: SQL Affects Versions: 3.0.0 Reporter: ABHISHEK KUMAR GUPTA Spark and PostgreSQL result is different when concatenating as below : Spark : Giving NULL result 0: jdbc:hive2://10.18.19.208:23040/default> select * from emp12; +-----+---------+ | id | name | +-----+---------+ | 20 | test | | 10 | number | +-----+---------+ 2 rows selected (3.683 seconds) 0: jdbc:hive2://10.18.19.208:23040/default> select id as ID, id+name as address from emp12; +-----+----------+ | ID | address | +-----+----------+ | 20 | NULL | | 10 | NULL | +-----+----------+ 2 rows selected (0.649 seconds) 0: jdbc:hive2://10.18.19.208:23040/default> select id as ID, id+name as address from emp12; +-----+----------+ | ID | address | +-----+----------+ | 20 | NULL | | 10 | NULL | +-----+----------+ 2 rows selected (0.406 seconds) 0: jdbc:hive2://10.18.19.208:23040/default> select id as ID, id+','+name as address from emp12; +-----+----------+ | ID | address | +-----+----------+ | 20 | NULL | | 10 | NULL | +-----+----------+ PostgreSQL: Saying throwing Error saying not supported create table emp12(id int,name varchar(255)); insert into emp12 values(10,'number'); insert into emp12 values(20,'test'); select id as ID, id+','+name as address from emp12; output: invalid input syntax for integer: "," create table emp12(id int,name varchar(255)); insert into emp12 values(10,'number'); insert into emp12 values(20,'test'); select id as ID, id+name as address from emp12; Output: 42883: operator does not exist: integer + character varying It should throw Error in Spark if it is not supported. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org