Hi guys and gals, I have a Spark 1.2.0 instance running that I connect to via the thrift interface using beeline. On this instance I can send a command like `show tables like 'tmp*';` and I get a list of all tables that start with `tmp`. When testing this same command out on a server that is running Spark 1.3.0 or higher, I now get an error message:
0: jdbc:hive2://localhost:10001> show tables like 'tmp*'; Error: java.lang.RuntimeException: [1.13] failure: ``in'' expected but identifier like found show tables like 'tmp*' ^ (state=,code=0) 0: jdbc:hive2://localhost:10001> I'm wondering if wildcard matching was inadvertently removed, or if there is another way to accomplish the same thing without having to do this filtering on the client side. Cheers! Doug -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/show-tables-like-tmp-does-not-work-in-Spark-1-3-0-tp24429.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org