[ 
https://issues.apache.org/jira/browse/SPARK-23812?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17187414#comment-17187414
 ] 

Yuming Wang commented on SPARK-23812:
-------------------------------------

We don’t support {{dfs}}.
{noformat}
scala> spark.sql("dfs -ls /tmp").show
org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input 'dfs' expecting \{'(', 'SELECT', 'FROM', 'ADD', 'DESC', 
'WITH', 'VALUES', 'CREATE', 'TABLE', 'INSERT', 'DELETE', 'DESCRIBE', 'EXPLAIN', 
'SHOW', 'USE', 'DROP', 'ALTER', 'MAP', 'SET', 'RESET', 'START', 'COMMIT', 
'ROLLBACK', 'REDUCE', 'REFRESH', 'CLEAR', 'CACHE', 'UNCACHE', 'TRUNCATE', 
'ANALYZE', 'LIST', 'REVOKE', 'GRANT', 'LOCK', 'UNLOCK', 'MSCK', 'EXPORT', 
'IMPORT', 'LOAD'}(line 1, pos 0)

== SQL ==
dfs -ls /tmp
^^^

  at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:239)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:115)
  at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638)
  
{noformat}

{{bin/spark-sql}} uses Hive api.

> DFS should be removed from unsupportedHiveNativeCommands in SqlBase.g4
> ----------------------------------------------------------------------
>
>                 Key: SPARK-23812
>                 URL: https://issues.apache.org/jira/browse/SPARK-23812
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: wangtao93
>            Priority: Minor
>
> dfs command has been supported,but SqlBase.g4 also put it in 
> unsupportedHiveNativeCommands .



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to