What about

sqlContext.sql("SELECT * FROM Logs as l where l.timestamp=*'2012-10-08
16:10:36.0'*").collect

You might need to quote the timestamp it looks like.

Thanks
Best Regards

On Sat, Nov 22, 2014 at 12:09 AM, whitebread <ale.panebia...@me.com> wrote:

> Hi all,
>
> I put some log files into sql tables through Spark and my schema looks like
> this:
>
>  |-- timestamp: timestamp (nullable = true)
>  |-- c_ip: string (nullable = true)
>  |-- cs_username: string (nullable = true)
>  |-- s_ip: string (nullable = true)
>  |-- s_port: string (nullable = true)
>  |-- cs_method: string (nullable = true)
>  |-- cs_uri_stem: string (nullable = true)
>  |-- cs_query: string (nullable = true)
>  |-- sc_status: integer (nullable = false)
>  |-- sc_bytes: integer (nullable = false)
>  |-- cs_bytes: integer (nullable = false)
>  |-- time_taken: integer (nullable = false)
>  |-- User_Agent: string (nullable = true)
>  |-- Referrer: string (nullable = true)
>
> As you can notice I created a timestamp field which I read is supported by
> Spark (Date wouldn't work as far as I understood). I would love to use for
> queries like "where timestamp>(2012-10-08 16:10:36.0)" but when I run it I
> keep getting errors.
> I tried these 2 following sintax forms:
> For the second one I parse a string so Im sure Im actually pass it in a
> timestamp format.
> I use 2 functions: /parse/ and  /date2timestamp/.
>
> *Any hint on how I should handle timestamp values?*
>
> Thanks,
>
> Alessandro
>
> 1)
> scala> sqlContext.sql("SELECT * FROM Logs as l where
> l.timestamp=(2012-10-08
> 16:10:36.0)").collect
> java.lang.RuntimeException: [1.55] failure: ``)'' expected but 16 found
>
> SELECT * FROM Logs as l where l.timestamp=(2012-10-08 16:10:36.0)
>                                                       ^
>         at scala.sys.package$.error(package.scala:27)
>         at
> org.apache.spark.sql.catalyst.SqlParser.apply(SqlParser.scala:60)
>         at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:73)
>         at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:260)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:21)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:26)
>         at $iwC$$iwC$$iwC.<init>(<console>:28)
>         at $iwC$$iwC.<init>(<console>:30)
>         at $iwC.<init>(<console>:32)
>         at <init>(<console>:34)
>         at .<init>(<console>:38)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)
>         at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)
>         at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:610)
>         at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:814)
>         at
>
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:859)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:771)
>         at
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:616)
>         at
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:624)
>         at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:629)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:954)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
>         at
>
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:997)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> 2)
> sqlContext.sql("SELECT * FROM Logs as l where
> l.timestamp="+date2timestamp(formatTime3.parse("2012-10-08
> 16:10:36.0"))).collect
> java.lang.RuntimeException: [1.54] failure: ``UNION'' expected but 16 found
>
> SELECT * FROM Logs as l where l.timestamp=2012-10-08 16:10:36.0
>                                                      ^
>         at scala.sys.package$.error(package.scala:27)
>         at
> org.apache.spark.sql.catalyst.SqlParser.apply(SqlParser.scala:60)
>         at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:73)
>         at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:260)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:28)
>         at $iwC$$iwC$$iwC.<init>(<console>:30)
>         at $iwC$$iwC.<init>(<console>:32)
>         at $iwC.<init>(<console>:34)
>         at <init>(<console>:36)
>         at .<init>(<console>:40)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)
>         at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)
>         at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:610)
>         at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:814)
>         at
>
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:859)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:771)
>         at
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:616)
>         at
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:624)
>         at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:629)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:954)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
>         at
>
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:997)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-Timestamp-query-failure-tp19502.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to