After many tries, in order to build a streaming query application on csv in
Eclipse I used:
As maven dependencies:
- commons-io 2.4 , commons-logging 1.1.3 , commons-lang3 3.2 , janino 2.7.6
, eigenbase-properties 1.1.5 , avatica 1.8.0, opencsv 2.3 , json-simple 1.1

and as external jars :
calcite-core 1.9.0, example-csv-1.9.0, calcite-linq4j 1.9.0 after creating
them with mvn install command.

2016-09-06 21:15 GMT+03:00 Γιώργος Θεοδωράκης <giwrgosrth...@gmail.com>:

> I've tried with avatica, avatica-metrics, standalone-server and server,
> all in version 1.8.0 jars from maven repository as dependencies(at first
> only with avatica and avatica-metrics and then with other combinations) and
> 1.9.0-SNAPSHOT versions of calcite core and csv with their sources. At
> first I got an eigenbase-properties NoClass Exception, so I added version
> 1.1.5 from maven repositories. Then, I got java.lang.NoClassDefFoundError:
> org/codehaus/commons/compiler/CompileException , and I am stuck with
> these exception after many combinations and tries.
>
> 2016-09-06 18:46 GMT+03:00 Julian Hyde <jhyde.apa...@gmail.com>:
>
>> That still sounds like a version mismatch. Note that avatica releases are
>> separate from calcite these days. Therefore calcite-core-1.9.0-SNAPSHOT,
>> for instance, depends upon avatica-metrics-1.8.0.
>>
>> I think you should use 1.9.0-SNAPSHOT versions of calcite- jars, and
>> 1.8.0 versions of avatica- jars.
>>
>> Julian
>>
>>
>> > On Sep 6, 2016, at 7:45 AM, Γιώργος Θεοδωράκης <giwrgosrth...@gmail.com>
>> wrote:
>> >
>> > I have imported as external jars calcite-example-csv, calcite-core,
>> avatica
>> > , linq4j, avatica-metrics, avatica-standalone with all their sources and
>> > tests, and I get: Exception in thread "main"
>> java.lang.AbstractMethodError:
>> > org.apache.calcite.config.CalciteConnectionProperty.valueCla‌
>> ​ss()Ljava/lang/Class‌​;
>> > at
>> > org.apache.calcite.avatica.ConnectionConfigImpl$PropEnv.getE‌
>> ​num(ConnectionConfig‌​Impl.java:228).
>> > Is there something else I am missing.
>> >
>> > Also, when I ran the streaming test from CsvTest.java and I add a
>> filter in
>> > the query, I get an error. Is filter implemented for streaming query by
>> > defining the query with a string, or should I define with another way?
>> >
>> >
>> > 2016-09-05 21:03 GMT+03:00 Julian Hyde <jh...@apache.org>:
>> >
>> >> You have mismatched versions. CANCEL_FLAG will be added to calcite-core
>> >> only in version 1.9. So, if you are using 1.9.0-SNAPSHOT version of
>> >> example/csv you should use a 1.9.0-SNAPSHOT version of other Calcite
>> jars.
>> >> (You can build and install in your local repo using ‘mvn install’.)
>> >>
>> >> 1.9 should be released in a couple of weeks.
>> >>
>> >> Julian
>> >>
>> >>> On Sep 5, 2016, at 3:04 AM, Γιώργος Θεοδωράκης <
>> giwrgosrth...@gmail.com>
>> >> wrote:
>> >>>
>> >>> It has to do with the dependencies. When I am running the sample code
>> >> with
>> >>> changes as Test in
>> >>> ../calcite-master/example/csv/src/test/java/org/apache/calcite/test/
>> I
>> >> have
>> >>> no errors. However, when I try to create my own project in Eclipse and
>> >>> after I  have imported everything with maven repositories (from
>> calcite
>> >> 1.8
>> >>> version and as external .jar the calcite-example-csv-1.9.0-SNAP
>> SHOT.jar
>> >>> from the github version) the same error occurs :
>> >>>
>> >>> Exception in thread "main" java.lang.NoSuchFieldError: CANCEL_FLAG
>> >>> at
>> >>> org.apache.calcite.adapter.csv.CsvScannableTable.scan(
>> >> CsvScannableTable.java:48)
>> >>> at
>> >>> org.apache.calcite.interpreter.TableScanNode.
>> >> createScannable(TableScanNode.java:117)
>> >>> at
>> >>> org.apache.calcite.interpreter.TableScanNode.
>> >> create(TableScanNode.java:94)
>> >>> at org.apache.calcite.interpreter.Nodes$CoreCompiler.visit(
>> Nodes.java:
>> >> 68)
>> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>> at
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(
>> >> NativeMethodAccessorImpl.java:57)
>> >>> at
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>> >> DelegatingMethodAccessorImpl.java:43)
>> >>> at java.lang.reflect.Method.invoke(Method.java:606)
>> >>> at
>> >>> org.apache.calcite.util.ReflectUtil.invokeVisitorInternal(
>> >> ReflectUtil.java:257)
>> >>> at org.apache.calcite.util.ReflectUtil.invokeVisitor(
>> >> ReflectUtil.java:214)
>> >>> at org.apache.calcite.util.ReflectUtil$1.invokeVisitor(
>> >> ReflectUtil.java:471)
>> >>> at
>> >>> org.apache.calcite.interpreter.Interpreter$Compiler.visit(
>> Interpreter.
>> >> java:476)
>> >>> at
>> >>> org.apache.calcite.interpreter.Interpreter$Compiler.visitRoot(
>> >> Interpreter.java:433)
>> >>> at org.apache.calcite.interpreter.Interpreter.<init>
>> >> (Interpreter.java:75)
>> >>> at Baz.bind(Unknown Source)
>> >>> at
>> >>> org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(
>> >> CalcitePrepare.java:327)
>> >>> at
>> >>> org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(
>> >> CalciteConnectionImpl.java:282)
>> >>> at
>> >>> org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(
>> >> CalciteMetaImpl.java:553)
>> >>> at
>> >>> org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(
>> >> CalciteMetaImpl.java:544)
>> >>> at
>> >>> org.apache.calcite.avatica.AvaticaResultSet.execute(
>> >> AvaticaResultSet.java:187)
>> >>> at
>> >>> org.apache.calcite.jdbc.CalciteResultSet.execute(
>> >> CalciteResultSet.java:65)
>> >>> at
>> >>> org.apache.calcite.jdbc.CalciteResultSet.execute(
>> >> CalciteResultSet.java:44)
>> >>> at
>> >>> org.apache.calcite.avatica.AvaticaConnection$1.execute(
>> >> AvaticaConnection.java:605)
>> >>> at
>> >>> org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(
>> >> CalciteMetaImpl.java:599)
>> >>> at
>> >>> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecu
>> teInternal(
>> >> AvaticaConnection.java:613)
>> >>> at
>> >>> org.apache.calcite.avatica.AvaticaStatement.executeInternal(
>> >> AvaticaStatement.java:139)
>> >>> at
>> >>> org.apache.calcite.avatica.AvaticaStatement.executeQuery(
>> >> AvaticaStatement.java:208)
>> >>> at stream_test.CsvTest.checkSql(CsvTest.java:122)
>> >>> at stream_test.CsvTest.checkSql(CsvTest.java:76)
>> >>> at stream_test.TestQuery.main(TestQuery.java:13)
>> >>>
>> >>> Is there anything else I have to import or have I done something
>> wrong?
>> >>>
>> >>> 2016-09-05 1:11 GMT+03:00 Julian Hyde <jhyde.apa...@gmail.com>:
>> >>>
>> >>>> It might be case-sensitivity. Try double-quoting the column names in
>> >> your
>> >>>> query.
>> >>>>
>> >>>> Julian
>> >>>>
>> >>>>> On Sep 4, 2016, at 09:43, Γιώργος Θεοδωράκης <
>> giwrgosrth...@gmail.com>
>> >>>> wrote:
>> >>>>>
>> >>>>> I have correctly used sqlline to run queries on a streaming table,
>> but
>> >>>> now
>> >>>>> I face problems trying to implement it programmatically with java. I
>> >> have
>> >>>>> made an attempt, but haven't got it running yet (
>> >>>>> http://stackoverflow.com/questions/39318653/create-a-
>> >>>> streaming-example-with-calcite-using-csv
>> >>>>> ).
>> >>>>> Can somebody help me by giving a template or finding what's wrong
>> with
>> >> my
>> >>>>> code?
>> >>>>>
>> >>>>> Thank you in advance,
>> >>>>> George
>> >>>>>
>> >>>>> 2016-09-03 18:14 GMT+03:00 Γιώργος Θεοδωράκης <
>> giwrgosrth...@gmail.com
>> >>> :
>> >>>>>
>> >>>>>> When I tried a query like SELECT STREAM ss.depts.deptno FROM
>> ss.depts
>> >>>>>> WHERE ss.depts.deptno < 30; it gave me a correct answer on the
>> >>>> SDEPTS.cvs
>> >>>>>> in sales folder with both my json and model-stream-table.json. I
>> only
>> >>>> had
>> >>>>>> to declare better where to find the tables and the columns, because
>> >> with
>> >>>>>> only the name it wouldn't run. I still haven't fixed my sOrders.csv
>> >> yet,
>> >>>>>> but I suppose it has to do with how I have created.
>> >>>>>>
>> >>>>>> 2016-09-03 15:39 GMT+03:00 Γιώργος Θεοδωράκης <
>> >> giwrgosrth...@gmail.com
>> >>>>> :
>> >>>>>>
>> >>>>>>> I am trying to create a simple streaming query ( like SELECT
>> STREAM *
>> >>>>>>> FROM ORDERS WHERE units > 10). I have created a stream using a
>> socket
>> >>>> that
>> >>>>>>> saves the orders in an sOrders.csv file and I have changed the
>> >>>>>>> model-stream-table.json like this:
>> >>>>>>> {
>> >>>>>>> version: '1.0',
>> >>>>>>> defaultSchema: 'CUSTOM_TABLE',
>> >>>>>>> schemas: [
>> >>>>>>>  {
>> >>>>>>>    name: 'CUSTOM_TABLE',
>> >>>>>>>    tables: [
>> >>>>>>>      {
>> >>>>>>>        name: 'ORDERS',
>> >>>>>>>        type: 'custom',
>> >>>>>>>        factory: 'org.apache.calcite.adapter.cs
>> >>>>>>> v.CsvStreamTableFactory',
>> >>>>>>>        stream: {
>> >>>>>>>          stream: true
>> >>>>>>>        },
>> >>>>>>>        operand: {
>> >>>>>>>          file: '/home/hduser/Desktop/sOrders.csv',
>> >>>>>>>          flavor: "scannable"
>> >>>>>>>        }
>> >>>>>>>      }
>> >>>>>>>    ]
>> >>>>>>>  }
>> >>>>>>>
>> >>>>>>> , because when I had defaultSchema: 'STREAM' it showed: Error
>> while
>> >>>>>>> executing SQL "SELECT STREAM * FROM orders": From line 1, column
>> 22
>> >> to
>> >>>> line
>> >>>>>>> 1, column 27: Table 'ORDERS' not found (state=,code=0).
>> >>>>>>>
>> >>>>>>> Everything is good, until I try to project or filter a column
>> when it
>> >>>>>>> shows: Error while executing SQL "SELECT STREAM productId FROM
>> >> orders":
>> >>>>>>> From line 1, column 15 to line 1, column 23: Column 'PRODUCTID'
>> not
>> >>>> found
>> >>>>>>> in any table (state=,code=0). When I type !columns I get:
>> >> CUSTOM_TABLE
>> >>>> |
>> >>>>>>> ORDERS     | productId   | 4         | JavaType(class...
>> >>>>>>>
>> >>>>>>> To solve my problem, should I write from the
>> >>>>>>> start CsvStreamFilterableTable.java etc. or are these operations
>> >>>> already
>> >>>>>>> implemented?
>> >>>>>>>
>> >>>>>>> Thank you in advance,
>> >>>>>>> George
>> >>>>>>
>> >>>>
>> >>
>> >>
>>
>>
>

Reply via email to