Re: Query Compilation error with 80+ CASE statements

2019-02-27 Thread Arjun kr
Rahul, You can try setting system option 'exec.java.compiler.exp_in_method_size' to lesser value from the default of 50 if you haven't tried already and see if it succeeds. alter session set `exec.java.compiler.exp_in_method_size` = ; Thanks, Arjun From: Abh

Re: HDFS storage prefix returning Error: VALIDATION ERROR: null

2019-02-13 Thread Arjun kr
Just wanted to confirm on the name node URI. Can you verify if 8020 is your namenode ipc port? May be you can run 'hadoop fs -ls hdfs://host18-namenode:8020/tmp' and verify it? Get Outlook for Android From: Abhishek Girish Sent: Tuesday, February 12, 11:37 PM Subject: Re

Re: DoY - PAM Auth only reading YARN account.

2019-01-25 Thread Arjun kr
Is DoY user ( used to start DoY client) part of shadow group incase you are using PAM with /etc/passwd for authentication? If not, can you try adding and see if it helps? >From doc: If you use PAM with /etc/passwd for authentication, verify that the users with permission to start the Drill proc

Re: Drill connection using jdbc

2019-01-09 Thread Arjun kr
a.org <http://www.sidra.org/> On 1/9/19, 10:39 AM, "Arjun kr" wrote: Are you missing drill zk directory in connection url? By default, it's value is drill. jdbc:drill:zk=[:][,[:]... /;[schema=] https://drill.apache.org/docs/using-the-<https://d

Re: Drill connection using jdbc

2019-01-08 Thread Arjun kr
Are you missing drill zk directory in connection url? By default, it's value is drill. jdbc:drill:zk=[:][,[:]... /;[schema=] https://drill.apache.org/docs/using-the-jdbc-driver/

Re: how to set drill planner parameter in drill-override.conf

2018-11-09 Thread Arjun kr
This option can be set at system or session level using alter system/session set command.The sys.options table lists options that you can set at the system or session level. https://drill.apache.org/docs/planning-and-execution-options/ Thanks, Arjun Get Outlook for Android

Re: Join tables from different databases returns empty result

2018-11-06 Thread Arjun kr
Does it return any result if you query on individual tables with the corresponding filter applied in Drill? Thanks, Arjun From: Khurram Faraaz Sent: Wednesday, November 7, 2018 5:34 AM To: user@drill.apache.org Cc: om...@intertrust.com Subject: Re: Join tables

Re: Hbase tables in apache drill not showing up

2018-10-29 Thread Arjun kr
Do you have Zookeeper service running on Drillbit nodes? Try below command from the drillbit node. echo ruok | nc localhost 2181 Please make sure Zookeeper connection settings in storage plugin definition are in sync with the settings in hbase-site.xml used by HBase services. Thanks, Arjun

Re: Login Failure investigation

2018-08-07 Thread Arjun kr
Do you see any authentication failure message in /var/log/secure file? Also you may use pamtester utility to check if you see similar behavior. # to check sudo profile for user 'user1' pamtester -v sudo 'user1' 'authenticate' http://pamtester.sourceforge.net Get Outlook for Android

Re: difference between cast as timestamp and to_timestamp

2018-07-03 Thread Arjun kr
TO_TIMESTAMP function accepts epoch timestamp in seconds. Whereas cast to Timestamp seems to be expecting value in milliseconds. 0: jdbc:drill:> select TO_TIMESTAMP(1530601200049/1000) from (values(1)); ++ | EXPR$0 | ++ | 2018-07-0

Re: How to dynamically add months in Apache drill

2018-07-03 Thread Arjun kr
One way to do this is given below. This requires interval expression to be passed to the DATE_ADD function. 0: jdbc:drill:> select DATE_ADD(date '2015-05-15', interval '1' month) with_literal, DATE_ADD(date '2015-05-15',cast(concat('P',val,'M') as interval month)) with_column_value from (sel

Re: How to Start Drill Service From Linux Non-root user

2018-05-31 Thread Arjun kr
If it's a multi node drillbit cluster, do you have this path/mount available on all drillbit nodes ? Get Outlook for Android From: Surneni Tilak Sent: Thursday, May 31, 7:06 AM Subject: RE: How to Start Drill Service From Linux Non-root user To: user@drill.apache.org I

Re: About Drill Timezone Parse

2018-04-06 Thread Arjun kr
Hi Alican, Please see if below syntax helps. Looks like hour format is in 24-hour format (HH instead of hh). 0: jdbc:drill:schema=dfs> select TO_TIMESTAMP('Mon June 04 12:09:56 EET 2001', 'E MMM dd HH:mm:ss ZZZ ') from (values(1)); ++ | EXPR$0 | +--

Re: JDBC Driver

2018-04-02 Thread Arjun kr
Hi Ravi, Looking at the AWS documentation, it seems it can be specified in connection URL. I would suggest to try it using a standalone java application before trying it with Drill in case you have not tried it. As per doc, below are the steps involved. You may try it as given below (I have no

Re: hive connection as generic jdbc

2018-03-16 Thread Arjun kr
nk seems to be wrong.It shows text content being executed instead of json. Thanks, Arjun From: Asim Kanungo Sent: Friday, March 16, 2018 9:57 PM To: user@drill.apache.org Subject: Re: hive connection as generic jdbc Hi Kunal/Arjun, Yeah I tried from view as well b

Re: hive connection as generic jdbc

2018-03-15 Thread Arjun kr
t 0:0 [Error Id: d6a2fdf6-7979-4415-8d08-afbcd3667bde on rs-master.redstack.com:31010] (state=,code=0) Please try from your side, and let me know if you are facing the same issue. On Thu, Mar 15, 2018 at 1:49 AM, Arjun kr wrote: > Hi Asim, > > I was able to connect to Hive 1.2 using

Re: hive connection as generic jdbc

2018-03-14 Thread Arjun kr
-1.1.1.jar Thanks, Arjun From: Arjun kr Sent: Wednesday, March 14, 2018 1:05 PM To: user@drill.apache.org Subject: Re: hive connection as generic jdbc Looks like hive-jdbc-1.1.1-standalone.jar has 'slf4j-log4j' bundled.You may try cloning below rep

Re: hive connection as generic jdbc

2018-03-14 Thread Arjun kr
re is any other jar file I should use and/or if any changes required. On Tue, Mar 13, 2018 at 1:22 PM, Arjun kr wrote: > Hi Asim, > > > Can you try using below jar? It looks like hive 1.2 onwards, Hive uses > httpclient version 4.4. The previous versions of Hive uses httpclient > v

Re: hive connection as generic jdbc

2018-03-13 Thread Arjun kr
streamx - kafka-connect-s3 : Ingest data from Kafka to Object Stores(s3) On Tue, Mar 13, 2018 at 12:30 AM, Arjun kr wrote: > Hi Asim, > > > You may give it a shot by adding this uber jar to Drill 3rd party > directory (Remove previously copied jars). For truststore, try giving &

Re: hive connection as generic jdbc

2018-03-12 Thread Arjun kr
SL enabled system so I have to give the extra details in the URL and it worked. Does that mean it should work for adding it as a generic JDBC.How is this test related to my issue ? Thanks Asim On Mon, Mar 12, 2018 at 10:36 PM, Arjun kr wrote: > Hi Asim, > > > You may try using h

Re: hive connection as generic jdbc

2018-03-12 Thread Arjun kr
Hi Asim, You may try using hive uber jar in case you have not tried it. See if below link helps. https://github.com/timveil/hive-jdbc-uber-jar/releases It would be ideal to test this uber jar with a sample JDBC application before trying with Drill. java -cp "hive-jdbc-uber-2.6.3.0-235.ja

Re: Setting up drill to query AWS S3 behind a proxy

2018-03-12 Thread Arjun kr
Hi Tyler, The Hadoop-AWS module provides settings for proxy setup. You may try setting these configs in $DRILL_CONF/core-site.xml and restart drill-bits. I have not tested it though. https://hadoop.apache.org/docs/r2.7.1/hadoop-aws/tools/hadoop-aws/index.html fs.s3a.proxy.host Hostname

Re: Way to "pivot"

2018-03-06 Thread Arjun kr
If each timestamp has only one set of values for (x,y,z) , you can try something like below. select dt , max(case when source='X' THEN `value` else 0.0 end) as X, max(case when source='Y' THEN `value` else 0.0 end) as Y, max(case when source='Z' THEN `value` else 0.0 end) as Z from group by

Re: Default value for parameter 'planner.width.max_per_node'

2018-02-22 Thread Arjun kr
-exec/src/main/java/org/apache/drill/exec/server/options/TypeValidators.java#L252> github.com drill - Mirror of Apache Drill Looks like a non-zero value is not being computed. Setting the value as a system variable should persist and correct this. Can you file a JIRA for this? The fix should

Default value for parameter 'planner.width.max_per_node'

2018-02-21 Thread Arjun kr
Hi Team, Did default value for configuration parameter 'planner.width.max_per_node' change between Drill 1.10 and 1.11 versions? Also, does default value of 70% of total cores still hold true? -- Drill 1.11 0: jdbc:drill:drillbit=localhost> select * from sys.options where name like '%plan

Re: Fixed-width files

2018-02-20 Thread Arjun kr
If you have Hive storage plugin enabled, You can create Hive table with regex serde and query the same in Drill. -- Table contents $ hadoop fs -cat /tmp/regex_test/* 112123 $ -- Hive DDL with regex '(.{1})(.{2})(.{3})' - column1 of width 1,column2 of width 2 and column3 of width 3 CREATE EX

Re: Code too large

2018-02-14 Thread Arjun kr
Hi Anup, You may try setting configuration option 'exec.java.compiler.exp_in_method_size' to lower value from default of 50 and run the query to see if it helps. Even lowering to a value of 1 doesn't help, the query details and stack trace may helpful for analysis as Khurram mentioned. alter

Re: S3 Connection Issues

2018-02-13 Thread Arjun kr
If you have 'hadoop-aws-2.9.0.jar' jar in drill classpath, replace it with original aws jar that comes with tarball. The class 'org/apache/hadoop/fs/GlobalStorageStatistics' is not available in hadoop common jar - hadoop-common-2.7.1.jar ( this was added in 2.8.0). You can try with original ta

Re: S3 Connection Issues

2018-02-13 Thread Arjun kr
conds) 0: jdbc:drill:schema=dfs> https://docs.aws.amazon.com/general/latest/gr/rande.html Thanks, Arjun ________ From: Arjun kr Sent: Tuesday, February 13, 2018 8:47 PM To: user@drill.apache.org Subject: Re: S3 Connection Issues Hi Anup, Please see if below steps help. 1) Add

Re: S3 Connection Issues

2018-02-13 Thread Arjun kr
Hi Anup, Please see if below steps help. 1) Add below option $DRILL_HOME/conf/drill-env.sh export DRILLBIT_JAVA_OPTS="$DRILLBIT_JAVA_OPTS -Dcom.amazonaws.services.s3.enableV4=true" 2) Restart the drillbit service and try querying S4 region. Thanks, Arjun

Re: Unable to setup hive plugin in Drill 1.11.0

2018-02-13 Thread Arjun kr
Hi Anup, As Sorabh mentioned, you seem to be using hive 2.1.1 jars in Drill classpath based on the stack trace. Did you build drill package by customizing hive version to 2.1.1 or added hive 2.1.1 jars in Drill classpath manually? I could see that Drill 1.12 (latest released), 1.11 and 1.10

Re: PCAP files with Apache Drill and Sergeant R

2018-02-06 Thread Arjun kr
Hi Houssem, You should be able to query it using DFS plugin and S3 storage plugin ( I have not tried it with S3 plugin though). You can enable pcap format in storage plugin definition as given below. "formats": { , "pcap": { "type": "pcap" } } Also, it would be best to use Drill 1.1

Re: Apache drill validation error, table not found

2018-02-06 Thread Arjun kr
Is the failure due to ORC file not being supported by DFS/S3 plugin? This error may come if you are querying on unsupported format or if you don't have the format defined in corresponding storage plugin definition. Below is sample execution for junk format 'thenga' not defined in storage plugi

Re: convert epoch time stamp to timestamp

2018-01-14 Thread Arjun kr
Looks like you are passing epoch timestamp value in milliseconds instead of seconds. You can divide by 1000 or remove last three digits to see if you are getting the desired result. # Divide by 1000 SELECT TO_TIMESTAMP(1515545336591/1000) FROM (VALUES(1)); ++ |

Re: Illegal Argument Exception while convert unix date format to drill timestamp

2017-12-14 Thread Arjun kr
+ 1 row selected (0.165 seconds) 0: jdbc:drill:schema=dfs> Thanks, Arjun kr From: Divya Gehlot Sent: Thursday, December 14, 2017 9:12 AM To: user@drill.apache.org Subject: Illegal Argument Exception while convert unix date format to drill

Re: Drill Capacity

2017-11-06 Thread Arjun kr
...@2x-vfla6ltfz.png]<https://www.dropbox.com/sh/5akxrzm078jsabw/AADuD92swH6c9jwijTjkkac_a?dl=0> Drill<https://www.dropbox.com/sh/5akxrzm078jsabw/AADuD92swH6c9jwijTjkkac_a?dl=0> www.dropbox.com Shared with Dropbox Thank you! Yun -Original Message- From: Arjun kr [

Re: Drill Capacity

2017-11-06 Thread Arjun kr
Hi Yun, Are you running in Drill embedded mode ? If so , the logs will be available in sqllline.log and drillbit.log will not be populated. You can enable DEBUG logging in logback.xml , run the query and share log file as Paul suggested. Edit $DRILL_HOME/conf/logback.xml to enable DEBUG level

Re: Drill Capacity

2017-11-03 Thread Arjun kr
I have seen a use-case where query fails for 12 GB single json file having structure '‘{ "key":[obj1, obj2, obj3..objn]}’'. Here json file has a key element and value is array of json object 'obj'. There were around 175K objects in this array and each obj is again complex json object with nest

Re: Drill Capacity

2017-11-03 Thread Arjun kr
Hi Yun, Could you please provide more details on your json data structure for 400 MB json file. Structure 1: ‘{ "key":[obj1, obj2, obj3..objn]}’ Structure 2: [ {obj1},{obj2}..,{objn}] Structure 3: {obj1} {obj1} .. {objn} Thanks, Arjun From: Yun Liu

Re: Apache Drill connection issue in tableau

2017-11-03 Thread Arjun kr
This property is a startup option (boot option) which can be set in drill-override.conf ($DRILL_HOME/conf).You can specify it as highlighted below. drill.exec: { cluster-id: "", zk.connect: "", rpc.user.timeout :60 } Restart Drill service once changes are made in all drillbit nodes. You

Re: Apache drill : Error while creating storage plugin for Oracle DB

2017-11-01 Thread Arjun kr
bdcsce/drill/ apache-drill-1.11.0", zk.connect: "<>:<" } Thanks, Arjun From: Arjun kr Sent: Wednesday, November 1, 2017 11:20 AM To: user@drill.apache.org Subject: Re: Apache drill : Error while creating storage plugin for Oracle DB Hi

Re: Apache drill : Error while creating storage plugin for Oracle DB

2017-10-31 Thread Arjun kr
> > Host: <>:8047 > > Accept: */* > > Content-Type: application/json > > Content-Length: 179 > > > < HTTP/1.1 200 OK > < Content-Type: application/json > < Content-Length: 59 > < Server: Jetty(9.1.5.v20140505) > < > { > &quo

Re: Apache drill : Error while creating storage plugin for Oracle DB

2017-10-31 Thread Arjun kr
g Subject: Re: Apache drill : Error while creating storage plugin for Oracle DB Hi Arjun, No error message getting logged in drillbit.log once I click on "create" button. Thanks, Akshay On Tue, Oct 31, 2017 at 11:19 PM, Arjun kr wrote: > > Do you see any specific error message in drillbi

Re: Apache drill : Error while creating storage plugin for Oracle DB

2017-10-31 Thread Arjun kr
jdbc driver (ojdbc7.jar) to all the drill nodes and restarted the drillbits on all nodes. Thanks and Regards, Akshay On Tue, 31 Oct 2017 at 10:30 PM, Arjun kr wrote: > Hi Akshay, > > > Did you copy jdbc driver to all the drill nodes and restarted drillbits ? > > >

Re: Apache drill : Error while creating storage plugin for Oracle DB

2017-10-31 Thread Arjun kr
Hi Akshay, Did you copy jdbc driver to all the drill nodes and restarted drillbits ? Thanks, Arjun From: Akshay Joshi Sent: Tuesday, October 31, 2017 9:46 PM To: user@drill.apache.org Subject: Apache drill : Error while creating storage plugin for Oracle DB

Re: S3 Connection Issues

2017-10-20 Thread Arjun kr
on S3 or I’m missing some config variable in Drill. — C > On Oct 20, 2017, at 14:12, Arjun kr wrote: > > Hi Charles, > > > Any chance you can test s3 connectivity with other tools like hdfs shell or > hive in case you haven't tried already (and these tools available)?

Re: S3 Connection Issues

2017-10-20 Thread Arjun kr
Hi Charles, Any chance you can test s3 connectivity with other tools like hdfs shell or hive in case you haven't tried already (and these tools available)? This may help to identify if it is Drill specific issue. For connecting via hdfs , you may try below command. hadoop fs -Dfs.s3a.acces

Re: S3 with mixed files

2017-10-20 Thread Arjun kr
Hi Daniel, This error may occur if you don't have format defined in S3 storage plugin that handles ".log" extension. For eg: -- I have file input.csv and have csv format defined in s3 storage plugin. 2 rows selected (1.233 seconds) 0: jdbc:drill:schema=dfs> select * from s3.root.`test-dir/inpu

Re: Exception while reading parquet data

2017-10-11 Thread Arjun kr
Can you try disabling async parquet reader to see if problem gets resolved. alter session set `store.parquet.reader.pagereader.async`=false; Thanks, Arjun From: PROJJWAL SAHA Sent: Wednesday, October 11, 2017 2:20 PM To: user@drill.apache.org Subject: Except

Re: Access to Drill 1.9.0

2017-10-07 Thread Arjun kr
Drill ships with jar 'log4j-over-slf4j.jar'. Even Drillbit fails to start with same stacktrace when slf4j-log4j12.jar is added to Drill classpath. You may need to look into how log4j-over-slf4j.jar is getting added to your classpath. As per SLF4J documentation , both these jars cannot be presen

Re: How connect to HIVE with LDAP?

2017-09-27 Thread Arjun kr
Hi, Did you try setting it up with other authentication enabled - say PAM authentication. This would help to identify if issue it is related to LDAP authentication. In my understanding , Drill does connect to Hive metastore and hive server 2 authentication may not be relevant here. BTW , what

Re: error accessing sqlline in distributed mode

2017-09-20 Thread Arjun kr
The connection URL should be jdbc:drill:zk=[:][,[:] https://drill.apache.org/docs/using-the-jdbc-driver/ Thanks, Arjun From: Divya Gehlot Sent: Thursday, September 21, 2017 9:18 AM To: user@drill.apache.org Subject: error accessing sqlline in distributed mode

Re: Query Error on PCAP over MapR FS

2017-09-14 Thread Arjun kr
> I don’t understand Drill architecture well... > > > If you copy the file to local file system root directory , Does query > work ? > Yes, query was successful on the local file system. > > Thank you. > > > > 2017/09/14 16:24、Arjun kr のメール: > > > > T

Re: Query Error on PCAP over MapR FS

2017-09-14 Thread Arjun kr
The stack trace shared before shows FileInputStream being invoked from PcapRecordReader class. Does it work with Hdfs/Mapr FS or expects only local file system ? >>> Caused by: java.io.FileNotFoundException: /x.pcap (No such file or >>> directory) >>> at java.io.FileInputStream.open(Nat

Re: Query Error on PCAP over MapR FS

2017-09-13 Thread Arjun kr
I have not used pcap storage format before. Doesn't it require specific format defined in storage plugin ( as psv format given below)? "formats": { "psv": { "type": "text", "extensions": [ "psv" ], "delimiter": "|" }, Thanks, Arjun _