ODBC querying issues- Can only see files in Drill Explorer, not with other client

2017-06-15 Thread Jack Ingoldsby
Hi, I'm using Windows embedded to connect to S3, but am having querying using ODBC The ODBC connection works (connection string below) CastAnyToVarchar=true;Catalog=s3citibike;Schema=default;HandshakeTimeout=5;QueryTimeout=180;TimestampTZDisplayTimezone=local;NumberOfPrefetchBuffers=5;StringColum

Re: SIGSEGV error - StubRoutines::jlong_disjoint_arraycopy

2017-06-15 Thread Kunal Khatua
I'm skeptical of preventing the segfault simply by the switch to HDFS as the target storage. It'll at most help you avoid the need for FTPing (which is anyway a saving IMHO) You might need to increase memory allocation for Drill. See if JConsole can reveal whether the Heap memory hits a limit

Re: Increasing store.parquet.block-size

2017-06-15 Thread Khurram Faraaz
Thanks Padma. From: Padma Penumarthy Sent: Thursday, June 15, 2017 8:58:44 AM To: user@drill.apache.org Subject: Re: Increasing store.parquet.block-size Sure. I will check and try to fix them as well. Thanks, Padma > On Jun 14, 2017, at 3:12 AM, Khurram Faraaz

RE: SIGSEGV error - StubRoutines::jlong_disjoint_arraycopy

2017-06-15 Thread Lee, David
Yeah. It only crashes on the larger JSON files. Reworking my python script to use hdfs.tmp instead of dfs.tmp now.. -Original Message- From: Kunal Khatua [mailto:kkha...@mapr.com] Sent: Thursday, June 15, 2017 10:52 AM To: user@drill.apache.org Subject: RE: SIGSEGV error - StubRoutines::

Re: QUESTION: Drill Configuration to access S3 buckets

2017-06-15 Thread Jack Ingoldsby
Was able to connect to N Virginia, thanks. But to be able to use the Drill as a standard tool, would need to be able to connect to all regions, of course On Thu, Jun 15, 2017 at 7:20 AM, Jack Ingoldsby wrote: > Thx for this. Sounds like a combination of AWS/Drill factors. > Are we likely to add

RE: SIGSEGV error - StubRoutines::jlong_disjoint_arraycopy

2017-06-15 Thread Kunal Khatua
Nope.. not seen this before. Can you share more details of the log messages, etc? The problem might have to do with the JSON files being very large... because the segmentation fault that triggered the JVM (Drillbit) crash hints at that during the write of the Parquet files. I take it you are

SIGSEGV error - StubRoutines::jlong_disjoint_arraycopy

2017-06-15 Thread Lee, David
Starting last week we started seeing the error below which terminates the drill service pid.. My research suggests that it is a space issue with /tmp, but we have plenty of free space.. I'm using dfs.tmp to convert JSON files (2 to 3 gig each) into Parquet.. Anyone encounter this issue before

Re: QUESTION: Drill Configuration to access S3 buckets

2017-06-15 Thread Jack Ingoldsby
Thx for this. Sounds like a combination of AWS/Drill factors. Are we likely to address the Drill side in a subsequent release? On Thu, Jun 15, 2017, 01:39 Uwe L. Korn wrote: > The current Drill releases use the hadoop-io libraries from the 2.7.x > series. Locally I have built against the 3.0.0 a