Thanks for your reply, able to fix the issue by setting.
set store.mongo.bson.record.reader = false;
On 12/07/2016 08:28 PM, Chunhui Shi wrote:
The length of utf8 encoded byte array is not guarantee to be the same as
String.length(). A fix should be in BsonRecordReader.writeString().
On We
try this
set store.mongo.bson.record.reader = false;
On 12/08/2016 12:41 AM, Dana Jin wrote:
Hello guys,
I have a trouble to query to MongoDB using Drill, so I’d like to ask for some
help. :)
When I queried to drill like:
SELECT _id
FROM ……mongodb……
LIMIT 1;
The result looks like:
[cid:A
Hello
I want to know how and where to configure HTTPD Storage plugin. How
it working?
Can i convert my IP address to location, country using httpd plugin.
--
---
Thanks & Regards
Sanjiv Kumar
Hello guys,
I have a trouble to query to MongoDB using Drill, so I’d like to ask for some
help. :)
When I queried to drill like:
SELECT _id
FROM ……mongodb……
LIMIT 1;
The result looks like:
[cid:AF8CBBA2-744E-44C9-807A-D752D5461BEF]
The value should be like ‘540f9b7eee6da146f690166c’
Could an
I am not able to reproduce your issue at least with your one sample record,
reproduce step:
(1) from mongodb, display your sample record:
>db.kath.find().pretty();
{
"_id" : ObjectId("58402ad5757d7fede822e641"),
"rule_list" : [
"x",
"(contains:x(con
The length of utf8 encoded byte array is not guarantee to be the same as
String.length(). A fix should be in BsonRecordReader.writeString().
On Wed, Dec 7, 2016 at 3:11 AM, yousuf wrote:
>
> Hi
>
> I'm currently exploring apache drill, running on a cluster mode. my
> datasoure is mongodb.My dat
Hi Stefán,
Yes, thanks, I know about CTAS possibility and it works fine. And much
faster then direct JSON read.
I'm looking for possibility to load batch data from other sources. For
example from Kafka Connect Sink module.
On Wed, Dec 7, 2016 at 4:33 PM, Stefán Baxter wrote:
> Hi Alexander,
>
>
Alexander -
When I have something like this, especially when the output will be
extremely large, I use CTAS into Parquet files. That said, I think you are
more looking at the ETL process for JSON. So, ignoring the CTAS to Parquet
for now, if you have a bunch of JSON files that will be loaded
incr
Hi Alexander,
Drill allows you to both a) query the data directly in json format and b)
convert it to Parqet (have a look at the CTAS function)
Hope that helps,
-Stefán
On Wed, Dec 7, 2016 at 1:08 PM, Alexander Reshetov <
alexander.v.reshe...@gmail.com> wrote:
> Hello,
>
> I want to load batch
(Also Alla, you should tell your webdevs that your companies website
doesn't render properly on a 4k screen ;) )
On Wed, Dec 7, 2016 at 9:07 AM, Tom Barber wrote:
> Not to suggest something silly, but having seen stuff like this 100's of
> times... are you sure you're not over engineering the so
Hello,
I want to load batches of unstructured data in Drill. Mostly JSON data.
Is there any batch API or other options to do so?
Thanks.
Hi
I'm currently exploring apache drill, running on a cluster mode. my
datasoure is mongodb.My datasource table contains 5 million documents. I
can't execute a simple query
|select body from mongo.twitter.tweets limit 10;|
*Throwing exception*
|QueryFailed:AnErrorOccurredorg.apache.drill.c
Is your data the same on 1.6.0 and 1.9.0, or did your data change by any
chance when/after upgrading from 1.6.0 to 1.9.0 ?
On Wed, Dec 7, 2016 at 12:22 PM, Pratik Khedkar <
pratik.khed...@games24x7.com> wrote:
> Hi Team,
>
> I am getting below error after upgrading drill from 1.6 to 1.9.
> Workin
Not to suggest something silly, but having seen stuff like this 100's of
times... are you sure you're not over engineering the solution because its
hipster not because its actually the best way to do things? :)
Cheers
Tom
On Wed, Dec 7, 2016 at 7:29 AM, WeiWan wrote:
> Hi Alla,
>
> I’m not the
14 matches
Mail list logo