;
> >Nicholas Szandor Hakobian
> >Data Scientist
> >Rally Health
> >nicholas.hakob...@rallyhealth.com
> >
> >On Thu, Feb 4, 2016 at 12:57 PM, Dave Nicodemus
> ><dave.nicode...@gmail.com> wrote:
> >> Thanks Nick,
> >>
> >&g
it to
> have a table name alias so it can be referenced in an outer statement.
>
> -Nick
>
> Nicholas Szandor Hakobian
> Data Scientist
> Rally Health
> nicholas.hakob...@rallyhealth.com
>
> On Thu, Feb 4, 2016 at 11:28 AM, Dave Nicodemus
> <dave.nicode...@gmail.
are integer data type
Does anyone know if this is a known issue and whether it's fixed someplace
?
Thanks,
Dave
Stack
Caused by: java.lang.NullPointerExcEeption: Remote
java.lang.NullPointerException: null
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPresent
of various ORC
files being used in the query some are DICTIONARY_V2 and some are DIRECT_V2
encoded, depending on the data for column 584.
We can workaround it by disabling hive.vectorized.execution.enabled.
Has anyone else experienced anything similar?
Thanks
-Dave
/orc/files/col1=val1/col2=val2';
Thanks - Dave
On Fri, 13 Nov 2015 at 11:59 patcharee <patcharee.thong...@uni.no> wrote:
> Hi,
>
> It work with non-partition ORC, but does not work with (2-column)
> partitioned ORC.
>
> Thanks,
> Patcharee
>
>
>
> On
this error
before? Is it fixed in a later version? I've included reproduction steps
below.
Thanks - Dave
*Create a sample text file*
echo a1,b1 part
*Create a textfile table and load the data into it*
CREATE TABLE t1 (a STRING, b STRING)
PARTITIONED BY (c STRING)
ROW FORMAT DELIMITED FIELDS TERMINATED
to do that I've missed?
Regards,
Dave
://metacpan.org/release/Thrift-API-HiveClient2
--
Best wishes,
Dave Cardwell.
http://davecardwell.co.uk/
On 14 May 2013 11:35, Dave Cardwell d...@davecardwell.co.uk wrote:
I wrote a few reporting scripts in Perl that connected to Hive via the
Thrift interface.
Since we upgraded CDH to 4.2.0
() without an argument, the TCLIService
module itself complains that it cannot create a TOpenSessionResp object
because the class is not loaded.
I have attached example code. Can anyone advise me on how to get past this
block?
--
Best wishes,
Dave Cardwell.
http://davecardwell.co.uk
Hi guys,
trying to calculate the dwell time of pages in a weblog. In oracle we would
used the lead analytic function to find the next row for a particular cookie.
What is the best approach for Hive?
Thanks
Dave
Dave Houston
r...@crankyadmin.net
regexp_extract(event_list, '\d+') = 239;
is that I have at the minute but always returns 0 Rows loaded to
video_plays_for_sept
Many thanks
Dave Houston
r...@crankyadmin.net
DELIMITED FIELDS TERMINATED BY '\t'
STORED AS INPUTFORMAT
'com.example.mapreduce.input.TextFileInputFormatIgnoreSubDir'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION '/data/test/users';
Hope this saves someone else the trouble of figuring it out...
-Dave
On Thu, Aug
look at files?
Thanks in advance,
-Dave
(instead of 32), but I still get 32 teeny output files. Setting the
various hive.merge.*” options does not seem to have any effect.
Is there something else I should be doing to get the output to be in one
large file instead of 32 small ones?
--
Dave Brondsema
Lead Software Engineer
(ProcessImpl.java:65)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
... 16 more
2011-03-01 14:46:13,784 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task
--
Dave Brondsema
Software Engineer
Geeknet
www.geek.net
/Hive/LanguageManual/UDF). Possibly use of a
Map type would be best.. not sure.
HTH
Dave Viner
On Tue, Mar 1, 2011 at 4:33 AM, Cam Bazz camb...@gmail.com wrote:
Hello,
Now I would like to count impressions per item. To achieve this, I
made a logger, for instance when the user goes
--
shangan
--
Dave Brondsema
Software Engineer
Geeknet
www.geek.net
jobs = 2. However, after generating the
lots-of-small-files table, Hive says:
Ended Job = job_201011021934_1344
Ended Job = 781771542, job is filtered out (removed at runtime).
Is there a way to force the merge, or am I missing something?
--Leo
--
Dave Brondsema
Software
of getCombineFileInputFormat into
Hadoop18Shims?
On Wed, Nov 10, 2010 at 4:31 PM, yongqiang he heyongqiang...@gmail.comwrote:
I think the problem was solved in hive trunk. You can just try hive trunk.
On Wed, Nov 10, 2010 at 10:05 AM, Dave Brondsema dbronds...@geek.net
wrote:
Hi, has there been any
I copied Hadoop19Shims' implementation of getCombineFileInputFormat
(HIVE-1121) into Hadoop18Shims and it worked, if anyone is interested.
And hopefully we can upgrade our Hadoop version soon :)
On Fri, Nov 12, 2010 at 12:44 PM, Dave Brondsema dbronds...@geek.netwrote:
It seems that I can't
20 matches
Mail list logo