Looks like there are several issues.  First there are some parse exception,
probably about Hive statements you didnt put here.

org.apache.hadoop.hive.ql.parse.ParseException: line 1:0 cannot recognize
> input near 'conf' '.' 'set'


As for the map-reduce exception of your Hive query, you can get more
information in the logs, but it looks like it just timeout as it took too
long.  Maybe your machine is low on resource and you need to bump
mapred.task.timeout.

Task attempt_201404092012_0138_m_000000_3 failed to report status for 600
> seconds. Killing!


Hope that helps
Szehon


On Wed, May 14, 2014 at 1:21 AM, Audi Lee ( 李坤霖 )
<audi...@taiwanmobile.com>wrote:

>   Hi~
>
> When I run a hive statement(select * from lab.ec_web_log limit 100), I
> got an error.
>
> Should I do anything for fixing it?
>
> Thanks for your help!
>
>
>
> Lab.ec_web_log create statement:
>
> CREATE external TABLE lab.ec_web_log (
>
> host STRING, ipaddress STRING, identd STRING, user STRING,finishtime
> STRING,
>
> requestline STRING, returncode INT, size INT, getstr STRING, retstatus
> INT, v_P03_1 STRING, v_P04 STRING,
>
> v_P06 STRING, v_P08 STRING, v_P09 STRING, v_P10 STRING, v_P11 STRING,
> v_P12 STRING, v_P13 STRING, v_P14 STRING, v_P15 STRING, v_P16 STRING, v_P17
> STRING, v_P18 STRING, v_P19 STRING, v_P20 STRING)
>
> ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe'
>
> WITH SERDEPROPERTIES (
>
>
> 'serialization.format'='org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol',
>
> 'quote.delim'='("|\\[|\\])',
>
> 'field.delim'=' ',
>
> 'serialization.null.format'='-')
>
> STORED AS TEXTFILE
>
> LOCATION '/user/audil/weblog/';
>
>
>
> Web log format:
>
> xxx.xxxx.com xxx.xxx.xxx.xxx - - [04/May/2014:23:59:59 +0800] 1 1248214
> "GET
> /buy/index.php?action=product_detail&prod_no=P0000200382387&prod_sort_uid=3304
> HTTP/1.1" 200 30975 "202.39.48.37" "-" "Mozilla/5.0 (Windows NT 6.1; WOW64)
> AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.66 Safari/537.36"
> "-"
>
>
>
> Error List:
>
> 2014-05-14 13:55:07,751 WARN  snappy.LoadSnappy
> (LoadSnappy.java:<clinit>(36)) - Snappy native library is available
>
> 2014-05-14 15:01:24,303 WARN  mapred.JobClient
> (JobClient.java:copyAndConfigureFiles(746)) - Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
>
> 2014-05-14 15:42:09,652 ERROR exec.Task
> (SessionState.java:printError(410)) - Ended Job = job_201404092012_0138
> with errors
>
> 2014-05-14 15:42:09,655 ERROR exec.Task
> (SessionState.java:printError(410)) - Error during job, obtaining debugging
> information...
>
> 2014-05-14 15:42:09,656 ERROR exec.Task
> (SessionState.java:printError(410)) - Job Tracking URL:
> http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201404092012_0138
>
> 2014-05-14 15:42:09,659 ERROR exec.Task
> (SessionState.java:printError(410)) - Examining task ID:
> task_201404092012_0138_m_000002 (and more) from job job_201404092012_0138
>
> 2014-05-14 15:42:09,878 ERROR exec.Task
> (SessionState.java:printError(410)) -
>
> Task with the most failures(4):
>
> -----
>
> Task ID:
>
>   task_201404092012_0138_m_000000
>
>
>
> URL:
>
>
> http://hdp001-jt:50030/taskdetails.jsp?jobid=job_201404092012_0138&tipid=task_201404092012_0138_m_000000
>
> -----
>
> Diagnostic Messages for this Task:
>
> Task attempt_201404092012_0138_m_000000_3 failed to report status for 600
> seconds. Killing!
>
>
>
> 2014-05-14 15:42:09,900 ERROR ql.Driver
> (SessionState.java:printError(410)) - FAILED: Execution Error, return code
> 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
>
> 2014-05-14 15:56:30,759 ERROR ql.Driver
> (SessionState.java:printError(410)) - FAILED: ParseException line 1:0
> cannot recognize input near 'conf' '.' 'set'
>
>
>
> org.apache.hadoop.hive.ql.parse.ParseException: line 1:0 cannot recognize
> input near 'conf' '.' 'set'
>
>
>
>         at
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:193)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:418)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
>
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
>
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:756)
>
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>         at java.lang.reflect.Method.invoke(Method.java:597)
>
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
>  ------------------------------
> _DISCLAIMER : This message (and any attachments) may contain information
> that is confidential, proprietary, privileged or otherwise protected by
> law. The message is intended solely for the named addressee (or a person
> responsible for delivering it to the addressee). If you are not the
> intended recipient of this message, you are not authorized to read, print,
> retain, copy or disseminate this message or any part of it. If you have
> received this message in error, please destroy the message or delete it
> from your system immediately and notify the sender.
>

Reply via email to