HiBenchmark for spark in CDH error

2017-08-03 Thread Kumar Jayapal
I am not able to run HiBenchmark for spark in CDH cluster. Not able to understand what is the mistake I am doing. can any one help [root@hadoop3-main HiBench]# bin/workloads/micro/wordcount/prepare/prepare.sh bin/workloads/micro/wordcount/prepare /u01/HiBench/bin/workloads/micro/wordcount/prepare

unable to install cdh5.7

2016-06-27 Thread Kumar Jayapal
Hi, I am trying to install cdh5.7 I get this error. I see that python-psycopg2-2.0.14-2.el6.x86_64.rpm,mod_ssl-2.2.15-53.el6.centos.x86_64.rpm are present in my local reposityr --> Finished Dependency Resolution Error: Package: cloudera-manager-agent-5.7.0-1.cm570.p0.76.el6.x86_64 (base) R

DBVisualizer configuration with hive in kerberos env

2016-06-22 Thread Kumar Jayapal
Hi All, Did any one integrate DBVisualizer with hive in a KRB cluster. I followed the steps given at the site still not able to make a connection to hive. https://community.hortonworks.com/articles/32586/integrating-dbvisualizer-with-kerberized-hive.html I am using hive 0.13.1+cdh5.3.3+350 2

AD with Hadoop....

2016-05-03 Thread Kumar Jayapal
Hello All, When we configure Active Directory With Hadoop do we have any limitation on user or group naming convention. How long can they be and can they contain characters like . or _ #@$%^&*. Can any one of you point me to the link in which these convention are defined. Thanks Jay

Unable to connect to Impala shell after updating the cluster to 5.5.1

2016-01-25 Thread Kumar Jayapal
Hi, Did anyone had this issue for impala ? I am Unable to connect to Impala shell after updating the cluster to 5.5.1 my cluster has kerberos and LDAP for authentication. When it try to connect impala shell It displays a message " LDAP credentials may not be sent over insecure connections. Enabl

Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

2015-12-29 Thread Kumar Jayapal
Hi, When I run this simple pig script from pig editor in hue I get permission denied error. I can execute queries in hive as the same user any idea why? We are using sentry for authorisation. Here is my pig script. LOAD_TBL_A = LOAD 'sandbox.suppliers' USING org.apache.hive.hcatalog.pig.HCatL

How does this work

2015-12-23 Thread Kumar Jayapal
Hi, My environment has Kerbros and Senry for authentication and authorisation. we have the following permission on drwxrwx--- - hive hive */user/hive/warehouse* Now When I login through Hue/Beeline how am able to acccess the data inside this directory. When I dont belong to hive gr

Re: Unable to connect to beeline after kerberos is installed

2015-11-24 Thread Kumar Jayapal
principle > Use "!connect > jdbc:hive2://:/;principal=”. > > > Please make sure it is the hive’s principal, not the user's. > > Restart the affected services. And it should work. > > Regards, > Arpan > On 24 Nov 2015 1:55 am, "Kumar Jayapal" wrote: &g

Unable to connect to beeline after kerberos is installed

2015-11-23 Thread Kumar Jayapal
Unable to connect to beeline after kerberos is installed I am getting this error. Let me know if an one has resolution for it. *Error: Could not open client transport with JDBC Uri: jdbc:hive2://hiven001.np.mmc.com:1/ : Peer indicated failure: Unsupported mec

Hive showing SemanticException [Error 10002]: Line 3:21 Invalid column reference 'mbdate

2015-10-28 Thread Kumar Jayapal
Hello, Can some please help. When I execute hive query with as case statement I get this error " Error while compiling statement: FAILED: SemanticException [Error 10002]: Line 3:21 Invalid column reference 'mbdate' Here is the query : select a.mbcmpy, a.mbwhse, a.mbdept, a.mbitem, (CASE WHEN t

hive showing SemanticException [Error 10002]: Line 3:21 Invalid column reference 'mbdate'

2015-10-27 Thread Kumar Jayapal
Hello, Can some please help. When I execute hive query with as case statement I get this error " Error while compiling statement: FAILED: SemanticException [Error 10002]: Line 3:21 Invalid column reference 'mbdate' Here is the query : select a.mbcmpy, a.mbwhse, a.mbdept, a.mbitem, (CASE WHEN t

compress folder in hadoop

2015-08-05 Thread Kumar Jayapal
Hi All, How to compress a folder in hadoop? I want to compress a folder which has old data and not frequently used. How can I do that ? When I searched the web I got some idea to compress the files. Can some please help me understanding Why files are not in .lzo or .gz format. I am test execut

Sqoop export date error

2015-07-27 Thread Kumar Jayapal
Hi , I am trying to export data from hdfs to oracle Its getting error out at date column. sqoop export -Dmapred.child.java.opts="-Djava.security.egd=file:/dev/../dev/urandom" --connect "jdbc:oracle:thin:@lorsastest.kjp.com:1521/testus" --username usertest -P --table "TEST_BI.DATE_DIMENSION" --e

How to convert files in HIVE

2015-07-16 Thread Kumar Jayapal
Hi, How can we convert files stored in snappy compressed parquet format in Hive to avro format. is it possible to do it. Can you please guide me or give me the link which describe the procdure Thanks Jay

Re: sqoop export query error out..

2015-07-16 Thread Kumar Jayapal
ble TEST_DIMESION is the same schema as TESTUSER, if not try > appending the schema name to the table - .TEST_DIMENSION. > 3. Make sure you are connecting to the right server and service name. > > Thanks, > Ashwin > > > On Thu, Jul 16, 2015 at 5:05 AM, Kumar Jayapal > w

sqoop export query error out..

2015-07-15 Thread Kumar Jayapal
Hi, I see the table in my database. When I execute a export command I get table or view does not exist. When I do list-tables I can see the table. Can some one tell me what wrong. I have DBA privileges sqoop export --connect "jdbc:oracle:thin:@lorsasa.ss.com:1521/pocd01us" --username " TESTUSER

Avro Format output

2015-06-29 Thread Kumar Jayapal
Hello, I have a situation here. I have loaded CSV files into hive database in a PARQUET FORMAT. I am want to move this data into a directory in AVRO format is it possible to do that. I tried with this potion but not successed. Can some one please help. insert overwrite directory '/tmp/blo' ROW

hive load error

2015-06-29 Thread Kumar Jayapal
Hi, I am trying to insert data into file from parquet table. with the following command and getting this exception. insert overwrite directory '/tmp/blo' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS AVRO select dwhse, dsdat, dreg_num, drn_num, dscnr, dareq, datak, dmsgc from blok whe

Fwd: Quest Data Connector for Oracle throws error.

2015-06-25 Thread Kumar Jayapal
Hello All, I have installed Quest Data Connector for Oracle but it is showing error while importing data using sqoop. I am able to import the same data from oracle when i disable Quest Data Connector . I have copied debug logs. I dint find any blog about resolve this issue. please let me know

dir permission show ??????

2015-06-09 Thread Kumar Jayapal
Hi All, Why is my directory permission show like this ls -ltr /mtn d? ? ?? ?? drive15-sdq and datanode is showing error... I am unable to see device /dev/dsq1 how can I resolve this issue. Please help 8:47:57.529 PMWARNorg.apache.hadoop.hdfs.server.datanod

Prallel running jobs

2015-06-03 Thread Kumar Jayapal
Hi, I have a question regarding jobs running in YARN. I have SET hive.exec.parallel=true; When I submit this command FROM EMPLOYER_STAGE INSERT OVERWRITE TABLE EMPLOYER PARTITION (FISCAL_YEAR = 2015, FISCAL_PERIOD = 01) SELECT * WHERE FISCAL_YEAR = 2014 AND FISCAL_PERIOD = 08 INSERT OVERWR

Re: Unable to drop table in HIVE

2015-05-29 Thread Kumar Jayapal
Hello Ruvi, Thanks for your prompt response. I tried adding the jar file to hive. My cluster is sentry enabled. I get insufficient privileges when I try to add jar. Thanks Sajid Thanks Jay On Fri, May 29, 2015 at 2:19 PM, P lva wrote: > If you created this table with a custom serde

Unable to drop table in HIVE

2015-05-29 Thread Kumar Jayapal
Hi, *I get this error when I am trying to drop a table in hive. If any one of you had seen this issue please help me to resolve it.* *2015-05-29 20:29:50,755 INFO org.apache.hadoop.hive.ql.log.PerfLogger: * *2015-05-29 20:29:50,773 INFO org.apache.sentry.binding.hive.conf.HiveAuthzConf: Default

how to use --as-parquetfile in sqoop import

2015-05-27 Thread Kumar Jayapal
Hi, Can I use --as-parquetfile argument while importing data to Hive? I have check the site https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_basic_usage I don't see this option any place mentioned. Thanks Jay

OOZIE workflow error

2015-05-13 Thread Kumar Jayapal
My Oozie job is failing with following error. I look at the site and tried to change the hive-site.xml but still gettting the same error. issue is still not resolved. Does any one worked on this issue. 2015-05-13 22:29:00,998 WARN org.apache.oozie.action.hadoop.HiveActionExecutor: SERVER[ oozie

[no subject]

2015-05-07 Thread Kumar Jayapal
Can some one please help me. I am running the simple sqoop command to import the table with split by options I am getting this error. Does any one solved this error before. I searched site no resolution so far. sqoop command sqoop import --connect "jdbc:oracle:thin:@mysql.1521/PR" --username "

Sqoop --split by options

2015-05-06 Thread Kumar Jayapal
Hello All, Can I use split-by option with multiple keys to import the data ? please help and suggest me any link. sqoop import \ --connect jdbc:mysql://mysql.example.com/sqoop \ --username sqoop \ --password sqoop \ --query 'SELECT normcities.id, \ countries.country, \ normcities.city \ FROM

Re: how to load data

2015-05-03 Thread Kumar Jayapal
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.access$200(MapOperator.java:127) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:508) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:180) ... 8 more Caused by: java.lang Thanks Jay O

Re: how to load data

2015-05-03 Thread Kumar Jayapal
TED BY ',' > and give it a try > > On Fri, May 1, 2015 at 6:33 PM, Kumar Jayapal > wrote: > >> 106,"2003-02-03",20,2,"A","2","2","037" >> 106,"2003-02-03",20,3,"A","2","2"

Re: parque table

2015-05-02 Thread Kumar Jayapal
t; > > > On May 1, 2015, at 7:32 AM, Asit Parija wrote: > > Hi Kumar , > You can remove the stored as text file part and then try that out by > default it should be able to read the .gz files ( if they are comma > delimited csv files ) . > > > Thanks > Asit &

Re: how to load data

2015-05-01 Thread Kumar Jayapal
106,"2003-02-03",20,2,"A","2","2","037" 106,"2003-02-03",20,3,"A","2","2","037" 106,"2003-02-03",8,2,"A","2","2","037" Thanks Jay On

Re: parque table

2015-04-30 Thread Kumar Jayapal
> > On Fri, May 1, 2015 at 9:17 AM, Kumar Jayapal > wrote: > >> Created table CREATE TABLE raw (line STRING) PARTITIONED BY >> (FISCAL_YEAR smallint, FISCAL_PERIOD smallint) >> STORED AS TEXTFILE; >> >> and loaded it with data. >> >> LOAD DATA L

Re: how to load data

2015-04-30 Thread Kumar Jayapal
ata on > select. > > 2. run simple query to load data > insert overwrite table > select * from > > On Wed, Apr 29, 2015 at 3:26 PM, Kumar Jayapal > wrote: > >> Hello All, >> >> >> I have this table >> >> >> CREATE T

parque table

2015-04-30 Thread Kumar Jayapal
Created table CREATE TABLE raw (line STRING) PARTITIONED BY (FISCAL_YEAR smallint, FISCAL_PERIOD smallint) STORED AS TEXTFILE; and loaded it with data. LOAD DATA LOCAL INPATH '/tmp/weblogs/20090603-access.log.gz' INTO TABLE raw; I have to load it to parque table when I say select * from raw i

Re: How to move back to .gz file from hive to hdfs

2015-04-30 Thread Kumar Jayapal
I did not find it in .Trash file is moved to hive table I want to move it back to hdfs. On Thu, Apr 30, 2015 at 2:20 PM, Alexander Pivovarov wrote: > Try to find the file in hdfs trash > On Apr 30, 2015 2:14 PM, "Kumar Jayapal" wrote: > >> Hi, >> >> I loade

How to move back to .gz file from hive to hdfs

2015-04-30 Thread Kumar Jayapal
Hi, I loaded one file to hive table it is in .gz extension. file is moved/deleted from hdfs. when I execute select command I get an error. Error: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask (state=08S01,code=2) how c

Re: how to load data

2015-04-30 Thread Kumar Jayapal
query to load data > insert overwrite table > select * from > > On Wed, Apr 29, 2015 at 3:26 PM, Kumar Jayapal > wrote: > >> Hello All, >> >> >> I have this table >> >> >> CREATE TABLE DBCLOC( >>BLwhse int COMMENT 'DEC

how to load data

2015-04-29 Thread Kumar Jayapal
Hello All, I have this table CREATE TABLE DBCLOC( BLwhse int COMMENT 'DECIMAL(5,0) Whse', BLsdat string COMMENT 'DATE Sales Date', BLreg_num smallint COMMENT 'DECIMAL(3,0) Reg#', BLtrn_num int COMMENT 'DECIMAL(5,0) Trn#', BLscnr string COMMENT 'CHAR(1) Scenario', BLareq strin

Re: YARN Exceptions

2015-04-25 Thread Kumar Jayapal
FO mapreduce.ImportJobBase: Retrieved 0 records. 15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed! thanks Sajid On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh wrote: > What is this error > > User edhdtaesvc not found > > Are you using any user with

YARN Exceptions

2015-04-24 Thread Kumar Jayapal
Hi, I am getting the following error while running sqoop import script. can any one please help in resolving this issue. 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false 15/04

Sqoop2 error when I run the jobs through hue.

2015-04-22 Thread Kumar Jayapal
Hi, I am getting this error when I execute run the job in sqoop2 from hue. I see lots of people talking about this error but no proper resolution. Did any one able to resolve this issue. Any help is appreciated. 2015-04-22 21:36:07,281 ERROR org.apache.sqoop.submission.mapreduce.MapreduceSubm

Exception while import in sqoop

2015-04-17 Thread Kumar Jayapal
Hi, I installed sqoop2 and trying to execute simple export command to check the db connection Does any one have reason and resolution of this error. sqoop import --connect jdbc:mysql://mysql.mmc.com:3306/hive --username hive --password sdeecer --table ROLES Exception has occurred during

is this a best practice to do....

2015-04-14 Thread Kumar Jayapal
Hello all, I want to use snappy to import data from sqoop is this the best way to do. sqoop import --connect jdbc:as400://DEV400.mmc.com --username kkms --P --driver com.ibm.as400.access.AS400JDBCDriver --table INATSTDTA.INWCTLP --as-avrodatafile --split-by WCWHS5 --target-dir /user/kkms/a

Re: Hadoop 2.6 issue

2015-04-01 Thread Kumar Jayapal
$which java make sure the paths are valid for your installation (change if using 32bit version): /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java /usr/lib/jvm/java-6-openjdk-amd64/bin/javac Setup update-alternatives: sudo update-alternatives --install "/usr/bin/java" "java" "/usr/lib/jvm/java-6-o

Fwd: HMS error

2015-04-01 Thread Kumar Jayapal
Hello All, Did any one got this error before. I am working on database migration task from postgresql to MySQL. Here is what I did. I took the dumps using PG_DUMP from PostgreSQL and converted it to MySQL using PHP script. I don't see any error in creating the tables in MySQL db. I created

Cleanup Procedure in CM

2015-03-19 Thread Kumar Jayapal
Hi, Does any one have any doc or link which describe how to delete KMS service and its client configuration from a cluster using CM. site just say " Deleting a service does *not* clean up the associated client configurations that have been deployed in your cluster". Thanks Jay

How to stop ooze flow using CM

2015-03-16 Thread kumar jayapal
Hello, May i know how to stop oozie flows in CDH5 using CM? can you please give me some link to know more about it? thanks Jap