Re:

2015-05-07 Thread Kai Voigt
Not sure if that will fully help, but --m is bad syntax, use -m instead. Maybe sqoop freaks out about that and its syntax parser gets confused. Kai > Am 08.05.2015 um 07:34 schrieb Kumar Jayapal : > > Can some one please help me. > > I am running the simple sqoop command to import the table wi

Re:

2015-05-07 Thread Hadoop User
Sent from my iPhone > On May 7, 2015, at 10:51 PM, Rich Haase wrote: > > If Sqoop is generating the SQL for your import then you may have hit a bug in > the way the SQL for Oracle is being generated. I’d recommend emailing the > Sqoop user mailing list: u...@sqoop.apache.org. > >> On May

Re:

2015-05-07 Thread Rich Haase
If Sqoop is generating the SQL for your import then you may have hit a bug in the way the SQL for Oracle is being generated. I’d recommend emailing the Sqoop user mailing list: u...@sqoop.apache.org. On May 7, 2015, at 11:45 PM, Rich Haase mailto:rha...@pandora.co

Re:

2015-05-07 Thread Rich Haase
I’m not a Sqoop user, but it looks like you have an error in your SQL. -> Caused by: java.sql.SQLSyntaxErrorException: ORA-00907: missing right parenthesis On May 7, 2015, at 11:34 PM, Kumar Jayapal mailto:kjayapa...@gmail.com>> wrote: ORA-00907 Rich Haase| Sr. Software Engineer | Pandora m

[no subject]

2015-05-07 Thread Kumar Jayapal
Can some one please help me. I am running the simple sqoop command to import the table with split by options I am getting this error. Does any one solved this error before. I searched site no resolution so far. sqoop command sqoop import --connect "jdbc:oracle:thin:@mysql.1521/PR" --username "

Hive JSON Create Table query error: ParseException line 3:14 cannot recognize input near ':' 'string' ',' in column type

2015-05-07 Thread mani kandan
I'm trying to import a JSON file into a hive table, and trying to execute the below query: CREATE EXTERNAL TABLE twitter_data( userdata struct , tweetmessage:string, createddate:string) ROW FORMAT SERDE 'org.apache.hcatalog.JsonSerDe'; Upon execution I'm getting the following error: *Error

Re: Testing HDFS TDE - "Failed to close inode"/"Illegal key size" error

2015-05-07 Thread Philip Shon
Thanks Chris, that did the trick. I guess that exception in the kms.log file is an unrelated issue, b/c that exception was still thrown when it worked. On Thu, May 7, 2015 at 12:21 PM, Chris Nauroth wrote: > Hi Philip, > > I see that you used a key size of 256. This would require installati

Re: Testing HDFS TDE - "Failed to close inode"/"Illegal key size" error

2015-05-07 Thread Chris Nauroth
Hi Philip, I see that you used a key size of 256. This would require installation of the JCE unlimited strength policy files. http://www.oracle.com/technetwork/java/javase/downloads/jce-7-download-432124.html Alternatively, if you're just testing right now and can accept a smaller key size, t

Testing HDFS TDE - "Failed to close inode"/"Illegal key size" error

2015-05-07 Thread Philip Shon
I am testing out the TDE feature of HDFS, and am receiving the following error when trying to copy a file into the encryption zone. [hdfs@svr501 ~]$ hdfs dfs -copyFromLocal 201502.txt.gz /secure copyFromLocal: java.security.InvalidKeyException: Illegal key size 15/05/07 10:59:23 ERROR hdfs.DFSCli