My problem is same as this link
http://qnalist.com/questions/6024365/1-99-5-and-unable-to-find-valid-kerberos-ticket

On Thu, Jul 16, 2015 at 11:27 AM, Lee S <[email protected]> wrote:

> I've already added zookeeper lib in common.loader.
>
> On Thu, Jul 16, 2015 at 11:03 AM, Lee S <[email protected]> wrote:
>
>> sqoop log as attachment.
>>
>>
>> On Thu, Jul 16, 2015 at 10:43 AM, Abraham Elmahrek <[email protected]>
>> wrote:
>>
>>> Are there any other errors in your sqoop.log?
>>>
>>> Also, I think you need to zookeeper client jars as well for the security
>>> integration.
>>>
>>> On Wed, Jul 15, 2015 at 7:12 PM, Lee S <[email protected]> wrote:
>>>
>>>> Hi abe:
>>>>   I finally reinstalled the sqoop2 on a hadoop2.3 cluster.
>>>>   And execute show version --all, exceptions below comes out:
>>>>
>>>> Sqoop 1.99.6 source revision 07244c3915975f26f03d9e1edf09ab7d06619bb8
>>>>   Compiled by root on Wed Apr 29 10:40:43 CST 2015
>>>> 0    [main] WARN  org.apache.hadoop.util.NativeCodeLoader  - Unable to
>>>> load native-hadoop library for your platform... using builtin-java classes
>>>> where applicable
>>>> Exception has occurred during processing command
>>>> Exception: org.apache.sqoop.common.SqoopException Message:
>>>> CLIENT_0004:Unable to find valid Kerberos ticket cache (kinit)
>>>>
>>>> *And derbyrepo.log as below*
>>>> Loaded from
>>>> file:/root/lcy/sqoop-1.99.6-bin-hadoop200/server/webapps/sqoop/WEB-INF/lib/derby-10.8.2.2.jar
>>>> java.vendor=Oracle Corporation
>>>> java.runtime.version=1.7.0_45-b18
>>>> user.dir=/root
>>>> derby.system.home=null
>>>>
>>>> derby.stream.error.file=/root/lcy/sqoop-1.99.6-bin-hadoop200/logs/derbyrepo.log
>>>> Database Class Loader started - derby.database.classpath=''
>>>> Thu Jul 16 10:13:31 CST 2015 Thread[PurgeThread,5,main] (XID = 634),
>>>> (SESSIONID = 1), (DATABASE =
>>>> /root/lcy/sqoop-1.99.6-bin-hadoop200/repository/db), (DRDAID = null),
>>>> Cleanup action starting
>>>> Thu Jul 16 10:13:31 CST 2015 Thread[PurgeThread,5,main] (XID = 634),
>>>> (SESSIONID = 1), (DATABASE =
>>>> /root/lcy/sqoop-1.99.6-bin-hadoop200/repository/db), (DRDAID = null),
>>>> Failed Statement is: DELETE FROM "SQOOP"."SQ_SUBMISSION" WHERE
>>>> "SQS_UPDATE_DATE" < ? with 1 parameters begin parameter #1: 2015-07-15
>>>> 10:13:31.646 :end parameter
>>>> ERROR 08000: Connection closed by unknown interrupt.
>>>>         at
>>>> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>>>>         at
>>>> org.apache.derby.iapi.util.InterruptStatus.setInterrupted(Unknown Source)
>>>>         at org.apache.derby.iapi.util.InterruptStatus.throwIf(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.execute.BasicNoPutResultSetImpl.checkCancellationFlag(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.execute.TableScanResultSet.getNextRowCore(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.execute.ProjectRestrictResultSet.getNextRowCore(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.execute.DMLWriteResultSet.getNextRowCore(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.execute.DeleteResultSet.collectAffectedRows(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.execute.DeleteCascadeResultSet.collectAffectedRows(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.execute.DeleteCascadeResultSet.open(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
>>>>         at
>>>> org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
>>>>         at
>>>> org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeStatement(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeUpdate(Unknown
>>>> Source)
>>>>         at
>>>> org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
>>>>         at
>>>> org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
>>>>         at
>>>> org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
>>>>         at
>>>> org.apache.sqoop.repository.common.CommonRepositoryHandler.purgeSubmissions(CommonRepositoryHandler.java:1055)
>>>>         at
>>>> org.apache.sqoop.repository.JdbcRepository$26.doIt(JdbcRepository.java:594)
>>>>         at
>>>> org.apache.sqoop.repository.JdbcRepository.doWithConnection(JdbcRepository.java:92)
>>>>         at
>>>> org.apache.sqoop.repository.JdbcRepository.doWithConnection(JdbcRepository.java:63)
>>>>
>>>> On Thu, Jul 16, 2015 at 9:45 AM, Lee S <[email protected]> wrote:
>>>>
>>>>> Got it. Thanks for your clear answers.
>>>>>
>>>>> On Thu, Jul 16, 2015 at 9:39 AM, Abraham Elmahrek <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Sqoop2 should work against Hadoop 2.3 if it's compiled against Hadoop
>>>>>> 2.6. Those versions are wire compatible. If you run into any issues, post
>>>>>> them here and we can see exactly what's going on.
>>>>>>
>>>>>>
>>>>>> http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-common/Compatibility.html#Wire_compatibility
>>>>>>
>>>>>> On Wed, Jul 15, 2015 at 6:35 PM, Lee S <[email protected]> wrote:
>>>>>>
>>>>>>> yep, I compiled it and found that no security classes were
>>>>>>> available.
>>>>>>> How about I jumped the compilation of security related module.
>>>>>>> Cause I need  to use sqoop2 in a hadoop2.3 cluster.
>>>>>>>
>>>>>>> On Thu, Jul 16, 2015 at 9:29 AM, Abraham Elmahrek <[email protected]>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> At the moment, Hadoop 2.6 or newer is needed. The reason is that
>>>>>>>> there's a dependency on some of the security classes that were 
>>>>>>>> available in
>>>>>>>> later versions of Hadoop.
>>>>>>>>
>>>>>>>> On Wed, Jul 15, 2015 at 6:26 PM, Lee S <[email protected]> wrote:
>>>>>>>>
>>>>>>>>> Hi Abe:
>>>>>>>>> I'll try what you say.
>>>>>>>>> And one more question whether sqoop2 can be compiled with
>>>>>>>>> hadoop2.3 ?
>>>>>>>>> I've tried it but seemts that one class can not be found in
>>>>>>>>> hadoop2.3.
>>>>>>>>>
>>>>>>>>> On Thu, Jul 16, 2015 at 4:18 AM, Abraham Elmahrek <
>>>>>>>>> [email protected]> wrote:
>>>>>>>>>
>>>>>>>>>> So this is a bit strange. There must have been jar conflicts when
>>>>>>>>>> you were using sqoop2 before. I see a couple of solutions 
>>>>>>>>>> immediately:
>>>>>>>>>>
>>>>>>>>>>    1. Start fresh and recreate your jobs
>>>>>>>>>>    2. Dump the data in the old data base and load into a new
>>>>>>>>>>    database
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> For option two, here's a quick over view of how to do that:
>>>>>>>>>>
>>>>>>>>>>    1. Get the DDL with DBLOOK (
>>>>>>>>>>    
>>>>>>>>>> http://db.apache.org/derby/docs/10.1/tools/rtoolsdblookexamples.html).
>>>>>>>>>>    To get DBLook working on my mac, I had to do the following: alias
>>>>>>>>>>    dblook='java -cp
>>>>>>>>>>    
>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/db/lib/derbytools.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/db/lib/derby.jar
>>>>>>>>>>    org.apache.derby.tools.dblook'. Then dblook -d <jdbc conn string>.
>>>>>>>>>>    2. Get a data dump using SYSCS_UTIL.SYSCS_EXPORT_TABLE system
>>>>>>>>>>    procedure from IJ or programmatically (
>>>>>>>>>>    http://db.apache.org/derby/docs/10.7/ref/rrefexportproc.html).
>>>>>>>>>>    3. Load data using SYSCS_UTIL.SYSCS_IMPORT_DATA system
>>>>>>>>>>    procedure from IJ or programmatically (
>>>>>>>>>>    http://db.apache.org/derby/docs/10.7/ref/rrefimportdataproc.html
>>>>>>>>>>    ).
>>>>>>>>>>
>>>>>>>>>> Here's a quick example of what a programmatic solution might look
>>>>>>>>>> like (incomplete):
>>>>>>>>>>
>>>>>>>>>> public class DerbyDump {
>>>>>>>>>>>   private static final String SCHEMA_NAME = "SQOOP";
>>>>>>>>>>>
>>>>>>>>>>>   private static List<String> fetchTableNames(String jdbc)
>>>>>>>>>>> throws SQLException {
>>>>>>>>>>>     Connection conn = DriverManager.getConnection(jdbc);
>>>>>>>>>>>     List<String> tableNames = new LinkedList<String>();
>>>>>>>>>>>     Statement stmt = null;
>>>>>>>>>>>
>>>>>>>>>>>     try {
>>>>>>>>>>>       stmt = conn.createStatement();
>>>>>>>>>>>       ResultSet rs = stmt.executeQuery("select t.tablename\n" +
>>>>>>>>>>>           "     from sys.systables t, sys.sysschemas s\n" +
>>>>>>>>>>>           "     where t.schemaid = s.schemaid\n" +
>>>>>>>>>>>           "          and t.tabletype = 'T'\n" +
>>>>>>>>>>>           "          and s.schemaname = '" + SCHEMA_NAME + "'\n"
>>>>>>>>>>> +
>>>>>>>>>>>           "     order by s.schemaname, t.tablename");
>>>>>>>>>>>       while (rs.next()) {
>>>>>>>>>>>         tableNames.add(rs.getString(1));
>>>>>>>>>>>       }
>>>>>>>>>>>     } finally {
>>>>>>>>>>>       if (stmt != null) {
>>>>>>>>>>>         stmt.close();
>>>>>>>>>>>       }
>>>>>>>>>>>
>>>>>>>>>>>       if (conn != null) {
>>>>>>>>>>>         conn.close();
>>>>>>>>>>>       }
>>>>>>>>>>>     }
>>>>>>>>>>>
>>>>>>>>>>>     return tableNames;
>>>>>>>>>>>   }
>>>>>>>>>>>
>>>>>>>>>>>   private static void dumpTableToFile(String jdbc, String
>>>>>>>>>>> tableName, String path) throws SQLException {
>>>>>>>>>>>     Connection conn = DriverManager.getConnection(jdbc);
>>>>>>>>>>>     PreparedStatement stmt = null;
>>>>>>>>>>>
>>>>>>>>>>>     try {
>>>>>>>>>>>       stmt = conn.prepareStatement("CALL
>>>>>>>>>>> SYSCS_UTIL.SYSCS_EXPORT_TABLE(?, ?, ?, null, null, 'UTF-8')");
>>>>>>>>>>>       stmt.setString(1, SCHEMA_NAME);
>>>>>>>>>>>       stmt.setString(2, tableName);
>>>>>>>>>>>       stmt.setString(3, path);
>>>>>>>>>>>       stmt.execute();
>>>>>>>>>>>     } finally {
>>>>>>>>>>>       if (stmt != null) {
>>>>>>>>>>>         stmt.close();
>>>>>>>>>>>       }
>>>>>>>>>>>
>>>>>>>>>>>       if (conn != null) {
>>>>>>>>>>>         conn.close();
>>>>>>>>>>>       }
>>>>>>>>>>>     }
>>>>>>>>>>>   }
>>>>>>>>>>>
>>>>>>>>>>>   private static void loadTableToFile(String jdbc, String
>>>>>>>>>>> tableName, String path) throws SQLException {
>>>>>>>>>>>     Connection conn = DriverManager.getConnection(jdbc);
>>>>>>>>>>>     PreparedStatement stmt = null;
>>>>>>>>>>>
>>>>>>>>>>>     try {
>>>>>>>>>>>       stmt = conn.prepareStatement("CALL
>>>>>>>>>>> SYSCS_UTIL.SYSCS_IMPORT_DATA(?, ?, null, null, ?, null, null, 
>>>>>>>>>>> 'UTF-8', 0)");
>>>>>>>>>>>       stmt.setString(1, SCHEMA_NAME);
>>>>>>>>>>>       stmt.setString(2, tableName);
>>>>>>>>>>>       stmt.setString(3, path);
>>>>>>>>>>>       stmt.execute();
>>>>>>>>>>>     } finally {
>>>>>>>>>>>       if (stmt != null) {
>>>>>>>>>>>         stmt.close();
>>>>>>>>>>>       }
>>>>>>>>>>>
>>>>>>>>>>>       if (conn != null) {
>>>>>>>>>>>         conn.close();
>>>>>>>>>>>       }
>>>>>>>>>>>     }
>>>>>>>>>>>   }
>>>>>>>>>>> }
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> The above snippet would need to be compiled to read from 10.8
>>>>>>>>>> databases when dumping data and 10.8.2.2 when loading data.
>>>>>>>>>>
>>>>>>>>>> -Abe
>>>>>>>>>>
>>>>>>>>>> On Wed, Jul 15, 2015 at 2:54 AM, Lee S <[email protected]> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Richard:
>>>>>>>>>>>   I cant run ij and dont know where derby is installed.
>>>>>>>>>>>   And I try to reconfigure sqoop and ran it  without the
>>>>>>>>>>> exceptions I posted earlier.
>>>>>>>>>>>   but with the exceptions I sent on the first email.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Wed, Jul 15, 2015 at 10:16 AM, Zhou, Richard <
>>>>>>>>>>> [email protected]> wrote:
>>>>>>>>>>>
>>>>>>>>>>>>  Hi,
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> There is no need to install derby in advance.
>>>>>>>>>>>>
>>>>>>>>>>>> For the sqoop.log,
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> [org.apache.sqoop.repository.derby.DerbyRepositoryHandler.detectRepositoryVersion(DerbyRepositoryHandler.java:196)]
>>>>>>>>>>>> Can't fetch repository structure version.
>>>>>>>>>>>>
>>>>>>>>>>>> Caused by: java.sql.SQLException: Schema 'SQOOP' does not exist
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> This error is correct. It’s fine. As it’s your first time to
>>>>>>>>>>>> start a Sqoop server, it will generate DB automatically (also log 
>>>>>>>>>>>> this
>>>>>>>>>>>> error) if it does not exists.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> org.apache.sqoop.common.SqoopException: COMMON_0000:Unable to
>>>>>>>>>>>> run specified query - CREATE TABLE "SQOOP"."SQ_INPUT" ("SQI_ID" 
>>>>>>>>>>>> BIGINT
>>>>>>>>>>>> GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1) 
>>>>>>>>>>>> PRIMARY KEY,
>>>>>>>>>>>> "SQI_NAME" VARCHAR(64), "SQI_FORM" BIGINT, "SQI_INDEX" SMALLINT, 
>>>>>>>>>>>> "SQI_TYPE"
>>>>>>>>>>>> VARCHAR(32), "SQI_STRMASK" BOOLEAN, "SQI_STRLENGTH" SMALLINT,
>>>>>>>>>>>> "SQI_ENUMVALS" VARCHAR(100),CONSTRAINT "SQOOP"."FK_SQI_SQF" 
>>>>>>>>>>>> FOREIGN KEY
>>>>>>>>>>>> ("SQI_FORM") REFERENCES "SQOOP"."SQ_FORM" ("SQF_ID"))
>>>>>>>>>>>>
>>>>>>>>>>>> Caused by: java.sql.SQLSyntaxErrorException: Syntax error:
>>>>>>>>>>>> BOOLEAN.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> This seems to be the root cause. There is “Syntax error:
>>>>>>>>>>>> BOOLEAN” when running generate scripts. Would you run this script 
>>>>>>>>>>>> using
>>>>>>>>>>>> “ij” of derby to see whether this script runs correctly in your 
>>>>>>>>>>>> env?
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Regards
>>>>>>>>>>>>
>>>>>>>>>>>> Richard
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> *From:* Lee S [mailto:[email protected]]
>>>>>>>>>>>> *Sent:* Tuesday, July 14, 2015 10:56 PM
>>>>>>>>>>>> *To:* [email protected]
>>>>>>>>>>>> *Subject:* sqoop2-tool verify with exception: The database was
>>>>>>>>>>>> created by or upgraded by version 10.8
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Hi all:
>>>>>>>>>>>>
>>>>>>>>>>>>  I verify the configuration with the  exception below:
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ERROR XSLAN: Database at
>>>>>>>>>>>> /root/lcy/sqoop-1.99.6-bin-hadoop200/repository/db has an 
>>>>>>>>>>>> incompatible
>>>>>>>>>>>> format with the current version of the software.  The database was 
>>>>>>>>>>>> created
>>>>>>>>>>>> by or upgraded by version 10.8.
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.iapi.error.StandardException.newException(Unknown 
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.store.raw.log.LogToFile.readControlFile(Unknown
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.store.raw.log.LogToFile.boot(Unknown Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown
>>>>>>>>>>>>  Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.store.raw.data.BaseDataFileFactory.bootLogFactory(Unknown
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.store.raw.data.BaseDataFileFactory.setRawStoreFactory(Unknown
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
>>>>>>>>>>>> Source)
>>>>>>>>>>>>
>>>>>>>>>>>>         at
>>>>>>>>>>>> org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown
>>>>>>>>>>>>  Source)
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> It seems that db in repository is not created with derby
>>>>>>>>>>>> version 10.8 , but I dont know how to fix it . I've checked that 
>>>>>>>>>>>> derby jar
>>>>>>>>>>>> in WEB-INF/lib is 10.8.2.2.
>>>>>>>>>>>>
>>>>>>>>>>>> I'm working with sqoop-1.99.6, any idea?
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to