我最近开发了一个ClouderaHadoop集群。除sqoop外,所有服务和外壳都很好( Impala 、Hive、色调等)。以下作业以前运行良好:
$ sqoop job -fs hdfs://<namenode>:8020 --create myJob \
-- import --driver com.mysql.jdbc.Driver \
--connection-manager org.apache.sqoop.manager.GenericJdbcManager \
--connect jdbc:mysql://<mysqldb>:3306/<dbname> --username <username> \
-P --table <tablename> --target-dir /raw/ \
--incremental append --check-column <table-pk>
创建作业时没有错误。在作业运行时,它失败了,抱怨没有启用简单身份验证。
$ sqoop job --exec myJob
16/03/01 11:56:26 WARN security.UserGroupInformation: PriviledgedActionException as:hadoop_admin (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
16/03/01 11:56:26 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2097)
at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1214)
at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1210)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1210)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1409)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:269)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:228)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:283)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1403)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2095)
... 29 more
我正在使用kerberos/kdc,并根据cloudera规范对其进行了配置:http://www.cloudera.com/documentation/enterprise/latest/topics/cm_sg_sec_troubleshooting.html -我的配置文件是相同的,当然除了我的领域被替换,而且正如我所说的,所有其他服务在启用kerberos的情况下都可以正常运行和启动。
我用以下行更新了/etc/sqoop2/conf/sqoop.properties:
org.apache.sqoop.security.authentication.type=KERBEROS
org.apache.sqoop.security.authentication.handler=org.apache.sqoop.security.authentication.KerberosAuthenticationHandler
org.apache.sqoop.security.authentication.kerberos.principal=sqoop2/<fqdn>@<REALM>
org.apache.sqoop.security.authentication.kerberos.keytab=/etc/sqoop2/conf/sqoop2.keytab
org.apache.sqoop.security.authentication.kerberos.http.principal=HTTP/<fqdn>@<REALM>
org.apache.sqoop.security.authentication.kerberos.http.keytab=/etc/sqoop2/conf/sqoop2.keytab
org.apache.sqoop.security.authentication.enable.doAs=true
org.apache.sqoop.security.authentication.proxyuser.mapr.users=*
要生成我运行的sqoop2.keytab:
kadmin> xst -k /home/kerberos/sqoop2.keytab HTTP/<FQDN>@<REALM>
kadmin> xst -k /home/kerberos/sqoop2.keytab sqoop2/<fqdn>@<REALM>
然后我将keytab文件移到/etc/sqoop2/conf,并将其chmod'ed到400,然后将其chowned到sqoop2。
klist显示:
Ticket cache: FILE:/tmp/krb5cc_1000
Default principal: hadoop_admin@<REALM>
Valid starting Expires Service principal
03/01/2016 11:28:14 03/01/2016 21:28:14 krbtgt/<REALM>@<REALM>
renew until 03/08/2016 11:28:14
我是否错过了一步?
暂无答案!
目前还没有任何答案,快来回答吧!