当我试图从java访问配置时,出现以下错误。
Exception in thread "main" java.lang.IllegalAccessError: tried to access method org.apache.hadoop.metrics2.lib.MutableCounterLong.<init>(Lorg/apache/hadoop/metrics2/MetricsInfo;J)V from class org.apache.hadoop.fs.s3a.S3AInstrumentation
at org.apache.hadoop.fs.s3a.S3AInstrumentation.streamCounter(S3AInstrumentation.java:164)
at org.apache.hadoop.fs.s3a.S3AInstrumentation.streamCounter(S3AInstrumentation.java:186)
at org.apache.hadoop.fs.s3a.S3AInstrumentation.<init>(S3AInstrumentation.java:113)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:199)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at MyProgram.GetHiveTableData(MyProgram.java:710)
at MyProgram$1.run(MyProgram.java:674)
at MyProgram$1.run(MyProgram.java:670)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at MyProgram.GetHiveTableDetails(MyProgram.java:670)
at MyProgram.main(MyProgram.java:398)
代码行是
FileSystem hdfs = FileSystem.get(new URI(uriStr), configuration);
uristr=s3a://sbucketname
s3a的混淆设置如下
fs.default.name=fs.defaultFS
fs.defaultFS=s3a://bucketName
sPath: XXXXXX
fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem
fs.s3a.access.key=XXXXXX
fs.s3a.secret.key=XXXXXXX
fs.s3a.endpoint=XXXXXXX
hadoop.rpc.protection=privacy
dfs.data.transfer.protection=privacy
hadoop.security.authentication=Kerberos
dfs.namenode.kerberos.principal=hdfs/XXXX@XXXX.XXX.XXXXXX.XXX
yarn.resourcemanager.principal=yarn/XXXX@XXXX.XXX.XXXXXX.XXX
配置设置中是否缺少任何内容?请告知。
1条答案
按热度按时间2skhul331#
如果aws sdk版本和hadoop版本不兼容,则可能会出现此问题,您可以从spark job获取更多帮助从spark cluster上的s3读取illegalaccesserror:尝试访问方法mutablecounterlong和java.lang.noclassdeffounderror:org/apache/hadoop/fs/storagestatistics
当我退后
hadoop-aws
从2.8.0到2.7.3版本,问题得到了解决。根据这里的讨论https://stackoverflow.com/a/52828978/8025086,似乎是合适的使用
aws-java-sdk1.7.4
,我刚刚用pyspark测试了这个简单的例子,它也可以工作。我不是 java 人,也许有人能有更好的解释。