我正在按照[https://github.com/aws-samples/aws-glue-samples/blob/master/utilities/Spark_UI/README.md]中提到的所有步骤操作
但无法启动容器,出现此错误:
2023-02-22 13:27:51 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
2023-02-22 13:27:51 23/02/22 18:27:51 INFO HistoryServer: Started daemon with process name: 1@a868804c2bab
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SignalUtils: Registering signal handler for TERM
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SignalUtils: Registering signal handler for HUP
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SignalUtils: Registering signal handler for INT
2023-02-22 13:27:51 23/02/22 18:27:51 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SecurityManager: Changing view acls to: root
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SecurityManager: Changing modify acls to: root
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SecurityManager: Changing view acls groups to:
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SecurityManager: Changing modify acls groups to:
2023-02-22 13:27:51 23/02/22 18:27:51 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
2023-02-22 13:27:52 23/02/22 18:27:52 INFO FsHistoryProvider: History server ui acls disabled; users with admin permissions: ; groups with admin permissions:
2023-02-22 13:27:52 23/02/22 18:27:52 WARN FileSystem: S3FileSystem is deprecated and will be removed in future releases. Use NativeS3FileSystem or S3AFileSystem instead.
2023-02-22 13:27:52 Exception in thread "main" java.lang.reflect.InvocationTargetException
2023-02-22 13:27:52 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2023-02-22 13:27:52 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2023-02-22 13:27:52 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2023-02-22 13:27:52 at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2023-02-22 13:27:52 at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:300)
2023-02-22 13:27:52 at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
2023-02-22 13:27:52 Caused by: java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified by setting the fs.s3.awsAccessKeyId and fs.s3.awsSecretAccessKey properties (respectively).
2023-02-22 13:27:52 at org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:74)
2023-02-22 13:27:52 at org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:94)
2023-02-22 13:27:52 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2023-02-22 13:27:52 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2023-02-22 13:27:52 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-22 13:27:52 at java.lang.reflect.Method.invoke(Method.java:498)
2023-02-22 13:27:52 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409)
2023-02-22 13:27:52 at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
2023-02-22 13:27:52 at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
2023-02-22 13:27:52 at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
2023-02-22 13:27:52 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346)
2023-02-22 13:27:52 at com.sun.proxy.$Proxy5.initialize(Unknown Source)
2023-02-22 13:27:52 at org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:111)
2023-02-22 13:27:52 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2812)
2023-02-22 13:27:52 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
2023-02-22 13:27:52 at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2849)
2023-02-22 13:27:52 at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2831)
2023-02-22 13:27:52 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
2023-02-22 13:27:52 at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
2023-02-22 13:27:52 at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:116)
2023-02-22 13:27:52 at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:88)
2023-02-22 13:27:52 ... 6 more
1条答案
按热度按时间wooyq4lh1#
我有问题是因为我
-Dspark.history.fs. log目录=s3://名称而不是s3 a://