我正在eclipse中运行一个应用程序,spark将从kafka producer读取数据。我正在使用linux。它开始,然后关闭。我的scala和spark版本是兼容的。我在后台运行zookeeper和kafka。这是我的密码。eclipse中没有可见的错误。我曾尝试以root用户身份运行eclipse,但结果相同。有人能帮忙吗?
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.Seconds
import org.apache.spark.streaming.kafka.KafkaUtils
object hye {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("KafkaWordCount").setMaster("local[*]")
val ssc = new StreamingContext(sparkConf, Seconds(2))
val lines = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("hello" -> 5))
lines.print()
ssc.start()
ssc.awaitTermination()
}
}
pom.xml文件
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>new</groupId>
<artifactId>new</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.11</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
输出:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/02/05 16:00:03 INFO SparkContext: Running Spark version 2.2.0
18/02/05 16:00:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/02/05 16:00:09 WARN Utils: Your hostname, hunny-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface enp0s3)
18/02/05 16:00:09 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
18/02/05 16:00:09 INFO SparkContext: Submitted application: KafkaWordCount
18/02/05 16:00:09 INFO SecurityManager: Changing view acls to: root
18/02/05 16:00:09 INFO SecurityManager: Changing modify acls to: root
18/02/05 16:00:09 INFO SecurityManager: Changing view acls groups to:
18/02/05 16:00:09 INFO SecurityManager: Changing modify acls groups to:
18/02/05 16:00:09 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
18/02/05 16:00:12 INFO Utils: Successfully started service 'sparkDriver' on port 38357.
18/02/05 16:00:12 INFO SparkEnv: Registering MapOutputTracker
18/02/05 16:00:13 INFO SparkEnv: Registering BlockManagerMaster
18/02/05 16:00:13 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/02/05 16:00:13 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/02/05 16:00:13 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-6c907005-62a3-4e7e-91af-bfab176dcc5d
18/02/05 16:00:13 INFO MemoryStore: MemoryStore started with capacity 324.6 MB
18/02/05 16:00:13 INFO SparkEnv: Registering OutputCommitCoordinator
18/02/05 16:00:15 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18/02/05 16:00:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.2.15:4040
18/02/05 16:00:16 INFO Executor: Starting executor ID driver on host localhost
18/02/05 16:00:16 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39293.
18/02/05 16:00:16 INFO NettyBlockTransferService: Server created on 10.0.2.15:39293
18/02/05 16:00:16 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/02/05 16:00:16 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.0.2.15, 39293, None)
18/02/05 16:00:16 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.2.15:39293 with 324.6 MB RAM, BlockManagerId(driver, 10.0.2.15, 39293, None)
18/02/05 16:00:16 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.0.2.15, 39293, None)
18/02/05 16:00:16 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.0.2.15, 39293, None)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:91)
at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:66)
at hye$.main(hye.scala:11)
at hye.main(hye.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 16 more
18/02/05 16:00:18 INFO SparkContext: Invoking stop() from shutdown hook
18/02/05 16:00:18 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
18/02/05 16:00:18 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/02/05 16:00:18 INFO MemoryStore: MemoryStore cleared
18/02/05 16:00:18 INFO BlockManager: BlockManager stopped
18/02/05 16:00:18 INFO BlockManagerMaster: BlockManagerMaster stopped
18/02/05 16:00:18 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/02/05 16:00:18 INFO SparkContext: Successfully stopped SparkContext
18/02/05 16:00:18 INFO ShutdownHookManager: Shutdown hook called
18/02/05 16:00:18 INFO ShutdownHookManager: Deleting directory /tmp/spark-93a4315c-31c1-41b2-8ff7-cff6b4ffb53c
1条答案
按热度按时间z9ju0rcb1#
您的依赖项不兼容。你可以改变你的Kafka依赖如下。