phoenix“org.apache.phoenix.spark.defaultsource”错误

weylhg0b  于 2021-06-10  发布在  Hbase
关注(0)|答案(1)|浏览(924)

我是新来Phoenix城的,我正在尝试加载 hbase Phoenix城的table。当我试图加载Phoenix城,我得到下面的错误。 java.lang.ClassNotFoundException: org.apache.phoenix.spark.DefaultSource 我的代码:

package com.vas.reports
import org.apache.spark.SparkContext
import org.apache.spark.sql.{SQLContext, SaveMode}
import org.apache.phoenix.spark
import java.sql.DriverManager
import com.google.common.collect.ImmutableMap
import org.apache.hadoop.hbase.filter.FilterBase
import org.apache.phoenix.query.QueryConstants
import org.apache.phoenix.filter.ColumnProjectionFilter;
import org.apache.phoenix.hbase.index.util.ImmutableBytesPtr;
import org.apache.phoenix.hbase.index.util.VersionUtil;
import org.apache.hadoop.hbase.filter.Filter

object PhoenixRead {

case class Record(NO:Int,NAME:String,DEPT:Int)

def main(args: Array[String]) {

val sc= new SparkContext("local","phoenixsample")

val sqlcontext=new SQLContext(sc)

val numWorkers = sc.getExecutorStorageStatus.map(_.blockManagerId.executorId).filter(_ != "driver").length

import sqlcontext.implicits._

val df1=sc.parallelize(List((2,"Varun", 58),

(3,"Alice", 45),

(4,"kumar", 55))).

toDF("NO", "NAME", "DEPT")

df1.show()

println(numWorkers)

println("pritning df2")

val df =sqlcontext.load("org.apache.phoenix.spark",Map("table"->"udm_main","zkUrl"->"phoenix url:2181/hbase-unsecure"))

df.show()

spark提交~~~~~~~~~~~~ spark-submit --class com.vas.reports.PhoenixRead --jars /home/hadoop1/phoenix-core-4.4.0-HBase-1.1.jar /shared/test/ratna-0.0.1-SNAPSHOT.jar 请调查一下并向我提出建议。

j8ag8udp

j8ag8udp1#

这是因为,您需要在hbase\u home/libs和spark\u home/lib中添加以下库文件。
在hbase\u home/libs中:
phoenix-spark-4.7.0-hbase-1.1.jar
phoenix-4.7.0-hbase-1.1-server.jar
在spark\u home/lib中:
phoenix-spark-4.7.0-hbase-1.1.jar
phoenix-4.7.0-hbase-1.1-client.jar

相关问题