在spark中将json键转换为列

q9yhzks0  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(377)

我编写了一个代码,读取数据并从元组中选取第二个元素。第二个元素恰好是json。获取json的代码:

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.conf.Configuration;
import    com.amazon.traffic.emailautomation.cafe.purchasefilter.util.CodecAwareManifestFileSystem;
import com.amazon.traffic.emailautomation.cafe.purchasefilter.util.CodecAwareManifestInputFormat;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import amazon.emr.utils.manifest.input.ManifestItemFileSystem;
import amazon.emr.utils.manifest.input.ManifestInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat ;
import scala.Tuple2;

val configuration = new Configuration(sc.hadoopConfiguration);
ManifestItemFileSystem.setImplementation(configuration);
ManifestInputFormat.setInputFormatImpl(configuration, classOf[TextInputFormat]);
val linesRdd1 = sc.newAPIHadoopFile("location", classOf[ManifestInputFormat[LongWritable,Text]], classOf[LongWritable], classOf[Text], configuration).map(tuple2 =>  tuple2._2.toString());

下面是一个例子:

{"data":   {"marketplaceId":7,"customerId":123,"eventTime":1471206800000,"asin":"4567","type":"OWN","region":"NA"},"uploadedDate":1471338703958}

现在,我想创建一个数据框架,其中json键如marketplaceid、customerid等作为列,行有其值。我不知道该怎么办?有人能帮我用指针吗?它能帮我达到同样的效果吗?

wqsoz72f

wqsoz72f1#

您可以使用此链接创建一个scala对象来编组/解编组jsonhttps://coderwall.com/p/o--apg/easy-json-un-marshalling-in-scala-with-jackson
然后使用该对象在scala中使用case类读取json数据:

import org.apache.spark.{SparkConf, SparkContext}

object stackover {
  case class Data(
                   marketplaceId: Double,
                   customerId: Double,
                   eventTime: Double,
                   asin: String,
                   `type`: String,
                   region: String
                 )
  case class R00tJsonObject(
                            data: Data,
                            uploadedDate: Double
                           )

  def main(args: Array[String]): Unit = {
    val conf = new SparkConf(true)
    conf.setAppName("example");
    conf.setMaster("local[*]")

    val sc = new SparkContext(conf)
    val data = sc.textFile("test1.json")
    val parsed = data.map(row => JsonUtil.readValue[R00tJsonObject](row))

    parsed.map(rec => (rec.data, rec.uploadedDate, rec.data.customerId, 
rec.data.marketplaceId)).collect.foreach(println)
 }
 }

输出:

(Data(7.0,123.0,1.4712068E12,4567,OWN,NA),1.471338703958E12,123.0,7.0)

相关问题