Assert失败:运行时反射宇宙中的不安全符号时间戳(子项为< none>)

von4xj4u  于 2021-07-13  发布在  Spark
关注(0)|答案(1)|浏览(321)

我们使用scala编程语言和spark作为数据处理工具。我们有3个不同的项目
项目a//我们在其中定义了此注解

package Common {
      case class TimeStamp(format: DateFormat) extends StaticAnnotation
    }

项目b//这里有一个case类,它使用上面的注解

case class CommonSchema(
    @Common.TimeStamp(DateFormat.DateTime)
    __datetime: Timestamp
)

项目c//我们使用上面的case类作为类型

val events = events.map(c => CommonSchema(
  c.__datetime,
))
events.as[CommonSchema]

我们在c项目中遇到了以下异常

User class threw exception: java.lang.RuntimeException: error reading Scala signature of commonschema: assertion failed: unsafe symbol TimeStamp (child of <none>) in runtime reflection universe
at scala.reflect.internal.pickling.UnPickler.unpickle(UnPickler.scala:46)
at scala.reflect.runtime.JavaMirrors$JavaMirror.unpickleClass(JavaMirrors.scala:619)
at scala.reflect.runtime.SymbolLoaders$TopClassCompleter$$anonfun$complete$1.apply$mcV$sp(SymbolLoaders.scala:28)
at scala.reflect.runtime.SymbolLoaders$TopClassCompleter$$anonfun$complete$1.apply(SymbolLoaders.scala:25)
at scala.reflect.runtime.SymbolLoaders$TopClassCompleter$$anonfun$complete$1.apply(SymbolLoaders.scala:25)
at scala.reflect.internal.SymbolTable.slowButSafeEnteringPhaseNotLaterThan(SymbolTable.scala:263)
at scala.reflect.runtime.SymbolLoaders$TopClassCompleter.complete(SymbolLoaders.scala:25)
at scala.reflect.runtime.SymbolLoaders$TopClassCompleter.load(SymbolLoaders.scala:33)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$typeParams$1.apply(SynchronizedSymbols.scala:140)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$typeParams$1.apply(SynchronizedSymbols.scala:133)
at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$8.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:168)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.typeParams(SynchronizedSymbols.scala:132)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$8.typeParams(SynchronizedSymbols.scala:168)
at scala.reflect.internal.Types$NoArgsTypeRef.typeParams(Types.scala:1931)
at scala.reflect.internal.Types$NoArgsTypeRef.isHigherKinded(Types.scala:1930)
at scala.reflect.internal.tpe.TypeComparers$class.isSubType2(TypeComparers.scala:377)
at scala.reflect.internal.tpe.TypeComparers$class.isSubType1(TypeComparers.scala:320)
at scala.reflect.internal.tpe.TypeComparers$class.isSubType(TypeComparers.scala:278)
at scala.reflect.internal.SymbolTable.isSubType(SymbolTable.scala:16)
at scala.reflect.internal.Types$Type.$less$colon$less(Types.scala:784)
at scala.reflect.internal.Types$Type.$less$colon$less(Types.scala:260)
at org.apache.spark.sql.catalyst.ScalaReflection$.isSubtype(ScalaReflection.scala:83)
at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$optionOfProductType$1.apply$mcZ$sp(ScalaReflection.scala:677)
at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$optionOfProductType$1.apply(ScalaReflection.scala:676)
at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$optionOfProductType$1.apply(ScalaReflection.scala:676)
at scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
at org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:926)
at org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:49)
at org.apache.spark.sql.catalyst.ScalaReflection$.optionOfProductType(ScalaReflection.scala:675)
at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:51)
at org.apache.spark.sql.Encoders$.product(Encoders.scala:275)

link1和link2中也回答了类似的问题,但是我们已经为项目c构建了uberjar,所以所有的类都应该包含在包中。在本地,命中此代码的单元测试可以正常工作,但是为什么会出现运行时错误呢?

qlckcl4x

qlckcl4x1#

这是因为我们使用maven shade插件并最小化jar。我们对配置进行了更改,这样就不会从jar中删除这些类。它解决了这个问题

相关问题