我使用aws glue读取aws aurora db(mysql)表并写入s3文件。mysql表有日期列值“0000-00-00”,这在mysql中是允许的。因此,我的胶水作业(Pypark)失败了。如何在胶水代码中处理这个问题?
我试过但失败了。
追加 jdbc:mysql:<host-name>/<db-name>?zeroDateTimeBehavior=convertToNull&autoReconnect=true&characterEncoding=UTF-8&characterSetResults=UTF-8
从pyspark代码中的dynamicframe或dataframe中删除日期列。如: df.drop(df["date_column"])
已从粘合表定义中删除日期列。看起来表中的所有列都已读取。
下面的错误消息
Traceback (most recent call last):
File "script_2018-08-03-21-41-06.py", line 107, in <module>
total_record_count=datasourceDF0.count()
File "/mnt/yarn/usercache/root/appcache/application_1533330570684_0005/container_1533330570684_0005_01_000001/pyspark.zip/pyspark/sql/dataframe.py", line 427, in count
File "/mnt/yarn/usercache/root/appcache/application_1533330570684_0005/container_1533330570684_0005_01_000001/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
File "/mnt/yarn/usercache/root/appcache/application_1533330570684_0005/container_1533330570684_0005_01_000001/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
File "/mnt/yarn/usercache/root/appcache/application_1533330570684_0005/container_1533330570684_0005_01_000001/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o335.count.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 7.0 failed 4 times, most recent failure: Lost task 0.3 in stage 7.0 (TID 21, ip-172-24-120-182.us-west-2.compute.internal, executor 1): java.sql.SQLException: Value '0000-00-00' can not be represented as java.sql.Timestamp
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:996)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:935)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:924)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:870)
at com.mysql.jdbc.ResultSetRow.getNativeTimestamp(ResultSetRow.java:606)
at com.mysql.jdbc.ByteArrayRow.getNativeTimestamp(ByteArrayRow.java:187)
at com.mysql.jdbc.ResultSetImpl.getNativeTimestamp(ResultSetImpl.java:4309)
at com.mysql.jdbc.ResultSetImpl.getTimestampInternal(ResultSetImpl.java:5929)
at com.mysql.jdbc.ResultSetImpl.getTimestamp(ResultSetImpl.java:5609)
1条答案
按热度按时间cvxl0en21#
验证字段在crownler表架构中设置的格式。设置为字符串。这样你就不再有解析错误了。选择包含函数的列
df.selectExpr()
并根据需要格式化数据。一些spark sql表达式:日期格式
到unix\u时间戳
来自\u unixtime