[DNM][VL]Support iceberg merge on read test #5568
Triggered via pull request
March 6, 2025 17:27
jinchengchenghh
opened
#8923
Status
Success
Total duration
18s
Artifacts
–
labeler.yml
on: pull_request_target
Label pull requests
5s
Annotations
3 errors
|
|
VeloxIcebergSuite.iceberg read mor table - merge into:
org/apache/gluten/execution/VeloxIcebergSuite#L1
Job aborted due to stage failure: Task 0 in stage 89.0 failed 1 times, most recent failure: Lost task 0.0 in stage 89.0 (TID 101) (20ada9f93cfa executor driver): java.lang.NullPointerException: Cannot invoke "org.apache.spark.unsafe.types.UTF8String.toString()" because the return value of "org.apache.spark.sql.catalyst.InternalRow.getUTF8String(int)" is null
at org.apache.spark.sql.catalyst.InternalRow.getString(InternalRow.scala:35)
at org.apache.iceberg.spark.source.SparkPositionDeltaWrite$DeleteOnlyDeltaWriter.delete(SparkPositionDeltaWrite.java:491)
at org.apache.iceberg.spark.source.SparkPositionDeltaWrite$DeleteOnlyDeltaWriter.delete(SparkPositionDeltaWrite.java:447)
at org.apache.spark.sql.execution.datasources.v2.DeltaWithMetadataWritingSparkTask.write(WriteToDataSourceV2Exec.scala:563)
at org.apache.spark.sql.execution.datasources.v2.DeltaWithMetadataWritingSparkTask.write(WriteToDataSourceV2Exec.scala:549)
at org.apache.spark.sql.execution.datasources.v2.WritingSparkTask.$anonfun$run$1(WriteToDataSourceV2Exec.scala:471)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1563)
at org.apache.spark.sql.execution.datasources.v2.WritingSparkTask.run(WriteToDataSourceV2Exec.scala:509)
at org.apache.spark.sql.execution.datasources.v2.WritingSparkTask.run$(WriteToDataSourceV2Exec.scala:448)
at org.apache.spark.sql.execution.datasources.v2.DeltaWithMetadataWritingSparkTask.run(WriteToDataSourceV2Exec.scala:549)
at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.$anonfun$writeWithV2$2(WriteToDataSourceV2Exec.scala:411)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92)
at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
at org.apache.spark.scheduler.Task.run(Task.scala:139)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:554)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1529)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:557)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Driver stacktrace:
|