-
Notifications
You must be signed in to change notification settings - Fork 15
Open
Description
Hi,
I am deploying the PairwiseStitching module on my HPC system. To test the latest version of BigStitcher-Spark, I am running the latest version on the test dataset (N5+XML, unaligned).
Previous version (0.0.2) of BigStitcher handles this data fine and generates the correlation for all overlapping views, but I am seeing only r=0 for all tile pairs with the latest version (0.1.0) [61c4424].
I also observe this error in the log file for both versions:
2025-03-13 16:31:59,193 [task-result-getter-1] WARN [TaskSetManager]: Lost task 209.0 in stage 0.0 (TID 209) (10.110.100.15 executor 9): java.lang.NullPointerException
at net.preibisch.mvrecon.process.interestpointregistration.pairwise.constellation.grouping.Group.combineOrSplitBy(Group.java:234)
at net.preibisch.mvrecon.process.interestpointregistration.pairwise.constellation.grouping.Group.combineBy(Group.java:172)
at net.preibisch.stitcher.algorithm.GroupedViewAggregator$Action.pickBrightest(GroupedViewAggregator.java:119)
at net.preibisch.stitcher.algorithm.GroupedViewAggregator$Action.aggregate(GroupedViewAggregator.java:102)
at net.preibisch.stitcher.algorithm.GroupedViewAggregator.aggregate(GroupedViewAggregator.java:396)
at net.preibisch.stitcher.algorithm.globalopt.TransformationTools.computeStitching(TransformationTools.java:277)
at net.preibisch.bigstitcher.spark.SparkPairwiseStitching.lambda$call$c9354d04$1(SparkPairwiseStitching.java:202)
at org.apache.spark.api.java.JavaPairRDD$.$anonfun$toScalaFunction$1(JavaPairRDD.scala:1070)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:224)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:302)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1597)
at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1524)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1588)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1389)
at org.apache.spark.storage.BlockManager.getOrElseUpdateRDDBlock(BlockManager.scala:1343)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:379)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)
at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
at org.apache.spark.scheduler.Task.run(Task.scala:141)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
I also attached an image showing the diff between the log for my old version (0.0.2) and the latest.

StephanPreibisch and nvladimus
Metadata
Metadata
Assignees
Labels
No labels