[SPARK-55706][SQL][TESTS] Disable DB2 JDBC Driver tests#54505
[SPARK-55706][SQL][TESTS] Disable DB2 JDBC Driver tests#54505dongjoon-hyun wants to merge 1 commit intoapache:masterfrom
Conversation
|
cc @pan3793 and @LuciferYang |
|
@dongjoon-hyun thanks, I rebased #53454 on this one, let's wait for the CI result. I think it should pass. |
|
Thank you. |
|
Oops. I missed to run |
04e8553 to
41078b4
Compare
|
41078b4 to
37eaf37
Compare
.../docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/DB2IntegrationSuite.scala
Show resolved
Hide resolved
...cker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/DB2IntegrationSuite.scala
Show resolved
Hide resolved
|
Since it's too late for me in Cupertino, I'll check the CI result tomorrow morninn. i'll ping you nex time adter the PR passes the CI. Sorry for the trouble. |
|
@dongjoon-hyun thank you, I will keep eyes here. |
|
thanks, merging to master |
|
It would be good to explain what the side-effect is in the PR description. Neither the JIRA or this PR has a clear description on it. Although #53920 has the discussion, it would be good to explicitly write it here. |
|
@viirya #53454 (comment) might answer your question |
Yeah, I know. What I mean is we should make the PR self-descriptive for others who don't know the context to understand this change immediately instead of needing to look for information across several PRs. |
Definitely! Sorry for the inconvenience. |
|
Thank you, @viirya . You are right. I agree with you that the current description (and the commit log) wasn't enough. Let me revise the PR title at least (although it's already committed) To all reviewers: Just for the record, the discussion originally initiated over 2 months ago and we tried many approaches to minimize the gap as much as possible we can. DB2 JDBC was the last hurdle to block
Lines 1360 to 1365 in 1d56813 We don't want to give an extra recommendation like using We have still two on-going JIRA issue.
I'm not sure about when (2) is done because it's 3-rd party decision. The bottom line is that the Apache Spark should be able to proceed without being blocked by 3rd-party library packaging issue. |
|
Thank you @dongjoon-hyun @pan3793 for making the description better and unblocking https:.github.com/apache/spark/pull/53454. |
|
Thank you so much for your consistently thoughtful and sincere reviews, @viirya . I truly appreciate the time and care you put into every review. 😄 |
What changes were proposed in this pull request?
This PR aims to disable DB2 JDBC driver test coverage until it removes its side-effect.
ConnectionProviderSuiteis revised to use another connection provider instead ofDB2.Why are the changes needed?
To avoid a side-effect on LZ4 test dependency. We had better focus on the non-test dependency first.
BACKGROUND
The discussion originally initiated over 2 months ago and we tried many approaches to minimize the gap as much as possible we can. DB2 JDBC was the last hurdle to block
Apache Sparkfrom upgrading our LZ4 library.lz4-javato 1.10.0 #53327 (Dec 4, 2025)lz4-javato 1.10.1 #53347 (Dec 5, 2025)DB2is simply just a test dependency (or test coverage), but it causesjava.lang.NoSuchMethodError: 'net.jpountz.lz4.LZ4BlockInputStream$Builder net.jpountz.lz4.LZ4BlockInputStream.newBuilder()' at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:156)when there is a mismatch between Apache Spark's built-in LZ4 library and DB2's embedded LZ4 library.Apache Spark distribution has no documentation, assumptions or requirements for JDBC drivers so far. That's the reason why I suggested to simply disable
Db2test coverage for a while as the alternative #53920 .spark/pom.xml
Lines 1360 to 1365 in 1d56813
We don't want to give an extra recommendation like using
12.1.3.0_special_74723which it may contaminate Spark's classpaths. IBM may want to give a recommendation from their side.We have still two on-going JIRA issue.
I'm not sure about when (2) is done because it's 3-rd party decision. The bottom line is that the Apache Spark should be able to proceed without being blocked by 3rd-party library packaging issue.
Does this PR introduce any user-facing change?
No Spark's behavior change because this is only a test dependency and coverage.
How was this patch tested?
Pass the CIs.
Was this patch authored or co-authored using generative AI tooling?
No.