Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,14 @@ All notable changes to this project will be documented in this file.

- Support objectOverrides using `.spec.objectOverrides` on the `SparkConnectServer` and `SparkHistoryServer`.
See [objectOverrides concepts page](https://docs.stackable.tech/home/nightly/concepts/overrides/#object-overrides) for details ([#640]).
- Support for Spark `4.1.1` ([#642]).

### Removed

- Support for Spark `3.5.6` and `4.0.1` ([#642]).

[#640]: https://github.com/stackabletech/spark-k8s-operator/pull/640
[#642]: https://github.com/stackabletech/spark-k8s-operator/pull/642

## [25.11.0] - 2025-11-07

Expand Down
10 changes: 5 additions & 5 deletions docs/modules/spark-k8s/partials/supported-versions.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
// Stackable Platform documentation.
// Please sort the versions in descending order (newest first)

- 4.0.1 (Hadoop 3.4.1, Scala 2.13, Python 3.11, Java 17) (Experimental)
- 3.5.6 (Hadoop 3.3.4, Scala 2.12, Python 3.11, Java 17) (Deprecated)
- 3.5.7 (Hadoop 3.3.4, Scala 2.12, Python 3.11, Java 17) (LTS)
- 4.1.1 (Hadoop 3.4.2, Scala 2.13, Python 3.12, Java 21) (Experimental)
- 3.5.7 (Hadoop 3.4.2, Scala 2.12, Python 3.11, Java 17) (LTS)

Some reasons why Spark 4 is considered experimental (as of September 2025):
Some reasons why Spark 4 is considered experimental (as of January 2026):

- Missing HBase compatibility (See: https://github.com/apache/hbase-connectors/pull/130)
- Executors fail to load logging libs (maybe related: https://issues.apache.org/jira/browse/SPARK-52585)
- No Iceberg Spark runtime release with support for Spark 4.1 available yet.
- No Delta Lake release with support for Spark 4.1 available yet.
12 changes: 10 additions & 2 deletions tests/templates/kuttl/delta-lake/40-spark-app.yaml.j2
Original file line number Diff line number Diff line change
Expand Up @@ -42,9 +42,17 @@ spec:
deps:
requirements:
- importlib-metadata
- delta-spark=={{ test_scenario['values']['delta'] }}
{% if test_scenario['values']['spark-delta-lake'].startswith("4") %}
- delta-spark==4.1.1 # <-- not released yet
{% else %}
- delta-spark==3.3.2
{% endif %}
packages:
- io.delta:delta-spark_2.12:{{ test_scenario['values']['delta'] }}
{% if test_scenario['values']['spark-delta-lake'].startswith("4") %}
- io.delta:delta-spark_2.13:4.1.1 # <-- not released yet
{% else %}
- io.delta:delta-spark_2.12:3.3.2
{% endif %}
volumes:
- name: script
configMap:
Expand Down
13 changes: 8 additions & 5 deletions tests/templates/kuttl/iceberg/10-deploy-spark-app.yaml.j2
Original file line number Diff line number Diff line change
Expand Up @@ -42,13 +42,16 @@ spec:
deps:
packages:
#
# The iceberg runtime contains the spark and scala versions in the form :
# Need to select the correct iceberg runtime.
#
# <spark major>.<spark minor>_<scala major>.<scala minor>
# For Spark 3: iceberg-spark:spark-runtime-3.5_2.12:1.10.1
# For Spark 4: iceberg-spark:spark-runtime-4.1_2.13:xxx <-- no iceberg release yet
#
# We extract the spark parts from the test scenario value.
#
- org.apache.iceberg:iceberg-spark-runtime-{{ ".".join(test_scenario['values']['spark-iceberg'].split('.')[:2]) }}_2.12:1.8.1
{% if test_scenario['values']['spark-iceberg'].startswith("4") %}
- org.apache.iceberg:iceberg-spark-runtime-{{ ".".join(test_scenario['values']['spark-iceberg'].split('.')[:2]) }}_2.13:xxx
{% else %}
- org.apache.iceberg:iceberg-spark-runtime-{{ ".".join(test_scenario['values']['spark-iceberg'].split('.')[:2]) }}_2.12:1.10.1
{% endif %}
volumes:
- name: script
configMap:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ spec:
mainApplicationFile: local:///stackable/spark/examples/src/main/python/als.py
deps:
requirements:
- numpy==1.24.2
- numpy==2.4.1
driver:
config:
logging:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ spec:
mainApplicationFile: local:///stackable/spark/examples/src/main/python/als.py
deps:
requirements:
- numpy==1.24.2
- numpy==2.4.1
driver:
config:
logging:
Expand Down
Binary file not shown.
Binary file not shown.
17 changes: 9 additions & 8 deletions tests/test-definition.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,33 +5,38 @@ dimensions:
- "false"
- name: spark
values:
- 3.5.6
- 3.5.7
- 4.0.1
- 4.1.1
# Alternatively, if you want to use a custom image, append a comma and the full image name to the product version
# as in the example below.
# - 3.5.6,oci.stackable.tech/sandbox/spark-k8s:3.5.6-stackable0.0.0-dev
- name: spark-logging
values:
- 3.5.7
- 4.1.1
- name: spark-hbase-connector
values:
- 3.5.7
# No hbase-connector release with support for Spark 4 yet.
# - 4.1.1
- name: spark-delta-lake
values:
- 3.5.7
# No delta-lake release with support for Spark 4.1 yet
# - 4.1.1
# - 3.5.6,oci.stackable.tech/sandbox/spark-k8s:3.5.6-stackable0.0.0-dev
- name: spark-iceberg
values:
- 3.5.7
# No iceberg release with support for Spark 4.1 yet
# - 4.1.1
- name: spark-connect
values:
- 3.5.7
- 4.0.1
- 4.1.1
# - 3.5.6,oci.stackable.tech/sandbox/spark-k8s:3.5.6-stackable0.0.0-dev
- name: hbase
values:
- 2.6.2
- 2.6.3
- name: hdfs-latest
values:
Expand All @@ -46,9 +51,6 @@ dimensions:
values:
- "false"
- "true"
- name: delta
values:
- 3.1.0
tests:
- name: smoke
dimensions:
Expand Down Expand Up @@ -102,7 +104,6 @@ tests:
- name: delta-lake
dimensions:
- spark-delta-lake
- delta
- openshift
- name: hbase-connector
dimensions:
Expand Down