Skip to content

Commit 8cfc3a7

Browse files
Add withDelta() and fix config()
Improvements
2 parents 0b137cf + 163dbd7 commit 8cfc3a7

4 files changed

Lines changed: 14 additions & 6 deletions

File tree

README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -349,6 +349,10 @@ Even more queries can be found [here](https://colab.research.google.com/github/R
349349

350350
# Latest updates
351351

352+
## Version 0.2.0 alpha 6
353+
- Fix a bug with the config() call of the builder.
354+
- add withDelta() to configure Delta Lake tables and files, for use with the JSONiq Update Facility.
355+
352356
## Version 0.2.0 alpha 5
353357
- If the initialization of the Spark session fails, we now check if SPARK_HOME is set and if it may be invalid or pointing to a different Spark version than 4.0, and output a more informative error message.
354358

pyproject.toml

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,12 +4,13 @@ build-backend = "setuptools.build_meta"
44

55
[project]
66
name = "jsoniq"
7-
version = "0.2.0a5"
7+
version = "0.2.0a6"
88
description = "Python edition of RumbleDB, a JSONiq engine"
99
requires-python = ">=3.11"
1010
dependencies = [
1111
"pyspark==4.0",
12-
"pandas>=2.2"
12+
"pandas>=2.2",
13+
"delta-spark==4.0"
1314
]
1415
authors = [
1516
{name = "Ghislain Fourny", email = "ghislain.fourny@inf.ethz.ch"},
-3.45 KB
Binary file not shown.

src/jsoniq/session.py

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -111,12 +111,15 @@ def master(self, url):
111111
self._sparkbuilder = self._sparkbuilder.master(url);
112112
return self;
113113

114-
def config(self, key, value):
115-
self._sparkbuilder = self._sparkbuilder.config(key, value);
114+
def config(self, key=None, value=None, conf=None, *, map=None):
115+
self._sparkbuilder = self._sparkbuilder.config(key=key, value=value, conf=conf, map=map)
116116
return self;
117117

118-
def config(self, conf):
119-
self._sparkbuilder = self._sparkbuilder.config(conf);
118+
def withDelta(self):
119+
self._sparkbuilder = self._sparkbuilder \
120+
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \
121+
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog") \
122+
.config("spark.jars.packages", "io.delta:delta-spark_2.13:4.0.0")
120123
return self;
121124

122125
def __getattr__(self, name):

0 commit comments

Comments
 (0)