Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 15 additions & 93 deletions in-progress/administration.adoc

Large diffs are not rendered by default.

131 changes: 17 additions & 114 deletions in-progress/astra-streaming-cli.adoc

Large diffs are not rendered by default.

23 changes: 1 addition & 22 deletions in-progress/change-data-capture.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,13 +15,10 @@ astra db create cdc_demo_db -r us-east1 -k cdc_demo_keyspace
----

.Result
[%collapsible]
====
[source,console]
----
Database 'cdc_demo_db' has been created with id 'ce5e563c-3248-4064-b03c-5892433a1347'. It is now active after waiting 422 seconds.
----
====

== Create a streaming tenant

Expand All @@ -33,14 +30,11 @@ astra streaming create cdc-demo-tenant -c gcp -r useast1
----

.Result
[%collapsible]
====
[source,console]
----
https://api.astra.datastax.com/v2/streaming/tenants/cdc-demo-tenant
[OK] Tenant 'cdc-demo-tenant' has being created.
----
====

== Create a table in your database

Expand All @@ -56,13 +50,10 @@ astra db cqlsh exec cdc_demo_db \
----

.Result
[%collapsible]
====
[source,console]
----
[INFO] Cqlsh is starting, please wait for connection establishment...
----
====

Confirm table creation:
+
Expand All @@ -73,8 +64,6 @@ astra db cqlsh exec cdc_demo_db \
----

.Result
[%collapsible]
====
[source,console]
----
[INFO] Cqlsh is starting, please wait for connection establishment...
Expand All @@ -84,7 +73,6 @@ astra db cqlsh exec cdc_demo_db \

(0 rows)
----
====

== Create a CDC connection

Expand All @@ -101,13 +89,10 @@ astra db create-cdc cdc_demo_db \
////
// TODO: The command is not working as expected. Must investigate and figure out why it is reporting the following error: [ERROR] INVALID_ARGUMENT: Error Code=422(422) Invalid information provided to create DB: 422 Unprocessable Entity: databaseId, keyspace, tableName, and orgId are mandatory fields
.Result
[%collapsible]
====
[source,console]
----

----
====
////

Use `xref:commands:astra-db-list-cdcs.adoc[]` to confirm CDC details for the database and tenant:
Expand All @@ -118,8 +103,6 @@ astra db list-cdcs cdc_demo_db
----

.Result
[%collapsible]
====
[source,console]
----
+-----------------------+-------------------+----------------+----------------+--------------------+----------------+----------------+
Expand All @@ -128,16 +111,13 @@ astra db list-cdcs cdc_demo_db
| 57a3024f-cdcdemotable | cdc_demo_keyspace | cdc_demo_table | cdc-demo-tenant| pulsar-aws-useast1 | astracdc | Running |
+-----------------------+-------------------+----------------+----------------+--------------------+----------------+----------------+
----
====

[source,shell]
----
astra streaming list-cdc cdc-demo-tenant
----

.Result
[%collapsible]
====
[source,console]
----
+--------------------+----------------+----------------+-------------------+----------------+----------------+
Expand All @@ -146,8 +126,7 @@ astra streaming list-cdc cdc-demo-tenant
| pulsar-aws-useast1 | astracdc | cdc_demo_db | cdc_demo_keyspace | cdc_demo_table | running |
+--------------------+----------------+----------------+-------------------+----------------+----------------+
----
====

== Connect a sink

After you enable CDC on your {db-serverless} database, you're ready to xref:astra-streaming:developing:astream-cdc.adoc#connect-a-sink[connect a sink].
After you enable CDC on your {db-serverless} database, you're ready to xref:astra-streaming:developing:astream-cdc.adoc#connect-a-sink[connect a sink].
34 changes: 2 additions & 32 deletions in-progress/dsbulk.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,6 @@ astra db dsbulk load **DB_ID** -k **KEYSPACE_NAME** -t **TABLE_NAME** --url **FI
----

.Result
[%collapsible]
====
[source,console]
----
[INFO] Downloading Dsbulk, please wait...
Expand All @@ -44,10 +42,9 @@ Checkpoints for the current operation were written to checkpoint.csv.
To resume the current operation, re-run it with the same settings, and add the following command line flag:
--dsbulk.log.checkpoint.file=/Users/USERNAME/logs/LOAD_20250123-020734-995267/checkpoint.csv
----
====

[TIP]
======
====
You can use the `xref:commands:astra-db-cqlsh-exec.adoc[]` command to check that the data imported successfully:

[source,shell,subs="+quotes"]
Expand All @@ -56,8 +53,6 @@ astra db cqlsh exec **DB_ID** "SELECT * FROM **KEYSPACE_NAME**.**TABLE_NAME** LI
----

.Result
[%collapsible]
====
[source,console]
----
[INFO] Cqlsh is starting, please wait for connection establishment...
Expand Down Expand Up @@ -88,7 +83,6 @@ astra db cqlsh exec **DB_ID** "SELECT * FROM **KEYSPACE_NAME**.**TABLE_NAME** LI
(20 rows)
----
====
======

== Unload data

Expand All @@ -100,8 +94,6 @@ astra db dsbulk unload **DB_ID** -k **KEYSPACE_NAME** -t **TABLE_NAME** --url **
----

.Result
[%collapsible]
====
[source,console]
----
[INFO] RUNNING: /Users/USERNAME/.astra/dsbulk-1.11.0/bin/dsbulk unload -u token -p AstraCS:FZm... -b /Users/USERNAME/.astra/scb/scb_91b35105-a5aa-4cd5-a93b-900ac58452ba_us-east1.zip -k dsbulk_demo_keyspace -t cities_by_country -logDir ./logs --log.verbosity normal --schema.allowMissingFields true -maxConcurrentQueries AUTO -delim , -url unloaded_data -header true -encoding UTF-8 -skipRecords 0 -maxErrors 100
Expand All @@ -116,7 +108,6 @@ Checkpoints for the current operation were written to checkpoint.csv.
To resume the current operation, re-run it with the same settings, and add the following command line flag:
--dsbulk.log.checkpoint.file=/Users/USERNAME/logs/UNLOAD_20250123-021231-557959/checkpoint.csv
----
====

== Count data

Expand All @@ -128,8 +119,6 @@ astra db dsbulk count **DB_ID** -k **KEYSPACE_NAME** -t **TABLE_NAME**
----

.Result
[%collapsible]
====
[source,console]
----
[INFO] RUNNING: /Users/USERNAME/.astra/dsbulk-1.11.0/bin/dsbulk count -u token -p AstraCS:FZm... -b /Users/USERNAME/.astra/scb/scb_91b35105-a5aa-4cd5-a93b-900ac58452ba_us-east1.zip -k dsbulk_demo_keyspace -t cities_by_country -logDir ./logs --log.verbosity normal --schema.allowMissingFields true -maxConcurrentQueries AUTO
Expand All @@ -145,7 +134,6 @@ To resume the current operation, re-run it with the same settings, and add the f
--dsbulk.log.checkpoint.file=/Users/USERNAME/logs/COUNT_20250123-021120-127216/checkpoint.csv
134574
----
====

== Complete {dsbulk-short} example

Expand All @@ -159,13 +147,10 @@ astra db create dsbulk_demo_db -r us-east1 -k dsbulk_demo_keyspace
----
+
.Result
[%collapsible]
====
[source,console]
----
Database 'dsbulk_demo_db' has been created with id '8b8fea68-404e-4f12-9a79-02079060adfa'. It is now active after waiting 433 seconds.
----
====

. Download the xref:ROOT:attachment$cities.csv[cities.csv] file and move it to the directory where you run {product} commands.
+
Expand All @@ -190,16 +175,13 @@ astra db cqlsh start dsbulk_demo_db -k dsbulk_demo_keyspace
----
+
.Result
[%collapsible]
====
[source,console]
----
Connected to cndb at 127.0.0.1:9042.
[cqlsh 6.8.0 | Cassandra 4.0.0.6816 | CQL spec 3.4.5 | Native protocol v4]
Use HELP for help.
token@cqlsh:dsbulk_demo_keyspace>
----
====
+
.. Copy and paste the following CQL statement into the `cqlsh` prompt and press kbd:[Enter]:
+
Expand Down Expand Up @@ -233,8 +215,6 @@ astra db dsbulk load dsbulk_demo_db -k dsbulk_demo_keyspace -t cities_by_country
----
+
.Result
[%collapsible]
====
[source,console]
----
[INFO] RUNNING: /Users/USERNAME/.astra/dsbulk-1.11.0/bin/dsbulk load -u token -p AstraCS:FZm... -b /Users/USERNAME/.astra/scb/scb_91b35105-a5aa-4cd5-a93b-900ac58452ba_us-east1.zip -k dsbulk_demo_keyspace -t cities_by_country -logDir ./logs --log.verbosity normal --schema.allowMissingFields true -maxConcurrentQueries AUTO -delim , -url cities.csv -header true -encoding UTF-8 -skipRecords 0 -maxErrors 100
Expand All @@ -252,7 +232,6 @@ Checkpoints for the current operation were written to checkpoint.csv.
To resume the current operation, re-run it with the same settings, and add the following command line flag:
--dsbulk.log.checkpoint.file=/Users/USERNAME/logs/LOAD_20250123-020734-995267/checkpoint.csv
----
====

. Confirm that the data loaded successfully:
+
Expand All @@ -262,8 +241,6 @@ astra db cqlsh exec dsbulk_demo_db "select * from dsbulk_demo_keyspace.cities_by
----
+
.Result
[%collapsible]
====
[source,console]
----
[INFO] Cqlsh is starting, please wait for connection establishment...
Expand Down Expand Up @@ -293,7 +270,6 @@ astra db cqlsh exec dsbulk_demo_db "select * from dsbulk_demo_keyspace.cities_by

(20 rows)
----
====

. Count the loaded data:
+
Expand All @@ -303,8 +279,6 @@ astra db dsbulk count dsbulk_demo_db -k dsbulk_demo_keyspace -t cities_by_countr
----
+
.Result
[%collapsible]
====
[source,console]
----
[INFO] RUNNING: /Users/USERNAME/.astra/dsbulk-1.11.0/bin/dsbulk count -u token -p AstraCS:FZm... -b /Users/USERNAME/.astra/scb/scb_91b35105-a5aa-4cd5-a93b-900ac58452ba_us-east1.zip -k dsbulk_demo_keyspace -t cities_by_country -logDir ./logs --log.verbosity normal --schema.allowMissingFields true -maxConcurrentQueries AUTO
Expand All @@ -320,7 +294,6 @@ To resume the current operation, re-run it with the same settings, and add the f
--dsbulk.log.checkpoint.file=/Users/USERNAME/logs/COUNT_20250123-021120-127216/checkpoint.csv
134574
----
====

. Unload the data into CSV files:
+
Expand All @@ -330,8 +303,6 @@ astra db dsbulk unload dsbulk_demo_db -k dsbulk_demo_keyspace -t cities_by_count
----
+
.Result
[%collapsible]
====
[source,console]
----
[INFO] RUNNING: /Users/USERNAME/.astra/dsbulk-1.11.0/bin/dsbulk unload -u token -p AstraCS:FZm... -b /Users/USERNAME/.astra/scb/scb_91b35105-a5aa-4cd5-a93b-900ac58452ba_us-east1.zip -k dsbulk_demo_keyspace -t cities_by_country -logDir ./logs --log.verbosity normal --schema.allowMissingFields true -maxConcurrentQueries AUTO -delim , -url unloaded_data -header true -encoding UTF-8 -skipRecords 0 -maxErrors 100
Expand All @@ -346,6 +317,5 @@ Checkpoints for the current operation were written to checkpoint.csv.
To resume the current operation, re-run it with the same settings, and add the following command line flag:
--dsbulk.log.checkpoint.file=/Users/USERNAME/logs/UNLOAD_20250123-021231-557959/checkpoint.csv
----
====
+
This command unloads row data from the `cities_by_country` table, and stores it as CSV files within a subsirectory names `unloaded_data` in the same directory where you ran the command.
This command unloads row data from the `cities_by_country` table, and stores it as CSV files within a subsirectory names `unloaded_data` in the same directory where you ran the command.
Loading