feat: Command Line SDK update for version 20.2.0#308
feat: Command Line SDK update for version 20.2.0#308ArnabChatterjee20k wants to merge 1 commit intomasterfrom
Conversation
Greptile SummaryThis PR bumps the CLI SDK from 20.1.0 to 20.2.0, adding the new
Confidence Score: 3/5The new bigint commands will silently corrupt large boundary values and the --provider rename will break existing scripts at runtime. Two distinct defects in new code: lib/commands/services/databases.ts and lib/commands/services/tables-db.ts (bigint precision); lib/commands/services/project.ts (option rename). Important Files Changed
|
| .requiredOption(`--key <key>`, `Attribute Key.`) | ||
| .requiredOption(`--required <required>`, `Is attribute required?`, parseBool) | ||
| .option(`--min <min>`, `Minimum value`, parseInteger) | ||
| .option(`--max <max>`, `Maximum value`, parseInteger) | ||
| .option(`--xdefault <xdefault>`, `Default value. Cannot be set when attribute is required.`, parseInteger) | ||
| .option( | ||
| `--array [value]`, | ||
| `Is attribute an array?`, | ||
| (value: string | undefined) => | ||
| value === undefined ? true : parseBool(value), | ||
| ) | ||
| .action( | ||
| actionRunner( | ||
| async ({ databaseId, collectionId, key, required, min, max, xdefault, array }) => | ||
| parse(await (await getDatabasesClient()).createBigIntAttribute(databaseId, collectionId, key, required, min, max, xdefault, array)), | ||
| ), | ||
| ); | ||
|
|
||
|
|
||
| const databasesUpdateBigIntAttributeCommand = databases | ||
| .command(`update-big-int-attribute`) | ||
| .description(`Update a bigint attribute. Changing the \`default\` value will not update already existing documents. |
There was a problem hiding this comment.
parseInteger loses precision for large bigint values
parseInteger calls parseInt(value, 10) which converts to a JavaScript number, capping precision at Number.MAX_SAFE_INTEGER (2^53 − 1 ≈ 9.007 × 10^15). A 64-bit bigint can hold values up to 9.22 × 10^18, so any --min, --max, or --xdefault value larger than the safe integer limit will be silently truncated before being sent to the API. The same issue appears in tablesDBCreateBigIntColumnCommand and tablesDBUpdateBigIntColumnCommand in tables-db.ts.
| ); | ||
|
|
||
|
|
||
| const tablesDBCreateBigIntColumnCommand = tablesDB | ||
| .command(`create-big-int-column`) | ||
| .description(`Create a bigint column. Optionally, minimum and maximum values can be provided. | ||
| `) | ||
| .requiredOption(`--database-id <database-id>`, `Database ID.`) | ||
| .requiredOption(`--table-id <table-id>`, `Table ID.`) | ||
| .requiredOption(`--key <key>`, `Column Key.`) | ||
| .requiredOption(`--required <required>`, `Is column required?`, parseBool) | ||
| .option(`--min <min>`, `Minimum value`, parseInteger) | ||
| .option(`--max <max>`, `Maximum value`, parseInteger) | ||
| .option(`--xdefault <xdefault>`, `Default value. Cannot be set when column is required.`, parseInteger) | ||
| .option( | ||
| `--array [value]`, | ||
| `Is column an array?`, | ||
| (value: string | undefined) => | ||
| value === undefined ? true : parseBool(value), | ||
| ) | ||
| .action( | ||
| actionRunner( | ||
| async ({ databaseId, tableId, key, required, min, max, xdefault, array }) => | ||
| parse(await (await getTablesDBClient()).createBigIntColumn(databaseId, tableId, key, required, min, max, xdefault, array)), | ||
| ), | ||
| ); | ||
|
|
||
|
|
||
| const tablesDBUpdateBigIntColumnCommand = tablesDB | ||
| .command(`update-big-int-column`) | ||
| .description(`Update a bigint column. Changing the \`default\` value will not update already existing rows. | ||
| `) | ||
| .requiredOption(`--database-id <database-id>`, `Database ID.`) | ||
| .requiredOption(`--table-id <table-id>`, `Table ID.`) | ||
| .requiredOption(`--key <key>`, `Column Key.`) | ||
| .requiredOption(`--required <required>`, `Is column required?`, parseBool) | ||
| .requiredOption(`--xdefault <xdefault>`, `Default value. Cannot be set when column is required.`, parseInteger) | ||
| .option(`--min <min>`, `Minimum value`, parseInteger) | ||
| .option(`--max <max>`, `Maximum value`, parseInteger) | ||
| .option(`--new-key <new-key>`, `New Column Key.`) | ||
| .action( | ||
| actionRunner( | ||
| async ({ databaseId, tableId, key, required, xdefault, min, max, newKey }) => | ||
| parse(await (await getTablesDBClient()).updateBigIntColumn(databaseId, tableId, key, required, xdefault, min, max, newKey)), | ||
| ), | ||
| ); | ||
|
|
There was a problem hiding this comment.
parseInteger precision loss for bigint column values
Same issue as in databases.ts: --min, --max, and --xdefault for create-big-int-column and update-big-int-column use parseInteger (backed by parseInt, a 53-bit safe integer), which silently truncates any value beyond Number.MAX_SAFE_INTEGER. Valid 64-bit bigint boundary values will be corrupted before the API call is made.
This PR contains updates to the Command Line SDK for version 20.2.0.\n\nThis branch is generated from master to avoid legacy dev-branch history (e.g. removed presences artifacts).