Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 19 additions & 2 deletions docs/commands.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Cloud SQL Proxy CLI Reference

**Version:** 0.4.13
**Generated:** 2025-12-21
**Version:** 0.4.14
**Generated:** 2025-12-22

## Overview

Expand Down Expand Up @@ -40,6 +40,7 @@ Commands:
paths Show resolved system paths and configuration
locations
upgrade [options] Upgrade cloudsqlctl to the latest version
support Support utilities
help [command] display help for command
```

Expand Down Expand Up @@ -328,3 +329,19 @@ Options:
--json Output status in JSON format
-h, --help display help for command
```

### support

```text
Usage: cloudsqlctl support [options] [command]

Support utilities

Options:
-h, --help display help for command

Commands:
bundle [options] Create a support bundle zip with logs, config, doctor,
paths, and status
help [command] display help for command
```
2 changes: 2 additions & 0 deletions src/cli.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ import { authCommand } from './commands/auth.js';
import { setupCommand } from './commands/setup.js';
import { pathsCommand } from './commands/paths.js';
import { upgradeCommand } from './commands/upgrade.js';
import { supportCommand } from './commands/support.js';
import { logger } from './core/logger.js';

const program = new Command();
Expand Down Expand Up @@ -51,6 +52,7 @@ program.addCommand(authCommand);
program.addCommand(setupCommand);
program.addCommand(pathsCommand);
program.addCommand(upgradeCommand);
program.addCommand(supportCommand);

program.parseAsync(process.argv).catch(err => {
logger.error('Unhandled error', err);
Expand Down
158 changes: 158 additions & 0 deletions src/commands/support.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,158 @@
import { Command } from 'commander';
import path from 'path';
import fs from 'fs-extra';
import axios from 'axios';
import { logger } from '../core/logger.js';
import { readConfig } from '../core/config.js';
import { isRunning } from '../core/proxy.js';
import { checkGcloudInstalled, getActiveAccount, checkAdc, listInstances } from '../core/gcloud.js';
import { checkEnvironmentDetailed } from '../system/env.js';
import { isServiceInstalled, isServiceRunning } from '../system/service.js';
import { getEnvVar, runPs } from '../system/powershell.js';
import { PATHS, PATHS_REASON, PATHS_SOURCE, ENV_VARS } from '../system/paths.js';

function formatTimestamp(): string {
return new Date().toISOString().replace(/[:.]/g, '-');
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The formatTimestamp function replaces colons and periods with hyphens, which results in timestamps like "2025-12-22T15-30-45-123Z". This format may be confusing as it mixes date separators (hyphens) with time separators (also hyphens). Consider using a more standard filesystem-safe format like "YYYYMMDD-HHMMSS" or keeping underscores for time parts to improve readability (e.g., "20251222_153045").

Suggested change
return new Date().toISOString().replace(/[:.]/g, '-');
const now = new Date();
const pad = (value: number): string => value.toString().padStart(2, '0');
const year = now.getFullYear();
const month = pad(now.getMonth() + 1);
const day = pad(now.getDate());
const hours = pad(now.getHours());
const minutes = pad(now.getMinutes());
const seconds = pad(now.getSeconds());
// Format: YYYYMMDD-HHMMSS (filesystem-safe and unambiguous)
return `${year}${month}${day}-${hours}${minutes}${seconds}`;

Copilot uses AI. Check for mistakes.
}

function buildPathsReport(): string {
const lines = [
'CloudSQLCTL Paths',
`Home: ${PATHS.HOME}`,
`Bin: ${PATHS.BIN}`,
`Logs: ${PATHS.LOGS}`,
`Config: ${PATHS.CONFIG_FILE}`,
`Proxy: ${PATHS.PROXY_EXE}`,
`Secrets: ${PATHS.SECRETS}`,
'',
`Resolution Source: ${PATHS_SOURCE}`,
`Reason: ${PATHS_REASON}`,
];
return `${lines.join('\n')}\n`;
}

async function buildStatusReport(): Promise<string> {
const processRunning = await isRunning();
const serviceInstalled = await isServiceInstalled();
const serviceRunning = serviceInstalled ? await isServiceRunning() : false;
const config = await readConfig();

const lines = [
'CloudSQLCTL Status',
`Service: ${serviceInstalled ? (serviceRunning ? 'RUNNING' : 'STOPPED') : 'NOT INSTALLED'}`,
`Process: ${processRunning ? 'RUNNING' : 'STOPPED'}`,
`Instance: ${config.selectedInstance || 'Unknown'}`,
`Port: ${config.proxyPort || 5432}`,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue: Using || to default the port will treat 0 as falsy, which may be unintended.

If config.proxyPort can validly be 0 (or another falsy value like NaN), this will incorrectly fall back to 5432. If you only want a default when the value is null or undefined, prefer config.proxyPort ?? 5432.

];

return `${lines.join('\n')}\n`;
}

async function buildDoctorReport(): Promise<string> {
const lines: string[] = [];
lines.push('CloudSQLCTL Diagnostics');

const gcloudInstalled = await checkGcloudInstalled();
lines.push(`gcloud: ${gcloudInstalled ? 'OK' : 'FAIL'}`);

const account = await getActiveAccount();
lines.push(`gcloud account: ${account || 'none'}`);

const adc = await checkAdc();
lines.push(`ADC: ${adc ? 'OK' : 'WARN'}`);

try {
await listInstances();
lines.push('list instances: OK');
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
lines.push(`list instances: FAIL (${message})`);
}

const machineEnv = await checkEnvironmentDetailed('Machine');
if (machineEnv.ok) {
lines.push('env (machine): OK');
} else {
lines.push('env (machine): WARN');
machineEnv.problems.forEach(p => lines.push(` - ${p}`));
}

const userEnv = await checkEnvironmentDetailed('User');
if (userEnv.ok) {
lines.push('env (user): OK');
} else {
lines.push('env (user): WARN');
userEnv.problems.forEach(p => lines.push(` - ${p}`));
}

const proxyExists = await fs.pathExists(PATHS.PROXY_EXE);
lines.push(`proxy binary: ${proxyExists ? 'OK' : 'FAIL'}`);

const serviceInstalled = await isServiceInstalled();
lines.push(`service installed: ${serviceInstalled ? 'yes' : 'no'}`);

if (serviceInstalled) {
const serviceCreds = await getEnvVar(ENV_VARS.GOOGLE_CREDS, 'Machine');
lines.push(`service creds: ${serviceCreds ? 'set' : 'not set'}`);
}

try {
await axios.get('https://api.github.com', { timeout: 5000 });
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GitHub API check uses a hardcoded timeout of 5000ms. This should be extracted as a constant at the top of the file for easier configuration and consistency. For example, define a constant like GITHUB_API_TIMEOUT_MS = 5000.

Copilot uses AI. Check for mistakes.
lines.push('github api: OK');
} catch {
lines.push('github api: FAIL');
}

return `${lines.join('\n')}\n`;
}
Comment on lines +51 to +107
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The buildDoctorReport function duplicates logic from the existing doctor command (src/commands/doctor.ts). This creates a maintenance burden as changes to diagnostic checks would need to be made in two places. Consider extracting the shared diagnostic logic into a reusable function that both the doctor command and the support bundle can use.

Copilot uses AI. Check for mistakes.

export const supportCommand = new Command('support')
.description('Support utilities');

supportCommand
.command('bundle')
.description('Create a support bundle zip with logs, config, doctor, paths, and status')
.option('--output <path>', 'Output zip path')
.option('--keep', 'Keep staging directory after bundling')
.action(async (options) => {
try {
const timestamp = formatTimestamp();
const stagingDir = path.join(PATHS.TEMP, `support-bundle-${timestamp}`);
const outputPath = options.output
? path.resolve(options.output)
: path.join(PATHS.TEMP, `cloudsqlctl-support-${timestamp}.zip`);

await fs.ensureDir(stagingDir);
await fs.ensureDir(path.dirname(outputPath));

await fs.writeFile(path.join(stagingDir, 'paths.txt'), buildPathsReport());
await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());

if (await fs.pathExists(PATHS.CONFIG_FILE)) {
await fs.copy(PATHS.CONFIG_FILE, path.join(stagingDir, 'config.json'));
Comment on lines +132 to +133
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 suggestion (security): Copying the raw config into the bundle may leak sensitive information.

config.json can contain secrets (tokens, credentials, project IDs, etc.), so copying it as-is into the bundle is risky. Consider redacting known sensitive fields or generating a minimal, non-sensitive config view for the bundle, and make an explicit decision about which fields are safe to include.

Suggested implementation:

            await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
            await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());

            if (await fs.pathExists(PATHS.CONFIG_FILE)) {
                const SENSITIVE_KEY_PATTERN = /(token|secret|password|key|credential|project[_-]?id|client[_-]?id)/i;

                const redactConfig = (value: unknown): unknown => {
                    if (Array.isArray(value)) {
                        return value.map(redactConfig);
                    }

                    if (value && typeof value === 'object') {
                        const result: Record<string, unknown> = {};
                        for (const [k, v] of Object.entries(value as Record<string, unknown>)) {
                            if (SENSITIVE_KEY_PATTERN.test(k)) {
                                result[k] = '[REDACTED]';
                            } else {
                                result[k] = redactConfig(v);
                            }
                        }
                        return result;
                    }

                    // Primitive values are left as-is unless their key was flagged above.
                    return value;
                };

                try {
                    const rawConfig = await fs.readFile(PATHS.CONFIG_FILE, 'utf8');
                    const parsedConfig = JSON.parse(rawConfig);
                    const redactedConfig = redactConfig(parsedConfig);

                    await fs.writeFile(
                        path.join(stagingDir, 'config-redacted.json'),
                        JSON.stringify(redactedConfig, null, 2),
                        'utf8'
                    );
                } catch (err) {
                    await fs.writeFile(
                        path.join(stagingDir, 'config-error.txt'),
                        `Failed to process config.json: ${(err as Error).message}`,
                        'utf8'
                    );
                }
            } else {
                await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
            }
  • If there are other places in the codebase or documentation that reference config.json in the support bundle, update them to point to config-redacted.json and/or describe the new redaction behavior.
  • You may want to tune SENSITIVE_KEY_PATTERN based on your actual config schema (e.g., adding/removing field names) to better balance usefulness vs. privacy.

} else {
Comment on lines +132 to +134
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The support bundle may contain sensitive information from the config.json file (such as instance connection names, credentials paths, or other configuration details). Consider adding a warning message to the user when creating the bundle, or sanitizing sensitive fields before including the config in the bundle to prevent accidental exposure of sensitive data.

Copilot uses AI. Check for mistakes.
await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
}

if (await fs.pathExists(PATHS.LOGS)) {
await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs'));
Comment on lines +138 to +139
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When copying the entire logs directory, there is no size limit check. If the logs directory has grown very large over time, the support bundle creation could fail or produce an unexpectedly large zip file. Consider adding a check for the logs directory size and either warning the user or only including recent log files (e.g., last N MB or last N days of logs).

Suggested change
if (await fs.pathExists(PATHS.LOGS)) {
await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs'));
const MAX_LOG_DIR_BYTES = 50 * 1024 * 1024; // 50 MB
const walkLogFiles = async (dir: string): Promise<Array<{ path: string; size: number; mtimeMs: number }>> => {
const entries = await fs.readdir(dir);
const files: Array<{ path: string; size: number; mtimeMs: number }> = [];
for (const entry of entries) {
const fullPath = path.join(dir, entry);
const stat = await fs.stat(fullPath);
if (stat.isDirectory()) {
const subFiles = await walkLogFiles(fullPath);
files.push(...subFiles);
} else {
files.push({
path: fullPath,
size: stat.size,
mtimeMs: stat.mtimeMs
});
}
}
return files;
};
if (await fs.pathExists(PATHS.LOGS)) {
const logFiles = await walkLogFiles(PATHS.LOGS);
const totalLogBytes = logFiles.reduce((sum, f) => sum + f.size, 0);
if (totalLogBytes <= MAX_LOG_DIR_BYTES) {
await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs'));
} else {
logger.warn(
`Logs directory size (${(totalLogBytes / (1024 * 1024)).toFixed(1)} MB) exceeds ` +
`the support bundle limit of ${(MAX_LOG_DIR_BYTES / (1024 * 1024)).toFixed(1)} MB. ` +
`Including only the most recent logs up to the limit.`
);
const logsDestRoot = path.join(stagingDir, 'logs');
await fs.ensureDir(logsDestRoot);
// Copy most recent log files first until we reach the size limit.
logFiles.sort((a, b) => b.mtimeMs - a.mtimeMs);
let copiedBytes = 0;
for (const file of logFiles) {
if (copiedBytes >= MAX_LOG_DIR_BYTES) {
break;
}
const remaining = MAX_LOG_DIR_BYTES - copiedBytes;
if (file.size > remaining && copiedBytes > 0) {
// Skip files that would substantially exceed the limit once we already have some logs.
continue;
}
const relativePath = path.relative(PATHS.LOGS, file.path);
const destPath = path.join(logsDestRoot, relativePath);
await fs.ensureDir(path.dirname(destPath));
await fs.copy(file.path, destPath);
copiedBytes += file.size;
}
await fs.writeFile(
path.join(stagingDir, 'logs-truncated.txt'),
`Logs directory was larger than ${(MAX_LOG_DIR_BYTES / (1024 * 1024)).toFixed(1)} MB.\n` +
`Included only the most recent logs, totaling approximately ` +
`${(copiedBytes / (1024 * 1024)).toFixed(1)} MB out of ` +
`${(totalLogBytes / (1024 * 1024)).toFixed(1)} MB.\n`
);
}

Copilot uses AI. Check for mistakes.
} else {
await fs.writeFile(path.join(stagingDir, 'logs-missing.txt'), 'logs directory not found');
}

await runPs('& { Compress-Archive -Path $args[0] -DestinationPath $args[1] -Force }', [
path.join(stagingDir, '*'),
outputPath
]);

if (!options.keep) {
await fs.remove(stagingDir);
}
Comment on lines +144 to +151
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the PowerShell Compress-Archive command fails, the staging directory may not be cleaned up even without the --keep flag, as the cleanup happens after the compression. Consider wrapping the cleanup in a finally block or using a try-catch around the compression to ensure the staging directory is cleaned up on error (unless --keep is specified).

Copilot uses AI. Check for mistakes.

logger.info(`Support bundle created: ${outputPath}`);
} catch (error) {
logger.error('Failed to create support bundle', error);
process.exit(1);
}
});
1 change: 1 addition & 0 deletions tools/generate-docs.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ async function generateDocs() {
'setup',
'paths',
'upgrade',
'support',
];

content += `## Commands\n\n`;
Expand Down