Skip to content

Commit cff22c1

Browse files
authored
Merge pull request #392 from ligangty/main
Update README
2 parents fa6d106 + b91978a commit cff22c1

1 file changed

Lines changed: 25 additions & 13 deletions

File tree

README.md

Lines changed: 25 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -46,54 +46,66 @@ to configure AWS access credentials.
4646
* AWS configurations. The uploader uses aws boto3 to access AWS S3 bucket, and follows the AWS configurations statndards. You can use:
4747
* AWS configurations files: $HOME/.aws/config and $HOME/.aws/credentials. (For format see [AWS config format](https://docs.aws.amazon.com/sdkref/latest/guide/file-format.html))
4848
* [System environment varaibles](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html)
49-
* Configurations for uploader. We use $HOME/.charon/charon.conf to hold these configurations. Currently, The uploader has two configurations:
50-
* ignore_patterns. This is used to filter out some files that are not allowed to upload. It is a json array of regular expressions. (Example: ["README.md", "example.txt"]). This can also be retrieved from "CHARON_IGNORE_PATTERNS" system environment variable.
51-
* bucket. This is used to specify which AWS S3 bucket to upload to with the tool. This config can also be retrieved from "charon_bucket" system environment variable.
49+
* Configurations for uploader. We use $HOME/.charon/charon.yaml to hold these configurations. The configuration file uses YAML format and supports the following options:
50+
* **targets** (required). Defines target S3 buckets for uploads. Each target can specify:
51+
* `bucket`: S3 bucket name (required)
52+
* `prefix`: Path prefix inside the bucket (optional)
53+
* `registry`: NPM registry URL for NPM targets (optional)
54+
* `domain`: Domain name for the bucket (optional)
55+
* **ignore_patterns**. Array of regular expressions to filter out files from upload. (Example: `[".*^(redhat).*", ".*snapshot.*"]`). Can also be set via `CHARON_IGNORE_PATTERNS` environment variable (JSON array format).
56+
* **aws_profile**. Specifies which AWS profile to use for S3 operations (overrides default boto3 profile selection).
57+
* **aws_cf_enable**. Boolean flag to enable AWS CloudFront invalidation support.
58+
* **manifest_bucket**. S3 bucket name for storing upload manifests.
59+
* **ignore_signature_suffix**. Defines file suffixes to exclude from signing per package type (maven, npm, etc.).
60+
* **detach_signature_command**. Command template for generating detached signatures.
61+
* **radas**. Configuration for RADAS (Red Hat Artifact Distribution and Signing) service integration.
62+
63+
See [config/charon.yaml.sample](config/charon.yaml.sample) for a complete example configuration.
5264

5365
### charon-upload: upload a repo to S3
5466

5567
```bash
56-
usage: charon upload $tarball [$tarball*] --product/-p ${prod} --version/-v ${ver} [--root_path] [--ignore_patterns] [--debug] [--contain_signature] [--key]
68+
usage: charon upload $archive [$archive*] --product/-p ${prod} --version/-v ${ver} [--root_path] [--ignore_patterns] [--debug] [--contain_signature] [--key]
5769
```
5870

59-
This command will upload the repo in tarball to S3.
60-
It will auto-detect if the tarball is for maven or npm
71+
This command will upload the repo in archive to S3.
72+
It will auto-detect if the archive is for maven or npm
6173

6274
**New in 1.3.5**: For Maven archives, this command now supports uploading multiple zip files at once. When multiple Maven zips are provided, they will be merged intelligently, including proper handling of archetype catalog files and duplicate artifact detection.
6375

6476
* For maven type, it will:
6577

66-
* Scan the tarball for all paths and collect them all.
78+
* Scan the archive for all paths and collect them all.
6779
* Check the existence in S3 for all those paths.
68-
* Filter out the paths in tarball based on:
80+
* Filter out the paths in archive based on:
6981
* filter_pattern in flags, or
7082
* filter_pattern in config.json if no flag
7183
* Generate/refresh all maven-metadata.xml for all GA combined
7284
with both S3 and local filtered pom.xml
7385
* Upload these artifacts to S3 with metadata of the product.
7486
* If the artifacts already exists in S3, update the metadata
7587
of the product by appending the new product.
76-
* NPM type (TBH): We need to know the exact tarball structure
88+
* NPM type (TBH): We need to know the exact archive structure
7789
of npm repo
7890
* For both types, after uploading the files, regenerate/refresh
7991
the index files for these paths.
8092

8193
### charon-delete: delete repo/paths from S3
8294

8395
```bash
84-
usage: charon delete $tarball|$pathfile --product/-p ${prod}
96+
usage: charon delete $archive|$pathfile --product/-p ${prod}
8597
--version/-v ${ver} [--root_path] [--debug]
8698
```
8799

88100
This command will delete some paths from repo in S3.
89101

90-
* Scan tarball or read pathfile for the paths to delete
102+
* Scan archive or read pathfile for the paths to delete
91103
* Combine the product flag by --product and --version
92-
* Filter out the paths in tarball based on:
104+
* Filter out the paths in archive based on:
93105
* filter_pattern in flags, or
94106
* filter_pattern in config.json if no flag
95107
* If the artifacts have other products in the metadata,
96-
remove the product of this tarball from the metadata
108+
remove the product of this archive from the metadata
97109
but not delete the artifacts themselves.
98110
* During or after the paths' deletion, regenerate the
99111
metadata files and index files for both types.

0 commit comments

Comments
 (0)