You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+25-13Lines changed: 25 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -46,54 +46,66 @@ to configure AWS access credentials.
46
46
* AWS configurations. The uploader uses aws boto3 to access AWS S3 bucket, and follows the AWS configurations statndards. You can use:
47
47
* AWS configurations files: $HOME/.aws/config and $HOME/.aws/credentials. (For format see [AWS config format](https://docs.aws.amazon.com/sdkref/latest/guide/file-format.html))
* Configurations for uploader. We use $HOME/.charon/charon.conf to hold these configurations. Currently, The uploader has two configurations:
50
-
* ignore_patterns. This is used to filter out some files that are not allowed to upload. It is a json array of regular expressions. (Example: ["README.md", "example.txt"]). This can also be retrieved from "CHARON_IGNORE_PATTERNS" system environment variable.
51
-
* bucket. This is used to specify which AWS S3 bucket to upload to with the tool. This config can also be retrieved from "charon_bucket" system environment variable.
49
+
* Configurations for uploader. We use $HOME/.charon/charon.yaml to hold these configurations. The configuration file uses YAML format and supports the following options:
50
+
***targets** (required). Defines target S3 buckets for uploads. Each target can specify:
51
+
*`bucket`: S3 bucket name (required)
52
+
*`prefix`: Path prefix inside the bucket (optional)
53
+
*`registry`: NPM registry URL for NPM targets (optional)
54
+
*`domain`: Domain name for the bucket (optional)
55
+
***ignore_patterns**. Array of regular expressions to filter out files from upload. (Example: `[".*^(redhat).*", ".*snapshot.*"]`). Can also be set via `CHARON_IGNORE_PATTERNS` environment variable (JSON array format).
56
+
***aws_profile**. Specifies which AWS profile to use for S3 operations (overrides default boto3 profile selection).
57
+
***aws_cf_enable**. Boolean flag to enable AWS CloudFront invalidation support.
58
+
***manifest_bucket**. S3 bucket name for storing upload manifests.
59
+
***ignore_signature_suffix**. Defines file suffixes to exclude from signing per package type (maven, npm, etc.).
60
+
***detach_signature_command**. Command template for generating detached signatures.
61
+
***radas**. Configuration for RADAS (Red Hat Artifact Distribution and Signing) service integration.
62
+
63
+
See [config/charon.yaml.sample](config/charon.yaml.sample) for a complete example configuration.
This command will upload the repo in tarball to S3.
60
-
It will auto-detect if the tarball is for maven or npm
71
+
This command will upload the repo in archive to S3.
72
+
It will auto-detect if the archive is for maven or npm
61
73
62
74
**New in 1.3.5**: For Maven archives, this command now supports uploading multiple zip files at once. When multiple Maven zips are provided, they will be merged intelligently, including proper handling of archetype catalog files and duplicate artifact detection.
63
75
64
76
* For maven type, it will:
65
77
66
-
* Scan the tarball for all paths and collect them all.
78
+
* Scan the archive for all paths and collect them all.
67
79
* Check the existence in S3 for all those paths.
68
-
* Filter out the paths in tarball based on:
80
+
* Filter out the paths in archive based on:
69
81
* filter_pattern in flags, or
70
82
* filter_pattern in config.json if no flag
71
83
* Generate/refresh all maven-metadata.xml for all GA combined
72
84
with both S3 and local filtered pom.xml
73
85
* Upload these artifacts to S3 with metadata of the product.
74
86
* If the artifacts already exists in S3, update the metadata
75
87
of the product by appending the new product.
76
-
* NPM type (TBH): We need to know the exact tarball structure
88
+
* NPM type (TBH): We need to know the exact archive structure
77
89
of npm repo
78
90
* For both types, after uploading the files, regenerate/refresh
0 commit comments