Skip to content

Commit 0e1aba0

Browse files
authored
Merge pull request #55 from steamcmd/rewrite
Rewrite to multi-service
2 parents ed6779a + 2a2312d commit 0e1aba0

22 files changed

Lines changed: 1133 additions & 382 deletions

.dockerignore

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
11
*.pyc
22
__pycache__
33
.venv
4-
.deta
54
.env
5+
6+
_test.py
7+
celerybeat-schedule.db
8+
celerybeat-schedule-*
9+
celerybeat-schedule

.gitignore

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,12 @@
1+
.DS_Store
2+
.ruff_cache
3+
14
*.pyc
25
__pycache__
36
.venv
4-
.deta
57
.env
8+
9+
_test.py
10+
celerybeat-schedule.db
11+
celerybeat-schedule-*
12+
celerybeat-schedule

Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,4 +29,4 @@ COPY --chown=$USER:$USER src/ $HOME/
2929
##################### INSTALLATION END #####################
3030

3131
# Set default container command
32-
CMD exec gunicorn main:app --max-requests 3000 --max-requests-jitter 150 --workers $WORKERS --worker-class uvicorn.workers.UvicornWorker --bind 0.0.0.0:$PORT
32+
CMD exec gunicorn web:app --max-requests 3000 --max-requests-jitter 150 --workers $WORKERS --worker-class uvicorn.workers.UvicornWorker --bind 0.0.0.0:$PORT

README.md

Lines changed: 29 additions & 65 deletions
Original file line numberDiff line numberDiff line change
@@ -11,57 +11,13 @@
1111

1212
# SteamCMD API
1313

14-
Read-only API interface for steamcmd app_info. Updates of this code are
15-
automatically deployed via [Github Actions](https://github.com/steamcmd/api/actions)
16-
when a new version has been created on Github.
14+
Read-only API interface for steamcmd app_info. The official API is reachable on
15+
[api.steamcmd.net](https://api.steamcmd.net) and it's documentation can be found
16+
on [www.steamcmd.net](https://www.steamcmd.net).
1717

1818
## Self-hosting
1919

20-
The easiest way to host the API yourself is using the free cloud platform
21-
[Fly.io](https://fly.io). Install the CLI according to the documentation:
22-
[https://fly.io/docs/hands-on/install-flyctl/](https://fly.io/docs/hands-on/install-flyctl/).
23-
24-
After installing, authenticate locally with the `flyctl` cli:
25-
```bash
26-
fly auth login
27-
```
28-
Create the app and redis instances (choose your own names):
29-
```bash
30-
fly apps create <app-name>
31-
fly redis create <redis-name>
32-
```
33-
Retrieve the Redis connection URL (you will need this later):
34-
```bash
35-
fly redis status <redis-name>
36-
37-
Redis
38-
ID = xxxxxxxxxxxxxxxxxx
39-
Name = api
40-
Plan = Free
41-
Primary Region = ams
42-
Read Regions = None
43-
Eviction = Enabled
44-
Private URL = redis://default:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx@fly-api.upstash.io <== Write the password down
45-
```
46-
Set the required configuration environment variables:
47-
```bash
48-
fly secrets set --app <app-name> \
49-
CACHE=True \
50-
CACHE_TYPE=redis \
51-
CACHE_EXPIRATION=120 \
52-
REDIS_HOST="fly-api.upstash.io" \
53-
REDIS_PORT=6379 \
54-
REDIS_PASSWORD="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
55-
```
56-
Finally deploy the API Docker image with the latest code:
57-
```bash
58-
fly deploy --app <app-name> --image steamcmd/api:latest -e VERSION=1.0.0
59-
```
60-
The version is optional and currently only required for the `/v1/version` endpoint.
61-
62-
## Container
63-
64-
The API can easily be run via a Docker image which contains the API code and the
20+
The API can easily be run via a container image which contains the API code and the
6521
`uvicorn` tool to be able to respond to web requests. With every new version of
6622
the API the Docker images is automatically rebuild and pushed to the Docker Hub:
6723
```bash
@@ -73,8 +29,16 @@ docker pull steamcmd/api:1.10.0
7329
```bash
7430
docker run -p 8000:8000 -d steamcmd/api:latest
7531
```
76-
However during development, using Docker Compose is preferred. See the
77-
[Development](#development) section for information.
32+
The API consists of 2 services; the **Web** and the **Job** service and the Redis
33+
cache. The **Job** service and the Redis cache are both optional but are both required
34+
if you want to run the **Job** service.
35+
36+
Details on how the official API is hosted can be found in the
37+
[platform](https://github.com/steamcmd/platform) repository. This repository contains
38+
all the infrastructure as code that is used to deploy the API on a Kubernetes cluster.
39+
40+
See the [Development](#development) section for more information on running
41+
the API and Job services directly via Python.
7842

7943
## Configuration
8044

@@ -89,7 +53,7 @@ that you will need to set the corresponding cache settings for that type as well
8953
when using the **redis** type).
9054

9155
All the available options in an `.env` file:
92-
```
56+
```shell
9357
# general
9458
VERSION=1.0.0
9559

@@ -109,34 +73,34 @@ REDIS_URL="redis://YourUsername:YourRedisP@ssword!@your.redis.host.example.com:6
10973

11074
# logging
11175
LOG_LEVEL=info
112-
113-
# deta
114-
DETA_BASE_NAME="steamcmd"
115-
DETA_PROJECT_KEY="YourDet@ProjectKey!"
11676
```
11777

11878
## Development
11979

120-
Run the api locally by installing a web server like uvicorn and running it:
80+
To develop locally start by creating a Python virtual environment and install the prerequisites:
12181
```bash
12282
python3 -m venv .venv
12383
source .venv/bin/activate
12484
pip install -r requirements.txt
125-
pip install uvicorn
126-
cd src/
127-
uvicorn main:app --reload
12885
```
12986

130-
The easiest way to spin up a complete development environment is using Docker
131-
compose. This will build the image locally, mount the correct directory (`src`)
132-
and set the required environment variables. If you are on windows you should
133-
store the repository in the WSL filesystem or it will fail. Execute compose up
134-
in the root:
87+
Run the Web Service (FastAPI) locally by running the FastAPI development server:
13588
```bash
136-
docker compose up
89+
source .venv/bin/activate
90+
cd src/
91+
fastapi dev web.py
13792
```
13893
Now you can reach the SteamCMD API locally on [http://localhost:8000](http://localhost:8000).
13994

95+
Run the Job Service (Celery) locally by running celery directly:
96+
```bash
97+
python3 -m venv .venv
98+
source .venv/bin/activate
99+
pip install -r requirements.txt
100+
cd src/
101+
celery -A job worker --loglevel=info --concurrency=2 --beat
102+
```
103+
140104
### Black
141105

142106
To keep things simple, [Black](https://github.com/python/black) is used for code

docker-compose.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
services:
22
web:
33
build: .
4-
command: "gunicorn main:app --worker-class uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000 --reload"
4+
command: "gunicorn web:app --worker-class uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000 --reload"
55
ports:
66
- "8000:8000"
77
volumes:
@@ -10,7 +10,7 @@ services:
1010
PORT: 8000
1111
WORKERS: 4
1212
VERSION: 9.9.9
13-
CACHE: True
13+
CACHE: "True"
1414
CACHE_TYPE: redis
1515
CACHE_EXPIRATION: 120
1616
REDIS_HOST: redis

requirements.txt

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,17 @@
1-
fastapi
2-
redis
3-
deta
4-
1+
## general
52
semver
6-
python-dotenv
73
logfmter
84

5+
## web
6+
fastapi[standard]
7+
redis
8+
minio
9+
10+
## steam
911
steam[client]
1012
gevent
13+
14+
## job
15+
celery
16+
celery-singleton
17+
flower

src/config.py

Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
import utils.general
2+
import utils.helper
3+
import logging
4+
from dotenv import load_dotenv
5+
from logfmter import Logfmter
6+
7+
# fmt: off
8+
9+
# Load values from .env file
10+
load_dotenv()
11+
12+
# Set variables based on environment
13+
cache = utils.helper.read_env("CACHE", "False", choices=[ "True", "False" ])
14+
cache_type = utils.helper.read_env("CACHE_TYPE", "redis", choices=[ "redis" ])
15+
cache_expiration = utils.helper.read_env("CACHE_EXPIRATION", "120")
16+
17+
redis_url = utils.helper.read_env("REDIS_URL")
18+
redis_host = utils.helper.read_env("REDIS_HOST", "localhost")
19+
redis_port = utils.helper.read_env("REDIS_PORT", "6379")
20+
redis_password = utils.helper.read_env("REDIS_PASSWORD")
21+
redis_database = utils.helper.read_env("REDIS_DATABASE", "0")
22+
23+
storage_type = utils.helper.read_env("STORAGE_TYPE", "local", choices=[ "local", "object" ])
24+
storage_directory = utils.helper.read_env("STORAGE_DIRECTORY", "data/", dependency={ "STORAGE_TYPE": "local" })
25+
storage_object_endpoint = utils.helper.read_env("STORAGE_OBJECT_ENDPOINT", dependency={ "STORAGE_TYPE": "object" })
26+
storage_object_access_key = utils.helper.read_env("STORAGE_OBJECT_ACCESS_KEY", dependency={ "STORAGE_TYPE": "object" })
27+
storage_object_secret_key = utils.helper.read_env("STORAGE_OBJECT_SECRET_KEY", dependency={ "STORAGE_TYPE": "object" })
28+
storage_object_bucket = utils.helper.read_env("STORAGE_OBJECT_BUCKET", dependency={ "STORAGE_TYPE": "object" })
29+
storage_object_secure = utils.helper.read_env("STORAGE_OBJECT_SECURE", True)
30+
storage_object_region = utils.helper.read_env("STORAGE_OBJECT_REGION", False)
31+
32+
log_level = utils.helper.read_env("LOG_LEVEL", "info", choices=[ "debug", "info", "warning", "error", "critical" ])
33+
version = utils.helper.read_env("VERSION", "9.9.9")
34+
35+
# Set general settings
36+
chunk_size = 10
37+
38+
# Logging configuration
39+
formatter = Logfmter(keys=["level"], mapping={"level": "levelname"})
40+
handler = logging.StreamHandler()
41+
handler.setFormatter(formatter)
42+
logging.basicConfig(handlers=[handler], level=utils.general.log_level(log_level))
43+
44+
# Set Celery configuration
45+
timezone = "UTC"
46+
broker_url = redis_url
47+
broker_connection_retry_on_startup = True
48+
beat_schedule = {
49+
"check-changelist-every-5-seconds": {
50+
"task": "check_changelist",
51+
"schedule": 5.0
52+
},
53+
#"check-missing-apps-every-30-minutes": {
54+
# "task": "check_missing_apps",
55+
# "schedule": 1800.0,
56+
#},
57+
"check-incorrect-apps-every-30-minutes": {
58+
"task": "check_incorrect_apps",
59+
"schedule": 1800.0,
60+
},
61+
"check-deadlocks-every-1-hour": {
62+
"task": "check_deadlocks",
63+
"schedule": 3600.0,
64+
},
65+
}
66+
worker_concurrency = 4
67+
68+
# Dynamically import all tasks files
69+
imports = utils.helper.list_tasks()

0 commit comments

Comments
 (0)