Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
81 commits
Select commit Hold shift + click to select a range
2da4907
Execution backend - revamp
harini-venkataraman Feb 19, 2026
41eeef8
async flow
harini-venkataraman Feb 19, 2026
f66dfb2
Streaming progress to FE
harini-venkataraman Feb 24, 2026
95c6592
Removing multi hop in Prompt studio ide and structure tool
harini-venkataraman Feb 25, 2026
d8cc6cc
Merge origin/main into feat/execution-backend
Deepak-Kesavan Feb 28, 2026
44a2b3f
Merge remote-tracking branch 'origin/main' into feat/execution-backend
Deepak-Kesavan Mar 2, 2026
2f4f2dc
UN-3234 [FIX] Add beta tag to agentic prompt studio navigation item
Deepak-Kesavan Mar 2, 2026
d041201
Added executors for agentic prompt studio
harini-venkataraman Mar 2, 2026
0a0cfb1
Merge branch 'main' of github.com:Zipstack/unstract into feat/executi…
harini-venkataraman Mar 2, 2026
a4e1fd7
Merge branch 'main' of github.com:Zipstack/unstract into feat/executi…
harini-venkataraman Mar 2, 2026
ae77d6a
Added executors for agentic prompt studio
harini-venkataraman Mar 2, 2026
5c22956
Added executors for agentic prompt studio
harini-venkataraman Mar 2, 2026
3cc3213
Removed redundant envs
harini-venkataraman Mar 2, 2026
d0532f8
Removed redundant envs
harini-venkataraman Mar 2, 2026
6173df5
Removed redundant envs
harini-venkataraman Mar 3, 2026
bbe6f58
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 3, 2026
a3dc912
Removed redundant envs
harini-venkataraman Mar 3, 2026
98c8071
Merge branch 'main' of github.com:Zipstack/unstract into feat/executi…
harini-venkataraman Mar 3, 2026
21157ac
Merge branch 'feat/execution-backend' of github.com:Zipstack/unstract…
harini-venkataraman Mar 3, 2026
0216b59
Removed redundant envs
harini-venkataraman Mar 3, 2026
db81b9d
Removed redundant envs
harini-venkataraman Mar 3, 2026
e1da202
Removed redundant envs
harini-venkataraman Mar 3, 2026
d119797
Removed redundant envs
harini-venkataraman Mar 3, 2026
fbadbf8
Removed redundant envs
harini-venkataraman Mar 3, 2026
882296e
Removed redundant envs
harini-venkataraman Mar 4, 2026
6d3bbbf
Removed redundant envs
harini-venkataraman Mar 4, 2026
292460b
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 4, 2026
f35c0e6
Removed redundant envs
harini-venkataraman Mar 4, 2026
9bcb458
Merge branch 'feat/execution-backend' of github.com:Zipstack/unstract…
harini-venkataraman Mar 4, 2026
0cbd10a
adding worker for callbacks
harini-venkataraman Mar 4, 2026
2b1ab1e
adding worker for callbacks
harini-venkataraman Mar 5, 2026
4122f08
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 5, 2026
1ceb352
adding worker for callbacks
harini-venkataraman Mar 5, 2026
d69304d
Merge branch 'feat/execution-backend' of github.com:Zipstack/unstract…
harini-venkataraman Mar 5, 2026
7c1266b
adding worker for callbacks
harini-venkataraman Mar 5, 2026
0b84d9e
adding worker for callbacks
harini-venkataraman Mar 5, 2026
5b0629d
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 5, 2026
98ee4b9
Pluggable apps and plugins to fit the new async prompt execution arch…
harini-venkataraman Mar 6, 2026
2dffcef
Merge branch 'feat/execution-backend' of github.com:Zipstack/unstract…
harini-venkataraman Mar 6, 2026
3b35fb2
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 6, 2026
1ab6031
Pluggable apps and plugins to fit the new async prompt execution arch…
harini-venkataraman Mar 6, 2026
15c3daf
Merge branch 'feat/execution-backend' of github.com:Zipstack/unstract…
harini-venkataraman Mar 6, 2026
7ae1a74
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 6, 2026
fbf9c29
Pluggable apps and plugins to fit the new async prompt execution arch…
harini-venkataraman Mar 9, 2026
ec2f762
Merge branch 'feat/execution-backend' of github.com:Zipstack/unstract…
harini-venkataraman Mar 9, 2026
d6a3c5e
adding worker for callbacks
harini-venkataraman Mar 9, 2026
5c23ab0
adding worker for callbacks
harini-venkataraman Mar 9, 2026
525024f
adding worker for callbacks
harini-venkataraman Mar 9, 2026
a8cbce1
adding worker for callbacks
harini-venkataraman Mar 9, 2026
549f17a
adding worker for callbacks
harini-venkataraman Mar 9, 2026
f9b86a9
adding worker for callbacks
harini-venkataraman Mar 10, 2026
5369e5a
adding worker for callbacks
harini-venkataraman Mar 10, 2026
b5205ff
adding worker for callbacks
harini-venkataraman Mar 10, 2026
9659661
fix: write output files in agentic extraction pipeline
harini-venkataraman Mar 11, 2026
67eef62
UN-3266 fix: replace hardcoded /tmp paths with secure temp dirs in te…
harini-venkataraman Mar 11, 2026
3f4cc7d
Merge branch 'main' into feat/async-prompt-service-v2
harini-venkataraman Mar 11, 2026
a563a35
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 11, 2026
9b422da
Update docs
harini-venkataraman Mar 11, 2026
6a6e8e9
Merge branch 'feat/async-prompt-service-v2' of github.com:Zipstack/un…
harini-venkataraman Mar 11, 2026
817fc1c
UN-3266 fix: remove dead code with undefined names in fetch_response
harini-venkataraman Mar 11, 2026
d9bc50f
Un 3266 fix security hotspot tmp paths (#1851)
harini-venkataraman Mar 11, 2026
b715f64
UN-3266 fix: resolve SonarCloud bugs S2259 and S1244 in PR #1849
harini-venkataraman Mar 11, 2026
e9c23b2
UN-3266 fix: resolve SonarCloud code smells in PR #1849
harini-venkataraman Mar 11, 2026
f59755a
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 11, 2026
4bf9736
UN-3266 fix: wrap long log message in dispatcher.py to fix E501
harini-venkataraman Mar 11, 2026
0531870
UN-3266 fix: resolve remaining SonarCloud S117 naming violations
harini-venkataraman Mar 11, 2026
a2edb23
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 11, 2026
3f86131
UN-3266 fix: resolve remaining SonarCloud code smells in PR #1849
harini-venkataraman Mar 11, 2026
45e61c4
UN-3266 fix: resolve SonarCloud cognitive complexity and code smell v…
harini-venkataraman Mar 11, 2026
6391c6c
UN-3266 fix: remove unused RetrievalStrategy import from _handle_answ…
harini-venkataraman Mar 11, 2026
0af0484
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 11, 2026
807e405
UN-3266 fix: rename UsageHelper params to lowercase (N803)
harini-venkataraman Mar 11, 2026
9bdb3f5
UN-3266 fix: resolve remaining SonarCloud issues from check run 66691…
harini-venkataraman Mar 11, 2026
18eafe9
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 11, 2026
7a01a35
UN-3266 fix: remove unused locals in _handle_answer_prompt (F841)
harini-venkataraman Mar 11, 2026
3e5ce31
Merge branch 'main' into feat/async-prompt-service-v2
harini-venkataraman Mar 12, 2026
e3ca0c6
fix: resolve Biome linting errors in frontend source files
harini-venkataraman Mar 12, 2026
db3d8c2
fix: replace dynamic import of SharePermission with static import in …
harini-venkataraman Mar 12, 2026
a62a9fd
Merge branch 'main' into feat/async-prompt-service-v2
harini-venkataraman Mar 12, 2026
b3a90af
fix: resolve SonarCloud warnings in frontend components
harini-venkataraman Mar 12, 2026
4200ac1
Merge branch 'main' into feat/async-prompt-service-v2
ritwik-g Mar 12, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 6 additions & 23 deletions backend/api_v2/api_deployment_views.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
import uuid
from typing import Any

from configuration.models import Configuration
from django.db.models import F, OuterRef, QuerySet, Subquery
from django.http import HttpResponse
from permissions.permission import IsOwner, IsOwnerOrSharedUserOrSharedToOrg
Expand Down Expand Up @@ -211,31 +210,15 @@ def get(
status=status.HTTP_422_UNPROCESSABLE_ENTITY,
)

# Process completed execution
response_status = status.HTTP_422_UNPROCESSABLE_ENTITY
if execution_status_value == CeleryTaskState.COMPLETED.value:
response_status = status.HTTP_200_OK
# Check if highlight data should be removed using configuration registry
api_deployment = deployment_execution_dto.api
organization = api_deployment.organization if api_deployment else None
enable_highlight = False # Safe default if the key is unavailable (e.g., OSS)
# Check if the configuration key exists (Cloud deployment) or use settings (OSS)
from configuration.config_registry import ConfigurationRegistry

if ConfigurationRegistry.is_config_key_available(
"ENABLE_HIGHLIGHT_API_DEPLOYMENT"
):
enable_highlight = Configuration.get_value_by_organization(
config_key="ENABLE_HIGHLIGHT_API_DEPLOYMENT",
organization=organization,
)
if not enable_highlight:
response.remove_result_metadata_keys(["highlight_data"])
response.remove_result_metadata_keys(["extracted_text"])
if not include_metadata:
response.remove_result_metadata_keys()
if not include_metrics:
response.remove_result_metrics()
DeploymentHelper.process_completed_execution(
response=response,
deployment_execution_dto=deployment_execution_dto,
include_metadata=include_metadata,
include_metrics=include_metrics,
)
return Response(
data={
"status": response.execution_status,
Expand Down
185 changes: 182 additions & 3 deletions backend/api_v2/deployment_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -258,8 +258,11 @@ def execute_workflow(
result.status_api = DeploymentHelper.construct_status_endpoint(
api_endpoint=api.api_endpoint, execution_id=execution_id
)
# Check if highlight data should be removed using configuration registry
# Ensure workflow identification keys are always in item metadata
organization = api.organization if api else None
org_id = str(organization.organization_id) if organization else ""
cls._enrich_result_with_workflow_metadata(result, organization_id=org_id)
# Check if highlight data should be removed using configuration registry
enable_highlight = False # Safe default if the key is unavailable (e.g., OSS)
from configuration.config_registry import ConfigurationRegistry

Expand All @@ -273,8 +276,10 @@ def execute_workflow(
if not enable_highlight:
result.remove_result_metadata_keys(["highlight_data"])
result.remove_result_metadata_keys(["extracted_text"])
if not include_metadata:
result.remove_result_metadata_keys()
if include_metadata or include_metrics:
cls._enrich_result_with_usage_metadata(result)
if not include_metadata and not include_metrics:
result.remove_inner_result_metadata()
if not include_metrics:
result.remove_result_metrics()
except Exception as error:
Expand All @@ -293,6 +298,144 @@ def execute_workflow(
)
return APIExecutionResponseSerializer(result).data

@staticmethod
def _enrich_item_inner_metadata(
item: dict, file_exec_id: str, usage_helper: Any
) -> None:
"""Inject per-model usage breakdown into item['result']['metadata']."""
inner_result = item.get("result")
if not isinstance(inner_result, dict):
return
metadata = inner_result.get("metadata")
if not isinstance(metadata, dict):
return
usage_by_model = usage_helper.get_usage_by_model(file_exec_id)
if usage_by_model:
metadata.update(usage_by_model)

@staticmethod
def _enrich_item_top_metadata(
item: dict, file_exec_id: str, usage_helper: Any
) -> None:
"""Inject aggregated usage totals into item['metadata']['usage']."""
item_metadata = item.get("metadata")
if not isinstance(item_metadata, dict):
return
aggregated = usage_helper.get_aggregated_token_count(file_exec_id)
if aggregated:
aggregated["file_execution_id"] = file_exec_id
item_metadata["usage"] = aggregated

@staticmethod
def _enrich_result_with_usage_metadata(result: ExecutionResponse) -> None:
"""Enrich each file result's metadata with usage data.

For each file_execution_id:
1. Injects per-model cost arrays (extraction_llm, challenge_llm,
embedding) into item["result"]["metadata"].
2. Injects aggregated usage totals into item["metadata"]["usage"],
matching the legacy response format.
"""
if not isinstance(result.result, list):
return

from usage_v2.helper import UsageHelper

for item in result.result:
if not isinstance(item, dict):
continue
file_exec_id = item.get("file_execution_id")
if not file_exec_id:
continue
DeploymentHelper._enrich_item_inner_metadata(item, file_exec_id, UsageHelper)
DeploymentHelper._enrich_item_top_metadata(item, file_exec_id, UsageHelper)

@staticmethod
def _enrich_item_workflow_metadata(
item: dict,
file_exec_id: str,
fe_lookup: dict,
workflow_execution: Any,
organization_id: str,
tag_names: list[str],
) -> None:
"""Populate workflow identification keys into item['metadata']."""
if not isinstance(item.get("metadata"), dict):
item["metadata"] = {}
metadata = item["metadata"]
fe = fe_lookup.get(str(file_exec_id))
we = fe.workflow_execution if fe else workflow_execution
if fe:
metadata.setdefault("source_name", fe.file_name)
metadata.setdefault("source_hash", fe.file_hash or "")
metadata.setdefault("file_execution_id", str(fe.id))
metadata.setdefault("total_elapsed_time", fe.execution_time)
if we:
metadata.setdefault("workflow_id", str(we.workflow_id))
metadata.setdefault("execution_id", str(we.id))
metadata.setdefault(
"workflow_start_time",
we.created_at.timestamp() if we.created_at else None,
)
metadata.setdefault("organization_id", organization_id)
metadata.setdefault("tags", tag_names)

@staticmethod
def _enrich_result_with_workflow_metadata(
result: ExecutionResponse,
organization_id: str,
) -> None:
"""Ensure workflow identification keys are always present in item metadata.

Uses setdefault() — fills in MISSING keys only, never overwrites
values already present from the workers cache.
"""
if not isinstance(result.result, list):
return

from workflow_manager.file_execution.models import WorkflowFileExecution

# 1. Collect file_execution_ids
file_exec_ids = [
item.get("file_execution_id")
for item in result.result
if isinstance(item, dict) and item.get("file_execution_id")
]
if not file_exec_ids:
return

# 2. Batch query (single JOIN query for all file executions)
fe_lookup = {
str(fe.id): fe
for fe in WorkflowFileExecution.objects.filter(
id__in=file_exec_ids
).select_related("workflow_execution")
}

# 3. Get execution-level data (tags) — one M2M query
workflow_execution = None
tag_names: list[str] = []
if fe_lookup:
first_fe = next(iter(fe_lookup.values()))
workflow_execution = first_fe.workflow_execution
tag_names = list(workflow_execution.tags.values_list("name", flat=True))

# 4. Enrich each item
for item in result.result:
if not isinstance(item, dict):
continue
file_exec_id = item.get("file_execution_id")
if not file_exec_id:
continue
DeploymentHelper._enrich_item_workflow_metadata(
item=item,
file_exec_id=file_exec_id,
fe_lookup=fe_lookup,
workflow_execution=workflow_execution,
organization_id=organization_id,
tag_names=tag_names,
)

@staticmethod
def get_execution_status(execution_id: str) -> ExecutionResponse:
"""Current status of api execution.
Expand All @@ -308,6 +451,42 @@ def get_execution_status(execution_id: str) -> ExecutionResponse:
)
return execution_response

@staticmethod
def process_completed_execution(
response: ExecutionResponse,
deployment_execution_dto: Any,
include_metadata: bool,
include_metrics: bool,
) -> None:
"""Enrich and clean up the response for a completed execution."""
from configuration.config_registry import ConfigurationRegistry

api_deployment = deployment_execution_dto.api
organization = api_deployment.organization if api_deployment else None
org_id = str(organization.organization_id) if organization else ""
DeploymentHelper._enrich_result_with_workflow_metadata(
response, organization_id=org_id
)
enable_highlight = False
if ConfigurationRegistry.is_config_key_available(
"ENABLE_HIGHLIGHT_API_DEPLOYMENT"
):
from configuration.models import Configuration

enable_highlight = Configuration.get_value_by_organization(
config_key="ENABLE_HIGHLIGHT_API_DEPLOYMENT",
organization=organization,
)
if not enable_highlight:
response.remove_result_metadata_keys(["highlight_data"])
response.remove_result_metadata_keys(["extracted_text"])
if include_metadata or include_metrics:
DeploymentHelper._enrich_result_with_usage_metadata(response)
if not include_metadata and not include_metrics:
response.remove_inner_result_metadata()
if not include_metrics:
response.remove_result_metrics()

@staticmethod
def fetch_presigned_file(url: str) -> InMemoryUploadedFile:
"""Fetch a file from a presigned URL and convert it to an uploaded file.
Expand Down
105 changes: 105 additions & 0 deletions backend/backend/worker_celery.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
"""Lightweight Celery app for dispatching tasks to worker-v2 workers.

The Django backend already has a Celery app for internal tasks (beat,
periodic tasks, etc.) whose broker URL is set via CELERY_BROKER_URL.
Workers use the same broker. This module provides a second Celery app
instance that reuses the same broker URL (from Django settings) but
bypasses Celery's env-var-takes-priority behaviour so it can coexist
with the main Django Celery app in the same process.

Problem: Celery reads the ``CELERY_BROKER_URL`` environment variable
with highest priority — overriding constructor args, ``conf.update()``,
and ``config_from_object()``.

Solution: Subclass Celery and override ``connection_for_write`` /
``connection_for_read`` so they always use our explicit broker URL,
bypassing the config resolution chain entirely.
"""

import logging
from urllib.parse import quote_plus

from celery import Celery
from django.conf import settings
from kombu import Queue

logger = logging.getLogger(__name__)

_worker_app: Celery | None = None


class _WorkerDispatchCelery(Celery):
"""Celery subclass that forces an explicit broker URL.

Works around Celery's env-var-takes-priority behaviour where
``CELERY_BROKER_URL`` always overrides per-app configuration.
The connection methods are the actual points where Celery opens
AMQP/Redis connections, so overriding them is both sufficient
and safe.
"""

_explicit_broker: str | None = None

def connection_for_write(self, url=None, *args, **kwargs):
return super().connection_for_write(url or self._explicit_broker, *args, **kwargs)

def connection_for_read(self, url=None, *args, **kwargs):
return super().connection_for_read(url or self._explicit_broker, *args, **kwargs)


def get_worker_celery_app() -> Celery:
"""Get or create a Celery app for dispatching to worker-v2 workers.

The app uses:
- Same broker as the workers (built from CELERY_BROKER_BASE_URL,
CELERY_BROKER_USER, CELERY_BROKER_PASS via Django settings)
- Same PostgreSQL result backend as the Django Celery app

Returns:
Celery app configured for worker-v2 dispatch.
"""
global _worker_app
if _worker_app is not None:
return _worker_app

# Reuse the broker URL already built by Django settings (base.py)
# from CELERY_BROKER_BASE_URL + CELERY_BROKER_USER + CELERY_BROKER_PASS
broker_url = settings.CELERY_BROKER_URL

# Reuse the same PostgreSQL result backend as Django's Celery app
result_backend = (
f"db+postgresql://{settings.DB_USER}:"
f"{quote_plus(settings.DB_PASSWORD)}"
f"@{settings.DB_HOST}:{settings.DB_PORT}/"
f"{settings.CELERY_BACKEND_DB_NAME}"
Comment on lines +70 to +74
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

settings.DB_USER is not URL-encoded in the result backend URL

quote_plus is correctly applied to DB_PASSWORD, but DB_USER is interpolated raw. If the database username contains any URL-special characters (e.g. @, :, /), the resulting connection string would be malformed and the Celery result backend would fail to connect. Apply the same quote_plus encoding to settings.DB_USER for consistency and correctness, just as is done for settings.DB_PASSWORD.

Prompt To Fix With AI
This is a comment left during a code review.
Path: backend/backend/worker_celery.py
Line: 70-74

Comment:
**`settings.DB_USER` is not URL-encoded in the result backend URL**

`quote_plus` is correctly applied to `DB_PASSWORD`, but `DB_USER` is interpolated raw. If the database username contains any URL-special characters (e.g. `@`, `:`, `/`), the resulting connection string would be malformed and the Celery result backend would fail to connect. Apply the same `quote_plus` encoding to `settings.DB_USER` for consistency and correctness, just as is done for `settings.DB_PASSWORD`.

How can I resolve this? If you propose a fix, please make it concise.

)

app = _WorkerDispatchCelery(
"worker-dispatch",
set_as_current=False,
fixups=[],
)
# Store the explicit broker URL for use in connection overrides
app._explicit_broker = broker_url

app.conf.update(
result_backend=result_backend,
task_queues=[Queue("executor")],
task_serializer="json",
accept_content=["json"],
result_serializer="json",
result_extended=True,
)
Comment on lines +85 to +92
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Configured queue name "executor" doesn't match the actual dispatch queue

get_worker_celery_app() registers task_queues=[Queue("executor")], but ExecutionDispatcher._get_queue() (in sdk1/execution/dispatcher.py) constructs the actual queue name as celery_executor_{executor_name} — for the legacy executor this becomes "celery_executor_legacy".

The queue declared on the app ("executor") never matches the queue used by send_task, so this task_queues setting has no practical effect. While send_task with an explicit queue parameter bypasses queue routing and the task is delivered correctly, the misconfigured task_queues setting means any queue-routing policies (e.g. prefetch limits, fair scheduling) configured on "executor" will not apply.

Either align the queue name to "celery_executor_legacy" (or the appropriate prefix), or remove the stale task_queues declaration from this app's config if it is intentionally unused.

Prompt To Fix With AI
This is a comment left during a code review.
Path: backend/backend/worker_celery.py
Line: 85-92

Comment:
**Configured queue name `"executor"` doesn't match the actual dispatch queue**

`get_worker_celery_app()` registers `task_queues=[Queue("executor")]`, but `ExecutionDispatcher._get_queue()` (in `sdk1/execution/dispatcher.py`) constructs the actual queue name as `celery_executor_{executor_name}` — for the legacy executor this becomes `"celery_executor_legacy"`.

The queue declared on the app (`"executor"`) never matches the queue used by `send_task`, so this `task_queues` setting has no practical effect. While `send_task` with an explicit `queue` parameter bypasses queue routing and the task is delivered correctly, the misconfigured `task_queues` setting means any queue-routing policies (e.g. prefetch limits, fair scheduling) configured on `"executor"` will not apply.

Either align the queue name to `"celery_executor_legacy"` (or the appropriate prefix), or remove the stale `task_queues` declaration from this app's config if it is intentionally unused.

How can I resolve this? If you propose a fix, please make it concise.


_worker_app = app
# Log broker host only (mask credentials)
safe_broker = broker_url.split("@")[-1] if "@" in broker_url else broker_url
safe_backend = (
result_backend.split("@")[-1] if "@" in result_backend else result_backend
)
logger.info(
"Created worker dispatch Celery app (broker=%s, result_backend=%s)",
safe_broker,
safe_backend,
)
return _worker_app
Loading
Loading