Skip to content

Assign sequential api_server ports when proxy_url is unset#4416

Open
lvhan028 wants to merge 2 commits intoInternLM:mainfrom
lvhan028:dp-wo-proxy
Open

Assign sequential api_server ports when proxy_url is unset#4416
lvhan028 wants to merge 2 commits intoInternLM:mainfrom
lvhan028:dp-wo-proxy

Conversation

@lvhan028
Copy link
Collaborator

No description provided.

Copilot AI review requested due to automatic review settings March 16, 2026 13:16
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the distributed (dp>1) OpenAI API server launcher to allow running without a proxy server by deterministically assigning per-rank server ports starting from a configurable base port.

Changes:

  • Remove the hard requirement that proxy_url must be set when launching dp-mode servers.
  • Add base_port to launch_server() and use it to generate sequential ports when proxy_url is unset.
  • Wire the CLI --server-port value into launch_server(base_port=...) for dp-mode launches.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.

File Description
lmdeploy/serve/openai/launch_server.py Adds base_port and switches to deterministic sequential port allocation when proxy_url is None.
lmdeploy/cli/serve.py Passes CLI server_port through as base_port when calling launch_server() in dp-mode.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

@lvhan028 lvhan028 changed the title Assign sequential api_server ports by dp_rank when proxy_url is unset Assign sequential api_server ports when proxy_url is unset Mar 16, 2026
@lvhan028 lvhan028 requested a review from Copilot March 19, 2026 13:12
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the OpenAI server launcher to allow running multi-process (DP) API servers without requiring a proxy server, by assigning deterministic sequential ports when proxy_url is unset.

Changes:

  • Extracts port-availability logic into a reusable is_port_available helper.
  • Removes the hard requirement for proxy_url and adds server_port as a base port argument.
  • Uses sequential ports (server_port + i) when proxy_url is None, with an availability check.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Comment on lines +117 to +118
if not is_port_available(port):
raise ValueError(f'Port {port} is not available')
Comment on lines 85 to 89
model_path: str,
backend_config: Union[PytorchEngineConfig, TurbomindEngineConfig],
proxy_url: str = None,
server_port: int = 23333,
**kwargs):
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(('127.0.0.1', port))
return True
except Exception:
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants