Assign sequential api_server ports when proxy_url is unset#4416
Assign sequential api_server ports when proxy_url is unset#4416lvhan028 wants to merge 2 commits intoInternLM:mainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR updates the distributed (dp>1) OpenAI API server launcher to allow running without a proxy server by deterministically assigning per-rank server ports starting from a configurable base port.
Changes:
- Remove the hard requirement that
proxy_urlmust be set when launching dp-mode servers. - Add
base_porttolaunch_server()and use it to generate sequential ports whenproxy_urlis unset. - Wire the CLI
--server-portvalue intolaunch_server(base_port=...)for dp-mode launches.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
lmdeploy/serve/openai/launch_server.py |
Adds base_port and switches to deterministic sequential port allocation when proxy_url is None. |
lmdeploy/cli/serve.py |
Passes CLI server_port through as base_port when calling launch_server() in dp-mode. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
There was a problem hiding this comment.
Pull request overview
This PR updates the OpenAI server launcher to allow running multi-process (DP) API servers without requiring a proxy server, by assigning deterministic sequential ports when proxy_url is unset.
Changes:
- Extracts port-availability logic into a reusable
is_port_availablehelper. - Removes the hard requirement for
proxy_urland addsserver_portas a base port argument. - Uses sequential ports (
server_port + i) whenproxy_urlisNone, with an availability check.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
| if not is_port_available(port): | ||
| raise ValueError(f'Port {port} is not available') |
| model_path: str, | ||
| backend_config: Union[PytorchEngineConfig, TurbomindEngineConfig], | ||
| proxy_url: str = None, | ||
| server_port: int = 23333, | ||
| **kwargs): |
| s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) | ||
| s.bind(('127.0.0.1', port)) | ||
| return True | ||
| except Exception: |
No description provided.