[FLINK-39074][PyFlink] Add AI inference lifecycle management with infer() API #27590
+1,424
−0
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What is the purpose of the change
This PR FLINK-39074 introduces a comprehensive AI inference framework for PyFlink, making it easy to perform machine learning inference on streaming data with automatic lifecycle management.
Currently, PyFlink users face significant challenges when implementing AI inference:
This PR addresses these pain points with a simple, declarative API.
Brief change log
DataStream.infer()method for easy AI inferenceModelLifecycleManagerfor model loading and warmupBatchInferenceExecutorfor efficient batch inferenceExample Usage
Verifying this change
Does this pull request potentially affect one of the following parts:
Documentation
Additional Context
This is an initial MVP implementation focusing on core functionality. Future enhancements could include:
The implementation follows Flink's design principles and integrates naturally with existing PyFlink APIs.