RankExtractPlus is a Python package designed to extract and structure ranked information from unstructured text inputs. It leverages the power of large language models (LLMs) to process text and return structured, ranked outputs.
- Extracts and ranks information from unstructured text
- Uses
llmatch-messagesto ensure structured and consistent outputs - Supports custom LLMs for flexible integration
- Easy-to-use interface with minimal setup
To install RankExtractPlus, simply run:
pip install rankextractplusfrom rankextractplus import rankextractplus
user_input = "Text about the best countries at math..."
response = rankextractplus(user_input)
print(response)You can use any LLM compatible with LangChain. Here are examples with different LLMs:
from langchain_openai import ChatOpenAI
from rankextractplus import rankextractplus
llm = ChatOpenAI()
response = rankextractplus(user_input, llm=llm)
print(response)from langchain_anthropic import ChatAnthropic
from rankextractplus import rankextractplus
llm = ChatAnthropic()
response = rankextractplus(user_input, llm=llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from rankextractplus import rankextractplus
llm = ChatGoogleGenerativeAI()
response = rankextractplus(user_input, llm=llm)
print(response)By default, RankExtractPlus uses ChatLLM7 from langchain_llm7. If you want to use a custom API key, you can pass it directly or set it as an environment variable:
from rankextractplus import rankextractplus
# Using environment variable
import os
os.environ["LLM7_API_KEY"] = "your_api_key"
response = rankextractplus(user_input)
# Or passing it directly
response = rankextractplus(user_input, api_key="your_api_key")user_input(str): The unstructured text input to process.llm(Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the defaultChatLLM7will be used.api_key(Optional[str]): The API key for LLM7. If not provided, the environment variableLLM7_API_KEYwill be used.
The default rate limits for LLM7's free tier are sufficient for most use cases. If you need higher rate limits, you can obtain a free API key by registering at LLM7.
If you encounter any issues or have suggestions, please open an issue on GitHub.
- Eugene Evstafev
- Email: hi@eugene.plus
- GitHub: chigwell
This project is licensed under the MIT License. See the LICENSE file for details.