Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 2 additions & 14 deletions tutorials/27_First_RAG_Pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -28,18 +28,6 @@
"For this tutorial, you'll use the Wikipedia pages of [Seven Wonders of the Ancient World](https://en.wikipedia.org/wiki/Wonders_of_the_World) as Documents, but you can replace them with any text you want.\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "QXjVlbPiO-qZ"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down Expand Up @@ -361,7 +349,7 @@
"### Initialize a ChatGenerator\n",
"\n",
"\n",
"ChatGenerators are the components that interact with large language models (LLMs). Now, set `OPENAI_API_KEY` environment variable and initialize a [OpenAIChatGenerator](https://docs.haystack.deepset.ai/docs/OpenAIChatGenerator) that can communicate with OpenAI GPT models. As you initialize, provide a model name:"
"ChatGenerators are the components that interact with large language models (LLMs). Now, set `OPENAI_API_KEY` environment variable and initialize a [OpenAIChatGenerator](https://docs.haystack.deepset.ai/docs/openaichatgenerator) that can communicate with OpenAI GPT models. As you initialize, provide a model name:"
]
},
{
Expand Down Expand Up @@ -626,4 +614,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
12 changes: 0 additions & 12 deletions tutorials/29_Serializing_Pipelines.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -30,18 +30,6 @@
"Although it's possible to serialize into other formats too, Haystack supports YAML out of the box to make it easy for humans to make changes without the need to go back and forth with Python code. In this tutorial, we will create a very simple pipeline in Python code, serialize it into YAML, make changes to it, and deserialize it back into a Haystack `Pipeline`."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "9smrsiIqfS7J"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down
12 changes: 0 additions & 12 deletions tutorials/30_File_Type_Preprocessing_Index_Pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -40,18 +40,6 @@
"Optionally, you can keep going to see how to use these documents in a query pipeline as well."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "rns_B_NGN0Ze"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down
14 changes: 1 addition & 13 deletions tutorials/31_Metadata_Filtering.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -28,18 +28,6 @@
"Although new retrieval techniques are great, sometimes you just know that you want to perform search on a specific group of documents in your document store. This can be anything from all the documents that are related to a specific _user_, or that were published after a certain _date_ and so on. Metadata filtering is very useful in these situations. In this tutorial, we will create a few simple documents containing information about Haystack, where the metadata includes information on what version of Haystack the information relates to. We will then do metadata filtering to make sure we are answering the question based only on information about Haystack 2.0.\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "tM3U5KyegTAE"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down Expand Up @@ -269,4 +257,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -32,18 +32,6 @@
"In the last section, you'll build a multi-lingual RAG pipeline. The language of a question is detected, and only documents in that language are used to generate the answer. For this section, the [`TextLanguageRouter`](https://docs.haystack.deepset.ai/docs/textlanguagerouter) will come in handy.\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "oBa4Q25cGTr6"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down Expand Up @@ -687,4 +675,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
14 changes: 1 addition & 13 deletions tutorials/33_Hybrid_Retrieval.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -28,18 +28,6 @@
"There are many cases when a simple keyword-based approaches like BM25 performs better than a dense retrieval (for example in a specific domain like healthcare) because a dense model needs to be trained on data. For more details about Hybrid Retrieval, check out [Blog Post: Hybrid Document Retrieval](https://haystack.deepset.ai/blog/hybrid-retrieval)."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "ITs3WTT5lXQT"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/setting-the-log-level)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down Expand Up @@ -571,4 +559,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
14 changes: 1 addition & 13 deletions tutorials/34_Extractive_QA_Pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -29,18 +29,6 @@
"To get data into the extractive pipeline, you'll also build an indexing pipeline to ingest the [Wikipedia pages of Seven Wonders of the Ancient World dataset](https://en.wikipedia.org/wiki/Wonders_of_the_World)."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "eF_hnatJUEHq"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down Expand Up @@ -659,4 +647,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
14 changes: 1 addition & 13 deletions tutorials/35_Evaluating_RAG_Pipelines.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -52,18 +52,6 @@
"<iframe width=\"560\" height=\"315\" src=\"https://www.youtube.com/embed/5PrzXaZ0-qk?si=lgBSfHatbV2i59J-\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen></iframe>\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "QXjVlbPiO-qZ"
},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/setting-the-log-level)"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down Expand Up @@ -383,7 +371,7 @@
"\n",
"In this example, we'll be using:\n",
"- [`InMemoryEmbeddingRetriever`](https://docs.haystack.deepset.ai/docs/inmemoryembeddingretriever) which will get the relevant documents to the query.\n",
"- [`OpenAIChatGenerator`](https://docs.haystack.deepset.ai/docs/OpenAIChatGenerator) to generate answers to queries. You can replace `OpenAIChatGenerator` in your pipeline with another `ChatGenerator`. Check out the full list of generators [here](https://docs.haystack.deepset.ai/docs/generators)."
"- [`OpenAIChatGenerator`](https://docs.haystack.deepset.ai/docs/openaichatgenerator) to generate answers to queries. You can replace `OpenAIChatGenerator` in your pipeline with another `ChatGenerator`. Check out the full list of generators [here](https://docs.haystack.deepset.ai/docs/generators)."
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
"\n",
"📚 Useful Sources:\n",
"* [OpenAIChatGenerator Docs](https://docs.haystack.deepset.ai/docs/openaichatgenerator)\n",
"* [OpenAIChatGenerator API Reference](https://docs.haystack.deepset.ai/reference/generator-api#openaichatgenerator)\n",
"* [OpenAIChatGenerator API Reference](https://docs.haystack.deepset.ai/reference/generators-api#openaichatgenerator)\n",
"* [🧑‍🍳 Cookbook: Function Calling with OpenAIChatGenerator](https://github.com/deepset-ai/haystack-cookbook/blob/main/notebooks/function_calling_with_OpenAIChatGenerator.ipynb)\n",
"\n",
"[OpenAI's function calling](https://platform.openai.com/docs/guides/function-calling) connects large language models to external tools. By providing a `tools` list with functions and their specifications to the OpenAI API calls, you can easily build chat assistants that can answer questions by calling external APIs or extract structured information from text.\n",
Expand Down
11 changes: 0 additions & 11 deletions tutorials/42_Sentence_Window_Retriever.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -24,17 +24,6 @@
"`SentenceWindowRetriever(document_store=doc_store, window_size=2)`"
]
},
{
"cell_type": "markdown",
"id": "784caaa2",
"metadata": {},
"source": [
"\n",
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration#enabling-the-gpu-in-colab)\n"
]
},
{
"cell_type": "markdown",
"id": "98c2f9d3",
Expand Down
4 changes: 2 additions & 2 deletions tutorials/44_Creating_Custom_SuperComponents.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
"\n",
"- **Level**: Intermediate\n",
"- **Time to complete**: 20 minutes\n",
"- **Concepts and Components Used**: [`@super_component`](https://docs.haystack.deepset.ai/docs/supercomponents), [`Pipeline`](https://docs.haystack.deepset.ai/docs/pipeline), [`DocumentJoiner`](https://docs.haystack.deepset.ai/docs/documentjoiner), [`SentenceTransformersTextEmbedder`](https://docs.haystack.deepset.ai/docs/sentencetransformerstextembedder), [`InMemoryBM25Retriever`](https://docs.haystack.deepset.ai/docs/inmemorybm25retriever), [`InMemoryEmbeddingRetriever`](https://docs.haystack.deepset.ai/docs/inmemoryembeddingretriever), [`TransformersSimilarityRanker`](https://docs.haystack.deepset.ai/docs/transformerssimilarityranker)\n",
"- **Concepts and Components Used**: [`@super_component`](https://docs.haystack.deepset.ai/docs/supercomponents), [`Pipeline`](https://docs.haystack.deepset.ai/docs/pipelines), [`DocumentJoiner`](https://docs.haystack.deepset.ai/docs/documentjoiner), [`SentenceTransformersTextEmbedder`](https://docs.haystack.deepset.ai/docs/sentencetransformerstextembedder), [`InMemoryBM25Retriever`](https://docs.haystack.deepset.ai/docs/inmemorybm25retriever), [`InMemoryEmbeddingRetriever`](https://docs.haystack.deepset.ai/docs/inmemoryembeddingretriever), [`TransformersSimilarityRanker`](https://docs.haystack.deepset.ai/docs/transformerssimilarityranker)\n",
"- **Goal**: After completing this tutorial, you'll have learned how to create custom SuperComponents using the `@super_component` decorator to simplify complex pipelines and make them reusable as components."
]
},
Expand Down Expand Up @@ -851,4 +851,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
10 changes: 0 additions & 10 deletions tutorials/template.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -21,16 +21,6 @@
"*Here provide a short description of the tutorial. What does it teach? What's its expected outcome?*"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Preparing the Colab Environment\n",
"\n",
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration#enabling-the-gpu-in-colab)\n",
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/log-level)"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
Loading