Skip to content

johanlabs/infinity

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Infinity ♾️

An agnostic Node.js library for summarizing "infinite" texts using binary tree reduction.

Why Infinity?

LLMs have context window limits. To summarize a book or a 100-page document, you cannot simply send everything at once. Infinity solves this by breaking the text into overlapping chunks and creating a hierarchy of summaries.

Key Features

  • Agnostic: Works with any LLM (OpenAI, OpenRouter, Anthropic, Local models).
  • Efficient: Uses binary tree reduction to process chunks in parallel levels.
  • Resilient: Built-in concurrency control and retry logic.
  • Context-Aware: Supports chunk overlap to preserve meaning at borders.

Installation

npm install

Quick Start

const Infinity = require('infinity');
const OpenRouterAdapter = require('./adapters/openRouterAdapter');

const model = new OpenRouterAdapter('YOUR_API_KEY');
const summarizer = new Infinity(model, { chunkSize: 4000 });

const summary = await summarizer.summarize(veryLongText);

Creating Custom Adapters

You can connect any provider by implementing a class with a generate(prompt) method:

class MyAdapter {
  async generate(prompt) {
    const res = await myApi.call(prompt);
    return res.text;
  }
}

Configuration

  • chunkSize: Max characters per chunk.
  • chunkOverlap: Characters to overlap between chunks.
  • concurrency: Number of simultaneous API calls.
  • maxFinalLength: Desired length for the final output.
  • mapPrompt: Custom prompt for initial chunking (use {{content}}).
  • reducePrompt: Custom prompt for merging summaries (use {{textA}} and {{textB}}).

About

An agnostic library for summarizing "infinite" texts using binary tree reduction.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published