Skip to content

EricksonAtHome/FRC7

Repository files navigation

🚀 FRC (Fast Response Connection) Ecosystem

Netlify Status

Welcome to the Fast Response Connection (FRC) ecosystem. FRC is a next-generation distributed AI execution platform and scripting language.

What started as a simple declarative language (.frcl) has evolved through 9 levels of complexity into a full-scale, cloud-native, distributed AI network capable of executing models securely across global regional nodes.

This repository serves as the master monorepo for the entire FRC project. It contains every layer of the architecture: from the lightweight language parser to the production Kubernetes cluster setup.

FRC Image

🧠 What is FRC?

At its core, FRC solves the problem of decentralized AI execution. Instead of building monolithic AI apps, FRC allows developers to write simple, declarative scripts (using the FRCL language) that instruct a network to:

  1. Parse the intent.
  2. Route the task to the nearest global node (EU, US, ASIA).
  3. Queue the task asynchronously using Redis.
  4. Execute the AI model and stream the result back to the client.

📂 Directory Breakdown & Capabilities

Here is a thorough explanation of every component folder in this repository, why it exists, and how to use it.

1. frcl-extension/ (Developer Tooling)

  • What it is: A Visual Studio Code extension.
  • Why it exists: FRC has its own language syntax (.frcl). To make the developer experience seamless, this extension provides native syntax highlighting, intelligent auto-completion, and colorization for .frcl files inside VS Code.
  • What you can do: Compile it using vsce package and install the .vsix file into your IDE to get proper FRC code styling.

2. dev-installer/ (The AI Project Brain)

  • What it is: A Python-based orchestration and scaffolding tool.
  • Why it exists: This represents "Level 4" of the system—an autonomous project generator. Instead of manually setting up Next.js or Docker projects, you describe what you want in FRCL, and this Python engine scaffolds the entire project.
  • What you can do: Run python installer.py to auto-generate full-stack applications.

3. frc-node/ (The Autonomous Meta-System)

  • What it is: A theoretical Python implementation of the advanced concepts (Levels 5 through 9).
  • Why it exists: It explores how FRC functions as a "living" system without a UI. It includes logic for Self-Adaptive Nodes (auto-healing), Global Routing (zero-trust security), and Self-Rewriting architectures where the system writes its own deployment code based on server stress.
  • What you can do: Explore files like self_evolving_node.py to study autonomous system architectures and self-scaling mathematical models.

4. frc-runtime/ & frc-real/ (The Core Engine Basics)

  • What it is: The earliest, foundational JavaScript implementations of the FRC execution engine.
  • Why it exists: To bridge the gap between the .frcl script and the actual machine execution. It contains the raw string parsers that extract instructions like use model models5 and turn them into JSON payloads.
  • What you can do: Run the CLI (node cli/frc.js) to parse raw text files locally.

5. frc-v1/ & frc-v2/ (The Production Backends)

  • What it is: Grounded, production-ready backend architectures using Node.js, Express, and Redis.
  • Why it exists: This translates the theoretical routing into real software engineering.
    • v1 is a simple monolithic API and worker.
    • v2 is the Production System. It introduces x-api-key authentication, an Express API Gateway, and a Redis Job Queue. The gateway pushes tasks to Redis, and multi-node Docker workers pull jobs off the queue asynchronously.
  • What you can do: cd frc-v2 and run docker-compose up to launch a fully distributed job queue and worker cluster on your local machine.

6. frc-cluster/ (The Multi-Region Blueprint)

  • What it is: A simulated global compute network using Docker Compose.
  • Why it exists: It proves that FRC can scale globally. It launches an API Gateway alongside three independent regional nodes (node-eu, node-us, node-asia).
  • What you can do: Send a request to the Gateway with a header x-country: US, and watch the Gateway intelligently proxy the request specifically to the US Node container.

7. frc-k8s/ (The Enterprise Cloud Architecture)

  • What it is: A comprehensive suite of Kubernetes configuration manifests (.yaml).
  • Why it exists: To transition FRC from "local Docker" to a real Cloud-Native platform (like AWS EKS or Google GKE). It includes Deployments, Services, an Ingress router, and Horizontal Pod Autoscalers (HPA).
  • What you can do: Apply this directly to a Kubernetes cluster (kubectl apply -f k8s/) to spin up auto-scaling regional nodes that react to real-time CPU utilization.

8. frc-netlify-live/ & frc-control-panel/ (The Serverless Edge)

  • What it is: The Global Control Panel built on Netlify Serverless Edge Functions.
  • Why it exists: Because running the heavy AI compute directly on the frontend is inefficient. The Netlify app serves purely as a Smart Router and UI. It uses Netlify's native IP/Geo-headers (x-nf-country) to instantly detect where the user is located, and forwards the payload to the nearest external frc.systems cluster.
  • What you can do: Push this to Netlify to instantly deploy a globally distributed Edge Router with zero server maintenance.

9. arduino_example/ (The Hardware IoT Client)

  • What it is: C++ hardware logic for microcontrollers like the ESP32 and Arduino boards.
  • Why FRC makes IoT better: Traditional hardware requires complex backend logic, heavy MQTT brokers, or tight coupling to a specific cloud provider to process AI logic. FRC abstracts all of this. An Arduino board simply makes a lightweight HTTP POST request containing FRCL instructions, and the FRC cloud network instantly parses it, routes it globally, executes the AI model, and returns a clean JSON response.
  • How developers can use it for testing: Developers can flash arduino_example.ino onto an ESP32, connect it to WiFi, and immediately see live responses from frc.systems in their Serial Monitor. This provides a zero-friction playground to test latency and AI model outputs on real physical devices.
  • Using it in your own apps: You can directly embed this HTTP architecture into smart home sensors, robotics, or industrial monitoring tools. For example, a temperature sensor could send raw data to an FRC node, where an AI model analyzes it and returns an instruction (e.g., "turn_on_cooling") directly to the microcontroller.

🛠️ Usage (Singularity CLI)

The FRC CLI interfaces with the self-modifying ecosystem.

# Execute model globally
frc run models5

# Trigger Level 9 Singularity Loop (Self-Rewriting Ecosystem)
frc singularity

# View dynamic ecosystem map
frc nodes

🛠️ The FRC Execution Flow

No matter which folder you are looking at, the systemic logic of FRC remains uniform:

  1. Client Request: A developer, dashboard, or hardware device sends FRCL script instructions.
  2. Global Router (Netlify / Ingress): Evaluates user location and traffic.
  3. API Gateway (Express): Authenticates API keys and secures the payload.
  4. Broker (Redis): Buffers the tasks to prevent system crashes during high traffic.
  5. Execution Nodes (Docker / K8s): Asynchronously grabs the job, runs the AI model, and returns the result.

This repository is the complete blueprint for an enterprise-scale AI operating system.

About

FRC7 (Fast Response Connection) It is a platform that uses its own scripting language (FRCL, Fast Response Connection Language) to automate complex AI tasks. Instead of relying on a single central server, FRC distributes the workload across a global network of nodes.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors