Skip to content

flyto-ai

Natural language to executable automation. flyto-ai turns plain language descriptions into reusable YAML workflows using 412+ pre-built modules from flyto-core.

Why flyto-ai?

Traditional AI coding tools generate code that must be executed directly — non-deterministic and error-prone. flyto-ai takes a different approach:

  • LLM never writes code — only selects modules and fills parameters
  • Every execution produces a reusable YAML workflow
  • Results are deterministic — same params always produce same results
  • Zero-cost replays — save learned blueprints, skip LLM on reruns

Quick Start

Installation

bash
pip install flyto-ai

Interactive Chat

bash
flyto-ai

One-shot Execution

bash
flyto-ai chat "scrape the top 10 posts from Hacker News"

YAML-only Mode (no execution)

bash
flyto-ai chat "scrape example.com" --plan

Architecture

User Message
  → LLM (OpenAI / Anthropic / Ollama)
    → Tool Selection (412 flyto-core modules)
      → Module Execution (deterministic)
        → Results + YAML Workflow
          → Blueprint Learning (self-improving)

Key Components

ComponentPurpose
Agent CoreChat loop, tool dispatch, safety layers
LLM ProvidersOpenAI, Anthropic, Ollama with failover
Claude Code AgentAI-driven code fixes with verification
MemoryPersistent conversation memory with semantic search
ChannelsTelegram, Discord, Slack integrations
Prompt SystemThree-layer prompt architecture

Supported LLM Providers

ProviderModelsBest For
OpenAIgpt-4o, gpt-4o-miniFunction calling
AnthropicClaude Sonnet 4.5, HaikuReasoning + multi-step
OllamaQwen, Llama, MistralLocal inference, no API costs

Providers can be chained with automatic failover on 429/5xx errors.

Released under the Apache 2.0 License.