SIM AI : The Open-Source Platform for Building AI Agent Workflows

Demo: https://www.youtube.com/watch?v=uQJbt3aPctA

The rise of agentic AI systems is changing the way we design and deploy intelligent applications. Instead of relying on a single large model, platforms like SIM AI let us orchestrate multiple agents, each with specialized roles , to work together in workflows that are modular, explainable, and powerful.

SIM AI is open-source, fast-growing, and designed for developers who want to self-host AI workflows with complete control over their data..

Article content

Why SIM AI is Trending

What makes SIM AI stand out is its ability to bring together agents, memory, knowledge, routing, workflows, and loops in a clean visual interface. You can define agents, watch them collaborate step by step, and view the output from each node in real-time.

This modular approach not only makes workflows easier to design but also simplifies debugging and customization.

Article content

Getting Started with SIM AI

SIM AI provides multiple ways to run the platform locally, giving developers flexibility:

1. NPM Quick Start

If you just want to try it out, a single command is enough:

npx simstudio        

By default, it runs on http://localhost:3000.

2. Docker Compose Setup

For a production-like environment, you can clone the repo and launch SIM with Docker Compose:

git clone https://github.com/simstudioai/sim.git
cd sim
docker compose -f docker-compose.prod.yml up -d        

Access the app at http://localhost:3000.

3. Running with Local Models (Ollama)

Privacy-first? SIM supports running fully offline with Ollama.

  • GPU setup:

docker compose -f docker-compose.ollama.yml --profile setup up -d        

  • CPU setup:

docker compose -f docker-compose.ollama.yml --profile cpu --profile setup up -d        

4. Developer-Friendly Setup

For developers, SIM offers:

  • VS Code Dev Containers integration
  • Manual setup with Bun runtime + PostgreSQL + pgvector
  • Easy environment configuration via .env

Once everything is ready, you can run both the Next.js app and realtime socket server together with:

bun run dev:full        

Knowledge Integration with RAG

One of SIM’s most powerful features is the ability to add custom knowledge bases.

With the Knowledge RAG pipeline, you can upload your own documents—such as company data, private files, or policy manuals. These become available to your agents during runtime, allowing them to reason with your organization’s context instead of relying only on general AI knowledge.

This transforms SIM into a domain-specific assistant that can adapt to your unique business needs.


Article content

Tech Stack Behind SIM AI

Under the hood, SIM AI is built with:

  • Next.js + Bun runtime
  • PostgreSQL + pgvector + Drizzle ORM
  • Better Auth for authentication
  • ReactFlow for visual workflows
  • Zustand for state management
  • Tailwind + Shadcn for UI
  • Socket.io for realtime communication
  • Trigger.dev for background jobs
  • Monorepo managed with Turborepo

This modern stack makes it both developer-friendly and scalable.

Open Source & Community

SIM AI is licensed under Apache 2.0 and welcomes contributions. You can check out the GitHub repository for setup guides, examples, and contribution details.



Great. Thank you for sharing

Like
Reply

To view or add a comment, sign in

More articles by Imdad Areeph

Others also viewed

Explore content categories