Skip to content

OSS · 2026 · NestJS · LangGraph

Ship AI SaaS on NestJS without rebuilding the platform layer.

A modular NestJS + LangGraph AI platform delivered as 14 publishable libraries — orchestrated workflows, real-time token streaming, durable checkpoint/replay, human-in-the-loop gating, and unified vector + graph memory (ChromaDB + Neo4j). The opinionated foundation for shipping AI SaaS on NestJS.

← Selected work
OSS · 2026 · NestJS · LangGraph
  • NestJS
  • LangGraph
  • Nx
  • ChromaDB

Ship AI SaaS on NestJS without rebuilding the platform layer.

A modular NestJS + LangGraph AI platform delivered as 14 publishable libraries — orchestrated workflows, real-time token streaming, durable checkpoint/replay, human-in-the-loop gating, and unified vector + graph memory (ChromaDB + Neo4j). The opinionated foundation for shipping AI SaaS on NestJS.

Most NestJS AI projects rebuild the same plumbing — streaming, checkpoints, replay, memory, approval gates — from scratch every time, and inconsistently. This starter ships 14 focused libraries that solve those cross-cutting concerns once: graph-based agent orchestration via decorators, token/event WebSocket streaming, deterministic checkpoint + replay timelines, semantic vector + graph memory fusion, and removable HITL approval gates. Drop them into an Nx workspace and you have an enterprise-grade AI SaaS scaffold on day one.

The challenge

Designing a NestJS AI platform where streaming, durability, memory fusion, and human-in-the-loop are not bolted on per-project but expressed as composable libraries that can be opted in or removed without rewriting consumer code.

The outcome

An open-source Nx workspace of 14 NestJS libraries that ships an enterprise AI SaaS scaffold end-to-end — graph orchestration, WebSocket streaming, deterministic replay, vector + graph memory, and removable HITL — with reference apps demonstrating every pillar.

Technical approach

  • LangGraph orchestration — graph-based agent execution with declarative + functional decorator APIs
  • Real-time streaming — token, event, and progress streams over WebSocket via method-level decorators
  • Durable checkpoint & replay — resume after failure with deterministic timeline replay that re-emits original token cadence
  • Memory fusion — semantic vector retrieval (ChromaDB) cascaded into graph-relationship expansion (Neo4j)
  • Human-in-the-Loop gating — approval modals + intervention gates, removable with zero code churn (no-op fallback)
  • Multi-agent coordination — typed role boundaries between architect, executor, reviewer agents
  • OpenRouter default + Ollama + HuggingFace — 100+ models, fully local inference, embedding-only modes
  • Time-travel replay — execution timelines replayable for debugging and exploration
  • Optional modularity — every cross-cutting feature is injectable; opt out without forking
  • Nx monorepo of 14 publishable libraries — core contracts, workflow engine, streaming, checkpointing, multi-agent, HITL, memory fusion, time-travel, monitoring, platform API, persistence adapters
  • Demo apps included — dev-brand-api (NestJS) and dev-brand-ui (Angular + Playwright E2E)

Results at a glance

14
Publishable libraries
Nx monorepo
Workspace
ChromaDB
Vector store
Neo4j
Graph store
OpenRouter (100+ models)
Default provider
Next project

Stop hand-writing AI workflow config for every new project.