AI Build System · v1.2.2

Describe the business.
Get the platform.

b2dp is a business-to-data-platform orchestrator for AI coding agents. It turns plain-English product ideas into schemas, backend code, UI contracts, tests, MCP setup, and infrastructure-ready output.

b2dp · setup or generate
$b2dp setup
skills installed for Codex, Claude, Gemini
MCP servers written to agent config
youBuild a platform for a ride-sharing startup
  your preferred agent now uses b2dp skills + MCP tools directly
$b2dp generate "Build a platform for a ride-sharing startup" --agent codex
► Spawning offloaded agent run...
  live progress · phase updates · interrupt support
► Analyzing business requirements...
  passengers, drivers, vehicles, rides, payouts, ratings
► Generating backend, data layer, tests, and infrastructure...
  TypeScript · SQL · vitest · docker · UI data models
✓ Platform ready
The pipeline

Two ways to use b2dp

You can either install b2dp into your preferred coding agent and prompt that agent directly, or you can use b2dp generate to offload the whole job to a spawned agent run with live progress output.

01
🧠

Install Once

Run b2dp setup to install the orchestrator skill set, write rules, and configure MCP servers for your chosen agents.

02
📐

Prompt Your Agent

After setup, talk to Codex, Claude, Gemini, VS Code, or Antigravity normally and let that agent invoke b2dp skills and MCP tools.

03
🗄️

Or Offload Generation

Use b2dp generate when you want the CLI itself to spawn an agent run, stream progress logs, and keep generation isolated in its own workspace.

04
🌱

Model The Business

Whichever path you choose, b2dp translates the product idea into entities, workflows, service boundaries, and data relationships.

05
📊

Generate The Stack

The orchestrator can produce schemas, seed data, repositories, analytics queries, tests, UI contracts, and deployment scaffolding.

06
💻

See Progress Clearly

In generate mode, the CLI shows agent logs, phase updates, and interrupt options so long-running builds stay understandable.

Deliverables

Everything needed to start building

The output is shaped for real implementation work: data model, executable setup, code scaffolding, contracts, and operational context.

🏗️

Business Analysis

Structured breakdown of product concepts, workflows, actors, and domain assumptions.

markdown
📄

Schema & Migrations

Tables, indexes, migration-ready SQL, and conventions your team can extend.

sql · ts/py

Provision & Seed

Database creation, execution logs, and realistic sample data for local iteration.

auto-execute

Repos & UI Contracts

Repository scaffolds, API shapes, and frontend-friendly data contracts instead of throwaway mock arrays.

ts · cs · py · java
🔍

Cache & Search

Paths for Redis, Elasticsearch, and adjacent data services when the system needs more than Postgres.

architecture
📊

Analytics Queries

Queries that help teams answer revenue, operations, funnel, and performance questions earlier.

queries.sql

Install b2dp and connect the toolchain

What b2dp needs

Claude Code, Codex, Gemini CLI, Antigravity, or VS Code with MCP support enabled
The b2dp CLI — use b2dp setup to install skills, configure MCP servers, and write the right agent-specific rules
Datafy MCP Server — local or remote data sources defined as a source in your dbhub.toml
The b2dp sibling skillscloud-solution-architect, api-test-generator, frontend-data-consumer, infrastructure-as-code-architect — installed by the CLI for the full design → test → UI → IaC workflow
Optional MCPs — Context7, Prisma, GitHub, Redis, and search tools enrich the workflow when available

What the ecosystem exposes

(where <id> is the source id from your dbhub.toml)

execute_sql_<id> — Execute DDL or DML queries
execute_admin_sql_<id> — Execute admin queries (e.g. CREATE DATABASE)
search_objects_<id> — Search and list database objects (schemas, tables, columns, indexes)
generate_code — Convert SQL to TypeScript or C# (Prisma, Dapper, EF Core)
github-mcp-server — For repository discovery and CI/CD setup
context7-mcp — Fetches latest best practices and patterns
prisma-mcp-server — For migrations and database exploration
redis_command_<id> — Execute Redis commands
elasticsearch_search_<id> — Execute analytic queries
Step 1 — Install the CLI and run setup
$ npm install -g @teckedd-code2save/b2dp
$ b2dp setup
# npm package
https://www.npmjs.com/package/@teckedd-code2save/b2dp
# Or let b2dp auto-detect configured agents
$ b2dp setup --yes
Step 2 — Confirm your MCP configuration
# Example JSON MCP entry used by several supported agents
{
  "mcpServers": {
    "datafy": {
      "command": "npx",
      "args": [
        "@teckedd-code2save/datafy@latest",
        "--config",
        "/path/to/your/dbhub.toml",
        "--transport",
        "stdio"
      ]
    }
  }
}
# Codex uses ~/.codex/config.toml and b2dp writes the TOML shape for you.
# IMPORTANT: Restart your AI assistant / editor for changes to take effect.
Step 3 — Create your dbhub.toml
# dbhub.toml — define all your data sources
# PostgreSQL (required)
[[sources]]
id = "pet_market" # → execute_sql_pet_market, search_objects_pet_market
type = "postgres"
host = "localhost"
port = 5432
database = "petty"
user = "postgres"
password = "postgres"
lazy = true
# Redis — session / cache (optional)
[[sources]]
id = "session_storage" # → redis_command_session_storage
type = "redis"
host = "localhost"
port = 6379
database = "0"
lazy = true
description = "Redis for session storage"
# Elasticsearch — logs & analytics (optional)
[[sources]]
id = "logs_and_analytics" # → elasticsearch_search_logs_and_analytics
type = "elasticsearch"
host = "localhost"
port = 9200
lazy = true
description = "Elasticsearch for logs and analytics"
💡 Each [[sources]] id becomes the MCP tool suffix — so "pet_market" automatically exposes execute_sql_pet_market and search_objects_pet_market to the agent.
Option A — prompt your configured agent directly
you › Build a backend for a subscription meal-kit service
# Your preferred agent uses the installed b2dp skills and MCP tools
# to reason, provision, generate code, and verify output.
Option B — offload to b2dp generate
$ b2dp generate "Build a platform for a subscription meal-kit service" --agent codex
# b2dp spawns the selected agent in a fresh workspace,
# streams logs, shows phases, and keeps the run isolated.
Use cases

Who this is actually for

b2dp is strongest when you already know the product shape and want your agents to stop wasting time on repetitive backend setup, wiring, and structural drift.

Indie SaaS builders

Go from a product idea to a credible backend foundation faster, without hand-assembling schema, repos, tests, and infra prompts every single time.

Agencies and consultants

Standardize how client backends get scaffolded so every new engagement starts from a more repeatable architecture and delivery workflow.

Startup engineers

Turn messy requirement docs into structured backend artifacts and reduce drift between product language, schema design, and implementation plans.

Stop rebuilding backend setup from scratch.

Install b2dp, wire your agents once, and let the ecosystem handle the boring part: getting from product intent to a usable backend delivery workflow.