I built a CLI tool intended to standardize local development setup across microservices. The promise: one command—dev bootstrap—that discovers services, generates .env files, and starts containers via Docker Compose. In demos, it was magical. In real teams, it broke in 40% of setups due to bespoke scripts, Compose version drift, OS differences, and odd edge cases. The MVP automated too much, too early, and eroded trust.
This article explains what I built, why it failed, and how I would rebuild the MVP around a clear compatibility contract and a validator-first workflow that earns trust before automating.
The Context: Diverse Stacks, Fragile Automation
Microservice repos evolve organically. Teams glue together language-specific tools, local caches, custom scripts, and different container setups. A tool that tries to own the entire “bootstrap and run” flow without a shared contract is brittle.
What I Built (MVP Scope)
- Discovery: Scan repos for services via file patterns.
- Env Generation: Infer env keys from
docker-compose.ymland sample.env.examplefiles; produce unified.env. - Compose Orchestration: Start all services locally with one command.
- Opinionated Defaults: Assume standard port ranges and common service names.
- Metrics: Time to first run, number of successful bootstraps per team.
Launch and Early Results
- Solo demos worked spectacularly.
- Team pilots revealed fragility: custom scripts, non-standard Compose naming, and OS-specific quirks caused frequent failures.
- Trust dropped quickly; teams reverted to their known scripts.
Why It Failed: Over-Automation Without a Contract
I tried to automate the whole workflow without agreeing on a small, stable contract that teams could satisfy. Without a shared “dev.json” or similar spec, guessing env keys and start commands led to errors. Reliability suffered, and with dev tools, reliability is the MVP.
Root causes:
- Inference Errors: Guessing configurations from heterogeneous repos is error-prone.
- Hidden Assumptions: Opinionated defaults clashed with local reality.
- No Validation Step: Users couldn’t see or fix mismatches before automation ran.
The MVP I Should Have Built: Validate and Guide
Start with a minimal compatibility contract and a validator that helps teams conform incrementally.
- Contract: Each service exposes a
dev.jsoncontaining ports, env keys, and start command. - Validator CLI:
dev validatechecks conformance, explains gaps, and suggests fixes. - Linter: Provide a linter for
dev.jsonwith clear error messages. - Guided Setup: Generate
.envfromdev.jsonand start one service at a time. - Telemetry: Track validation pass rate, categories of errors, and time to first successful run.
How It Would Work (Still MVP)
- Step 1: Teams add
dev.jsonto each service with minimal fields. - Step 2: Run
dev validate; fix issues based on actionable messages. - Step 3: Use
dev envto generate environment files deterministically. - Step 4: Start one service with
dev run service-a; expand to orchestration only after a high pass rate.
This builds trust by making the tool predictable and by exposing mismatches early.
Technical Shape
- Schema:
dev.jsonwith fields{ name, port, env: [KEY], start: "cmd" }. - Validation Engine: JSON schema + custom checks (port conflicts, missing env keys).
- Compose Adapter: Optional; reads from
dev.jsonto generate Compose fragments rather than infer from arbitrary files. - Cross-Platform Tests: Simple checks for OS differences (path separators, shell commands).
Measuring Trust
- Validation Pass Rate: Percentage of services passing
dev validate. - First Successful Run: Time from install to one service running.
- Error Categories: Distribution helps prioritize adapters and docs.
- Rollback Incidents: Track how often teams abandon the tool mid-setup.
Onboarding and Documentation
- Quick Start: Create
dev.jsonwith a template; rundev validate. - Troubleshooting: Clear guides for common errors with copy-paste fixes.
- Contracts Over Recipes: Emphasize the compatibility contract and why it exists.
Personal Reflections
I wanted the “it just works” moment so much that I skipped the steps that make “it just works” possible: a shared spec and a validator. Dev teams reward predictability over magic; trust is the currency.
Counterfactual Outcomes
With a validator-first MVP:
- Validation pass rate climbs from ~40% to ~80% in two months.
- Time to first successful run drops significantly.
- Teams adopt the tool gradually, and orchestration becomes feasible.
Iteration Path
- Add adapters for common stacks (Node, Python, Go).
- Introduce a
dev doctorcommand that diagnoses OS and toolchain issues. - Expand the contract only as needed; resist auto-inference beyond the spec.
Closing Thought
For dev tools, the smallest viable product is a trust-building tool: define a minimal contract, validate it, and guide teams to conformance. Automate only after reliability is demonstrated. Magic is delightful, but trust is what sticks.



