Why This Playbook?

Generative AI is often adopted quickly, but production usefulness depends on clear output contracts and safe fallbacks. This playbook shows a practical, mock-first implementation that teams can run locally in minutes.

Repos:

Concept Primer: What Is Generative AI?

Generative AI uses LLMs to create new content from prompts (for example drafts, summaries, checklists, test ideas). In this repository, the model generates structured artifacts from a task + context input.

Broader GenAI Use Cases

  • Requirement and user-story drafting.
  • PR/change summaries and release notes.
  • Incident/postmortem first drafts.
  • Knowledge article/support response drafting.
  • Test strategy and checklist generation.

Demo scope in this repo:

  • For repeatability, this demo focuses on test case generation from task + context.

Concept Comparison (GenAI vs Agentic vs RAG)

User Need
   |
   +--> Fast content draft from prompt/context
   |      -> Choose GENERATIVE AI
   |
   +--> Multi-step planning + tool orchestration
   |      -> Choose AGENTIC AI
   |
   +--> Answers grounded in source documents with citations
          -> Choose RAG

What It Demonstrates

  • Structured prompt contract with JSON output.
  • Guardrails for parsing/normalizing model output.
  • Mock-first mode for offline demo reliability.
  • Provider mode using OpenAI with no UI changes.
  • Scenario runner for repeatable runs and report artifacts.

Flow

  1. User submits task + context from UI.
  2. API validates request and builds structured prompt.
  3. Model selector chooses mock or provider.
  4. Model returns JSON output.
  5. Server validates/parses response and applies fallback if needed.
  6. UI renders normalized structured result.

ASCII Diagram

User (UI)
   |
   v
POST /api/generate -> Validate Input -> Build Prompt
                                    |
                                    v
                          Model Selector (Mock/OpenAI)
                                    |
                                    v
                             JSON Completion
                                    |
                                    v
                     Parse/Guard/Normalize Output
                                    |
                                    v
                               UI Response

Provider Support

  • OpenAI is integrated out-of-the-box.
  • Other providers (Gemini, Claude, etc.) can be added via provider adapters in demo-app/src/providers/ and provider-selection logic in src/services/generator.js.

Quickstart

git clone https://github.com/amiya-pattnaik/generativeAI-engineering-playbook.git
cd generativeAI-engineering-playbook/demo-app
cp .env.example .env
npm install
npm run dev
# open http://localhost:3000

Use OpenAI:

  • Set OPENAI_API_KEY (optional OPENAI_MODEL) in .env.

Run and Evaluate

npm run demo:scenarios
npm run demo:scenario -- scenarios/login-mfa.json

Outputs are written to demo-app/reports/ (JSON + Markdown).

Closing Thought

Generative AI is strongest when treated as an engineering interface with contracts, validation, and clear fallback behavior, not as an unconstrained text box.