HomeGlossary › Prompt Engineering

Prompt Engineering

Prompt engineering is the practice of crafting effective text prompts to get desired outputs from LLMs like ChatGPT, Claude, or Gemini. Includes techniques like role assignment and few-shot examples.

Definition: The practice of designing, testing, and refining text prompts to elicit accurate, useful, or specific responses from large language models (LLMs) like ChatGPT, Claude, Gemini, and Perplexity. Considered both an art and a discipline as of 2026.

How it works

LLMs are highly responsive to how a question is phrased. Small changes in wording, structure, context provided, or examples included can produce dramatically different outputs. Prompt engineering encompasses the techniques that consistently produce better outputs.

Example

Weak prompt: 'Write a marketing email.' Better prompt: 'You are a B2B SaaS marketing expert. Write a 150-word cold email to a head of operations at a 200-person manufacturing company introducing facility maintenance software. Tone: professional, direct, no buzzwords. Include a specific value proposition and a clear single call-to-action.' The second prompt provides role, context, length, audience, tone, and structure — producing a much more usable output.

Comparison + context

Common techniques include: Role assignment ('You are a senior software engineer...'), Few-shot examples (providing 2-3 examples of desired output), Chain-of-thought ('Think step by step...'), Output structure specification ('Return as a JSON array with fields: name, email, score'), and Constraint setting ('Limit to 100 words, no bullet points').

Related app: $4.99 one-time iOS app for storing, organizing, and reusing prompt templates
Get PromptForge: AI Toolkit

See also