Home › Glossary › Prompt Engineering
Prompt engineering is the practice of crafting effective text prompts to get desired outputs from LLMs like ChatGPT, Claude, or Gemini. Includes techniques like role assignment and few-shot examples.
LLMs are highly responsive to how a question is phrased. Small changes in wording, structure, context provided, or examples included can produce dramatically different outputs. Prompt engineering encompasses the techniques that consistently produce better outputs.
Weak prompt: 'Write a marketing email.' Better prompt: 'You are a B2B SaaS marketing expert. Write a 150-word cold email to a head of operations at a 200-person manufacturing company introducing facility maintenance software. Tone: professional, direct, no buzzwords. Include a specific value proposition and a clear single call-to-action.' The second prompt provides role, context, length, audience, tone, and structure — producing a much more usable output.
Common techniques include: Role assignment ('You are a senior software engineer...'), Few-shot examples (providing 2-3 examples of desired output), Chain-of-thought ('Think step by step...'), Output structure specification ('Return as a JSON array with fields: name, email, score'), and Constraint setting ('Limit to 100 words, no bullet points').