System Prompt Builder

Build system prompts for AI assistants, chatbots, and agents. Define persona, tone, constraints, knowledge domain, and output format to create production-ready system prompts for ChatGPT, Claude, Gemini, and more.

1Define the Persona
Choose a preset to pre-fill sensible defaults, or define a custom persona from scratch.

This becomes "You are [your text]." at the start of the prompt.

2Configure Style & Format
Set the tone, response length, and output format for the AI assistant.
3Knowledge & Constraints
Define what the AI should know about and what it should avoid doing.

Specify the topics and areas of expertise the AI should draw upon.

Define boundaries, restrictions, and things the AI should NOT do.

Any extra behavioral rules or instructions not covered above.

4Optional Sections
Toggle additional sections to include in the generated system prompt.
Generated System Prompt
Copy this prompt and paste it into your AI platform's system message field.
ProfessionalAdaptive (match query complexity)Free-form

Fill in the fields above to generate your system prompt. Start by selecting a preset or defining a custom persona.

Characters: 0Estimated tokens: ~0

Frequently Asked Questions

What is a system prompt?

A system prompt is a set of instructions given to an AI model before any user interaction. It defines the AI's persona, behavior, tone, constraints, and response format. System prompts are used in platforms like ChatGPT (system message), Claude (system prompt), and other LLM APIs to configure how the AI assistant behaves throughout a conversation.

How long should a system prompt be?

A good system prompt is typically between 200 and 1500 tokens. Short prompts (under 200 tokens) may lack enough guidance, while very long prompts (over 2000 tokens) consume context window space and can lead to the AI ignoring parts of the instructions. Focus on clarity and specificity rather than length.

What makes an effective system prompt?

An effective system prompt clearly defines the AI's role, sets explicit boundaries on what it should and should not do, specifies the desired output format and tone, and includes instructions for handling edge cases like out-of-scope questions. Using structured sections with markdown headers helps the AI parse and follow the instructions consistently.

Can I use system prompts with any AI model?

Yes. System prompts work with all major AI models including OpenAI's GPT-4 and GPT-4o (via the system message), Anthropic's Claude (via the system prompt field), Google's Gemini, Meta's Llama, Mistral, and others. The exact implementation may vary by platform, but the concept of pre-configuring AI behavior is universal across LLMs.

What is the difference between a system prompt and a user prompt?

A system prompt sets the overall behavior, persona, and rules for the AI before any conversation begins. It acts as persistent instructions. A user prompt is the actual question or request sent during a conversation. The system prompt shapes how the AI interprets and responds to every user prompt. Think of the system prompt as the AI's job description and the user prompt as a specific task.

Related Tools