Welcome to the wild world of AI prompt engineering, where humans and machines engage in a linguistic tango in an attempt to get a highly advanced AI to understand your request. What could go wrong?

What is Prompt Engineering?

Prompt engineering involves crafting queries to elicit useful responses from AI models. It’s a growing field that, at this time, is part science, part art, and part luck. With creativity, logic, and patience, you can master the field of AI communication. Here, I’ll help you learn to balance between vague and overly specific queries so the AI-generated responses you receive match the intent of your query.

Why is Prompt Engineering a challenge?

Prompt engineering can be challenging. It’s not as simple as giving instructions and receiving desired results. AI models have unique quirks and interpretations. What are clear instructions to humans may lead AI in unexpected directions. Language ambiguity complicates matters. Humans excel at grasping context and nuance, while AI struggles in this area. Minor wording changes can significantly alter output. Also, the constant evolution of AI models adds complexity as what were effective prompts may become obsolete with updates.

Balancing specificity is crucial. Vague prompts yield useless results, while overly specific ones negate AI’s purpose.

Techniques Used in Prompt Engineering

While several prompt engineering techniques are currently in use, new ones will likely be created, and existing ones will change as AI models evolve. Here are techniques commonly used by prompt engineers today, including the purpose and an example for each.

Prompt Engineering Techniques

Chain-of-Thought Prompting

This prompt engineering technique guides the model through step-by-step reasoning to improve complex problem-solving.

Example: Solve this math problem step-by-step: If a train travels 120 miles in 2 hours, what is its average speed in miles per hour?

Few-Shot Learning

With few-shot learning, examples are provided within the prompt to demonstrate the desired output format or style the user expects to generate.

Example: Translate these sentences from English to French: Hello. How are you?

Zero-Shot Prompting

Zero-shot prompting is a technique in which the model is asked to perform tasks without specific examples. With zero-shot prompting, the user is relying on the AI to use its pre-trained knowledge to answer their query.

Example: Write a haiku about artificial intelligence.

Instruction Prompting

Instruction prompting is when the user clearly states the task and desired outcome in natural language.

Example: Summarize the main plot points of Romeo and Juliet in five sentences.

Role-Playing

Similar to role-playing in the movies, this is when the user assigns the AI a specific role or persona. The goal of this type of prompt is to elicit a particular type of response from the AI model.

Example: You are a 19th-century detective investigating a mysterious theft. Describe how you would approach the crime scene.

Context Stuffing

Context stuffing is a prompting technique used to improve the accuracy of an AI model’s response. It involves providing the AI with relevant and detailed background information.

Example: Given that the Earth’s radius is approximately 6,371 km and it orbits the sun at an average distance of 149.6 million km, calculate how long it takes sunlight to reach Earth.

Prompt Chaining

Breaking complex tasks into smaller subtasks and using the outputs as inputs for subsequent prompts is called prompt chaining.

Example: First, generate a list of 5 random ingredients. Then, create a recipe using all of these ingredients.

Constrained Prompting

Constrained Prompting is when the user sets specific limitations or rules for the model to follow in its responses.

Example: Write a short story about a time traveler, but don’t use any words containing the letter ‘e.’

Tools Used in Prompt Engineering

Prompt engineering tools are designed to make writing prompts faster and AI-generated responses more accurate.

The Most Common Prompt Engineering Tools

Prompt Libraries: Collections of Pre-Tested Prompts for Various Tasks

These libraries can be found in various formats including books, websites, apps, and downloadable resources. They’re useful for overcoming creative blocks, structuring thoughts, or simply as starting points for various tasks. Types of prompt libraries include:

    • Writing Prompts: fiction writing prompts, journaling prompts, poetry prompts, descriptive writing exercises
    • Creative Thinking: brainstorming prompts, problem-solving scenarios, what if questions
    • Education: discussion starters for various subjects, critical thinking exercises, research topic suggestions
    • Business and Marketing: product description templates, social media post ideas, email subject line generators
    • Personal Development: self-reflection questions, goal-setting prompts, mindfulness exercises
    • Art and Design: drawing challenges, photography themes, character design prompts
    • AI-Specific: ChatGPT conversation starters, image generation prompts for AI art tools, voice assistant command templates

Prompt Optimization Platforms: Software for Testing and Refining Prompts at Scale

These platforms help users craft more effective prompts for language models.

    • PromptBase
    • Anthropic’s Constitutional AI
    • OpenAI’s InstructGPT
    • Cohere’s Prompt Engineering
    • AI21 Labs’ Jurassic-1 Jumbo

Version Control Systems: Tools to Track and Manage Different Prompt Iterations

    • Git-based systems: Some developers adapt Git to manage prompts, using branches and commit to track changes.
    • LangChain’s PromptTemplate: While not a full version control system, it allows for templating and reuse of prompts.
    • Specialized prompt management tools:
      • PromptLayer: Offers versioning, testing, and analytics for prompts.
      • Humanloop: Provides version control and experimentation features for prompts.

Model Playgrounds: Interactive Environments for Experimenting with Different Prompts and Models

These playgrounds allow users to experiment with different prompts, adjust parameters, and see how models respond in real-time.

    • OpenAI Playground: Allows testing prompts with various GPT models.
    • Anthropic Claude Playground: This is for experimenting with Claude AI models.
    • Hugging Face Playground: Offers a wide range of open-source language models.
    • Google AI Test Kitchen: Provides access to select Google AI models for experimentation.
    • Cohere Playground: This is for testing Cohere’s language models.
    • GPT-J Playground: Allows testing with the open-source GPT-J model.
    • AI21 Studio: Provides access to Jurassic-1 and other AI21 language models.

Collaborative Platforms: Environments for Teams to Work Together on Prompt Engineering Projects

    • PromptBase: A marketplace for buying and selling prompts for various AI models.
    • Prompthero: A platform for discovering, sharing, and collaborating on AI prompts.
    • Anthropic’s Constitutional AI Platform: While not solely focused on prompts, it allows collaboration on developing AI systems with specific traits and behaviors.
    • OpenAI’s Playground: Enables users to experiment with and share prompts for GPT models.
    • Hugging Face: Offers collaborative tools for working with language models, including prompt engineering.
    • GPT-3 Prompt Engineering Forum: An online community for discussing and sharing prompt engineering techniques.
    • AI Prompt Engineering Discord Servers: Various Discord communities dedicated to collaborative prompt engineering.
    • GitHub repositories: Many open-source projects focused on collaborative prompt engineering exist on GitHub.

Integration Tools: Software for Incorporating Optimized Prompts into Applications and Workflows

    • LangChain: A framework for developing applications powered by language models, offering tools for prompt management and chaining.
    • Promptflow: Microsoft’s tool for building AI applications with prompt engineering and flow orchestration.
    • Prompt Engine: An open-source library for managing and versioning prompts in AI applications.
    • OpenAI Playground: A web interface for experimenting with and refining prompts for OpenAI’s models.
    • Anthropic’s Claude Prompt Design: Tools and guidelines for effectively prompting Anthropic’s Claude model.
    • Hugging Face’s Prompt Engineering Guide: Resources and tools for prompt engineering with various language models.
    • GPT Index: A toolkit for using LLMs with external data sources, including prompt optimization features.
    • FLAML (Fast and Lightweight AutoML): Microsoft’s AutoML library that includes prompt optimization capabilities.
    • Promptable: A prompt engineering IDE for testing and iterating on prompts.
    • PromptLayer: A tool for tracking, managing, and sharing prompts across team members.

10 Best Practices for AI Prompt Engineering

Whether you are a prompt engineer or not, here are ten best practices for eliciting helpful responses from AI models. To clarify nuances, examples of poorly written and well written prompts are provided.

1. Be specific and clear

Aim for a balance between vague and overly detailed instructions. Prompts that are too vague send the AI on wild goose chases and cause you to have to restructure your queries, eating up valuable time. Details can be beneficial, but overloading your prompt with excessive nuance can confuse the AI or lead it astray.

Examples:
Unclear AI prompt: Write about cars.
Overly detailed AI prompt: Describe how the growing popularity of Tesla Model 3 versus the Toyota Camry Hybrid impacted major technological advancements in the industry between 2010 to 2023.
Strong AI prompt: Describe the evolution of electric vehicles from 2010 to 2023, focusing on major technological advancements.

Additionally, refrain from using unnecessary jargon or complex language. Clarity is crucial. The AI does not need to be impressed by an extensive vocabulary; it needs to understand your intent.

Examples:
Jargon-heavy prompt: Write a brief exploring technological disruptors that have emerged over the last three years and how they’ve affected enterprise ecosystems and mission-critical initiatives.
Language-appropriate prompt: Write an article exploring advancements in technology since 2021 and how they relate to changes in what businesses are prioritizing and investing in.

2. Use context and background information

Do not assume the AI has the context it does not possess. AI systems, while impressive, lack real-world knowledge beyond their training data. That’s why it is important to provide any necessary background information.

Examples:
Prompt lacking context: Explain the impact of renewable energy on carbon emissions.
Prompt with the proper context: Considering recent climate change reports, explain how renewable energy could impact global carbon emissions by 2030.

While you will want to orient the AI with relevant information, avoid injecting personal biases into the prompt. Bias can inadvertently steer the AI towards a particular viewpoint or outcome. Strive for neutrality unless a biased response is desired.

Examples:
Bias-heavy prompt: Why are monkeys smarter than orangutans?
Bias neutral prompt: How smart are monkeys compared to orangutans?

Also, be cautious of using prompts that could lead to harmful or unethical outputs. Always consider the potential implications of your requests. An unethical prompt might ask for someone’s personal or private information, make assumptions about a group of individuals, or request AI to do something like give a steep discount (i.e. getting a chatbot to sell you a truck for $1).

3. Break complex tasks into smaller steps

If you want a series of information that builds on itself, define each item as a separate ask. Breaking tasks down into focused, individual requests ensures that the AI doesn’t bundle the answer and that each part receives the attention it deserves.

Examples:
Bundled prompt: Outline how the major events of World War II impacted global politics and how these effects are still felt today.
Focused prompt: First, outline the major events of World War II. Then, analyze their impact on global politics. Finally, discuss how these effects are still felt today.

When asking about unrelated topics, be sure to refresh your session to clear out previous queries. This prevents the AI from injecting aspects of the previous prompt into the new one.

4. Specify the desired format or structure

You can minimize your workload by asking AI to add structure, formatting, and references for you.

Example:
Prompt requesting formatting and citations: Write a 500-word news article about the latest Mars rover mission, including a headline and three subheadings. Provide citations for sources.

5. Include relevant details and constraints

If you want to include or exclude a specific subset of information, be sure to incorporate that information in your prompt.

Example:
Prompt with exclusions: Create a healthy meal plan for a vegetarian athlete, considering protein intake and calorie requirements for intense training.

6. Use open-ended questions to encourage detailed responses

You can give the AI more latitude in its response by asking questions that leave the answer open to interpretation. Avoid wording questions in a manner that elicits a yes or no response.

Examples:
Close-ended prompt: Will artificial intelligence transform the healthcare industry over the next decade?
Open-ended prompt: How might artificial intelligence transform the healthcare industry over the next decade?

7. Provide examples or templates when appropriate

An example or template can give us a head start in our work and it can do the same for AI. This includes providing outlines or content similar to what you are requesting.

Example:
Prompting with a template: Write a product review following this structure: Introduction, Pros, Cons, Verdict. Use the ‘iPhone 14 Pro’ as an example.

8. Clarify the intended audience and tone

If you are a marketing professional, you may want to appeal to a specific audience. Clarifying who the audience is and what emotion you want to evoke or what goal you want to accomplish can be very helpful.

Example:
Prompt with audience specifications: Explain quantum computing to a high school student, using simple analogies and avoiding technical jargon.

9. Use action verbs to guide the AI’s response

Asking for data is different than asking for an analysis. Using verbs in your response will prompt the AI to perform the appropriate action rather than just supplying information.

Example:
Verb usage in prompts: Analyze, compare, and contrast the economic policies of the current U.S. administration with those of the previous one.

10. Iterate and refine prompts based on initial responses

Avoid becoming complacent with your prompting style. Experiment, iterate, and learn from both successes and failures. The art of prompting is continuously evolving. A fun way to experiment with AI would be to play with the examples above. Take the prompts and ask them in different ways to see how that changes the response.

What is the Future of Prompt Engineering in AI?

Looking ahead, the synergy between prompt engineering and other AI disciplines promises to unlock new frontiers in human-AI collaboration. As we refine our ability to communicate with AI systems, we pave the way for more intuitive, powerful, and responsible applications of this technology. Ultimately, mastering the art and science of prompt engineering will be essential for harnessing AI’s full potential in solving complex real-world problems.

We anticipate prompt engineering to become more nuanced and specialized. Instead of relying on generic prompts, we will develop tailored approaches for different industries and use cases. For example, imagine prompts specifically designed for medical diagnoses or legal document analysis. These types of applications are already being developed and will open additional avenues of exploration. This generative AI demo showcases a few use cases InfoWorks has been working on in this space.

AI assistants will probably become more context-aware, requiring less explicit instruction. Prompts might focus more on setting the right tone or perspective rather than providing step-by-step directions.

Ethical considerations will likely play a larger role as well. Crafting prompts that not only achieve the desired output but also align with moral and legal standards will be imperative. Interestingly, there will likely be a shift towards more collaborative prompt engineering. Instead of one-shot prompts, we could have ongoing dialogues where humans and AI work together to refine the prompt and output.

The prompt engineering field calls for ongoing research to address issues of bias, consistency, and scalability. Being along for this ride is thrilling. A lot will “go right” if we continually learn how to interact with these evolving models.

If you are looking to advance the utilization of AI within your organization, contact us to set up a free AI workshop focused on use cases specifically for your business.

About Lenny Durrough

Lenny Durrough is a tech-savvy customer success leader and trusted advisor with highly developed skills in training/client education, sales enablement, and client services. He is known for building and developing best-in-class, sustainable pre- and post-sales support, customer implementations, customer success initiatives, and continuous improvement of company’s customer outcomes. A change agent, he embraces new technologies and focuses continuous improvement to drive new business growth and expand highly effective working relationships.

We look forward to hearing what initiatives you’re working on and how we can help you accelerate success. Let’s talk.