Prompt engineering best practices for LLMs on Amazon Bedrock
Prompt engineering is the process of guiding large language models (LLMs) to produce desired outputs. In this session, get an overview of prompt engineering best practices and learn how to choose the most appropriate formats, phrases, words, and symbols to get the most out of generative AI solutions while improving accuracy and performance. We use Anthropic Claude LLMs as an example of how prompt engineering helps solve complex customer use cases. Dive deep with our customer idealo on their lessons learned from building product comparisons beyond price tags, where they utilize Generative AI and apply these prompt engineering techniques.
ABOUT AWS
Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world’s most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.
- AWS Summit Stockholm 2024 - Prompt engineering best practices for LLMs on Amazon Bedrock (AIM304) ( Download)