Skip to content

Latest commit

 

History

History
 
 

07-advanced-prompts-and-custom-parsers

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

Advanced Prompts and Custom Lambda Parsers

In this folder, we provide an example of an HR agent using Agents for Amazon Bedrock new advanced prompt and custom lambda parser capabilities.

Agents in Amazon Bedrock take a sequence of steps to process a user query: Pre-processing, Orchestration, Knowledge base response generation, and Post-processing. For each step in the sequence Prompt templates are the basis for creating prompts to be provided to the FM. Agents for Amazon Bedrock exposes the default four base prompt templates that are used during the pre-processing, orchestration, knowledge base response generation, and post-processing. You can optionally edit these base prompt templates to customize your agent's behavior at each step of its sequence. This forms the basis for Advanced prompts.

For example, to create a custom pre-processing prompt we can provide the custom prompt string in the promptOverrideConfiguration object in the UpdateAgent call.

"promptOverrideConfiguration": { 
    "promptConfigurations": [ 
        { 
            "basePromptTemplate": <custom_prompt:string>,
            "inferenceConfiguration": { 
                "maximumLength": int,
                "stopSequences": [ "string" ],
                "temperature": float,
                "topK": float,
                "topP": float
            },
            "promptCreationMode": "OVERRIDDEN",
            "promptState": "ENABLED",
            "promptType": "PRE_PROCESSING"
        }
    ]
}

In addition, we can provide custom lambda parsers to modify the raw output from the LLM at each of the steps in the agent sequence. This custom lambda parser is often used in conjunction with the custom prompt to give you greater control of not only how process the user query at that step but also what parts of the output response should be passed onto the next step in the sequence.

To take advantage of custom lambda parsers, a lambda function needs to be created and used to update the agent using the UpdateAgent call. For our pre-processing example, we can provide the lambda arn to the overrideLambda key in the promptOverrideConfiguration object in the UpdateAgent call, setting the parserMode to OVERRIDDEN.

promptOverrideConfiguration={
        'overrideLambda':parser_arn,
        'promptConfigurations': [
            {
                'basePromptTemplate': custom_pre_prompt,
                'inferenceConfiguration': {
                "maximumLength": 2048,
                "stopSequences": [
                        "</invoke>",
                        "</answer>",
                        "</error>"
                                  ],
                "temperature": 0.0,
                "topK": 250,
                "topP": 1.0,
                },
                'promptCreationMode':'OVERRIDDEN',
                'promptState': 'ENABLED',
                'promptType': 'PRE_PROCESSING',
                'parserMode': 'OVERRIDDEN'
            }
        ]
    }

The lambda function provided as the parser needs to respect the structure of event that is produced by the agent as input as well as respect the structure the agent expects as response from the lambda. Examples of the input and output structure are shown below:

Lambda input event structure:

{
    "messageVersion": "1.0",
    "agent": {
        "name": "string",
        "id": "string",
        "alias": "string",
        "version": "string"
    },
    "invokeModelRawResponse": "string",
    "promptType": "PRE_PROCESSING",
    "overrideType": "OUTPUT_PARSER"
}

Lambda response structure for pre-processing:

{
    "messageVersion": "1.0",
    "promptType": "PRE_PROCESSING",
    "preProcessingParsedResponse": {
        "isValidInput": "boolean",
        "rationale": "string"
    }
}

For examples of the response structures for the other promptTypes see Parser Lambda function in Agents for Amazon Bedrock