Important: This documentation covers Yarn 1 (Classic).
For Yarn 2+ docs and migration guide, see yarnpkg.com.

Package detail

@aws-amplify/graphql-conversation-transformer

aws-amplify39.2kApache-2.01.1.12TypeScript support: included

Amplify GraphQL @conversation transformer

readme

GraphQL @conversation Transformer

The @conversation transformer is a powerful tool for quickly and easily creating AI-powered conversation routes within your AWS AppSync API. This transformer leverages the capabilities of large language models to enable dynamic, context-aware conversations in your GraphQL schema.

Table of Contents

Directive Definition

The @conversation directive is defined as follows:

directive @conversation(
  aiModel: String!
  systemPrompt: String!
  functionName: String
  tools: [ToolMap]
  inferenceConfiguration: ConversationInferenceConfiguration
) on FIELD_DEFINITION

input ToolMap {
  name: String
  description: String
}

input ConversationInferenceConfiguration {
  maxTokens: Int
  temperature: Float
  topP: Float
}

Usage

To use the @conversation directive, add it to a field in your GraphQL schema. This field should be of type ConversationMessage and should be part of the Mutation type.

type Mutation {
  sendMessage(conversationId: ID!, content: String!): ConversationMessage
    @conversation(aiModel: "anthropic.claude-3-haiku-20240307-v1:0", systemPrompt: "You are a helpful AI assistant.")
}

To find the necessary GraphQL types to use with the @conversation directive, see src/graphql-types/conversation-schema-types.ts.

Configuration Options

The @conversation directive accepts the following configuration options:

  • aiModel (required): Specifies the AI model to be used for generating responses.
  • systemPrompt (required): Defines the initial prompt that sets the context for the AI model.
  • functionName (optional): Specifies a custom Lambda function to handle the conversation logic.
  • tools (optional): An array of tool configurations that the AI can use during the conversation.
  • inferenceConfiguration (optional): Fine-tunes the AI model's behavior with parameters like maxTokens, temperature, and topP.

Generated Resources

When you use the @conversation directive, the transformer generates several AWS resources to support the conversation functionality:

  1. DynamoDB Tables:

    • A table for storing conversation sessions
    • A table for storing individual messages
  2. AppSync Resolvers:

    • A pipeline resolver for the conversation mutation
    • A resolver for the assistant's response mutation
    • A subscription resolver for real-time updates
  3. Lambda Function:

    • A default conversation handler (if no custom functionName is provided)
  4. IAM Roles and Policies:

    • Necessary permissions for AppSync to interact with DynamoDB and Lambda
  5. AppSync Data Sources:

    • DynamoDB data sources for conversation and message tables
    • Lambda data source for the conversation handler

Examples

Basic Usage

type Mutation {
  chat(conversationId: ID!, message: String!): ConversationMessage
    @conversation(
      aiModel: "anthropic.claude-3-haiku-20240307-v1:0"
      systemPrompt: "You are a friendly AI assistant. Respond to user queries in a helpful and concise manner."
    )
}

Advanced Usage with Tools and Inference Configuration

type Mutation {
  customerSupport(conversationId: ID!, inquiry: String!): ConversationMessage
    @conversation(
      aiModel: "anthropic.claude-3-haiku-20240307-v1:0"
      systemPrompt: "You are a customer support AI. Help users with their product inquiries and issues."
      tools: [
        { name: "getProductInfo", description: "Retrieves detailed information about a product" }
        { name: "checkOrderStatus", description: "Checks the status of a customer's order" }
      ]
      inferenceConfiguration: { maxTokens: 500, temperature: 0.7, topP: 0.9 }
    )
}

Best Practices

  1. Craft Clear System Prompts: The system prompt sets the tone and context for the conversation route. Make it specific and aligned with your use case.

  2. Use Appropriate AI Models: Choose AI models that suit your application's needs in terms of capabilities and response time.

  3. Implement Error Handling: Always handle potential errors client-side code.

  4. Monitor and Optimize: Regularly review the performance and costs associated with your conversation route.

Troubleshooting

If you encounter issues while using the @conversation transformer, consider the following:

  1. Check Your Schema: Ensure your GraphQL schema is valid and the @conversation directive is used correctly.

  2. Verify AWS Resources: Check that all required AWS resources have been created successfully.

  3. Review Logs: Examine CloudWatch logs for any errors in your Lambda functions or AppSync resolvers.

  4. Test Incrementally: When adding complex features like custom tools, test each addition incrementally to isolate potential issues.

changelog

Change Log

All notable changes to this project will be documented in this file. See Conventional Commits for commit guidelines.

1.1.12 (2025-06-09)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

1.1.11 (2025-04-17)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

1.1.10 (2025-04-09)

Reverts

1.1.9 (2025-03-06)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

1.1.8 (2025-02-26)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

1.1.7 (2025-02-07)

Bug Fixes

1.1.6 (2025-01-30)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

1.1.5 (2025-01-16)

Bug Fixes

  • conversation: add padding to stream chunks (#3115) (8ab2284)

1.1.4 (2024-12-23)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

1.1.3 (2024-12-17)

Bug Fixes

  • conversation: handle enums in data tool selection sets (#3045) (7466aaa)

1.1.2 (2024-11-20)

Bug Fixes

1.1.1 (2024-11-20)

Bug Fixes

  • conversation: model list tool query name infinite recursion (#3037) (63aea80)

1.1.0 (2024-11-19)

Bug Fixes

  • ai: forward user-agent with package metadata in generation and conversation requests (#3029) (9a0ac8b)

Features

  • bump conversation and generation transformers to v1 (#3030) (1d9e59e)

0.7.1 (2024-11-14)

Bug Fixes

  • conversation: prefix supporting types with AmplifyAI (#3023) (24198ac)

0.7.0 (2024-11-14)

Bug Fixes

  • plumb outputStorageStrategy through conversation transformer to handler (#3017) (acc8140)

Features

  • conversation: require auth input for conversation directive (#3007) (39cca3f)
  • conversation: update directive input for tool definition for model list queries (#3013) (ee976cc)

0.6.0 (2024-11-08)

Features

  • conversation: propagate errors from lambda to client (#3002) (141512b)
  • conversation: sorting and performant list queries (#2997) (73f2d6b)
  • conversation: support response streaming (#2986) (815d51f)

0.5.0 (2024-10-28)

Bug Fixes

  • conversation: inline template for init slot and correct datasource in assistant response mutation (#2980) (5136377)

Features

  • conversation: enable update mutations on conversation model (#2948) (c90ac7d)

0.4.0 (2024-10-17)

Features

  • constrain custom conversation handler event version (#2958) (083fe17)

0.3.0 (2024-10-10)

Bug Fixes

  • conversation: allow changes to systemPrompt, inferenceConfig, aiModel to be hotswapped (#2923) (713e2b6)
  • conversation: use functionMap for custom handler IFunction reference (#2922) (d8d9eef)

Features

  • conversation: per message items and lambda history retrieval pattern (#2914) (874a30a)

0.2.2 (2024-10-01)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

0.2.1 (2024-09-16)

Note: Version bump only for package @aws-amplify/graphql-conversation-transformer

0.2.0 (2024-09-06)

Features

  • conversation: add conversation transformer (#2827) (cee6aef)