AI provider configuration

You can connect your self-hosted Tines tenant to different AI providers.

If connecting to Anthropic or OpenAI (and other compatible schemas), the configuration is the same as on cloud tenants.

Amazon Bedrock 

Configuring Amazon Bedrock is possible for tenants hosted on AWS infrastructure. Tines make requests to Bedrock using the assumed roles/credentials of the container hosting the Tines instance. Because of this, those credentials must be configured with the following IAM role permissions are required:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "BedrockModelAccessPermissions",
      "Effect": "Allow",
      "Action": [
        "bedrock:InvokeModel*",
        "bedrock:GetInferenceProfile",
        "bedrock:ListInferenceProfiles"
      ],
      "Resource": "*"
    }
  ]
}

ℹ️Info

We recommend enabling the latest Anthropic Claude models for best performance and capabilities.

You can now fully customize which AWS Bedrock models are enabled in Tines, allowing you to select any models available in your AWS region.

🪄Tip

Story copilot 

In order to effectively use story copilot, it is strongly recommended to use the latest top AI model from the major foundational AI providers (Anthropic and OpenAI).

Automatic mode 

In order use automatic mode, Tines Command Runner must be configured and enabled.

Was this helpful?