AI

It is possible configure and connect to different AI providers for your self-hosted Tines tenant.

If connecting to Anthropic or OpenAI (and other compatible schemas), the configuration is the same as on cloud tenants.

Amazon Bedrock 

Configuring Amazon Bedrock is possible for tenants hosted on AWS infrastructure. Tines make requests to Bedrock using the assumed roles/credentials of the container hosting the Tines instance. Because of this, those credentials must be configured with the following IAM role permissions are required:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "VisualEditor0",
      "Effect": "Allow",
      "Action": [
        "bedrock:InvokeModel*",
        "bedrock:GetInferenceProfile",
        "bedrock:ListInferenceProfiles"
      ],
      "Resource": "*"
    }
  ]
}

ℹ️Info

By default, following models should also be enabled:

  • Anthropic Claude 4 Sonnet

  • Anthropic Claude 3.7 Sonnet

  • Anthropic Claude 3.5 Haiku

  • Anthropic Claude 3 Haiku

You can now fully customize which AWS Bedrock models are enabled in Tines, allowing you to select any models available in your AWS region.

Automatic mode 

In order use automatic mode, Run script for self-hosted must be configured and enabled.

Was this helpful?