Torq supports Bring Your Own Subscription (BYOS) for AI models. BYOS lets you choose which AI provider and subscription your agent use, so you can leverage your own enterprise agreements or preferred model sources instead of relying only on Torq-managed subscription.
With BYOS, you can connect providers such as OpenAI, Azure OpenAI, Anthropic, or GCP Vertex AI, enabling flexibility, cost efficiency, and alignment with your organization’s compliance requirements. Key benefits include:
Flexibility: Use your own enterprise agreements and models alongside Torq’s hosted options.
Governance: Maintain compliance and data-residency requirements by connecting to your own provider.
Scalability: Support multiple subscriptions for different use cases, workflows, or teams.
Prerequisites
Before setting up BYOS in Torq, make sure you have an active integration with at least one supported AI provider. You’ll add your own API key or credentials, giving you full control over which models power your AI Agents.
Supported providers:
OpenAI
Microsoft Azure OpenAI
Google Cloud Platform (GCP) – Vertex AI
Anthropic Claude
Amazon Bedrock
Supported AI models by provider
When using Bring Your Own Subscription (BYOS), Torq supports a curated set of AI models per provider. Model availability depends on the provider and the region selected in your account.
Microsoft Azure OpenAI
When using Azure OpenAI, the following models are supported. Model availability depends on which models are deployed in your Azure OpenAI resource.
GPT-3.5 Turbo
GPT-4
GPT-4 Turbo
GPT-4o Mini
GPT-4o
GPT-5
GPT-5 Mini
GPT-5 Nano
Google Vertex AI
Model availability depends on the region and which models are enabled or deployed in your Vertex AI project.
Gemini 2.5 Pro
Gemini 2.5 Flash
Gemini 2.5 Flash Lite
Gemini 2.0 Flash
Gemini 2.0 Flash Lite
Claude 3.7 Sonnet
Claude Sonnet 4
Claude Opus 4
Claude Opus 4.1
Anthropic Claude
When using Anthropic, Torq supports the following Claude models:
Claude 3.7 Sonnet
Claude Sonnet 4
Claude Sonnet 4.5
Claude Opus 4
Claude Opus 4.1
Amazon Bedrock
Torq supports the following Anthropic Claude models via Amazon Bedrock. Model availability varies by AWS region, and not all models are available in every region.
Model | Available regions |
Claude Haiku 4.5 | ap-northeast-1, ap-northeast-2, ap-northeast-3, ap-south-1, ap-south-2, ap-southeast-1, ap-southeast-2, ap-southeast-3, ap-southeast-4, ca-central-1, eu-central-1, eu-central-2, eu-north-1, eu-south-1, eu-south-2, eu-west-1, eu-west-2, eu-west-3, me-central-1, sa-east-1, us-east-1, us-east-2, us-west-1, us-west-2 |
Claude Opus 4.5 | ap-northeast-1, ap-northeast-2, ap-northeast-3, ap-south-1, ap-south-2, ap-southeast-1, ap-southeast-2, ap-southeast-3, ap-southeast-4, ca-central-1, eu-central-1, eu-central-2, eu-north-1, eu-south-1, eu-south-2, eu-west-1, eu-west-2, eu-west-3, me-central-1, sa-east-1, us-east-1, us-east-2, us-west-1, us-west-2 |
Claude Sonnet 4.5 | af-south-1, ap-northeast-1, ap-northeast-2, ap-northeast-3, ap-south-1, ap-south-2, ap-southeast-1, ap-southeast-2, ap-southeast-3, ap-southeast-4, ca-central-1, ca-west-1, eu-central-1, eu-central-2, eu-north-1, eu-south-1, eu-south-2, eu-west-1, eu-west-2, eu-west-3, me-south-1, mx-central-1, sa-east-1, us-east-1, us-east-2, us-west-1, us-west-2, us-gov-west-1, us-gov-east-1 |
Integrate AI providers
Locate an integration: In your workspace, go to Build > Integrations > Steps, then search for the provider you want to connect.
Add the integration: Click the integration to connect it to Torq, refer to the relevant integration documentation:
Confirm availability: Once connected, the provider appears in the Subscription dropdown under the Instructions tab in the Configure Agent dialog or in an AI Task operator.
How to use
AI Agent
Add an AI Agent: Open your workflow and add an AI Agent step.
Open Agent configuration: Click the step and select Configure Agent.
Go to instructions: Select the Instructions tab.
Choose a subscription: In the Subscription dropdown, pick your BYOS provider.
Select a model: In the AI Model dropdown, choose the required AI model.
Select the region: For Google Cloud Platform (Vertex AI) and Amazon Bedrock, choose a region, noting that model availability varies by region.
Save and run: Save the Agent and execute the workflow.
Your Agent will now execute using your selected subscription.
AI Task operator
Add the operator: Open your workflow and add an AI Task operator.
Open the operator configuration: Click the operator to select it and display the Properties tab.
Select a subscription: In the Subscription dropdown, select the required AI provider.
Pick a model: In the Model dropdown, select which AI model you would like to process your task.



