# If you want to use other LLM provider, like azure and anthropic:
# - ENABLE_ANTHROPIC=true
# - LLM_KEY=ANTHROPIC_CLAUDE3.5_SONNET
# - ANTHROPIC_API_KEY=<your_anthropic_key>
# Microsoft Azure OpenAI support:
# If you'd like to use Microsoft Azure OpenAI as your managed LLM service integration with Skyvern, use the environment variables below.
# In your Microsoft Azure subscription, you will need to provision the OpenAI service and deploy a model, in order to utilize it.
# 1. Login to the Azure Portal
# 2. Create an Azure Resource Group
# 3. Create an OpenAI resource in the Resource Group (choose a region and pricing tier)
# 4. From the OpenAI resource's Overview page, open the "Azure AI Foundry" portal (click the "Explore Azure AI Foundry Portal" button)
# 5. In Azure AI Foundry, click "Shared Resources" --> "Deployments"
# 6. Click "Deploy Model" --> "Deploy Base Model" --> select a model (specify this model "Deployment Name" value for the AZURE_DEPLOYMENT variable below)
# - ENABLE_AZURE=true
# - LLM_KEY=AZURE_OPENAI # Leave this value static, don't change it
# - AZURE_DEPLOYMENT=<your_azure_deployment> # Use the OpenAI model "Deployment Name" that you deployed, using the steps above
# - AZURE_API_KEY=<your_azure_api_key> # Copy and paste Key1 or Key2 from the OpenAI resource in Azure Portal
# - AZURE_API_BASE=<your_azure_api_base> # Copy and paste the "Endpoint" from the OpenAI resource in Azure Portal (eg. https://xyzxyzxyz.openai.azure.com/)
# - AZURE_API_VERSION=<your_azure_api_version> # Specify a valid Azure OpenAI data-plane API version (eg. 2024-08-01-preview) Docs: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference
# Amazon Bedrock Support:
# Amazon Bedrock is a managed service that enables you to invoke LLMs and bill them through your AWS account.
# To use Amazon Bedrock as the LLM provider for Skyvern, specify the following environment variables.
# 1. In the AWS IAM console, create a new AWS IAM User (name it whatever you want)
# 2. Assign the "AmazonBedrockFullAccess" policy to the user
# 3. Generate an IAM Access Key under the IAM User's Security Credentials tab
# 4. In the Amazon Bedrock console, go to "Model Access"
# 5. Click Modify Model Access button
# 6. Enable "Claude 3.5 Sonnet v2" and save changes
# - ENABLE_BEDROCK=true
# - LLM_KEY=BEDROCK_ANTHROPIC_CLAUDE3.5_SONNET # This is the Claude 3.5 Sonnet "V2" model. Change to BEDROCK_ANTHROPIC_CLAUDE3.5_SONNET_V1 for the non-v2 version.
# - AWS_REGION=us-west-2 # Replace this with a different AWS region, if you desire
# - AWS_ACCESS_KEY_ID=FILL_ME_IN_PLEASE
# - AWS_SECRET_ACCESS_KEY=FILL_ME_IN_PLEASE
# Ollama Support:
# Ollama is a local LLM provider that can be used to run models locally on your machine.
# uncomment for local usage of `vaultwarden` & bitwarden-cli - see more at: https://github.com/dani-garcia/vaultwarden
# First this container needs to be started and configured to sign up, create master password and organization
# Once created, under SETTINGS/SECURITY/KEYS/API you should be able to get client id and secret for CLI & Skyvern integrations
# vaultwarden:
# image: vaultwarden/server:latest-alpine
# container_name: vaultwarden
# restart: unless-stopped
# environment:
# # DOMAIN: "https://vaultwarden.example.com" # required when using a reverse proxy; your domain; vaultwarden needs to know it's https to work properly with attachments
# SIGNUPS_ALLOWED: "true" # Deactivate this with "false" after you have created your account so that no strangers can register
# volumes:
# - ~/vw-data/:/data/ # the path before the : can be changed
# ports:
# - 127.0.0.1:11002:80 # you can replace the 11002 with your preferred port
# Bitwarden CLI Server (provides REST API endpoints for Skyvern)
# Once you have master password and api credentials, you can set them below and this CLI should start providing secure access for Skyvern to Vaultwarden