Skip to main content

Availability

EditionDeployment Type
Community & EnterpriseSelf-Managed, Hybrid
This guide assumes you are logged in as a Studio Administrator with full permissions to configure the system.

1. Login to the AI Studio

After completing the installation process and registering your first user:
  1. Access the UI: Open your web browser and navigate to your configured SITE_URL
  2. Admin Login: Log in using the administrator account you created during registration: Login Screen
    Reminder: If you haven’t completed the initial registration yet, go back to your installation guide and follow the “First User Registration” section.

2. Configure Your First LLM

One of the most common initial steps is connecting Tyk AI Studio to an LLM provider.
  1. Navigate to LLM Management: In the admin UI sidebar, select LLM Management > LLM Providers.
  2. Add LLM Configuration: Click the button to add a new LLM Configuration.
  3. Enter Details:
    • Configuration Name: Give it a recognizable name (e.g., OpenAI-GPT-4o).
    • Description: Optionally, add a description for this configuration.
    • Select Vendor: Choose the LLM provider you want to connect (e.g., OpenAI, Anthropic, Azure OpenAI).
    • Select Default Model: Specify the exact model identifier(s) provided by the vendor (e.g., gpt-4o, gpt-4-turbo).
    • Add API Key in the Access Details section:
      Do not paste your API key directly here. Instead, use Secrets Management.
      • If you haven’t already, go to the Secrets section in the admin UI and create a new secret:
        • Variable Name: OPENAI_API_KEY (or similar)
        • Secret Value: Paste your actual OpenAI API key here.
        • Save the secret.
      • Return to the LLM Configuration screen.
      • In the API Key field, enter the secret reference: $SECRET/OPENAI_API_KEY (using the exact Variable Name you created).
    • Other Parameters: Configure any other provider-specific settings (e.g., Base URL for Azure/custom endpoints, default temperature, etc.).
  4. Save: Save the LLM configuration. LLM Config UI
This LLM is now available for use within Tyk AI Studio, subject to User/Group permissions. For more details, see the LLM Management documentation.

3. Create Chat Experience

  1. Navigate to the Chats > Chats section in the admin UI.
  2. Click to create a new Chat Experience.
    • Name: Give your chat a descriptive name (e.g., OpenAPI GPT4o).
    • Select LLM Model: In the LLM settings, select the specific model you want to use for this chat (e.g., gpt-4o).
    • Select LLM: Select the LLM configuration you created in the previous step.
    • Select Group: Assign this chat to a specific Group or select “Default” to make it available to all users.
  3. Save the new Chat Experience. Chat Config

4. Use the Chat Interface

  1. Navigate to the Chat tab at the top of the UI.
  2. Select Chat Experience: Choose the OpenAPI GPT4o experience you just created from the list of available chats.
  3. Interact: Type a question or prompt in the chat box.
  4. Receive Responses: The LLM will process your request and stream the response back to you in the chat window. Chat UI
You have now successfully configured an LLM and used it to answer a question through the Tyk AI Studio Chat Interface.

Next Steps

With the initial configuration complete, you can now: