Skip to content

Quick Start

Get your first Strands Agent running in 5 minutes.

Prerequisites

  1. Universal API Account - Sign up at universalapi.co
  2. AWS Bedrock Access - You need AWS credentials with Bedrock permissions
  3. API Keys - Get your userId and secretUniversalKey from the dashboard

Step 1: Add AWS Credentials

Before creating an agent, add your AWS credentials to Universal API:

  1. Go to universalapi.co/keys
  2. Click "Add Key" → Select "AWS"
  3. Enter your AWS Access Key ID and Secret Access Key
  4. Select your preferred region (e.g., us-east-1)
  5. Save

!!! note "AWS Bedrock Access" Make sure your AWS account has access to Claude or other Bedrock models. You can enable model access in the AWS Bedrock Console.

Step 2: Create Your First Agent

Create a simple agent that uses Claude 3.7 Sonnet:

bash
curl -X POST "https://api.universalapi.co/agent/create" \
  -H "Content-Type: application/json" \
  -H "X-Uni-UserId: YOUR_USER_ID" \
  -H "X-Uni-SecretUniversalKey: YOUR_SECRET_KEY" \
  -d '{
    "agentName": "my-first-agent",
    "description": "A simple AI assistant",
    "sourceCode": "from strands import Agent\nfrom strands.models import BedrockModel\n\ndef create_agent():\n    model = BedrockModel(\n        model_id=\"us.anthropic.claude-3-7-sonnet-20250219-v1:0\",\n        region_name=\"us-east-1\"\n    )\n    agent = Agent(model=model)\n    return agent, []"
  }'

Response:

json
{
  "success": true,
  "data": {
    "agentId": "5a8f5c4f-a3ca-44cc-8fe7-afec7a75653d",
    "agentName": "my-first-agent",
    "endpointUrl": "https://api.universalapi.co/agent/5a8f5c4f-a3ca-44cc-8fe7-afec7a75653d/chat"
  }
}

Save the agentId - you'll need it to chat with your agent.

Step 3: Chat with Your Agent

Use the streaming endpoint for real-time responses:

bash
curl -N "https://stream.api.universalapi.co/agent/YOUR_AGENT_ID/chat" \
  -H "Content-Type: application/json" \
  -H "X-Uni-UserId: YOUR_USER_ID" \
  -H "X-Uni-SecretUniversalKey: YOUR_SECRET_KEY" \
  -d '{"prompt": "Hello! What is 2+2?"}'

Response (streamed in real-time):

__META__{"conversationId": "625c2112-9eac-4630-bbbc-785a845a182d"}__
Hello! The answer to 2+2 is 4.

This is a simple arithmetic calculation that doesn't require any special tools to solve.

Option B: Buffered Response

Use the standard endpoint for a complete response:

bash
curl -X POST "https://api.universalapi.co/agent/YOUR_AGENT_ID/chat" \
  -H "Content-Type: application/json" \
  -H "X-Uni-UserId: YOUR_USER_ID" \
  -H "X-Uni-SecretUniversalKey: YOUR_SECRET_KEY" \
  -d '{"prompt": "Hello! What is 2+2?"}'

Step 4: Continue a Conversation

To continue an existing conversation, include the conversationId:

bash
curl -N "https://stream.api.universalapi.co/agent/YOUR_AGENT_ID/chat" \
  -H "Content-Type: application/json" \
  -H "X-Uni-UserId: YOUR_USER_ID" \
  -H "X-Uni-SecretUniversalKey: YOUR_SECRET_KEY" \
  -d '{
    "prompt": "What about 3+3?",
    "conversationId": "625c2112-9eac-4630-bbbc-785a845a182d"
  }'

The agent will remember the previous context and respond accordingly.

Step 5: List Your Agents

View all your agents:

bash
curl "https://api.universalapi.co/agent/list" \
  -H "X-Uni-UserId: YOUR_USER_ID" \
  -H "X-Uni-SecretUniversalKey: YOUR_SECRET_KEY"

Complete Example Script

Here's a Python script to test your agent:

python
import requests
import os

# Configuration
USER_ID = os.environ.get("UNI_USER_ID")
SECRET_KEY = os.environ.get("UNI_SECRET_KEY")
BASE_URL = "https://api.universalapi.co"

headers = {
    "Content-Type": "application/json",
    "X-Uni-UserId": USER_ID,
    "X-Uni-SecretUniversalKey": SECRET_KEY
}

# Create an agent
agent_code = '''
from strands import Agent
from strands.models import BedrockModel

def create_agent():
    model = BedrockModel(
        model_id="us.anthropic.claude-3-7-sonnet-20250219-v1:0",
        region_name="us-east-1"
    )
    agent = Agent(model=model)
    return agent, []
'''

response = requests.post(
    f"{BASE_URL}/agent/create",
    headers=headers,
    json={
        "agentName": "python-test-agent",
        "description": "Test agent created from Python",
        "sourceCode": agent_code
    }
)
agent_id = response.json()["data"]["agentId"]
print(f"Created agent: {agent_id}")

# Chat with streaming
response = requests.post(
    f"{BASE_URL}/agent/{agent_id}/chat",
    headers=headers,
    json={"prompt": "Hello! Tell me a joke."},
    stream=True
)

print("Response:")
for chunk in response.iter_content(chunk_size=None, decode_unicode=True):
    print(chunk, end="", flush=True)
print()

Understanding the Response Format

Streaming responses include special markers:

MarkerDescription
__META__{json}__Metadata (conversationId, etc.)
__TOOL__{name}__Tool execution indicator
__ERROR__{json}__Error information

Regular text is the agent's response, streamed in real-time.

Next Steps

Universal API - The agentic entry point to the universe of APIs