← eustore.dev

Agent Self-Provisioning Guide

How an AI agent goes from zero to persistent EU storage in under 10 seconds, with no human in the loop.

This is the complete lifecycle. An agent with HTTP access can execute all of this autonomously.

1. Register (one-time)

The agent creates its own account. No email verification, no captcha, no credit card.

POST https://api.eustore.dev/v1/auth/register
Content-Type: application/json

{
  "name": "research-agent-7b",
  "email": "agent-7b@mycompany.ai"
}

# Response:
{
  "api_key": "eust_abc123...",
  "customer_id": "cust_...",
  "free_credits": 50,
  "message": "Account created. 50 free credits added (€0.50)."
}

2. Authenticate

POST https://api.eustore.dev/v1/auth/token
Content-Type: application/json

{
  "api_key": "eust_abc123..."
}

# Response:
{
  "token": "eyJ...",
  "expires_in": 86400
}

3. Create a bucket

The agent provisions its own storage. Picks a region closest to its workload.

POST https://api.eustore.dev/v1/storage/buckets
Authorization: Bearer eyJ...
Content-Type: application/json

{
  "name": "agent-7b-memory",
  "region": "eu-central"
}

# Response:
{
  "id": "bucket_...",
  "name": "agent-7b-memory",
  "region": "eu-central",
  "s3_endpoint": "https://fsn1.your-objectstorage.com",
  "created_at": "2026-03-29T07:00:00Z"
}

4. Get S3 credentials

Per-bucket S3 credentials. The agent uses these with any S3 client.

GET https://api.eustore.dev/v1/storage/buckets/{bucket_id}/credentials
Authorization: Bearer eyJ...

# Response:
{
  "access_key": "...",
  "secret_key": "...",
  "endpoint": "https://fsn1.your-objectstorage.com",
  "bucket_name": "agent-7b-memory",
  "region": "eu-central"
}

5. Store and retrieve data

Standard S3 — works with boto3, aws-cli, any S3 library.

# Python (boto3)
import boto3, json

s3 = boto3.client('s3',
    endpoint_url='https://fsn1.your-objectstorage.com',
    aws_access_key_id=creds['access_key'],
    aws_secret_access_key=creds['secret_key'],
    region_name='eu-central'
)

# Store agent memory
memory = {"session": "2026-03-29", "insights": [...], "decisions": [...]}
s3.put_object(
    Bucket='agent-7b-memory',
    Key='memory/2026-03-29.json',
    Body=json.dumps(memory)
)

# Retrieve across sessions
obj = s3.get_object(Bucket='agent-7b-memory', Key='memory/2026-03-29.json')
previous_memory = json.loads(obj['Body'].read())

6. Monitor costs

The agent checks its own spending. No surprise bills.

GET https://api.eustore.dev/v1/billing/balance
Authorization: Bearer eyJ...

# Response:
{
  "credits_remaining": 42,
  "credits_used": 8,
  "estimated_monthly_cost": 0.80,
  "currency": "EUR"
}

7. Top up programmatically

When credits run low, the agent can top up — via crypto (no human approval needed) or Stripe.

# Crypto (instant, no human)
POST https://api.eustore.dev/v1/billing/topup/crypto
Authorization: Bearer eyJ...
Content-Type: application/json

{
  "amount_eur": 10,
  "token": "usdc",
  "chain": "base"
}

# Returns wallet address — agent sends USDC, credits arrive automatically

Real-world patterns

🤖 Long-term memory

# Agent saves learnings after each session
s3.put_object(
    Bucket='agent-memory',
    Key=f'sessions/{session_id}/summary.json',
    Body=json.dumps({
        "date": "2026-03-29",
        "learnings": ["User prefers concise answers", ...],
        "context_for_next_session": {...}
    })
)

# Next session: agent loads all previous learnings
paginator = s3.get_paginator('list_objects_v2')
for page in paginator.paginate(Bucket='agent-memory', Prefix='sessions/'):
    for obj in page.get('Contents', []):
        # Reconstruct memory from all sessions
        ...

📊 Multi-agent shared workspace

# Research agent stores findings
s3.put_object(Bucket='team-workspace', Key='research/topic-a/findings.json', Body=...)

# Writer agent reads research and produces output
findings = s3.get_object(Bucket='team-workspace', Key='research/topic-a/findings.json')

# Reviewer agent reads output and stores feedback
s3.put_object(Bucket='team-workspace', Key='reviews/draft-1/feedback.json', Body=...)

🔄 Checkpoint/resume for long tasks

# Agent saves progress periodically
s3.put_object(
    Bucket='agent-work',
    Key=f'jobs/{job_id}/checkpoint.json',
    Body=json.dumps({"step": 47, "state": current_state, "progress": 0.65})
)

# If interrupted, resume from last checkpoint
try:
    cp = s3.get_object(Bucket='agent-work', Key=f'jobs/{job_id}/checkpoint.json')
    state = json.loads(cp['Body'].read())
    resume_from_step(state['step'])
except s3.exceptions.NoSuchKey:
    start_fresh()
Get Started — Free →