Quickstart
Build a multi-tenant worker, test it locally, and send remote commands — no cloud account needed.
In this guide, you'll build an AI worker that runs inside a customer's cloud. Your AI does the reasoning in your cloud; the worker does the actions in theirs — reading files, writing results, querying data — without any of it leaving their network.
╔═ Your Cloud ════════════╗ ╔═ Customer's Cloud ═════════════════╗
║ ║ ║ ║░
║ ┏━━━━━━━━━━━━━━━━┓ ║ tool calls ║ ┏━━━━━━━━━━━━━━━━┓ ║░
║ ┃ AI Agent ┃────╬─────────────▶──╬──┃ AI Worker ┃ ║░
║ ┃ (reasoning) ┃◀───╬────────────────╬──┃ (actions) ┃ ║░
║ ┗━━━━━━━━━━━━━━━━┛ ║ results ║ ┗━━━━━━┯━━━━━━━━━┛ ║░
║ ║ ║ │ ║░
╚═════════════════════════╝ ║ read files, query data, ║░
║ write results, ... ║░
║ ║░
╚════════════════════════════════════╝░
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░Install
curl -fsSL https://alien.dev/install | shCreate the project
alien initSelect remote-worker-ts. This creates:
Let's look at the two important files.
alien.ts — what to deploy
This file describes the infrastructure each customer gets:
import * as alien from "@alienplatform/core"
// Private file storage for each customer
// Becomes S3 on AWS, Cloud Storage on GCP, Blob Storage on Azure
const files = new alien.Storage("files").build()
// Your code — deployed as a serverless function in the customer's cloud
// Becomes Lambda on AWS, Cloud Run on GCP, Container Apps on Azure
const worker = new alien.Function("worker")
.code({ type: "source", src: "./", toolchain: { type: "typescript" } })
.commandsEnabled(true)
.ingress("private") // No public URL — only reachable via commands
.link(files)
.permissions("execution")
.build()
export default new alien.Stack("remote-worker")
.add(files, "frozen")
.add(worker, "live")
.permissions({
profiles: {
execution: {
"*": ["storage/data-read", "storage/data-write"],
},
},
})
.build()src/index.ts — the code that runs in the customer's cloud
The template includes two tools. Here's the core pattern:
import { command, storage } from "@alienplatform/sdk"
// Each tool runs inside the customer's cloud.
// Their files never leave their network — only the result comes back to you.
const tools = {
"read-file": {
description: "Read a file from the customer's private workspace",
execute: async ({ path }) => {
const store = await storage("files")
const { data } = await store.get(path)
return { content: new TextDecoder().decode(data) }
},
},
"write-file": {
description: "Write a file to the customer's private workspace",
execute: async ({ path, content }) => {
const store = await storage("files")
await store.put(path, content)
return { written: true, path }
},
},
}
command("execute-tool", async ({ tool, params }) => {
const handler = tools[tool]
if (!handler) throw new Error(`Unknown tool: ${tool}`)
return handler.execute(params)
})
command("list-tools", async () =>
Object.entries(tools).map(([name, t]) => ({
name,
description: t.description,
}))
)command() registers handlers, and storage() gives each command access to the customer's private storage.
Local development
Start local dev
alien devLocal Development
Project remote-worker-ts
✔ Build local release
✔ Start local deployment
╭─ default ────────── ● running ───╮
│ worker running (private) │
│ files local filesystem │
╰──────────────────────────────────╯
alien dev release → push changes alien dev deploy → new deployment Ctrl+C → stopEverything runs on your machine. Storage is on the local filesystem. Same APIs as production — no cloud credentials needed.
default is your first deployment — it simulates deploying into a customer's cloud. In production, this would be a real AWS account with a real S3 bucket. Right now, everything runs locally on your machine.
Send a command
Commands let your backend call functions on the worker without any inbound networking. No open ports, no VPN, no VPC peering — the customer's network stays completely closed.
In local dev, you target the default deployment. In production, the exact same command reaches a real customer deployment — from the CLI or from your code via the API.
Open a second terminal and list the tools the worker exposes:
alien dev commands invoke --deployment default --command list-tools[
{ "name": "read-file", "description": "Read a file from the customer's private workspace" },
{ "name": "write-file", "description": "Write a file to the customer's private workspace" }
]Write a file to the customer's storage:
alien dev commands invoke \
--deployment default \
--command execute-tool \
--params '{"tool": "write-file", "params": {"path": "hello.txt", "content": "Hello!"}}'{ "written": true, "path": "hello.txt" }Read it back:
alien dev commands invoke \
--deployment default \
--command execute-tool \
--params '{"tool": "read-file", "params": {"path": "hello.txt"}}'{ "content": "Hello!" }Simulate multiple customers
You have one customer. Let's add another. In production, each customer has their own AWS account with their own S3 bucket — completely separate from each other. Locally, Alien simulates this with isolated directories:
alien dev deploy --name acme-corp --platforms localBack in the first terminal, both customers appear:
╭─ default ─────────────────────────── ● running ─╮
│ worker running (private) │
│ files local filesystem │
╰─────────────────────────────────────────────────╯
╭─ acme-corp ───────────────────────── ● running ─╮
│ worker running (private) │
│ files local filesystem │
╰─────────────────────────────────────────────────╯The isolation is real even locally — files written by default are invisible to acme-corp, just like they would be in separate AWS accounts.
Push an update
Change your code — add a tool, fix a bug, anything. Then:
alien dev releaseThis creates a new local release and updates the tracked deployments to point at it. If you want to verify the new code path locally right away, restart alien dev after the release so the worker process reloads the new build.
Press Ctrl+C to stop.
Next step
You built a multi-tenant worker, tested it locally with zero cloud setup, simulated multiple customers with isolated data, and pushed a live update to all of them at once.
Ready to deploy it into a real AWS account?