Everything you need for local GCP development
Like LocalStack, but purpose-built for Google Cloud.
Zero GCP credentials
Develop and test against real-looking GCP APIs without a Google account, billing, or service account keys.
SDK-compatible
Works with the official google-cloud-* Python SDKs. Point them at Cloudbox and your code runs unchanged.
Instant startup
All services start in under a second. No Docker images to pull, no network calls, no cold-start delays.
Pure Python
FastAPI + uvicorn. Easy to inspect, extend, and debug. DuckDB powers BigQuery and Spanner queries.
Admin UI
Built-in web dashboard at localhost:8888 — browse buckets, topics, secrets, and tasks in real time.
Optional persistence
In-memory by default. Set CLOUDBOX_DATA_DIR to persist state across restarts via atomic JSON writes.
Ten GCP services, one command
Each service runs as an independent FastAPI server started concurrently from a single entry point.
Buckets, objects, simple / multipart / resumable uploads, byte-range downloads, compose, CORS config, retention policies, notifications, lifecycle rules, gsutil-compatible CLI.
:4443 · RESTTopics, subscriptions, push & pull, dead-letter, message filters, ordering, snapshots, schema validation.
:8085 gRPC · :8086 RESTDocuments, collections, structured queries, field projection, composite filters, cursor pagination, aggregation, transactions, field transforms, batchWrite.
:8080 · RESTSecrets, versioned payloads, enable / disable / destroy, latest resolution, cascading deletes.
Queues, HTTP tasks, deferred scheduling, exponential retry backoff, rate limiting (maxDispatchesPerSecond / maxConcurrentDispatches), force-run, auto-dispatch background worker.
:8123 · RESTDatasets, tables, views, schema evolution, SELECT / DML, parameterized queries, INFORMATION_SCHEMA, streaming inserts, async jobs, CTAS — backed by DuckDB.
:9050 · RESTInstances, databases, DDL, sessions, mutations, SQL & streaming reads — backed by DuckDB.
:9010 · RESTLog entries, severity & timestamp filters, sinks, log-based metrics, log exclusions, Cloud Monitoring time series.
:9020 · RESTCron jobs, HTTP targets, pause / resume, force-run, exponential retry backoff, background dispatch worker via croniter.
Key rings, crypto keys, versioned key material, AES-256-GCM encrypt / decrypt, AAD, key rotation — old ciphertexts remain decryptable after rotation.
:8092 · RESTFour ways to run it
Pull from Docker Hub, compose from source, or run locally with uv.
Docker Hub
Pull and run the published image directly — no clone needed:
docker run -p 4443:4443 -p 8080:8080 \
-p 8085:8085 -p 8086:8086 \
-p 8090:8090 -p 8091:8091 \
-p 8092:8092 -p 8123:8123 \
-p 9050:9050 -p 9010:9010 \
-p 9020:9020 -p 8888:8888 \
omab/cloudbox:latest
Admin UI → http://localhost:8888
Docker Compose
Clone the repo and start all services with one command:
git clone https://github.com/omab/cloudbox
cd cloudbox
docker compose up
Admin UI → http://localhost:8888
Local Python
Requires Python 3.12+ and uv.
git clone https://github.com/omab/cloudbox
cd cloudbox
uv sync
uv run cloudbox
Install from PyPI
Install the package and run it directly:
pip install cloudbox
cloudbox
Use the official Google Cloud SDKs
Point the google-cloud-* libraries at Cloudbox — your application code stays identical.
Cloud Storage
from google.cloud import storage
from google.auth.credentials import AnonymousCredentials
client = storage.Client(
project="local-project",
client_options={"api_endpoint": "http://localhost:4443"},
credentials=AnonymousCredentials(),
)
bucket = client.bucket("my-bucket")
Cloud Pub/Sub
# gRPC — set env var before importing the SDK
import os
os.environ["PUBSUB_EMULATOR_HOST"] = "localhost:8085"
from google.cloud import pubsub_v1
publisher = pubsub_v1.PublisherClient()
topic_path = publisher.topic_path("local-project", "my-topic")
publisher.create_topic(name=topic_path)
BigQuery
from google.cloud import bigquery
from google.auth.credentials import AnonymousCredentials
client = bigquery.Client(
project="local-project",
client_options={"api_endpoint": "http://localhost:9050"},
credentials=AnonymousCredentials(),
)
for row in client.query("SELECT 1 AS n").result():
print(row.n)
Secret Manager
from google.cloud import secretmanager
from google.auth.credentials import AnonymousCredentials
client = secretmanager.SecretManagerServiceClient(
client_options={"api_endpoint": "localhost:8090"},
credentials=AnonymousCredentials(),
)
secret = client.create_secret(
request={
"parent": "projects/local-project",
"secret_id": "api-key",
"secret": {},
}
)
Pre-configured helpers for all services are in sdk_compat/clients.py.
Runnable examples for every service
The examples/ directory contains self-contained scripts that target a live Cloudbox instance. Start Cloudbox, then run any example directly.
uv run cloudbox &
uv run python examples/gcs/upload_download.py
uv run python examples/gcs/byte_range.py
uv run python examples/gcs/cors.py
uv run python examples/gcs/retention.py
uv run python examples/pubsub/publish_subscribe.py
uv run python examples/firestore/queries.py # incl. field projection
uv run python examples/firestore/batch_write.py
uv run python examples/bigquery/tables.py # incl. schema evolution
uv run python examples/bigquery/views.py
uv run python examples/bigquery/parameterized_query.py
uv run python examples/bigquery/information_schema.py
uv run python examples/secretmanager/secrets.py
uv run python examples/tasks/tasks.py
uv run python examples/tasks/rate_limits.py
uv run python examples/scheduler/jobs.py
uv run python examples/kms/encrypt_decrypt.py
uv run python examples/kms/key_rotation.py
uv run python examples/logging/exclusions.py
upload/download · compose · byte-range · CORS config · retention policies
publish · subscribe · batch with attributes
CRUD · queries · field projection · cursor pagination · transactions · batchWrite
tables · schema evolution · views · parameterized queries · INFORMATION_SCHEMA
create · versioned payloads · access · disable
queues · enqueue tasks · list · delete · rate limits
cron jobs · pause · resume · delete
encrypt · decrypt · AAD · key rotation · version lifecycle
log exclusions · write filtering · CRUD
Drop-in replacements for gcloud and gsutil
Two CLI entry points installed by uv sync — use your existing shell scripts unchanged.
gsutillocal — Cloud Storage
gsutillocal ls # list buckets
gsutillocal ls gs://my-bucket # list objects
gsutillocal mb gs://my-bucket # create bucket
gsutillocal cp ./file.txt gs://b/f # upload
gsutillocal cp gs://b/f ./file.txt # download
gsutillocal cp gs://b1/o gs://b2/o # copy between buckets
gsutillocal mv gs://b/old gs://b/new # move / rename
gsutillocal rm gs://b/logs/* # wildcard delete
gsutillocal stat gs://b/file.txt # object metadata
gcloudlocal — multi-service CLI
gcloudlocal pubsub topics create my-topic
gcloudlocal pubsub subscriptions create my-sub \
--topic my-topic
gcloudlocal pubsub topics publish my-topic \
--message "hello world"
gcloudlocal secrets create api-key
gcloudlocal secrets versions add api-key --data "s3cr3t"
gcloudlocal tasks queues create my-queue
gcloudlocal scheduler jobs list