Skip to content

Files Module

SDK reference for reading, writing, and managing workflow files

The files module reads and writes files from workflows. It supports two storage modes:

  • cloud (default) — S3 storage, scoped per organization
  • local — Local filesystem under the CLI working directory and /tmp/bifrost/

And three storage locations:

LocationCloud (S3 prefix)Local path
workspacebucket rootcurrent working directory
temp_tmp//tmp/bifrost/temp
uploadsuploads//tmp/bifrost/uploads

uploads holds files submitted via form file fields. Paths are sandboxed — reads/writes outside the configured locations raise ValueError.

from bifrost import files
MethodReturnsDescription
files.read()strRead a text file
files.read_bytes()bytesRead a binary file
files.write()NoneWrite text
files.write_bytes()NoneWrite binary data
files.list()list[str]List files in a directory
files.delete()NoneDelete a file
files.exists()boolCheck if a file exists
files.get_signed_url()dictGenerate a presigned S3 URL for direct upload/download

All methods accept location and mode keyword arguments (defaults: "workspace", "cloud").

async def read(
path: str,
location: Literal["workspace", "temp", "uploads"] = "workspace",
mode: Literal["local", "cloud"] = "cloud",
) -> str

Returns the file contents as a string. Raises FileNotFoundError if missing, ValueError if path escapes the location root.

content = await files.read("data/customers.csv")
uploaded = await files.read("form_id/uuid/upload.txt", location="uploads")
async def read_bytes(
path: str,
location: Location = "workspace",
mode: Mode = "cloud",
) -> bytes

Same as read() but returns raw bytes. Use for binary formats (PDFs, images).

pdf = await files.read_bytes("reports/q4.pdf")
async def write(
path: str,
content: str,
location: Location = "workspace",
mode: Mode = "cloud",
) -> None

Write text content. Creates parent directories as needed.

await files.write("output/report.txt", "Report data\n")
async def write_bytes(
path: str,
content: bytes,
location: Location = "workspace",
mode: Mode = "cloud",
) -> None

Write binary data.

await files.write_bytes("uploads/logo.png", image_bytes)
async def list(
directory: str = "",
location: Location = "workspace",
mode: Mode = "cloud",
) -> list[str]

List file and directory names directly under directory (defaults to the location root).

items = await files.list("uploads")
for name in items:
print(name)
async def delete(
path: str,
location: Location = "workspace",
mode: Mode = "cloud",
) -> None

Delete a single file. Raises FileNotFoundError if it doesn’t exist.

await files.delete("temp/old_export.csv", location="temp")
async def exists(
path: str,
location: Location = "workspace",
mode: Mode = "cloud",
) -> bool
if await files.exists("data/customers.csv"):
customers = await files.read("data/customers.csv")

Generate a presigned S3 URL so a client (browser, external service) can upload or download a file directly without proxying through Bifrost. Always cloud-mode and always scoped to the current org.

async def get_signed_url(
path: str,
method: Literal["PUT", "GET"] = "PUT",
content_type: str = "application/octet-stream",
) -> dict
ParameterTypeDescription
pathstrFile path within the org’s S3 namespace
method"PUT" | "GET"PUT for uploads, GET for downloads
content_typestrMIME type (used for PUT only)

Returns:

KeyTypeDescription
urlstrPresigned URL
pathstrFull scoped S3 path
expires_inintLifetime in seconds
upload = await files.get_signed_url("incoming/photo.jpg", method="PUT", content_type="image/jpeg")
# Hand `upload["url"]` to the browser; it can PUT directly to S3.