How to Integrate Cloudflare R2 with Astro: Complete Guide
Step-by-step guide to integrating Cloudflare R2 with your Astro website.
Cloudflare R2 is object storage that is S3-compatible but without egress fees. That last part matters a lot. With AWS S3, every byte your users download costs you money. R2 charges nothing for data transfer out. For image-heavy Astro blogs, portfolios, or media sites, this can save hundreds of dollars a month at scale.
R2 works through the S3 API, which means any S3-compatible library works with it. You store files in R2 buckets, serve them through Cloudflare's CDN, and your Astro site references them by URL. No vendor lock-in, no proprietary SDKs required.
Prerequisites
- Node.js 18+
- An Astro project (
npm create astro@latest) - A Cloudflare account with R2 enabled (free tier includes 10GB storage and 10 million reads/month)
- An R2 bucket created in the Cloudflare dashboard
Installation
Install the AWS S3 SDK (works with R2 since it is S3-compatible):
npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner
Configuration
Create an R2 API token in the Cloudflare dashboard under R2 > Manage R2 API Tokens. You need:
- Access Key ID
- Secret Access Key
- Account ID (found in your Cloudflare dashboard sidebar)
- Bucket name
Add them to your .env:
R2_ACCESS_KEY_ID=your_access_key
R2_SECRET_ACCESS_KEY=your_secret_key
R2_ACCOUNT_ID=your_account_id
R2_BUCKET_NAME=your_bucket_name
R2_PUBLIC_URL=https://pub-xxxx.r2.dev
Create the R2 client:
// src/lib/r2.ts
import { S3Client } from "@aws-sdk/client-s3";
export const r2Client = new S3Client({
region: "auto",
endpoint: `https://${import.meta.env.R2_ACCOUNT_ID}.r2.cloudflarestorage.com`,
credentials: {
accessKeyId: import.meta.env.R2_ACCESS_KEY_ID,
secretAccessKey: import.meta.env.R2_SECRET_ACCESS_KEY,
},
});
export const BUCKET_NAME = import.meta.env.R2_BUCKET_NAME;
export const PUBLIC_URL = import.meta.env.R2_PUBLIC_URL;
Enabling Public Access
By default, R2 buckets are private. To serve files publicly:
- Go to your R2 bucket in the Cloudflare dashboard
- Click Settings > Public Access
- Enable the
r2.devsubdomain or connect a custom domain
The custom domain approach is better for production. Under your bucket settings, add a custom domain like media.yourdomain.com. Cloudflare handles the SSL and CDN caching automatically.
Uploading Files
Create a utility function for uploads:
// src/lib/r2.ts (add to existing file)
import { PutObjectCommand, DeleteObjectCommand } from "@aws-sdk/client-s3";
export async function uploadToR2(
file: Buffer,
key: string,
contentType: string
): Promise<string> {
await r2Client.send(
new PutObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
Body: file,
ContentType: contentType,
})
);
return `${PUBLIC_URL}/${key}`;
}
export async function deleteFromR2(key: string): Promise<void> {
await r2Client.send(
new DeleteObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
})
);
}
API Route for Image Uploads
Create an API endpoint in your Astro SSR app for handling uploads:
// src/pages/api/upload.ts
import type { APIRoute } from "astro";
import { uploadToR2 } from "../../lib/r2";
import { randomUUID } from "crypto";
export const POST: APIRoute = async ({ request }) => {
try {
const formData = await request.formData();
const file = formData.get("file") as File;
if (!file) {
return new Response(JSON.stringify({ error: "No file provided" }), {
status: 400,
});
}
const allowedTypes = ["image/jpeg", "image/png", "image/webp", "image/avif"];
if (!allowedTypes.includes(file.type)) {
return new Response(JSON.stringify({ error: "Invalid file type" }), {
status: 400,
});
}
const ext = file.name.split(".").pop();
const key = `uploads/${randomUUID()}.${ext}`;
const buffer = Buffer.from(await file.arrayBuffer());
const url = await uploadToR2(buffer, key, file.type);
return new Response(JSON.stringify({ url, key }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
} catch (error) {
console.error("Upload error:", error);
return new Response(JSON.stringify({ error: "Upload failed" }), {
status: 500,
});
}
};
Using R2 Images in Blog Posts
Reference your R2-hosted images in MDX frontmatter:
---
heroImage: "https://media.yourdomain.com/uploads/hero-image.webp"
heroImageAlt: "A description of the image"
---
Or inline in your MDX content:

Presigned URLs for Direct Uploads
For larger files, let the browser upload directly to R2 instead of going through your server:
// src/lib/r2.ts (add to existing file)
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
export async function getPresignedUploadUrl(
key: string,
contentType: string
): Promise<string> {
const command = new PutObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
ContentType: contentType,
});
return getSignedUrl(r2Client, command, { expiresIn: 3600 });
}
The frontend requests a presigned URL from your API, then uploads directly to R2. This keeps large files off your server.
Production Tips
Organize with prefixes. Use folder-like prefixes in your keys:
blog-images/2025/01/hero.webp. R2 does not have real folders, but prefixes keep things organized and make cleanup easier.Set cache headers. When uploading, add
CacheControl: "public, max-age=31536000, immutable"for static assets. Cloudflare's CDN respects these headers and caches accordingly.Use image transforms. Cloudflare Images can resize and optimize images on the fly when served through a custom domain. Add query parameters like
?width=800&format=webpto transform at the edge.Monitor usage. The Cloudflare dashboard shows R2 storage and operation counts. The free tier is generous, but keep an eye on Class A (writes) and Class B (reads) operations.
Back up your bucket. R2 supports lifecycle rules and cross-bucket replication. For critical media assets, enable versioning so accidental deletions can be recovered.
Alternatives to Consider
- AWS S3 + CloudFront if you need more advanced features like event notifications and Lambda triggers, and egress costs are not a concern.
- Bunny CDN Storage if you want even simpler pricing with a pull zone CDN included.
- Uploadthing if you want a managed upload service with built-in file validation and do not need raw S3 access.
Wrapping Up
Cloudflare R2 solves the biggest pain point of object storage: unpredictable egress costs. For Astro sites that serve a lot of media, the zero egress pricing combined with Cloudflare's global CDN makes R2 the most cost-effective option. The S3-compatible API means you can switch to or from R2 without rewriting your code. Set up a bucket, point your custom domain at it, and your Astro site has a fast, cheap media backend.
Related Articles
How to Use Algolia with Astro: Complete Guide
Step-by-step guide to integrating Algolia with your Astro website.
How to Integrate Auth0 with Astro: Complete Guide
Step-by-step guide to integrating Auth0 with your Astro website. Setup, configuration, and best practices.
How to Use AWS Amplify with Astro: Complete Guide
Step-by-step guide to integrating AWS Amplify with your Astro website.