Upload files to R2 using Cloudflare Pages (browser js file uploader + presigned url from functions)?

I’m exploring the possibility to implement a file uploader on Cloudflare Pages. The basic idea is:

  • Browser Js file uploader
  • Generate presigned url from Functions

From my research so far, seems that 1) official AWS SDK JS doesn’t work with Functions / Workers and 2) we’ll need to use something like aws4fetch and deal with XML?

There’s no XML for PutObject?

You’d use that to generate the presigned URL in Functions and browsers can use the Fetch Web API as normal to upload to that presigned URL.

1 Like

Sorry… bad example url… other API calls in the same document involves XML as input & output, e.g., ListBuckets

Yes. The problem is that all examples that I could find on the internet need to use language-specific SDKs to generate presigned URLs (e.g., python sdk, nodejs sdk…).

Is there any code snippet that works for Cloudflare functions / workers to generate presigned urls for R2?

aws4fetch was only fixed fairly recently for presigned URLs since there was a bug.

It’s basically just…

const aws = new AwsClient({
                "accessKeyId": env.AWS_ACCESS_KEY_ID,
                "secretAccessKey": env.AWS_SECRET_ACCESS_KEY,
                "service": “s3”,
                "region": “auto”
});

const request = new Request(“https://bucket.id.r2.cloudflarestorage.com/file.txt”, { 
  method: “PUT”
});

const presigned = await aws.sign(request, { aws: { signQuery: true }})

// presigned.url is what you want

Written on mobile; might be syntax errors but you get the gist.

1 Like

awesome!

Thanks for the tip! Will give it a try

So I ended up using aws-sdk v2 to generate presigned_url in Functions

  import * as AWS from 'aws-sdk';

  const accessKeyId = `${env.ACCESS_KEY_ID}`
  const secretAccessKey = `${env.SECRET_ACCESS_KEY}`;
  const endpoint = `https://${env.ACCOUNT_ID}.r2.cloudflarestorage.com`;

  const s3 = new AWS.S3({
    region: "auto",
    signatureVersion: 'v4',
    credentials: new AWS.Credentials(accessKeyId, secretAccessKey),
    endpoint: new AWS.Endpoint(endpoint),
  });

  const url = s3.getSignedUrl('putObject', {
    Bucket: 'some-bucket',
    Key: "filename.jpg",
    Expires: 300
  });

And I’m happy to report that I now face the same CORS issue as many others here :slight_smile:

From my understanding, we still can’t use presigned_url to upload objects to R2 via browser js, because of CORS settings. Right?

Nope

Thanks for the response.

We’ll upload files to S3 for now. It should be easy to switch to R2 in the future if CORS is fixed :slight_smile:

I would say that I’m very happy to make the browser file uploader work w/ Cloudflare Pages & Functions today. A lot of potentials to build incredible applications on this platform!

Which part is broken for you?

My “nope” was the answer to your “right?” - R2 supports PutBucketCors

ah, i see. Thanks for the tip!

So I’m able to upload a file from browser js to R2 now! Very happy!

I ended up with a one-off ops script (not deployed to production on Pages) to set CORS rules.

I should’ve read document more carefully or at least search “CORS” in the docs :slight_smile:

In case someone came from Google search - here’s the docs (scroll all the way to the bottom):

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.