Can't list bucket objects in Node.js with @aws-sdk/client-s3

For Workes & Pages, what is the name of the domain?

localhost.com:3000

What is the error number?

CORS error

What is the error message?

R2 access by S3Client in Node.js frontend blocked by CORS policy: Response to preflight request doesn’t pass access control check: No ‘Access-Control-Allow-Origin’ header is present on the requested resource.

What is the issue or error you’re encountering

When trying to list the objects in a Bucket with: S3.send(new ListObjectsV2Command({ Bucket: bucket_id }), I get an error for the preflight in CORS

What are the steps to reproduce the issue?

The code is the following:

import { S3Client, ListObjectsV2Command } from "@aws-sdk/client-s3";

const S3 = new S3Client({
        region: "auto",
        endpoint: `https://${ACCOUNT_ID}.r2.cloudflarestorage.com`,
        credentials: {
          accessKeyId: accessKeyId,
          secretAccessKey: secretAccessKey,
        },

      });

console.log(
        await S3.send(
          new ListObjectsV2Command({
            Bucket: bucketId,
          })
        )
      );

Now, this returns an error:

Access to fetch at 'https://bucketId.userId.r2.cloudflarestorage.com/?list-type=2' from origin 'http://localhost:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

This is the CORS configuration of the bucket:

[
  {
    "AllowedOrigins": [
      "*"
    ],
    "AllowedMethods": [
      "GET",
      "PUT",
      "DELETE",
      "HEAD",
      "POST"
    ],
    "AllowedHeaders": [
      "*"
    ]
  }
]

I tried doing the same with Postman, in Authorization I set up AWS Signature, the accessKey, secretKey, AWS Region to auto and Service name to S3. With the same result, but, if I change the url to: https://userId.eu.r2.cloudflarestorage.com/bucketId/?list-type=2 then it works. The problem is that S3Client always sends it to the wrong URL, and there’s no way to changing it.

So I have taken the axios request from Postman and copy-pasted it into my Node.js frontend:

const axios = require('axios');

let config = {
  method: 'get',
  maxBodyLength: Infinity,
  url: 'https://userId.eu.r2.cloudflarestorage.com/bucketId/?list-type=2',
  headers: { 
    'X-Amz-Content-Sha256': 'somethingHere', 
    'X-Amz-Date': '20240708T130444Z', 
    'Authorization': 'AWS4-HMAC-SHA256 Credential=keyAccessId/20240708/auto/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=somethingElseHere'
  }
};

axios.request(config)
.then((response) => {
  console.log(JSON.stringify(response.data));
})
.catch((error) => {
  console.log(error);
});

And this DOES work, no CORS error. So it’s clear that the S3Client is not querying the correct API endpoint, and there’s no way to change it. The problem is I can’t generate the headers as needed in my API without copy-pasting from postman (which of course is not an option for me). So any idea on how to circumvent this issue would be helpful.

Just to be clear, the main issue here seems to be: “S3Client not querying the correct API endpoint”. But there might be another issue, which is that the other URL does not have the CORS set correctly, probably because it’s a redirect.

I spent all day trying to list the bucket contents with no luck, so I’ll appreciate any real solution, like generating the S3 authorization headers myself correctly in Node.js+Typescript (which I haven’t been able to do yet, every time R2 complains that the signature is not matching).

Apologies for the formatting, but this is the best I can do in this forum without markdown indications.

I managed to work around this issue by generating the AWS signature myself and ditching the aws-sdk/client-s3 package.

import * as CryptoJS from 'crypto-js';
import axios from 'axios';

export function generateAWSHeaders(url: string, method: string, service: string, region: string, accessKey: string, secretKey: string, sessionToken: string) {
  const host = new URL(url).hostname;
  const amzDate = new Date().toISOString().replace(/[:-]/g, '').replace(/\.\d{3}/, '');
  const dateStamp = amzDate.slice(0, 8);
  const canonicalUri = new URL(url).pathname;
  const canonicalQueryString = new URL(url).searchParams.toString();
  const payloadHash = CryptoJS.SHA256('').toString();
  const canonicalHeaders = `host:${host}\nx-amz-content-sha256:${payloadHash}\nx-amz-date:${amzDate}\nx-amz-security-token:${sessionToken}\n`;
  const signedHeaders = 'host;x-amz-content-sha256;x-amz-date;x-amz-security-token';
  const canonicalRequest = `${method}\n${canonicalUri}\n${canonicalQueryString}\n${canonicalHeaders}\n${signedHeaders}\n${payloadHash}`;

  const algorithm = 'AWS4-HMAC-SHA256';
  const credentialScope = `${dateStamp}/${region}/${service}/aws4_request`;
  const stringToSign = `${algorithm}\n${amzDate}\n${credentialScope}\n${CryptoJS.SHA256(canonicalRequest).toString()}`;

  const signingKey = getSignatureKey(secretKey, dateStamp, region, service);
  const signature = CryptoJS.HmacSHA256(stringToSign, signingKey).toString();

  const authorizationHeader = `${algorithm} Credential=${accessKey}/${credentialScope}, SignedHeaders=${signedHeaders}, Signature=${signature}`;

  return {
    'X-Amz-Security-Token': sessionToken,
    'X-Amz-Date': amzDate,
    'Authorization': authorizationHeader,
    'X-Amz-Content-Sha256': payloadHash
  };
}

export function getBucketObjects(bucket: string, accessKeyId: string, secretAccessKey: string, sessionToken: string): Promise<any> {
  const url: string = `https://${import.meta.env.VITE_APP_R2_ACCOUNT_ID}.eu.r2.cloudflarestorage.com/${bucket}/?list-type=2`;
  const headers = generateAWSHeaders(
    url,
    'GET',
    's3',
    'auto',
    accessKeyId,
    secretAccessKey,
    sessionToken //Only when using temporary credentials
  );
  let config = {
    method: 'get',
    maxBodyLength: Infinity,
    url: url,
    headers: headers
  }
  return new Promise((resolve, reject) => {
    axios.request(config)
    .then((response) => {
      resolve(response.data);
    })
    .catch((error) => {
      console.log(error);
      reject(error);
    });
  });
}

Hopefully this is helpful for someone else. In any case, there’s still a big issue with the bucket not being accessible with the aws npm package.