I am experimenting with Cloudflare R2 for object storage and faced some limits with uploading files that is more than 300mb from the dashboard. The prompt suggest that Files larger than 300 MB can be uploaded using the S3 Compatibility API or Workers
I am using boto3 in Python and I was successful in performing get_object
but face ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
when using put_object
to upload a file to R2. I tested the PutObject with a file smaller than 300 mb and it denied access.
Also, I have added extra CORS permission in the bucket as below:
[
{
"AllowedOrigins": [
"http://localhost:3000"
],
"AllowedMethods": [
"GET",
"PUT",
"POST",
"DELETE",
"HEAD"
]
}
]
and the code that works for me
import boto3
s3 = boto3.client('s3',
endpoint_url = <url>,
aws_access_key_id = <key_id>,
aws_secret_access_key = <access_key>
)
bucket = 'test_bucket'
key = 'test.csv'
s3.put_object(
Body=open('test.csv', 'rb'),
Bucket = bucket,
Key = key
)
# s3.get_object(Bucket=bucket, Key=key) <- This is working