R2 Presigned links not expiring

A group I do some development work with is looking to switch to R2 as we expect it will provide a much more performant and reliable solution than our current provider; however, I seem to have run into a blocking issue.

At the moment any preresigned URL I generate is continuing to work well after it should have expired. I’m wondering if anyone else has run into a similar issue and if so did you manage to fix it?

Code snippet:

 private function R2Download($url, $endpoint) {
       // $url is the file we wish to download  eg: https://<account-id>.r2.cloudflarestorage.com/<bucket>/<key>`
       // $endpoint is the URL without the key eg: https://<account-id>.r2.cloudflarestorage.com/<bucket>
        $accountId = $this->R2['CF_ACCOUNT_ID'];

        $client = new Aws\S3\S3Client([
            'region' => 'auto',
            'endpoint' => "https://$accountId.r2.cloudflarestorage.com",
            'version' => 'latest',
            'credentials' => new Aws\Credentials\Credentials(
                $this->R2['CF_ACCESS_KEY'], $this->R2['CF_SECRET_KEY']

        $cmd = $client->getCommand('GetObject', [
            'Bucket' => $this->R2['CF_R2_BUCKET'],
            'Key' => urldecode(str_replace("$endpoint/", "", $url)) // remove the URL host, and bucket from the string  eg:  'path/to/file'
        $request = $client->createPresignedRequest($cmd, '+1 minute');
        $url = parse_url((string)$request->getUri());
        if (strpos($this->R2['CF_R2_CNAME'], "https://") === 0) {
            $cname = parse_url($this->R2['CF_R2_CNAME']);
            $cname = parse_url("https://".$this->R2['CF_R2_CNAME']);

        return sprintf("https://%s%s?%s", $cname['host'], $url['path'], $url['query']);        

The resulting URL includes a timestamp X-Amz-Date=20221203T034958Z and an expiry X-Amz-Expires=60

I can use the URL immediately to access the file. I then conducted a series of tests over the next 10 minutes and I was able to continue to start downloads using the URL. Hoping someone can see something stupid I’ve done, because this is the one thing stopping us from making the switch and from the preliminary tests I’ve done, this change is one we’re very excited about.


How exactly was you testing it? The original Cache-Control of the file is sent to you so it’s entirely possible your browser is just caching the response.

I created a presigned UR, consumed it then tried again five minutes later.

I hadn’t considered Cloudflare would be caching a private resource, so I went and put the domain into “Development Mode” which should bypass Cloudflare’s cache entirely. Five minutes later the link still works.

Cloudflare’s cache isn’t relevant here, it doesn’t run on the S3 API.

I’m referring to your browser’s own cache - try the URL on a different device, or clear your browser’s cache (F12 → Network → Disable Cache → refresh the page).

1 Like

So I tried that, and the file (200MB) still downloaded. I also gave the link to someone else ~ an hour after it should have expired and they too were able to download the file through that link.

I can confirm that this problem still exist. I created a pre-signed url that is supposed to expire after 10 minutes. It has been more than 10 days and the link is still not expired.

I have tried different devices and internet connection.

Thanks for this. I was thinking about trying R2 again, but I see it’s still broken.

wow bump?

I’m seeing this as well pretty consistently. Hard to believe such a gaping security hole has been going on since tho, so I will test further.

In the meantime,

Are you guys still seeing this ??