Syncing a R2 bucket to GitHub repo

For Workers & Pages, what is the name of the domain?

What is the error number?

no error, suggestions on design requested

What is the error message?

no error

What is the issue or error you’re encountering

Best way to solve the synchronization between R2 and GitHub

What steps have you taken to resolve the issue?

 I have a math website interactablemath.org (still small and starting out) and I want to move the static files to Cloudflare. The static files consist of both static code (js/css) and binaries (mp3, some images, pdf, a few tiny mp4 <60MB). The static code is in a public GitHub repo and the binaries are in another private Github repo. What I want to do is have the R2 buckets update automatically every time there is a commit on the repo. In other words to delete or update or add files in R2. The R2 buckets would not contain the repository but just a collection of the latest files checked in from the repo. My thoughts are that I can do this with GitHub actions, a YAML file with CF API secret tokens and r2-upload-action. Since I plan to serve the R2 buckets from a custom subdomain publicly, I think I don’t need Cloudflare workers. Is this a good approach or is there a better way? I’m surprised that no one has published the code for this and when I get it working I will put out the code for it to save others the effort. Thanks for your insights!

What are the steps to reproduce the issue?

still in the initial stages of determining best approach

Hello. Are you still encountering this issue? If not, how did you fix it? If so, what have you tried?

It seems the Cloudflare solution to synch a R2 storage bin to GitHub is to clone the repository into the R2 bucket. Behind the scenes, I believe the Cloudflare bucket repo is updated on all commits to Github Repo. However, I only need a few files from the repo and only the most recent version. Cloning the entire repo would be a factor of 10 larger than what I need and unnecessary–but I can see that is easier to implement from CF point of view.
I am pursuing writing a YAML file on the Github side that will collect the files that have changed on a commit and r2-upload-action write the new files into my r2 storage bin.
What do you think of this approach?
Am I correct in assuming that once the files are in R2 storage, they will automatically serve from a public subdomain without writing additional Cloudflare workers? Is this considered good standard practice or does it have cybersecurity holes?
Thanks

This topic was automatically closed after 15 days. New replies are no longer allowed.