Updating this for anyone who needs it. Cloud is indeed the way to go if your media is in google drive and files are above 100mb.
Expanding on the reply that @renan gave above:
If your files were in a shared folder, as was my case, copy them as original files to your own drive in colab: https://webapps.stackexchange.com/a/141694
Upgrade your storage size before doing this if you need to so you don’t run out of space mid-transfer.
Once they’re in your own drive move them to your cloud storage as linked above: https://medium.com/@philipplies/transferring-data-from-google-drive-to-google-cloud-storage-using-google-colab-96e088a8c041
Set your bucket to uniform access permissions so all objects in the bucket inherit the permissions, and grant the bucket public access to make the transfers: https://cloud.google.com/storage/docs/access-control/making-data-public#buckets
Objects are then assigned public URLs which are recognized by cloudflare stream’s URL uploader and I’m sure they’ll work programmatically as well.
Or use file-specific permissions if your case calls for it. Regardless remember to revoke public access in the cloud bucket after uploading to stream.
This is a million times faster than downloading from drive and uploading to stream through the ui.
HTH and thanks again @renan for pointing in the right direction.