Connecting to Google Storage

These are the steps that have worked for me to pull a file from Google’s Cloud Storage. There might be an easier way, which I would love to hear about!

What worked
Loading the files via the URL (same path as the browser in the console) using OAUTH.

Why can’t I just use the Google Cloud SDK?
Sadly, you have to load their JSON file. There is no way to pass it in. So that rules out that option.

Things I’ve tried but didn’t work

  • DroneDeploy and Cloudflare - I couldn’t find the correct things to configure on my account, or enough stuff was stripped out of their examples that it didn’t work.
  • Service account authorization without OAuth – the key provided in the JSON Private key, wouldn’t import with crypto.subtle.importKey. It might work if Cloudflare would implement the ability to import a pkcs8 key. This would greatly simplify the steps to connect!
  • Using the JSON API with the auth below, I probably had the URLs wrong. Right now I just need to read the file. When I get to editing files, that might have to change.

The code

This should work if the few variables were replaced with actual values.

// Get the header of the OAUTH request
const GOOGLE_KEY_HEADER = objectToBase64url({
  alg: 'RS256',
  typ: 'JWT',
})

// Determine the issue and expiration date for the claimset
const iat = Math.round(Date.now() / 1000)
// Expires in an hour (that is the max allowed)
const exp = iat + 3600

// Generate the claimset payload
const claimset = objectToBase64url({
  iss: 'YOUR SERVICE ACCOUNT EMAIL from Step 1a or 1b below',
  # Grab the scope from https://cloud.google.com/storage/docs/authentication#oauth-scopes
  scope: 'https://www.googleapis.com/auth/devstorage.read_write',
  aud: 'https://www.googleapis.com/oauth2/v4/token',
  exp,
  iat,
})

// TODO: use the JSON object from Step 4 below.
const jwk = {} 

// Import the Key into a CryptoKey object
// This will export a private key, only used for signing
const key = await crypto.subtle.importKey(
  'jwk',
  {
   ...jwk,
    // Add alg: 'RS256' to it
    alg: 'RS256',
  },
  {
    name: 'RSASSA-PKCS1-v1_5',
    hash: {
      name: 'SHA-256',
    },
  },
  false,
  ['sign'],
)

// Sign the header and claimset 
const rawToken = await crypto.subtle.sign(
  { name: 'RSASSA-PKCS1-v1_5' },
  key,
  new TextEncoder().encode(`${GOOGLE_KEY_HEADER}.${claimset}`),
)

// Convert the token to Base64URL format
const token = arrayBufferToBase64Url(rawToken)

// Make the OAUTH request
const response = await fetch('https://www.googleapis.com/oauth2/v4/token', {
  method: 'POST',
  headers: new Headers({
    'Content-Type': 'application/json',
  }),
  body: JSON.stringify({
    grant_type: 'urn:ietf:params:oauth:grant-type:jwt-bearer',
    assertion: `${GOOGLE_KEY_HEADER}.${claimset}.${token}`,
  }),
})

// Grab the JSON from the response
const oauth = await response.json()

// Looks like:
// {
//   access_token:
//     'LONG STRING',
//   expires_in: 3600,
//   token_type: 'Bearer',
// }

const bucketName = 'bucket-name'
const filepath = 'path/to/myfile.json'

const responseFromGoogle = fetch(
  `https://storage.googleapis.com/${bucketName}/${filepath}`, 
  {
    method: 'GET',
    headers: new Headers({
      Authorization: `${oauth.token_type} ${oauth.access_token}`,
    }),
  })

// Go forth and create something awesome

/**
 * Helper methods for getting things to/from base64url and array buffers
 */
function objectToBase64url(payload) {
  return arrayBufferToBase64Url(
    new TextEncoder().encode(JSON.stringify(payload)),
  )
}

function arrayBufferToBase64Url(buffer) {
  return btoa(String.fromCharCode(...new Uint8Array(buffer)))
    .replace(/=/g, '')
    .replace(/\+/g, '-')
    .replace(/\//g, '_')
}

Note

Steps to get a JSON Web Key (JWK) for a Google Service Account

Step 1a - Create a service account

  • Sign into the Google Cloud Console
  • Go to the IAM & admin; Service accounts

    https://console.cloud.google.com/iam-admin/serviceaccounts

  • Click the create service account button
  • Enter a name
  • Choose a role, under Storage, I choose Storage Object Creator. This will give the worker rights to create and edit, but not delete or change any bucket settings.
  • There is no need to check Furnish a new private key, unless you want to use this service worker with the Google SDKs.
  • Click the Save button
  • Note the email for the new account

Step 1b - Use an existing service account

Step 2 - Create a P12 key

  • Click the edit button
  • Click the create a key button
  • Choose P12, click the create button

    Google will download the certificate, save it for later. The password notasecret will be needed at a later step.

  • Click close

Step 3 - Convert the P12 key to an RSA Private key

Thanks to Google for providing a P12 to PEM conversion tool
GitHub - googleapis/google-p12-pem: Convert Google .p12 keys to .pem keys.

  • Install the package
    npm install google-p12-pem
    
  • And convert
    ./node_modules/google-p12-pem/build/src/bin/gp12-pem.js PATH/TO/KEY.p12 > private.pem
    

Note: there is probably a better way to run

Step 4 - Convert the RSA Private key to JWK

Thanks to Danny Coates for creating a package to do the heavy lifting.
GitHub - dannycoates/pem-jwk

  • Install the package
    npm install pem-jwk
    
  • And convert
    ./node_modules/pem-jwk/bin/pem-jwk PEM/FROM/STEP3/private.pem > private.json
    
13 Likes

This is super useful, thanks for sharing!

1 Like

Thank you so much for this, it’s super helpful and saved me hours of time. I really appreciate the step-by-step directions.

I had to make a small change to get step 4 to work:

./node_modules/pem-jwk/bin/pem-jwk.js PEM/FROM/STEP3/private.pem > private.json

(adding the .js extension)

This solution help me to solve a big problem, thank you very much.

This is a great article.

One question would be, could we leverage the Cloudflare Cache once we get a response from Google Storage?

As far as I see in this case, the worker will always call the storage bucket on every request. Would be nice if we could cache the response somehow.

1 Like

@matthieu.lachance Correct me if I’m wrong I guess you could set the cacheTtl to a few thousand seconds in fetch settings as per Cache Using Fetch.

I confirm this upload method works.

Big thanks to webchad :grinning:

1 Like

And not only for Google storage, works for all Google Cloud services (with some changes).

1 Like

Here is another example with some caching and error-handling weg-li/index.js at cfae0036f035a1e5ff1eb9b19ea4820ad73c8097 · weg-li/weg-li · GitHub

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.