With the power of Github Action, we can implement the CI/CD process with various tasks quickly and directly. I recently have a CD task in my current project, which needs to upload some files in the repository to Firebase Storage after the CI is completed. I would like to share how I handled the task using Github Action.
The solution uses python 3.7.6 on a macOS runner machine.
Prerequisite: a Github repository, a Google Firebase/Cloud account, and a cup of coffee ☕️ 😊.
1. Set up Firebase project
In your Firebase console project, let’s get started with the Storage setup if you haven’t done it yet.
Notice the bucket name in the File section; we will need it later. In this article, the default bucket will be used. Check out the Firebase document for more detail about the bucket and custom bucket.
2. Set up Google cloud’s Service account and get the Credential key
From the Google cloud console, navigate to the associated project in the All tab and open it.
In the IDENTITY & SECURITY section on the menu, select Identity > Service Accounts. Two default service accounts existed in the section already without the credential key. Choose the Create key action to generate and download the JSON certificate for the credential. Securely store the JSON file because the key it contains cannot be recovered if lost.
3. Store the secret key file on Github’s secret
The upload process, which needs to import the certificate key JSON file, will be executed on Github Action. However, storing sensitive information directly on the repository is not a good idea, even a private repository. There are some great approaches to protecting private data; one is to use Github’s encrypted secrets.
The idea is to encode the key file to a base64 string and store it on Github Secret, then decode it later in the workflow to retrieve a file that the upload script can uses. The script below could encode a file in the same directory to a base64 string and print it out.
Next step, create a secret on Github and use the base64 string as the content. Notice that there is some limitation like the secret name format or number of secret on each organization. Further information could be found here.
4. The upload script
We now have all the necessary elements (the Google cloud key file, the Google storage bucket name) for the upload script. Notice that the bucket name that we use is the part without gs:// as from google firebase document:
You can specify a default bucket name when initializing the Admin SDK. Then you can retrieve an authenticated reference to this bucket. The bucket name must not contain gs:// or any other protocol prefixes. For example, if the bucket URL displayed in the Firebase Console is gs://bucket-name.appspot.com, pass the string bucket-name.appspot.com to the Admin SDK.
Now, we use this python script to run the upload task whenever it is triggered. Remember to put this script file in the repository’s directory that matches the path we declared in the workflows .yml file, which we will create in the next part.
5. Github Action workflow
To create a workflow, in the repository site, select the Action tab and select a template for our workflows or skip to make a new one. Use the .yml script below and replace the corresponding file name and file path in the project.
In the workflow, we use another 3rd party action name base64-to-file, which will export the base64 string from the secret that we added to a file and save it in /Users/runner/work/_temp on the runner machine. This file will be used as an input for Google’s credential object in the upload script.
The workflow file will be pushed to the repository after finish and be located in .github/workflows.
From now, whenever the workflow’s conditions are triggered, the get credential and upload process will run and upload a specific file to Firebase Storage.
Thank you for reading! 🙏🏻