Skip to main content

How to Copy to a Google Storage Bucket with GitHub Actions (GCP)

In this article, I will show you how to copy a file to a Google Storage Bucket using GitHub Actions.

info

These instructions were written and tested on a Mac.

Step 1. Create a project folder

Make a project folder and change to it using these commands:

mkdir -p ~/projects/actions/gh-action-zip-gcp-bucket 
cd ~/projects/actions/gh-action-zip-gcp-bucket

Step 2. Make a GitHub workflows folder

To make a GitHub workflows folder, do the following in the root of the repo:

mkdir -p .github/workflows

Step 3. Make a source folder in the project

To make a source folder, do the following in the repo:

mkdir -p source

Step 4. Create the files

Create empty files to fill in later by doing the following in the root of the project:

touch README.md
touch LICENSE
touch .github/workflows/upload-to-gcp.yml
touch source/bacon.txt

You can name the source file (bacon.txt) anything you want. If you want to follow my example, generate some text from https://baconipsum.com/ and paste it into source/bacon.txt.

Just leave the folder name as source - because a later step will reference that specific folder name.

You can do a search for "top ipsum generators" for alternatives.

Step 5. Edit upload-to-gcp.yml

  • Open .github/workflows/upload-to-gcp.yml in VS Code
  • Paste the following into the file and save it:
name: Zip and Upload to GCP Bucket

env:
TARGET_ZIP: source.zip
BUCKET_FOLDER: uploads

on:
push:
tags:
- 'v*'

jobs:
deploy:

runs-on: ubuntu-22.04
timeout-minutes: 10

steps:
- name: Checkout
uses: actions/checkout@v4

- name: Zip the folder
run: zip -r ./${{ env.TARGET_ZIP }} source/

- name: Authorize GCP
uses: 'google-github-actions/auth@v2'
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

# Step to Authenticate with GCP
- name: Set up Cloud SDK
uses: google-github-actions/setup-gcloud@v2
with:
version: '>= 363.0.0'
project_id: ${{ secrets.GCP_PROJECT_ID }}

# Step to Configure Docker to use the gcloud command-line tool as a credential helper
- name: Configure Docker
run: |-
gcloud auth configure-docker

# Step to Upload the file to GCP Bucket
- name: Upload to Google Cloud Storage
run: |-
TARGET_ZIP=${{ env.TARGET_ZIP }}
BUCKET_PATH=${{ secrets.GCP_TARGET_BUCKET }}/${{ env.BUCKET_FOLDER }}
EXTENSION="${TARGET_ZIP##*.}"
FILENAME="${TARGET_ZIP%.*}"
TIMESTAMP=$(date +%Y-%m-%d_%H-%M-%S) # Format: YYYY-MM-DD_HH-MM-SS
LATEST_FILENAME="${FILENAME}_latest.${EXTENSION}"
NEW_FILENAME="${FILENAME}_${TIMESTAMP}.${EXTENSION}"
gsutil cp ./$TARGET_ZIP gs://${BUCKET_PATH}/${LATEST_FILENAME}
gsutil cp ./$TARGET_ZIP gs://${BUCKET_PATH}/${NEW_FILENAME}

The workflow does the following:

  • Triggers when a tag beginning with v is pushed (v1.0.11 for example)
  • Checks out the files
  • Authorizes using a GCP Service Account pasted in the repos Secrets section
  • Sets up gcloud
  • Authorizes with Docker - even though Docker isn't used directly in this example
  • Builds two filenames (latest and one with a datestamp in the name)
  • Copies the zip file of the source folder to the bucket under the two new file names
  • The latest file will be overwritten with the latest copy
tip

You can comment out the second gsutil cp line if you don't want to store previous versions.

The drawback to storing previous versions is that they can build up over time. You would need to periodically remove them. Or create some automation to only maintain a certain number of previous versions. That's beyond the scope of this article.

Step 6. Setup Google Cloud Platform (GCP)

Create a new project id

Create a service account (SA)

Create a new bucket

  • Create a new bucket (in the GCP console under Cloud Storage)
  • Make sure the SA has access to it

Step 7. Fill in the secret vars

Be sure to set the secret vars in the repo for the following:

  • GCP_SA_KEY
  • GCP_PROJECT_ID
  • GCP_TARGET_BUCKET

For a little extra security, I like to store the project id and bucket name as secret vars. Partially because this is going into a publicly accessible demo.

I would at least put them in vars and not hard code them in the middle of your script.

For documentation purposes for a private repo it would probably be convenient to have them in environment vars at the top of your file. So when you are looking at the workflow in 6 months you can easily see and remember where it's storing things.

You should always make the SA key a secret.

Step 8. Trigger the build

  • Get the current tags:
git tag
  • Increment based on the last tag
  • Trigger the build

Here is an example:

git tag v1.0.1  
git push origin --tags

Step 9. Monitor the workflow actions

  • In the repo, click on the Actions tab
  • The run should turn green to indicate a pass

Step 10. Check the bucket

Check the bucket for updates and / or new files.

tip

You may need to refresh the browser window

A note about billing

Remember that GCP may give you credits, but it's not completely free.

Always keep an eye on billing and shutdown your experiments when not needed to keep costs low.

You can monitor project costs here:

Sample project

Here is a sample project:

Conclusion

In this article you learned how to copy a file to a Google Storage Bucket using GitHub Actions.

References