Google Cloud Storage for backups

King of the Hill Challenge

Transferring Data to the Google Cloud

Although GCS is primarily considered a backup and disaster recovery solution in the following example, it makes sense to be familiar with the common tools and procedures for transferring data to Google's object storage. In general, Google offers two important tools for transferring data to the cloud. The Cloud Storage Transfer Service [1] is used to move data from Amazon S3 or another cloud storage system either from the GUI in the GCP console via the Google API client library in a supported programming language or directly with the cloud storage transfer service REST API.

Also, the gsutil command-line tool [2] can be used to transfer data from a local location to GCS. More specifically, with gsutil you can create and delete buckets; upload, download, or delete objects; list buckets and objects; move, copy, and rename objects; and edit object and bucket ACLs.

Backing Up to Google Cloud

When it comes to backups, countless third-party providers have integrated Amazon S3 connectors into their products (e.g., S3 browser, WordPress backup plugin, Duplicati open source enterprise backup program, and Clonezilla imaging tool). However, Duplicati comes with a GCS connector, as well (Figure 1).

Figure 1: Duplicati supports numerous backup targets, including GCS.

Even without third-party tools, though, the route to the first Google cloud backup is not difficult. To begin, you need a GCP account (e.g., one that comes with Google Drive or G Suite). Thus armed, you will be able to create a new project (Figure 2) in the GCP console [3] by clicking on Create in the Select a Project dialog.

Figure 2: The basis of all functionalities in the Google Cloud is the project.

Once the project has been created, create the desired bucket in GCS by clicking on Storage | Browser in the Storage section of the navigation menu and then on Create Bucket . Because the storage class for inexpensive archive storage in this example will be Coldline and because I am located in Germany, my choice of target region here is europe-west-1 .

Now you will be able to upload files of any type to the new bucket. Depending on how you want to structure the files, you can use a meaningful folder structure within the bucket. Note, however, that folders in an object store are only pseudo-folders, wherein any path separators like the slash (/) are part of the object name. It is also possible to zip files or backups before uploading to reduce storage costs and simplify uploading and downloading.

Automating Backups

Ambitious users can also use the Google Cloud SDK to automate the backup. Here, I'll look at an Ubuntu-based approach, where I will first add package sources and then install the Google Cloud Public Key:

# export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)"
# echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | sudo tee /etc/apt/sources.list.d/google-cloud-sdk.list
# curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -

The following command installs the SDK:

# sudo apt-get update && sudo apt-get install google-cloud-sdk

The next step is to run the gcloud init command and follow the instructions to sign in and authenticate the SDK to your desired Google Account. After doing so, you can use the gsutil tool, which supports rsync, cp, mv, rm, and many other Linux commands. The syntax for rsync is as expected:

# gsutil rsync -r <source destination>

In other words, you only need to specify the directory to be copied from the local server to Google Cloud Storage, as in:

# gsutil rsync -r /home/drilling/google-demo gs:// td-ita-backup

The -r flag forces recursive synchronization of all subdirectories in the source, as you might expect. The rsync command also checks whether the file exists before copying and only copies if the file has changed. Items in the Google Cloud storage account are not removed by rsync -r when they are deleted locally; however, this operation would be possible if you used the -d option.

The last step is to automate the backup process. The easiest approach is to set up a cron job by creating a script that includes the gsutil command and making it executable:

# nano gcpbackup.sh gsutil rsync -r /home/drilling/google-demo gs://td-ita-backup
# chmod +x gcpbackup.sh

Now the script can be called with cron, as required.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy ADMIN Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs



Support Our Work

ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.

Learn More”>
	</a>

<hr>		    
			</div>
		    		</div>

		<div class=