Firebase Cloud Function Restore Backups


Shit happened, now we have to deal with it

This is part 5 of a series of devops 101 with firebase cloud functions.

You can check the working repo here.

  1. Firebase Cloud Functions CI/CD with Cloud Build
  2. Firebase Cloud Functions and Slack notifications
  3. Firebase Cloud Functions Logging events
  4. Firestore Backups
  5. Firebase Disaster Recovery

Alright friends, let's finish with this series, all shit went wrong, we lost our data… have you been there? I have, and all I can tell you is that it's a unique experience, it's a combination of panic, fear, need to go to the bathroom, you name it, lucky for you, you read this series and you are prepared to go back to the game asap.

The good part is that it's really simple, the tricky part is how to implement, I will tell you the foundation and one approach, you can find what works best for you, I've seemed all kind of cool implementations around.

What are we doing?

  • We need to create a similar cloud function to export, but this time to import (duh)
  • We need to find which bucket is the last one, on the previous post I decided to name the buckets with a date format like YYYY-MM-DD , this case is easier, we just need to calculate yesterday's date.
  • Create a PubSub topic restore-backup , to trigger the function.
  • Create a PubSub listener for this cloud function.

The Function

This part is VERY similar to the export function, with a small difference, let's see

export const restoreBackup = async () => {
  const client = await auth.getClient({
    scopes: [

  // we calculate yesterday date, which is our backup name
  const yesterday = new Date
  yesterday.setDate(yesterday.getDate() - 1)
  const timestamp = dateformat(yesterday, 'yyyy-mm-dd')
  const path = `${timestamp}`

  const projectId = await auth.getProjectId()
  // we change the action for importDocuments
  const url = `${projectId}/databases/(default):importDocuments`
  const backup_route = `gs://${BUCKET_NAME}/${path}`
  return client.request({
    method: 'POST',
    data: {
        inputUriPrefix: backup_route, // this param is different as well
        // collectionIds: [] // if you want to import only certain collections
  }).then(async (res) => {
    console.log(`Backup restored from folder ${backup_route}`)
    // notify slack maybe? check the repo
  .catch(async (e) => {
    return Promise.reject({ message: e.message })

That's it! We are halfway there. If you need more information, you can check the official docs here

Let's subscribe the function, on your index.ts

export const automatedRestore = functions.pubsub

Deploy your function ^^

firebase deploy --only functions

PubSub Topic 

Next, on the Google Cloud Console type PubSub on the top navigation bar, once on the page, you should click Create Topic.

pub sub one

After creating your topic, you just need to trigger it. For the sake of the sample, please, make sure you have an already created backup with yesterday date or whatever you are using to calculate your last export.

I've seemed really cool implementations of this, some people just trigger this event via the sdk, others implemented a text messase solution using Twillio, your imagination is the limit.

That's it! easy right? a few considerations:

  • You should be able to implement this with confidence, make a few tests on your dev environment and then do a few controlled tests on production (YES ON PRODUCTION), make sure you backup your data first 🙄 you already know how.
  • Disaster recovery is more than this, we just care about put the business back in the game, after this, you should implement whatever process you have in place to find the source of the problem.
  • Monitor, Monitor, Monitor.

This post concludes my series on devops101 with Firestore and Cloud Functions, I will create more posts later to teach you other techniques that can leverage your productivity when working with firebase and GCP in general. Anything you want me to talk about? let me know

If you like this post, please consider subscribing to my newsletter so you don't miss the next! I write on a weekly basis. Also sharing is appreciate as well.

Check the repo here

Enjoyed this post? Receive the next one in your inbox!

I hand pick all the best resources about Firebase and GCP around the web.

Not bullshit, not spam, just good content, promised 😘.