Written by Alistair Sykes
Feb 06, 2018

Firebase Cloud Functions

How do you trigger Firebase Cloud Functions when a file gets uploaded? We recently needed to take regularly updated text files and read them into a database. Here’s how we used Firebase to do it.

Firebase Cloud Functions

Firebase Cloud Functions are a mechanism of running backend code (written in JavaScript) which can be triggered in various ways. In this instance, we wanted to trigger functions based on a new file, so opted to use them in combination with Firebase Cloud Storage. Whenever a file gets uploaded our code is triggered.



Here you can see that we are listening for a change in file storage. We check if it's a file intended for this cloud function. Then we read the file from storage and pull out the data. We then break it up by line and add it into our database.

const functions = require('firebase-functions');
const admin = require('firebase-admin');
const path = require('path');
const gcs = require('@google-cloud/storage')();

exports.updateObj = functions.storage.object().onChange(event => {
 const file = event.data; // The Storage object.
 const fileBucket = file.bucket; // The Storage bucket that contains the file.
 const filePath = file.name; // File path in the bucket.
 const contentType = file.contentType; // File content type.
 const fileName = path.basename(filePath); // Get the file name.

 if (fileName != "Obj.txt") {

 const db = admin.database();
 const objRef = db.ref("obj");

 return gcs.bucket(fileBucket)
   .then(function(data) {
     if (data)
       return data.toString("utf-8");
   .then(function(data) {
     if (data) {
       var dataLines = data.split("\n");
       for (var i = 0; i < dataLines.length; i++) {
         var objLine = dataLines[i];
         if (isEmpty(objLine)) {
         var objLineFields = objLine.split("|");

   .catch(function(e) {

function isEmpty(str) {
 return (!str || 0 === str.length);


This solution hits issues with larger datasets. Firebase cloud functions only allows for up to 540 secs per function (default of 60 secs). As soon as we started using bigger datasets we started hitting those timeouts.

In an attempt to combat the issue we tried exploring running promises (a Javascript concept) in parallel. Although parallelism is possible, when trying to handle errors correctly and working within those timeouts, it appeared this was never going to work for large datasets. Firebase may be planning a change to this behaviour, but for the moment our advice would be only to use this solution for smaller datasets.

Next Post

Queued tasks on App Engine for Firebase