Use PubSub schedule with Firebase Cloud-Function Emulator

When writing a Firebase cloud function, you can create CRON functions with functions.pubsub.schedule('every 10 minutes'). But they are not supported by the Firebase Emulator.

// Here is my CRON function declaration
export const myCronFunction = functions
.schedule('every 10 minutes')
 .onRun( async () => {
    console.log("this runs every 10 minutes")
    return null;

Which means that when you run

firebase emulators:start --only functions

Well, myCronFunction is not gonna run, never.

Firebase Shell

It is possible to run manually a cloud function using firebase shell. This is how you do it:

user@laptop:~$ firebase functions:shell
firebase > myCronFunction()

> this runs every 10 minutes

And I noticed that, inside this shell, you can use setInterval. Therefore, it is very easy to make that function run every 10 minutes. This is good enough to emulate the behaviour of a CRON function.

user@laptop:~$ firebase functions:shell
firebase > setInterval(() => myCronFunction(), 600000)

> this runs every 10 minutes

Finally, I am going to pipe this setInterval line as the input of the shell to have a one-line command to run easily. Here is the resulting package.json

  "scripts": {
        "shell": "firebase functions:shell",
        "cronschedules": "echo \"setInterval(() => myCronFunction(), 30000)\" | npm run shell",

Now when I run npm run cronschedules my CRON function runs every 10 minutes.

Host a static website with Google Cloud Storage using Firebase Cloud Function as a proxy to secure the access with HTTP basic authentication

That is a fucking long title.

I wanted a cheap hosting for a static website. So I though “Hey let’s use Google Cloud Storage”! It can be used easily to host a static website.

But my website has to be protected by username/password authentication. A HTTP basic Authentication is enough for my use case. The thing is: it is super hard to achieve this with Google Cloud Storage… I can restrict the access to my bucket with IAM, that’s for sure. But when my users access my static website hosted on GCS, no one is going to include a nice “Authorization” header or whatever. So it is “Access denied”.

So I wanted to create a Proxy to access my bucket. By creating a service account, I can provide READ access to any server to my bucket. But I still want a cheap solution. So I will definitely not run a VM 24/7 just to provide a proxy to a website rarely used.

Firebase proposes some cheap plan (pay as you go) and allow us to run Cloud Function with basically any NPM module I want, including Express, which provide Proxy middleware.

So I started to build a proxy to access my GCS hosted website. But I never achieve the expected result because of this


There is a Github issue about a similar error but damn, there is no solution for this; It looks like Google will just not allow Firebase to access GCS via HTTP.

So after a lot of struggles and try-but-failed workaround, I ended up downloading locally the requested file from GCS and stream it as a request.

All is left to do is create a nice CNAME for my firebase project.

See this gist to access the final code I used.

* Declare a single cloud function "componentsProxy"
const functions = require("firebase-functions");
const { server } = require("./server");
const componentsProxy = functions.https.onRequest(server);
module.exports = {

view raw
hosted with ❤ by GitHub

"name": "functions",
"description": "Cloud Functions for Firebase",
"scripts": {
"serve": "firebase serve –only functions",
"shell": "firebase functions:shell",
"start": "npm run shell",
"deploy": "firebase deploy –only functions",
"logs": "firebase functions:log"
"engines": {
"node": "8"
"dependencies": {
"@google-cloud/storage": "^4.1.1",
"express": "^4.14.1",
"express-basic-auth": "^1.2.0",
"firebase-admin": "^8.6.0",
"firebase-functions": "^3.3.0",
"fs-extra": "^8.1.0",
"google-auth-library": "^5.5.1"
"devDependencies": {
"firebase-functions-test": "^0.1.6"
"private": true

view raw
hosted with ❤ by GitHub

* 1st middleware
* Check for HTTP basic Auth
* If not OK, ask the browser for it.
* If OK, proceed to next middleware
* 2nd middleware
* Use google-cloud-storage client to download locally a version of the requested file
* Then stream this file as a response
const express = require("express");
const basicAuth = require("express-basic-auth");
// Imports the Google Cloud client library
const { Storage } = require("@google-cloud/storage");
// Creates a client from a Google service account key; A service is allowed to access the GCS bucket hosting the files
const storage = new Storage({ keyFilename: "./my_service_key.json" });
const fs = require("fs-extra");
const server = express();
// Basic HTTP authentication
users: { username: "Hey this is a strong password or ??? yeah maybe not :-/" },
challenge: true // trigger most browsers to ask for credentials
// Catch all requests
server.use("/**", async (req, res) => {
const bucketName = "MY_GCS_BUCKET_NAME";
const srcFilename = req.params[0] || "index.html";
const destFilename = "/tmp/" + srcFilename;
console.log("file to fetch", srcFilename);
const options = {
destination: destFilename
// Create local path to file if not exist
fs.outputFileSync(destFilename, "");
// Downloads the file locally
await storage
`gs://${bucketName}/${srcFilename} downloaded to ${destFilename}.`
// Stream the file asa response
module.exports = {

view raw
hosted with ❤ by GitHub