Securely deploy your Tensorflow JS model via Google App Script

Nitin Pasumarthy
6 min readAug 16, 2020

Background Story

I recently wanted to share my new Tensorflow JS model with my colleagues at office to get their feedback. Rather than handing them a folder of HTML and JavaScript files, I wanted to deploy it as an internal static site for a better user experience. There are several hosting services to easily achieve this, but as my model is confidential it should only be accessible by company employees. I didn’t want to spin up an entire server side system to take care of authorization, but instead want to achieve this with minimal changes to my static HTML file. My company is subscribed to GSuite services (docs, sheets, drive etc.) which is what I chose to deploy the site securely. If you wish to publicly deploy your model via Firebase hosting, checkout this codelab by Jason Mayes.

After reading this article you’ll be able to deploy your Python Tensorflow model as a Tensorflow JS (tfjs) model to run on users’ devices directly. The web page is served via Google App Script with permission levels of your choice. Google drive is used as a model store, which again lets you control who has access to the model files. Security can be important here as the model, which can be a confidential entity, is downloaded onto the user’s device and can be modified at their will. This article’s main focus is the deployment of a ML model and assumes the reader has a basic understanding of web development and machine learning. In an another article I shared how to train a digit recognition model directly on the browser using web workers and touch upon some basics of machine learning along the way. In that article I also shared some pros & cons of deploying the models directly on users’ devices.

Please note that this approach only works with Tensorflow Keras model, due to the lack of support to load GraphDef models via tfjs.io.browserFiles API which we depend on. Now that we know the why and what behind this idea, lets get to the exciting part…

Solution Overview

  1. Train & export your Tensorflow Keras model
  2. Convert the Python model to Javascript
  3. Load and serve the model from Google Drive via app-script using tfjs
  4. Deploy it as a secure website

1. Train a simple model

To follow along you can either create a tf keras model from scratch in seconds as shown in the notebook below or use your own. This is a dummy model which is trained to simply multiply the given input with 3. It is not useful in practice, but has all the ingredients to put the point across.

A toy Tensorflow Keras model that learns how to multiply a given input with 3

Download the trained 3_table_model.h5 model file from Google Colab (how?) or feel free to use the one I trained directly from here.

2. Python Model → JavaScript Model

Tensorflow JS team created a wonderful converter which walks us through the process of converting various types of Python models to be runnable in JavaScript. Install their tool and follow the steps mentioned here.

Options I chose on tensorflow_wizard to convert our 3 multiplier Python model to JavaScript

You can download our 3 multiplier converted model folder from here directly and follow along.

3. ️️Serve the model with Google App Script

We are finally at the crux of this post. Upload your converted tfjs model folder to Google Drive and set permissions appropriately. This is our first line of defense 👮‍♀️ in terms of security for our model. I set it as viewable to anyone in my company.

The converted tfjs model folder has a model.json file and one or more .bin files. model.json files has the meta data about the model and its architecture. The bin files are the weights of our model that are sharded / split for optimal transfer across the wire.

Create a new App Script project on https://script.google.com/home. The Code.gs file fetches our models from Drive and also serves the base HTML file as shown below,

App script code to load our JS model from Drive for serving

In the loadModelFromDrive(), we return the mode.json file f as a JSON string and the only (in our case) binary file f2 as a bytes array. If you have more bin files, read and transform them in a similar manner.

Create a new HTML file called index.html from the File menu. This is where we call the loadModelFromDrive(), run the model and respond to user events.

JavaScript snippet that loads a tfjs model from Google Drive

The interesting part is how we consume the model files and pass them to tf.loadLayersModel(). TFJS team provided several helper IO methods to load our model,

  1. public or authenticated URL
  2. IndexedDB
  3. Local storage
  4. HTML File inputs
  5. Custom IO

As the Drive API provides us with the file contents, we wrap them up as JavaScript Files, cleverly mimicking them as user HTML file inputs (option 4). As the model.json is a JSON string, we can easily create a JS File object out of it by mentioning its MIME type,

const modelJson = new File([modelFiles[0]], "model.json", { type: "application/json" })

The weights file however needs to be read in as raw bytes and then wrapped as a JS File object. This is because the app script’s file object is different from JS file object. Therefore to create one from the other we have to reduce the result to the common denominator, which is the OS level raw bytes. The loadModelFromDrive() in the app-script above returns a Byte[] array. To consume it we use JS TypedArrays like so,

const modelWeights = new File([Uint8Array.from(modelFiles[1])], "group1-shard1of1.bin")

These data structures are recently introduced in JavaScript to deal with media (canvas, image, audio, video etc.) data. But luckily they came in handy to manage our ML model files as well!

Congratulations! your model is now ready to make predictions using model.predict(). The complete script which shows other pieces like UI, prepare input for predictions, show alerts to the user etc. is available here.

Some optional things to try

to improve the experience could be to

  • Save the model to browser’s IndexedDB the first time it is downloaded from drive. When the user visits your site again, the model can be loaded from their device itself, without having to make a network round trip.
  • Use preact or similar to create a more modern component based app
  • Use clasp to manage your app-script code locally in your favorite editor and publish it to git. I published this code at https://github.com/Nithanaroy/tfjs-drive-deploy/tree/master in the same manner.

4. Deploy Securely 🚀🔒

This is the easiest step of all and one of the primary reasons for choosing app-script in the first place. Select “Deploy as web app” option from “Publish” menu on the app-script UI and choose the settings that suit your needs. In my my case, I enabled access to users from my company.

App script provides basic authorization for free

That’s all friends. All the files used in each of the steps are available here. And here is the hosted site with our toy model.

The final resultant model in action that runs inside browser and deployed with App Script and Google Drive

What did you build and deploy with tfjs? There is an active community on LinkedIn where developers from around the world, share their creative projects with tfjs. Check it out and get inspired!

Big thanks to Jason Mayes for suggesting several alternatives to this approach and to Ping Yu for clarifying how loadLayersModel() works under the hood.

--

--