Using Google Apps Script as a Webhook

A logo for webhooks. A triangle with nodes on each end, connected to one another.

Google Apps Script projects can be published as web apps when you’re done with them. This is helpful if you’re building a tool with a simple UI (using HTMLService), but they can also be used as webhooks to do work within a user account.

Google provides the option of service accounts, so why use webhooks?

For my project, it was because I needed to send calendar invites to other people. Service accounts are generally the way to handle those kinds of jobs, but it needed to have full account delegation, which means it can act on behalf of any user in the domain. That wasn’t an option in this case, so a webhook became the next best option.

What is a webhook?

In simple terms, it’s an application that can do something when it receives a standard HTTP request. They generally work on a subscription model, where the webhook service listens for an action on a service. It sits in the middle, listening for an HTTP call and then emits another action in response.

For this example, I have our events management app running in Python on a local server. When there’s an action requiring an update to the calendar, it hits the Apps Script webhook and the script does some work before returning a response to the events app.

The Code

In Google Apps Script, doPost and doGet are functions which respond to POST and GET requests respectively. Because I’m calling the script from the server with a payload for the calendar event, I used doPost to listen for events.

The simplest hook you can set up is a function which listens for a POST request from somewhere and returns a response:

function doPost(e) {
    return ContentService.createTextOutput(JSON.stringify(
            "message": "Yo."

When you deploy the script as a web app, make sure it is set to “Anyone” can access but that the script runs as you. And that’s where the magic is.

With a traditional service account, it needs access as another user. There’s no way to limit that permission to a single user from the admin console, which is why I couldn’t take that approach with my project. In this case, the webhook is triggered by a request and then runs as the scoped user.

Using this method, we’re able to achieve the same permissions but without giving it access to every other user in the domain.

A Quick Note on Deploying

Update 11/29/2021
Many thanks to Joseph in the comments for sharing how to update a deployment without generating a new URL. This section of the post remains becuase it is a good example of how not to manage deployments.

With the new Apps Script editor, deploys are a pain. You can’t use the dev endpoint to test the hook (I don’t know why), meaning there’s no way to execute the HEAD version of the script in this instance. So, to test your changes, you need to deploy a new version and that gives a new URL for the endpoint. You’ll have to update your API request in your other codebase…it’s a mess.

If you can swing it, either build in the legeacy editor (still using the V8 runtime) or use clasp to manage your deploys. If I could change one thing about this whole project, this would be it.


Now, how to secure something like this? There’s nothing foolproof, obviously, but there are a few advantages to using Apps Script in this way:

  1. The server handles all communication. I’m not taking direct requests from a frontend client, so there’s more control over what is sent.
  2. Google’s deploy URLs are complex, reducing the risk of guessing the direct URL.
  3. Your script can have it’s own validation (ie, an API key) before processing requests.
  4. You can parse incoming requests for specific data structures, throwing errors when the received structure doesn’t match the expected.
  5. you could have a pseudo-CSP implemented by checking request headers for the correct domains and throw errors if they don’t match.

It all depends on how you’re expecting the webhook to be used with that third party.

When a request comes in, it includes an event parameter which holds information for the task. Even though every request is a POST, I listen for different methods in the post body to determine what happens next.

Here’s the same application as above with more detail added:

function doPost(e) {
    const accessKey = 'someBigLongString';
    let result;
    let params = JSON.parse(e.postData.contents)
    let method = params.method;
    let token = params.token;
    let userId = params.userId;
    // Add whatever other params you want

    if(token === accessKey) {
        switch method {
            case method === 'POST':
                result = {
                    // ...
            case method === 'PUT':
                // etc...
    } else {
        result = {
            'status': 'Forbidden',
            'statusCode': 403,
            'message': 'You do not have access to this resource.'
    return ContentService.createTextOutput(JSON.stringify(result))

In Practice

The general structure for any web hook is the same:

  • receive a request
  • process the method
  • process the payload
  • perform some task
  • return a response

You have the double benefit of hosting the script and user-scoped permissions for individual projects. In the future, it may be worth finding some kind of parsing library for handling incoming requests to cut down on boilerplate code for new projects. But if you’re looking for a way to interact with Google resources from the outside, this is one way that has worked well for me.

Sending Charts from Google Forms

We’re getting ready to launch a digital teaching and learning course for staff that is based on the TPACK framework. A few years ago, we started our 1:1 training by coaching teachers through using SAMR to design lessons which really use the technology in new ways. This time, we’re having teachers do more metacognitive reflection on their own practice with TPACK as a model for thinking through content, pedagogy, and technology’s impact on their practice.

Matt Kholer has collected several surveys which can be used to gauge each component of TPACK. We modified a 2009 instrument written by Kohler (and several others) into a Google Form for teachers to take before and after the program. For our course, we’re only focusing on technology and pedagogy and coaching people through applying their content knowledge.

Our goal was to have teachers compelte the form and then receive an email with their survey results. Once the form was submitted, the sheet needed to average each domain, create a chart, generate an email, and fire it off with the chart included in the body of the email.

I started with Martin Hawksey’s posts on email dashboards from Google Sheets using Apps Script. The main issue I ran into is that the radar chart we were hoping to use had to be rendered into the spreadsheet to get all the labels and formatting correct. Even calling the image directly from its URL was problematic. Specifically, if multiple people submitted at once, the charts could get combined in unexpeected ways, sending people wrong information. I looked into using Apps Script’s LockServiceto limit execution, but honestly, I couldn’t spend a whole lot of time on getting this running.

Martin suggested using to generate charts through an API endpoint. It’s based on Chart.js, which I’ve used in the past for an action research summary.

This ends up working better because I don’t need to read the sheet twice to generate a chart. I can take the processed data object right from the form responses, hit the API with some config in a JSON string and get an image which is appended into the email.

The Code

function processFormValues(row) {

    // The form structure doesn't change, so we can slice the values array directly.
    let obj = {
        "email": row[1],
        "tk": row.slice(2, 8),
        "pk": row.slice(8, 15),
        "tpk": row.slice(15, 21),
        "tpck": [row[row.length-1]],
    return calculate(obj)

// calculate the average for the list of scores
function calculate(obj) {
    let avgObj = {}

    for(var key in obj) {
        if(key === "email") {
            avgObj["email"] = obj[key]
        } else {
            var length = obj[key].length;
            // Form responses are strings, so convert each to a number while calculating the average.
            var avg = (obj[key].reduce((prev, curr) => Number(prev) + Number(curr), 0)) / length;
            avgObj[key] = avg.toFixed(2) // limit the float
  return avgObj;

function getChart(chartData) {

    // Use chart.js to avoid race conditions building charts in the sheet.
    const baseUrl = ``
    const params = {
        "type": "horizontalBar",
        "data": {
            "labels": ["tk","pk","tpk","tpck"],
            "datasets": [
                    "label": "Self Assessment",
                    "backgroundColor": "rgba(54, 162, 235, 0.5)",
                    "borderColor": "rgb(54, 162, 235)",
                    "data": [ chartData[1], chartData[2], chartData[3], chartData[4] ]
        "options": {
            "plugins": {
                "datalabels": {
                    "display": true,
                    "anchor": "end",
                    "align": "end",
                    "color": "#000",
            "elements": {
                "rectangle": {
                    "borderWidth": 2
            "responsive": true,
            "scales": {
                "xAxes": [{
                    "ticks": {
                        "display": true,
                        "beginAtZero": true,
                        "suggestedMax": 6,
                        "stepSize": 1,
                "gridLines": {
                    "display": true,
            "yAxes": [{
                "gridLines": {
                    "display": false,

    try {
        let blob = UrlFetchApp.fetch(baseUrl + encodeURIComponent(JSON.stringify(params))).getAs('image/png');
        return {'url': baseUrl + encodeURIComponent(JSON.stringify(params)), 'blob': blob};
    } catch(e) {
        return e.message

// Add an installable trigger to run on each form submit
function onFormSubmit(e) {
    const ss = SpreadsheetApp.getActiveSpreadsheet();
    const calcs = ss.getSheetByName('calculations');
    const headers = calcs.getRange(1, 1, 1, calcs.getLastColumn()).getValues()[0];

    try {
        let scores = processFormValues(e.values);

        // Save the averaged data to a sheet in case something goes wrong
        var writeRow = []
        for(var i=0; i<headers.length; i++) {

        // Assign to a variable because we'll write to the sheet later.
        let nextRange = calcs.getRange(calcs.getLastRow()+1, 1, 1, writeRow.length).setValues([ writeRow ])

        // Get the chart
        let chart = getChart(writeRow)

        // if there was good data then send the email
        if(chart) {
            let emailImages = {};
            emailImages['chart'] = chart['blob'];

            let emailBody = `...your email template...`;

            // append the image as a blob
            emailBody += `<p align='center'><img src='cid:${emailImages["chart"]}'></p>`;

                to: writeRow[0],
                subject: 'Your survey results!',
                htmlBody: emailBody,
                inlineImages: emailImages

            // Set the chart URL in a sheet in case we need it again
            calcs.getRange(nextRange.getRow(), 6).setValue(chart['url']);
    } catch(err) {
        calcs.getRange(calcs.getLastRow()+1, 1).setValue(err)
        throw new Error(err) // put it in the console

The result

As form submissions come in, the sheet generates the average for each category and then uses those values to send a request to the quickchart API. The image blob is captured in an object and sent inline to the user who submitted. It runs in 7-8 seconds consistently, with the lag coming from sending the email.

Our email template includes an explainer on what each domain is meant to describe so participants can think through their survey before coming to the live session. We’ll use the same form at the end so they can compare their results as part of the final reflection.