Computerization and AI can possibly help we all in showcasing. However, right now a ton of these instruments are unavailable to individuals who can't code or who can code a piece yet aren't generally that OK with it.

What regularly happens is that there winds up being a couple of individuals in the workplace who are alright with composing and altering code and afterward these individuals produce contents and journals that every other person runs. The work process looks a cycle this way:

I will show you a basic method to smooth out this work process to eliminate the means where individuals need to run a content and organization the yield. Rather they can run the computerization straightforwardly from inside Google Sheets.

The model I will show you is for a Sheets custom capacity that profits the Lighthouse score for a URL like in this gif:

The strategy I will show you isn't the main method of doing this, yet it represents a substantially more broad procedure that can be utilized for some, things, including AI calculations.

*There are two sections: *

A Google Cloud Run application that will do the confounded stuff (for this situation run a Lighthouse test) and that will react to HTTP demands.

An Appscript custom capacity that will make solicitations to the API you made in sync 1 and return the outcomes into the Google Sheet.

Cloud run applications

Cloud Run is a Google administration that takes a docker picture that you give and makes it accessible over HTTP. You possibly pay when a HTTP demand is made, so for a help like this that isn't being utilized every minute of every day it is modest. The real cost will rely upon the amount you use it, however I would appraise under $1 every month to run a great many tests.

The primary thing we have to do is make a Docker picture that will play out the Lighthouse investigation when we make a HTTP solicitation to it. Fortunately for us there is some documentation telling the best way to run a Lighthouse review programatically on Github. The connected code spares the investigation to a record instead of restoring the reaction over HTTP, yet this is anything but difficult to fix by enveloping the entire thing by an Express application like this:

const express = require('express');
const app = express();
const lighthouse = require('lighthouse');
const chromeLauncher = require('chrome-launcher');

app.get('/', async (req, res) => {
    // Check that the url query parameter exists
    if(req.query && req.query.url) {
        // decode the url
        const url = decodeURIComponent(req.query.url)    
        const chrome = await chromeLauncher.launch({chromeFlags: ['--headless', '--no-sandbox','--disable-gpu']});
        const options = {logLevel: 'info', output: 'html', port: chrome.port};
        const runnerResult = await lighthouse(url, options);

        await chrome.kill();

const port = process.env.PORT || 8080;
app.listen(port, () => {
  console.log(`Listening on port ${port}`);

Save this code as index.js.

At that point you will likewise require a document called package.json which depicts how to introduce the above application and a Dockerfile so we can envelop everything with Docker. All the code documents are accessible on Github.

    "name": "lighthouse-sheets",
    "description": "Backend API for putting Lighthouse scores in Google sheets",
    "version": "1.0.0",
    "author": "Richard Fergie",
    "license": "MIT",
    "main": "index.js",
    "scripts": {
        "start": "node index.js"
    "dependencies": {
        "express": "^4.17.1",
        "lighthouse": "^6.3"
    "devDependencies": {}
# Use the official lightweight Node.js 10 image.
FROM node:12-slim

# Our container needs to have chrome installed to
# run the lighthouse tests
RUN apt-get update && apt-get install -y 
  && curl -sSL | apt-key add - 
  && echo "deb stable main" > /etc/apt/sources.list.d/google-chrome.list 
  && apt-get update && apt-get install -y 
  && apt-get purge --auto-remove -y curl gnupg 
  && rm -rf /var/lib/apt/lists/*

# Create and change to the app directory.
WORKDIR /usr/src/app

# Copy application dependency manifests to the container image.
# A wildcard is used to ensure copying both package.json AND package-lock.json (when available).
# Copying this first prevents re-running npm install on every code change.
COPY package*.json ./

# Install production dependencies.
# If you add a package-lock.json, speed your build by switching to 'npm ci'.
# RUN npm ci --only=production
RUN npm install --only=production

# Copy local code to the container image.
COPY . ./

# Run the web service on container startup.
CMD [ "node", "--unhandled-rejections=strict","index.js" ]

Assemble the docker image and afterward you can test things locally on your own PC like this:

First start the image:

docker run -p 8080:8080 lighthouse-sheets

And then test to see if it works:

curl -v "localhost:8080?"

Or visit localhost:8080? in your browser. You should see a lot of JSON.

The next step is to push your image to the Google Container registry. For me, this is a simple command:

docker push

However, you may need to arrangement the docker confirmation first before you can do this. An elective technique is the utilization Google Cloud Build to make the picture; this may turn out better for you in the event that you can't get the validation working.

Next you have to make a Cloud Run administration with this docker picture.

Open Cloud Run and click “Create service”


Name and adjust settings. You must give your service a name and configure a few other settings:

It is ideal to pick an area that is near where the greater part of the crowd for your destinations live. Checking the site speed for a UK site from Tokyo won't give you similar outcomes as what your crowd get.

With the goal for you to call this administration from Google Sheets it must permit unauthenticated summons. In case you're stressed over securing a lot the administration to keep others from utilizing it you should do this by (for instance) checking from an API mystery in the HTTP solicitation or something to that effect.

Next you should choose the holder you made before. You can type in the name in the event that you recall it or snap "Select" and pick it from the menu.

Then click “Show Advanced Settings” because there is further configuration to do.

You have to expand the memory assignment since Lighthouse tests need more than 256Mb to run. I have picked 1GiB here however you may require the greatest stipend of 2GiB for certain locales.

I have discovered that lessening the simultaneousness to 1 improves the dependability of the administration. This implies Google will consequently begin another compartment for every HTTP demand. The drawback is that this costs marginally more cash.

Snap "Create" and your Cloud Run administration will be prepared in a matter of seconds.

You can give it a quick test using the URL. For example:

curl -v ""

Or visit in your browser.

The next step is to write some Appscript so you can use your new API from within Google Sheets.

Open a new Google Sheet and the open up the Appscript editor.

This will open a new tab where you can code your Google Sheets custom function.

The key idea here is to use the Appscript UrlFetchApp function to perform the HTTP request to your API. Some basic code to do this looks like this:

function LIGHTHOUSE(url) {
  const BASE_URL = ""
  var request_url = BASE_URL+"?url="+encodeURIComponent(url)
  var response = UrlFetchApp.fetch(request_url)
  var result = JSON.parse(response.getContentText())
  return(result.categories.performance.score * 100)

The last line restores the general presentation score into the sheet. You could alter it to restore something different. For instance to get the SEO score use result.categories.seo.score all things considered.

Or then again you can restore numerous segments of results by restoring a rundown like this:

[result.categories.performance.score, result.categoryies.seo.score]

Spare the record and afterward you will have a custom capacity accessible in your Google Sheet called LIGHTHOUSE.

The simplest method to begin with this is to duplicate my model Google Sheet and afterward update the code yourself to point at your own API and to restore the Lighthouse results you are generally keen on.

published by Shantun Parmar