Skip to main content

Using Google Colab for REST API exploration and testing

New York City's Office of Technology Innovation provides a collection of useful APIs that let you access City data. For the past few weeks I have been playing with the APIs looking for useful application ideas and I've been using Google's colab product for that exploration. These APIs are free to access and you can sign up here.

If you are new to Colab, Google provides an introduction notebook that covers the basics. If you've used jupyter with Python you should be good to go. While Colab is frequently used for data science and AI, I think this is a great platform for building internal tools. For one specific type of user, users with lots of domain specific knowledge who may not know an API or tool, Colab is useful as a way of bundling instructions and code in a way that allows them to be productive.  

Minimal code example for connecting to an end point

This is a minimal test script. It connects to an API, uses a secure way of holding the API keys and allows the user to visually set the parameters sent to the API. After execution, the JSON output is displayed. While this is fine, an a step above using CURL from the command line, this isn't what I would send to a user. 

import requests
import json
import datetime
import urllib.parse
from google.colab import userdata

from_date = "2025-05-27" # @param {type: "date"}
to_date = "2025-06-06" # @param {type: "date"}

url = "https://api.nyc.gov/public/api/GetCalendar?fromdate={}&todate={}"
api_url = url.format(urllib.parse.quote(from_date), urllib.parse.quote(to_date))

subkey = userdata.get('Ocp-Apim-Subscription-Key')
headers = {
"Ocp-Apim-Subscription-Key": subkey,
}

response = requests.get(api_url, headers=headers)

print(response.text)

This is what it looks like on screen. The left side has the python code while the UI for setting dates is on the right.

Storing secrets outside of notebooks

Colab lets you store API keys outside of the code and use a class for accessing the keys. I highly recommend doing this rather than pasting the keys in code. Sharing a Colab notebook does not share the keys so you will need to work with users to have them install API keys but once you install a key, it is usable in any of the notebooks the user has.


Adding exception handling and extracting out the API call

First step is to extract out a function to call the API and to create a short circuit test in case there is no API key set in secrets. After that, we include error handling and the libraries we will be using below. Normally, this is code you would hide and simply execute to set up.  


import requests
import json
import datetime
import urllib.parse
from google.colab import userdata

# fail quick if no API Key
assert not not userdata.get('Ocp-Apim-Subscription-Key')

def call_api(from_date, to_date):

try:

url = "https://api.nyc.gov/public/api/GetCalendar?fromdate={}&todate={}"
api_url = url.format(urllib.parse.quote(from_date), urllib.parse.quote(to_date))
subkey = userdata.get('Ocp-Apim-Subscription-Key')
headers = {
"Ocp-Apim-Subscription-Key": subkey,
}

response = requests.get(api_url, headers=headers)
response.raise_for_status() # Raise HTTPError for responses 4xx or 5xx

return response

except requests.exceptions.RequestException as e:
print(f"Error fetching API: {e}")
except Exception as e:
print(f"Error while fetching API: {e}")

def extract_response(response):

try:

json_data = json.loads(response.text)
return json_data

except json.JSONDecodeError as e:
print(f"Error decoding JSON: {e}")
 

Using a form for data entry

In the same way you want to avoid putting any API keys in code you should avoid having users edit code to set variables. Colab has a really nice form system. When combined with the hide code option, you can build a tool where a domain expert can populate a request and execute it without any knowledge of the internals. Avoid having users put API keys in a form. Users might email screenshots or paste them in a bug tracker and they may not be careful about leaking data. 

Below is the block that includes the selection of dates and executes the function call_api.

from_date = "2025-05-27" # @param {type: "date"}
to_date = "2025-06-03" # @param {type: "date"}

# These are the two critical calls
response = call_api(from_date, to_date)
json_data = extract_response(response)

Now that we have a json response, we can figure out what we want to do with it. 

Format output data

Colab has a nice data formatter for printing out tables with interactive features. Data can be sorted, filtered and exported via markdown. Overall, it's a pretty good feature set for basic reporting. 
At this point the user has a way of executing the set up and then alternating between picking dates and using an interactive table. This is fine assuming all is well.

## Build interactive table
##
from pandas import DataFrame as df
from google.colab import data_table

data_table.disable_dataframe_formatter()

table = []

for day in json_data["days"]:
for item in day["items"]:
item["day"] = day["today_id"]
table.append(item)

df = df(table)
order = ['day', 'type', 'status', 'details', 'exceptionName']
df = df.reindex(columns=order)

df

You end up with a table like this. While not a full featured spreadsheet, it allows for a fair amount of manipulation of the data.


Building tests

It's nice when things work nicely but that isn't always the case and in almost all cases, the time it 

Pytest is a great tool for testing and I use it for most the of the Python scripts I write. And while it is possible to run pytest in Colab, I normally wouldn't bother. The process for using it exceeds the value I get so I normally just use exception handlers for critical code where I need to know the cause of a failure and simple boolean evaluations to "test" the results of my code. 

In this notebook I have loud errors for the three blockers that a user might encounter, lack of a security key, the API not responding and the API not returning JSON. In cases where everything runs but the chart can not be populated, I section called Test Results Dump.  

Testing and reporting errors

The general form for testing software is Given, When, Expect. I don't remember where I learned that from but whenever I am writing tests, I keep those three words in that order in my head. 

If I am given a URL, when I call the URL with these values I expect the following outcomes. In order to follow that best practice I need to rewrite the code to extract out some functions and break the code down into sections that can be executed individually. The section that calls the API and then runs the tests should be able to executed multiple times with different inputs. And it should be possible to export the results in a way that could be attached to a bug tracking system. 

Normally I would follow the Right Bicep rules but I am only going to test the three things that would break my use case, which is a script to automatically add exceptions to my calendar. 

  1. That there is a collection of items under "days" that is equal to the date range between from and to.
  2. That for any day, there are three items-Alternate Side Parking, Collections and Schools.
  3. If schools are closed, or trash pick up is delayed or suspended or alternate sides parking is suspended, that there is a non empty exception for the item.

Dumping test output

Executing the code in the Test Results Dump results in executing some test code that inspect the JSON that is returned to see if it conforms with the expectations. If a field is renamed or some other data is added then one or more tests will fail and hopefully lead to information that is useful in mitigating any issues.

## Dump Out Test Table
## This is a row by row test dump, run this if you dont' get the pretty data report

print(f"from_date: {from_date}")
print(f"to_date: {to_date}")

print(test_not_empty(response.text))

print(test_not_empty(json_data))

print(test_date_range(json_data["days"],from_date,to_date))

# this should be a parameterized test
for day in json_data["days"]:
print(day)
for item in day["items"]:
print(test_item(item))

This results in a blob of test that can be opened in an editor. While it isn't pretty, it can be read.

The bonus round

This is a list of items that I may not be needed but I think are worth mentioning.

Charts

Colab can make use of matplotlib and Seaborn for charting. While not useful for me in this case, you can quickly create a chart like this. 




Working with Google Sheets

Colab can import and export from Google Sheets. I've used this feature to turn Colab into a report generation tool more than once. I've used it to add rows, columns and new workbooks as a way to keep up to date metrics.

The example I am using requires saving a JSON file with Auth information to your Google Drive. If you'd rather not use that approach, I wouldn't blame you as its the equivalent of storing ssh keys in your Google drive.

In those cases, look at the bottom of the data table for this button. It will walk you through creating a live interactive Google Sheet embedded in your notebook.




Using custom runtimes

If you use Google Cloud, you can set up a Compute Engine instance as your back end. This is useful in cases where you have a secure cloud environment and you would like to a access the databases, APIs and other resources while using the same security.

Running a local copy

Colab allows you to export a notebooks to, and import a notebook from, GitHub. You can grab a copy of this example from here

Both Visual Studio Code and IntelliJ have Python plugins that will execute Python notebooks. You'll have to adjust the way you store keys and remove the Google specific libraries but the notebook would run as expected. 

Hope this is useful.

If you found this useful and want to encourage me to write more, feel free to comment either here or where ever you found this.



Comments

Popular posts from this blog

Capturing text from any Mac Application into Emacs org-mode with Automator and org-protocol

After decades of using vi and Vim I switched to  Spacemacs  which is an amazing vi keystroke emulation layer running on Emacs and configured with an amazing set of preconfigured layers for different tasks. I decided to give it a try after seeing Org-Mode in action and seeing it was a nice taking system with integrations with almost anything imaginable. A few weeks ago I found out about org-protocol and followed this post  by Jethro on using a bookmarket to capture from the Web to Emacs.  This page assumes a few things You use Emacs on a Mac You are using org and understand how to use capture and capture templates. You need to yank text from random apps into Emacs You don't need to be using Spacemacs and this should work with any install of Emacs that supports org, org-capture and org-protocol. Creating Automator Action Start Automator. It's this icon. I'm guessing many people have had this for years and have never used it.  Open it and pick Quick Action Grab the...

Halloween Candy Distribution Robot Chute

I am not a hardware guy and my Brooklyn apartment lacks true workshop space but we were able to put together a reasonably well done candy chute robot able to deliver candy directly into Trick-or-Treat's candy bags.  My wife wanted the robot to blink lights and wave an arm. I decided to use a servo motor driven by a Raspberry pi pico running MicroPython. The pico and MicroPython were chosen because I had them already from prior projects with my son.  Legos, chopsticks and leftover screws. Only the best. Cardboard, aluminum foil and Tupperware to protect the electronics. Those are the bags of candy and we managed to go through all of the candy by dark. This is what it looked like up on our balcony. How do you get the candy down to the trick-or-treaters? A dryer duct. Last time we used plastic sheeting and zip ties. The $25 to get a duct was worth it. We tested it with fun sized chocolate, smarties, double bubble gum, skittles and m&ms. The bagged candy, skittles and M...

LinkedIn should have an introduction feature

I have been spending more time on LinkedIn the past couple of months and every time I want to facilitate an introduction between people I find myself thinking there should be a better way to do it. It's somewhat annoying that a platform for networking lacks a way to link people in.   Right now, LinkedIn lets you message a person and include in that message, a third person's contact. This would be ok as an introduction but it lacks what I think are two important features that should be part of LinkedIn's communication options-protection of the privacy and seeking the consent of the two people being introduced.  As an example, let's say I wanted to introduce a student I know named Adam to a business strategist who is looking for a part time worker. We'll call him Bob . I know Bob will find Adam to be the perfect fit-a motivated communications student at a local school who is looking for a part time summer gig. Starting from Bob's profile, I would pick the more op...