Jacob Mulquin

My Waste My Future (Part 2)

Systems come, systems go, automation is forever

✍️ Jacob Mulquin
📅 25/01/2022

A few days weeks ago, I woke up and looked at the Raspberry Pi on my wall. I was surprised to see the LEDs were all dim. Thoughts went through my head as to what could be wrong:

Turns out, the duplicitous site wollongongwaste.net.au has laid its head in a coffin and its remnants have merged into wollongongwaste.com.au. The link now redirects to a booking page. I was saddened for a moment, thinking that all the time spent figuring things out was for nothing, and that my Pi would lose functionality. All that effort for only a few weeks of pay-off.

However, after some investigation the saddened became gladdened. It turns out the new system for presenting calendars to end users is far easier to read.

Enter, Impact Apps

It appears that Remondis are using a SaaS offering by Impact Apps to inform residents of waste management schedules. It's basically a calendar & information brochure app. I don't believe that services can be booked through it. Impact are used by numerous councils in NSW.

The specific URL for Wollongong's calendar is found here. Loading this page in a browser greets the user with a lovely modal offering to search for Suburb. The workflow is still the same as before, where a user cannot simply type their address. They must select their suburb, then street, then house number.

For the purposes of this article, I will pretend that I live at The Old Court House, Wollongong.

The API requests are as follows:

You beauty, /api/v1 in the URL, json output. This is heavenly compared to before.

Out with the Scraper, In with the Parser

Since the data is now graciously provided as JSON, there's no need to download all that superflous HTML. With the ability to set the entire year as a query string, the amount of times my script needs to hit the web server has been drastically reduced. Technically it only needs to be run once a year but I will fetch it once a week in order to pick up any special events.

From reading this API, I can tell it has been designed specifically to provide data to the javascript that powers the Impact Apps calendar, this is evident by attributes like startHalfDay, color, textColour and borderColor. To cut down on the amount of data required to send to the client, this API contains the dow and daysOfWeek attributes to define recurring events. Since my red (waste) and green (organic) bins are collected weekly, these events use this recurring logic, while the yellow (recycling) bin occurs fortnightly, so each pickup is listed as a separate event, as are special events.

{
    "property": {
      "collection_day": 2,
      "collection_day_2": null,
      "zone": "north",
      "shs": null,
      "bin_bank_id": null,
      "collections": []
    },
    "color": "#EF3340",
    "textColor": "#EF3340",
    "borderColor": "#000",
    "dow": [
      2
    ],
    "daysOfWeek": [
      2
    ],
    "start_date": "2020-12-29",
    "event_type": "waste"
  },
  {
    "startHalfDay": true,
    "color": "#44D62C",
    "textColor": "#44D62C",
    "borderColor": "#000",
    "dow": [
      2
    ],
    "daysOfWeek": [
      2
    ],
    "start_date": "2020-12-29",
    "event_type": "organic"
  },
  {
    "color": "#FEDB00",
    "textColor": "#FEDB00",
    "borderColor": "#000",
    "start": "2020-12-29",
    "event_type": "recycle"
  },
  { "snip": "snip" }
]

What needs to be done is that the recurring logic is calculated and a series of bin_objects are created. These will then be combined with the existing event objects provided by the rest of the API.

To make sure there are no fiddly problems at the start of the year, the script looks a few days before the start date and a few days after the end date.

def transform_content(fetch):
    raw = json.loads(fetch)
    bins = []
    
    start_date_obj = datetime.strptime(start_date, "%Y-%m-%d")
    end_date_obj = datetime.strptime(end_date, "%Y-%m-%d")
    days_between = (end_date_obj - start_date_obj).days
    weeks = int(days_between / 7)

    for entry in raw:
        if "dow" in entry:
            start_date_monday = start_date_obj + timedelta(days=-start_date_obj.weekday())
            start_date_bin = start_date_monday + timedelta(days=entry['dow'][0]-1)

            for i in range(weeks+2):
                date_bin = start_date_bin + timedelta(days=i*7)
                if (date_bin > end_date_obj or date_bin < start_date_obj):
                    continue

                bin = bin_object(entry, date_bin)

                bins.append(bin)
        else:
            this_date = entry['start']
            this_date_obj = datetime.strptime(this_date, "%Y-%m-%d")

            bin = bin_object(entry, this_date_obj)
            
            bins.append(bin)
    return bins

This results in a nice set of objects like so:

[
    {
        "date": "2021-12-17",
        "timestamp": "1639659600",
        "colour": "red",
        "type": "waste",
        "name": "waste"
    },
    {
        "date": "2021-12-17",
        "timestamp": "1639659600",
        "colour": "green",
        "type": "organic",
        "name": "organic"
    },
    {
        "date": "2022-01-07",
        "timestamp": "1641474000",
        "colour": "yellow",
        "type": "recycle",
        "name": "recycle"
    },
    {
        "date": "2022-01-17",
        "timestamp": "1642338000",
        "colour": "purple",
        "type": "special",
        "name": "UOW Goodwill Hunting Drop Off"
    }
]

The ordering of the objects is a bit funky, but it doesn't really matter because we aren't displaying these dates to anybody.

Now that I have a nice set of objects to work with (with some beautiful timestamps), it's quite easy to look through the array and only grab the objects that are to be collected in the next 6 days.

upcoming = {}
for bin in bins:

    if int(bin['timestamp']) < int(today_timestamp):
        continue
    if int(bin['timestamp']) > int(today_plus_six_timestamp):
        continue

    if bin['colour'] not in upcoming:
        upcoming[bin['colour']] = []

    upcoming[bin['colour']].append(bin['timestamp'])
    upcoming[bin['colour']].sort()

The reason I append the timestamp to the upcoming array is because I want to eventually have an animated pulse when it is the day of the event.

The code to light up the LEDs is much the same as it was in the previous article.

Before I run the code, I need to run a command to stop the kano from doing its cpu monitor animation: kano-speakerleds cpu-monitor stop

I ran the code and uh oh: requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581)

Honestly, I'm lazy and this isn't mission critical, so I'll be throwing a verify=False onto the requests.get method. (Learn from my laziness, don't do this)

Hoorah, we have over-engineered LEDs once again!

Prognosis and what's next

I'm happy with the results so far, it's nice to know that I'm getting the same results as before with a fraction of the HTTP requests.

I awoke one morning and the poor Pi was dangling by it's USB cord. I'm glad I chose a cord with a sturdy connection!

The Raspberry Pi clinging for life by it's USB cord

So, a thoroughly exuberant amount of mounting tape later, it is tightly secured back on the tiles.

I would eventually like to add the pulsing animation but I need to figure out how to do it without sleep or similar.