I was disappointed when Home Assistant removed the BOM integration. Whilst I don’t fault their reasons, and wish that BOM did provide a proper API, it still was an unfortunate outcome. The default weather integration uses data that’s to generalised thus lacks accuracy. This is typical of all weather applications that don’t source their information from the Bureau directly. Fortunately part of my problem was solved via HACS and bremor’s bureau_of_meteorology plugin, which provides a weather component that sources its data from BOM. The only thing left is to display a radar image on the dashboard. Initially I simply created a Generic Camera with an image URL which sufficed my needs, but the lack of a historical timelapse began to wane on me. This is what was missing and took it upon myself to remedy.
Up to this point I would say I was a skilled Python reader, not so much with writing but alas it is just a language and, aside from its nuances, is no different than Ruby (which is my career-building language). The easy win for my solution would be to take the previous integration and extract the useful bits and pieces. But I started thinking to make the solution broader as I also wanted to have a timelapse of some traffic camera feeds. These feeds are useful to glipse at traffic along major roads near where my home and are updated every minute. The difficult thing about inspecting traffic was that it was a snapshot in time and not a replay. So that was what I aimed to achieve, to capture these images one by one and play them back.
My first attempt was to create a Home Assistant Add-on and use the existing integrations to glue it all together. I went down the rabbit hole of researching ffmpeg and ended up building a HTTP proxy service in Go but was foiled by my lack of understanding on how the Generic Camera integration actually works. At first I thought that it captured and stored the image, periodically updating it, but I was wrong. It simply proxied the request, storing the image in memory. This busted my proxy concept because that meant I had to register the image somehow and periodically capture it. This felt like double-handling and thus be more work to maintain. Back to the drawing board.
I was going to have to get my hands dirty and into Python to make what I want happen. I began to dissect several camera components as well as HA’s frontend to understand how images and videos were being displayed. This lead me to learn about MJPEG, specifically the MJPEG over HTTP variety. The basic premise is to keep the server connection open whilst sending individual JPEG images via a special MIME type. It’s simple and easy and Home Assistant already does this. So all I needed was to periodically capture and store images then iterate and push each image during playback. The MJPEG IP Camera integration was a good reference for the playback portion. It exposed the handle_async_mjpeg_stream
callback, which is key to creating an animation on the frontend.
My biggest issue was for the periodically updating. I began perusing other components in the HA codebase to see how if I could find a solution and eventually came across async_track_time_interval
. This was the final piece of the puzzle I needed to complete my plugin. After some trial and error I began to see things spring to life.
The end result became exactly what I wanted, with seamless integration in HA’s frontend and dashboard. If you’re interested in trying out my plugin then please give it a go. And if you discover an issue please let me know.