Visualizing AI Garden data using the Google Home Hub

It has been about a year since my last blog post about the AI Garden, the chatbot that helps you take care of your plants. Now I would like to tell you about my latest additions to this project, visualizing garden data using a Google Home Hub smart speaker.

AI Garden Googe Home Hub

And in case you missed it, the Google Home Hub is a smart speaker with a built-in display. Unfortunately, not available in all countries yet.

Updated architecture

Visualizing the user interface, instead of a chat-only conversation, gives of course a whole new dimension to the interaction. But before we talk about the visualization part, I would like to discuss the changes that have been made to the overall setup. To summarize: I'm still using the Flower Care Smart Monitor connected to a Raspberry Pi that acts as a data hub. Dialogflow is used as the preferred chatbot service, and Amazon Web Services (AWS) for the cloud infrastructure.

What has changed is the underlying Cloud architecture. Instead of having many different types of cloud components working together, I decided to simplify the setup by only implementing the necessary components to lower the latency between AWS Lambda and Dialogflow, resulting in a much faster response time. The components used now are: AWS DynamoDB, AWS Lambda, and AWS API Gateway.

Visualizing intents

The intents used in the original chatbot have stayed mostly the same and are automatically visualized now on the screen of the Google Home Hub. To make it more appealing I have added some visual flair to the intents by creating cards and adding pictures. Another nice improvement is the use of emojis, making the whole interaction more friendlier and human for the user. Check out the video below to see the new version in action:

Chat to your garden with AI garden

Showing custom content

One of the challenges to solve was to display custom content on the Google Home Hub screen. Google hasn’t implemented a clean way to display your own custom content, because it is not part of the Google Actions SDK functionality. The solution that our innovation team came up with was to force a screen to the Google Home hub by casting it using a Python script. This makes the Google Home Hub essentially a glorified web viewer. For example: using this approach we can now show a graph - using - that displays the plant data in real-time

Google Home Hub AI Garden

Future improvements

One of the things I would like to add is a prediction model that gives you insights in when to water the plants. With this you can get notified by the AI Garden if you need to water the plants or fertilize the soil. Of course, this also can be visualized on the screen, to see future actions at a glance.

Want to know more, or stay in touch with our innovations? Follow our blog and if you have specific questions regarding this case, please get in touch with