Counting Bees with YOLOv8 and Raspberry Pi Zero 2W: A TinyML Approach for Environmental Monitoring (Year Two Thousand and Twenty-Four)

Ever since I was a kid, I’ve been fascinated by bees. The way they buzz around, collecting pollen and making honey, it’s like a whole other world happening right before our eyes. But lately, there’s been this, like, buzzkill – bee populations are declining, and that’s a huge problem for our environment. So, at the Federal University of Itajuba, a group of us got together to see if we could use some cool tech to help out our buzzing buddies.

The Bee-g Picture

Traditionally, keeping tabs on bee populations has been about as fun as watching paint dry. Picture this: someone sitting by a hive for hours, literally counting bees as they come and go. Talk about tedious and, let’s be real, prone to errors. That’s where our project comes in! We’re ditching the old-school methods and embracing the future with TinyML – a way to shrink down powerful machine learning models and run them on tiny devices. Our weapon of choice? The Raspberry Pi Zero 2W, a tiny computer with a whole lotta potential.

Why This Matters

Okay, so why should we even care about counting bees? Well, these little guys are like the unsung heroes of our ecosystem. They’re master pollinators, responsible for a third of the food we eat. No bees, no food – it’s as simple as that. But here’s the kicker: bees are super sensitive to changes in their environment. Pollution, pesticides, climate change, you name it – it affects them. By monitoring bee populations, we’re basically getting an early warning system for the health of our planet. Pretty neat, huh?

What We’ll Be Doing

This isn’t just some pie-in-the-sky idea. We’re gonna get our hands dirty and show you how to build your own bee-counting setup. We’ll walk you through everything: setting up the Raspberry Pi Zero 2W, hooking up a camera, and training a super-smart YOLOv8 model to spot those busy bees coming and going from the hive. We’ll even show you how to analyze the data and become a bonafide bee whisperer (well, sort of).

Building the Ultimate Bee-Counting Setup

Alright, let’s talk hardware. The Raspberry Pi Zero 2W is our tiny hero in this story. This little dude is packed with a surprising amount of power for its size. We’re talking a quad-core processor, wireless connectivity, and even a camera connector, all in a package smaller than a credit card!

Under the Hood

Here’s a breakdown of what makes the Raspberry Pi Zero 2W so awesome for this project:

  • CPU: This bad boy has a 1GHz quad-core 64-bit Arm Cortex-A53 processor. Translation: it’s got some serious processing power for such a small device.
  • RAM: With a whopping 512MB SDRAM, it can handle the demands of running our YOLOv8 model.
  • Connectivity: It’s got built-in 2.4GHz Wi-Fi and Bluetooth 4.2, making it easy to connect to the internet and other devices.

And there’s more! The Raspberry Pi Zero 2W has a mini HDMI port so you can connect it to a display, a micro USB OTG port for data transfer, and a microSD card slot for storage. It even has a CSI-2 camera connector so we can hook up our trusty bee-watching camera. We’ll get to that in a bit.

Powering Up

To juice up our tiny powerhouse, we’ll be using a standard 5V/2.5A power supply. But here’s the cool part: we’re aiming to eventually power this whole setup using a compact solar station. Imagine that – a completely self-sufficient bee-monitoring system running on sunshine! How cool is that?

Getting Our Software On

Now, let’s breathe some life into our Raspberry Pi Zero 2W. We’ll be using Raspberry Pi OS Lite (the 64-bit version) because it’s lean, mean, and plays nicely with YOLOv8. Don’t worry; we’ll walk you through the whole process.

Installing the Operating System

First things first, we gotta get our OS up and running. We’ll be using a nifty tool called Raspberry Pi Imager, which you can download for free. It’s super user-friendly and works on macOS, Windows, and Linux. Once you’ve got the imager, follow these steps:

  1. Download the Raspberry Pi OS Lite (64-bit) image.
  2. Connect an SD card to your computer (at least 8GB is recommended).
  3. Open Raspberry Pi Imager, choose the OS image you downloaded and select your SD card.
  4. Before you flash the image, click on the little gear icon to configure some basic settings. Give your Raspberry Pi a cool hostname (like “beehive-one”), set a strong password, and enable SSH for remote access. Trust me, this will make your life a lot easier later on.
  5. Once you’re all set, hit that “Write” button and grab a snack. It’ll take a few minutes for the imager to do its thing.

Once the flashing is done, safely eject the SD card and pop it into your Raspberry Pi Zero 2W. Connect the power supply, and boom – you’re in business! But hold your horses; we’re just getting started.

Remote Control with SSH

Now, let’s tap into our inner hackers (the good kind, of course). We’ll be using SSH (Secure Shell) to remotely access our Raspberry Pi Zero 2W. This means we can control it from our main computer without needing a separate monitor and keyboard. It’s like magic, but with less smoke and mirrors.

First, you’ll need to find your Raspberry Pi’s IP address. Fire it up and type ifconfig -a in the terminal. Look for the wlan0: inet section – the IP address will be right there.

Next, open up your favorite terminal application (I’m a big fan of iTerm2) and enter the following command, replacing “your_username,” “your_pi_ip_address,” and “your_password” with your actual credentials:

ssh your_username@your_pi_ip_address

Hit enter, type in your password, and voila! You’re now connected to your Raspberry Pi Zero 2W. How cool is that?

Prepping for YOLOv8

Before we unleash the power of YOLOv8, we need to do a little housekeeping. First up, let’s make sure our system is up-to-date. In the Raspberry Pi’s terminal, run the following command:

sudo apt-get update

Next, we need to install Pip, a handy tool for managing Python packages. Type in:

sudo apt install python3-pip -y
pip install -U pip

Now, our Raspberry Pi is starting to look like a real machine-learning machine!

Supercharging with Swap

Here’s the thing about training machine learning models: they can be a bit memory-hungry. And while the Raspberry Pi Zero 2W has a decent amount of RAM, we’re gonna give it an extra boost using something called swap memory. Think of it like giving your computer a temporary memory upgrade.

Open the terminal and type htop to keep an eye on your system resources. We’re going to temporarily disable the existing swap, increase its size, and then reactivate it:

sudo dphys-swapfile swapoff
sudo nano /etc/dphys-swapfile

In the opened file, find the line that says CONF_SWAPSIZE=100 and change it to CONF_SWAPSIZE=2000. Save the file and exit. Now, let’s apply these changes:

sudo dphys-swapfile setup
sudo dphys-swapfile swapon
sudo reboot

Once your Pi reboots, it’ll have a beefier swap space, ready to tackle those memory-intensive tasks. Sweet!

Lights, Camera, Action!

It’s time to give our Raspberry Pi Zero 2W the gift of sight! We’ll be using the Arducam OV5647, a nifty little 5MP camera module that’s perfect for our bee-counting needs.

Carefully connect the camera module to the CSI-2 connector on your Raspberry Pi. Once it’s securely connected, we need to enable the camera in the Raspberry Pi’s configuration. Open up the config.txt file:

sudo nano /boot/firmware/config.txt

Add the following line at the end of the file:

dtoverlay=ov5647,cam0

Save the file and reboot your Pi. Now, let’s make sure the camera is recognized:

libcamera-hello --list-cameras

If everything went smoothly, you should see your camera listed. Let’s take it for a spin!

rpicam-jpeg --output test.jpg --width 640 --height 480

Check your Pi’s storage – you should have a beautiful test image. Click!

File Transfer Fun with FileZilla

We’re going to be transferring files between our computer and the Raspberry Pi quite a bit, so let’s set up a smooth workflow. FileZilla, a free FTP client, is our friend here. Download and install it on your main computer.

Remember that IP address we found earlier? Open FileZilla, enter your Pi’s IP address, username, and password in the relevant fields, and hit connect. You should now see your Pi’s file system. Easy peasy!

Unleashing the Power of YOLOv8

Alright, it’s time for the star of the show: YOLOv8! This state-of-the-art object detection algorithm is about to become our bee-counting secret weapon. Let’s get it installed on our Raspberry Pi.

First, let’s install Ultralytics, the awesome library that powers YOLOv8. In your Pi’s terminal, enter the following commands:

sudo apt update
pip install ultralytics[export]
sudo reboot

Once your Pi is back online, let’s test out YOLOv8. Create a new directory and run a quick inference test:

mkdir Documents/YOLO && cd Documents/YOLO
yolo predict model='yolov8n' source='https://ultralytics.com/images/bus.jpg'

YOLOv8 will download the specified model (yolov8n in this case) and run inference on the image. You’ll find the results in the runs/detect directory. Pretty cool, right?

Going Turbo with NCNN

To squeeze every ounce of performance out of our Raspberry Pi, we’re going to convert our YOLOv8 model to NCNN format. NCNN is a super-fast inference engine that’s perfect for resource-constrained devices like ours.

Let’s convert the yolov8n model we downloaded earlier:

yolo export model=yolov8n.pt format=ncnn

Now, we have an optimized yolov8n_ncnn_model ready to rock! Let’s give it a test drive:

yolo predict model='./yolov8n_ncnn_model' source='bus.jpg'

You’ll notice that the inference is much faster with the NCNN model. Speed demons, unite!

Demystifying YOLO

So, what makes YOLO so special? Let’s break it down:

YOLO (You Only Look Once)

YOLO is all about speed and efficiency. Unlike traditional object detection methods that involve multiple steps, YOLO processes the entire image in one go. This “single-shot” approach makes it incredibly fast, perfect for real-time applications like our bee counter.

Evolution of a Champion

YOLO has come a long way since its inception. With each new version (from YOLOv1 to the latest and greatest), it has gotten faster, more accurate, and more efficient. We’re using YOLOv8, which strikes a great balance between speed and accuracy, making it ideal for our TinyML project.

A Model for Every Occasion

One of the coolest things about YOLO is its versatility. It’s used in a wide range of applications, from self-driving cars to security systems to, you guessed it, counting bees! Its ability to accurately detect objects in real-time makes it a true game-changer.

Playing with YOLO in Python

Time to get our hands dirty with some Python! We’ll start by exploring YOLOv8 in the interactive Python interpreter. Open your Pi’s terminal and type python to enter the interpreter. Now, let’s import YOLO and load our NCNN model:

from ultralytics import YOLO
model = YOLO('yolov8n_ncnn_model')

Let’s run inference on our trusty bus image:

img = 'bus.jpg'
result = model.predict(img, save=True, imgsz=640, conf=0.5, iou=0.3)

YOLOv8 will work its magic, and you’ll get a detailed result object containing bounding boxes, confidence scores, and other juicy details. We can even access the inference time:

inference_time = int(result[0].speed['inference'])
print(f"Inference Time: {inference_time} ms")

Wow, that was fast! Now, let’s take it up a notch and create a simple Python script to automate the inference process.

Creating a YOLOv8 Inference Script

Open up your favorite text editor on your Pi (I like Nano) and create a new file called yolov8_tests.py. Copy and paste the following code:

from ultralytics import YOLO

model = YOLO('yolov8n_ncnn_model')
img = 'bus.jpg'
result = model.predict(img, save=False, imgsz=640, conf=0.5, iou=0.3)

inference_time = int(result[0].speed['inference'])
print(f"Inference Time: {inference_time} ms")
print(f'Number of objects: {len(result[0].boxes.cls)}')

Save the file and run the script:

python yolov8_tests.py

You should see the inference time and the number of objects detected in the image. Boom! We’ve got ourselves a working YOLOv8 inference script.