To share the images that have a subject of interest, we created a page with a gallery on it. There’s a helpful description of the setup, robots used, and more. We decided to start simple first. In the future, we will need to have a better way of organising this so the tags can be seen. You can see the page here: Bowie Imagery.
Today we took a look at some of the old imagery data from Bowie from Summer 2018. There still needs to be work done on this, such as going through all the photos and tagging them if they include a subject of interest. There were two robots collecting imagery data that season. To start off, we worked on a dataset from OG Bowie’s groundcam: 09-08-2018_11-09 (2pm field test). This was Sept. 8th, 2018 – we had a Field Test, then had the beach closing party at Westboro Beach.
65 images were tagged out of a total of 2299. The items included bark, cigarette butts, feather, glass, leaf, misc, pebbles, plastic, seaweed, stick, stone, twig, and twigs.
Check out the video (not sure why the link doesn’t go properly, but the video is also embedded above)
Now we need to combine the test scripts we’ve been writing with Bowie API. The Bowie API takes a payload sent over serial from the microcontroller to the Raspberry Pi. It’s lightweight – just two key value pairs and some categories to help organise it. Here is how the messages are sent. Here is a look at the API, but it has since evolved since then.
Something we got stuck on is formatting the json dictionary to be proper for the updates. This is an example of what the json dictionary could look like. Eventually playing with the braces and quotes it worked out. The value has to be formatted as a string.
Integration time! Time to run the script on the Raspberry Pi. It was cool to boot up this Raspberry Pi again, it was entirely the same as Summer 2019. It will be fun to revisit some of those scripts at some point in time, but not right now. Installed the Python AWS IoT SDK.
Dun dun dun … and now a huge bug. We weren’t receiving serial data from the Teensy 3.6 to the Raspberry Pi. To debug it, we checked the function was being called in the Teensy. We made a simple test script that only prints out the data received on the Pi. We checked the port configuration and Serial was enabled. We rebooted it. The other thing to check was to verify the data was coming out of the Teensy. Tried to connect it to an FTDI adapter, but still the computer’s drivers issues were there, and rebooting was not an option to make it work. So it was searching endless forum posts to find any information or things to try…
The data is being received from Teensy to Pi!
Data on AWS IoT!
The bug ended up being the name of the serial port in the RPi code. Not entirely sure why this would have changed! But we had to change the serial port in the code from /dev/ttyAMA0 to /dev/ttyS0.
Carrying on from the previous tech log. The example we are using also uses a Lamda function to send a log note. In the future, Lambda functions can be used as “glue” to pretty much anything. Something to explore in the future when there’s more time available are SNS rules. This would let us send an email when a value is at a certain threshold. Here is a cool example of this.
We accomplished a lot of things working:
Look at the data! It’s there
Another look at the data
Just playing around. Doing this doesn’t work
The delta updates are useful, to not get the entire json each time
We played with figuring out what Shadow can do and not do. We looked at the pro’s and con’s of both:
We will be going the Shadow route, because it gives is the capability to use Lambda functions, and the persistence store of the data can be useful in the future. So this tech log #004 and #005 are blurred a little bit because a late night was spent tinkering around with this. With sleep deprivation hitting full swing we hope some of this is coherent.
Some of the time this work session was spent finding the documentation. We were trying to find more information about Shadow updates vs vanilla MQTT. All of the AWS IoT SDKs can be found here. If you’re reading “SDK” for the first time, it means software development kit. They usually consist of libraries, example code, and helper functions – so that you can get going on whatever you need to build with that particular service quickly.
It was time to get going on working on getting connected with AWS IoT. This is going to be fun! We are going to connect Bowie’s brain to the cloud. We have done some things previously, such as connecting Bowie to shiftr.io MQTT. This works great, but in order to fully fulfil our vision of the robots working together in teams, we will need the data to be connected to many different services.
Starting can be intimidating especially if you aren’t familiar with AWS IoT. Here is the tutorial that we followed step by step. There were some errors to fix along the way but it makes sense once you start un-puzzling it. Going through this step by step was immensely helpful to understand. Big thanks to soumilshah1995 on YouTube for making that tutorial.
Getting closer! It’s partially updating, but not quite receiving the data properly
After a few edits, now it is working!
We see the data that we’re sending has been received!
And it’s being called as a Lambda function!
Also receiving a verification though Python
After many hours debugging and tinkering, we made it work. We are seeing data in the AWS cloud! A step closer to getting Bowie connected. Next tech log will make further progress on this.