Pan-tilt camera on a Raspberry Pi

This is part of my series on learning to build an End-to-End Analytics Platform project.

TLDR; We removed the sense had and assembled the Pimoroni pan-tilt hat with a NoIR Pi Camera. We tested the Pimoroni pan-tilt hat library and got the Pi movin’ and shakin’.

What are we trying to achieve though? 🤔

The plan is to be able to use computer vision to detect objects or events, send events based on detection, process the event, then send an event or command back down to the device to perform an action. We’ll start with local development then explore cloud in future posts. Yup, the end-to-end analytics platform project is growing in scope, that’s okay.

Lights, camera, action! 🎥

Now to detect, capture, and even track objects with a computer vision solution we need something that can ‘see’ and ‘move’. The Pimoroni has a pan-tilt hat to ‘move’ with servos to pan (x-axis) and tilt (y-axis) the mounted camera. We also have a NoIR Pi Camera to help ‘see’. NoIR means No Infrared. Why not the normal one? This one basically has night vision. Case closed.

A Raspberry Pi 4 with a disassembled pan-tilt hat and camera.
Compute, uh, vision? 😁

After removing the SenseHAT we used before, which we can see in the background. We can use the Pimoroni guide for assembly to get the new hat set up. One thing we don’t have is the Neopixel stick (light) which we don’t need.

Interesting point 💡
Pan-Tilt HAT is a two-channel servo driver designed to control a tiny servo-powered Pan/Tilt assembly. - Pimoroni pan-tilt hat Github repo
What's a servo? A servomotor (or servo motor) is a rotary actuator or linear actuator that allows for precise control of angular or linear position, velocity and acceleration.[1] It consists of a suitable motor coupled to a sensor for position feedback. It also requires a relatively sophisticated controller, often a dedicated module designed specifically for use with servomotors - Wikipedia

We’re focused on getting the pan-tilt working, not the camera. We’ll set up the camera when we do the object detection, image capture, etc. The kind maintainers of the Pimoroni pan-tilt hat Github repo have graciously bestowed upon us a curl command to install everything we need.

curl https://get.pimoroni.com/pantilthat | bash
A command terminal with informational messages.
Wait… “may explode”? 💥

Because we already enabled the I2C using the raspi-config tool, we can see the setup noticed that and printed out ‘I2C Already Enabled’.

A command terminal with informational messages.
Fully prepared 👍

We’re going to opt for the full install so that we can grab the examples and docs for future. You know, just in case 😏.

A command terminal with informational messages.
Joyful exclamation ❗

Installation done, let’s turn the key 🗝️ and see if this beauty starts.

Start up python in the VSCode bash terminal. Import the pantilthat library. The documentation has a few methods we can try out. We’ll start simple using the pan() and tilt() methods passing in the angles within the allowed range.

A command terminal running Python 3 interpreter displaying code with informational messages.
One hop this time 🕺

Great! It works. Playing around a little we can see the angle changing.

A Raspberry Pi 4 with an assembled pan-tilt hat and camera.
Hi WALL-E! 🤖

Queue music for interpretive machine dancing through numerous function calls… 🤣

All done! Before we close out, the documentation suggests it’s a good idea to turn of the servo drive signal to save power if we don’t it to actively point somewhere using the pantilthat.servo_enable(index, state) function. Reset the servos to their original position, and used the function to disabled the two servos. Now to shutdown the Pi and think about the next challenge. Getting the camera feed working, then on to object detection and tracking. I’d like to see if we can get the solution working with a Python venv though.

sudo halt

For fun 🎈

While perusing the documentation I noticed a note on displaying the Pi in the bash terminal which I thought was nifty. Give it a try…

pinout
A command terminal with a Raspberry Pi 4 drawing and information displayed in the terminal.
Whaaaat! 🤯 Pi in bash!!

Summary

It’s been a quick post. Wrapping it up, we got our new pan-tilt hat installed, working, and dancing which brought a smile to my face. There’s heaps still to learn, I2C (I2C protocol), pinouts, and so much more. Most of which is new to me too.

Quick note 📝
A massive thank you to the many people who put their time and effort into projects, like the Pimoroni pan-tilt hat repo, which make things significantly easier for all of us.

We’ll work on the vision part in the next post. Camera’s, image capture, and even object detection. Once we have that done, we’re going to start working on connecting the device to the cloud.

Until next time.

🐜

End-to-End Analytics Platform – Bicep What-If deployment

This is part of my series on learning to build an End-to-End Analytics Platform project.

TLDR; After I refactored my code to use modules I found that Bicep supports ‘What-If’ operations which explain what the code is going to do before deploying it. This post I do a short test on that. Found an issue not showing Azure Synapse resource creation. Then browsed the Bicep GitHub repo to search issues related to What-If operations. Didn’t find what I was hoping for, so logged my first public GitHub issue 😁.

Update: The issue we encountered seems to be related to another preflight improvement which is being worked on but is a “…bit of a gnarly, low level issue so please be patient 🙂. I was amazed to see how quickly Bicep the team responded on this.

What happens when I push this button? 🤔

So after my previous post on factoring in some Bicep best practices for code reuse I noticed that Bicep supports ‘What-If’ operations.

az deployment sub create --name '<name of deployment>' --location '<location name>' --template-file '<path to bicep file>' --confirm-with-what-if

Side note: I had to change the VS Code theme to save us all from the agony of lime green on light grey background reading.

What’s nice is we get a breakdown of changes that we are about to apply to our environment. I think that is awesome.

Terminal output of a Bicep deployment what-if operation.
I have one question. Explosions? 🧨

Yes, for the eagle-eyed reader, I realised my storage account name is an Azure Region name hahaha 😂

Looking at the terminal output, reading top to bottom, I can see:

  • We are about to deploy at the subscription scope.
  • We are deploying a Azure Data Lake Gen2 Storage Account with blob container and all their configuration goodness.
  • We are deploying an Azure Synapse… wait a minute…

What was weird was that I didn’t see the Synapse Workspace. I checked the deployment details/output and it was there.

Azure portal deployments screen.
Deployed

I wondered if the reason it didn’t output the Azure Synapse Resource during the What-If was because I didn’t define any output variables for it which I did for the storage account.

Bicep output code.
Putting more out.

I updated my variables, added output variables for my synapse.bicep module, then ran the What-If again. Aaaaand…. nothing changed. Considering Bicep is an Open Source project on GitHub we get to search for issues with ‘What-If’ operations. So, we get to create a issue 😁 Taking the things learnt over the past few posts on

GitHub issue summary.
de bug 🐛

That’s it. Done. Created our first public issue: what-if operation doesn’t seem to include all bicep defined or created resources · Issue #3682 · Azure/bicep (github.com).

The what-if behaviour doesn’t block us at this stage. The deployment works so at this point I think we are set for the next section to work on getting this into a GitHub Actions pipeline.

🐜

End-to-End Analytics Platform – Infrastructure as Code (IaC)

Photo by John Nail from Pexels

This is part of my series on learning to build an End-to-End Analytics Platform project.

TLDR; I set up a GitHub milestone with two issues. Started working with the Bicep language to build Azure resources. It’s basically a language that simplifies building Azure Resource Manager (ARM) templates. I installed the Bicep tooling. Defined resources using things like parameters, modules, and others using an ARM template guide. Used Bicep build to generate an ARM template from the Bicep file. Experimented with Bicep decompile to generating a Bicep file from an ARM template. Created my first gist to share some code. Lastly, used the Azure CLI to deploy the Bicep resource. Also… found a Bicep playground 🤸‍♂️ Just saying..

Prepping for Dev 👨‍💻

We are using the development flow from my previous post. Not enough time? Check out the GitHub Flow.

We need a starting point to build out our end-to-end analytics platform. We are going to attempt to deploy a Azure Synapse Analytics and required services with Bicep templates. This gives us two key capabilities:

  • Data Lake Storage to store our data
  • Pipelines to support orchestration and batch ingestion of data

Let’s get started. We created a new issue, updated the project board, set up our new branch in GitHub. Pulled the updates locally. Then checkout to that new branch.

Getting the hang of issues.

In this post though I wanted to learn about GitHub Milestones. They make it easy track a bunch of related issues. They also have convenient progress tracking built in. So I added another issue. Then made my way to the milestone page from the issues tab:

More issues.

Used the ‘New milestone’ button to create a new milestone. Gave it an name and filled in the details.

A milestone. Yep.

After that jump back to the issue and assign it to the milestone we just created. Notice that the milestone has a progress bar.

I will walk 500 miles.

Nice. Issues, milestones, branches, in the flow. Time to get to building things.

Building Biceps 💪

To work with Bicep files we need to install the Bicep tools (Azure CLI + Bicep install, VS Code Bicep Extension) . Once that’s done, we add our first .bicep file 🦾 to the project. Remember to check which branch you are on locally.

Flexing our first Bicep file.

Stepping back, according to the documentation, Bicep is a domain-specific language (DSL) that uses declarative syntax to deploy Azure resources. We not covering every area of Bicep here. The documentation does a good job of that. There are a few things that we am going to use in this post:

There are a bunch of other capabilities that you can explore from loops, functions, and more. So much goodness, so little time… someday maybe.

Now to start building out our resources. Let’s start with some parameters:

Intellisense integration for Bicep development.

Yes please! Intellisense for the win! I also added a comment which is being kind to my future self. Now let’s add a resource. The intellisense really helps a bunch to expedite development. We can get to the resource, the API version, and more using the Tab key. Another nice thing is using Ctrl+Space to expand more options, properties, and more:

Building a Bicep resource.

We have some basic building blocks for figuring out how to create a resource. Next I expanded the declaration with more resources, parameters, comments, and properties using the template documentation and Azure-Samples.

Note: You could try create a Synapse Analytics Workspace using the portal and grab the template just before you create it as well.

Notice that you can reference the parameters in the resource declaration which helps with code reuse. I added a deployment condition to control what services get deployed. The code is in an intermediate state to showcase that I can use strings or parameters to assign values. I’ll gist show you what I did 😉:

/*Global parameters*/
param resLocation string = resourceGroup().location
/*This controls if we deploy the resource our not*/
param deployDataLake bool = true
param deploySynapse bool = true
/*Resource specific parameters – Synapse Analytics*/
param synapsWorkspaceName string = 'fancy-name'
param synapseSqlAdministratorLogin string = 'majestic-username'
param synapseSqlAdministratorLoginPassword string = 'your-complex-password'
/*Create a data lake storage account which we use as the Synapse Analytics default data lake*/
resource datalake 'Microsoft.Storage/storageAccounts@2021-04-01' = if (deployDataLake == true) {
name: 'fancy-name'
location: resLocation
sku: {
name: 'Standard_LRS'
tier: 'Standard'
}
kind: 'StorageV2'
properties: {
isHnsEnabled: true
supportsHttpsTrafficOnly: true
accessTier: 'Hot'
networkAcls: {
defaultAction: 'Allow'
bypass: 'AzureServices'
virtualNetworkRules: []
ipRules: []
}
encryption: {
services: {
blob: {
enabled: true
}
file: {
enabled: true
}
}
keySource: 'Microsoft.Storage'
}
}
}
/*
I built this child resource by wroking my way back through these templates: https://github.com/Azure-Samples/Synapse/tree/main/Manage/DeployWorkspace/storage
It get's a little tricky, but we are building a dependency chain of parent-child resources. e.g. Storage account -> Blob -> Container
*/
resource blobService 'Microsoft.Storage/storageAccounts/blobServices@2021-04-01' = if (deployDataLake == true) {
parent: datalake
name: 'default'
properties: {
cors: {
corsRules: []
}
deleteRetentionPolicy: {
enabled: false
}
}
}
resource container 'Microsoft.Storage/storageAccounts/blobServices/containers@2021-04-01' = if (deployDataLake == true) {
parent: blobService
name: 'workspace'
properties: {
publicAccess: 'None'
}
}
/*Create a Synapse Analytics workspace*/
resource synapseWorkspace 'Microsoft.Synapse/workspaces@2021-04-01-preview' = if (deploySynapse == true) {
name: synapsWorkspaceName
location: resLocation
identity: {
type: 'SystemAssigned'
}
properties: {
defaultDataLakeStorage: {
/*I used the datalake resource and can use dot notation to reference information about it. This establishes a dependency.*/
accountUrl: datalake.properties.primaryEndpoints.dfs
filesystem: container.name
}
sqlAdministratorLogin: synapseSqlAdministratorLogin
sqlAdministratorLoginPassword: synapseSqlAdministratorLoginPassword
}
resource workspaceFirewall 'firewallRules@2021-04-01-preview' = {
name: 'allowAll'
properties: {
startIpAddress: '0.0.0.0'
endIpAddress: '255.255.255.255'
}
}
}

It’s a basic Synapse deployment. The goal is to start deploying using Bicep. We can add things like RBAC assignment for storage access, network configurations on the storage firewalls, and others.

To ship it 🚢 we can use the Azure CLI in the VS Code integrated terminal. The deployment is pretty simple. Login into your Azure subscription with the Azure CLI. Set your subscription context.

az login
az account list
az account set --subscription 'your-subscription-name-or-id'

Create a resource group in which we want to deploy the resources defined in the .bicep file. Bicep can do this which we will get to another day.

az group create --resource-group 'your-resource-group' -location 'azure-region'
Success.

Deploy the resources at a resource group level specifying the Bicep file path as our template file. Once you submit the terminal will indicate that the deployment is running. We should see a JSON summary output when it’s done similar to our resource group deployment.

az deployment group create --resource-group 'your-resource-group' --template-file 'path-to-your-bicep-file'
Deploying robots.

Checking the deployment in the Azure Portal is simple. Navigate to the resource group. On the ‘Overview’ page, there is a ‘Deployments’

Deployments are deploying.

If you keep following the trail, you end up at the deployment detail screen:

It’s working… It’s working! 🚀

We can validate the resources are deployed in the Azure Portal:

Deployed!

Let’s close off one of our issues and see what it does to the milestone:

Milestone achieved ✅

Awesome! To clean up, just delete the resource group 👍

az group delete --resource-group 'your-resource-group'

Interesting finds

Bicep has nice capabilities for users coming from an ARM background is that you can use the Bicep build to have it build the ARM template 😉.

az bicep build --file 'path-to-your-bicep-file'
Building ARMs from Biceps lol

If you have ARM templates, you can try out the Bicep decompile functionality to TRY (it’s not perfect, so no guarantees) convert your ARM templates to Bicep files.

az bicep decompile --file 'path-to-your-bicep-file'

Wow! Another lengthy post. Thanks for sticking around. We covered some serious ground. We learnt a bunch and kept building a foundation for our future work. Future posts we can tackle things like modules, advanced resource deployments, and deploying using GitHub actions which should be fun.

🐜

End-to-End Analytics Platform – GitHub Setup

TLDR; I set up a GitHub repo for my learning project to build an End-to-End Analytics Platform. Used a GitHub project as reference (Atom). Cloned my repo with VS Code. Closed the loop with the GitHub Flow by creating an issue (GitHub Guides – Mastering Issues), then a branch, changed/commit stuff, pushed changes, created a pull request, merge the changes. Found an exciting learning tool (GitHub Lab | How it works), project (GitHub Training Kit), and VS Code GitHub integration along the way.

As part of my series on learning to build an End-to-End Analytics Platform project I want to start To get started here are a few things I want to get set up:

  • README.md – project on a page 📄.
  • .gitignore file – ignores files starting with VS Code using the gitignore extension.
  • docs folder – using this to store documents related to the project.
  • src folder – planning to use this as the root folder for code in the repo.
  • test folder – planning to use this as the root folder for unit tests.
  • actions folder – as far as I can tell at the moment for GitHub action-related files.
  • GitHub Project – using this to track my work issues over time.

How did I figure out that’s how I wanted to setup the repo you ask? I took my inspiration from the atom/github: Git and GitHub integration for Atom repository. Why? Well, I am starting my learning for GitHub through a LinkedIn Learning course Learning GitHub and that was the repo showcased in intro videos.

When I grow up I’ll work on using an approach outlined by Rob Sewell who has a fabulous beard, and some next-level skills: How to fork a GitHub repository and contribute to an open source project – Rob Sewell – SQLDBAWithABeard. For now, no forking the repo. The plan is to use VS Code in with my development environment setup to clone the repo locally, setup a new remote/local branch to get into the swing of feature branching, then make the changes are complete a very simple pull request.

Creating and cloning repo was easy enough using VS Code. Now I didn’t do everything in VS Code purely because I want to get a used to a few things first. So I manually grabbed the Clone URL from the GitHub repo.

Screen clip of GitHub clone URL.
Grabbing the repository URL for cloning.

All by the book so far. I opened up the Command Palette with Ctrl+Shift+P. Start typing ‘git clone’ and select the ‘Git: Clone’ operation.

Using VS Code command palette to start Git clone.
Use the VS Code command palette to start Git Clone.

Pasted in the repo URL.

Passing in the repository URL to clone in VS Code.
Paste the URL we copied earlier.

I was prompted to choose an repository location to clone the repo to. I chose to go with a folder that I can use for future repos, something like <your path>\github\<your repo>. Once that’s done, we get prompted to open up the repo.

Open repository pop-up in VS Code.
VS Code picks up I did something.

Great! Repo has been cloned locally and the files are there. Local repository has been initialised and we are on the local main branch (see bottom left).

Enhance. We have our files.

Side note: I set a local repo user configs aligned with the user name and email to be used by git for committing to my remote GitHub repo:

git config --local user.name 'example'
git config --local user.email 'example@domain.com'

Before creating feature branches I want to track my work for updating the project structure and the documentation. To do that we create a ‘Project Board‘:

Create a new project from the Projects tab.

Give it a name, description, and if you want to use a template go for it. I opted for a ‘basic kanban’ project template. As I go a long learning how all the triggers and things work this will evolve to something more sophisticated.

First look at the new spiffy project board.

It’s beginning to take shape, nice! Once that is done I need to add a work item to track my work against, to do that we create an ‘Issue‘:

Adding a new issue from the project Issues tab.

Fill it with information. Assign it to myself. Then label it with a ‘documentation’ label. Then hit the ‘Submit new issue’ button. Awesome. We now have an new issue that we can discuss, subscribe to for notifications, assign to projects, milestones, and more.

Posted new issue.

Once the issue is logged, I jumped over to the Project Board and added the ‘card’ to the ‘To Do’ swim lane. Now it automatically links up to this project.

Adding a card to the Project Board.

Switching over to the Code tab, we create a branch from the main branch in GitHub so that we can make our changes to that branch.

Creating a new branch from the main branch.

Once the branch is created we need to sync our local repo with a Git Pull to get the latest changes one of which is the addition of the new branch. The next point is to switch to or ‘check out’ that branch we just created. Opening up the Command Palette with Ctrl+Shift+P. Start typing ‘git checkout’ and select the ‘Git: Checkout to..’ operation. Then the feature branch you want to switch to.

Switching to the feature branch.

Next up. Make changes.

I added all the files and folders that I mentioned earlier. A noteworthy mention though was adding a .gitignore file using the gitignore extension which pulls .gitignore templates from the https://github.com/github/gitignore repository. Awesome. Didn’t know there was a repo full of .gitignore templates.

Installing the gitignore extension.

Opening up the Command Palette with Ctrl+Shift+P. Start typing ‘add gitignore’ and select the ‘Add gitignore’ operation.

Using the Command Palette to add a gitignore file.

Then just choose a template. In my case that’s ‘VisualStudioCode’.

Adding a gitignore file for Visual Studio Code.

Nice! That was pretty easy.

gitignore file added with definition.

Next up push it up remote. Using the menu in the left, I switched to the Source Control view (Ctrl+Shift+G). Added a commit message. Then hit the commit button. In my case I went for the ‘stage all changes and commit them directly‘ option.

Committing the changes.

Now notice at the bottom of the screen in the status bar we can see we have a change waiting to be pushed to the remote repository in GitHub.

Awaiting the big push to the cloud.

Just click on that. Follow the prompts and VS Code will push your changes up to the remote repository. In future with branches we will follow this up with a pull request.

Once that’s done, we jump back to GitHub and we can see our branch is updated with the files we added. Next step is to finish of the flow by creating a Pull Request and merging our changes back into the main branch. To start that process we click on the ‘Compare & pull request’ button.

Starting the review and pull request process.

As part of the pull request we fill in as much information as we can and link any issues that we want to close automatically using keywords. Once we filled everything out and we are happy there are no conflicts, we can click on the ‘Create pull request’ button.

Filing out the pull request and closing issues.

We get a valuable amount of rich history related to the work we have done here. Looking through the details I see I have no conflicts either. Let the merge begin.

Merging.

The merge is successful. In this case the work I have done on the ‘anthonyfourie-skeleton’ branch is complete. I don’t need that branch anymore so I am going to delete it.

Deleting the branch after a successful merge.

That’s it! Flipping back to the Issues tab we can see our issue has been closed. Whoop there it is!

Issue closed. Work complete.

All in all I think that was a pretty good start. Looking forward to the next post where I think I will tackle spinning up some basic infrastructure by using infrastructure as code.

🐜

End-to-End Analytics Platform

As part of my plan for Real Projects. I recently came across a few very interesting links for inspiration:

Now I don’t have much experience in any of those areas. It got me thinking though. What if I used a mash up of those architectures, technologies, and processes to build and learn about an end-to-end analytics platform. Source to AI/API.

So that’s the plan. I am going to work on build a project that can try bring together those services in a single project. Along the way I am thinking about not only learning about the technologies but about the processes and methodologies that surround them such as DevOps/DataOps/MLOps.

As I get my hand dirty with learning these things the code can be found here: data-gauntlet 🐱‍👤