<![CDATA[Monostable Blog]]>http://blog.monostable.co.uk/Ghost 0.8Wed, 27 Sep 2017 17:26:33 GMT60<![CDATA[Growing your code with Arduino]]>Over the past few years I have been using the Arduino ecosystem more and more for prototyping in a professional setting.

I have been working on software for the Braille e-book reader prototypes for Bristol Braille Technology and I have finalised the firmware of the inital Kickstarter batch of the

]]>
http://blog.monostable.co.uk/growing-your-code-with-arduino/8d5a0205-1200-4035-a0f2-5cc474b4e1e6Wed, 09 Aug 2017 21:55:46 GMTOver the past few years I have been using the Arduino ecosystem more and more for prototyping in a professional setting.

I have been working on software for the Braille e-book reader prototypes for Bristol Braille Technology and I have finalised the firmware of the inital Kickstarter batch of the VRGO Chair virtual reality controller.

There was a time when I would dismiss Arduino code and as unprofessional and "just for hobby projects" but it has some strong points in its favour that made me abandon these preconceptions long ago:

  • A packaged development environment that even a mechanical engineer can install and use.
  • A vast selection of re-usable code: any particular module that you think may be suitable for your prototype likely has an open source library or example project for Arduino.
  • Abstractions that save time on looking up documentation and writing boilerplate code.
  • The abstractions also help portability. Code that makes use of the Arduino libraries can be used over a wide range of processors (from 8-bit low clock-speed MSP430 and AVR to powerful ARM Cortex M4 in the form of the Arduino Due and Teensy boards).

There are downsides of course:

  • A packaged development environment that anyone that cares the least bit about code formatting will perceive as torture.
  • You lose some control and performance and the abstraction layer can become a source of confusion (e.g. PIN2 on DDRD on your AVR, as it is in the schematic becomes pin number 0 in the Arduino world, but it depends on your processor)
  • While your code is potentially portable it isn't guaranteed. The devil is in the details of what hardware and processor specific things your code, or your libraries code, makes use of.

There are ways to mitigate these issues though and I want to outline some of the coping mechanisms I have been using that have let me use Arduino in a professional setting.

These techniques have, so far, let me not abandon my code as we move from prototypes to products but let me extend it and let me keep using the flexibility and development speed of the "Arduino way" to my advantage.

A sensible environment

If you spend a lot of time writing code you should spend a lot of time getting to know your text-editor. The one embedded inside the cute Arduino IDE is not a good one and you should abandon it with haste. Vim and Emacs are the grumpy old men, Atom and VSCode are the new kids on the block, but there are millions of good ones, why would you use the Arduino IDE to edit code?

There is a little preference that I always tick when I install the Arduino IDE. It's called "Use external editor" and it makes the area with the code go grey and un-editable and refresh whenever the file is saved by something else. I simply use the IDE to compile and upload the code and edit the code itself with Vim.

As you code grows even this limited use of the IDE may become annoying in which case I thoroughly recommend switching the project over to PlatformIO. Ejecting your Arduino project from the IDE to PlatformIO should be straightforward. If you are using the Uno for instance running pio init --board uno and moving all the source code into the src/ directory should get you there.

Let's make it modular

Often people seem to be unaware that Arduino is just C++. More accurately you are using C++ and the Arduino C++ framework.

C++ was designed with huge collaborative software projects in mind and some of its features can seem cumbersome when working on small embedded systems and daunting if you have been learning to code by playing with Arduino.

C++ also has a class hammer and too often I come across code that has classed all the things for no particular reason other than that seems to be what you are supposed to do in C++. In my experience code like this easily looses sight of the relationships it's trying to express.

Classes are a language feature for expressing something that exists more than once. A class is really good when you want two thousand of something and many consider a singleton class, a class with a single instance, to be something to avoid.

In embedded systems you are often building abstractions around resources, say the internal EEPROM or a motor controller or LED attached to particular pins, where it is dictated by the real world that there only ever is one thing and there could never possibly be a possibility to instantiate another.

I don't quite think singleton classes should be avoided at all costs. Classes provide other benefits such as bound methods and encapsulation but I don't introduce a class to my code unless there is a really good reason to. What I use instead are namespaced modules.

Most higher level languages have adopted module systems where you can split your code into files and import or require your files as desired. C++ has inherited a two file system from C. .cpp is the main source file and .h is the header file.

You can declare prototypes, re-confirming the types from your function declarations, in your header files so that you can use them in your .cpp file in any order you like. I find that the repetition in .h and .cpp become cumbersome and slows me down so I opt for single .h file modules and namespaces for encapsulation within the modules.

The one module name that has become common in all my projects is pins.h where all the pins used on my Arduino compatible board are mapped out.

// pins.h
namespace Pins {  
    enum {
        LED = 13,
        BATTERY_LEVEL = A0,
        // ...
    }
}

As things are added to you main .ino file often simply moving things out of setup and loop can be a natural start to a new module.

A contrived example:

// example.ino
#include "pins.h"
#include "led.h"
#include "some_other_thing.h"

void setup() {  
   Led::setup();
   SomeOtherThing::setup();
   // ...
}

void loop() {  
   Led::loop();
   SomeOtherThing::loop();
   // ...
}
// led.h
namespace Led {  
     void setup() {
        pinMode(Pins::LED, OUTPUT);
     }

    void loop() {
       digitalWrite(Pins::LED, HIGH);
       delay(500);
       digitalWrite(Pins::LED, LOW);
       delay(500);
    }
}

The caveat here is you need to pay attention to your imports and the order your functions are declared in within your .h files. It's not quite like module systems of more higher level languages but it's pretty damn close.

Conclusion

Too often do I see Arduino being disparaged as "not professional" when it is just a convenient way to compile your C++ code and upload it to a variety of development boards. Often people use it as an excuse to re-write everything from scratch, and more often than not that's a really bad idea.

You can adapt you development environment to be more pleasant and adopt development practices that helps you write clean and modular code. This lets you grow your Arduino prototypes into production ready products rather than throwing everything away and starting over.

]]>
<![CDATA[Kitnic.it, 1-click BOM and the CPL]]>This is a guest blog post for Octopart and first appeared on their blog.

Kitnic.it is a site to share electronics projects. If a project is on Kitnic, you can download the Gerbers and put parts into a retailer shopping cart with a single click.

kitnic_screenshot.png

These are early days

]]>
http://blog.monostable.co.uk/kitnic-it-1-click-bom-and-the-cpl/7b3355ee-19b8-4129-a5b9-6f676e72ecb8Wed, 30 Nov 2016 16:38:57 GMTThis is a guest blog post for Octopart and first appeared on their blog.

Kitnic.it is a site to share electronics projects. If a project is on Kitnic, you can download the Gerbers and put parts into a retailer shopping cart with a single click.

kitnic_screenshot.png

These are early days for Kitnic: our submission process currently involves opening a pull-request on GitHub.

In an effort to encourage people to give it a go, I often look over open source hardware projects that I come across to see if I can get them into a state that is ready to be put up on the site. The challenge is almost always in sorting out the bill of materials (BOM). Really, this is the issue that Kitnic is trying to address: there is no standard way to record a BOM.

Whether you are looking at someone else's project or your own work months after it's creation, trying to find the right parts can be a tedious exercise. The main culprits are often generic components, like resistors and capacitors, where you care about the basic values and specification but not about the exact manufacturer or retailer part.

The 1-click BOM browser extension that goes along with Kitnic makes the process of adding items to shopping carts a seamless experience. In its first few iterations, this only worked if you specified exact retailer or manufacturer part numbers. In an effort to reduce the tedium of creating BOMs I added a smart, semantic match of surface mount resistors and capacitors to those in the Common Parts Library (CPL). This was only possible thanks to Octopart providing the CPL data in an easily parse-able format with a Creative Commons license. The result is best illustrated with the GIF below and should save many engineers a lot of frustration and time.

demo.gif

Everything mentioned here is free and open source, and is available on GitHub so if you spot any issues or have ideas for new features don't hesitate to get in touch — and maybe even dig in and make improvements yourself. We are also currently running a promotion for Kitnic for early adopters: you get free PCB manufacturing for registering your project.

]]>
<![CDATA[Beginner FPGA programming using open source tools #2: Setup]]>This tutorial is for the iCEstick Evaluation Kit which you can get for about $25 from various places.

We want to spend as little time as possible on the setup. If you are already comfortable with Bash and compilation and have your system set up for that you can follow

]]>
http://blog.monostable.co.uk/beginner-fpga-series-2/e2ae2979-16fd-446a-b5f1-3d6ca6f01479Tue, 14 Jun 2016 12:05:00 GMT

This tutorial is for the iCEstick Evaluation Kit which you can get for about $25 from various places.

We want to spend as little time as possible on the setup. If you are already comfortable with Bash and compilation and have your system set up for that you can follow the instructions from the Ice Storm project but below is an easier route using Vagrant.

Vagrant will set up a virtual machine for you, install all the right tools and make sure the USB connection to the iCEstick is passed through. This should work on any machine be it Linux, Windows or OSX.

  1. Install Virtualbox
  2. Install Vagrant
  3. Install Git
  4. Open a terminal (use Git Bash on Windows) and clone the icestorm-vagrant repository.
git clone https://github.com/monostable/icestorm-vagrant  

(The code-block above indicates this is a bash command to be typed into your terminal.)

We now create the Vagrant machine.

cd icestorm-vagrant  
vagrant up  

The above will set up a 64-bit Ubuntu 14.04 virtual machine with all the required tools installed. The compilation steps can take a while so go grab a coffee or something.

After it completes you can login to your newly created virtual machine.

vagrant ssh  

This should say "Welcome to Ubuntu..." and present you with a prompt that looks like this:

vagrant@vagrant-ubuntu-trusty-64:~$  

We are now all on the same page and have the same system with the same tools installed.

Next we will run a simple example through the entire toolchain workflow to end up with some hardware set on your iCE40 FPGA.

]]>
<![CDATA[Beginner FPGA programming using open source tools #1: Introduction]]>In this blog series I will document my adventure in learning how to program iCE40 FPGAs using only open source tools. I have no prior experience using Verilog or any of the software involved so the aim is a tutorial suitable for beginners.

Field programmable gate arrays (FPGAs) are reconfigurable

]]>
http://blog.monostable.co.uk/beginner-fpga-series-1/a2f89689-9ccb-40b3-bd4f-ae44ac3a3740Mon, 13 Jun 2016 22:38:46 GMTIn this blog series I will document my adventure in learning how to program iCE40 FPGAs using only open source tools. I have no prior experience using Verilog or any of the software involved so the aim is a tutorial suitable for beginners.

Field programmable gate arrays (FPGAs) are reconfigurable hardware. When you program an FPGA you define hardware logic blocks and connections between them allowing for high speed parallel execution of digital logic designs.

FPGA image credit: W.T.Freeman

FPGAs are how new processor designs are prototyped and they have niche applications where high speed and parallel processing and a large number of IO (inputs and outputs) are a distinct advantage.

Their major downsides are their cost and the complexity and license restrictions of the vendor specific toolchains. Most of the time you don't want to use an FPGA, but when you do, oh boy are you in for a treat.

Xilinx ISE image credit: Paolo Santinelli

Vendors provide heavy bloated IDEs that are gigabytes big, slow and seem to include everything but the kitchen sink. These are the only way to program their devices. They restrict you in what features you are allowed to use and even which of their devices you are allowed to program depending on what licensing fees you pay.

Even more annoyingly it takes considerable time investment to learn the work flows and quirks of a toolchain that will tie you to a certain product line of a certain vendor.

The reason vendors get away with this is that they keep a tight grasp on their bitstream formats, an encrypted communication protocol that allows you to re-program and reconfigure the FPGA.

I last used FPGAs in University and, though I was fascinated by the technology, it was not a pleasant experience. Since then, while mostly programming microcontrollers and embedded Linux applications, I have become accustomed to a work-flow involving fast, single-purpose programs executed from the bash command-line. I use vim, gcc and make in my day to day and I rely on the shell's repeatability (Ctrl-R is your friend!), configurability and scriptability as a productivity boon.

I had pretty much written off FPGA work as "not my thing" until about a year ago when an announcement was made that some friendly hackers (Clifford Wolf and Mathias Lasser) had reverse engineered the Lattice iCE40 bitstream and one is now able to program them completely from the command-line using only open source tools.

Verilog with Make

Documentation and presentations followed and one of the most interesting revelations was that these efforts could enable on-the-fly reconfiguration of hardware from a software program. This is a new way to do computing that could be a revolutionary solution for some problems.

The easiest way to come to grasps with new way of thinking is to imagine an add-on board for a Raspberry Pi (a shield or hat, if you will) and the software running on the embedded Linux platform could re-configure its attached hardware when it needed to to accelerate a certain computation or access the vast inputs and outputs (144 in the case of the iCE40s) available from the FPGA.

Plans for exactly this are already under way. Clifford Wolf himself has been working on the Ico Board (pictured) which is currently in a beta release phase and the CAT board is another notable project attempting to provide something similar.

Ico Board image credit: OnSite Broadcast eU

The excitement around these advancements is tangible with comparisons to the beginnings of the GNU Compiler Collection being thrown around. It really remains to be seen what the true value of an open source reconfigurable toolchain is and what a community of free acting hackers will build with this.

This is a venture strictly for fun and not for profit as the currently the marketability of the skills of using this particular workflow is questionable at best. Join me in exploration of this frontier of open source.

Let's set up our tools in part 2.

]]>