Tuesday, February 13, 2018

Mindstorms Robot Swarm

Basic components of a LEGO Mindstorms Robot Swarm

I’ve always been fascinated by the behaviour of flocks and herds when many animals move with seeming coordination. I’ve been lucky to have had the chance to cooperate with Laurens Valk on robot simulation of such behaviour. 


In the video below you can see the project starting to come together. Here’s a list of problems we encountered.


1. Positioning

For the robots to be able to avoid each other and cooperate it is important that they know where the other robots are in relation to them. For this we used the triangular markers you see on the backs of the robots. They are spotted by a webcam above the field. The markers are all unique and show an ID, position and orientation to the camera. Converting dimensions in camera pixels to real world relative positions proved to be some interesting linear algebra. I will go into that in a full separate post.

2. Communication

The next challenge was to actually share positions with all the robots over some wireless networking system. The most important feature of the communication was that it had to be real time. The most reliable solution that I could find was a direct UDP socket connection over wifi. In another separate post I will go into this communication challenge too, and how to implement in python.

data = gzip.compress(pickle.dumps(robot_data[robot_id]))
try:
    result = self.server_socket.sendto(data, ('255.255.255.255', port))


3. Movement and pathfinding

Now that each robot knows where it is, it has to decide where to go. The best method here proved to be a set virtual springs. So some springs push the robots away from each other, other springs push the robots away from the walls and yet other springs attract robots to the balls on the table. All the virtual spring forces can be calculated from the relative position of the robot to other robots and to the walls and balls. Mathematically you can then combine all spring forces into one vector. That’s where we want the robot to move. That would be easy with rotocasters or omniwheels. They can start in any direction, at any time. But we wanted to limit ourselves to one basic Mindstorms set. So on to the next challenge.

3. Robot driving

The solution for movement was attaching the virtual springs to a point in front of the driving wheels. Imagine you are pulling a sulky horse cart. At any time you can step forward, backward or sideward. Now imagine a swiveling shopping cart wheel in the position you were standing, and yourself sitting where cart driver normally sits. By turning the wheels with your hand, you can pull the swiveling wheel in any direction at any time! So this is how we attached the virtual springs to a two-wheeled robot base. We calculate the needed movement of the swivel wheels by combining all the spring forces there and then just turn the wheels accordingly.

4. Picking up balls

Picking up balls was another challenge that was more difficult than expected. We wanted to limit ourselves to one motor for both picking up and releasing the balls at will, while also having some storage inside the robot. The picking also needed to be able to cope with some positioning imprecision. So it needed to pick up balls that could be off by centimeters from the feeding point. I will also dedicate a post on all the failed prototypes here.

5. Behaviour

The last challenge was to be able to show different behaviours in the robot. Driving to the depot with a full belly, seeking balls while avoiding others, signalling an empty battery,… for this robot program has a ‘state machine’. It executes it’s behaviour seven times per second, but switches to a different behaviour when the situation arises. For instance, after picking up five balls it switches to ‘drive to depot’-mode for releasing the balls. In every mode different forces, target and spring calculations apply.
Which challenge should I detail first? Let me know on facebook, youtube or google plus!
Written with StackEdit.

Saturday, February 11, 2017

The math behind the vertical plotter

Many people are asking after the math behind the rope plotter. I used only second grade high school math to do it. Here's how it works. 

Target motor positions

For all calculations I am using a coordinate system with the top left of the door as (0,0) and with cm as units. So the top right of the door is (90,0). The positive y-axis is pointing down so I have only positive numbers. The first goal is now to calculate the correct lengths of the ropes L and R in terms of the coordinates (x,y) of the target location for the robot. For this you can use Pythagoras. The left rope is the easiest: there is a triangle with sides x,y and L where `L = (x**2 + y**2)**0.5` (In python the double asterisk means a power). The right rope is also part of a triangle. This triangle has y, 90-x and R as sides. Therefore `R = ( (90-x)**2 + y**2 )**0.5`. Now that we have the target lengths we have to calculate the degrees the motor has to rotate to reach our desired coordinate. The diameter of the bush is about 0,9 cm. A complete 360 degree rotation results therefore in a 0,6 cm * 3,14 change in length. That is about 170 degrees of rotation to move one centimeter. I mounted the motors so that forward rotation is upwards. Therefore the motor target for the left motor is L * -170, the right target is R * -170. 

Navigating the drawing canvas

Now we have a formula to navigate the complete door. But what we really want is to navigate a piece of paper in the middle of the door. I wanted the robot to be usable independently of the space between the attachment points or the size of the paper. Therefore I defined a second coordinate system that has (0,0) in the top left of the paper and is normalised. This means all coordinates are between 0 and 1. This coordinate system is easy to scale on different plotting surfaces and paper sizes. It works like this: let m be the margin left of the paper, n be the margin on top and C the with of the canvas. The coordinates (x,y) in terms of the normalised coordinates (u,v) become simply: `x = C * u + m` and `y = C * v + n`.

At this point it's simply a matter of reading two files, x.rtf and y.rtf with normalised coordinates and moving the robot towards each of these coordinates. 

Ev3 program

You can also download the Lego Mindstorms Ev3 program to see the details. The ev3 main program expects 2 rtf files (robot text files), on named x.rtf and the other y.rtf. Both contain target coordinates. x.rtf has the number of coordinates as it's first line. Note that you can't open these files with a Rich Text File editor!

All the stupid mistakes I made when building 3nsor, the vertical plotter

After long nights of hard work I finally have vertical plotter that makes recognisable illustrations! In this article I'm sharing all the trouble I ran into. There's als an LDD file, 3nsor.lxf, for this 5th generation.


Mistake 1: the wrong spindle

My first hunch was to use a large spindle, so the relative increase in diameter with a layer of cord would be minimal. A varying diameter makes the math a lot more complicated.  The second spindle looked best because it had the smoothest surface. The others had all kinds of ribs. I made several generations with the second spindle. The problem was, though that the motor had to push a really big load. Because, with the spindle diameter, the lever length and thus the torque increases. So I had to add two gears to increase the force. Also, when just using the large spindles without the gears, the internal motor inertia is not enough to keep the robot in place on the vertical surface. It would slowly uncoil and descend. Another problem with that spindle was, that it wasn't part of the original 31313 set. So I went and looked for a smaller spindle. The simple red bush proved to be the solution, but only after I found the right thread...

Mistake 2: the wrong cord

I tried many different cords. I was looking for thin and strong, because a thick thread makes the diameter of the spool change a lot when it's rolled up. As I was counting degrees for movement, this would mean a bad distortion of the drawing. The first thread (the brown) was nice and thin, but broke after 3 drawings. It nearly cost me my brick. The second one, thick white was stronger, but so hard that it dented the soft lego parts and so thick that I had to use a big spool to avoid the change in diameter. This in turn required a gearbox which was imprecise and lossy. Only after a year of failed experiments I had the idea of using dental flos. Thin, strong, cheap, and not all too elastic. A little elastic is ok, as the increased length of the thread, due to stretch would compensate for the increased diameter on the spool.

Mistake 3: running the program from the command line

During development I was coding on my laptop while the robot was drawing. After some time, though, when the robot was running stable I closed my laptop and went to other stuff while the robot was drawing. That was a bad idea. Because I was coding on ev3dev, I ran the program from the command line. And Linux terminates programs run from the command line if the command line gets a time-out. The result was that the program stopped in the middle of a drawing. And when an ev3dev program stops, the motors keep running at the speed they are set to, they don't stop automatically. This resulted in the robot driving all the way up to the top right corner of the door, making a hard-to-clean sharpie stain and then being catapulted all the way to the left on the floor. I was really lucky that it landed on relatively thick rug...

Mistake 4: Putting the pen way below the point where the cords meet

I thought gravity would keep the pen in position but I forgot about impulse. A pen down below gave really imprecise and wobbly lines. That was cool for the very first experiment. It even made my beard look better. But ultimately it was a bad idea.


Sunday, August 23, 2015

How to get the Mindstorms Ev3 brick to read coordinates from a file

In order to plot drawings I needed to read coordinates from a file, while running a program on the Ev3 brick. It turns out to be a very tricky task that is not well documented. The limitations I did know were:

  • Ev3 can only read a whole line per file access
  • There are no text manipulation blocks that allow me to split lines by commas
  • There is no way to detect the end of a file
  • I will have to put x coordinates and y coordinates in separate files

What is the file format used by the Ev3 for writing numbers?

I started by generating a file full of numbers from an Ev3 program so I could reverse engineer the file format. I made a simple program that divides 1 by 2 about 20 times.
Next I opened the file in vi to see what was in there:
A ha! This is very enlighting. My conclusions from this little test:
  • It seems that numbers have 4 decimal places
  • Numbers are separated by ascii 13
  • The file extension is .rtf
  • ...but the file format is txt. It has nothing to do with the Rich Text Files. 
TextEdit on Mac will not open the file. Only vi or sublime text are working for me. On to the next phase.

How to generate number files that the Ev3 can read

Now it's just a matter of writing my list of coordinates to file. In Python that looks like this:
xfile = open('x.rtf', 'w')
yfile = open('y.rtf', 'w')

for x,y in pointlist:
    #write each number on a new line
    xfile.write("{:.4f}".format(x)+chr(13))
    yfile.write("{:.4f}".format(y)+chr(13))

xfile.close()
yfile.close()


Test if the Ev3 can read the numbers

Finally I wrote a litte program that reads from the file, performs some math and plots the result to the screen. And behold, when I compare to the original text file, it works!




Building a Mindstorms plotter with two ropes

Today I finally succeeded in building a Mindstorms robot that plots pictures, suspended by two ropes. Check the video. I ran into so much trouble building this, that I'll share what I learned.

These are the problems I ran in to:

  • Making the robot lay flat on the door. Gravity can be a bitch. (coming soon)
  • Selecting rope and pulleys for the robot (coming soon)
  • Generating the coordinates for the portrait (coming soon)
  • Getting the Ev3 to read coordinates from a file
  • Building a PID controller that does not reset the motor sensor (coming soon)
  • The math behind a two-rope plotter (coming soon)
  • Making the pen go up and down (coming soon)
Enjoy! I hope you can avoid my mistakes and build an even better plotter.





Wednesday, December 24, 2014

Modifying the original BrickPi case to fit a Raspberry Pi model B+

The standard acrylic plates that come with a BrickPi do not accommodate a Raspberry Pi model B+. The screws don't fit and the holes are in the wrong locations. You need to do some modding to make it work. But then you have 4 USB ports on your lego creation, and a nice and compact micro SD!

What you'll need

  • A 3mm drill
  • An M3 screw with a small head

Step 1: The extra hole

The first thing to do is drilling a 3mm hole in the corner of the bottom acrylic plate, the one without the BrickPi logo. It doesn't have to be super accurate, as we'll be using a large hole for the other screw. Just put your B+ on the acrylic plate to mark the hole.

Step 2: Enlarge the holes on your Raspberry Pi

Somehow the new model has smaller holes than the old B+. Carefully enlarge the holes on the circuit board with a 3mm drill.

Step 3: Mount the RPi on the acrylic plate

In the top left corner you'll need that smaller M3 screw. The screws that come with the BrickPi are too large and cover the micro USB port and the black 4R7 thingy, so you can't tighten them.
The bottom right screw goes into a larger hole in the acrylic plate that is meant for a Lego peg. So you have some play there.

Step 4: Slide on the BrickPi and assemble the rest

Here's a completed assembly in an unfinished Lego robot.

Optional: Add a bevel to the holes in the acrylic plates

It's hard to insert Lego pegs in the acrylic plates that com with the BrickPi. Using a large drill you can manually add a little bevel so they go in more smoothly. You can also use a slow turning Dremel tool.

Thursday, December 11, 2014

Writing blog posts on blogger with code snippets made easy

In my mindstorms hacking projects I need quite some code. And I want to blog about it, but the default editor in Blogger.com doesn’t have a button to mark text as code. You can abuse the blockquote, or add the <code> tag by hand, but it’s quite a hassle. The solution proved the be the brilliant StackEdit web app! It’s so amazing, I’m going to pay the little fee they ask.

With it you can write your blog post using markdown. And with the code blocks like GitHub has them in the readme.md file. Actually I got the idea to use markdown for blogging when I was writing a readme. Markdown is so much easier to use than a wysiwyg editor or typing all the HTML tags manually.

Here’s how to use it on your blogger.com blog.

1. Edit the html of your blogger blog

In the blogger dashboard go to ‘template’ and click the ‘Edit HTML’ button. Then, just above the closing <\HEAD> tag insert this:

<link href='http://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.4/styles/zenburn.min.css' rel='stylesheet'/>
  <script src='http://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.4/highlight.min.js'/>
<script>hljs.initHighlightingOnLoad();</script>

2. Go to StackEdit.io and write something interesting about code

StackEdit is mostly self-explanatory. So start writing and then click the hash icon on the top left. Choose: Publish > Blogger and there you are! A great blogpost with minimal typing and layout effort!