Monday, March 16, 2015

Google Breakpad and the post crash experience

Google Breakpad has many components to it, but at the basic level it lets you capture information at the time a crash occurs and upload that to the net. A really cute part of Breakpad is that the binary doesn't need to have the debug symbols in it, you don't even need to have them on the client machine at any location. When you build version $githash then you use a breakpad tool to copy out the debug symbols into separate files. When the user discovers a crash they upload a minidump file to a server of your selecting. Then you can combine the extracted symbols from build time and the minidump file to generate a backtrace with line number information. So software users don't have to know about gdb or lldb or whatnot and how to make a backtrace and where to paste it.



I recently updated FontForge's use of breakpad to use a small server on localhost to report the bug. The application dmg file for fontforge will soon also include the extracted symbols for the build. By telling breakpad to use a local server, that server can lookup the symbols that are shipped and generate a human readable backtrace with line number information. Because its also a web interface and running locally, it can spawn a browser on itself. So instead of getting the Mac dialog supplied by the osx crash reporter app telling you that there was a crash, you get a web page telling you the same thing. But the web page can use jQuery/Bootstrap (or $ui tool of choice) and ask what the user was doing and offer many ways to proceed from there depending on how the user wants to report things. The https://gist.github.com/ site can be used to report without any login or user accounts. It's also rather handy as a place to checking larger backtraces that might be, maybe, 50-100kb.

But once you can upload to gist, you can get a http and other URL links to the new gist. So it makes sense from there to offer to make a new github issue for the user too. And in that new issue include the link to the gist page so that developers can get at the full backtrace. It turns out that you can do this last part, which requires user login to github, by redirecting to github/.../issues/new and passing title and body GET parameters. While there is a github API, to report a new issue using it you would need to do OAuth first. But in the libre world it's not so simple to have a location to store the OAuth secure token for next time around. So the GET redirect trick nicely gets around that situation.


For those interested in this, the gist upload and callback to subsequently make a github issue are both available. The Google Breakpad hands over the minidump to a POST method which then massages the minidump into the backtrace and spawns a browser on itself. The GET serves up all the html, css, js, and other assets to the browser and that served html/js is what I link to at the start of the paragraph which is where the actual upload/reporting of the backtrace takes place.

The only thing left to do is to respond to the backtraces that come in and everybody gets a more stable FontForge out of the deal. It might be interesting to send off reports to a Socorro server too so that statistics month on month can be easily available.

Thursday, January 29, 2015

libferris on osx

So libferris is now compiled and installed thanks to some of my handy work on Portfiles and macports doing the heavy lifting. I've put the Portfile into the distribution for many of the repositories; ferrisloki, ferrisstreams, stldb4, ferris, fampp2. And moved the source control over to the github -- https://github.com/monkeyiq/ferris

It's still a bit of a bumpy compile for ferris itself. Using clang instead of gcc, using the different stdc++ lib, the lack of some API calls on osx relative to Linux and the assumptions I'd made that IPC, advising the kernel on IO patterns, memory mapping and again advising on page patterns, would all be available APIs and contants. I have a patch from the compile which I need to feedback into the main libferris repo, making sure it still works fine on Linux too.

So now I can dig into xml files from the command line on osx too. I have to test out the more advanced stuff and the web services. The later use some of the 'Q' magic dust, qjson, qoauth, qtnetwork et al so they should be fairly robust after the port.

I should also update the primary file:// handler in libferris to use some of the osx apis for file monitoring etc to be a friendlier citizen on that platform. But going from no ferris on osx to some ferris is a great first move. A bundle would be the ultimate goal, /Applications/Ferris install in a single drag and drop.

Friday, January 9, 2015

Terry 2.0: The ROS armada begins!

It all started with wanting to use a Kinect or other RGBD (Depth sensing) camera to do navigation... Things ended up slowly but surely with moving from a BeagleBone Black and custom nodejs script that I created as the heart to a quad core atom running ROS and many ROS nodes that I created ;)


The main gain to ROS is the nodes that other people have written. If you want to convert RGBD to a simulated laser scan in order to do 2d navigation then that's already available. If you want to make a map and then use it then that code is already there for you. And the visualization for these things. I'm not sure I'd have the time to write from scratch a 3d robot viewer and visualize my cut down 'fake' 2d laser scan data from the Kinect in OpenGL. But with ROS I got the joy of seeing the scan change in real time as Terry sensed me move in front of it.

I now have 3d control of the robot arm happening, including optional sinusoidal encoding of movements. The fun part is that the use of sinusoidal can be enabled or disabled without any code changes. I wrote that part as a JointTrajectory shim. For something to use smoother movement all it has to do is publish to that shim instead of directly to the servo controller itself. The publish and subscribe parts of the IPC that ROS has are very easy to get used to and allow breaking up the functionality into rather small pieces if desired.


The arm is one area that is ROS controlled, but not quite the way I want. It seems that using MoveIt is indicated for arm control but I didn't manage to get that to work as yet. The wizard only produced an arm that would articulate on one joint, so more tinkering is needed in that area. Instead I wrote my own ROS node to control the arm. It's all fairly basic trig to get the gripper at an x,y,z relative to the base of the arm. And an easy carry over to fix the gripper at a horizontal to the base no matter what position the arm is moved to. But in the future the option to MoveIt will be considered, can't hurt to have two codepaths to choose from for arm control.

As part of the refresh I updated the pan unit for the camera platform.Previously I used a solid 1/4 inch shaft with the load taken by a bearing and the gearmotor turning the shaft directly from below it. Unfortunately that setup has many drawbacks; no ability to use a slip ring, no torque multiplication, difficulty using an axle end rotary encoder IC to gain real world position feedback. The updated setup uses a 6 rpm gearmotor offset with a variable motor mount to drive a 24 tooth brass gear. That mates with an 80 tooth gear which is affixed to a hollow 1/2 inch alloy tube. As you can see at the top of the image, I've fed the tilt servo cable directly into the inside of that tube. No slip ring right now, but it is all set to allow the USB cable to slip through to the base and enable continuous rotation of the pan subsystem. So the Kinect becomes a radar style. One interesting aside is that you can no longer manually rotate the pan system because the gearmotor, even unpowered, will stop you. The grub screw will slip before the axle turns.


As shown below, the gearmotor is driven by an Arduino which is itself connected to a SparkFun breakout of the TB6612FNG HBridge IC. This combo is attached using double sided 3M tape to a flat bit of channel. Then the flat bit of channel is bolted to Terry. I've used this style a few times now and quite like it. A single unit and all it's wires can be attached and moved fairly quickly.



At first I thought the Arudino gearmotor control and the Web interface would be a bit outside the bounds of ROS. But there is an API for Arduino which gives the nice publish and subscribe with messages that one would expect on the main ROS platform. A little bit of python glue takes the ttyUSB right out of your view and you are left with a little extension from the main ROS right into the MCU. I feel that my 328 screen multiplexer will be updated to use this ROS message API. Reimplementing packeting and synchronization at the serial port level becomes a little less exciting after a while, and not having to even think about that with ROS is certainly welcome.

Below is the motherboard setup for all this. Unfortunately many of the things I wanted to attach used TTL serial, so I needed a handful of USB to TTL bridges. The IMU uses I2C, so its another matter of shoving a 328 into the mix to publish the ROS messages with the useful information for the rest of the ROS stack on the main machine to use at its will.


The web interface has been resurrected and extended from the old BBB driven Terry. This is the same Bootstrap/jQuery style interface but now using roslibjs to communicate from the browser to Terry. I'm using WebSockets to talk back, which is what I was doing manually from the BBB, but with ROS that is an implementation choice that gets hidden away and you again get a nice API to talk ROS like things such as publishing and subscribing standard and custom messages.


The below javascript code sends an array of 4 floats back to Terry to tell it where you want to have the arm (x,y,z,claw) to be located. The 4th number allows you to open and close the claw in the same command. The wrist is held horizontal to the ground for you. Notice that this message is declared to be a Float32MultiArray which is a standard message type.The msg and topic can be reused, so an update is just a prod to an array and a publish call. You can fairly easily publish these messages from the command line too for brute force testing.

var topic_arm_xyz = new ROSLIB.Topic({
   ros  : ros,
   name : '/arm/xyzc',
   messageType : 'std_msgs/Float32MultiArray'
});

var msg = new ROSLIB.Message({
  data : [ x,y,z, claw ]
});
topic_arm_xyz.publish( msg );


The learning curve is a bit sharp for some parts of ROS. Navigation requires many subsystems to be brought up, and at first I had a case that the robot model was visualized 90 degrees out of phase to reality. Most of the stuff is already there, but you need to have a robot base controller that is compatible. It is also a trap for the new players not to have a simple robot model urdf file. Without a model some parts of the system didn't work for me. I'd have liked to have won with the MoveIt control, and will get back to trying to do just that in the future. I think I'll dig around for shoe string examples, something like building a very basic three servo arm with ice cream sticks and $5 servos would make for an excellent example of MoveIt for hobby ROS folk. Who knows, maybe that example will appear here in a future post.


Sunday, January 4, 2015

One Small Step for Terry, one giant leap for robotkind.

This evening the BeagleBone Black was finally unbolted from Terry. All the sensors, servos, gearmotors, PWM partial update moosed down RGB DMD screens are controlled from ROS now.

In the process I replaced the physical pan subsystem with a torque multiplied 6 rpm gearmotor. Its 80-24 multiplication giving 1.8 effective rpm at the pan. Unlike the previous setup, the torque is now so high that you can not manually turn the pan system. It occurred to me that this is also a minor hazard as if you get a finger in the gear mesh it would not lead to a "nice place". This is also now using a hollow 1/2 inch metal tube, so I can add a slip ring to allow the pan and tilt to rotate freely at will. I can also much more easily include an IC based pot to close the pan feedback loop with high precision.

It has been interesting going from a board with so many headers for SPI, I2C, UART and GPIO to a general purpose small desktop motherboard running the show. The upside is that things like SLAM and localization which would be tedious to reimplement myself are now available. It might be fun to play with Monte Carlo localization on an mbed based MCU (target point at around 100Mhz/100kb sram). But that's for another day.

The next trick is bringing together the launch files for each subsystem into a Terry-Main.launch that brings it all up.

Pictures and video are extremely likely to follow.

Friday, November 28, 2014

FingerTech Mecanum meets Actobotics

Sparkfun sell some nice little omniwheels made by FingerTech Robotics. These come with a grub screw mount for a 3mm axle. While it is said around the webs that one might drill out the mount to accept up to a 6mm axle, I wanted a more flexible solution for mounting these wheels onto an Actobotics structure. It turns out that the four screw mounts (using what I think are x-40 screws) are in an extremely close location to the four screws on the Actobotics hub mount. Unfortunately it was a tad hard for me to get a hold of longer x-40 screws to attach the hub mount, so I ended up taking the wheel apart and replacing the standoffs with the Actobotics ones. The result is shown below:


The below shows the mecanum wheel taken apart. The three standoffs you see vertical in the image are the original ones from the wheel. These are about 1 inch long, so you'll be wanting some 1 inch actobotics standoffs to replace them with. When you unscrew the original standoffs then the hub mount part (centre of the red alloy), will be able to fall out and be removed. This lets you screw the Actobotics standoffs on and then on the other side use slightly longer bolts to attach the hub mount to the wheel to get the assembly shown above.


Apart from a clean 6mm hookup for the stepper motors that I plan to use, this is a handy modification allowing the 1/4 inch hub mount or other sizes to be substituted in instead. This is handy if you want to switch from 6mm to 6.35mm (1/4 inch) axles as you can easily change your mind just by changing the actobotics hub mounts.

Sunday, November 23, 2014

Terry: Updated Top Shelf

The Kinect is now connected much closer to the tilt axis, giving a much better torque to hold ratio from the servo gearbox. I used some self tapping screws to attach the channel to the bottom of the Kinect. Probably not the cleanest solution but it appears to mount solidly and then you get to bolt that channel to the rest of the assembly. For a closer look the Logitech 1080 webcam is mounted offset from the Kinect. This should give an enjoyable time using the 1080 RGB data and combining the VGA depth mask from the Kinect into a point cloud.


The camera pan/tilt is now at the front of the top shelf and a robot arm is mounted at the back of the shelf. The temptation is high to move the arm onto a platform that is mounted using threaded rod to the back of Terry. All sorts of fun and games to be had with automated "pick up" and move tasks! Also handy for some folks who no longer enjoy having to pick items up from the ground. The camera pan/tilt can rotate around to see first hand what the arm is doing, so to speak.


The wheel assembly is one area that I'm fairly happy with. The yumo rotary encoder runs 1024 P/R and it is attached using an 8:1 down ratio to give an effective "ideal world" 13 bit precision. Yes, there are HAL effect ICs that give better precision, though they don't look as cool ;) The shaft of the motor is the axle for the wheel. It is handy that the shaft is not right in the centre of the motor because you can rotate the motor to move the wheel through an arc, and thus adjust the large alloy gear until it nicely mates with the brass gear on the rotary encoder.



Lower down near the wheels is a second distance sensor which is good for up to around 80cm distance. The scan rate is much slower than the Kinect however.


Things are getting very interesting now. A BeagleBone Black, many Atmel 328s on board, and an Intel j1900 motherboard to run the SLAM software.

Friday, October 31, 2014

Terry 2.0 includes ROS!

What started as a little tinker around the edges has resulted in many parts of Terry being updated. The Intel j1900 motherboard is now mounted in the middle of the largest square structure, and SSD is mounted (the OCZ black drive at the bottom), yet another battery is mounted which is a large external laptop supply, the Kinect is now mounted on the pan and tilt mechanism along with the 1080p webcam that was previously there. The BeagleBone Black is moved to its own piece of channel and a breadboard is sunk into the main 2nd top level channel.


I haven't cabled up the j1900 yet. On the SSD is Ubuntu and ROS including a working DSLAM (strangely some fun and games getting that to compile and then to not segv right away).

I used 3 Actobotics Beams, one IIRC is a 7.7 incher and two shorter ones. The long beam actually lines on for the right side of the motherboard that you see in the image. The beam is attached with nylon bolts and has a 6.6mm standoff between the motherboard and the beam to avoid any undesired electrical shorts. With the two shorter beams on the left side of the motherboard it is rather securely attached to Terry now. The little channel you see on the right side up a little from the bottom is there for the 7.7 inch beam to attach to (behind the motherboard) and there is a shorter beam on this side to secure the floating end of the channel to the base channel.



The alloy structure at the top of the pan and tilt now has a Kinect attached. I used a wall mount plastic adaptor which with great luck and convenience the nut traps lined up to the actobotics holes. I have offset the channel like you see so that the center of gravity is closer to directly above the pan and tilt. Perhaps I will have to add some springs to help the tilt servo when it moves the Kinect back too far from the centre point. I am also considering a counter balance weight below the tilt which would also work to try to stabilize the Kinect at the position shown.



I was originally planning to put some gripper on the front of Terry. But now I'm thinking about using the relatively clean back channel to attach a threaded rod and stepper motor so that the gripper can have access to the ground and also table top. Obviously the cameras would have to rotate 180 degrees to be able to see what the gripper was up to. Also for floor pickups the tilt might have to be able to handle a reasonable downward "look" without being too hard on the servo.

There were also some other tweaks. A 6 volt regulator is now inlined into a servo extension cable and the regulator is itself bolted to some of the channel. Nice cooling, and it means that the other end of that servo extension can take something like 7-15v and it will give the servo the 6v it wants. That is actually using the same battery pack as the drive wheels (8xAA).

One thing that might be handy for others who find this post, the BeagleBone Black Case from sparkfun attaches to Actobotics channel fairly easily. I used two cheesehead m3 nylocks and had to force them into the enclosure. The nylocks lined up to the Actobotics channel and so the attachment was very simple. You'll want a "3 big hole" or more bit of channel to attach the enclosure to. I attached it to a 3 bit holer and then attaced that channel to the top of Terry with a few threaded standoffs. Simplifies attach and remove should that ever be desired.

I know I need slip rings for the two USB cameras up top. And for the tilt servo as well. I can't use a USB hub up top because both the USB devices can fairly well saturate a USB 2.0 bus. I use the hardware encoded mjpeg from the webcam which helps bandwidth, but I'm going to give an entire USB 2.0 bus to the Kinect.