oops, i adopted an UR5 robot arm

here it is in its new home!

 

thanks dane for spotting this arm online and davidben for helping with the adoption fees

ur5 life history

the person i bought it from says that it lived its life as an R&D bot making pizzas at a pizza factory, then got auctioned off. he saw it and rented a home depot truck and went to get it. (more stories but i should ask permission first). it was part of a bakerbot which I looked up at https://www.apexmotion.com/baker-bot-cobot

Screenshot from that apexmotion site of ur5s decorating cakes, stamping cookies, etc.

my diary, aka “well that escalated suddenly”

to be honest, i didn’t quite intend for my robot arm collection to escalate so fast.

i think early april i decided to get the SO-ARM101 desktop arms,

and now it’s late april and i have a ur5 arm, bringing my total to five robot arms >__>

from so-arm101

for the so-arm101 i got the “pro” unassembled kit –basically a pile of six 12.4v feetech servos for the follower arm, plus a pile of six 6V feetech servos for the leader arm, some wires, and two motor controllers. i proceeded to print the arms themselves (each print took 12 hours!)

image of x1 carbon plate (thanks erons for x1 adoption fees, A++ tactic to distract from life’s woes) with half-printed parts for an so-arm-101

(The picture above is actually where I cancelled one halfway through. I was cleaning and accidentally tripped over the power supply, shutting off the printer mid-print six hours in. The bambu did an amazing job recovering, but after a few layers there was an obvious xy shift and for the tight tolerances on this print I didn’t think it would be okay)

to an ur5?

and somehow i now have a ur5, which i sorta daydreamed about for years but the price was quite prohibitive even on ebay.

ur5 with controller and pendant


i think what happened is that i got to play with robot arms recently (see: job apps) and y’know that feeling when you smell food and suddenly realize you’re starving?

that’s how it felt to play around with robot arms again.

…  and so i now own 5 robot arms.

feed me, i’m starving

Initial UR5 thoughts (pictures later)

when delivered, it took five minutes to set up and get working 0:

there is something nice about that i have to admit. just plug the controller into the wall, plug the arm into the controller, turn it on with the pendant, and it just works !!!

it’s nice that it homes slowly. collaborative robots are great.

it’s got a USB drive inside the control box, the whole robot runs off of a usb drive !

there’s an M8 connector near the end effector power and potentially data. 24v power i think. this is where the gripper would attach.

the end effector plate is https://millibar.com/product/mtc-345-rs “MTC-345-RS” Low Profile Series Manual Tool Changer – Robot Side (thanks previous owner)

deadline project

with any project like this it’s important to have a deadline or else enthusiasm peters out and the robot arm will be an expensive brick / toy

the original goal is also to have a demo for the first spring farmer’s market at union sq — originally the idea was collect calligraphy data — however that’s a bit tight (two weeks) so i’m thinking more a simple interactive demo (eg robot tracks palm, as you move your palm around the robot mirrors your movement)

The weekend / personal project, socks!

i did get excited about it just doing laundry, then i unfortunately realized laundry folding demos are all with two robot arms. oops. so aside from the above demo, in terms of what i want it to actually do at home, i think i’ll be backing down to a sock sorting robot.

problem statement (approximately): i have a pile of mismatching socks (singletons) so after each laundry run i pair my socks but then need to pair again with the mismatching pile / add to the pile as appropriate, which is annoying as the pile of mismatching socks is quite large. it’d be nice to just … not.

to simplify as much as possible (ideally this is a weekend project or at most a weeklong project to a MVP demo) i decided to start with the so101 arms. they have very limited reach so i cut up some kleenex boxes. if you read up online people talk about radix sort and such, but again, simplify it. i put three pairs of socks (of equal length) that are obviously different designs.

software system sketch

The goal will be for it to pull out one sock (that’s the standard “collect teleop data, train” bit) and lay it out. We take a picture using the overhead camera.

We use an off-the-shelf (i hope?) image segmentation to find the sock within the camera image.

Store the image: Assign the sock_image an id and store the associated image in either a pandas dataframe (+pickle) or sqlite or just as flat file.

We will also need a function to say if two images show a matching pair of socks.

Now for the loop. I mocked out the following set up to deal with the limited reach of the so101 arm. We have four bins: a bin (I’ll call it pile A) on the left, two bins (bin 1 and bin 2), and a fourth bin (called pile B). (The bins are just halved kleenex boxes).

Finally we have a bowl on the right. The bowl can be bigger since we only need to drop things in, not retrieve them.

the so101 follower arm with four boxes and a bowl, one with socks in it, as a mock setup for sorting socks. (note: the big carton on the left side has walls so that when the robot arm drops socks, pushes them around, etc. the socks are less likely to end up on the floor)

We start with pile A, where some human dumped all the socks. The robot arm pulls out a sock and drops it in the center (ideally so the sock is mostly flat) to take a picture.

Put the sock in bin 1 (I might not bother with training for this one, unless I want recovery when/if it fails. Might just tell it the x,y center of the box and have it open the claw over it for now).

Pull out another sock and drop it in the center for inspection.  If pair_found(), put that sock in the bowl, then pick up the matching sock and put that in the bowl too.

Otherwise, put the inspected sock in an empty bin (bin 2). Pull another sock. If it matches, repeat pair_found(). Pull another sock. If at any point we pull a sock and have no empty bins, we toss the inspected sock into pile B which we treat as the “discard pile.”

Once the first pile is empty, we have inspected all socks once.

Now we determine if all the socks in discard (pile B) are singletons — if so we are done! But if not (aka there’s unsorted pairs in there) we do the same process again, except pile B is now our starting pile and pile A is our discard. Repeat until end state reached.

(Note: if we had another bin, if we pull a singleton but there are still pairs left we could put it the bin-of-singletons. But we don’t have enough space for that alas).

Fin.

time to sleep so i can wake up at 5AM to go to my safety at sea class, which i signed up for back when i thought i’d be sailing this year =/ alas. don’t see much of that in my future now.

back to robot arms!

i got excited about robots again, which is a happy thing for me.* i watched some young ‘un training up an arm to put legos in a bowl in the space of a few hours and was curious, and holy smokes, manipulation has just bounded ahead in the last 2-3 years.

* apparently it’s been about 10 years, obligatory wow i’m old, the staubli arm post is here: https://orangenarwhals.com/2017/07/staubli-arm/

i decided to invest in my future and get a pair of robot arms (it’s great to not be a self-employed grad student anymore), it’s ca. $300 for two robot arms (T__T so expensive) — one a leader, one a follower. the name of the game (for at least a year or so) has been teleoperation data (?? i guess they just decided to throw money and scale this) which seems dumb but has been remarkably effective (see: bitter lesson).

3d print

i even took the STLs and sliced them with tree supports myself *gasp* instead of printing a model from makerworld eheh. See: https://github.com/TheRobotStudio/SO-ARM100#printing-the-parts

assembly thoughts

well i’m a bit lazy to document everything, but here are some thoughts:

CHINAMIXELS — as a friend put it — chinese dynamixels ! get position feedback out. they’re expensive still ($20 each) but at least i no longer make $20k a year so eh?

the main issue with the assembly was the holes were too small for the screwdriver to fit through, so i was drilling out the hole with the screwdriver … also i realized a bit late that there were official docs for the seeedstudio version, not just generic ones,

https://wiki.seeedstudio.com/lerobot_so100m_new/

and also there was a 12.4 v power supply and a 5 v power supply.

also assigning ids to motors is ideally done before assembly because it’s a pain after assembly to detach and attach the servo wires.

also getting the horns onto the servo does involve some force, and some trickery to getting them off again (wiggle side to side with a screwdriver).

also why didn’t they just put numbers and labels on the 3d printed parts ?? it takes some figuring out which pieces go where. SIGH. i would’ve added those features *shakes head*

alright. anyway after some fuss i assembled the arms. not sure i got the handle right (the purple is the “leader arm” but eh.

so after some fuss with the motor id stuff, followed by the calibration (the follower does seem smart enough to not go past the bounds you calibrate it to, but still map correctly to the leader arm)

teleoperation

i finally got teleop working 😀

keep truckin’ through the readme, set up cameras

alright so i have a wrist-mount camera which is like a raspberry pi camera, but it has a board-to-USB cable instead of going through a ribbon 0: then I can print out the camera adapter

https://github.com/TheRobotStudio/SO-ARM100/tree/main/Optional/SO101_Wrist_Cam_Hex-Nut_Mount_32x32_UVC_Module

(shown here mounted with tape, but actually it just takes 2x M3 screws and then the camera module is screwed in using four of the leftover feetech servo screws)

https://wiki.seeedstudio.com/lerobot_so100m_new/#if-using-a-regular-camera

& collect data: input = teleop motor joints + camera feeds

then after that got the weird training code semi-working (really need to figure out the keyboard controls, killed my whole python environment trying to get it working).

i recorded and got episode replay to play it back

[ Note: i have some built-in fear from my first experience with commercial robot arms, which was an industrial staubli machine that moved FAST. it was not a collaborative arm and it would commit violence on its way to a home position without a second thought. ]

there’s tape on the bottom of the button to keep it from sliding around. i put a taped-cross on it in the theory that it would make learning faster, but decided the task was too annoying to try to collect an hour of data for.

what’s next?

in the above data collection, wanted to give it the task of “turn on the light” — but actually that button is real annoying to press with the robot arm / manipulator.

i feel apologetic to my cat for making it push buttons.

so, what’s next? actually collecting data and training.

and after that … i’m getting a ur5 tomorrow huehuehue

Fin.

[ personal note: life has been topsy turvy* so i’ve been pretty quiet. but i also told myself that once i graduated (my phd) i’d be free to be really honest about things. i’ve been following general advice (how to be professional) but starting to feel like i get better results being myself, because well, i’m myself. (this includes my derpy enthusiastic side, but also includes my self-studying physics c self that was fearless about whiteboarding math up)

*i’m unemployed again *laugh cry* now i have stories aplenty. from 16x A100 GPUs idling doing nothing, to *other stuff* . at least i did manage to kill some of my imposter syndrome through just how bizarre my jobs have been. time to look for an in-person job, which should be easy with robots ]

captain’s log: i found a job! + going to a xmas tree farm

well, trying to kickstart my blogging again, i found a job 🙂 at a small startup (<10 people). i’ve joined the ai crowd. it is good to not have to prep for interviews !

anyway so here is us getting an xmas tree dec 6th apparently — wow i have not blogged in AGES. 

here is the tree, complete with plastic baubles, LED lights, origami decorations, and origami tree topper

proof that it is a real tree! so many leaves.

the birth grounds of our tree. all the trees are a flat cost so that people don’t cut down all the baby trees


the tree farm has this machine that wraps the tree and it makes it huggable  !

TODO: insert videos of the baling and auger process.

hurray i blogged!

projects blog (nouyang)