openCV, ubuntu 14.04, and researching robot portraiture techniques for Robot Art Competition 2016

For the robotart.org competition, with a ton of help, I’ve gotten a 6 axis arm to work (it accepts x,y,z coordinates).

I configured openCV on ubuntu like so: http://www.samontab.com/web/2014/06/installing-opencv-2-4-9-in-ubuntu-14-04-lts/

(has a few steps tacked onto the official linux installation instructions to hook everything up properly, which is explained on the site and I’ll just make a quick note of below:

sudo gedit /etc/ld.so.conf.d/opencv.conf
  • /usr/local/lib
sudo ldconfig
  • PKG_CONFIG_PATH=$PKG_CONFIG_PATH:/usr/local/lib/pkgconfig
  • export PKG_CONFIG_PATH

) (openCV takes up a decent chunk of space!)

I haven’t used C++ before, though I’ve used C. To run some examples, such as this hough transform:

https://github.com/Itseez/opencv/blob/master/samples/cpp/tutorial_code/ImgTrans/HoughLines_Demo.cpp

$ g++ houghdemo.cpp -o app `pkg-config --cflags --libs opencv`

This creates a file called “app” which can be run like so:
$ ./app face4.jpg

Man, canny edge detectors on faces look super creepy and not at all recognizable. Also, I don’t think Hough is the way to go for recognizable faces.

Screenshot from 2016-01-14 17:16:26

Alright, so what are some alternative approaches?

stroke-based approach?

I started looking into the general problem of “stroke-based approaches” which will probably be useful later. Here is a good SIGGRAPH paper from 2002 I was skimming through, supplementing with youtube videos when I got confused:

http://web.cs.ucdavis.edu/~ma/SIGGRAPH02/course23/notes/S02c23_3.pdf

http://stackoverflow.com/questions/973094/easiest-algorithm-of-voronoi-diagram-to-implement

I found this comparison of lines only versus allowing different stroke widths (still one color) very compelling:

http://www.craftsy.com/blog/2013/05/discover-the-secrets-to-capturing-a-likeness-in-a-portrait/

It would be interesting to get different stroke widths involved. I wonder what the name of the final photoshop filter (that reduced the image to simple shapes) isInkscape’s vector tracing of bitmaps is (according to the credits) based on Potrace, created by Peter Selinger:

potrace.sourceforge.net

Alright, not quite doing the trick..

portraits

Okay, there’s this sweet 2004 paper “Example-Based Composite Sketching of Human Portraits” that Pranjal dug up.

http://grail.cs.washington.edu/wp-content/uploads/2015/08/chen-2004-ecs.pdf

They essentially had an artist generate a training set, then parameterized each feature of the face (eyes, nose, mouth), and had a separate system for the hair. The results are really awesome, but to replicate them, I’d have to have someone dig up 10 year old code and try to get it to run; or generate my own training set, wade into math, and code it all by myself.

Given my limited timeframe (I essentially have two more weeks working by myself), I should probably focus on more artistic and less technical implementations. Yes, the robot has six axis, but motion planning is hard and I should probably focus on an acceptable entry or risk having no entry at all.

Stippling

An approach that would be slow but would probably capture the likeness better would be to get stippling working. Evil Mad Scientist Labs wrote a stippler in Processing for their egg drawing robot, which outputs an SVG.

http://www.evilmadscientist.com/2012/stipplegen2/

http://wiki.evilmadscientist.com/StippleGen

Presumably the eggbot software converts the svg to gcode at some point.

http://wiki.evilmadscientist.com/Installing_software

If I can’t figure out where there generator is, or want to stick to python, there seems to be PyCAM

Todo

read http://blog.otoro.net/2015/12/28/recurrent-net-dreams-up-fake-chinese-characters-in-vector-format-with-tensorflow/

Calligraphy

First, inspiration. 250 year old writer automaton, you can swap out cams to change the gearing and change what it writes. Crazy!

okay, so more modern. this robot arm uses the Hershey vector fonts to draw kanji.

https://en.wikipedia.org/wiki/Hershey_fonts

Turns out there is an open source SVG font for Chinese (Hershey does traditional chinese characters, not simplified) so now I can write messages to my parents :3  if I can convert the svg to commands to the robot.

https://en.wikipedia.org/wiki/WenQuanYi

Haha, there is even a startup that writes notes for you…

http://www.wired.com/2015/02/meet-bond-robot-creates-handwritten-notes/

Evil mad scientist labs has done some work on the topic.

http://www.evilmadscientist.com/2011/hershey-text-an-inkscape-extension-for-engraving-fonts/

https://github.com/evil-mad/EggBot/

This seems tractable. The key step seems to be SVG to gcode. Seems like I should be able to roll my own without too much difficulty or else use existing libraries.

misc. videos

^– hi again disney research

impressive with just derpy hobby servos

and of course, mcqueen

 

 

hi 2016 (2 servo drawing robot arm, tripod gait 12 servo hexapod, visit to NASA, quadcopter tuning, etc.)

hm, haven’t updated in a while.

i built a lot of robots with parents over the winter break. i built a robot arm and refreshed on inverse kinematics; more specifically, make sure your servos are rotating as you expect: IK goes counterclockwise since angles increase that way, but your servos may increasing in a clockwise direction… a simple map(theta, 0, 180, 180, 0) will fix your problem if you catch it.

2016-01-01

processing takes in x,y coordinates drawn on the screen and spits them out to arduino over serial, which does the inverse kinematics and spits out the theta values to the servo

https://github.com/NarwhalEdu/CopyCat/blob/master/Code/basicsIK/basicsIK.ino

or for the one where it draws what you draw on the screen, https://gist.github.com/nouyang/b312b9ea5c67baa0c914

also tried to face.

it does not face well, in part i have derpy three year old code

face3

this processing code takes a lot of processing libraries. thresholds image, performs canny edge detection, then a walking algorithm (look at each black pixel by scanning image in x and y, see if neighbors are black as well, then walk along that pixel) to turn the edges into vectors. then output to robot, but robot is limited in resolution (arduino servo library) and cheap hobby servo overshoot.

below you can see preview in python.  (basic code, I basically copied the output from processing into a text file and  added some python code to that to plot the values)

is to check image is within the working envelope of the arm. IK is fixed with arm “up”.

face2

faceservo

problem of walking algorithm: adds a box around the image. irritating. need to rewrite code. looking into open cv.

i also rehashed my hexapod project with 12 servos and popsicle sticks

hexapod

basically this https://github.com/nouyang/18-servo-hexapod/blob/master/arduino_may13_2011.pde

but modified to work with the servo configuration on the rectangular robot, and added code to allow you to step through the gait with “j” and “k”: https://gist.github.com/nouyang/d9b6474e3ee412b9b05b

need to implement the other gaits; also, this moves so smoothly, envious, but they have lasercutter :3

worked on quad, now stuck at calibration stage 😡 because i have not built quad before, i could not push through this in a day or two unlike the drawing arm and hexapod.

quad

 

made from a sad clothes drying rack we took apart

 

 

 

transmittercable

we couldn’t find the original cable for the transmitter, so we connected the ports up with a FTDI -> USB cable as per http://psychoul.com/electronics/how-to-make-your-own-usb-cable-for-hk-t6a-calibration

zero

used http://www.sgr.info/usbradio/download.htm and calibrated my servos to zero… took a while to realize it *can* and *should* read the current values, guess my wires were loose, but the values because a lot easier to input. used the kk2 screen to fix some controls that were reversed from what the kk2 expected (left = left and not right, etc.). zeroed all the values on the kk2. turns out (minus the flipping controls) I could zero just as well on using the trim knobs on the controller itself.

went to visit NASA space museum in houston. they had little robot that made and served you froyo. adorable.

nasaicecream

also, some regal looking hexapods in the actual NASA workplace.

 

nasarobot

at MITERS I got a robot arm working with lots of help from MITERS / London Hackerspace / john from BUILDS. For robot arm competition. http://robotart.org/

i’m now robot art-ing. here is using Fengrave on a black and white image with appropriate offsets to produce gcode (well, limited to G0 and G1 commands)

fengrave

robotdrawing

face code still derp. (streaks are because i wrote gcode translator, and it goes to x,y,z position instead of x,y and then z). too many x,y points. draws slowly.

face

michael made crayon extruder (=metal tube + power resistor) and also pen mount. crayons = hard to control flow rate. started making square, then pooped out a lot of melted crayon. alas.

crayonextrusion

learned a lot of patience dealing with old manuals, 20 year old operating systems / controllers. main issue turned out to be a dumb calibration assumption (robot had arrows; should have ignored them and used indentations instead).

https://github.com/miters/gdmux gcode -> V+

also, i learned about oscilloscope rs232 decoder! had to invert to get it working properly (zeros are high in rs232?). scope ground, tx line. bam, now you can check whether you are actually transmitting all the carriage return and line feeds you need…

2016-01-13

currently: reading up on image processing. openCV. http://web.cs.ucdavis.edu/~ma/SIGGRAPH02/course23/notes/S02c23_3.pdf

terse update. more details available if questions exist.

many thanks to my parents for being excited and not jaded

DIY Menstrual Cups & Hack4Fem

Background on Menstrual Cups

I’ve been throwing around the idea of customized or instrumented menstrual cups for a while.

Menstrual cups, typically made of medical-grade silicone, are cups that catch the blood flowing out of the cervix and typically only need to be emptied twice a day. They are flexible so you fold them and they open on insertion. They exist in collapsible, ball-valve, and other forms.

For why I love them (well, I actually switched to tampons first):

so recently I started regularly using tampons, they are amazinnnnggg

seriously
every month my sleep schedule would get massively disrupted, because I slept uneasily, sometimes even in an upright sitting position, ready to jump awake when blood inevitably started to go everywhere, and also I’d have to wash my sheets and multiple sets of underwear / pants each month, often by hand
or i’d store pads in my bookbag and a month later when I needed them again, they’d be full of resistors and other bookbag filth, since the pad packaging isn’t really watertight

Some of my friends like the concept of menstrual cups, but have various problems with removal and leakage. I speculate that custom cups could help, although more analysis is needed to determine the cause and the variation. Hence this project.

I also really like the idea of an citizen science project measuring variation in periods. Other people claim a Diva cup is supposed to hold half your period’s worth of blood — I definitely bleed at least four cup-fulls during my period, so I wonder if my bleeding is heavy. If lots of people contributed data, we could see whether it varied by ethnicity, age, body weight, etc. It would be a ton of fun!

I was beaten to the punch on instrumented cups, but I think there is still room for improvement (or at least an open-hardware version).

This past Saturday I ran a hackathon for feminism (website here).

2015-11-14

It went really well, and I’ll blog about it in another post shortly — this one’s focused on menstrual cups 🙂

hackathon

pre-hackathon

Before the hackathon, John and I made menstrual cup molds and cast menstrual cups. We modeled it in Solidworks, then intersected it with a rectangular prism and split the prism in half. This created a two-part mold with a cavity in-between the size and shape of the menstrual cup. The Solidworks 2015 files are available here and are CC-BY-SA (c) John Aleman.

model

We then printed these out on a Stratasys printer, melted the wax support material off in the toaster oven, cleaned off with isoproprly alcohol, and had our molds.

2015-11-12

I first tested it at home with John’s help. We used Smooth-On’s SORTA-Clear Translucent Silicone Rubber, shore 37, since it cures in 4 hours instead of 24 hours. This is food safe and I think body-safe — I would call Reynold’s Advanced Materials to double-check before actually using one of these cups.

Turns out this material was really difficult to work with… it was the consistency of viscous snot.  No pictures pre-hackathon since the silicone is messy. I sure had a difficult time separating the molds. And it turned out we hadn’t even filled the molds for two of them!

incomplete-molds

Eventually some careful prying by my friend Nadya, we opened the final mold to find a menstrual cup inside!

mcup-mold

mcup

It had a lot of bubbles — would have benefited from 2-3 minutes in a vacuum. Unfortunate, since there doesn’t seem to be a cheap hobbyist vacuum solution. It was also slightly tacky, so either something in the 3d print is inhibiting cure or we didn’t mix well enough.

hackathon hands-on workshop

hackathon-molds

At the hackathon, we used syringes to eject into the molds until it came out the sides, and also just poured it into the bottom half of the mold and squeezed the top half on top to guarantee the mold was filled. We also all got pretty excited about the idea of glittery menstrual cups! 🙂 Sadly I didn’t have glitter on hand.

twocups

 

At the end of the hackathon we managed to demold one. This one  also had a giant bubble and many little bubbles, and was also tacky after the full four hours of cure time. We mixed this one much better, so I’m starting to think something is inhibiting cure, or perhaps the place we left them was too chilly.

next up

I definitely need a better mold design, and I need to figure out why the silicone is not curing properly. I also need to figure out a way around the bubble issue.

I also wonder how to analyze and determine why menstrual cups are failing (when they are hard to retrieve or leak). I also wonder, if fit is part of the problem, how you could easily take a measurement(s) to customize menstrual cups to each person.

And of course, I want to build the open science project! 🙂 Maybe instead of having each person build their own cup, I could at least start a form going and collect data…

projects blog (nouyang)