i built a lot of robots with parents over the winter break. i built a robot arm and refreshed on inverse kinematics; more specifically, make sure your servos are rotating as you expect: IK goes counterclockwise since angles increase that way, but your servos may increasing in a clockwise direction… a simple map(theta, 0, 180, 180, 0) will fix your problem if you catch it.
processing takes in x,y coordinates drawn on the screen and spits them out to arduino over serial, which does the inverse kinematics and spits out the theta values to the servo
it does not face well, in part i have derpy three year old code
this processing code takes a lot of processing libraries. thresholds image, performs canny edge detection, then a walking algorithm (look at each black pixel by scanning image in x and y, see if neighbors are black as well, then walk along that pixel) to turn the edges into vectors. then output to robot, but robot is limited in resolution (arduino servo library) and cheap hobby servo overshoot.
below you can see preview in python. (basic code, I basically copied the output from processing into a text file and added some python code to that to plot the values)
is to check image is within the working envelope of the arm. IK is fixed with arm “up”.
problem of walking algorithm: adds a box around the image. irritating. need to rewrite code. looking into open cv.
used http://www.sgr.info/usbradio/download.htm and calibrated my servos to zero… took a while to realize it *can* and *should* read the current values, guess my wires were loose, but the values because a lot easier to input. used the kk2 screen to fix some controls that were reversed from what the kk2 expected (left = left and not right, etc.). zeroed all the values on the kk2. turns out (minus the flipping controls) I could zero just as well on using the trim knobs on the controller itself.
went to visit NASA space museum in houston. they had little robot that made and served you froyo. adorable.
also, some regal looking hexapods in the actual NASA workplace.
at MITERS I got a robot arm working with lots of help from MITERS / London Hackerspace / john from BUILDS. For robot arm competition. http://robotart.org/
i’m now robot art-ing. here is using Fengrave on a black and white image with appropriate offsets to produce gcode (well, limited to G0 and G1 commands)
face code still derp. (streaks are because i wrote gcode translator, and it goes to x,y,z position instead of x,y and then z). too many x,y points. draws slowly.
michael made crayon extruder (=metal tube + power resistor) and also pen mount. crayons = hard to control flow rate. started making square, then pooped out a lot of melted crayon. alas.
learned a lot of patience dealing with old manuals, 20 year old operating systems / controllers. main issue turned out to be a dumb calibration assumption (robot had arrows; should have ignored them and used indentations instead).
also, i learned about oscilloscope rs232 decoder! had to invert to get it working properly (zeros are high in rs232?). scope ground, tx line. bam, now you can check whether you are actually transmitting all the carriage return and line feeds you need…