To the men in my life: a letter about the Orlando shootings, gun violence, Trump, and toxic masculinity

 

To the men in my life:
Can we have a conversation about the toxic masculinity of mass shooters in the United States, and about what role you, personally, want to play in challenging the harmful side of masculine gender roles?

Can you all make it a point to talk to each other about your feelings?

Can you all make it a point to talk to each other about your relationships (how your dates are going, how you’re going about it)?

Can you all make it a point to talk about being good role models, both privately (1 on 1), semi-privately (hanging out with other guys), and publicly (online by writing and talking [3], offline by giving talks and holding discussions), for other men?

To all:
Instead of talking about how mass shooters are crazy and we should improve mental health care (which should be f***g’ obvious), let’s also talk about how so many of them had a history of domestic violence in their life, how many frequented virulently misogynistic websites long before they got around to shooting people up.

Let’s talk about how the UCLA shooter, who shot his estranged wife before driving to UCLA (3 dead):

Let’s talk about how normal he was. in his note he asked for the police to feed his cat and gave his address. When the police went to his house, they also found a kill list with Hasti’s name on it.

“Hasti’s body was found dead at 1:25 a.m. Thursday, according to the Hennepin County Medical Examiner’s Office. She had multiple gunshot wounds.”

Let’s talk about the Colorado Planned Parenthood shooter and how, according to his ex-wife when filing for divorce (3 dead),
“He’d … suddenly explode in anger at home, kicking her and pulling her hair.” [1]

Let’s talk about how the Orlando shooter beat his wife, years before shooting up 100+ people at a gay nightclub’s Latin Night during Pride Mont (49 dead):
“He beat me. He would just come home and start beating me up because the laundry wasn’t finished or something like that.” [2]

Let’s talk about the UC Santa Barbara shooter (6 dead):
“I’m 22 years old and I’m still a virgin … I don’t know why you girls aren’t attracted to me, but I will punish you all for it. … I don’t know what you don’t see in me. I’m the perfect guy and yet you throw yourselves at these obnoxious men instead of me, the supreme gentleman.”

Let’s talk about the , a 48-year-old systems analyst (4 dead):
“Why do this?? To young girls? Just read below. … Result is I am learning basics by trial and error in my 40s, followed by discuragement. Too embarassed to tell anyone this, at almost 50 one is expected to just know these things. … I was reading several posts on different forums and it seems many teenage girls have sex frequently. One 16 year old does it usually three times a day with her boyfriend. So, err, after a month of that, this little hoe has had more sex than ME in my LIFE, and I am 48.”

male_entitlement_toxic_masculinity_lafitness_mass_shooter

Let’s talk about how “women are 11 times more likely to be murdered with guns in the U.S. than in other developed countries.”

Let’s talk about how, “despite impressions from media coverage, mass shootings in which at least four people were murdered with a gun are also typically acts of domestic or family violence:
an Everytown analysis of every mass shooting between 2009-15 found that 57 percent were committed by intimate partners or family.”

Let’s talk about how 90%+ of mass murderers are men.

And let’s address it, one conversation at a time, in real life.

Talk to the men in your life, and expect more of them.

====
[1] And finally, finally, can we talk about how popular Trump is, despite (or because) he sounds eerily similar to the above men:
” After a painful scalp reduction surgery to remove a bald spot, … Donald held back Ivana’s arms and began to pull out fistfuls of hair from her scalp, as if to mirror the pain he felt from his own operation. He tore off her clothes and unzipped his pants.”

[2] Trump in 1994: “And then I have days where, if I come home — and I don’t want to sound too much like a chauvinist — but when I come home and dinner’s not ready, I go through the roof.””

[3] https://www.youtube.com/results?search_query=feminist

openCV, ubuntu 14.04, and researching robot portraiture techniques for Robot Art Competition 2016

For the robotart.org competition, with a ton of help, I’ve gotten a 6 axis arm to work (it accepts x,y,z coordinates).

I configured openCV on ubuntu like so: http://www.samontab.com/web/2014/06/installing-opencv-2-4-9-in-ubuntu-14-04-lts/

(has a few steps tacked onto the official linux installation instructions to hook everything up properly, which is explained on the site and I’ll just make a quick note of below:

sudo gedit /etc/ld.so.conf.d/opencv.conf
  • /usr/local/lib
sudo ldconfig
  • PKG_CONFIG_PATH=$PKG_CONFIG_PATH:/usr/local/lib/pkgconfig
  • export PKG_CONFIG_PATH

) (openCV takes up a decent chunk of space!)

I haven’t used C++ before, though I’ve used C. To run some examples, such as this hough transform:

https://github.com/Itseez/opencv/blob/master/samples/cpp/tutorial_code/ImgTrans/HoughLines_Demo.cpp

$ g++ houghdemo.cpp -o app `pkg-config --cflags --libs opencv`

This creates a file called “app” which can be run like so:
$ ./app face4.jpg

Man, canny edge detectors on faces look super creepy and not at all recognizable. Also, I don’t think Hough is the way to go for recognizable faces.

Screenshot from 2016-01-14 17:16:26

Alright, so what are some alternative approaches?

stroke-based approach?

I started looking into the general problem of “stroke-based approaches” which will probably be useful later. Here is a good SIGGRAPH paper from 2002 I was skimming through, supplementing with youtube videos when I got confused:

http://web.cs.ucdavis.edu/~ma/SIGGRAPH02/course23/notes/S02c23_3.pdf

http://stackoverflow.com/questions/973094/easiest-algorithm-of-voronoi-diagram-to-implement

I found this comparison of lines only versus allowing different stroke widths (still one color) very compelling:

http://www.craftsy.com/blog/2013/05/discover-the-secrets-to-capturing-a-likeness-in-a-portrait/

It would be interesting to get different stroke widths involved. I wonder what the name of the final photoshop filter (that reduced the image to simple shapes) isInkscape’s vector tracing of bitmaps is (according to the credits) based on Potrace, created by Peter Selinger:

potrace.sourceforge.net

Alright, not quite doing the trick..

portraits

Okay, there’s this sweet 2004 paper “Example-Based Composite Sketching of Human Portraits” that Pranjal dug up.

http://grail.cs.washington.edu/wp-content/uploads/2015/08/chen-2004-ecs.pdf

They essentially had an artist generate a training set, then parameterized each feature of the face (eyes, nose, mouth), and had a separate system for the hair. The results are really awesome, but to replicate them, I’d have to have someone dig up 10 year old code and try to get it to run; or generate my own training set, wade into math, and code it all by myself.

Given my limited timeframe (I essentially have two more weeks working by myself), I should probably focus on more artistic and less technical implementations. Yes, the robot has six axis, but motion planning is hard and I should probably focus on an acceptable entry or risk having no entry at all.

Stippling

An approach that would be slow but would probably capture the likeness better would be to get stippling working. Evil Mad Scientist Labs wrote a stippler in Processing for their egg drawing robot, which outputs an SVG.

http://www.evilmadscientist.com/2012/stipplegen2/

http://wiki.evilmadscientist.com/StippleGen

Presumably the eggbot software converts the svg to gcode at some point.

http://wiki.evilmadscientist.com/Installing_software

If I can’t figure out where there generator is, or want to stick to python, there seems to be PyCAM

Todo

read http://blog.otoro.net/2015/12/28/recurrent-net-dreams-up-fake-chinese-characters-in-vector-format-with-tensorflow/

Calligraphy

First, inspiration. 250 year old writer automaton, you can swap out cams to change the gearing and change what it writes. Crazy!

okay, so more modern. this robot arm uses the Hershey vector fonts to draw kanji.

https://en.wikipedia.org/wiki/Hershey_fonts

Turns out there is an open source SVG font for Chinese (Hershey does traditional chinese characters, not simplified) so now I can write messages to my parents :3  if I can convert the svg to commands to the robot.

https://en.wikipedia.org/wiki/WenQuanYi

Haha, there is even a startup that writes notes for you…

http://www.wired.com/2015/02/meet-bond-robot-creates-handwritten-notes/

Evil mad scientist labs has done some work on the topic.

http://www.evilmadscientist.com/2011/hershey-text-an-inkscape-extension-for-engraving-fonts/

https://github.com/evil-mad/EggBot/

This seems tractable. The key step seems to be SVG to gcode. Seems like I should be able to roll my own without too much difficulty or else use existing libraries.

misc. videos

^– hi again disney research

impressive with just derpy hobby servos

and of course, mcqueen

 

 

hi 2016 (2 servo drawing robot arm, tripod gait 12 servo hexapod, visit to NASA, quadcopter tuning, etc.)

hm, haven’t updated in a while.

i built a lot of robots with parents over the winter break. i built a robot arm and refreshed on inverse kinematics; more specifically, make sure your servos are rotating as you expect: IK goes counterclockwise since angles increase that way, but your servos may increasing in a clockwise direction… a simple map(theta, 0, 180, 180, 0) will fix your problem if you catch it.

2016-01-01

processing takes in x,y coordinates drawn on the screen and spits them out to arduino over serial, which does the inverse kinematics and spits out the theta values to the servo

https://github.com/NarwhalEdu/CopyCat/blob/master/Code/basicsIK/basicsIK.ino

or for the one where it draws what you draw on the screen, https://gist.github.com/nouyang/b312b9ea5c67baa0c914

also tried to face.

it does not face well, in part i have derpy three year old code

face3

this processing code takes a lot of processing libraries. thresholds image, performs canny edge detection, then a walking algorithm (look at each black pixel by scanning image in x and y, see if neighbors are black as well, then walk along that pixel) to turn the edges into vectors. then output to robot, but robot is limited in resolution (arduino servo library) and cheap hobby servo overshoot.

below you can see preview in python.  (basic code, I basically copied the output from processing into a text file and  added some python code to that to plot the values)

is to check image is within the working envelope of the arm. IK is fixed with arm “up”.

face2

faceservo

problem of walking algorithm: adds a box around the image. irritating. need to rewrite code. looking into open cv.

i also rehashed my hexapod project with 12 servos and popsicle sticks

hexapod

basically this https://github.com/nouyang/18-servo-hexapod/blob/master/arduino_may13_2011.pde

but modified to work with the servo configuration on the rectangular robot, and added code to allow you to step through the gait with “j” and “k”: https://gist.github.com/nouyang/d9b6474e3ee412b9b05b

need to implement the other gaits; also, this moves so smoothly, envious, but they have lasercutter :3

worked on quad, now stuck at calibration stage 😡 because i have not built quad before, i could not push through this in a day or two unlike the drawing arm and hexapod.

quad

 

made from a sad clothes drying rack we took apart

 

 

 

transmittercable

we couldn’t find the original cable for the transmitter, so we connected the ports up with a FTDI -> USB cable as per http://psychoul.com/electronics/how-to-make-your-own-usb-cable-for-hk-t6a-calibration

zero

used http://www.sgr.info/usbradio/download.htm and calibrated my servos to zero… took a while to realize it *can* and *should* read the current values, guess my wires were loose, but the values because a lot easier to input. used the kk2 screen to fix some controls that were reversed from what the kk2 expected (left = left and not right, etc.). zeroed all the values on the kk2. turns out (minus the flipping controls) I could zero just as well on using the trim knobs on the controller itself.

went to visit NASA space museum in houston. they had little robot that made and served you froyo. adorable.

nasaicecream

also, some regal looking hexapods in the actual NASA workplace.

 

nasarobot

at MITERS I got a robot arm working with lots of help from MITERS / London Hackerspace / john from BUILDS. For robot arm competition. http://robotart.org/

i’m now robot art-ing. here is using Fengrave on a black and white image with appropriate offsets to produce gcode (well, limited to G0 and G1 commands)

fengrave

robotdrawing

face code still derp. (streaks are because i wrote gcode translator, and it goes to x,y,z position instead of x,y and then z). too many x,y points. draws slowly.

face

michael made crayon extruder (=metal tube + power resistor) and also pen mount. crayons = hard to control flow rate. started making square, then pooped out a lot of melted crayon. alas.

crayonextrusion

learned a lot of patience dealing with old manuals, 20 year old operating systems / controllers. main issue turned out to be a dumb calibration assumption (robot had arrows; should have ignored them and used indentations instead).

https://github.com/miters/gdmux gcode -> V+

also, i learned about oscilloscope rs232 decoder! had to invert to get it working properly (zeros are high in rs232?). scope ground, tx line. bam, now you can check whether you are actually transmitting all the carriage return and line feeds you need…

2016-01-13

currently: reading up on image processing. openCV. http://web.cs.ucdavis.edu/~ma/SIGGRAPH02/course23/notes/S02c23_3.pdf

terse update. more details available if questions exist.

many thanks to my parents for being excited and not jaded

projects blog (nouyang)