All posts by nouyang

Keeping my Thinkpad battery ship-shape on Ubuntu (19.10)

Recently, I felt like my laptop battery (Lenovo Thinkpad) was draining a lot faster than before. I’m not sure if something triggered a particular spike, but I looked into it and found it’s really easy for me to adopt better battery practices on Ubuntu.

First, to find out just how sad my battery is, I used the command

upower -i $(upower -e | grep '/battery')
  native-path:          BAT0
  vendor:               LGC
  model:                01AV457
  serial:               969
  power supply:         yes
  updated:              Wed 07 Aug 2019 01:56:41 PM ADT (40 seconds ago)
  has history:          yes
  has statistics:       yes
  battery
    present:             yes
    rechargeable:        yes
    state:               discharging
    warning-level:       none
    energy:              28.85 Wh
    energy-empty:        0 Wh
    energy-full:         40.02 Wh
    energy-full-design:  56 Wh
    energy-rate:         7.468 W
    voltage:             15.958 V
    time to empty:       3.9 hours
    percentage:          72%
    capacity:            70.8929%
    technology:          lithium-polymer
    icon-name:          'battery-full-symbolic'
  History (charge):
    1565197001	72.000	discharging
  History (rate):
    1565197001	7.468	discharging

My full battery is 56 Wh, and now it’s only 40 Wh two years in! Nearly a (16/56) = 30% decrease!

Whew. That means if I had 6.5 hours of battery life before, now I have only 5 hours.

So, I looked up online. The key things to do are

1) Change the threshold for starting/stopping battery charging, to stop the battery from charging to 100%
2) Keep the laptop plugged in more, and
3) Keep the laptop cool

Guess I need to figure out a way to use my laptop in bed without setting it on the blankets. Or, get a proper desk and table.

Here’s the deets.

From https://forums.lenovo.com/t5/Welcome-FAQs-Knowledge-Base/How-can-I-increase-battery-life-ThinkPad/ta-p/244800:

For maximum lifespan if you rarely use the battery, set Custom charge thresholds to start charging at 40% capacity and stop at 50%, and keep the ThinkPad cool. The thresholds can be adjusted in the Battery Maintenance settings of the Lenovo Power Manager.

If you do use your battery somewhat frequently, set the start threshold at say 85% and stop at 90%. This will still give a good lifespan benefit over keeping it charged to 100%.

The simplest way to optimize for battery lifespan is to select Automatic in the Power Manager Battery Maintenance settings, and let it manage your battery charge thresholds for you.

And from https://forums.lenovo.com/t5/Lenovo-IdeaPad-1xx-3xx-5xx-7xx/maintaining-battery-capacity-on-ideapad-530s-laptop/td-p/4282652:

About battery discharge, it is better not to allow battery to reach full discharge before recharging it. Best care for Li battery is keeping it between 20-80 %. Avoid low discharge and overcharge . Battery drain should only be done to calibrate wrong battery recognition or if there is a battery problem.
Now, how to do this in Ubuntu? Turns out it’s really easy for thinkpads.
https://linrunner.de/en/tlp/docs/tlp-linux-advanced-power-management.html

Note: TLP and ThinkPad-related packages below are available via the official Ubuntu repository. Nevertheless it is recommended to use the PPA to stay with the latest TLP version.

sudo add-apt-repository ppa:linrunner/tlp
sudo apt-get update 
sudo apt-get install tlp tlp-rdw 
sudo apt-get install tp-smapi-dkms acpi-call-dkms
rui@chaiX1YG2:~$ sudo tlp setcharge 70 90 BAT0
Setting temporary charge thresholds for BAT0:
  stop  = 90
  start = 70
rui@chaiX1YG2:~$ sudo tlp start
TLP started in battery mode

Or, for a permanent change, edit the file “/etc/default/tlp

rui@chaiX1YG2:~$ sudo vi /etc/default/tlp

And uncomment the lines (around line 273)

START_CHARGE_THRESH_BAT0=75                                                   
STOP_CHARGE_THRESH_BAT0=80

setting the thresholds to whatever you think is best.

Shortly before I head out, I’ll probably run
sudo tlp setcharge 70 80 BAT0
since 3 to 4 hours of battery life is plenty for me. But I do want to be conservative since my power button no longer works, so if it completely dies then I have to unscrew the back cover and unplug the battery + CMOS battery briefly, then plug in/screw in/plug in power charger in order to start it again. I really only have myself to blame, not thinkpad — I’m sure it has something to do with shorting 24V / 3 amps through the USB port when I was working on motor controller things for my final project in underactuated robotics. Yea… that wasn’t great, and I’m really impressed the laptop survived and the USB port even still works!

[END]

Bash timer script (for pomodoro / general use)

Whew every so often I feel very self conscious about my blog, especially the previous very messy half-notes post, so here is a nice clean code snippet to refresh the palate:

From 1 minute timer

 

# script to create timer in terminal
# Jason Atwood
# 2013/6/22

# Edited: Nouyang, 2019/08/05
# Added bell sound, speech of "your time is up", & popup notification
# Added bigger font option, window resizing
# And changed colors: red background by default, flashing green for time's up 
# Usage: `./terminalTimer.sh 25` & set terminal to "always on top" and "always on visible workspace"

# start up 
#echo "starting timer script ..." 
#sleep 1 # seconds

# get input from user
if [[ $1 ]]; then
    DURATION=$1
else
    read -p "Timer for how many minutes? (Default 25) " -e DURATION 
    #read -p "Timer for how many minutes? " -i 25 -e DURATION 
    if [[ !DURATION ]]; then
        DURATION=25
    fi
fi
DURATION=$(( $DURATION*60 )) # convert minutes to seconds

# get start time
START=$(date +%s)
			
setterm -term linux -back red -fore white # use setterm to change background color
printf '\e[8;4;1t'

# infinite loop
while [ -1 ]; do

	# do math	
	NOW=$(date +%s)				# get time now in seconds
	DIF=$(( $NOW-$START ))			# compute diff in seconds
	ELAPSE=$(( $DURATION-$DIF ))		# compute elapsed time in seconds
	MINS=$(( $ELAPSE/60 ))			# convert to minutes... (dumps remainder from division)
	SECS=$(( $ELAPSE - ($MINS*60) )) 	# ... and seconds

	# conditional
	if [ $MINS == 0 ] && [ $SECS == 0 ]	# if mins = 0 and secs = 0 (i.e. if time expired)
	then 					# blink screen
        clear;
        #zenity --info --text "$(date);$(pwd)"
        notify-send "Time's up! $((DURATION/60)) minutes"
        #spd-say ""
        #spd-say "Your time is up! $((DURATION/60)) minutes"
        echo "Your time is up! $((DURATION/60)) minutes" | festival --tts

        play "./Computer_Magic.wav"
		for i in `seq 1 180`;    		# for i = 1:180 (i.e. 180 seconds)
		do
			clear					# flash on
			setterm -term linux -back green -fore white # use setterm to change background color
			echo "00:00                             		" # extra tabs for visibiltiy

			sleep 0.5

			clear					# flash off
			setterm -term linux -default		# clear setterm changes from above 
			echo "00:00" 				# (i.e. go back to white text on black background)
			sleep 0.5	
		done  					# end for loop 
		break					# end script

	else 					# else, time is not expired
        OUTPUT=$(clear; echo "$MINS:$SECS" | toilet -f future --filter crop ) # display time
        #OUTPUT=$(clear; echo "$MINS:$SECS" | toilet -f mono12 --filter crop ) # display time
        echo "$OUTPUT"
		sleep 1 # sleep 1 second
	fi					# end if
done	# end while loop	

To run, use

$ ./terminalTimer 25 # in minutes

What it looks like while running:

From 3 minute timer

Replicating the Visual Pushing and Grasping Paper Pt 1: Calibration

Quick note: This post is more a build log of how I got the calibration and grasping parts of the code base working, moreso than replication of the results; as I am not working with the pushing part.

 

I’m working on a summer project [1] building on a project by some Princeton folks called TossingBot. The idea is nice: combine a simple physics model with a network that learns a residual to add (or subtract) onto the single control parameter (thrown velocity), in order to toss arbitrary objects into a desired bin.

Anyway, although the code for the tossing bot paper is not available yet, the same authors released a nice, well commented / documented code repository for their earlier paper, the visual pushing and grasping paper. (I guess, it seemed like they completed part of it during a google internship, so I feel better that I’m being paid far less and cannot spend much time on releasing quality code).

And I actually got it to work! Wow, replicable work. Okay, so I didn’t get it to work in full — but I do have a vastly simplified version of their code working on my ur5, with a d415 camera, and a different gripper — and by using their pre-trained model out of the box! It outputs grasp predictions, and the ur5 moves to different locations where there are actually objects, picks them up, and then drops them.

I had to solve a few issues to get to this point, so I’ll outline here and explain in more detail later (hopefully — again time is short). Perhaps the most broadly applicable is my understanding of their calibration code.

Relevant links:
https://tossingbot.cs.princeton.edu/
http://vpg.cs.princeton.edu/
https://github.com/andyzeng/visual-pushing-grasping

What I did the last 10 days:

SOFTWARE

  1. Installed 18.04.1 on the lab computer
  2. Installed ROS — This is actually not needed for the VPG code, which has removed ROS as a dependency

Re: ROS, I also learned a hard lesson — checkout the right branch for your ROS packages. e.g. Kinetic Karma or Melodic whatever.  Otherwise will get a ton of errors.

GRIPPER

I used a different gripper than the one used in the paper, so I needed to rewrite portions of the code.

1. Attached robotiq gripper to the robot arm, and got it functional.
1a. Required low profile screws of a short length (8mm) that I couldn’t find in the lab at first.

1b. Got it working directly with the teach pendant.
1b. There is a serial to USB converter which for me happened to be inside the ur5 control box. I unplugged that and plugged it into my desktop (presumably, you could control the gripper directly from the ur5 interface when it’s plugged into the ur5 usb ports).
1c. Got it working with ROS. To be hones, this was a majoorrr pain. I kept getting all sorts of weird errors.
ow, instead, I talk to it directly in python, bypassing ROS entirely. Read the robotiq manuals which give a clear command example.

Relevant links:
https://blog.robotiq.com/controlling-the-robotiq-2f-gripper-with-modbus-commands-in-python
(mostly, just something like `ser.write(“\x09\x03\x07\xD0\x00\x03\x04\x0E”)` )

REALSENSE

Due to using a different version of Ubuntu, I had to a bit of experimenting to install the realsense drivers (which are from Intel, and separate from the VPG codebase).

First off, I had a Realsense D435, and opted to buy a D415 as in the paper, since the D415 is better for static scenarios where precision is more important. And it does seem to perform a lot better on the tabletop by default.

 

 

1. Attempted to install realsense-viewer on my ubuntu 19.10 install. Apparently the deb install only works with a much older version of the linux kernel — thus, started patching things and compiling from source. Did things like patch the patches, since the patches were for 18.04.2 and not… 19.10… I did get it working, but my main lesson was to install 18.04.1 on the ur5 desktop.

Relevant links:

  • Debug log http://orangenarwhals.com/hexblog/2019/06/11/realsense/
  • Start here https://github.com/IntelRealSense/librealsense/blob/development/doc/distribution_linux.md
  • Fail, start to compile from source https://github.com/IntelRealSense/librealsense/blob/development/doc/installation.md
  • See patch script files https://raw.githubusercontent.com/leggedrobotics/librealsense/7183d63720277669aaa540fba94b145c03d864cf/scripts/patch-ubuntu-kernel-4.18.sh

I did also switch from the D435 to the D415 out of a desire to change as little as possible from their setup. (Also, on the Intel website I read that the D435 is better for detecting motion and D415 better for static setups).

UR5

1. Plugged it in
1b. Major lesson: Pendant shows coordinates, the ones in VIEW are different the ones which are reported over serial / you send via python. Have to use dropdown to select BASE.
Additionally, there are two ways to specify configurations which can not be directly mixed and matched. joint config = angle of each of the 6 joints. And the other one is the coordinates (which presumably ur5 has a built-in IK solver and path planner to move to), but note that the final tool position is in axis-angle coordinates, not in rotation of each joint!!! This was super confusing to debug.
2. Learned to use ROS ur_modern_driver and get test_move.py working; ignore the other package — eventually, did not use this since codebase did not need ROS

VPG code

The calibration program outputs the pose of the camera, with which we can transform (shear, rotate, etc.) the acquired depth image into a “birds eye” depth image view.

I learned: use python-urx for debugging (due to upgrade of UR5 firmware itself, from universal robots, the serial communication code of VPG is a bit flaky). The calibrate.py parameters specify checkerboard offset from “tool center” which is defined by the UR5 (by default middle of the outward face of last joint). I documented my work in this github issue. Use the teach pendant to set workspace limits. Use foam to offset z height from table for safety purposes.

Calibration — as copied from github issue —

I’m not sure this is correct, but:

  1. Using the pendant, the limits are the X, Y, Z as displayed under the “TCP” box (it is displayed in mm; the code is in meters).
    e.g.
[[0.4, 0.75], [-0.25, 0.15], [-0.2 + 0.4, -0.1 + 0.4]])  [1]
[minx, max x], [miny, max y], [minz, max z]
  1. This is also just experimentally measured. I’m least certain on this part, but I think it is what the tool would need to do to move to the checkerboard center. So if it needs to move +20cm X – 0.01cm Z to the center of the checkerboard. Presumably the tool center = the middle area of the gripper fingers.

EDIT: Wow not sure what I was thinking, but it’s to the “tool center” of the robot (what is reported on the pendant / over TCP from the UR). And as to the sign of the offset — it’s really checkerboard_pos = tool_pos + offset, so define the offset appropriately. Well, that’s my current belief based on inspecting the code, but maybe I will update the belief tomorrow, who knows. end edit

The readme implies this calibration isn’t so important if you’re using the Intel D415 realsense. For what it’s worth the format of the files is (ignore the actual values)

EDIT: Yup, changed my mind. The calibration actual provides the pose of the camera relative to the robot frame. In this way, the image from the camera, which may be looking at the workspace from the side or at an angle, can be morphed/transformed so that the image is from a perfectly “birds eye” camera. end edit

Also, for starting out, a blank file named camera_depth_scale.txt will suffice to kill errors preventing code run.

real/camera_depth_scale.txt
1.012695312500000000e+00
real/camera_pose.txt
9.968040993643140224e-01 -1.695732684590832429e-02 -7.806431039047095899e-02 6.748152280106306522e-01
5.533242197034894325e-03 -9.602075096454146808e-01 2.792327374276499796e-01 -3.416026459607500732e-01
-7.969297786685919371e-02 -2.787722860809356273e-01 -9.570449528584960008e-01 6.668261082482905833e-01
0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.000000000000000000e+00
  1. Any 4×4 checkerboard will work. I used some online checkerboard generator and then printed it out. e.g. here is one
    4x4

[1] Note that it’s possible the pendant display somehow differs from the actual TCP values — my z-values were 0.07 on the pendant corresponding to 0.47 in python; to debug, can use examples/simple.py https://github.com/SintefManufacturing/python-urx

And more rambling thoughts:
I mostly fussed around with the calibrate.py script for a long time, an entire 1-2 days wasted on the fact that I didn’t realize the pendant coordinates were off by 40 cm on the z axis, so combined with the joint config vs position specification issue, I was confused why the robot was constantly trying to go through the table. I suspected it was something like the z axis issue, but really it was using this library to get the pose out
https://github.com/SintefManufacturing/python-urx/
(such a great library!) that helped me figure it out.

Additionally, the tool offset I wasn’t certain how it worked, until I opened the code. I thought it was literally to where I wanted on the gripper to be the centerpoint, but no, it’s literally to what the UR5 thinks is the centerpoint of its tool, which is what it reports the coordinates of.

I’m currently still having some z-depth issues, so trying to work through the very detailed! parameters given in the paper to see what is going on with that.

USB extension cable — USB 3.0 is quite strange. I spent a long time figuring out that my extension cable looks like a USB 3 cable (blue ends, extra pins) but was behaving as a USB 2.0 extension cable… ordered some off of amazon that did the trick (also lsusb -t was very helpful).

Home position —

It seems that

Here’s a video of what it’s doing for now (I’ll rehost onto youtube for longevity when I get the chance)
https://photos.app.goo.gl/r6vtjPbjLJECjzPd6
And a more exciting dynamic maneuver

And pictures

 

Yesterday, when it was kinda working

Hey look, I selected BASE. T__T

Calibration in progress. With some limits to the movel command, punctuated by “I guess it’s safe *shrug*:

And a blurry picture of my lab. Had to crop out my robot a bit to avoid faces.

Until next time, folks. Hopefully I’ll have a working demo of something of my own soon. Right now, just running a mutilated version of someone else’s code. But happy to working with actual robots again.

conclusion

Okay, that was all a bit rambly. But if anyone has questions, feel free to ask away.

 

Foot notes:

As to my motivation, I’m working on a small summer research project, which I will detail if I end up getting it working in full.

The idea is heavily based off of the tossing bot paper, as I liked the idea of combining a physics baseline with learning of the error (the residuals).

My requirements were:
1. can be finished in 3 months starting from scratch
2. has cool demo (to, for instance, a 10 year old maker faire attendee) — so probably something dynamic, movement-wise
3. research worthy, since my qualification trials are at the end of the summer.

I think I’ll struggle most with the last point, but I’m hoping that in the process of working toward my goal, I’ll think of something that could be tweaked or improved.