Today’s hack

Let’s say you’re automating a git workflow for a variety of good and bad reasons. Commits are fine, you can just do a:

commit -am 'a message'

and it goes through non interactively. Now let’s say you have a merge and you try:

git merge

It pop open an editor window to let you type a message. There’s no merge -m option to let you provide a message. If you look closely, you’ll see that the editor window already has a message filled in saying it’s a merge. This means if you save without exiting, git sees a message and decides all is OK and it can proceed. So all you have to do is provide an “editor” which quits without saving every time. The program which does nothing except exit successfully is /true so you can simply do:

EDITOR=true git merge

for a non interactive merge.

I don’t know whether to be proud or ashamed, but it works.



Small hack of the day

Two things:

  1. syntax highlighting is nice
  2. wouldn’t it be nice to copy /paste it

The main problem is many terminals don’t for no especially good reason allow copy/paste of formatted text. There are three tools which help:

  1. xclip -t text/html this eats standard input and allows pasting of it as HTML, so it can include formatting and color and so on. By default, xclip does plain text, so you have to specify that it’s HTML.
  2. aha this takes ANSI formatted text (i.e. the formatting scheme used by terminals and turns it into HTML).
  3.  unbuffer many tools will only write color output to a terminal, not a pipe. Through the magic of PTYs (pseudoterminals) this program fools other programs into thinking they’re running on a terminal when they’re not.

Now all you have to do is pipe them together. So, take a silly C++ program like this:

void silly;

And compile it like this:

unbuffer g++-7 -std=c++1z  | aha | xclip -t text/html

Then a simple paste into a box which accepts HTML (e.g. email, etc) gives: error: variable or field 'silly' declared void
 void silly;


Overwrite a file but only if it exists (in BASH)

Imagine you have a line in a script:

cat /some/image.img > /dev/mmcblk0

which dumps a disk image on to an SD card. I have such a line as part of the setup. Actually, a more realistic one is:

pv /some/image.img | sudo 'bash -c cat >/dev/mmcblk0'

which displays a progress bar and doesn’t require having the whole script being run as root. Either way, there’s a problem: if the SD card doesn’t happen to be in when that line runs, it will create /dev/mmcblk0. Then all subsequent writes will go really fast (at the speed of the main disk), and you will get confused and sad when none of the changes are reflected on the SD card. You might even reboot which will magically fix the problem (/dev gets nuked). That happened to me 😦

The weird, out of place dd tool offers a nice fix:

pv /some/image.img | sudo dd conv=nocreat of=/dev/mmcblk0

You can specify a pseudo-conversion, which tells it to not create the file if it doesn’t already exist. It also serves the second purpose as the “sudo tee” idiom but without dumping everything to stdout.

A simple hack but a useful one. You can do similar things like append, but only if it exists, too. The incantation is:

dd conv=nocreat,notrunc oflag=append of=/file/to/append/to

That is, don’t create the file, don’t truncate it on opening and append. If you allow it to truncate, then it will truncate then append, which is entirely equivalent to overwriting but nonetheless you can still specify append without notrunc.

Warn or exit on failure in a shell script

Make has a handy feature where when a rule fails, it will stop whatever it’s doing. Often though you simply want a linear list of commands to be run in sequence (i.e. a shell script) with the same feature.

You can more or less hack that feature with BASH using the DEBUG trap. The trap executes a hook before every command is run, so you can use it to test the result of the previous command. That of course leaves the last command dangling, so you can put the same hook on the EXIT trap which runs after the last command finishes.

Here’s the snippet and example which warns (rather than exits) on failure:

function test_failure(){
  #Save the exit code, since anything else will trash it.
  if [ $v != 0 ]
    echo -e Line $LINE command "\e[31m$COM\e[0m" failed with code $v
  #This trap runs before a command, so we use it to
  #test the previous command run. So, save the details for
  #next time.

#Set up the traps
trap test_failure EXIT
trap test_failure DEBUG

#Some misc stuff to test.
echo hello
sleep 2 ; bash -c 'exit 3'

echo world

echo what > /this/is/not/writable

echo the end

Running it produces:

$ bash errors.bash 
Line 21 command bash -c 'exit 3' failed with code 3
Line 25 command false failed with code 1
errors.bash: line 27: /this/is/not/writable: No such file or directory
Line 27 command echo what > /this/is/not/writable failed with code 1
the end