Skip to content

Title cards for videos with Common Lisp

Xach posted recently about Fighting blog spam with Common Lisp as a short example of using lisp to solve everyday programming problems.  Here’s one I made last weekend.

The Problem:

My wife belly dances, and we frequently do some light video editing before posting things to youtube.  One of the annoying chores is making title cards, usually white text on a black background that we put at the beginning and end of each video to state time, place, etc, and then fade in/out.  An example:

Text should be large enough to fill the screen, centered, and all the same size.  For awhile I made these in Gimp, then made them in HTML and took screenshots, and finally said screw it and wrote a small helper program.  I don’t know how to use Gimp effectively, and jiggling font sizes in HTML is a pain.  I’m sure there are better solutions, but it was faster (and more fun) for me to write some lisp.

The Solution:

To make title cards, I wrote a little program called “Titler”.

Like Xach’s project, I started with (quickproject:make-project "~/src/lisp/titler/" :depends-on '(vecto iterate))

From there I banged away at it for a little while, found some .ttf files on my system and got the basic generation done using vecto’s string drawing functions. The only tricky bit was to determine the optimal font size. Vecto provides a string-bounding-box function that will give you pixel dimensions for a given string at a given font size. I made a function that uses newton’s method to iteratively try different font size values until we get one that fits on the title card and takes up more than 75% of the width. I’m almost positive there are corner cases where my implementation won’t converge on a solution, but it works pretty well for now.

For the next video I can make easy titles:

(make-title "

May 26, 2011
Sun Center
Gainesville, FL" 640 480)

Code is at, and you can see the results (and my wife fire dancing) at

coroutines in common lisp with bordeaux-threads

Turns out threads are a lot easier without beer and after a good nights sleep.  Following up on last night’s defeat (see coroutines in common lisp), I re-read the documentation this morning and got my locks sorted out.

I now use one lock and two condition variables (CV).  From the bordeaux-threads API docs:

A condition variable provides a mechanism for threads to put themselves to sleep while waiting for the state of something to change, then to be subsequently woken by another thread which has changed the state.

I thought of these CVs like events in Java/C#/Javascript.  Telling one thread to CONDITION-WAIT on a CV is kinda like telling it to listen to that event, and have another thread CONDITION-NOTIFY on a CV is kinda like firing the event.  It took me a long time to understand the importance of CONDITION-WAIT atomically releasing a lock, and reacquiring it before continuing execution in that thread.  That mechanism let me coordinate some sequential execution between the threads, eliminating the race conditions that beat me last night.

I also added the ability to send a value into the coroutine by setting the return value of yield.

I used one CV to tell the coroutine it should run to the next yield, and another CV for the coroutine to tell the caller that a value was ready for it.  I had a few let bindings for my shared memory, closing variables into both the coroutine and caller functions.  The coroutine doesn’t spawn a new thread until the first time it’s funcalled.  I have a somewhat poor mechanism for determining if the coroutine is done; you specify a sigil value and the coroutine yields that as the final value (kind of like eof-value in stream reading functions).  I tried to use thread-alive-p, but ran into race conditions.  I have a few ideas for how to improve that.

Here’s the latest make-coroutine macro and test function:

(defmacro make-coroutine ((&key (coroutine-done-value :done))
			  &body body)
  (alexandria:with-gensyms ((yield-cv "there a value ready for pickup")
			    (run-cv "coroutine should run")
			    (lock "lock")
			    (val "shared memory")
			    (yield-result "return value of yield in the corouting")
			    (thrfn "thread function body"))
    `(let* ((,yield-cv (bordeaux-threads:make-condition-variable
			 :name "yield"))
	    (,run-cv (bordeaux-threads:make-condition-variable
			 :name "run"))
	    (,lock (bordeaux-threads:make-lock "coroutine lock"))
	    ,val ,yield-result
	    (,thrfn (lambda ()	  
		      (flet ((yield (&optional n)
			       (setf ,val n)
			       ;;signal that a value is ready for pickup
			       (bordeaux-threads:condition-notify ,yield-cv)
			       ;;wait for a chance to run
			       (bordeaux-threads:condition-wait ,run-cv ,lock)
			(bordeaux-threads:acquire-lock ,lock)
			(yield ,coroutine-done-value)
			(bordeaux-threads:release-lock ,lock)))))

       ;;function to pull values from the coroutine
       (let ((alive-p T) thr)
	 (lambda (&key (send nil send-suppliedp))
	   (when alive-p
	     (bordeaux-threads:with-lock-held (,lock)
	       (if thr
		   (bordeaux-threads:condition-notify ,run-cv)
		   (setf thr (bordeaux-threads:make-thread
			      ,thrfn :name "coroutine")))
	       (bordeaux-threads:condition-wait ,yield-cv ,lock)

	       (setf ,yield-result
		     (if send-suppliedp send ,val))

	       (when (eql ,coroutine-done-value ,val)
		 (setf alive-p nil)
		 (bordeaux-threads:condition-notify ,run-cv))

(defun coroutine-test ()
  (let ((cor (make-coroutine (:coroutine-done-value :done)
	       (yield 1)
	       (yield 4)))
	(cor2 (make-coroutine ()
		(yield (yield (yield 4)))
    (assert (eql 1 (funcall cor)) )
    (assert (null (funcall cor)))
    (assert (eql 4 (funcall cor)))
    (assert (eql :done (funcall cor)))
    (assert (eql :done (funcall cor)))

    (assert (eql 4 (funcall cor2)))
    (assert (eql 4 (funcall cor2 :send 6)))
    (assert (eql 6 (funcall cor2)))
    (assert (eql :done (funcall cor2)))))

I’ll probably play with it more tonight, maybe put together a stand-alone repo / library.

coroutines in common lisp

After spending awhile in python land, I wanted to have “yield” in lisp.  After a month or so of stewing, I decided to dive in tonight.  My first stab uses threads, not continuations to accomplish this.  I made that choice partially because I find the arnesi library intimidating (arnesi has continuations in there somewhere), and partially because I wanted to more practice with threads.

I ended up futzing with bordeaux-threads for a few hours, and eventually punted and used a library that had already solved these problems.  My basic test function was:

In retrospect, this may have been a bit pathological.  Virtually no time was spent anywhere, and so everything was happening pretty much at once.

My basic threading approach was to make a reader/writer pair:

  1. run the coroutine (writer) in a thread, and lexically bind yield using a flet such that calling yield set shared memory (with the appropriate locks)
  2. build a lambda (reader) that, when funcalled, waits for the thread to have a value ready, pulls it from shared memory and returns it (with the appropriate locks)

The “with the appropriate locks” bit killed me.  I spent a lot of time in deadlock, and had race conditions everywhere.  I ran into these issues:

  • race condition during startup where the writer thread would start too slowly, missing the notify from the reader to give me a value, and then get stuck waiting for the reader to notify
  • race condition at the end of the coroutine, where the writer thread wouldn’t die fast enough, and the reader would get stuck waiting for the writer to notify
  • many cases where I wanted to CONDITION-WAIT in one thread before I CONDITION-NOTIFY in another, but kept getting it backward.  Adding more layers of locks/condition variables seemed to just defer the problem to another level.

My initial bordeaux-threads version worked great if I ran it from the REPL (with 1+ second pauses for me to run the commands), but the race conditions screwed me when I put it all together.

After a few hours (and a few beers) of debugging, I decided to look at how chanl did it, which rapidly degraded into a chanl-based implementation.  This, of course, took 10m to write and worked great:

For reference, my last broken bordeaux-threads version was:

Fun stuff! Good to know I suck at threads, maybe I’ll take another try with less beer later. At least now I can browse simpy source with less envy.

working with R, postgresql + SSL, and MSSQL

I’ve been able to take a break from my regularly scheduled duties and spend some time working with R.  This is a short log of what I did to get it working.

The main things I’m looking to do is regression modelling from a large dataset I have in postgresql and various stats calculations on some business data I have in SQL Server.  Today I got to the stage in my R learning where I wanted to hook up the databases.

My setup:

  • R version 2.12.0 on windows 7
  • postgresql 8.4.5 on ubuntu server, requiring SSL
  • MS SQL Server 2005 on Windows 2003

R connects to databases via RJDBC, which (surprise) uses JDBC.  You need to download JDBC drivers for each server, and then can load those up inside R.

  1. Install RJDBC
    1. Open R
    2. Packages -> Install package(s)
    3. pick a mirror near you
    4. select RJDBC
  2. install JDBC driver for MSSQL
    1. I used jtds: (there is also a Microsoft provided driver I didn’t hear about until I was done)
    2. download and unzip
    3. note the path to the jtds jar file (hereafter referred to as $JTDS and the jar filename
    4. open, which has some magic strings JDBC wants
    5. optional – copy $JTDS/(x64|x86)/SSO/ntlmauth.dll into your %PATH% if you want to use windows authentication with SQL Server
  3. install JDBC driver for Postgresql
    1. Download from
    2. note the path to the jar file (hereafter referred to as $PG) and the jar file name
    3. open, which has some magic strings JDBC wants

Then, to connect with MSSQL:

> library(RJDBC)
> mssql <- JDBC("net.sourceforge.jtds.jdbc.Driver", "$JTDS/jtds-1.2.5.jar", "`")
> testdb <- dbConnect(mssql, "jdbc:jtds:sqlserver://host/dbname")
> typeof(dbGetQuery(testdb, "SELECT whathaveyou FROM whither"))
[1] "list"

And you’re off and running with a list of your results in a list and can do whatever you like.

Now for postgresql+ssl:

> pgsql <- JDBC("org.postgresql.Driver", "$PG/postgresql-9.0-801.jdbc3.jar", "`")
> testdb <- dbConnect(pgsql, "jdbc:postgresql://host/dbname?ssl=true", password="password")
> typeof(dbGetQuery(testdb, "SELECT whathaveyou FROM whither"))
[1] "list"

The connection here has a lot more options, and depends highly on your server’s pg_hba.conf.  It took a little while figure out the “?ssl=true” bit.  Luckily you get pretty descriptive error messages if you can’t connect, and the PostgreSQL JDBC docs are pretty good.

Now to re-learn everything I once knew about regression modeling!

making SQL Server backups using python and pyodbc

I have a set of python scripts to help me manage a few SQL Servers at work, and one of the things I do is take database backups using BACKUP DATABASE and BACKUP LOG.  I’ve been using pymssql to connect, but today tried switching to pyodbc.  pymssql seems to be having momentum problems, so I figured I’d try the alternative and see how well it works.

I ran into two issues:

  • pymssql supports “%s” as a parameter placeholders, and pyodbc wants “?”
  • BACKUP and RESTORE operations on SQL Server don’t run like normal queries

The first was trivial to fix, the second took some digging.  If you try to run a BACKUP via pyodbc, the cursor.execute() call starts and finishes with no error, but the backup doesn’t get made.  With help from CubicWeb‘s post MS SQL Server Backuping gotcha, I learned that BACKUP and RESTOREs over ODBC trigger some kind of asynchronous / multiple result set mode.  To get around this, you can poll for file size (as Cubicweb did), but that gets ugly when making a backup on a remote server.

In a backup, I think each “X percent processed.” message is considered a different result set to ODBC. Instead of polling the file size, you can call cursor.nextset in a loop to get all the “percent processed” sets:

After adding that while loop, backups of small and medium sized databases worked like a charm.

Installing VS 2008 and SQL 2008 Express on Windows 7

A new decade means time for a fresh windows install at work.  I ran into some trouble with windows 7, visual studio 2008, and SQL 2008 Express.  Here’s how I resolved them.  Contrary to most things I found on the web, I’m not using betas or release candidates.

First off, installing SQL 2008 Express.  I only wanted the management tools, and this was a little hard to come by.  I downloaded various EXE files from MSDN, but none of them worked (they would error out, bring up an seemingly unrelated installer, or any other confusing behavior that may have led you here).  Here’s what worked for me:

  1. Be sure any previous installation attempts have been purged via Add / Remove Programs
  2. Go to the “other install options” page for SQL express:
  3. Click the “Management Tools” install button (for me that’s:
  4. Install the “Microsoft Web Platform Installer” (MWPI) if it asks you to
  5. Should be straightforward from here on

The funny thing here is the MWPI seems to download an installer that looks a lot like the one at Microsoft® SQL Server® 2008 Management Studio Express that didn’t work for me.

Next up, Visual Studio 2008 (VS2008).  My company has an MSDN subscription, so we downloaded an ISO (named en_visual_studio_2008_professional_x86_x64wow_dvd_X14-26326.iso) and I used freeware MagicISO to mount it, then ran “setup.exe”.  The install failed on the “Microsoft Visual Studio Web Authoring Component” (MVSWAC).  Here’s what worked for me:

  2. Be sure any previous installation attempts have been purged via Add / Remove Programs
  3. Download WebDesignerCore.EXE from microsoft
  4. Run it
  5. Install VS2008 from disc/iso as normal.

Digging into the ISO using 7zip, the problem is /WCU/WebDesignerCore/WebDesignerCore.EXE is corrupt.  To get VS2008 to install cleanly, first we need to install MVSWAC, at which point the VS2008 installer will happily skip past the corrupt file.  I ran across several blog/forum posts with horror stories about VS2008 installing SQL2005, and needing to uninstall half the planet to get things working right.

As always, be sure to hit up windows update, and change your update settings so you get fixes for VS2008 and SQL2008.

Microsoft® SQL Server® 2008 Management Studio Express



This is a test of the wordpress android app. I am likely too lazy to delete this test later.

Is programming all marshmallows and toothpicks, or is it just web apps?

I’ve been doing some maintenance programming for a few days solid (rare for me to get to program that much), and I again find myself amazed that any software works at all.  I’ve only been programming seriously for about a decade (mostly web apps), but it feels like I’m building rickety crap on top of other people’s horrible hacks.

The bar for quality software seems so abysmally low.  When coding around some bizarre behavior I’m seeing out of the .NET framework, I know I’m introducing weird brittle bits.  It feels wrong, but I don’t see any other option.  And this is new code, written for the latest released version of a very popular system!  It seems like everyone else is doing the same thing in every programming environment I’ve seen.

My best guess is I’m working at maybe the 1000th layer of abstraction over the bare metal, and that sounds low.  That’s a lot of cruft, hacks, bugs, security holes, late-night fixes, bad compromises and coffee.

Maybe my sense of “clean code” is just OCD?  Sometimes I wonder if writing good code is just a waste of time.  Is shoddy copy/paste winning the evolutionary battle for the software base that will drive humanity for the next millennium?

more heat-maps using vecto and ch-image

This is a follow-up to my post last year about simplistic heat-maps using Vecto. To recap, I’m trying to make heat maps for google maps overlays.

Here’s how it works in a nutshell:

  1. From javascript I pass to the server the lat/lng region currently shown on the google map, and what size heat map to generate, in pixels.
  2. lisp pulls weights from my database within the given lat/lng region
  3. lisp iterates over the db results, mapping lat/lng to x/y coordinates for the final heat map image
  4. lisp uses the list of mapped (x y weight) to draw the heat map in png
  5. javascript throws the png on top of the google map

I tried a few things based upon the comments I got back from the helpful lisp community.

  • used zpng to get direct pixel access, and calculated each pixel’s color using a weighted average of nearby points using distance.  This didn’t produce good images, and was pretty slow.
  • used zpng to get direct pixel access, and calculated each pixel’s color using the gravity formula against nearby points.  This didn’t produce good images, and was very slow.

I did some more research and learned about the Generic Mapping Tools and bicubic interpolation. The GMT is a set of C programs, similar to the Imagemagick suite.  GMT showed one way to draw heat maps in the Image Presentations tutorial.  It spoke of gridded data sets, and that gave me one more vecto-based idea: split the desired heat-map into a grid and color each square in the grid based upon an average of the weights mapped in that square.  This is a neat effect, but not what I was going for:

This is reasonably fast, taking about 1 second on my dev server.  To quickly find what weights belong in which grid square, I make a spatial index of all the weights, using an r-tree from the spatial-trees library.

The next method I tried was to use interpolation to get a smooth look.  I found Cyrus Harmon‘s ch-image library supports image interpolation, and got to it.  As Patrick Stein noted elsewhere, ch-image isn’t easy to install.  It’s not asdf-installable, and the project page doesn’t list all its dependencies.  For future reference, here’s what I think I needed to install it:

(asdf-install:install "")
(asdf-install:install "")
(asdf-install:install "")
(asdf-install:install "")
(asdf-install:install "")
(asdf-install:install "")

Armed with ch-image, now the drawing process becomes:

  1. draw a small image, coloring pixels based upon weights
  2. enlarge the small image with interpolation

The first step is very similar to the code I wrote to make the grid version above.   Instead of drawing a rectangle, I draw a pixel using ch-image’s pixel access functions.  This was a little weird because ch-image’s coordinate system has 0,0 at the top left of the image.  I’m still not sure how to best choose the size of this smaller image, but ultimately it should depend on my data.  For now I just have it hard-coded be 20x smaller than the desired size:

Yep, that’s pretty small.  Applying a transform to scale it up to the desired size using bilinear interpolation yields:

It looks pretty good and takes about a half-second to draw.  If you click into the larger version, you can see some discontinuities in there, which is a well-known result of bilinear interpolation.  However, based upon other graphics I’ve seen, what I really want is bicubic interpolation.  Luckily, ch-image has this built in:

Oops, maybe not so luckily.  I can certainly see the kinds of look I’m wanting in all the garbled stuff, but ch-image is freaking out somewhere there.

Bilinear it is!  Here’s a screenshot of the overlay in place on the map:

It’s pretty fast, and looks pretty nice, and is fairly close to the look I wanted.  I probably still have some off-by-one errors somewhere, and need to check the git repos for the ch-* libs to see if there might be newer versions than the tarballs I installed.  I still count this as great progress for 5 hours of coding and research.  Huzzah for the much-maligned lisp libraries!

latest postgres docs bookmarklet

When using google to find things in the excellent Postgresql documentation, I often end up on pages showing old postgres versions.  For example, googling for “postgresql create index”, the first hit is for the postgresql 8.2 docs, and I’m running 8.4 now.  My co-workers made a greasemonkey script to automatically redirect to the current version, and I adapted that into a bookmarklet.

Drag this link into your address bar to to use it in your browser:


When you find yourself on a old postgres docs page, click the bookmarklet to redirect to the latest version of that page.   This should work as long as the postgres folks keep their URL naming scheme.