Time-Robbing Python Errors

Ran into a problem while attempting to get Python provisioned automatically in Windows. I could install Python as an administrator, but when I would switch into a Limited User Account and attempt to use pip or virtualenv, I’d get nothing but obscure failure.

The key phrase that kept popping up was:

“WindowsError: [Error 183] Cannot create a file when that file already exists: ‘C:\\Documents and Settings\\Administrator\\.distlib'”

as distlib kept trying to put things into the admin account by default.

To make sense of the error, I had to dig into the closest code to the WindowsError that was thrown, which looked like this:

Lo and behold, setting the LOCALAPPDATA environment variable to some writeable folder to which the Limited user has acccess will fix the issue. But it strikes me as a place where the os.path.expandvars() call should probably check the APPDATA environment variable too.

And, for whatever odd reason, it seems like the os.path.expanduser() call uses the user credentials associated with the Python executable or something, because at no point as I’m running this code as a Limited User, do I specify that I want the code to act like it is an Administrator. It’s a bit odd.

In any case, it works, but the solution is less than obvious.

Adding Python Twisted module to a virtualenv on Windows

I’m trying to keep my Python system-level install as clean as possible on Windows, but also trying to get Buildbot set up in a virtualenv. Unfortunately, some of the Buildbot dependencies want to build native extensions, and the most important — Twisted — just fails.

This is all I have in my system-level site-packages:

If I create a new virtualenv with --no-site-packages, then try to install the buildbot-slave package, I get the dreaded "Unable to find vcvarsall.bat" when Twisted is being installed:

Unfortunately, if you download and attempt to install the Twisted redistributable package from their website directly into a virtualenv, the installer package will register itself with the system, even if you select the “Install just for me” option. In other words, you can’t just have the package install itself portably into multiple virtualenvs:

  • Select “Install just for me”
  • Set “Python 2.7 from registry” to “Entire feature will be unavailable”
  • Set “Python from another location” to “Will be installed on local hard drive”
  • Set “Provide an alternate Python location” to the location of your virtualenv "c:\virtualenv"

The User Account Control (UAC) dialog will pop up, and after the install, you’ll be stuck with a single install of Twisted which can only be Repaired or Removed:


This situation kind of sucks. Unfortunately, the .exe version of the installer also won’t let you choose where to install its contents. It just gets stuck on the dialog where it autodetects your existing Python 2.7 installation, but won’t let you choose a different site-packages folder as a file destination. It only goes for the global install.


The solution is to download the .exe version of the installer, unpack it using something like 7-zip, and then copy the files from the PLATLIB folder to your virtualenv’s Libs\site-packages and the the files from the SCRIPTS folder to your virtualenv’s Scripts folder.

Then, the next time you run pip freeze, you should see something like:

Then you can install the rest of the Buildbot packages:

Detailed Error Emails For Django In Production Mode

Sometimes when you’re trying to figure out an issue in a Django production environment, the default exception tracebacks just don’t cut it. There’s not enough scope information for you to figure out what parameters or variable values caused something to go wrong, or even for whom it went wrong.

It’s frustrating as a developer, because you have to infer what went wrong from a near-empty stacktrace.

In order to be able to produce more detailed error reports for Django when running on the production server, I did a bit of searching and found a few examples like this one, but rewriting a piece of core functionality seemed a bit weird to me. If the underlying function changes significantly, the rewrite won’t be able to keep up.

So I came up with something different, a mixin function redirection that adds the extra step I want (emailing me a detailed report) and then calls the original handler to perform the default behavior:

Note that by using this code, you do end up with two emails: the usual generic error report and the highly-detailed one containing details usually seen when you hit an error while developing the site with settings.DEBUG == True. These emails will be sent within milliseconds of one another. The ultimate benefit is that none of the original code of the Django base classes is touched, which I think is good idea.

Another thing to keep in mind is that you probably want to put all of your OAuth secrets and deployment-specific values in a file other than settings.py, because the values in settings get spilled into the detailed report that is emailed.

One final note is that I am continuously amazed by Python. The fact that first-class functions and dynamic attributes let you hack functionality in, in ways the original software designers didn’t foresee, is fantastic. It really lets you get around problems that would require more tedious solutions in other languages.

Python Parametrized Unit Tests

I’ve been testing some image downloading code on Tandem Exchange, trying to make sure that we properly download a profile image for new users when they sign in using one of our social network logins. As I was writing my unit tests, I found myself doing a bit of copy and paste between the class definitions, because I wanted multiple test cases to check the same behaviors with different inputs. Taking this as a sure sign that I was doing something inefficiently, I started looking for ways to parametrize the test cases.

Google pointed me towards one way to do it, though it seemed a bit more work than necessary and involved some fiddling with classes at runtime. Python supports this, of course, but it seemed a bit messy.

The simpler way, which doesn’t offer quite as much flexibility but offers less complexity (and less fiddling with the class at runtime), was to use Python’s mixin facility to compose unit test classes with the instance parameters I wanted.

So let’s say I expect the same conditions to hold true after I download and process any type of image:

  1. I want the processed image to be stored somewhere on disk.
  2. I want the processed image to be converted to JPEG format, in truecolor mode, and scaled to 256 x 256 pixels.
  3. I want to retrieve the processed image from the web address where I’ve published it, and make sure it is identical to the image data I’ve stored on disk (round trip test).

Here’s what that code might look like:

So what ends up happening is that the composed classes simply specify which image they want the test functions to run against, and the rest of the test functions run as usual against that input parameter.

One thing readers might notice is the seemingly backwards class inheritance. Turns out (you learn something everyday!) Python thinks about class inheritance declarations from right-to-left, meaning that in the above examples, unittest.TestCase is the root of the inheritance chain. Or another way to look at it is that, for example, GoodAvatar instances will first search in StandardTestsMixin then in unittest.TestCase for inherited methods.

Python Code Coverage and cron

Every now and then, it’s useful to get a sense of assurance about the code you’re writing. In fact, it might be a primary goal of your organization to have functional code. Who knows?

Although I began development of Tandem Exchange following a test-first development process, the pace of change was too rapid. It’s not that I didn’t appreciate the value of testing. At the very beginning, I did implement a large number of tests. It’s just that those tests were written against soon-to-be-obsolete code and I didn’t have the time to develop new functionality and write unit tests simultaneously. Before the prototyping phase had ended, I learned the hard way that it didn’t really make sense to write many of those tests, when such a huge fraction of early functional code ended up in the dustbin.

Once things settled down, I started to leverage the Python code-coverage module alongside newly-written unit tests, made simple by using the nose test runner, which is a fantastic tool for test auto-discovery.

I then added the nose test runs to the development-site crontab, to generate coverage and unit test statistics on a regular basis:

@daily  /usr/local/bin/python2.7 /path/to/nosetests -v --with-coverage \
        --cover-package=exchange --cover-erase \ 
        --cover-html --cover-html-dir=/path/to/webdir/coverage --cover-branches \
        exchange.search_tests exchange.models_tests

All you have to do is specify a handful of extra options to the nosetests command line, it’s practically a freebie. Especially useful are the --cover-html and --cover-html-dir options, which tells nosetests to place the coverage reports in a specific directory.

In our case, I created a directory on the webhost, where I can log in and check the report results, which look something like:


The coverage reports show which Python statements (lines) have been exercised by the unit tests that have been run. Green lines have been run at least once, red lines have not been run, yellow lines indicate that not all of the conditions of a branch have been tested. (i.e. If you have an “if” statement, you have to test it for both True and False conditions, otherwise known as Modified Condition / Decision Coverage.) Note, however, that a coverage test does not prove that a piece of code behaves the way you expect, only that it has been run. The unit tests are the bits exclusively responsible for proving behavior.

In any case, I’ve already isolated two issues through the unit tests and am now assured that they will never come back. And as the percentage of statements covered by unit tests continues to increase, I’m sure any remaining issues will shake out. Which is the whole point, isn’t it?