Three

I am very, very bad at estimating how long software will take to write.

When I was very young, my mother was concerned that I never laughed or smiled, and having forgotten to pre-load my positronic net with the "humor" module, she realized she would have to do some work from scratch. I am told that the original transcript went something like this.
Mom: Do you know how humor works?
Me: No.
Mom: I am going to tell you a joke, then. It is one of the first jokes that my brother used to tell.
Me: Okay.
Mom: How many balls of string does it take to get to the moon?
Now, my mother actually kept balls of yarn in various places around the house, and I had seen the moon, so this didn't strike me as very funny. I thought about how big balls of yarn were, how surprisingly long they were when unrolled, and how slowly they got smaller. Then I attempted to mentally estimate the distance to the moon, in terms of how quickly the balls unrolled, how quickly they got smaller, pictures in books of the relationship between the moon and the earth, and how far away other things I had seen were. I don't remember the rest of the conversation, but I distinctly remember the mental image that I built during this process, as it has stayed with me during the years. It looked like this:
Moon, Earth, and balls of Yarn
and so I replied, without a trace of irony,
Me: Three.
My mother thought this was hilarious, so my initial understanding of humor was that I should run up to everyone I met and say: "howmanyballsofstringdoesittaketogettothemoon?doyougiveupyet?THREE!HAHAHAHAHAHAHAHAHAHA". Reading doc/fun/Twisted.Quotes in the Twisted distribution can show you how little it's progressed since then.

When I estimate programming tasks, I still have a similar sensation to when I was 2 years old and building that little picture in my head. Then, I grossly underestimated because I didn't have a mapping between astronomical distances and inches, because I didn't know what units distance was measured in among the stars. Now, I grossly understimate because I don't know what unit you can measure programming effort in. It's not "hours" because I can't reason about that - one does not do a uniform amount of work within one hour on a program, especially since several hours are spent thinking. I know various ways to measure finished programs, and I know of various ways to measure programs by specifying them to death - however, neither of these gives me the accurate estimate when I want it, which is to say, before work has begun and a great deal of resources have been invested. It is harder, and takes longer, in my experience, to accurately estimate (in hours) how long a program will take than to just write it in the first place; and even if you do go through that process, you can't estimate how long the esimation will take (and the estimation process cheats, by stealing work from the programming process so that it is shorter.)

All this thinking doesn't do anything to make the need for good estimates go away though. So how do you tell how big, or how hard a program is, without first writing the program several times and getting lots of different people to do it? When you know how hard it is, what units do you express it in?

blogging in the park

As many of you know, I just returned from vacation. My first day back was kind of crappy, so let's just pretend that didn't happen. The second was pretty good though! I've had relatively few frustrations with technology, and I've managed to start getting back in the groove work-wise.

My summer vacation was relaxing, but it wasn't fun. I didn't really go anywhere, but I caught up on sleep (LOTS of sleep) and I didn't stress out about anything. I didn't even work on my hobby projects very much, and I didn't do more than talk about Imagination for an hour or so. I did go to new york to see my father, my family and some friends, but I got back here in time to avoid the RNC. That was what I did with my summer vacation and I ate hibachi and tenth was there too. The end.

Evil Epilogue:

Today I finally got software suspend to work on my linux laptop. So, of course, what does any good hacker do when he has done something like this? take it for a test drive!

What better way to test-drive this than to go warwalking through my neighborhood. I was lucky enough to find not just an open access point - not just a high-bandwidth connection - but a linksys device with the password still set to the default and adminnable through wifi! The connection appeared to be idle, so I helped myself to a heaping helping of bittorrent ports and began to make merry in the park at midnight.

So now I'm writing some specs, playing some nethack, blogging, downloading, and generally having a good time in the park at midnight. Some late-night joggers have given me some really weird looks.

The best part of this is that I am actually looking at the message "Hello glyph, the elven Wizard, welcome back to NetHack! You are lucky! Full moon tonight." while there is actually a visible full moon directly in front of me.

And I'm thinking... wouldn't it be cool if I had a backpack full of solar-powered computers with wifi cards and repeaters that I could just sprinkle around the country... building a redundant, distributed filesharing overlay network on accidental connections... are there any such devices available for the average consumer-level evil genius?

I can't believe I'm doing this.

Blame my sister.

Enzyme
You are an enzyme. You are powerful, dark,
variable, and can change many things at your
whim...even when they're not supposed to be
changed. Bad you. You can be dangerous or
wonderful; it's your choice.

Which Biological Molecule Are You?
brought to you by Quizilla

today, linux is awesome

A brief list of things I can do with my new laptop in Linux:
  • Connect USB hardware (keyboard, mouse, camera)

  • Display to multiple screens (internal flatscreen and desktop flatscreen at a different resolution) simultaneously)

  • Connect to wireless networks, using a card which is not supported by linux, by loading the Windows NT driver for my card

  • Connect to wired networks.

  • Watch movies.

  • Play OpenGL games with good hardware accelleration. (e.g., Neverwinter Nights)

  • Play music.

  • Use voice-over-IP, courtesy of Divmod and Shtoom

  • Get accurate reports of remaining battery life.

  • Conserve battery power when I'm not using the processor's full capacity.


Last time I had a laptop, getting these things working would have been a 6-month project, minimum. This time things are working in less than a week, even given a really stupid mistake that hosed my installation completely and forced me to re-do all the configuration work. It really blows my mind. The only thing that I really haven't gotten working that I have in Windows so far is software suspend, or "hibernate", and I know exactly what I need to do (but haven't yet because of the annoyance of a kernel compile).

I hope that works when I try it next, because if it works that will mean I have a portable, dual-boot, dual-display machine that can swap from Windows to Linux in a matter of seconds.

Linux still has a little ways to go in the "automatically detecting your hardware" department, but at least things work more or less the way they say they should these days, if you spend a moment with the appropriate documentation. I'm really pleased.

Python's Secret Macro Mechanism

JP commented on the @decorator syntax recently checked in to a Python alpha. I agree with him. Moreover, despite the fact that Guido has pronounced such agreement futile, I agree with Jim Fulton and I think that this should be a library issue, not a language issue. (Does anyone think it's peculiar that the authors of two of the most popular chunks of free Python software who have a very limited history of agreeing with each other, agree on disagreeing with Python's designer on language features like this?)

The thing that's most disappointing to me in this is that I had finally come around to accepting the community's rationale against macros: that it was too dangerous to allow arbitrary changes to Python's syntax because it would make the local languages confusing and it would make the resulting language harder to learn and harder to document.

The need for macros doesn't go away, though, and technical necessity becomes distorted by social pressures. I believe that Python could benefit from a lot of different syntax additions, most notably block syntax and continuation syntax, to facilitate programming in the asynchronous model that Twisted uses. I can't add these myself, though, because Python doesn't provide the necessary hooks, and I don't have the patience to agitate on python-dev until my pet feature is added, unlike some people.

I haven't run the numbers, but I imagine that something like one out of every ten thousand methods I write is decorated in some way. Closer to one in five is an asynchronous callback. Why is it that the extra convenience of one-character syntax is necessary for this task which is extremely uncommon? Apparently because other people in completely different application domains find it is more common for them, and those people are more willing to spill gallons of virtual ink complaining about the inconvenience and unreadability of writing "decorate(hello)" vs. "@hello".

I think it is a bigger deal that I have to write ".addCallback(lambda x: doSomething())" rather than "wait; doSomething()", but I don't have the fortitude to convince Guido that "something is better than nothing" in this case. I have been hoping for years that "something" would arrive in the language.

These kind of ad-hoc language modifications strike me as more dangerous than a real, powerful macro system, because they don't just pollute the language for some people, they pollute it for everyone at once because of some people's needs. You have to hack Guido's brain, not the interpreter, to implement your macros, which is both more time-consuming and more detrimental to the community as a whole. Macros increase complexity locally and utility locally; a static language decreases complexity globally and utility globally, but every time someone successfully pushes through a feature like this, they increase complexity globally but utility locally. It's the worst of both worlds.

Of course I'm exaggerating. This feature in particular is not very worrisome, but the trend it indicates is.

The strange thing is that I really don't understand Python's design when things like this get bolted on to the side. Every time I think I've finally got the "pythonic" vibe something weird like this happens. I suppose that's what makes this feature so bad to me. I don't know why I am disagreeing with Guido, because I don't understand the aesthetic he's using to make this decision. From what I can tell just using the language and reading python-dev, it's more or less random.

I hope I'm wrong. I was wrong about the new object model (except __class__ mutation!), and I'm grateful for that every day. Still, even after reading a lot of the ranting, it seems to me that the @decorator syntax is being added because it's easy, not because it's important.