And Now For Something Completely Different

I’ve switched to a new publishing platform. You no longer need to inform state security of your interest in this content.

It seems that the constant reminders of all the wonderful new features that my previous web publishing host had in store for me finally motivated me to set up a somewhat more do-it-yourself publishing arrangement, as behooves someone of my particular talents and skills.

Desk

I’m using Pelican now, and from what I’ve seen of it so far, it’s a promising tool. I particularly like their approach to publishing multiple content types; for content that I’m writing new, I can use markdown, but for content that I want to process automatically, it accepts HTML just fine, which means that the conversion process from my previous host has been relatively painless. (Relatively painless, that is. Remember to calibrate pain on the Extract-Transform-Load scale, which starts at 8 or so.)

One interesting consequence of this change is that you may now access this blog securely if that’s the sort of thing you are interested in.

Another change is that I’ve made the conscious decision to eliminate the comments section. While over the years I have been lucky to receive many interesting and engaging comments on my posts, I think that the best feedback I've gotten has always been from people writing in their own spaces. So I've mainly removed the temptation of the comment box to prevent the motivation to write something thougthful from being burned away in a quick riposte.

If you would like to comment on something I’ve said here, now or in the future, just send me an email and I’ll be happy to either post your comments inline, in a new post, or ideally, link to your own published response elsewhere.

Or, of course, ignore it entirely, as is my right as the lord of this particular digital fiefdom.

Sadly, this change makes some of the older posts imported to this blog read slightly oddly, as they invite you to “comment below”, but nobody’s been doing that for some time, so I think that overall this change is likely to provoke a better quality of discussion and use of time. The full indictment of how You Kids Today ruined the Internet with Your Horrible Comments, and Back In My Day Things Were Better And So Forth will have to wait for a future article. (Or you could go read any New York Times article with the word “millennials” in the title, as it will say basically the same thing.)

I hope you enjoy the new format.

On "On Women In Tech"

I just read the inimitable Lea Verou's "On Women In Tech", and I have a few thoughts.

(Hello, Internet.  Please don't leave ten thousand horrible comments on this post, or on hers.  Please, just this once.)

First, I should say that this essay declined to name any specific efforts.  While that is nice in that it avoids name-calling, it does make it a bit more challenging to construct a specific argument.  So, I apologize for the over-general nature of some of my arguments here, but perhaps it's for the best so we don't get involved in mud-slinging over particular personalities or particular groups.  However, I do have to say that there are of course some groups that are more effective than others, and not everything I say applies uniformly.  But I do believe it's generally true.

I really wanted to like Lea's piece.  As someone who is (trying to be, at least) actively involved with outreach efforts, I feel like sometimes those efforts can have a very self-congratulatory vibe, and they could use a bit of honest criticism from within.  This is especially important because much of the "criticism" of feminism comes from … well, let's say, "unsympathetic sources", since I'm sure that if I'm more specific than that I'll activate a particularly unpleasant sort of internet hate machine.

As Lea herself acknowledges, there is a real problem with a lack of women in the software industry.  Women are still hugely underrepresented.  Women-in-technology groups propose a variety of different solutions to this problem.  As someone attacking a solution to a real problem, I feel like Lea faces an additional burden of proof than someone arguing in favor of such a solution; but her criticism falls short in a very important way: a lack of data.

Generally speaking, women-in-technology groups gather all kinds of statistics on both the problem as well as the efficacy of various solutions.  And make no mistake: many of the proposed solutions that Lea doesn't like – women-only or women-preferred events and groups, increasing the visibility of women in leadership roles, and both banning sexualized content from conference presentations and publicly communicating that it has been banned – do work. Whenever I've spoken with my friends who fit (to greater or lesser degrees) into the stereotype that Lea paints of a women-in-technology activist, they know what they're talking about.  Moreover, in the context of Python groups of various stripes, they have the statistics to prove that the things they are trying to do work.  It's important to keep in mind that each of these activities addresses a different audience; not all women are the same, and different sub-groups need different things to get involved.

To Lea, and to other women who have read her post and immediately identify with it, one thing that you may want to consider is that these efforts are not about you. As a woman already excelling in the field of software technology, attracting your interest and participation is not as important to these groups as attracting new people: if you want to increase participation, you must, as a simple fact of arithmetic, get those who are not already participating to participate.  This may mean women in the field who just don't participate in the community for some reason, or it may mean getting women into the field who aren't in it at all.  That means those women won't be like you in some important characteristics; for example, in aggregate, women do identify with their gender more strongly than men do.

Despite her distaste for such efforts, I first came to follow Lea on twitter explicitly because of a "women in technology" effort.  I followed her after making a conscious decision to diversify my (unfortunately still mostly male, unfortunately still overwhelmingly white) twitter stream.  As a result of that decision, I found a list of prominent women in web technology, where I came across her twitter handle.

Am I continuing to be her follower today just because she's a woman?  No, of course not.  She's an entertaining character with a lot of very interesting stuff to say about a variety of technology - browser front-end issues, mostly centering on CSS - that I don't know a lot about.  As a result of her tweets I've read several of her presentations and I'm much better informed as a result.  This knowledge has been useful to me both professionally and personally.  (And, maybe, if she doesn't completely hate this post, it will have benefited Lea as well.)

This is just one example of a general pattern - if, when we notice a glaring lack of diversity, we make an effort to seek out and include members of an underrepresented group, we frequently find that they have something just as interesting to contribute as the existing over-represented group, if not more so. By simple virtue of being different on one axis, we often find that they are different on other axes as well, and therefore have a more interesting point of view to contribute.

(I feel the need to stress that this is not to say that "all women are X, and therefore if we get more women in technology we will get more much-needed X".  That's reductive and probably factually inaccurate for any given X you might select.  The point is that if you pay attention to lots of people you haven't previously been paying attention to, you will, almost by definition, learn new things.)

The point is, I did not follow Lea because I had a "quota".  She did not claim a spot on my twitter stream that was previously occupied by a better-qualified man.  I just made an effort to follow more women and discovered that there were many interesting people I'd somehow missed out on.  And this worked out quite well for me as now I follow more interesting people.

Wherever efforts like this are made by institutional groups - conferences, for example - to ask an underrepresented group to participate more, to give a second look to applications from that underrepresented group - that accusation, "quotas", always tends to quickly follow.  Usually it comes from members of the over-represented group who didn't make the cut, complaining that they're being excluded despite being "more qualified", but the fact that Lea is herself a woman doesn't make her claim about "quotas" any more true.

Generally, the attempt to include female speakers in a conference program is not enforced via a "quota", unspoken or not; there are usually more than enough qualified female speakers, who are, for one reason or another, either (A) not applying, or (B) being rejected by a flawed "objective" selection process.  The fact is that it is very, very hard to tell whether a speaker will give a good talk or not, even subjectively, in advance.  There is basically no objective metric, so we can't say that we currently have a pure meritocracy, since we can't even agree on what merit is, for conference speakers.

Additionally, myriad cognitive biases influence our judgment.  For example, in an often-repeated story, blind auditions radically (50%) improve the chances that a woman will be selected in an orchestra.  Is this because all conductors are knuckle-dragging misogynists?  Well, okay, some are, but for the most part, that's probably not the reason: it's just that we – both genders in many cultures – are primed, since childhood, to regard woman as less capable, and repeated scientific analysis has shown that that bias creeps into our thinking in all kinds of contexts.

(An aside to the dudes in the audience: if you want to be an ally to feminists, the thing to do is never to say "well I'm not a sexist, I would never let gender influence my judgment".  It's to say "I know my judgment might be compromised in ways I can't control, so I'm going to take steps to ensure that doesn't negatively affect anyone".  A good general rule for privileged classes of all types.)

However, even if women were, for some reason, less qualified, and an actual quota were needed to get women on stage at technical conferences, it would still be worth it.  Let me reiterate first though: the women who apply for talks are not generally less qualified, and such quotas are generally not necessary.  One reason that you really don't need a quota for this particular role is that experience is not really a factor in giving a good talk; in fact, it can work against you.

Both of my two favorite talks from a recent technical conference I attended (sorry, no links, since I'm not going to subject the speakers to any fallout from this article, if there is any, but I'm happy to give you a link if you get in touch) were by women who were relatively new to the community about being relatively new to the community.  Of course many members of the old boys' club gave great talks too (myself included, or so I'd like to think) but there's also always a few talks we've heard before, and always at least a few where the speaker just reads the bullets on their slides about some boring thing like the open source YAML parser that they wrote, while standing motionless and staring down at the podium.

The reason that even quotas would still be worth it, if they were necessary, is that, for the next generation of potential hackers, it is very important that women in roles of technical leadership be seen as normal.  Children very quickly pick up on social cues; this is when they are establishing all those pernicious cognitive biases that I mentioned previously.  Lea herself points this out, saying "If you don’t meet many technical women, your brain tends to pick up the pattern".  In fact the entire final section of her article is about "starting early", and addressing young girls rather than adult women.  It's worth noting that stereotypical "women in technology" programs are, of course, already doing this; it's not an either-or proposition.  But in order to convince young women and girls that this is a normal, sensible thing to do, they need to be able to visualize themselves being successful, and that means there need to be visible, successful older women in the industry.

Finally, let's examine a claim that Lea makes:

"It’s not our industry that has a sexism problem, our society has a sexism problem."

While our society clearly does have a sexism problem, it's possible that our industry does in fact especially have a sexism problem.  So, does it?  Let's go to the numbers; the United States Department of Labor's Bureau of Labor Statistics maintains this handy list: "Employed persons by occupation, sex, and age".

According to that list, the total number of employed men in the USA over 20 is 73,403,000, and the total number of employed women over 20 is 64,640,000.  That means 53% of the workforce is male, which is a 3% bias (disregarding, for the moment, issues of pay equity).

Now let's look at "Computer and mathematical occupations".  The number of men in those professions, 20 and over, is 2,834,000.  The number of women in that same group of occupations, in that same age range, is 972,000.  That means that 74% of this workforce is men, which is a 24% bias, or eight times the bias for general employment.  So: yes, our industry, in particular, has a sexism problem.  But, perhaps comparing against general employment is unfair; let's look instead at management, a famously unfavorable field for women, which sports 9,823,000 over-19 men and 6,167,000 over-19 women, or 61% men, or an 11% bias.  In other words, computer technology is over twice as hostile to women in the USA, statistically speaking, than the historically unfair field of management.

In closing, I'd like to say that, despite some obviously substantial disagreement, I think that Lea has some good insight into aspects of the unintentional negative consequences that some of the strategies that outreach programs for women in technology have.  I hope that in the future, she'll return to the topic without resorting to the tired clichés about quotas, political correctness, "it's society, it's not our industry", and falsely equating the consequences of male and female sexual objectification.  There are still many good unanswered questions that she raises, like:

  • How can women avoid being seen as unduly concerned with things like language and propriety, while still drawing a firm line around unacceptable behavior?
  • How can conference organizers ensure that they're not overzealously including talks from women that aren't up to their usual quality standards?
  • What should we do about the perception that women are being unfairly selected when, in fact, this is almost never the case?
  • How can we get the message out to men who are unused to seeing women at their professional events that loudly apologizing only to the only woman in the room for using profanity is probably worse than using profanity in the first place?
  • How can the women-in-technology movement avoid alienating women who are interested in technology for technology's sake, and explicitly because they don't want to be involved with annoying social stuff like being an activist?  What's a good protocol for identifying that way without being seen to repudiate feminism as a whole?
  • How can we get naive but well-meaning men to stop treating all women as ambassadors for their gender, with all the baggage that implies?
I hope that in the future, Lea, and others who agree with her, will take some time to dig deeper into the realities of the women-in-tech movement, and perhaps work with those already in said movement to provide better answers to these questions.  It seems to me that there's more agreement than disagreement here, and that much of what she dislikes is a caricature of the movement, not its essence.

The Twisted Way

One of the things that confuses me most about Twisted is the fact that so many people seem to be confused by things about Twisted.

Much has been written, some of it by me, some of it by other brilliant members of the community, attempting to explain Twisted in lots of detail so that you can use it and understand it and control it to do your bidding.  But today, I'd like to try something different, and instead of trying to help you figure out how to use Twisted, I will try to help you understand what Twisted is.  To aid you in meditating upon its essence and to understand how it is a metaphor for software, and, if you are truly enlightened, for all life.

Let us contemplate the Twisted Way.

Image Credit: Ian Sane

In the beginning was the Tao.
All things issue from it; all things return to it.
- Tao Te Ching

All information systems are metaphors for the world.  All programs are, at least in small part, systems.  Therefore, every program contains within it a world of thought.  Those thoughts must be developed before they can be shared; therefore, one must have an interesting program before one thinks to do any interesting I/O.

It is less fashionable these days to speak of "object-oriented modeling" than it once was, but mostly because object-oriented design is now so pervasive that no-one needs convincing any more.  Nevertheless, that is what almost all of us do.  When an archetypical programmer in this new millennium sets out to create a program, they will typically begin by creating a Class, and then endowing that Class with Behavior; then, by creating an Instance of that Class, and bestowing interesting Data upon that Instance.

Such an Instance receives (as input) method calls from some other object, and produces (as output) method calls on some other object.  It is a system unto itself, simulating some aspect of human endeavor, computing useful results and dispensing them, all in an abstract vacuum.

But, the programmers who produced this artifact desires it to interact with the world; to produce an effect, and therefore to accept Input and dispense Output.

It is at this point that the programmer encounters Twisted.

When you look for it, there is nothing to see.
When you listen for it, there is nothing to hear.
When you use it, it is inexhaustible.

Except that, in fact, nobody ever encounters Twisted this way.  If this is where – and how – you encounter Twisted, then you will likely have great success with it.  But everyone tends to encounter Twisted, like one encounters almost every other piece of infrastructure, in medias res.  Method calls are flying around all over the place in some huge inscrutable system and you just have to bang through the tutorial to figure it all out right now, and it looks super weird.

Over the years, so many questions I've answered about Twisted seem to reduce to: "how do I even get this thing to do anything"?

This is Twisted's great mystery: it does nothing.  By itself, it is the world's largest NOP.  Its job, purely and simply is to connect your object to the world.  You tell Twisted: listen for connections on this port; when one is made, do this.  Make this request, and when it has a response, do that.  Listen for email over SMTP; when one arrives, do the other.

Without your direction, reactor.run will just ... wait.

The source of most confusion with Twisted, I believe, is that few objects are designed in this idiom.  When we seek to create a program, we feel we must start interacting with it immediately, before it even knows what it is supposed to do.  The seductions of blocking I/O are many and varied.  Any function which appears to merely compute a result can simply be changed to cheat and get its answer by asking some other system instead, with its callers none the wiser.  Even for those of us who know better, these little cheats accumulate and make the program brittle and slow, and force it to be spun out into a thread, or the cold, sparse desert of its own separate process, so it may tediously plod along, waiting for the response to its each and every query.

Thus, desire (for immediate I/O) leads to suffering (of the maintenance programmer).

Return is the movement of the Tao.
Yielding is the way of the Tao.

It doesn't need to be that way, though.  When you create an object, it is best to create it as independently as possible; to test it in isolation; to discretely separate its every interaction with the outside world so that they may be carefully controlled, monitored, intercepted and inspected, one iteration at a time.

All your object needs to do is to define its units of work as atomic, individual functions that it wishes to perform; then, return to its caller and allow it to proceed.

The Master does their job and then stops.
They understand that the universe is forever out of control.

When you design your objects by contemplating their purpose and making them have a consistent, nicely separated internal model of what they're supposed to represent, Twisted seems less like a straightjacket, contorting your program into some awkward shape.  Instead, it becomes a comfortable jacket that your object might slip on to protect itself from the vicissitudes of whatever events may assault it from the network, whether they be the soft staccato of DNS, the confused warbling of SIP or the terrifying roar of IMAP.

Better yet, your object can be a model of some problem domain, and will therefore have a dedicated partner; a different object, a wrapper, whose entire purpose is to translate from a lexicon of network-based events, timers, and Deferred callbacks, into a language that is more directly applicable to your problem domain.  After all, each request or response from the network means something to your application, otherwise it would not have been made; the process of explicitly enumerating all those meanings and recording and documenting them in a module dedicated to that purpose is a very useful exercise.

When your object has such unity of purpose and clarity of function, then Twisted can help manage its stream of events even if the events are not actually coming from a network; abstractions like deferreds, producers, consumers, cooperators and inline callbacks can be used to manipulate timers, keystrokes, and button clicks just as easily as network traffic.

True words aren't eloquent; eloquent words aren't true.
Sages don't need to prove their point;
those who need to prove their point aren't wise.

So, if you are setting out to learn to use Twisted, approach it in this manner: it is not something that will, itself, give your object's inner thoughts structure, purpose and meaning.  It is merely a wrapper; an interstitial layer between your logic and some other system.  The methods it calls upon you might be coming from anywhere.  And indeed, they should be coming from at least one other place: your unit tests.

(With apologies to Geoffrey James and Laozi.)

A Tired Hobgoblin

Alternate (Boring) Title: Why the Twisted coding standard is better than PEP8 (although you still shouldn't care)

People often ask me why Twisted's coding standard – camel case instead of underscores for method names, epytext instead of ReST for docstrings, underscores for prefixes – is "weird" and doesn't, for example, follow PEP 8.

First off, I should say that the Twisted standard actually follows quite a bit of PEP 8.  PEP 8 is a long document with many rules, and the Twisted standard is compatible in large part.  For example, pretty much all of the recommendations in the section on pointless whitespace.

Also, the primary reason that Twisted differs at all from the standard practice in the Python community is that the "standard practice" was almost all developed after Twisted had put its practices in place.  PEP 8 was created on July 5, 2001; at that point, Twisted had already existed for some time, and had officially checked in its first coding standard just a smidge over one month earlier, on May 2, 2001.

That's where my usual explanation ends.  If you're making a new Python project today, unless it is intended specifically as an extension for Twisted, you should ignore the relative merits of these coding standards and go with PEP 8, because the benefits of consistency generally outweigh any particular benefits of one coding standard or another.  Within Twisted, as PEP 8 itself says, "consistency within a project is even more important", so we're not going to change everything around just for broader consistency, but if we were starting again we might.

But.

There seems to be a sticking point around the camelCase method names.

After ten years of fielding complaints about how weird and gross and ugly it is – rather than just how inconsistent it is – to put method names in camel case, I feel that it is time to speak out in defense of the elegance of this particular feature of our coding standard.  I believe that this reaction is based on Python programmers' ancestral memory of Java programs, and it is as irrational as people disliking Python's blocks-by-indentation because COBOL made a much more horrible use of significant whitespace.

For starters, camelCase harkens back to a long and venerable tradition.  Did you camelCase haters-because-of-Java ever ask yourselves why Java uses that convention?  It's because it's copied from the very first object-oriented language.  If you like consistency, then Twisted is consistent with 34 years of object-oriented programming history.

Next, camelCase is easier to type.  For each word-separator, you have only to press "shift" and the next letter, rather than shift, minus, release shift, next letter.  Especially given the inconvenient placement of minus on US keyboards, this has probably saved me enough time that it's added up to at least six minutes in the last ten years.  (Or, a little under one-tenth the time it took to write this article.)

Method names in mixedCase are also more consistent with CapitalizedWord class names.  If you have to scan 'xX' as a word boundary in one case, why learn two ways to do it?

Also, we can visually distinguish acronyms in more contexts in method names.  Consider the following method names:
  • frog_blast_the_vent_core
  • frogBLASTTheVentCore
I believe that the identification of the acronym improves readability. frog_blast_the_vent_core is just nonsense, but frogBLASTTheVentCore makes it clear that you are doing sequence alignment on frog DNA to try to identify variations in core mammalian respiration functions.

Finally, and this is the one that I think is actually bordering on being important enough to think about, Twisted's coding standard sports one additional feature that actually makes it more expressive than underscore_separated method names.  You see, just because the convention is to separate words in method names with capitalization, that doesn't mean we broke the underscore key on our keyboards.  The underscore is used for something else: dispatch prefixes.

Ironically, since the first letter of a method must be lower case according to our coding standard, this conflicts a little bit with the previous point I made, but it's still a very useful feature.

The portion of a method name before an underscore indicates what type of method it is.  So, for example:
  • irc_JOIN - the "irc_" prefix on an IRC client or server object indicates that it handles the "JOINED" message in the IRC protocol
  • render_GET - the "render_" prefix on an HTTP resource indicates that this method is processing the GET HTTP method.
  • remote_loginAnonymous - the "remote_" prefix on a Perspective Broker Referenceable object indicates that this is the implementation of the PB method 'loginAnonymous'
  • test_addDSAIdentityNoComment - the "test_" prefix on a trial TestCase indicates that this is a test method that should be run automatically. (Although for historical reasons and PyUnit compatibility the code only actually looks at the "test" part.)
The final method name there is a good indication of the additional expressiveness of this naming convention.  The underscores-only version – test_add_dsa_identity_no_comment – depends on context.  Is this an application function that is testing whether we can add a ... dissah? ... identity with no comment?  Or a unit test?  Whereas the Twisted version is unambiguous: it's a test case for adding a D.S.A. identity with no comment.  It would be very odd, if not a violation of the coding standard, to name a method that way outside of a test suite.

Hopefully this will be the last I'll say on the subject.  Again, if you're starting a new Python project, you should really just go ahead and use PEP 8, this battle was lost a very long time ago and I didn't even really mind losing it back then.  Just please, stop telling me how ugly and bad this style is.  It works very nicely for me.

The Lexicology of Personal Development

These days, everybody talks about geeks.  Geek chic, the "age of the geek"; even the New York Times op-ed page has been talking about the rise of "geeks" for years.  Bowing to popular usage, even I use the word as it's currently being bandied about.  But I think that the real success story is that of nerds.

A pernicious habit I've noticed in the last decade of the growth of geek culture is that it has developed a sort of cargo-cult of meritocracy.  Within the self-identified "geek" community, there's a social hierarchy based on all kinds of ridiculous pop-culture fetishism.  Who knows the most Monty Python non-sequiteurs?  Who knows the most obscure Deep Space Nine trivia?  This is hardly a new thing – William Shatner famously complained about it on Saturday Night Live in 1986 – but the Internet has been accelerating the phenomenon tremendously.  People who had a difficult time in their teens find each other as adults through some fan-club interest group, and then they make fast friends who had similar social problems.  Soon, since that's the shared interest that they know all their friends from, they spend all their time in the totally fruitless pursuit of more junk related to some frivolous obsession.  That can be okay, almost healthy even, if the focus of this accumulation is a productive hobby. However, if it's just a pop-culture franchise (Harry Potter, Star Trek, World of Darkness) what was originally a liberating new social landscape can rapidly turn into a suffocating, stale dead-end for personal development.

So I always feel a twinge when I identify myself as a "geek".  I usually prefer to say that I am - or at least aspire to be - a nerd.

A nerd is someone who is socially awkward because they are more thoughtful, introspective, intelligent or knowledgeable than their peers.  They notice things that others don't, and it makes interaction difficult.  This is especially obvious in younger nerds, where they're a little above their age group's intelligence but not quite intelligent enough to know when to keep their mouths shut to avoid ostracism.  But, even if they have learned to keep a lid on their less-popular observations, it's tough to constantly censor yourself and it makes interaction with your peers less enjoyable.

A geek is someone who is socially awkward because they are obsessed with topics that the mundanes among us just don't care about that much. They collect things, whether it's knowledge, games, books, toys, or technology.  Faced with a popular science fiction movie, a nerd might want to do the math to see whether the special effects are physically plausible, but a geek will just watch it a dozen times to memorize all the lines.

A dork is just socially awkward because they just aren't all that pleasant to be around.  Nerds and geeks have trouble with interacting with others because they're lost in their own little worlds of intellectual curiosity or obsession: dorks are awkward because, let's face it, maybe they're a little stupid, a little mean, and just not that interesting.  A dork is unsympathetic.

By way of a little research for this post, I discovered that I'm apparently not the only one who has this impression of the definitions, and even Paul Graham seems to agree with me on word choice.  Still: from here on out, these are the correct definitions of the words, thank you very much.

Maybe you've heard these definitions before, and this is all old news. Also, these are words for the sort of tedious taxonomy of people that fictional teenagers in high-school movies do.  It's obviously not karmically healthy to start labeling people "nerd", "dork",  and "geek" and then writing them off as such.  So, you might ask, why do I bring it up?

Because you, like me, are almost certainly a nerd, a geek, and a dork.  And, as you might have inferred from my definitions above, nerds are better than geeks, and dorks are worse than both.

First, consider your inner nerd.  It's good to be intellectually curious, to stretch your cognitive abilities in new and interesting ways, to learn things about how systems work.  Physical systems, social systems, technological systems: it's always good to know more.  It's even good to be curious to the point of awkwardness, especially if you're a kid who is concerned about awkwardness; don't worry about it, it'll make you more interesting later.  It's good to foster any habits which are a little nerdy.

Second, your inner geek.  It's okay to enjoy things, even to obsess about them a little bit, but I think that our culture is really starting to overdo this.  Geeks are presented in popular media as equally, almost infinitely, obsessed with Star Wars, calculus, Star Trek, computer security, and terrible food (cheese whiz, sugary soda brands, etc).  No real people actually have time for all this stuff.  At some point, you have to choose whether you're going to memorize Maxwell's or Kosinski's equations.

One way that you can keep your inner geek in check is to always ask yourself the question: am I watching this movie / playing this game / reading this book because I actually enjoy it and I think it's worthwhile, or am I just trying to make myself conform to some image of myself as someone who knows absolutely everything about this one little cultural niche?

There are people who will treat being a fan of something that someone else created as morally equivalent (or, in a sense, even better than) creating something yourself, and those people are not doing you any favors.  Do not pay attention to them.

Of course, there's some overlap.  People who like playing with systems in real life enjoy the fluffier, more lightweight intellectual challenges of playing with the rules of fictional universes, especially the ones from speculative fiction.  When I was a kid, I went to a couple of Star Trek conventions and let me tell you, there were some legit nerds there; astrophysicists, rocket scientists, and experimental chemists, all excitedly talking about how they were inspired to pursue their careers by fiction of various kinds.

So go ahead, take a break, and geek out. Just don't tell yourself that it's anything other than for fun.

Finally, your inner dork.

As you're enthusiastically cultivating your nerdiness and carefully managing your geekiness, you will be accumulating a little bit of dorkiness as you go: at some point you have to make decisions about whether to do some minor social obligation in order to spend some time on learning a new thing (or re-watching your favorite movie).  You have to decide whether to restrain yourself so you can listen to your friend talk about a rough day at their job or to start spouting facts about the progress of the repairs on the large hadron collider.

Sometimes, on balance, it's acceptable to be a little bit inconsiderate in the pursuit of something more important.  People worth being friends with will see that and understand.  Heck, practically every movie plot these days puts at least one awkward and abrasive nerd in a sympathetic and even heroic position.  But be careful: once you decide that social graces are your lowest priority, it's a hop skip and a jump from being a lovable but absent-minded genius to being a blathering blowhard who just will not shut up about some tedious Riemannian manifold crap that nobody cares about even we just told them that somebody died.

The goal of the nerd or the geek, after all, is not to be awkward; it's easy to forget sometimes that that is an unintentional and unpleasant side effect of the good parts of those attributes.  Being a dork is just bad.  After all, if you're so smart, why aren't you nice?