I previously wrote a post about shipping a PyGame app to users on macOS. It’s now substantially updated for the new Notarization requirements in Catalina. I hope it’s useful to somebody!
Mac Python Distribution Post Updated for Catalina and Notarization
Notarize your Python apps for macOS Catalina.
Toward a “Kernel Python”
The life changing magic of a minimal standard library.
Prompted by Amber Brown’s presentation at the Python Language Summit last month, Christian Heimes has followed up on his own earlier work on slimming down the Python standard library, and created a proper Python Enhancement Proposal PEP 594 for removing obviously obsolete and unmaintained detritus from the standard library.
PEP 594 is great news for Python, and in particular for the maintainers of its standard library, who can now address a reduced surface area. A brief trip through the PEP’s rogues gallery of modules to deprecate or remove1 is illuminating. The python standard library contains plenty of useful modules, but it also hides a veritable necropolis of code, a towering monument to obsolescence, threatening to topple over on its maintainers at any point.
However, I believe the PEP may be approaching the problem from the wrong
direction. Currently, the standard library is maintained in tandem with, and
by the maintainers of, the CPython python runtime. Large portions of it are
simply included in the hope that it might be useful to somebody. In the
aforementioned PEP, you can see this logic at work in defense of the colorsys
module: why not remove it? “The module is useful to convert CSS colors between
coordinate systems. [It] does not impose maintenance overhead on core
development.”
There was a time when Internet access was scarce, and maybe it was helpful to pre-load Python with lots of stuff so it could be pre-packaged with the Python binaries on the CD-ROM when you first started learning.
Today, however, the modules you need to convert colors between coordinate
systems are only a pip install
away. The bigger core interpreter is just
more to download before you can get started.
Why Didn’t You Review My PR?
So let’s examine that claim: does a tiny module like colorsys
“impose
maintenance overhead on core development”?
The core maintainers have enough going on just trying to maintain the huge and ancient C codebase that is CPython itself. As Mariatta put it in her North Bay Python keynote, the most common question that core developers get is “Why haven’t you looked at my PR?” And the answer? It’s easier to not look at PRs when you don’t care about them. This from a talk about what it means to be a core developer!
One might ask, whether Twisted has the same problem. Twisted is a big collection of loosely-connected modules too; a sort of standard library for networking. Are clients and servers for SSH, IMAP, HTTP, TLS, et. al. all a bit much to try to cram into one package?
I’m compelled to reply: yes. Twisted is monolithic because it dates back to a similar historical period as CPython, where installing stuff was really complicated. So I am both sympathetic and empathetic towards CPython’s plight.
At some point, each sub-project within Twisted should ideally become a separate
project with its own repository, CI, website, and of course its own more
focused maintainers. We’ve been slowly splitting out projects already, where
we can find a natural boundary. Some things that started in Twisted like
constantly
and incremental
have been split out; deferred
and filepath
are in the process of getting that treatment as well. Other projects absorbed
into the org continue to live separately, like klein
and treq
. As we
figure out how to reduce the overhead of setting up and maintaining the CI and
release infrastructure for each of them, we’ll do more of this.
But is our monolithic nature the most pressing problem, or even a serious problem, for the project? Let’s quantify it.
As of this writing, Twisted has 5 outstanding un-reviewed pull requests in our review queue. The median time a ticket spends in review is roughly four and a half days.2 The oldest ticket in our queue dates from April 22, which means it’s been less than 2 months since our oldest un-reviewed PR was submitted.
It’s always a struggle to find enough maintainers and enough time to respond to pull requests. Subjectively, it does sometimes feel like “Why won’t you review my pull request?” is a question we do still get all too often. We aren’t always doing this well, but all in all, we’re managing; the queue hovers between 0 at its lowest and 25 or so during a bad month.
By comparison to those numbers, how is core CPython doing?
Looking at CPython’s keyword-based review queue queue, we can see that there are 429 tickets currently awaiting review. The oldest PR awaiting review hasn’t been touched since February 2, 2018, which is almost 500 days old.
How many are interpreter issues and how many are stdlib issues? Clearly review latency is a problem, but would removing the stdlib even help?
For a quick and highly unscientific estimate, I scanned the first (oldest) page of PRs in the query above. By my subjective assessment, on this page of 25 PRs, 14 were about the standard library, 10 were about the core language or interpreter code; one was a minor documentation issue that didn’t really apply to either. If I can hazard a very rough estimate based on this proportion, somewhere around half of the unreviewed PRs might be in standard library code.
So the first reason the CPython core team needs to stop maintaining the standard library because they literally don’t have the capacity to maintain the standard library. Or to put it differently: they aren’t maintaining it, and what remains is to admit that and start splitting it out.
It’s true that none of the open PRs on CPython are in colorsys
3. It does
not, in fact, impose maintenance overhead on core development. Core
development imposes maintenance overhead on it. If I wanted to update the
colorsys
module to be more modern - perhaps to have a Color
object rather
than a collection of free functions, perhaps to support integer color models -
I’d likely have to wait 500 days, or more, for a review.
As a result, code in the standard library is harder to change, which means its users are less motivated to contribute to it. CPython’s unusually infrequent releases also slow down the development of library code and decrease the usefulness of feedback from users. It’s no accident that almost all of the modules in the standard library have actively maintained alternatives outside of it: it’s not a failure on the part of the stdlib’s maintainers. The whole process is set up to produce stagnation in all but the most frequently used parts of the stdlib, and that’s exactly what it does.
New Environments, New Requirements
Perhaps even more importantly is that bundling together CPython with the definition of the standard library privileges CPython itself, and the use-cases that it supports, above every other implementation of the language.
Podcast after podcast after podcast after keynote tells us that in order to keep succeeding and expanding, Python needs to grow into new areas: particularly web frontends, but also mobile clients, embedded systems, and console games.
These environments require one or both of:
- a completely different runtime, such as Brython, or MicroPython
- a modified, stripped down version of the standard library, which elides most of it.
In all of these cases, determining which modules have been removed from the
standard library is a sticking point. They have to be discovered by a process
of trial and error; notably, a process completely different from the standard
process for determining dependencies within a Python application. There’s no
install_requires
declaration you can put in your setup.py
that indicates
that your library uses a stdlib module that your target Python runtime might
leave out due to space constraints.
You can even have this problem even if all you ever use is the standard
python
on your Linux installation. Even server- and desktop-class Linux
distributions have the same need for a more minimal core Python package, and so
they already chop up the standard library somewhat arbitrarily. This can break
the expectations of many python codebases, and result in bugs where even pip
install
won’t work.
Take It All Out
How about the suggestion that we should do only a little a day? Although it sounds convincing, don’t be fooled. The reason you never seem to finish is precisely because you tidy a little at a time. [...] The ultimate secret of success is this: If you tidy up in one shot, rather than little by little, you can dramatically change your mind-set.
Kondō, Marie.
“The Life-Changing Magic of Tidying Up”
(p. 15-16)
While incremental slimming of the standard library is a step in the right direction, incremental change can only get us so far. As Marie Kondō says, when you really want to tidy up, the first step is to take everything out so that you can really see everything, and put back only what you need.
It’s time to thank those modules which do not spark joy and send them on their way.
We need a “kernel” version of Python that contains only the most absolutely
minimal library, so that all implementations can agree on a core baseline that
gives you a “python”, and applications, even those that want to run on web
browsers or microcontrollers, can simply state their additional requirements in
terms of requirements.txt
.
Now, there are some business environments where adding things to your
requirements.txt
is a fraught, bureaucratic process, and in those places, a
large standard library might seem appealing. But “standard library” is a purely
arbitrary boundary that the procurement processes in such places have drawn,
and an equally arbitrary line may be easily drawn around a binary distribution.
So it may indeed be useful for some CPython binary distributions — perhaps even
the official ones — to still ship with a broader selection of modules from
PyPI. Even for the average user, in order to use it for development, at the
very least, you’d need enough stdlib stuff that pip
can bootstrap itself, to
install the other modules you need!
It’s already the case, today, that pip
is distributed with Python, but
isn’t maintained in the CPython repository. What the default Python binary
installer ships with is already a separate question from what is developed in
the CPython repo, or what ships in the individual source tarball for the
interpreter.
In order to use Linux, you need bootable media with a huge array of additional programs. That doesn’t mean the Linux kernel itself is in one giant repository, where the hundreds of applications you need for a functioning Linux server are all maintained by one team. The Linux kernel project is immensely valuable, but functioning operating systems which use it are built from the combination of the Linux kernel and a wide variety of separately maintained libraries and programs.
Conclusion
The “batteries included” philosophy was a great fit for the time when it was created: a booster rocket to sneak Python into the imagination of the programming public. As the open source and Python packaging ecosystems have matured, however, this strategy has not aged well, and like any booster, we must let it fall back to earth, lest it drag us back down with it.
New Python runtimes, new deployment targets, and new developer audiences all present tremendous opportunities for the Python community to soar ever higher.
But to do it, we need a newer, leaner, unburdened “kernel” Python. We need to dump the whole standard library out on the floor, adding back only the smallest bits that we need, so that we can tell what is truly necessary and what’s just nice to have.
I hope I’ve convinced at least a few of you that we need a kernel Python.
Now: who wants to write the PEP?
Acknowledgments
Thanks to Jean-Paul Calderone, Donald Stufft, Alex Gaynor, Amber Brown, Ian Cordasco, Jonathan Lange, Augie Fackler, Hynek Schlawack, Pete Fein, Mark Williams, Tom Most, Jeremy Thurgood, and Aaron Gallagher for feedback and corrections on earlier drafts of this post. Any errors of course remain my own.
-
sunau
,xdrlib
, andchunk
are my personal favorites. ↩ -
Yeah, yeah, you got me, the mean is 102 days. ↩
-
Well, as it turns out, one is on
colorsys
, but it’s a documentation fix that Alex Gaynor filed after reviewing a draft of this post so I don’t think it really counts. ↩
Tips And Tricks for Shipping a PyGame App on the Mac
A quick and dirty guide to getting that little PyGame hack you did up and running on someone else’s Mac.
I have written a tool you can actually use rather than copying and pasting shell-script snippets, which you can read about in a new post here. I've done my best to update the accuracy of the information below as well, particularly with respect to which Python you want and why, but it is a much older post and I could easily have missed something.
I’ve written and spoken at some length about shipping software in the abstract. Sometimes I’ve even had the occasional concrete tidbit, but that advice wasn’t really complete.
In honor of Eevee’s delightful Games Made Quick???, I’d like to help you package your games even quicker than you made them.
Who is this for?
About ten years ago I made a prototype of a little PyGame thing which I wanted to share with a few friends. Building said prototype was quick and fun, and very different from the usual sort of work I do. But then, the project got just big enough that I started to wonder if it would be possible to share the result, and thus began the long winter of my discontent with packaging tools.
I might be the only one, but... I don’t think so. The history of PyWeek, for example, looks to be a history of games distributed as Github repositories, or, at best, apps which don’t launch. It seems like people who participate in game jams with Unity push a button and publish their games to Steam; people who participate in game jams with Python wander away once the build toolchain defeats them.
So: perhaps you’re also a Python programmer, and you’ve built something with PyGame, and you want to put it on your website so your friends can download it. Perhaps many or most of your friends and family are Mac users. Perhaps you tried to make a thing with py2app once, and got nothing but inscrutable tracebacks or corrupt app bundles for your trouble.
If so, read on and enjoy.
What changed?
If things didn’t work for me when I first tried to do this, what’s different now?
- the packaging ecosystem in general is far less buggy, and py2app’s dependencies, like setuptools, have become far more reliable as well. Many thanks to Donald Stufft and the whole PyPA for that.
- Binary wheels exist, and the community has been getting better and better at building self-contained wheels which include any necessary C libraries, relieving the burden on application authors to figure out gnarly C toolchain issues.
- The PyGame project now ships just such wheels for a variety of Python versions on Mac, Windows, and Linux, which removes a whole huge pile of complexity both in generally understanding the C toolchain and specifically understanding the SDL build process.
- py2app has been actively maintained and many bugs have been fixed - many thanks to Ronald Oussoren et. al. for that.
- I finally broke down and gave Apple a hundred dollars so I can produce an app that normal humans might actually be able to run.
There are still weird little corner cases you have to work around — hence this post – but mostly this is the story of how years of effort by the Python packaging community have resulted in tools that are pretty close to working out of the box now.
Step 0: Development Setup
You will also want to use a virtual environment for development.
Finally: pip install
all your requirements into your virtualenv
, including
PyGame itself.
Step 1: Make an icon
All good apps need an icon, right?
When I was young, one would open up ResEdit
Resorcerer MPW CodeWarrior
Project Builder Icon Composer Xcode and
create a new ICON resource cicn resource
.tiff
file.icns
file. Nowadays there’s some weird opaque
stuff with xcassets
files and Contents.json
and “Copy Bundle Resources” in
the default Swift and Objective C project templates and honestly I can’t be
bothered to keep track of what’s going on with this nonsense any more.
Luckily the OS ships with the macOS-specific “scriptable image processing system”, which can helpfully convert an icon for you. Make yourself a 512x512 PNG file in your favorite image editor (with an alpha channel!) that you want to use as your icon, then run it something like this:
1 |
|
somewhere in your build process, to produce an icon in the appropriate format.
There’s also one additional wrinkle with PyGame: once you’ve launched the
game, PyGame helpfully assigns the cute, but ugly, default PyGame icon to
your running process. To avoid this, you’ll need these two lines somewhere in
your initialization code, somewhere before pygame.display.init
(or, for that
matter, pygame.display.<anything>
):
1 2 |
|
Obviously this is pretty Mac-specific so you probably want this under some kind of platform-detection conditional, perhaps this one.
Step 2: Include All The Dang Files, I Don’t Care About Performance
Unfortunately py2app still tries really hard to jam all your code into a .zip
file, which breaks the world in various hilarious ways. Your app will probably
have some resources you want to load, as will PyGame itself.
Supposedly, packages=["your_package"]
in your setup.py should address this,
and it comes with a “pygame” recipe, but neither of these things worked for me.
Instead, I convinced py2app to splat out all the files by using the
not-quite-public “recipe” plugin API:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
|
This is definitely somewhat less efficient than py2app’s default of stuffing the code into a single zip file, but, as a counterpoint to that: it actually works.
Step 3: Build it
Hopefully, at this point you can do python setup.py py2app
and get a shiny
new app bundle in dist/$NAME.app
. We haven’t had to go through the hell of
quarantine
yet, so it should launch at this point. If it doesn’t, sorry :-(.
You can often debug more obvious fail-to-launch issues by running the
executable in the command line, by running
./dist/$NAME.app/Contents/MacOS/$NAME
. Although this will run in a slightly
different environment than double clicking (it will have all your shell’s env
vars, for example, so if your app needs an env var to work it might
mysteriously work there) it will also print out any tracebacks to your
terminal, where they’ll be slightly easier to find than in Console.app.
Once your app at least runs locally, it’s time to...
Step 4: Code sign it
All the tutorials that I’ve found on how to do this involve doing Xcode project goop where it’s not clear what’s happening underneath. But despite the fact that the introductory docs aren’t quite there, the underlying model for codesigning stuff is totally common across GUI and command-line cases. However, actually getting your cert requires Xcode, an apple ID, and a credit card.
After paying your hundred dollars, go into Xcode, go to Accounts, hit “+”, “Apple ID”, then log in. Then, in your shiny new account, go to “Manage Certificates”, hit the little “+”, and (assuming, like me, you want to put something up on your own website, and not submit to the Mac App Store), and choose Developer ID Application. You probably think you want “mac app distribution” because you are wanting to distribute a mac app! But you don’t.
Next, before you do anything else, make sure you have backups of your certificate and private key. You really don’t want to lose the private key associated with that cert.
Now quit Xcode; you’re done with the GUI.
You will need to know the identifier of your signing key though, which should be output from the command:
1 |
|
You probably want to put that in your build script, since you want to sign with
the same identity every time. Further commands here will assume you’ve copied
one of the lines of results from that command and done export IDENTITY="..."
with it.
Step 4a: Become Aware Of New Annoying Requirements
Update for macOS Catalina: In Catalina, Apple has added a new code-signing requirement; even for apps distributed outside of the app store, they still have to be submitted to and approved by Apple.
In order to be notarized, you will need to codesign not only your app itself, but to also:
- add the hardened-runtime exception entitlements that allow Python to work, and
- directly sign every shared library that is part of your app bundle.
So the actual code-signing step is now a little more complicated.
Step 4b: Write An Entitlements Plist That Allows Python To Work
One of the features that notarization is intended to strongly encourage1 is the “hardened runtime”, a feature of macOS which opts in to stricter run-time behavior designed to stop malware. One thing that the hardened runtime does is to disable writable, executable memory, which is used by JITs, FFIs ... and malware.
Unfortunately, both Python’s built-in ctypes
module and various popular bits
of 3rd-party stuff that uses cffi
, including pyOpenSSL
, require writable,
executable memory to work. Furthermore, py2app
actually imports ctypes
during its bootstrapping phase, so you can’t even get your own code to start
running to perform any workarounds unless this is enabled. So this is just
if you want to use Python, not if your project requires ctypes
directly.
To make this long, sad story significantly shorter and happier, you can create an entitlements property list that enables the magical property which allows this to work. It looks like this:
1 2 3 4 5 6 7 8 |
|
Subsequent steps assume that you’ve put this into a file called entitleme.plist
in your project root.
Step 4c: SIGN ALL THE THINGS
Notarization also requires that all the executable files in your bundle, not
just the main executable, are properly code-signed before submitting. So
you’ll need to first run the codesign
command across all your shared
libraries, something like this:
1 2 3 4 5 6 7 8 9 |
|
Then finally, sign the bundle itself.
1 2 3 4 5 |
|
Now, your app is code-signed.
Step 5: Archive it
The right way to do this is probably to use dmgbuild or something like it, but what I promised here was quick and dirty, not beautiful and best practices.
You have to make a Zip archive that preserves symbolic links. There are a couple of options for this:
open dist/
, then in the Finder window that comes up, right click on the app and “compress” itcd dist; zip -yr $NAME.app.zip $NAME.app
Most importantly, if you use the zip
command line tool, you must use the
-y
option. Without it, your downloadable app bundle will be somewhat
mysteriously broken even though the one before you zip
ped it will be fine.
Step 6: Actually The Rest Of Step 4: Request Notarization
Notarization is a 2-step process, which is somewhat resistant to fully automating. You submit to Apple, then they email you the results of doing the notarization, then if that email indicates that your notarization succeded, you can “staple” the successful result to your bundle.
The thing you notarize is an archive, which is why you need to do step 5 first. Then, you need to do this:
1 2 3 4 5 |
|
Be sure that YOUR_BUNDLE_ID
matches the CFBundleIdentifier
you told py2app
about before, so that the tool can find your app bundle inside the archive.
You’ll also need to type in the iCloud password for your Developer ID account here.2
Step 6a: Wait A Minute
Anxiously check your email for an hour or so. Hope you don’t get any errors.
Step 6b: Finish Notarizing It, Finally!
Once Apple has a record of the app’s notarization, their tooling will recognize
it, so you don’t need any information from the confirmation email or the
previous command; just make sure that you are running this on the exact same
.app
directory you just built and archived and not a version that differs in
any way.
1 |
|
Finally, you will want to archive it again:
1 |
|
Step 7: Download it
Ideally, at this point, everything should be working. But to make sure that code-signing and archiving and notarizing and re-archiving went correctly, you should have either a pristine virtual machine with no dev tools and no Python installed, or a non-programmer friend’s machine that can serve the same purpose. They probably need a relatively recent macOS - my own experience has shown that apps made using the above technique will definitely work on High Sierra (and later) and will definitely break on Yosemite (and earlier); they probably start working at some OS version between those.
There’s no tooling that I know of that can clearly tell you whether your mac
app depends on some detail of your local machine. Even for your
dependencies, there’s no auditwheel for macOS.
Updated 2019-06-27: It turns out there is an
auditwheel
like thing for macOS: delocate
! In
fact, it predated and inspired auditwheel
!
Thanks to Nathaniel Smith for the update
(which he provided in, uh, January of 2018 and I’ve only just now gotten around
to updating...).
Nevertheless, it’s always a good idea to check your final app build on a fresh computer before you announce it.
Coda
If you were expecting to get to the end and download my cool game, sorry to disappoint! It really is a half-broken prototype that is in no way ready for public consumption, and given my current load of personal and professional responsibilities, you definitely shouldn’t expect anything from me in this area any time soon, or, you know, ever.
But, from years of experience, I know that it’s nearly impossible to summon any motivation to work on small projects like this without the knowledge that the end result will be usable in some way, so I hope that this helps someone else set up their Python game-dev pipeline.
I’d really like to turn this into a 3-part series, with a part for Linux (perhaps using flatpak? is that a good thing?) and a part for Windows. However, given my aforementioned time constraints, I don’t think I’m going to have the time or energy to do that research, so if you’ve got the appropriate knowledge, I’d love to host a guest post on this blog, or even just a link to yours.
If this post helped you, if you have questions or corrections, or if you’d like to write the Linux or Windows version of this post, let me know.
-
The hardened runtime was originally required when notarization was introduced. Apparently this broke too much software and now the requirement is relaxed until January 2020. But it’s probably best to treat it as if it is required, since the requirement is almost certainly coming back, and may in fact be back by the time you’re reading this. ↩
-
You can pass it via the
--password
option but there are all kinds of security issues with that so I wouldn’t recommend it. ↩
Careful With That PyPI
PyPI credentials are important. Here are some tips for securing them a little better.
Too Many Secrets
A wise man once said, “you shouldn’t use ENV variables for secret
data”.
In large part, he was right, for all the reasons he gives (and you should read
them). Filesystem locations are usually a better operating system interface to
communicate secrets than environment variables; fewer things can intercept an
open()
than can read your process’s command-line or calling environment.
One might say that files are “more secure” than environment variables. To his credit, Diogo doesn’t, for good reason: one shouldn’t refer to the superiority of such a mechanism as being “more secure” in general, but rather, as better for a specific reason in some specific circumstance.
Supplying your PyPI password to tools you run on your personal machine is a very different case than providing a cryptographic key to a containerized application in a remote datacenter. In this case, based on the constraints of the software presently available, I believe an environment variable provides better security, if you use it correctly.
Popping A Shell By Any Other Name
If you upload packages to the python package index, and
people use those packages, your PyPI password is an extremely high-privilege
credential: effectively, it grants a time-delayed arbitrary code execution
privilege on all of the systems where anyone might pip install
your packages.
Unfortunately, the suggested mechanism to manage this crucial, potentially world-destroying credential is to just stick it in an unencrypted file.
The authors of this documentation know this is a problem; the authors of the tooling know too (and, given that these tools are all open source and we all could have fixed them to be better about this, we should all feel bad).
Leaving the secret lying around on the filesystem is a form of ambient authority; a permission you always have, but only sometimes want. One of the worst things about this is that you can easily forget it’s there if you don’t use these credentials very often.
The keyring is a much better place, but even it can be a slightly scary place to put such a thing, because it’s still easy to put it into a state where some random command could upload a PyPI release without prompting you. PyPI is forever, so we want to measure twice and cut once.
Luckily, even more secure places exist: password managers. If you use
https://1password.com or https://www.lastpass.com, both offer command-line
interfaces that integrate nicely with PyPI. If you use 1password, you’ll
really want https://stedolan.github.io/jq/ (apt-get install jq
, brew install
jq
) to slice & dice its command-line.
The way that I manage my PyPI credentials is that I never put them on my filesystem, or even into my keyring; instead, I leave them in my password manager, and very briefly toss them into the tools that need them via an environment variable.
First, I have the following shell function, to prevent any mistakes:
1 2 3 4 |
|
For dev.twine
, I configure twine to
always only talk to my local DevPI
instance:
1 2 3 4 5 6 |
|
This way I can debug Twine, my setup.py
, and various test-upload things
without ever needing real credentials at all.
But, OK. Eventually, I need to actually get the credentials and do the thing. How does that work?
1Password
1password’s command line is a little tricky to log in to (you have to eval
its output, it’s not just a command), so here’s a handy shell function that
will do it.
1 2 3 4 5 6 |
|
Then, I have this little helper for slicing out a particular field from the OP JSON structure:
1 2 3 |
|
And finally, I use this to grab the item I want (named, memorably enough, “PyPI”) and invoke Twine:
1 2 3 4 5 6 7 |
|
LastPass
For lastpass, you can just log in (for all shells; it’s a little less secure)
via lpass login
; if you’ve logged in before you often don’t even have to do
that, and it will just prompt you when running command that require you to be
logged in; so we don’t need the preamble that 1password’s command line did.
Its version of prod.twine
looks quite similar, but its plaintext output
obviates the need for jq
:
1 2 3 4 5 |
|
In Conclusion
“Keep secrets out of your environment” is generally a good idea, and you should always do it when you can. But, better a moment in your process environment than an eternity on your filesystem. Environment-based configuration can be a very useful stopgap for limiting the lifetimes of credentials when your tools don’t support more sophisticated approaches to secret storage.1
Post Script
If you are interested in secure secret storage, my micro-project
secretly
might be of interest. Right
now it doesn’t do a whole lot; it’s just a small wrapper around the excellent
keyring module and the
pinentry /
pinentry-mac password prompt tools.
secretly
presents an interface both for prompting users for their credentials
without requiring the command-line or env vars, and for saving them away in
keychain
, for tools that need to pull in an API key and don’t want to make
the user manually edit a config file first.
-
Really, PyPI should have API keys that last for some short amount of time, that automatically expire so you don’t have to freak out if you gave somebody a 5-year-old laptop and forgot to wipe it first. But again, if I wanted that so bad, I should have implemented it myself... ↩
The Sororicide Antipattern
Don’t murder your parents or your siblings to get their attributes.
“Composition is better than inheritance.”. This is a true statement. “Inheritance is bad.” Also true. I’m a well-known compositional extremist. There’s a great talk you can watch if I haven’t talked your ear off about it already.
Which is why I was extremely surprised in a recent conversation when my interlocutor said that while inheritance might be bad, composition is worse. Once I understood what they meant by “composition”, I was even more surprised to find that I agreed with this assertion.
Although inheritance is bad, it’s very important to understand why. In a high-level language like Python, with first-class runtime datatypes (i.e.: user defined classes that are objects), the computational difference between what we call “composition” and what we call “inheritance” is a matter of where we put a pointer: is it on a type or on an instance? The important distinction has to do with human factors.
First, a brief parable about real-life inheritance.
You find yourself in conversation with an indolent heiress-in-waiting. She complains of her boredom whiling away the time until the dowager countess finally leaves her her fortune.
“Inheritance is bad”, you opine. “It’s better to make your own way in life”.
“By George, you’re right!” she exclaims. You weren’t expecting such an enthusiastic reversal.
“Well,”, you sputter, “glad to see you are turning over a new leaf”.
She crosses the room to open a sturdy mahogany armoire, and draws forth a belt holstering a pistol and a menacing-looking sabre.
“Auntie has only the dwindling remnants of a legacy fortune. The real money has always been with my sister’s manufacturing concern. Why passively wait for Auntie to die, when I can murder my dear sister now, and take what is rightfully mine!”
Cinching the belt around her waist, she strides from the room animated and full of purpose, no longer indolent or in-waiting, but you feel less than satisfied with your advice.
It is, after all, important to understand what the problem with inheritance is.
The primary reason inheritance is bad is confusion between namespaces.
The most important role of code organization (division of code into files, modules, packages, subroutines, data structures, etc) is division of responsibility. In other words, Conway’s Law isn’t just an unfortunate accident of budgeting, but a fundamental property of software design.
For example, if we have a function called multiply(a, b)
- its presence in
our codebase suggests that if someone were to want to multiply two numbers
together, it is multiply
’s responsibility to know how to do so. If there’s
a problem with multiplication, it’s the maintainers of multiply
who need to
go fix it.
And, with this responsibility comes authority over a specific scope within the
code. So if we were to look at an implementation of multiply
:
1 2 3 |
|
The maintainers of multiply
get to decide what product
means in the context
of their function. It’s possible, in Python, for some other funciton to
reach into multiply
with frame objects and mangle the meaning of product
between its assignment and return
, but it’s generally understood that it’s
none of your business what product
is, and if you touch it, all bets are
off about the correctness of multiply
. More importantly, if the maintainers
of multiply wanted to bind other names, or change around existing names, like
so, in a subsequent version:
1 2 3 4 5 |
|
It is the maintainer of multiply
’s job, not the caller of multiply
, to make
those decisions.
The same programmer may, at different times, be both a caller and a maintainer
of multiply
. However, they have to know which hat they’re wearing at any
given time, so that they can know which stuff they’re still repsonsible for
when they hand over multiply
to be maintained by a different team.
It’s important to be able to forget about the internals of the local variables in the functions you call. Otherwise, abstractions give us no power: if you have to know the internals of everything you’re using, you can never build much beyond what’s already there, because you’ll be spending all your time trying to understand all the layers below it.
Classes complicate this process of forgetting somewhat. Properties of class
instances “stick out”, and are visible to the callers. This can be powerful —
and can be a great way to represent shared data structures — but this is
exactly why we have the ._
convention in Python: if something starts with an
underscore, and it’s not in a namespace you own, you shouldn’t mess with it.
So: other._foo
is not for you to touch, unless you’re maintaining
type(other)
. self._foo
is where you should put your own private state.
So if we have a class like this:
1 2 3 |
|
we all know that A()._note
is off-limits.
But then what happens here?
1 2 3 4 |
|
B()._note
is also off limits for everyone but B
, except... as it turns out,
B
doesn’t really own the namespace of self
here, so it’s clashing with what
A
wants _note
to mean. Even if, right now, we were to change it to
_note2
, the maintainer of A
could, in any future release of A
, add a new
_note2
variable which conflicts with something B
is using. A
’s
maintainers (rightfully) think they own self
, B
’s maintainers (reasonably)
think that they do. This could continue all the way until we get to _note7
,
at which point it would explode violently.
So that’s why Inheritance is bad. It’s a bad way for two layers of a system to communicate because it leaves each layer nowhere to put its internal state that the other doesn’t need to know about. So what could be worse?
Let’s say we’ve convinced our junior programmer who wrote A
that inheritance
is a bad interface, and they should instead use the panacea that cures all
inherited ills, composition. Great! Let’s just write a B
that composes
in an A
in a nice clean way, instead of doing any gross inheritance:
1 2 3 4 |
|
Uh oh. Looks like composition is worse than inheritance.
Let’s enumerate some of the issues with this “solution” to the problem of inheritance:
- How do we know what attributes
Bprime
has? - How do we even know what type
a
is? - How is anyone ever going to
grep
for relevant methods in this code and have them come up in the right place?
We briefly reclaimed self
for Bprime
by removing the inheritance from A
,
but what Bprime
does in __init__
to replace it is much worse. At least
with normal, “vertical” inheritance, IDEs and code inspection tools can have
some idea where your parents are and what methods they declare. We have to
look aside to know what’s there, but at least it’s clear from the code’s
structure where exactly we have to look aside to.
When faced with a class like Bprime
though, what does one do? It’s just
shredding apart some apparently totally unrelated object, there’s nearly no way
for tooling to inspect this code to the point that they know where
self.<something>
comes from in a method defined on Bprime
.
The goal of replacing inheritance with composition is to make it clear and
easy to understand what code owns each attribute on self
. Sometimes that
clarity comes at the expense of a few extra keystrokes; an __init__
that
copies over a few specific attributes, or a method that does nothing but
forward a message, like def something(self): return self.other.something()
.
Automatic composition is just lateral inheritance. Magically auto-proxying all methods1, or auto-copying all attributes, saves a few keystrokes at the time some new code is created at the expense of hours of debugging when it is being maintained. If readability counts, we should never privilege the writer over the reader.
-
It is left as an exercise for the reader why
proxyForInterface
is still a reasonably okay idea even in the face of this criticism.2 ↩ -
Although ironically it probably shouldn’t use inheritance as its interface. ↩