I have a couple of inactive Pinax sites at different versions (0.9.x and 0.7.x). These both started with two common features:
1: While the version of Pinax was the most recent to have a (nonempty) social
starter project, they were both relatively old; and:
2: Installation hinged on procuring some extremely rare and difficult installations, and reminded me of installation and system administration on SGI IRIX.
In looking at this, there seems to be a theme of increasing fragility. Someone described systems integration as “Your computer doesn’t work because of a problem on a machine you haven’t heard of.”
Dependies seem ipso facto to constitute single points of failure, and the trend is to have more and more of them.
Any suggestions about how to cope with this above creating a virtualenv while you still can? I’m looking at creating all of a self-contained project with its own “roll your own” apps, not because I think this is desirable or Pythonic in itself, but to quarantine most or allall single points of failure to my own code, which ideally should be working and deployable after download Python and Django (if needed), and my project alone.
This is in fact not because I think I’m smarter than every single one of the developers of 100 or more individual Django packages: it’s because I think my code is better (or at least something I can provide completely) than the dumbest or most ephemeral of 100 or more individual Django packages. I don’t want a Leiningen vs. the Ants web of dependencies, each one a point of failure.
This probably reads more like a blog entry than a question, but I’m writing about a difficulty that I’ve been unable to avoid. Possibly with enough duct tape, I might for instance have a gallery of virtualenvs, and ensure that every single version of every single dependency is available in source and installed format.
What options should I consider if I want minimize adding (possibly ephemeral) dependencies? Are there ways to estimate how ephemeral a given dependency is?
Thanks