Want a practical complexity heuristic?

Update: 7 May 2016

What I didn’t explain is the hard part is figuring out what the parts are, and how they fit together. But there’s a good example in my upselling solution blog post, on how I figured out what imho was blocking growth at Causeway, with the help of expert ‘sales hacker’ Richard Harris. I guess the exec team at Causeway have found their own way to a solution, with the business transition to SaaS.

There you go, click on the pic for the three tweet answer, thanks.

Beware: this is not ‘top level’ thinking. This is a heuristic.

PS: I came up with all this a day after staring in to the sky whilst waiting for the morning minibus to Sony in Weybridge – and after tweeting about a strange line in the sky – by chance stumbled on the origin of the phrase ‘Occam’s Razor’ which is relevant to the design of heuristics: “One should not increase, beyond what is necessary, the number of entities required to explain anything.”

The answer to my question – ‘Ockham Stack’ (see Q & A below with @CoxeyLoxey) – is named after the village in Surrey where William of Ockham, the guy who coined the phrase Occam’s Razor, came from. So hope that didn’t increase beyond what’s necessary, the # entities required to explain it!

Systemantics and online communities

OK, it’s long list but it’s pretty useful when thinking of designing online communities for example! From John Gall. So as a planning tool how about thinking where your approach might fit into these. Good or bad!

1. The Primal Scenario or Basic Datum of Experience: Systems in general work poorly or not at all. (Complicated systems seldom exceed five percent efficiency.)
2. The Fundamental Theorem: New systems generate new problems.
3. The Law of Conservation of Anergy [sic]: The total amount of anergy in the universe is constant. (“Anergy” = ‘human energy’)
4. Laws of Growth: Systems tend to grow, and as they grow, they encroach.
5. The Generalized Uncertainty Principle: Systems display antics. (Complicated systems produce unexpected outcomes. The total behavior of large systems cannot be predicted.)
6. Le Chatelier’s Principle: Complex systems tend to oppose their own proper function. As systems grow in complexity, they tend to oppose their stated function.
7. Functionary’s Falsity: People in systems do not actually do what the system says they are doing.
8. The Operational Fallacy: The system itself does not actually do what it says it is doing.
9. The Fundamental Law of Administrative Workings (F.L.A.W.): Things are what they are reported to be. The real world is what it is reported to be. (That is, the system takes as given that things are as reported, regardless of the true state of affairs.)
10. Systems attract systems-people. (For every human system, there is a type of person adapted to thrive on it or in it.) [eg: watch out for contributors who dominate your community]
11. The bigger the system, the narrower and more specialized the interface with individuals.
12. A complex system cannot be “made” to work. It either works or it doesn’t.
13. A simple system, designed from scratch, sometimes works.
14. Some complex systems actually work.
15. A complex system that works is invariably found to have evolved from a simple system that works.
16. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
17. The Functional Indeterminacy Theorem (F.I.T.): In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
18. The Newtonian Law of Systems Inertia: A system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
19. Systems develop goals of their own the instant they come into being.
20. Intrasystem [sic] goals come first.
21. The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.
22. A complex system can fail in an infinite number of ways. (If anything can go wrong, it will.) (See Murphy’s law.)
23. The mode of failure of a complex system cannot ordinarily be predicted from its structure.
24. The crucial variables are discovered by accident.
25. The larger the system, the greater the probability of unexpected failure.
26. “Success” or “Function” in any system may be failure in the larger or smaller systems to which the system is connected.
27. The Fail-Safe Theorem: When a Fail-Safe system fails, it fails by failing to fail safe.
28. Complex systems tend to produce complex responses (not solutions) to problems.
29. Great advances are not produced by systems designed to produce great advances.
30. The Vector Theory of Systems: Systems run better when designed to run downhill.
31. Loose systems last longer and work better. (Efficient systems are dangerous to themselves and to others.)
32. As systems grow in size, they tend to lose basic functions.
33. The larger the system, the less the variety in the product.
34. Control of a system is exercised by the element with the greatest variety of behavioral responses.
35. Colossal systems foster colossal errors.
36. Choose your systems with care.