There you go, click on the pic for the three tweet answer, thanks.
Beware: this is not ‘top level’ thinking. This is a heuristic.
PS: I came up with all this a day after staring in to the sky whilst waiting for the morning minibus to Sony in Weybridge – and after tweeting about a strange line in the sky – by chance stumbled on the origin of the phrase ‘Occam’s Razor’ which is relevant to the design of heuristics: “One should not increase, beyond what is necessary, the number of entities required to explain anything.”
The answer to my question – ‘Ockham Stack’ (see Q & A below with @CoxeyLoxey) – is named after the village in Surrey where William of Ockham, the guy who coined the phrase Occam’s Razor, came from. So hope that didn’t increase beyond what’s necessary, the # entities required to explain it!
OK, it’s long list but it’s pretty useful when thinking of designing online communities for example! From John Gall. So as a planning tool how about thinking where your approach might fit into these. Good or bad!
1. The Primal Scenario or Basic Datum of Experience: Systems in general work poorly or not at all. (Complicated systems seldom exceed five percent efficiency.)
2. The Fundamental Theorem: New systems generate new problems.
3. The Law of Conservation of Anergy [sic]: The total amount of anergy in the universe is constant. (“Anergy” = ‘human energy’)
4. Laws of Growth: Systems tend to grow, and as they grow, they encroach.
5. The Generalized Uncertainty Principle: Systems display antics. (Complicated systems produce unexpected outcomes. The total behavior of large systems cannot be predicted.)
6. Le Chatelier’s Principle: Complex systems tend to oppose their own proper function. As systems grow in complexity, they tend to oppose their stated function.
7. Functionary’s Falsity: People in systems do not actually do what the system says they are doing.
8. The Operational Fallacy: The system itself does not actually do what it says it is doing.
9. The Fundamental Law of Administrative Workings (F.L.A.W.): Things are what they are reported to be. The real world is what it is reported to be. (That is, the system takes as given that things are as reported, regardless of the true state of affairs.)
10. Systems attract systems-people. (For every human system, there is a type of person adapted to thrive on it or in it.) [eg: watch out for contributors who dominate your community]
11. The bigger the system, the narrower and more specialized the interface with individuals.
12. A complex system cannot be “made” to work. It either works or it doesn’t.
13. A simple system, designed from scratch, sometimes works.
14. Some complex systems actually work.
15. A complex system that works is invariably found to have evolved from a simple system that works.
16. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
17. The Functional Indeterminacy Theorem (F.I.T.): In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
18. The Newtonian Law of Systems Inertia: A system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
19. Systems develop goals of their own the instant they come into being.
20. Intrasystem [sic] goals come first.
21. The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.
22. A complex system can fail in an infinite number of ways. (If anything can go wrong, it will.) (See Murphy’s law.)
23. The mode of failure of a complex system cannot ordinarily be predicted from its structure.
24. The crucial variables are discovered by accident.
25. The larger the system, the greater the probability of unexpected failure.
26. “Success” or “Function” in any system may be failure in the larger or smaller systems to which the system is connected.
27. The Fail-Safe Theorem: When a Fail-Safe system fails, it fails by failing to fail safe.
28. Complex systems tend to produce complex responses (not solutions) to problems.
29. Great advances are not produced by systems designed to produce great advances.
30. The Vector Theory of Systems: Systems run better when designed to run downhill.
31. Loose systems last longer and work better. (Efficient systems are dangerous to themselves and to others.)
32. As systems grow in size, they tend to lose basic functions.
33. The larger the system, the less the variety in the product.
34. Control of a system is exercised by the element with the greatest variety of behavioral responses.
35. Colossal systems foster colossal errors.
36. Choose your systems with care.
There’s been a great discussion about configuring forum discussions on e-mint recently (‘Discussions boards navigation/IA’), with one post from Ian Dickson sparking my interest on another level. He concluded his reply on the subject with the following ‘PS’: “it’s easier to simplify something that is overengineered than it is to complexify (?) something simple unless youn write off the older content.”
Photo by Stuart Glendinning Hall
Why is this of interest? Because it points to the value of a complexity approach in unifying the organic nature of an online community and and its structure. Sure, you might say, I could see why a bunch of people online can be seen in organic terms, but structure? But here’s the point. A motor engine is not complex its complicated, because its not based on organic principles, but mechanic ones. It also has none of the properties of self-regulation which an organic system has, hence the need for a control mechanism. But the structure of an online community can be designed along organic lines, based on simple parts which are assembled to form a complex whole. Seen in this complexity light Ian’s then in ideal terms it really is as straightforward to simply the complex as it is to make the simple complex. But what I suspect he means by ‘overengineered’ is in fact ‘complicated’. And in that sense I agree with him. As it is very difficult to make the mechanically complicated simple. So that’s why its important to build it on complex lines in the first place. And with the bonus that it fits with the way you approach management of the community itself, along organic lines, encouraging self-regulation rather than seeking control as a way of unleashing the power of the community. Hey, it’s just theory, but thanks again to Deirdre’s original post and Ian’s reply for the inspiration!
For no reason except I was trying to help Shirley figure out the week ahead I started thinking about the value of conversations, specifically using online communities to full effect, as part of a practically-minded ‘complexity’ approach to business. Then I did a Google search on complexity and conversations and came up with Dr Patricia Shaw’s book on the subject, with the following customer recommendation which is a useful starting point for further thought:
At last, recognition that real change doesn’t happen purely because of top-down, management dictats, but is embodied by real people having real conversations that are not structured by clear objectives, goals and processes. Inherently scary for all those who rely on management as a control process in their organisations and change as a corporately-guided process, this instead looks at the informal organisation and how creating spaces for conversations between like-minded change agents can be the most effective.
This veers slightly too far into complexity and informal processes only for me – I believe that a balance is required between formal change and informal conversations, but this is still an important broadening of the discussion on corporate change.
It occurred to me that a basic effor/time saving principle of agile project management is that you don’t try to plan everything up front, because you don’t need to!
The fact that for example getting a site live ready by a certain date requires a change to the URL pointing means you only need to worry about it when you get close to the event, as each event is connected in reality to another (‘one thing leads to another’ to slice the folk wisdom). Call it ‘managing complexity’ if you like.
AIG’s complexity blamed for fall: “The complexity and international spread of AIG’s operations impeded regulatory oversight of the derivatives unit that helped bring down the insurer, according to former executives, analysts and regulators.” (FT report)
Hmm, complexity? Maybe they should have invested in a product such as Datawatch’s Monarch software which is designed to transform report output files and other data sources into live data on your PC. Currently it is used by around 500,000 professionals worldwide and now on its 9th version.