When I was teaching maths (a long time ago!) I tried to infuse consideration towards well built coherent demonstrations.
Then one day a student asked a question about a theorem: "Clever! but how am I supposed to discover the solution to such a thing?"
I realised I had to teach that a demonstration was the final stage of a complex quest. During that search one had to mix deductive stages ("X implies Y") and inductive stages where the mind wandered in search of clues, paths, dead ends, magical inspirations ...
Can this design process be taught to be more efficient?
Then does the use of such methods guarantees a result?
Transposing this to information technology can lead to very pessimistic conclusions. When designing a software:
- you are not sure your hypothesis are 100% valid, you are not sure that you are aware of all hypothesis, hypothesis may change during the life cycle!
- we are not sure that the final stage (the software product) is coherent, reliable and even meets expectations.
- the overall process by which we create this product is blurry: many methods compete for our attention. Some are as "hard" as possible (could be likened to deductive processes), some are more oriented towards behavioural patterns for the programming team.
In this blog I will use a gross approximation which is to qualify "soft" methods (methods not based on proofs) of being "fuzzy". Being rational beings we need rules even if these are rules of thumb or if the theoretical foundations for those rules are unsteady.
Those methods are useful but have a disturbing property: their relevance should be continuously re-evaluated. This is paradoxical: one works on methods to establish a process that will enable further economies of thought ... then we need to constantly review this process. So where is the economy?
In software engineering the paucity of meta-methods practices is dangerous.
I have been working in a corporation using "extreme waterfall" methods : for every step there is an extensive list of prerequisite documents and a list of mandatory documents to be supplied for the next step. I have no clue of wether this is justified without a context for my judgement, but my question is: is there an evaluation for the overhead generated by the procedure itself?
At the other end of the development spectrum "agile" methods have an empirical approach recognising that problems cannot be fully understood. Some rules state that distracting influences of things not being directly focused on the current goal should be eliminated. This rule has also limits: in some case it is extremely important and profitable to let someone wander off. So without an evaluation of the violation of rules we miss something!
So, roughly, every method should carry its rationale, critics and evaluation practices. This is obvious but frequently overlooked.
Let's have a look on programming design patterns and how those patterns should constanly be re-evaluated.
At this stage it is important to remind that the recognition of the "design pattern" notion has been inspired by architecture (Ch. Alexander) -though I.M.H.O the first precise description I know of dates back to Clausewitz and dealt with military strategy-.
-those of you who are not familiar with Java programming may skip the following entries of the blog-