There are a lot of ideas in software engineering are considered as truth until someone pinpoints the fallacies. Peter Deutsch first came up with his Eight Fallacies of Distributed Computing to debunk misconceptions about distributed computing. I happen to have a list of my own. Here are 10 Fallacies of Software Analysis and Design.
There are a lot of ideas in software engineering are considered as truth until someone pinpoints the fallacies. Peter Deutsch first came up with his Eight Fallacies of Distributed Computing to debunk misconceptions about distributed computing:
Essentially everyone, when they first build a distributed application, makes the following eight assumptions. All prove to be false in the long run and all cause big trouble and painful learning experiences.
The network is reliable
Latency is zero
Bandwidth is infinite
The network is secure
Topology doesn't change
There is one administrator
Transport cost is zero
The network is homogeneous
I happen to have a list of my own. These are the 10 Fallacies of Software Analysis and Design:
You can trust everyone
Universal agreement is possible
A perfect model of the real world can be created
Change can be avoided
Time doesn't exist
Everyone can respond immediately
Concurrency can be abstracted away
Side effects and non-linearity should be removed
Systems can be proven to be correct
Implementation details can be hidden
You can trust everyone. Security is obviously a big problem in the computing world today; you cannot simply blindly trust external systems. However, even though many have ignored these concerns, everyone agrees that is a problem. The part thats a fallacy is trust of a more subtle variety. That is its a fallacy to assume that if another system declares something as true, it is actually true. Its more general that conventional security, however its a pervasive problem in the Internet. For example, there are firms that provide a service called Search Optimization. The service artificially tries to improve a search ranking. How it does this is by falsely planting information all over the Internet to fool search engines and ultimately up its ranking. In short, you cannot trust other parties will give you accurate information.
Universal agreement is possible. Achieving universal agreement is a fallacy because a common classification scheme is not only difficult to arrive at, but that classification changes over time. In addition change happens on a per participant basis. So either all participants are in absolute lock-step agreement or agreement can only exist in a relative sense. Finally, how can one achieve agreement if others may have conflicting agendas?
A perfect model of the real world can be created. Why is there this hidden belief by data modelers that a domain model can be created that can withstand the test of time? Thats is one that can cover any new requirement that can be possibly imagined? The problem with this thinking is that it reinforces the idea that change can be avoided. Furthermore, it assumes that you modeling constructs are powerful enough to capture any new concept.
Change can be avoided. Why is it that so many systems are designed without a bit of consideration for change? Could it be that its just too difficult to do or are we just simply lazy, that is, leave it to the next guy to worry about. The root cause of this problem may lie in that conventional software development does not employ metrics to measure agility.
Time doesn't exist. The fact that time doesnt exist is obviously a fallacy; unfortunately mathematics tends to remove time out of the equation. Thats simply because its much easier to work with. Unfortunately, this mathematics tradition carries over into the computing world. Examples of this are functional programming languages. Functional programming languages have extremely good analytic properties, but they are deficient pragmatically. Thats because they are cumbersome when dealing with state. However you simply just cant ignore state, thats because state is about time and anyone with a brain would notice that it indeed exists.
Everyone can respond immediately. Why is it that systems are designed such that other systems are always available and can respond immediately? Its as if every system unfailing communication, infinite resources and can process any request instantaneously. The fact is, communications do fail, resources are limited and request can sometime involve other parties that cant act as quickly.
Concurrency can be abstracted away. The thought here is that concurrency is a technical issue and can be handled at that level of discourse. Unfortunately, concurrency can have business consequences. Resources are not all abstract and resources cannot be reserved without business consequences. Coordination is not a implementation detail, not everyone can wait forever, and not everyone can react instantaneously.
Side effects and non-linearity should be removed Side effects and non-linearity should be removed, this again in a reflection of our mathematical heritage. In mathematics, side effects are extremely difficult to handle and Non-linear systems are force fitted into linear models to enable analysis. Unfortunately, the world isnt as orderly as we would like. Feedback is a natural occurrence, and you cant ignore.
Systems can be proved to be correct. Again this is an artifact or our mathematical roots. Wouldnt it be nice if we could prove that our systems are correct without actually running them through a battery of tests? Well, thats the idea here, unfortunately, there are a couple of problems. The first is, how can you prove that your proof is correct? The second is, if you cant take away time, side effects and non-linearity, then the chances that you can arrive at an analytic solution go down pretty quickly.
Implementation details can be hidden. This fallacy has its roots in our habits to create abstractions. That is hide details and concentrate only on the essentials. Unfortunately, abstractions do leak, not all implementation aspects can be truly be hidden. The real world is governed in time and space and when implementation is linked to it, no amount of abstraction can hide the fact that the real world exists.
These fallacies have its roots in our mathematical traditions, which is a world of perfect determinism. Unfortunately this static thinking mindset is prevalent in software engineering. It's important to realize that the world undergoes continuous change, with feedback and side effects that create non-linearities, with time that's relativistic and with humans that are unpredictable. Finally, if its not obvious to you, your software implementation actually interfaces with this reality.
I didn't understand what you were trying to get at in this section at all. What time? Development time? Run time? Communication time? Time to respond? I'm not even sure what other question to ask. Any clarifications would be appreciated.
Your musings on side effects and abstraction remind me of Toltoy's ruminations on the purpose of bees:
A bee settling on a flower has stung a child. And the child is afraid of bees and declares that bees exist to sting people. A poet admires the bee sucking from the chalice of a flower and says it exists to suck the fragrance of flowers. A beekeeper, seeing the bee collect pollen from flowers and carry it to the hive, says that it exists to gather honey. Another beekeeper who has studied the life of the hive more closely says that the bee gathers pollen dust to feed the young bees and rear a queen, and that it exists to perpetuate its race. A botanist notices that the bee flying with the pollen of a male flower to a pistil fertilizes the latter, and sees in this the purpose of the bee's existence. Another, observing the migration of plants, notices that the bee helps in this work, and may say that in this lies the purpose of the bee. But the ultimate purpose of the bee is not exhausted by the first, the second, or any of the processes the human mind can discern. The higher the human intellect rises in the discovery of these purposes, the more obvious it becomes, that the ultimate purpose is beyond our comprehension.
When I read that, it reminded me of of the process of writing software to model something. As you proceed and understand more about the thing you are trying to model, it constantly changes and you discover that it has so many appearances, facets, side effects, interactions and nuances that it escapes encapsulation.
We try to avoid side effects in software, but that is a denial of reality, where it seems that most things are side effects. Of course, it is also necessary to ignore as much as we can and simplify, for practical purposes, since time does exist (or does it?).
> Time doesn't exist. > > I didn't understand what you were trying to get at in this > section at all. What time? Development time? Run time? > Communication time? Time to respond? I'm not even sure > e what other question to ask. Any clarifications would be > appreciated.
No, surprisingly none of the above. What I'm referring to is the absense of time in many abstractions or models of reality. On second though, this list should have been called "10 Fallacies of Software Abstraction"
11. if you only make enough documents and diagrams and use the right tools the software will be perfect. 12. soon there will be a tool where I (the architect/designer) can just draw the diagrams and the completed application will appear so I can get rid of all those programmers who always mess up my perfect designs. 13. (stems from 12.) the architect is perfect, the programmers only cause problems by incorrectly implementing the design.
14. The salesman's initial ball-park estimate of cost and time estimate based on a (reported) two minute phone discussion with the customer, will form the two fixed goal-posts against which the project will be measured.
I remember a conversation, years ago, with the guy who did the Ethernet level communications for a lab data acquisition device that my then-employer sold. I'd come from working on Ethernet kit, in assembler, but was doing the user interface, so I had nothing much to do with his code. However, curious, I asked him how they handled retransmissions. He said "don't be f*cking stupid, packets don't vanish between the box and the server."
There was no retransmission. They relied on a spike-free, defect-free, 100% reliable piece of wire... in a laboratory full of bits of noisy electromechanical equipment and corrosive substances.
Still stays with me, that combination of ignorance and arrogance.