Sponsored Link •
|
Summary
There are a lot of ideas in software engineering are considered as truth until someone pinpoints the fallacies. Peter Deutsch first came up with his Eight Fallacies of Distributed Computing to debunk misconceptions about distributed computing. I happen to have a list of my own. Here are 10 Fallacies of Software Analysis and Design.
Advertisement
|
There are a lot of ideas in software engineering are considered as truth until someone pinpoints the fallacies. Peter Deutsch first came up with his Eight Fallacies of Distributed Computing to debunk misconceptions about distributed computing:
Essentially everyone, when they first build a distributed application, makes the following eight assumptions. All prove to be false in the long run and all cause big trouble and painful learning experiences.
- The network is reliable
- Latency is zero
- Bandwidth is infinite
- The network is secure
- Topology doesn't change
- There is one administrator
- Transport cost is zero
- The network is homogeneous
I happen to have a list of my own. These are the 10 Fallacies of Software Analysis and Design:
- You can trust everyone
- Universal agreement is possible
- A perfect model of the real world can be created
- Change can be avoided
- Time doesn't exist
- Everyone can respond immediately
- Concurrency can be abstracted away
- Side effects and non-linearity should be removed
- Systems can be proven to be correct
- Implementation details can be hidden
You can trust everyone. Security is obviously a big problem in the computing world today; you cannot simply blindly trust external systems. However, even though many have ignored these concerns, everyone agrees that is a problem. The part thats a fallacy is trust of a more subtle variety. That is its a fallacy to assume that if another system declares something as true, it is actually true. Its more general that conventional security, however its a pervasive problem in the Internet. For example, there are firms that provide a service called Search Optimization. The service artificially tries to improve a search ranking. How it does this is by falsely planting information all over the Internet to fool search engines and ultimately up its ranking. In short, you cannot trust other parties will give you accurate information.
Universal agreement is possible. Achieving universal agreement is a fallacy because a common classification scheme is not only difficult to arrive at, but that classification changes over time. In addition change happens on a per participant basis. So either all participants are in absolute lock-step agreement or agreement can only exist in a relative sense. Finally, how can one achieve agreement if others may have conflicting agendas?
A perfect model of the real world can be created. Why is there this hidden belief by data modelers that a domain model can be created that can withstand the test of time? Thats is one that can cover any new requirement that can be possibly imagined? The problem with this thinking is that it reinforces the idea that change can be avoided. Furthermore, it assumes that you modeling constructs are powerful enough to capture any new concept.
Change can be avoided. Why is it that so many systems are designed without a bit of consideration for change? Could it be that its just too difficult to do or are we just simply lazy, that is, leave it to the next guy to worry about. The root cause of this problem may lie in that conventional software development does not employ metrics to measure agility.
Time doesn't exist. The fact that time doesnt exist is obviously a fallacy; unfortunately mathematics tends to remove time out of the equation. Thats simply because its much easier to work with. Unfortunately, this mathematics tradition carries over into the computing world. Examples of this are functional programming languages. Functional programming languages have extremely good analytic properties, but they are deficient pragmatically. Thats because they are cumbersome when dealing with state. However you simply just cant ignore state, thats because state is about time and anyone with a brain would notice that it indeed exists.
Everyone can respond immediately. Why is it that systems are designed such that other systems are always available and can respond immediately? Its as if every system unfailing communication, infinite resources and can process any request instantaneously. The fact is, communications do fail, resources are limited and request can sometime involve other parties that cant act as quickly.
Concurrency can be abstracted away. The thought here is that concurrency is a technical issue and can be handled at that level of discourse. Unfortunately, concurrency can have business consequences. Resources are not all abstract and resources cannot be reserved without business consequences. Coordination is not a implementation detail, not everyone can wait forever, and not everyone can react instantaneously.
Side effects and non-linearity should be removed Side effects and non-linearity should be removed, this again in a reflection of our mathematical heritage. In mathematics, side effects are extremely difficult to handle and Non-linear systems are force fitted into linear models to enable analysis. Unfortunately, the world isnt as orderly as we would like. Feedback is a natural occurrence, and you cant ignore.
Systems can be proved to be correct. Again this is an artifact or our mathematical roots. Wouldnt it be nice if we could prove that our systems are correct without actually running them through a battery of tests? Well, thats the idea here, unfortunately, there are a couple of problems. The first is, how can you prove that your proof is correct? The second is, if you cant take away time, side effects and non-linearity, then the chances that you can arrive at an analytic solution go down pretty quickly.
Implementation details can be hidden. This fallacy has its roots in our habits to create abstractions. That is hide details and concentrate only on the essentials. Unfortunately, abstractions do leak, not all implementation aspects can be truly be hidden. The real world is governed in time and space and when implementation is linked to it, no amount of abstraction can hide the fact that the real world exists.
These fallacies have its roots in our mathematical traditions, which is a world of perfect determinism. Unfortunately this static thinking mindset is prevalent in software engineering. It's important to realize that the world undergoes continuous change, with feedback and side effects that create non-linearities, with time that's relativistic and with humans that are unpredictable. Finally, if its not obvious to you, your software implementation actually interfaces with this reality.
Have an opinion? Readers have already posted 8 comments about this weblog entry. Why not add yours?
If you'd like to be notified whenever Carlos Perez adds a new entry to his weblog, subscribe to his RSS feed.
Carlos E. Perez has been an object-oriented practitioner for over a decade. He holds a Bachelor's Degree in Physics and a Master's Degree in Computer Science from the University of Massachusetts. He has polished his craft while working in IBM's Internet Division and IBM's TJ Watson Research Center in Hawthorne, New York. He now works for a startup 1/100,000th the size of his former employer. He writes about topics covering emerging aspect and object oriented paradigms, loosely coupled architecture, open source projects and Java evangelism. |
Sponsored Links
|