The Artima Developer Community
Sponsored Link

Weblogs Forum
Software Engineering and the Art of Design

27 replies on 2 pages. Most recent reply: Apr 9, 2009 5:20 PM by Mohammad Malkawi

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 27 replies on 2 pages [ « | 1 2 ]
Les Stroud

Posts: 7
Nickname: lstroud
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 23, 2003 1:36 PM
Reply to this message Reply
Advertisement
It seems to me that there is a fundamental point that you guys are overlooking in your definitions. That is that Engineering is the application of well understood scientific "models" to real world problems. The important part of this is that Engineering eduction is, by extension, the practice of teaching engineers these scientific models and methods to validate that these models are applied correctly.

I think that where the confusion comes is for those of us with engineering backgrounds is the lack of mathematics involved. To a degree, I miss all of those neat tidy little formulas and rules of thumb. However, what we forget is that those formulas and models were developed over a very extended period of time around a fairly static subject. In fact, if you look at core software engineering you will find a fairly mature science for how to prove that software works. Unfortunately, these models rarely translate to the real world. The reason is that the science is changing so rapidly. There are new discoveries and new approaches formulated on a daily basis. I imagine that there was a time when this was the case in civil, mechanical, and electrical engineering. I can imagine the electrical engineering was largely an art in the 50s and 60s. So, once computer science becomes more stable, then I suspect we will start to see an engineering discipline emerge. Until then, we have to do our best to develop ways to improve our software and keep up with the changes. Certainly, in light of this, it is very difficult to accomplish anything terribly meaningful in software engineering education other than teaching the student how to learn effectively so that they can keep up. This is mostly because any model that we teach is likely to be obsolete before they graduate.

The other thing to remember is that we as a profession have done little to provide a better understanding of the benefits of software engineering to the corporate world (the money well). Most businesses would prefer to get the software done sooner and bank of catching bugs during testing instead of taking the time to engineer a product, insuring that the code was right initially. In their minds it is more cost effective (and it may be in non-life threatening cases). In fact, I would argue that some of the outsourcing of IT jobs from US companies to Indian consulting firms is due to this attitude. They see the quality of the software as unimportant since they can merely catch the bugs in testing and have the overseas company fix the problem. In fact, they are likely to outsource to these companies at a fixed price. Thus creating further economic incentive to treat software development as commodity labor. In general, I think that this is a cyclic trend that will come back around. However, I do think that the catalyst for this cycle is highly related to the quality of software and aoftware engineers that are hired onto these large IT projects. The point is, that until we find a way to improve the state of software engineering and the quality of the software that we produce, this cycle will continue.

To sum up this rambling, I suppose the questions are:

-whether or not computer science will ever stabalize long
enough to develop a model for it
-and whether it will really matter since the only important
stat to a business is ROI (a stat for which positive
results can be achieved by pulling a number of non
computer science related business handles).

LES

Isaac Gouy

Posts: 527
Nickname: igouy
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 23, 2003 3:25 PM
Reply to this message Reply
outsourcing of IT jobs... see the quality of the software as unimportant

Perhaps not, InfoSys (and Wipro) seem to consider quality a selling point: 1999 ISO9000 Recertification, Level 4 for Banking Business Unit, SEI CMM Level 5 assessed


Edsger Dijkstra had some ideas on the rejection of formal program design in the US:
"- The ongoing process of becoming more and more an amathematical society is more an American speciality than anything else.
- The idea of a formal design discipline is often rejected on account of vague cultural/philosophical condemnations such as "stifling creativity"; this is more pronounced in the Anglo-Saxon world where a romantic vision of "the humanities" in fact idealizes technical incompetence. Another aspect of the same trait is the cult of iterative design.
- Industry suffers from the managerial dogma that for the sake of stability and continuity, the company should be independent of the competence of individual employees."

Computing Science: Achievements and Challenges. 1999.
http://www.cs.utexas.edu/users/EWD/ewd12xx/EWD1284.PDF

Alex Peake

Posts: 13
Nickname: alexpeake
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 23, 2003 10:14 PM
Reply to this message Reply
There is far more engineering possible in software than perhaps most people are knowledgeable enough to use (RTFM as we engineers say).

Let's look at a comparison of building a bridge vs. building a piece of software (a similar argument would hold for a cell phone, ...).

Bridge There are myriad choices of shape, materials and substructure. (Some) known elements are the "usage spec", strengths of materials and performance of substructures under stress. There are great opportunities for creative design.

Software There are myriad choices of User Interaction, data structures and algorithms. (Some) known elements are the "usage spec", strength of data structures and performance of algorithms under stress. There are great opportunities for creative design.


I can imagine the electrical engineering was largely an art in the 50s and 60s
You imagine wrong. The science was different, but nonetheless still science. For example, small signal equations related to triodes and pentodes, and later individual transistors. Printed circuit boards did not yet operate at frequencies that required us to consider them wave guides. And so on.

Most businesses would prefer to get the software done sooner and bank of catching bugs during testing instead of taking the time to engineer a product
Let's leap to your summation, which answers this with a resounding NO!


Your summation
whether or not computer science will ever stabilize long enough to develop a model for it
There are already innumerable models (state, functions, higher order functions, lambda calculus, concurrency, modules, types, relational algebra, scoping, threads, eager and lazy evaluation, garbage collection, partial evaluation, ...). Is our science evolving? Why yes! But so is bridge building, electronic circuits and just about any branch of engineering - and just as rapidly too.

and whether it will really matter since the only important stat to a business is ROI
Correctness - the program does the job
Maintainability - someone can evolve the program as needs change
Productivity - the design/solution was created in a reasonable time
This is what ROI comes from (maximize output/input), and with available engineering, you produce software that meets this criteria. Where it fails is where those responsible do not use (often through lack of education) that set of engineering fundamentals that exist.

Les Stroud

Posts: 7
Nickname: lstroud
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 24, 2003 12:08 PM
Reply to this message Reply
Let me reply to a couple of these points. However, I will state up front that I do not feel that I actually have an answer for the question: is software art or science?


******
In response to the software / bridge analogy. I made the very same argument on my blog (www.mindmeld.ws) several months ago. Unfortunately, my blogs were lost in a server crash. However Darren Hobbs make a very good retort of my arguments in his article entitled <a href="http://www.darrenhobbs.com/archives/2003_05.html">Software, Buildings, Bridges & Testing</a>.

******
<i>I can imagine the electrical engineering was largely an art in the 50s and 60s
<b>
You imagine wrong. The science was different, but nonetheless still science. For example, small signal equations related to triodes and pentodes, and later individual transistors. Printed circuit boards did not yet operate at frequencies that required us to consider them wave guides. And so on.</i></b>
Here I disagree with you. When these components and models that you mention were new, the application of them was far more akin to trial and error than it is to current day EE. There are endless examples of theories and models that turned out to be wrong. While there were more concrete mathematical models to consider, forming them into a functioning radio (aka Tesla and the Eiffel tower) required an inventor, not an engineer.

*****
<i>Most businesses would prefer to get the software done sooner and bank of catching bugs during testing instead of taking the time to engineer a product<b>
Let's leap to your summation, which answers this with a resounding NO!</b></i>
Well, I'm not sure what your argument is, but I have yet to be at a company that was willing to spend the money for formal verification. In fact, I think that this is only prevalent in the defense industry and this is mainly because they have to spend the money or they loose it. Interestingly, I have spoken with some friends in the defense industry that are migrating away from the engineering approach toward a more IT-like approach in order to save cash.

*****
<i>whether or not computer science will ever stabilize long enough to develop a model for it<b>
There are already innumerable models (state, functions, higher order functions, lambda calculus, concurrency, modules, types, relational algebra, scoping, threads, eager and lazy evaluation, garbage collection, partial evaluation, ...). Is our science evolving? Why yes! But so is bridge building, electronic circuits and just about any branch of engineering - and just as rapidly too.</b></i>
With building a bridge, the concepts have remained largely stable. Generally speaking, what we all learned in statics and dynamics class is not changing. A moment of inertia is a moment of inertia. The tensile strength required to support 10,000 pounds for a given material is constant. In bridge building the foundations of the science are extraordinarily stable (and have been since Newton) only the material properties have changes. These bridge builders / civil engineers have created a set standards (plug points -- variable in equations) where they can plug in the variations and evaluate the results against a known standard. So, while there are things changing in bridge building, they have isolated those changes using equations.

In software that has not, to this point, occurred. In fact, fairly recent and popular things like virtual machines and AOP have forced a reevaluation of the equations themselves. For instance, smalltalk and java made OO popular. Well, that popularity made traditional metrics like function points and LOC useless. They forced a reevaluation of how you actual write code with many people moving away from the big bang processes (invented in the 60s) to agile processes and test driven methodologies. This was caused by a foundation level change. The "physics" of software changed and the model that we used to develop it changed (independent of the underlying electronics ... or at least Sun would like you to believe it was independent :)). My point is that every ten years, or so, there is a tectonic shift in software. This shift could be caused by tools (like VB), it could be a change in languages used for development (like Java), or it could be in the underlying platform (like the Internet). In any case, the science has to change with it. While the fundamentals of the underlying theory may have achieved some constancy (like state machines) the perspective that they are approached from, the way that they are used, and the way that they are constructed fundamentally changes. So, while you can model those concepts (create a scientific model) it is difficult to translate that to engineering since engineering requires not only the model to be constant, but the application of the model to be constant (constant in the since that an equation is constant).

******

Sorry this was so long. Frankly I would love to find an engineering approach to software. I miss it. However, in recent years, I have found, empirically, the craftsman approach to be more predictable and to produce better software than any of my previous attempts at quantification.





*******
One more response:
This gets a little off topic, but I feel I should respond anyway.

<i><b>"Perhaps not, InfoSys (and Wipro) seem to consider quality a selling point: 1999 ISO9000 Recertification, Level 4 for Banking Business Unit, SEI CMM Level 5 assessed"</i></b> CMM and ISO9000 do not ensure software quality. The funny thing about CMM and ISO is that while there are agencies that set the standard, and agencies that are paid to certify (by the company that wants to be certified), there are no enforcers that ensure that the certifications match the standards. With that said, the standard does mean something. In my opinion, though they mean very little for software quality.

My little CMM rant:

Having actually implemented a CMM and ISO9000 program, I can tell you that in practice they have very little to do with software quality. In general, they are about organizational coherence and survivability. Essentially, they force an organization to create and document repeatable processes. This has many advantages that I won't go into here. However, it does not have the advantage of improving software quality. In fact, if the development process is bad, not only does it not help to properly engineer the software, but it makes it hard to change poor engineering practices that are ingrained in the documented, repeatable processes. Simply put it makes repeatable bad processes as often as it makes repeatable good practices.

I have actually had two experiences with projects being outsourced overseas to "CMM5" shops. In both cases, the results were horrid. In the first case it was a matter of getting what you ask for. This particular CMM5 shop was very good at documentation. However, this meant that in order to get them to write the first line of code, we had to document the bejesus out of the requirement. In fact, I would say we spent more time and resources writing requirements than they did developing and testing. You would think that it might not be a bad thing to make your requirements explicit. In this case, though, it was a three month project (regurgitating some data out of an existing db) which overran it's deadlines by more than double. Additionally, the delivered software required a significant amount of rework to actually function. This was mostly due to misinterpretation of the language that the requirements were written in (English).

Now, I thought that this was an isolated incident, until my second experience. They, too were level 5 (or claimed to be). Admittedly, they didn't keep up the level 5 charade for long. However, I have to say I have never seen a more disorganized cluster #**! in my life. They put 100 inexperienced college grads on a project that should have taken 10 guys. These people not only got whole requirements wrong, but they also made simple mistakes (like treating database booleans as integers and then wondering why their non zero values were always 1 :)). To make matters worse, they would catch exceptions and drop them. I mean is was truly very, very poor quality work.

With that said, I am sure there are some very good companies out there that are CMM level 5. My experience is clearly from a small sampling. However, none of that speaks to the point that I was actually attempting to get across (and may not have done well). The fact that companies are willing to outsource overseas means that they no longer feel that they need to be in direct control. This either means that they think that the quality produced outside their company will be of equal or better quality to that which could be produced inside their company. Or, it means that they know there will be issues. So, they might as well save money during implementation and then spend money to fix it. Since, they know that they will have to fix it later either way. Simply put, they are accustomed to a lack of quality. So, they are willing to consider lower cost "economy" options and take on the accompanying risk.

This is similar to a person that buys a cheap steak knife. They know that they will have to replace it in a couple of years. They know that they could have bought the knife that lasts a life time and cuts tin cans. However, they would rather spend more money in the long run to save money in the short term. This is because they are used to things breaking and then replacing them. They accept that low standard of quality as the price they pay for buying a less expensive product. In a way, it's a financing plan.
****

Les Stroud

Posts: 7
Nickname: lstroud
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 24, 2003 12:09 PM
Reply to this message Reply
Also, apparently I have no idea how to use html within a message on this forum. :)

Alex Peake

Posts: 13
Nickname: alexpeake
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 24, 2003 1:27 PM
Reply to this message Reply
However Darren Hobbs make a very good retort ... First, the cost of testing. Unit tests are essentially free
Writing the tests, people's hours running, cost of capital tied up while running, fixing, ...???


The other major difference is in the economic ... Construction (i.e. compilation) is basically free.
You mean development, writing what you want the machine to do, writing code for the compiler how to assemble, ... are not construction?


...Tesla and the Eiffel tower...
Your stated period was the 50s and 60s I was studying physics and electronics at that time. I remember the science quite well. And let us not confuse the foundational knowledge with creativity, for without the foundations, there will unlikely be much creativity.


...Well, I'm not sure what your argument is...I have yet to be at a company that was willing to spend the money for formal verification...
Any company should focus on Return On Investment. Formal verification is merely a sub-goal, which may or may not be useful. What they (companies) care about is
Correctness - the program does the job
Maintainability - someone can evolve the program as needs change
Productivity - the design/solution was created in a reasonable time


With building a bridge, the concepts have remained largely stable.
Well let's look at recent changes. The Bay Bridge (San Francisco) needs to be rebuilt now that we better understand how earthquakes affect bridges and how to design bridges to better withstand earthquakes -- a recent understanding and a change in thinking about what you call known standards. Materials (better steels) continue to evolve. Better paints to protect from corrosion. None of these came from the equations, but from evolving science.


[/i]In fact, fairly recent and popular things like virtual machines...[/i]
There is nothing recent about virtual machines. To (re) quote Alan Kay "Most of current practice today was invented in the 60s".

...traditional metrics like function points and LOC useless...
Always were sub goals. See previous note on Return On Investment.

They forced a reevaluation of how you actual write code with many people moving away from the big bang processes (invented in the 60s) to agile processes and test driven methodologies. This was caused by a foundation level change.
Let's not confuse fad methodologies and languages with fundamentals of computer science.


...engineering requires not only the model to be constant, but the application of the model to be constant (constant in the sense that an equation is constant).
Now that is a really quaint idea. This would say that in the light of carbon fiber, microprocessors, liquid chromatography, optical fiber, fuel cells, carbon nanotubes, quantum memory, ... all engineering is no longer engineering?


Also, apparently I have no idea how to use html within a message on this forum. :)

When replying, look to the right at the section entitled "Formatting Your Post"

Isaac Gouy

Posts: 527
Nickname: igouy
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 24, 2003 2:43 PM
Reply to this message Reply
companies are willing to outsource overseas means that...
Simply put, they are accustomed to a lack of quality.

Exactly.

Les Stroud

Posts: 7
Nickname: lstroud
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 24, 2003 2:59 PM
Reply to this message Reply

Unit tests are essentially free
Writing the tests, people's hours running, cost of capital tied up while running, fixing, ...???

The other major difference is in the economic ... Construction (i.e. compilation) is basically free.
You mean development, writing what you want the machine to do, writing code for the compiler how to assemble, ... are not construction?

Remember, I was actually arguing with Darren on this. So, I agree. Frankly, I wish I still had my posts...it might be more clear if you could actually see both sides of the debate. :)


...Tesla and the Eiffel tower...
Your stated period was the 50s and 60s I was studying physics and electronics at that time. I remember the science quite well. And let us not confuse the foundational knowledge with creativity, for without the foundations, there will unlikely be much creativity.

Agreed. Maybe my timing was a little off. However, I was thinking (and not doing electronics at the time) that the transition from vacuum tubes to transitors was not entirely well understood. That quantum mechanics was fairly new. So, I was concluding that will there were a few engineers that may have known how to use them, that transitors was more guesswork to the majority of engineers at the time. Essentially, I suppose, that if you wanted to build a transistor radio you found someone that knew how and got them to help you instead of running down to the bookstore and reading up on transitor / radio / antenna theory. I could have totally been wrong since I was not around then. However, the point is the same. There has been a point of immaturity in every engineering disclipline where good engineers were craftsmen.


...Well, I'm not sure what your argument is...I have yet to be at a company that was willing to spend the money for formal verification...
Any company should focus on Return On Investment. Formal verification is merely a sub-goal, which may or may not be useful. What they (companies) care about is
Correctness - the program does the job
Maintainability - someone can evolve the program as needs change
Productivity - the design/solution was created in a reasonable time

I see your point now. I agree with what you are saying. My point is simply that in IT (something I define as technology that is used in a business, but is not the primary line of business) that businesses take a fundamentally lazy appraoch. They will almost always throw correctness and maintainability down to the second tier in exchange for higher productivity. I'm not saying that they want to cost themselves money. However, I believe they will take the least painful route from a short term perspective.


With building a bridge, the concepts have remained largely stable.
Well let's look at recent changes. The Bay Bridge (San Francisco) needs to be rebuilt now that we better understand how earthquakes affect bridges and how to design bridges to better withstand earthquakes -- a recent understanding and a change in thinking about what you call known standards. Materials (better steels) continue to evolve. Better paints to protect from corrosion. None of these came from the equations, but from evolving science.

I'll give you that things have changed in bridge building, but the fundamental concepts of how you build it, they shape and structure of it, and the forces it must withstand are static. A new understanding of earthquake forces, new paints to improve corrosion resistance amounts to maintenance not a radical departure in civil engineering. Like I said, the materials are evolving, our understanding of the world around the bridge is evolving and as a result the established standards evolve. The key here is evolve. There are no revolutions in bridge building (there may be in materials engineering), but civil engineering has reached the point where it is evolving and as such, has a stability to the underlying models.

In fact, fairly recent and popular things like virtual machines...
There is nothing recent about virtual machines. To (re) quote Alan Kay "Most of current practice today was invented in the 60s".

I am well aware that VMs were done in the 60s. People often hear me say that we have done nothing fundamentally new in the last 20 years. However, I would claim that while those concepts have been around for a long time, the implementation and the use of them is new. As with anything new, that implementation sparks new ideas as more people interact with it. So, the fundamental science may have some age, but the results of that science, it's application, and it's revolutionary effects are still young.


...traditional metrics like function points and LOC useless...
Always were sub goals. See previous note on Return On Investment.

Agreed....but they are one of the "equations" that has fallen by the wayside.


They forced a reevaluation of how you actual write code with many people moving away from the big bang processes (invented in the 60s) to agile processes and test driven methodologies. This was caused by a foundation level change.
Let's not confuse fad methodologies and languages with fundamentals of computer science.

I agree with you about the fundamentals of computer science. I am talking about the application of those fundamentals to "real world" problems -- otherwise known as engineering. In concept, I agree with your insult toward agile methodologies. However, I feel that the rapid and wide spread adoption of these methodologies represents a failure of software software engineering. Essentially, every body is abandoning engineering techniques because they are too slow, ineffective, and provide little return. People are actually punting to the position where they believe that testing everything for every possible success and failure scenario is good practice. It seems to me that if this is the case, the software engineering, as a discipline, needs to revisit itself and find ways to ensure quality and improve productivity at the same time.

...engineering requires not only the model to be constant, but the application of the model to be constant (constant in the sense that an equation is constant).
Now that is a really quaint idea. This would say that in the light of carbon fiber, microprocessors, liquid chromatography, optical fiber, fuel cells, carbon nanotubes, quantum memory, ... all engineering is no longer engineering?

I would argue that a couple of the things that you mention, primarily carbon nanotubes and quantum memory are still well in the field of scientists and not engineers. Many of the other things you mention, I believe have relatively constant models. In fact, they are constant enough that these materials are mass manufactured today. Sure there is still an aspect of science in improving these materials. However, most of them are in the realm of engineers finding novel ways to apply them to new or old problems. They are past the discovery stage. So, I stand by my statement.

Matt Gerrans

Posts: 1152
Nickname: matt
Registered: Feb, 2002

Re: Software Engineering and the Art of Design Posted: Jul 24, 2003 4:42 PM
Reply to this message Reply
LES wrote:
> ...electrical engineering was largely an art in the 50s and 60s...


You mean 1850s and 1860s, right?

Les Stroud

Posts: 7
Nickname: lstroud
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 24, 2003 8:34 PM
Reply to this message Reply
Well I meant as EE applies to electronics. However, you are correct that if you take into account power and communications engineering, 1850 would be more accurate. :)

Naren Chawla

Posts: 1
Nickname: naren
Registered: Feb, 2002

Re: Software Engineering and the Art of Design Posted: Jul 25, 2003 4:21 AM
Reply to this message Reply
If I provide the executive summary for this article, it poses subtely related questions -

1. Is formal computer science education required to suceed as a software developer ?

2. Is development of good-quality software systems more of a craft or is it systematic application of well-defined processes or in plain english, is it art or science ?

Well, I think all depends on the target software system being built.

For most business applications (Order Management, Inventory Control System,..) the software developers need more of vocational training in tools as oppossed to theortical understanding of computer science. They can live without taking "Operating Systems", "Data Structure", "Compilers" coursework. Also, it is imperative to have well-defined processes in this situation, since lack of it, will lead to disparity between the business requirements and actual functionality.

On the other hand, developers who write system software (frameworks, tools, os, protocools, etc.) will generally speaking benefit from learning formal hard science. There will be always exceptions like Jim Waldo, but most people will not have the maturity to think deeply about the system design issues without doing "hard" time in graduate schools. Also, since system software design relies heavily on creativity of developers, it certainly helps not to put too much emphasis on processes.

Isaac Gouy

Posts: 527
Nickname: igouy
Registered: Jul, 2003

Re: Software Engineering and the Art of Design Posted: Jul 25, 2003 1:32 PM
Reply to this message Reply
poses subtly related questions
and in two ways: are they true for the current situation, are they true in principle.

depends on the target software system
Rather than using the traditional systems vs applications distinction, wouldn't it be more useful to acknowledge that there are different roles in software development? Designing application level components requires different skills than using those components. (Even then, how would someone be able to choose between components without understanding algorithmic complexity and data structures?)

Mohammad Malkawi

Posts: 1
Nickname: mmalkawi
Registered: Apr, 2009

Re: Software Engineering and the Art of Design Posted: Apr 9, 2009 5:20 PM
Reply to this message Reply
I fully agree with the views of Jim.
Most importantly, the art of SW becomes more clear when we talk about non-functional requirements such as performance, reliability, availability, scalability and maintainability. Addressing these requirements is more of an art than anything else

Dr. Mohammad Malkawi

Flat View: This topic has 27 replies on 2 pages [ « | 1  2 ]
Topic: Mixins considered harmful/1 Previous Topic   Next Topic Topic: OSCON 2009 Registration Open


Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2014 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use - Advertise with Us