[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Extreme decomposition questionable?



In the May/June 1998 issue of the IEEE Software magazine, Les Hatton
has on p. 46 an article entitled "Does OO Sync with How We Think?".
The first page - in large font - contains the sentence "I assert that
any paradigm that is capable of decomposing a system into large numbers
of small components - as frequently occurs in both OO and conventional
systems - is fundamentally wrong."

This point is supported later by a U-shaped plot of the Defects
per KLOC (thousand lines of code) vs average component complexity. The
plot shape, with increasing defects for very low and very large
complexity (no numbers on the scales), implies that one can decompose 
too far. 

Hatton states this shows that defects tend to accumulate in the
smallest and largest components. He also seems to speculate that 
extreme decomposition can result in enough components to overflow our
limited capacity short-term memory - and notes that this may be the
point where programmers will admit that they need help from formal
methods.

----------

Perhaps it is not that defects accumulate within the small components,
but that the defects arise from difficulties in trying to glue the
large number of small components together - which is where CSP/occam
methods shine. If such methods were used the low-complexity side of the curve
might not rise.

------

There is considerable additional interesting information on
quantitative assessment of some of the claims of the OO community.
A four-year comparison on a large project (at Hatton's firm - Oakwood
Computing in Surrey) showed about 25% more faults/KLOC in the C++ OO
version against C, and the time to fix the faults was about three 
times as great in C++.  Components involved in inheritance were about 
six times more likely to have defects than those which were not - 
even though only single inheritance was used (this on a different
project from another group). Objects using polymorphism seem to be
more difficult to handle.

Other points: Hatton quotes a study by Leach showing that few users
felt they achieved more than 20% reusability - although Hatton's
project reached about 40%. (He also notes that 20 years ago he had
> 90% reusability in FORTRAN programs...not a surprise to some of
us old FORTRAN programmers...)

-- 
-----------------------------------------------------------------
Dyke Stiles
dyke@xxxxxxxxxxxxxxxxxx

Real-Time and Parallel Computing Group   http://multi.ece.usu.edu
Department of Electrical and Computer Engineering
Utah State University
Logan Utah 84322-4120
Voice: (435) 797-2806; FAX: (435)797-3054; Telex: 378-9426
==================================================================