[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Is OO a deliberate fraud?



Well, I don't think OO is a deliberate fraud... Instead, I prefer Peter
Welch's description that computer science has taken a wrong turn, driven
in part by the commercialisation of languages.

I recall my first involvement with SmallTalk with messages being sent
between objects. There was an enticing hint of system-managed
parallelism, but in reality, 'message passing' in many OO languages is
just the calling of subroutines. I like to say that OO languages have an
impedance mismatch when it comes to processes. I still haven't found a
clean way to combine OO abstraction with process abstraction. You can
read examples of this confusion today:

http://blogs.msdn.com/rickbrew/archive/2005/01/17/354657.aspx

This demonstrates how a recent language (C#) falls in a hole because it
lacks in-built process constructs: the programmer is forced into system
territory to render a parallel idea in their code! It's a pity Rick
Brewster hasn't heard of CSP.

Web development has again hidden the need for parallel thinking because
each session (i.e. process) is managed by the web server, with state
being held on a shared database. This is a specific class of application
and it's good that we have an abstraction that deals with this. But as a
result, the programmer doesn't have to think about parallelism as a
design tool.

Computer science ideas move in and out of favour, and I hope CSP will
return to focus as multicore CPUs become mainstream. You'll need
something sensible to program those 64-core chips and I don't see a
suitable language out there today.

I think we should be careful to separate the bloat discussion from the
design discussion. I don't accept that OO is the hidden weapon of a
Wintel conspiracy (didn't SmallTalk come out of Xerox Parc? Doesn't Tony
Hoare work for Microsoft Research?). While I love the elegance of
ultra-compact code like QNX, I don't yearn to return to the days of
static compilation and small binary footprints. I remember some horrible
frustrations with Occam back in 1990, entirely to do with its static
nature.

Personally I love the fact that my commodity workstation allows me to
manipulate thousands of photographs and edit digital video with ease.
Bigger memory spaces and more ghz allow new things to be enjoyed.
Computer science today isn't all bad.

My $0.02c worth (probably worth less than that)

-----Original Message-----
From: owner-occam-com@xxxxxxxxxx [mailto:owner-occam-com@xxxxxxxxxx] On
Behalf Of Eric Verhulst
Sent: Thursday, 8 June 2006 4:32 AM
To: tjoccam@xxxxxxxxxxx; 'Ruth Ivimey-Cook'; 'Jim Sack'
Cc: 'P.H.Welch'; java-threads@xxxxxxxxxx; occam-com@xxxxxxxxxx;
j.kerridge@xxxxxxxxxxxx
Subject: RE: Is OO a deliberate fraud?


I couldn't agree more. And it is horrifying to know that this is the
stuff they teach at Computer Science all to often.

I prefer the term "Process Oriented" Programming vs. Object Oriented
Programming or more generally (as it applies to Systems Engineering in
general) "Entities and Interactions". Both have formalisms (CSP and
Comm-Unity) to back them up. OO's most advanced state of the practice is
UML, a monster of a graphical notation (but they call it a language).

Who remembers the demo floppy (1.44 Mbytes for the young amongst the
readers) of QNX ? It was self-booting, message passing based and then
showed a GUI browser and you could actually connect to the net and
browse. I am proud to say that our latest OpenComRTOS provides a minimal
preemptive RTOS (with send and receive services) in just 850 bytes, even
when written in C.
The distributed version is less than 2KBytes.

I recently tested an Open Source tool written in OO Java. Besides that
it was very slow, it complained very rapidly about a lack of memory
while I had
1 Gbyte of RAM. 

OO is the hidden Wintel conspiracy. It justfies why we need 3 GHz
Pentium-XX, more memory and more diskspace to keep the industry going.
Did you know that Intel primarily invests in start-ups that develop
resource hungry software?

How to create a sulf-sustaining economy is not for those who look for
the working solutions. I don't know if this is the ultimate satisfaction
(as it has some drawbacks) but I currently work for target CPUs with 2K
of RAM and
32 KB of flash. These things often go in safety critical automotive
applications. Forget about OO in this world. We try formal modeling
whenever we can.

Cheers,

Eric


----------------------  FROM : --------------------------
   Eric.Verhulst@xxxxxxxxxxxxxxxxxxxxxx
   Skype me at: ericverhulstskype
   Mob. +32 477 608339
   Systematic Systems Development Methodologies
   Trustworthy Embedded Components
   http://www.OpenLicenseSociety.org
-----------------------------------------------------------
" "Concept" is a vague concept", L. Wittgenstein 


-----Original Message-----
From: owner-occam-com@xxxxxxxxxx [mailto:owner-occam-com@xxxxxxxxxx] On
Behalf Of tjoccam@xxxxxxxxxxx
Sent: Wednesday, June 07, 2006 8:10 PM
To: Ruth Ivimey-Cook; Jim Sack
Cc: 'P.H.Welch'; java-threads@xxxxxxxxxx; occam-com@xxxxxxxxxx;
j.kerridge@xxxxxxxxxxxx
Subject: Is OO a deliberate fraud?

Ruth, Jim, and all,

This is in indirect response to Ruth Ivimey-Cook "Re: CPA 2006 - Call
for Papers", in which she laments a dismal lack of response. I think
it's the death throes of science being choked out by fake science, and I
think I've identified the culprit.

I'm posting this to both occam and OO-based supporters, to be fair, and
allow serious answers to my points. Merrill R. Chapman in his tech
history ("In Search of Stupidity", Apress / Springer-Verlag, New York,
2003) quotes, as 1992-1993 era OO definition at Borland, the following
excerpt from "What is Object-Oriented Software" by Terry Montlick
(www.softwaredesign.com), given here in full:

> An object is a 'black box' which receives and sends messages.
> A black box actually contains code (sequences of computer
> instructions) and data (information which the inctruction operates 
> on). Traditionally, code and data have been kept apart. For example, 
> in the C language, units of code are called functions, while units of 
> data are called structures.
> Functions and structures are not formally connected in C.
> A C function can operate on more than one type of structure and more 
> than one function can operate on the same structure.
>
> Not so for object-oriented software! In o-o (object-oriented) 
> programming, code and data are merged into a single indivisible 
> thing---an object. This has some big advantages, as you'll see in a 
> moment. But first, here is why SDC developed the 'black box' metaphor 
> for an object. A primary rule of object-oriented programming is that 
> as the user of an object, you should never need to peek inside the 
> box!

ALL YOU OCCAM AND CSP FOLKS... DOES THIS SOUND FAMILIAR? It's stolen
from the definition of a process, and fits real OO (inheritance,
polymorphism, method calls) as well as a shoe fits an ear. Were they
really saying that in 1993? Because then the whole thing was fraud from
day one---describing one thing (the right thing) while doing a
completely different game with, yes, structures (objects) and functions
(methods).

Processes offer the black box of freedom from side effects, while OO
offers the black box of ignorance. Inheritance, polymorphism, and
especially encapsulation say that you are supposed to treat the
pushbutton for uploading a file as the same as the pushbutton for
shutting down a nuclear reactor. Don't look inside the box; pretend they
are the same. 
And if two black boxes A and B both upload files, which "impenetrable"
black box contains the shared file system and network drivers that they
CALL? This is the emperor's new clothes!

Example: I just finished examining US Patent Application 20030182503 (go
to uspto.gov > eBusiness... Patents File Search View > Search Patents
and Published Applications). It is intending to set up independent
tasks, but in [0070] it says "the group_write I/O task 352 calls (step
354) an IO task from the disk object 225a..." That implies multiple
stack nestings and out-of-black-box side effects. That's the only
example of metaphor run amok that I can deal with this week.

This admitted metaphor (image dissimilar to reality) generates
ever-huger languages and OSs, which is proof it is bad science. The fact
that it never works without being tinkered with is further proof. OO
just grabs whatever paradigm description sounds good and applies it to
itself. It's as if the Renaissance epicycle people neutralized Kepler by
saying epicycles were ellipses. It's as fraudulent as the old practice
of big companies announcing a product to kill a smaller competitor, and
then not bothering to produce.

We can't coexist with this monster; it's killing all good science. Have
you noticed life is like a Poul Anderson novel where science is dying
and all that remains is huge, slavish technology-by-rote?

We need to go back to scratch, to static non-virtual assembly language
design, and build all serious design in a higher-level language free of
OO and other infinite metaphor. Once we control the harness, they can
use OO if they want for what it is good for: manipulating graphic
widgets in a GUI.

Larry Dickson