Proving OO to be the most effective application development paradigm

1.      Background

It has been intuitively completely clear for a long time that object-oriented 3- tier approach is close to complexity minim and therefore requires least work to implement a given functionality.

This was actually the most important reason to create OOP and OOA (look at:

Grady Booch: Object oriented design Second edition p: 16-23  and Peter Coat Object Oriented Analysis  chapter 1.3. Analysis Methods p: 18 -36)

2.      Complexity model from Start Kauffman

Originally a medical doctor, Dr. Kauffman’s primary work has been as a theoretical biologist studying the origin of life and molecular organization. Thirty-five years ago, he developed the Kauffman models, which are random networks exhibiting a kind of self-organization that he terms “order for free.” He borrowed his model from physics call spin-glass model. The following is a quotation from Kauffman’s book:  At Home in the Universe. (ISBN= 0–19–511130-3 )

http://www.amazon.com/At-Home-Universe-Self-Organization-Complexity/dp/0195111303/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1223150067&sr=1-1

The NK model I will introduce is a kind of genetic version of a physicist’s spin-glass model. The virtues of the NK model are that it specifically shows how different features of the genotype will result in landscapes with different degrees of ruggedness, and it allows us to study a family of landscapes in a controlled way.

I will briefly sum up Kauffman’s concepts findings and conclusions. The idea of the proof is that there is a fruitful area between chaos and order where development using random search is both possible and probable. This is ruled by how big part of independent features is tightly or loosely associated.

Then we create a scale from bad to good and we assign for each feature and a random value to this feature. This is a starting point. Now we start our evolution scenario by randomly introducing mutation within the features. Then we interpret the total change as organisms as fitness of the organism in the ecosystem.

Kauffman talks about ruggedness of fitness landscape. The interpretation of N/K in genetics is that N is the total number of genies in an individual. K is the number of genes that are closely related. Changing K from 0 to N reveals that when K is relatively small compared to N the group has best possibilities to evolve and adapt along time and when K is close to N here are practically no chances to develop. Then he extends this pattern to things he call “patches” and the results repeat themselves but this time the close attached subset is topologically defined.

Tunning rugidness with K

(See Stuart Kauffman: At home in the universe pages: 164-180)

The conclusion is that adaption that can also be called learning is interplay between the structure and the basic dynamics. Here the combinatorial exploration is bounded by the organisation or structure and this way enabling that dynamic search to continue. This applies to patching s well.

3.     Complexity in modelling behaviour

Now I am applying his interpretation and conclusions into the application development domain.

The two different approaches separates from each other in the way the behavior is modeled. Here is an illustration of them both.  The functional decomposition sees the functionality just time sliced actions as OO explain it as collaborative functions of individual object. The next diagram illustrates this comparison

comparison

Figure 1.                          Function decomposition and  object collaboration

First remark. I pretty often come across a claim that OO analysis is more abstract than functional decomposition. This is false. Actually for the following comparison it must be the same. This level of abstraction is reflected by the number of attributes in system.

The thing that separates these is the amount of organization or structure. In the object view the first layer of structure is given when the domain is divided into classes. This leads to distribute activity into the object. This is done in a way where locality is maximized. First this means than object itself is the only one to see it’s attributes so all manipulation on those are strictly isolated within the object. The next layer of functionality of this model is the collaborations of a given set of objects.

The application of Kauffman’s N/K landscapes starts mapping gene’s to OO case to attribute – atomic base element.When we consider this difference in terms of N/K, then we can say that N is the number of attributes in system.

a.  Value of K in functional decomposition is N-1 since there is no structural restriction to use any attribute in the system.

b.  A simple estimation of the value of K in object collaboration is N/C, where C is the number of classes in the system. N/C is the average of the attributes in a class.

Now when we consider the two approaches to behavioral abstraction, they are :

functional decomposition and OO collaboration. In our interpretation the first division into closely related subset of attributes comes from class encapsulation. So in this case the number K is the average number of attributes within a class. If we have 100 classes in the domain layer then K = N/100. Classes encapsulate much of the behavior within each class. The coherence between the attributes inside of a class is much higher than to attributes in other classes.

So if N =800, then for functional decomposition K = 799 and for object collaboration it is near 8. This has a tremendous impact on the ruggedness of the landscape. (see above).

The second layer of organizational management of complexity is what Kauffman call patching. The equivalent in OO is the consequence of class structure topology.

image005

Figure 2.  A few (3) patches for example shown in airline class diagram

Objects forms clusters with other objects from directly associated classes. These object cluster form the patches and the coherence is much higher between object in the cluster than outside the cluster. As you can see from my illustration we can think that these patches are overlapping or fuzzy or we can do just simple partitioning of the class diagram.

4.     Abstraction

Lets return to abstraction.  First definition:

Abstraction is the process or result of generalization by reducing the information content of a concept or an observable phenomenon, typically in order to retain only information which is relevant for a particular purpose.

So abstraction equal simplification and it is done to address a certain viewpoint. The downsize of this is that we lose part of information. This loosing is the price of the clarification. On one hand abstraction has direction – you can think it as a vector in n –dimensional reality. When we have chosen the direction then abstraction is linear.

When we are think to model  a company operation it often includes some abstraction direction that have same starting point. The size of this point witch normally is a ball over these dimensions that is spanned by the set of vectors. The whole model the union over these spanning vectors.

Along each vector abstraction is linear. So each class is on some level of abstraction and has semantic content.. The level of abstraction is always connected to a reverse value of semantic. Following diagram describes this relationship

image007

Figure 4. Dependency between level of abstraction a class  and semantic value of a class.

Here the most effective representation lies somewhere near the center. The same result can be deducted from Kauffman’s patching. From his book we know that the evolutionary power is related to the size of the patch. This indicates that complex minim is achieved in the middle of extremes. This is quite obvious too. If our classes are very abstract then there are very few classes and the size and complexity within such class is very high. In the other extreme we have very small classes with high semantic value but the number of these classes exceed the limit of manageable number of elements.

1.     Three loosely connected aspects three logical layer of implementation

The third layer of patching is achieved  by early separating the three aspects: application, domain and persistence. The logical 3 layer implementation where application, domain and persistence is separated and the coupling to domain layer is asymmetric from both side. This way domain lives in complete isolation and doesn’t know anything of the rest of the world.

image009

Figure 5.             3-layer architecture and isolated domain layer

The static structure (classes and there association) and he dynamics ( methods, distribution of responsibilities and  collaboration) gives us a narrow isthmus between chaos and strict order. This is the same area of phase change where for instance life on earth have created such a vast complexity.

It is extremely important to remember, that this isthmus is narrow and that we don’t really have much freedom to move in either direction.

2.     Final conclusions

In paragraph 3 we found a narrow isthmus. Now this abstraction semantics akseli actually cut our isthmus and we are left with a tiny island in the sea of complexity which gives us minim. At the same time we have exhausted all our reassures. This means that we have reached our highs peek in our ruggedness landscape of effectiveness. I read this so that our theoretical knowledge in search of solution in application development in deterministic Turing machines has came to fulfillment; to its end.

On one hand the fundamental restriction of current system is of course the determinism of behavior on  the other hand this has been and is our aim in the first place. We all want truly want that a given transaction from one bank account to other bank result always to same end state.

When we extend our aim to nondeterministic Turing- machines then we come across with learning. This is then equivalent with Kauffman’s self-organizing  sets. The difference here between the previous and current if self with translates to learning. This is the cause of non-determinism. From our systems perspective this means the fundamental change in our method structure. All methods have to became mutable and their mutation is learning.

The technical solution from the hardware point of view shall be something like our brain: neuronets. The theoretical researches has been going on for some while all ready but I seems that we don’t know yet quite enough about the subject to be able to copy the hardware to perform the desired function.

Advertisements