Domain model of retail business line

Here is my next business line model. The business is retail. The amount of variation between individual retail businesses is quite huge. This is why the current model is perhaps somewhat more abstract than some others.

In retail the key to the business is the material flow. When we consider contemporary retail like the modern trend is close to JOT ( Just On Time ) production. The point is to avoid any unnecessary storing of component. The implication of this is of course highly organized supply flow. Emphasis on the delivery mechanism is a must. In production it is actually more critical than in retail in average.

This demand is reflected in the domain model as well. The deepest essential event flow gos like this:

  1. Provide the customers a comprehensive catalog of product descriptions.
  2. Then the customer can order based on this catalog. When the order is received it is immediately replicated to the provider of the product.
  3. In the worst case this will trigger a request to production. Then the customer is kept updated of the process.
  4. In many cases and at least in more classical scenario the retail company is keeping and managing a stock of goods and maintaining order limits and quantities.

My model here includes all above aspects. The event-object purchase can thus be either and event between producer and the retail company or between the retailer and end customer.

I have of course also included the work-events and the worker role in the model as well.

Here is the object collaboration of a classical even chain, where the product is first ordered, received and placed on a shelf in the stock (or in the shop) and finally purchased by a customer.


Confusions in Object Paradigm

One stubborn confusion or fuzziness around OO is the separation between paradigm and OO programming languages. I like to emphasize with this post the fact that the paradigm is a separate issue from programming languages. On the other hand the fact is that the analytical paradigm was born through the OO programming languages.

This is why I created an “almost” axiomatic definition for the abstract OO modeling.

Axiomatic definition of Object Model

Object Axiom

We call object anything past or present unique in the universe. Object is an abstraction of some corresponding reality (it is the reflection of something real in our mind  ). There are at least two big subcategories of this uniqueness. They are tangible things and events.  Uniqueness includes and thus implies a lifespan. This means every object is born and dies. In the case of an event the birth is the start of even and the death is the end time of event. Examples of events could be for instance a lecture or a car accident.  Every object must have a structure (at least one attribute) and it have to have at least one behavior (in languages a method). So universe consists of objects  and only objects1

Object Structure Axiom

Object is described by its attributes and services. Attributes gives an object physical structure and dimensions. Attributes can be primitive, composed or object reference.

Attribute Axiom

Attributes can be atomic (synonym for primitive) or composed. Composed attributes looks like object. They can have their own attributes and services, but they cannot have identity –because they are not objects. An example of composed type of attribute is Date type.

Service Axiom

Objects have behavior. The behavior consists of set of services that the object can perform. Services are named.  Services are actions that act upon the objects own attributes. Service is the only way to change the state of an object and it can return an object or an attribute. Every object has a set of  services -at least one.

Relation Axiom

Objects can have association (reference) to other singular objects or object groups. This means that corresponding objects see each other and thus can send messages to each other. On other type of relationship is generalization-specialization relation. When two objects are in this kind of relationship they can share common behavior (ie. services)

Collaboration  Axiom

Objects can form collaborative sets, where execution of task can be distributed to the objects in the set. Collaborations are always materialized by sending and receiving messages. These messages initiate a service within the receiving object.  Collaboration diagram describes such collaboration.

Class Axiom

A class is a defined abstraction of a set of  similar object. The class is a biggest common denominator of the corresponding set.  Objects that are classified to same class share the same attribute types and same services but they normally have also other attributes and services. In this way class is a simplification (abstraction) of those real objects.  A class diagram describes the static structure of the domain. Class diagram has symbols for classes, attribute types and services. In the diagram association and specialization-generalization are shown as lines.

Model  Axiom

Object model gives a simplification of reality. Object model is strait forward and easy to understand presentation of the corresponding reality and a chosen aspect. An object model consists of class model and a few most significant collaboration models. These models can be illustrated by class and collaboration diagrams.

Paradigms and Object Paradigm in particularly with quote from Grady Booch

The concept of paradigm is a tricky one. To characterize this in some way we can describe paradigm as a outmost explanatory framework or mechanism of modeling a complex thing. (see:

A paradigm offers building blocks to create a model of complex part of reality. These building block are in a sense fundamental to all models ( you could call them explanations as well) that are base on that paradigm.

The paradigms for same aspect are mutually exclusive. Let’s take a concrete example. If you want to create a 3-dimensional model of houses, you can us as you building material for example: Lego bricks, matchboxes or carbon. With each of these materials you can achieve good results but the working techniques differ and of course there are differences in the end results too. Now these three materials – the building blocks – represents here the paradigms. The first (and I consider this most important) result of this consideration is that is impossible to mix paradigms. Either you work with Legos or matchboxes but there no way of mixing these materials. This is universally true with different paradigms explaining the same aspect of the same part of reality.

The article in Wiki also mentions both paradigm shift and paradigm paralysis. It seems that the used paradigms are changed from time to time to more accurate or explaining ones. For instance people hade from the days of antique Greece a very commonly adopted paradigm of solar system. People believed that it is geocentric.  When time passed and people learned more about the universe they realized that their understanding of Solar system was wrong and they had to do a paradigm shift to heliocentric. This shift proved to be a difficult one at least for part of people (at least around Catholic Church). This ended in a paradigm paralysis. The reasons behind this were various and so were the explanations and excuses.

Now let’s consider the object paradigm. The paradigm shift happened from procedural paradigm. The key difference between the paradigms here is in how they describe model’s behavior. The older paradigm has a simple way of slicing the behavior in time and end up with time flow of events as the object paradigm describes the behavior as collaboration of named objects and their separated responsibilities. These ways of describing behavior are mutually exclusive. If you explain the behavior as flow of events there is no point in doing that all over again with collaborative object and their distribution of work. If this would be done one of the models would be completely redundant and unnecessary or even confusing!

I think that most of business application development today are done with OO-languages and most of the design is done in procedural way. My question is how far we are in the paradigm shift and how bad is the paradigm paralysis with procedural approach? The fact is that application development with object paradigm is from 2 to 10 time as effective as with the previous one! In the next post I will elaborate on this and start it with a quote – part of epilogue from Grady  Booch’s Object Solutions book.

  • Software development is difficult
  • Object-oriented technology helps

Is object-oriented technology mature enough upon which to build indus­trial-strength systems? Absolutely. Does this technology scale? Indeed. Is it the sole technology worth considering? No way. Is there some better technology we should be using in the future? Possibly, but I am clueless as to what that might be.

It is dangerous to make predictions, especially in a discipline that changes so rapidly, but one thing I can say with confidence is that I have seen the future, and it is object-oriented.

This was written around 1995. At that time the OO boom was at it’s heights and we all though that transition from procedural would be graduate, smooth and strait forward. It proved to be more difficult and there were a lot of forces against it. There were (some still are) the mighty stakeholder of procedural legacy.  The new concept proved to bee more difficult for many people than we thought.  Anyhow everything here hold today! of course we have learned something in 15 years and we can do things more efficiently today but the bases is here, solid and holds! The software that I am working with can and should all be implemented with OO.

Airline business (Business area models:)

Business area models:  Airline business

Here is now my second business area abstract domain model. My second target is Route Airline business. I have been working in Finnair about ten years, so I know the business and the model by myself very well.

The target scope have been narrowed a bit from general airline. This model concentrates on a typical flag carrier base operation: route flights. All these transportation core models contains a pattern; a pair of route design and implementation aspects:  Route, RouteFlight classes for design and Flight for real event.

The largest business cycle – or pulse- is a period or a season like spring, summer, autumn.   These periods are planned and coordinated as one unit. So my example is Finnair and its route can be for example Helsinki – London. When this is decided then the airline has to decide the weekly operating days and the plain type out of the fleet. When this is done then the airline have to negotiate the gate time slots. When this is done the plan for a periods route operation is ready.

The models class RouteFlight is on description object of a single repeating flight. The real flight event can then be generated for a given period from these objects.

The basic collaboration are very straight forward. Here is one of those: buying a ticket and reserving a seat:

These diagrams can be found in StartUML form at: RouteFlightCompany.uml

Just upload and open with StarUML.

OO domain models for several industries

I am about to start to publish OO domain models of some industries. The first model that publish is within healthcare and could be named as Heal Centre. The model will cover a group of doctors have reception. The model dynamic starts from a patient need for a doctor’s reception and it end at diagnosis, a treatment plan and possible recopies for medical treatment.

When I publish this kind of model, then it will contain the following parts:

  1. class diagram of the domain
  2. one to some collaboration diagrams describing the most essential business logic.
  3. PowerPoint animation presentation for each collaboration diagram to easy the reading of those.
  4. The abstraction level of this model is high and for instance the attributes in the classes is only a small portion of the required when implementing of any application that uses this domain.
  5. During a possible implementation the abstract analysis model should incrementally expanded to include all the necessary attributes. In most cases very few new classes are needed in this phase.

In this post I am not yet publishing the Heat Care (or any other real) model. Here I only give the form of the publication and the reasoning behind it.

I will publish here a very small toy (or tutorial) model called a BookShop to give you an example of everything that I described above.

The business that we look at here is a small bookshop in a centre of a city. The following is deliberately highly abstract model to emphasize the difference between the reality and the model. In the shop the books on the selves actually represent a publication than a copy. So our models don’t have these copies as objects in the model at all.

This way one can think that the copies represent a catalog. This way it is rather easy to sift from a real shop to internet shop. The model has first event called walk. This is the even where a person enters the shop and stars to walk around. During the walk the person pick up interesting book. This set of books are the candidates for purchase.

I use Peter Coad’s color coding. This will make it a lot easier to read. The colors are: Entity-classes green; Role-classes yellow; Description-classes blue and Event-classes pink.

Here is a collaboration diagram on what I described above.

If you have a need for a model on a specific domain then send me a proposal and let’s see can I do it for you.

Proving OO to be the most effective application development paradigm

1.      Background

It has been intuitively completely clear for a long time that object-oriented 3- tier approach is close to complexity minim and therefore requires least work to implement a given functionality.

This was actually the most important reason to create OOP and OOA (look at:

Grady Booch: Object oriented design Second edition p: 16-23  and Peter Coat Object Oriented Analysis  chapter 1.3. Analysis Methods p: 18 -36)

2.      Complexity model from Start Kauffman

Originally a medical doctor, Dr. Kauffman’s primary work has been as a theoretical biologist studying the origin of life and molecular organization. Thirty-five years ago, he developed the Kauffman models, which are random networks exhibiting a kind of self-organization that he terms “order for free.” He borrowed his model from physics call spin-glass model. The following is a quotation from Kauffman’s book:  At Home in the Universe. (ISBN= 0–19–511130-3 )

The NK model I will introduce is a kind of genetic version of a physicist’s spin-glass model. The virtues of the NK model are that it specifically shows how different features of the genotype will result in landscapes with different degrees of ruggedness, and it allows us to study a family of landscapes in a controlled way.

I will briefly sum up Kauffman’s concepts findings and conclusions. The idea of the proof is that there is a fruitful area between chaos and order where development using random search is both possible and probable. This is ruled by how big part of independent features is tightly or loosely associated.

Then we create a scale from bad to good and we assign for each feature and a random value to this feature. This is a starting point. Now we start our evolution scenario by randomly introducing mutation within the features. Then we interpret the total change as organisms as fitness of the organism in the ecosystem.

Kauffman talks about ruggedness of fitness landscape. The interpretation of N/K in genetics is that N is the total number of genies in an individual. K is the number of genes that are closely related. Changing K from 0 to N reveals that when K is relatively small compared to N the group has best possibilities to evolve and adapt along time and when K is close to N here are practically no chances to develop. Then he extends this pattern to things he call “patches” and the results repeat themselves but this time the close attached subset is topologically defined.

Tunning rugidness with K

(See Stuart Kauffman: At home in the universe pages: 164-180)

The conclusion is that adaption that can also be called learning is interplay between the structure and the basic dynamics. Here the combinatorial exploration is bounded by the organisation or structure and this way enabling that dynamic search to continue. This applies to patching s well.

3.     Complexity in modelling behaviour

Now I am applying his interpretation and conclusions into the application development domain.

The two different approaches separates from each other in the way the behavior is modeled. Here is an illustration of them both.  The functional decomposition sees the functionality just time sliced actions as OO explain it as collaborative functions of individual object. The next diagram illustrates this comparison


Figure 1.                          Function decomposition and  object collaboration

First remark. I pretty often come across a claim that OO analysis is more abstract than functional decomposition. This is false. Actually for the following comparison it must be the same. This level of abstraction is reflected by the number of attributes in system.

The thing that separates these is the amount of organization or structure. In the object view the first layer of structure is given when the domain is divided into classes. This leads to distribute activity into the object. This is done in a way where locality is maximized. First this means than object itself is the only one to see it’s attributes so all manipulation on those are strictly isolated within the object. The next layer of functionality of this model is the collaborations of a given set of objects.

The application of Kauffman’s N/K landscapes starts mapping gene’s to OO case to attribute – atomic base element.When we consider this difference in terms of N/K, then we can say that N is the number of attributes in system.

a.  Value of K in functional decomposition is N-1 since there is no structural restriction to use any attribute in the system.

b.  A simple estimation of the value of K in object collaboration is N/C, where C is the number of classes in the system. N/C is the average of the attributes in a class.

Now when we consider the two approaches to behavioral abstraction, they are :

functional decomposition and OO collaboration. In our interpretation the first division into closely related subset of attributes comes from class encapsulation. So in this case the number K is the average number of attributes within a class. If we have 100 classes in the domain layer then K = N/100. Classes encapsulate much of the behavior within each class. The coherence between the attributes inside of a class is much higher than to attributes in other classes.

So if N =800, then for functional decomposition K = 799 and for object collaboration it is near 8. This has a tremendous impact on the ruggedness of the landscape. (see above).

The second layer of organizational management of complexity is what Kauffman call patching. The equivalent in OO is the consequence of class structure topology.


Figure 2.  A few (3) patches for example shown in airline class diagram

Objects forms clusters with other objects from directly associated classes. These object cluster form the patches and the coherence is much higher between object in the cluster than outside the cluster. As you can see from my illustration we can think that these patches are overlapping or fuzzy or we can do just simple partitioning of the class diagram.

4.     Abstraction

Lets return to abstraction.  First definition:

Abstraction is the process or result of generalization by reducing the information content of a concept or an observable phenomenon, typically in order to retain only information which is relevant for a particular purpose.

So abstraction equal simplification and it is done to address a certain viewpoint. The downsize of this is that we lose part of information. This loosing is the price of the clarification. On one hand abstraction has direction – you can think it as a vector in n –dimensional reality. When we have chosen the direction then abstraction is linear.

When we are think to model  a company operation it often includes some abstraction direction that have same starting point. The size of this point witch normally is a ball over these dimensions that is spanned by the set of vectors. The whole model the union over these spanning vectors.

Along each vector abstraction is linear. So each class is on some level of abstraction and has semantic content.. The level of abstraction is always connected to a reverse value of semantic. Following diagram describes this relationship


Figure 4. Dependency between level of abstraction a class  and semantic value of a class.

Here the most effective representation lies somewhere near the center. The same result can be deducted from Kauffman’s patching. From his book we know that the evolutionary power is related to the size of the patch. This indicates that complex minim is achieved in the middle of extremes. This is quite obvious too. If our classes are very abstract then there are very few classes and the size and complexity within such class is very high. In the other extreme we have very small classes with high semantic value but the number of these classes exceed the limit of manageable number of elements.

1.     Three loosely connected aspects three logical layer of implementation

The third layer of patching is achieved  by early separating the three aspects: application, domain and persistence. The logical 3 layer implementation where application, domain and persistence is separated and the coupling to domain layer is asymmetric from both side. This way domain lives in complete isolation and doesn’t know anything of the rest of the world.


Figure 5.             3-layer architecture and isolated domain layer

The static structure (classes and there association) and he dynamics ( methods, distribution of responsibilities and  collaboration) gives us a narrow isthmus between chaos and strict order. This is the same area of phase change where for instance life on earth have created such a vast complexity.

It is extremely important to remember, that this isthmus is narrow and that we don’t really have much freedom to move in either direction.

2.     Final conclusions

In paragraph 3 we found a narrow isthmus. Now this abstraction semantics akseli actually cut our isthmus and we are left with a tiny island in the sea of complexity which gives us minim. At the same time we have exhausted all our reassures. This means that we have reached our highs peek in our ruggedness landscape of effectiveness. I read this so that our theoretical knowledge in search of solution in application development in deterministic Turing machines has came to fulfillment; to its end.

On one hand the fundamental restriction of current system is of course the determinism of behavior on  the other hand this has been and is our aim in the first place. We all want truly want that a given transaction from one bank account to other bank result always to same end state.

When we extend our aim to nondeterministic Turing- machines then we come across with learning. This is then equivalent with Kauffman’s self-organizing  sets. The difference here between the previous and current if self with translates to learning. This is the cause of non-determinism. From our systems perspective this means the fundamental change in our method structure. All methods have to became mutable and their mutation is learning.

The technical solution from the hardware point of view shall be something like our brain: neuronets. The theoretical researches has been going on for some while all ready but I seems that we don’t know yet quite enough about the subject to be able to copy the hardware to perform the desired function.

Why use cases are evil?

The previous post about databases and OO wiht headline saying …also eval actually refered to this text – so their order is wrong. Anyhow here it is now for you.

Well this is not the whole truth and first this is not only use cases but all procedural abstractions of behavior. These come with many names in addition to use cases: activity models, process models, rule models etc. Another fact is that they are not intrinsically mean. Actually they are just one way of abstracting behavior. The reason which makes then mean in context of modeling for application development is that they yield ten times worse solution.

All the way from the early days of OOA (late 80’s and early 90’s ) it was crystal clear that the description of procedural and object behavior are strikingly different.  There is no right or wrong here. They just are the same thing from completely different viewpoints. They are orthogonal abstraction of the same thing.

The most significant thing that created object paradigm was just this difference. This is a paradigm issue and thus the difference has a strong antagonistic nature. This means that you must choose and there is now way of mixing these together.

The big step in deeper understanding simulation modeling was that object collaboration yield to a magnitude simpler model than the corresponding procedural one. What is the cause of this difference? The root of explanation comes from microbiologist Stuart Kauffman. It is question of how to manage complexity. His findings were that the most complex organisms (and organism are always both structure and behavior at the same time) are in the middle ground or gray area between chaos and rigid order.  The conclusion was that you need both ingredients to accomplish.

In case of our models (and also programs) “wild” or “free” behavior (in programming a method or equivalent) represents the flexible chaotic side and class definition, class boundary with behavioral encapsulation and objet collaborations represent the order ingredient. This mixture is actually the trigger that collects the jackpot. This way we are able model 10 times more complex simulation than in the realm of pure chaos. The fundamental aim to avoid the repetition of the same behavior in the model over and over again is gained by the classes definition -or concept- that ultimately limits the objects possible behaviors. In other words the number or trick that a single object can is limited and this of course reflects to implementation so that the number of methods (if designed right) is limited. Thus the object model is the  only way (that at least I know) that can achieve this.

We can ( and actually have several times) come across a subset of reality that we would like to reflect  with application but which is too complex for us. There is of course a limit what we can do with this. The real limitation at the bottom of this all is our computer. Currently it is an implementation of deterministic Turing machine. We know the limitation of this from mathematics.

When we want to exceed these limitations and we would like to create learning applications we must go to nondeterministic Turing machines. The fist difficulty here is that we don’t have the hardware yet. We human stars to understand enough of brain structure and behavior then we can copy this to a machine. On other consequence from this is of course the fact that these system cannot be programmed. They must be taught! The final inevitable consequence is these application can do mistakes like humans!