UML Modeler - Software Analysis and Design Blog

Everything about software analysis and design - done right or wrong


Nothing stands alone. Or, as they say, "everything is connected".Customers and products, services an
Nothing stands alone. "Everything is connected". 

When it comes to building software for your business, it is surely the case: today's business software is built on models, and when it comes to models - data models are rather important.

And when it comes to data almost all software is built on a "relational database" model in which things that describe your business are connected (did we mention "relational"?). 
It is not that technical as it sounds - and what is behind it is essential to delivering a better software for your business.

So firstly, let's talk about something with a strange name: entity-relationship models. Note: it is not about having an affair, though. :)

An entity-relationship model describes exactly that: "relationships" - what is connected to what, and how.  Again, unfortunately, we will talk about pieces of data and their relationships (and not about men and women).


Using the Entity-Relationship model 
  • can give you insight into your business, 
  • help to define the right requirements for your shiny new software, and 
  • even help you to improve your processes with providing a common basis to think both for business and tech people
With an ER model you will see your business from another angle - together with the tech guys.


An ER-model can be very detailed when you need it, but it can be extremely high level at the start.

An Entity-Relationship model has three levels, defined as:
  • Conceptual model
  • Logical model
  • Physical model
Each of these levels can give you different information while at the same time move you along the road from idea to "real stuff" (=working software).

We will cover all three levels and how / where they can help you in your everyday business.


The Conceptual model is a simplistic model. All it describes is the set of Entities and the relationship between these Entities.

  • Customer owns zero or more Credit Card(s)
  • Credit Card belongs to one Customer
  • Order includes one or more Product
  • Product is part of zero or more Order(s)
  • Invoice covers* one or more Order
  • Order is covered by zero or more Invoices
Or with a graphical representation:

In this chart: the Customer owns (multiple, not only one, hence the "chicken leg") Credit card(s). But may own zero Credit Cards, as well (hence the small circle next to the chicken leg).
One Invoice can cover multiple Orders - and the other way around. Just like with Orders and Products: an Order can include multiple Products and one Product can belong to multiple Orders.

A side-note: As you can see, you definitely want to use a diagram. It is simpler to understand than the textual descriptions.

And why is the Conceptual model, with such a limited detail, valuable?
It gives you the chance to discover exactly how your business works (or in technical lingo: "what entities and relationships your business works with") without having to worry about anything else beyond that, at this stage.

You can focus simply on the "what and who are involved, and how are they related?" question. And why is it important? It ensures / provides:
  • A common language
  • That nothing has been left out
  • A common model - it ensures that you have the same base understanding with your "software guys"
  • It helps you to clarify what is "in" and what is "out" of the software to be delivered.
Since the software (and the underlying database) will need to support the business, thus having a good Conceptual model will give you a solid foundation when you go deeper into details, and start to work on your Logical model.

It also gives you the chance to go really wide and explore seemingly tenuous connections without wasting too much time.

Your Conceptual model can end up having more entities and relationships than you would ever put into a database or a software, and that is quite all right.

You can always remove the ones that you definitely don't need, and nothing says that your Logical and Physical models need to include everything from the Conceptual model.

Look at the Conceptual model as the perfect place to explore widely without any commitment. 
Include every entity you think can have even minimal impact. 
You never know what insight you will gain after looking at all the new entities and relations. 
This is the model that will give you a high level view of influence over your business.

To give you an example:
When looking at your business, say a shoe store set up you definitely have an Entity called Customer.
Not in any connection to you business, the customer is related to another Entity called Traffic.
Now, as it happens there are two locations near me, the Arsenal football stadium and Finsbury Park.
They both host events - an Entity with a very concrete connection to Traffic.
Usually a negative one.
Are these events connected to your business? Not at all, but they still can have an impact on your business. Maybe getting hold of the event calendars for the upcoming year would be a good idea.

So here comes a business decision to be made: what to include and what to exclude?
Should you include these Entities and Relations in Logical or Physical models? 
No, you don't - unless you really, really see the business use of these at this stage, of course.
Let us highlight that these are not technical decisions. These should be made all by you.


With the Logical model we take a big step towards implementation. From the Conceptual model we start to figure out how to build up the data model.

This is where we start to do a little bit more serious modelling.  In technical lingo: what happens here is defining attributes for the entities, primary and foreign keys, resolving many-to-many relationships, normalising.

The Logical model can deliver value for you in several ways:
  • It is good for more in depth investigation and exploring issues.
  • It is detailed enough to run query tests to spot possible performance bottlenecks by simulating query paths.
  • It is technology independent, which means you can find the best data structure and then pick the right database technology to implement it.
  • And yes, you can use the Logical model even if you are in the NoSQL database camp.
  • Fundamental facts about your data don't depend on the technical implementation.
For a business related problem a Logical model allows you to create a filtered view of the entities, their attributes and relationships. This will contain only what you need in your solution, and if it doesn't contain everything you need, then you know what to change. And you won't have to navigate the technical constraints of the Physical model which should be irrelevant when still thinking about the business.

The Logical model does not (and need not) reflect the Conceptual model perfectly: entities can be removed, split or merged, or even converted into a simple attribute of another Entity.

For example, while Colour can be an Entity in the Conceptual model, we can change it to a simple attribute in the Logical model. Maybe it's not that important. Maybe we just want to store a simple, unrestricted name for a colour. Like 'red' or 'blue' or 'asdfasdf' (wait: this is my keyboard stuck, it is not a color!).

Create Logical models based on only part of a Conceptual model, especially if the goal is to explore some business problem and find a solution to it you don't need to go and include everything.

Let's say you need to work out how to measure and track customer retention.

You will not need to include Product, Invoice, Order, Supplier, etc. in your logical model.
You will not need to define the Primary Keys, all the Attribute and normalise everything.

But you will need to define the attribute for the Entities involved in the Customer visit.
There is no need for the Phone Number field, but definitely there is a need to include the Date of Last Visit.

Then some additional technical stuff follows: normalising, resolving many-to-many relationships, defining Primary Keys, and so on. We leave the explanation of these more technical steps out, but the essence of all these is to make sure that your model makes sense from a technology perspective, as well (e.g. it is "clean enough" with no redundancies).


The Physical model is the most implementation specific level of the three. This is mainly for the hard core developer.

Taking a Logical model and turning it into a Physical model involves:
  • Converting Entities into database tables
  • Defining physical characteristics for the attributes
  • Denormalising where needed - it may sound crazy but it is the opposite of what tech people did in the previous step
  • Defining views - these are pre-prepared stuff tech people create for things you often have a look at (e.g. if you ask often the question how much is 1+1 then they will store the answer "2" in a separate table for you)

The same Logical model can be represented by several Physical models based on:
  • technology used - quite different for a relational database you may be using such as MySQL 4, MySQL 5 or an Oracle 11g etc. or even non-relational databases (they are in fashion nowadays, sounds brilliantly exciting, right!?)
  • different requirements - e.g. which queries need to provide answers fast
  • systems consuming the data - your legacy systems may impose some additional constraints

There are two big challenges, issues with Physical models in our experience:

1. When the technical constraints and limitations start to creep up and impose themselves on the Logical model and the business.

While it is true that sometimes technical limitations mean that the users need to find a 'work around', this should not impact the models of higher abstraction levels.

So in the majority of the cases you should stick to your own ideas and do not let developers change them.

In other words: the Conceptual model influences the Logical model and the Logical model influences the Physical model but do not let it work the other way back.

Well - there are some exceptions, but do not let in easily.

2. Keeping your Physical model up to date.

You have to live with the fact that (almost) every change to your business will modify the Physical model.
There are tools that will make it quite simple and easy on the model's level.

The challenge is how to modify the model in a way that it does not cause software errors (in existing software) and that these changes will surely affect all other layers of your software (and if you have multiple systems in place, other software in your business).


I hope that you are convinced that the Entity-Relationship model is more useful than just describing your database tables and relationships.

It can be used throughout the whole project cycle and also outside of the scope of a software project.

It can give you new insights into your business, and it can help you and your team develop the right solution both in the software world and the business.

History of Software Modeling and The Real Job of A Software Analyst and Designer / Developer

Advancements in the software world in recent decades is so vast that the industry has drifted from s
Feeling that you are not needed anymore. Think it over, please. Or just read this article.

In our article we briefly review the history of the software modeling and relating software project management methods and we draw a conclusion that will make you, the reader, feel better.

Advancements in the software world in recent decades is so vast that the industry has drifted from scientific interest alone into a multi-billion dollar business.

Software developers are in high demand as more companies rely on technology and software. Software development has had to adapt in a fast changing world where requirements and demands of the consumer are overtaking business aspects and the needs of other stakeholders.


Methodologies software developers use have also changed considerably. From the traditional Waterfall method, through incremental and iterative development to Agile (with scrum, XP etc.) and related techniques, methods have evolved a lot – and their development hasn’t stopped there, as we’re seeing the emergence of various hybrid methodologies combining the benefits of several previous methods.


Still, some tools have proven to be efficient even in this increasingly complex environment. Regardless of the decade, methodology applied, industry or details of the project, UML modelling is still a priceless tool for creating new software quickly and efficiently in an industry where time is limited, requirements are complex, and quality is respected above all.

History lesson – Looking back into the early days of software development

Early software development focused on building machine-intelligence, however this shifted after 1960 when LISP was created. And in 1962 the first graphic video game ‘Spacewar!’ was developed – this led to the famous Nokia’s ‘Snake’ in the 70’s and soon after, mobile phones were more than just a novelty.


In the 1980’s, technology drifted from the hands of the academics and the elite classes into the palms of the people. First came the 1D telephone with WAP, Nintendo created the Gameboy, and the traditional desktop application developer was suddenly involved in the embedded device market. Then a variety of different proprietary platforms emerged— and developers are still actively creating applications for them. Most platforms have associated developer programs that keep the developer communities small, and under contractual agreements on what they can and cannot do and say. These programs are often required and developers must pay for them. So some programmers responded to this by going it alone: now programmers can make a fortune overnight by creating mobile apps. In a market where the consumer wants everything and more, a good app idea can deepen someone's pockets.


Over the past 10 years the need for embedded software has increased in response to the explosion of product complexity in a developing market. Embedding the software into hardware systems makes for a more functional and cost effective device. Embedded systems are used in navigation tools like global positioning system (GPS), automated teller machines (ATMs), digital video cameras, mobile phones, aerospace applications, telecom applications, and are practically omnipresent in our everyday life. All these peaked in IoT – the Internet of Things, representing another significant wave of producing code.


As part of developing more and more code and more and more complex systems in the early 90’s, UML was developed based on previously created modeling notations, yet UML is the most commonly used modeling language today and it continues to adapt and develop.


The evolution of software management and modelling

UML did not appear from nothing. It developed hand in hand with software management methods and how developers were thinking at that time. The original incentive to find out something to model was to avoid failures. Who said that fear and greed are not primary motivators?


The early days

In the 1960’s and 1970’s there were attempts at creating such modeling languages that formally ensure that all requirements are met and that errors will not occur.

There were many attempts but no such a completely usable, feasible and reliable modeling language as been identified, unfortunately. So even today we have to live together with software errors and unsatisfied users.

These modeling languages did not use graphical notation but were mainly textual.

There were already flowcharts, though, in this period, used mainly to model the business process flows.


Structured methods / Waterfall / MITP and PRINCE2

The Waterfall method was invented in the 1980’s. It had preference for long-term planning, with sometimes complex management processes. As a reaction to this, developers tried to create alternatives that would lead to more efficient processes and less bureaucracy. As the stepping stone between these Waterfall practices and today’s popular Agile methodology, incremental and iterative development (IID) emerged.


Managing the Implementation of the Total Project, a serious name, was IBM’s own methodology to manage projects.  Then PRINCE2 was born – made for the waterfall-style management.


PRINCE2 walked hand in hand with SSADM (Structured systems analysis and design method), born in 1981 – it was a structured method. SSADM included the following elements (“stages”):

·       Stage 0 – Feasibility study

·       Stage 1 – Investigation of the current environment

·       Stage 2 – Business system options

·       Stage 3 – Requirements specification

·       Stage 4 – Technical system options

·       Stage 5 – Logical design

·       Stage 6 – Physical design

As one can see it clearly followed the waterfall approach with respect to software lifecycle.

It was invented in a world with procedural programming.


Its main factors and tools were:

·       Use supported by many CASE tools (including Select) from the 1990’s

·       Applies the traditional Systems Development Life Cycle and has clearly defined stages

·       Provides development staff with detailed guidelines, requiring, for example, the completion of pre-printed documents

·       Data driven; based on assumption that systems have an underlying data structure that changes little over time – however, later versions of SSADM have placed increasing emphasis on user

·       Thorough quality assurance: deliverables at every stage reviewed in structured walkthroughs and signed off by users – can be beneficial in contractor-user relationship

·       Separates the logical view of the system from the physical – this is a valid approach even today as the physical implementation can change

·       Provides 3 main views of the system; which can be cross-checked one against the other:

·       the Data Flow Diagram – this shows how data flows between data storage elements (entities) e.g. what activities change which piece of data storage

·       the Entity Relationship Diagram (invented in 1976 by Chen) – this shows the relationship between data elements and this forms the basis of the generation of the relational data model and

·       the Entity Life History – this shows single pieces of data elements and how their statuses change over the life of the system (e.g. for an Employee: Hiring, Contracted, Suspended, Terminated)

·       Besides provides additional tools to analyse and model the business:

·       Flowcharts

·       Requirements modeling

·       Feasibility study (business, technical)



Structured methods for rapid application development

For smaller projects with heavy user involvement and the need to produce results fast another method arose: RAD: rapid application development.

Its main factors and tools were:

·       The name speaks for itself – need for RAD driven by rapidly changing business needs

·       RAD can already be viewed as an example of the spiral model for systems development.  The spiral model acknowledges the stages that form the SDLC but builds into each stage iteration, prototyping and extensive user involvement.

·       Prototyping may be of whole system or part of the system to tease out particular problem areas

·       Early versions of prototype may be paper-based

·       Important users are identified and involved in workshops at early stages of development

·       Has already “features” from Agile: e.g. gather all the project stakeholders together in one forum in order to reach mutually acceptable decisions; issues are resolved so that the design can move forward; the right people (users and those in authority) are present; there is commitment to the meeting as a forum for critical decision making; that an executive sponsor and an experienced facilitator must be both present.

·       Design developed using diagramming aids such as

·       Data Flow Diagrams and

·       Entity Relationship Diagrams

·       Shorter cycles (than in the case of Watefall): prioritises functionality that is strictly necessary so that development is achievable in a 90 day life cycle. 



Object orientation and even faster delivery

Another change in the history of programming was the shift from procedural to object oriented  modeling and languages in the late 1970’s.


For object orientation objects became the units where both data is stored and functions (“procedures” in the older days) are “packaged” with data. As such data cannot be directly accessed or changed in these systems but objects provide means to change data through their own methods (=functions = procedures).

E.g. an invoice can have a method to say its customer, invoice date and total amount. Besides can have a method to add a new invoice line.


This approach, and the need to further involve the user, resulted in newer modeling methods.



UML / Agile (Scrum etc.)

Object orientation-based modeling became mature by the early 1990’s.

Grady Booch, Ivar Jacobson, James Rumbaugh have all invented object-oriented notation / modeling methods to describe the systems to be developed.

They contributed to the software development world we know today with the followings:

·       Rumbaugh, 1991: OMT: provides notation for static and dynamic software modeling (classes and behaviors)

·       Jacobson 1992: OOSE: Describes visual notation for use cases (requirements modeling); it is a “use case driven approach” to software engineering

·       Booch 1993: The Booch Method: Combines different models for logical, physical, static and dynamic aspects of a system


Then in 1994 Booch, Rumbaugh join Rational, anounce merging of methodologies at OOPSLA and in 1995 Jabson joins Rational, as well. In 1996 the “three amigos” rename the Unified Method to the Unified Modeling Language.


In 1997 Rational proposes UML as a standard notation to the Object Management Group, UML 1.1 adopted. In 2003 the OMG publishes UML 1.5, most recent “stable” version of UML and in

2004 the OMG publishes UML 2.0, and in 2013 UML 2.5 (beta) that is the newest version of the standard.


OMG is a group of companies that collaborate on the development of UML. OMG is still alive and well, regularly publishing several other things, methods, beyond UML.


And about project management methods that followed the shift to object orientation…

The need for even less project failures led to IID - IID’s shorter development cycles and incremental building method (meaning that software was built piece by piece, always expanding the functionality of the previous version) allowed for more frequent and efficient phases of client feedback and change management. As developers realized the benefits of shorter iterations, the basic disciplines of IID were distilled into the methodology that is today known as Agile. There are numerous Agile software development methodologies (e.g. Scrum, XP) etc. used today.


Due to Agile’s short timeframes, time to market is shorter: as a consequence, the risk associated with development is lower, while costs can also be minimized. Implementing changes often and fast results in an end product that more closely reflects the needs of the client.


This concept of quick output led to the creation of Scrum, which focuses on project management eliminating excess and encouraging productivity. Led by a Scrum Master, the team of developers adopts an empirical approach – accepting that the problem cannot be fully understood or defined, focusing instead on maximizing the team's ability to respond in an agile manner to emerging challenges, increasing the overall efficiency of Scrum teams.


…and the software development method that took everything to the extremes is XP. The main goal of XP is to lower the cost of change in software requirements. XP provides teams with flexibility and is carried out at a sustainable pace (40 hours per week) which is revolutionary for software developers who are used to working well into the early hours and, well…usually not being paid overtime. This methodology not only considers the productivity of the process, it also considers its development team, which is something the tech world and stakeholders can sometimes forget.

Conclusion: the importance of development itself and developers – and the real objective of a modeling language

What all these methodologies have in common is a simple idea that was still pretty slow to take hold in the industry: when given the chance, good developers will use their creativity to come up with ingenious solutions. Modelling and development methodologies have evolved to provide more freedom to those working on the software, to ensure that the end product employs solutions that help drive business objectives. If you look at the history of software development it is easy to trace software failure to an overworked, underpaid developer trying to adhere to various bureaucratic methods that slow the developer down.


The real finding is that if you look at history, software modeling and management methods did not evolve in a way that formal modeling languages became the norm: almost just the contrary: that graphical languages and modeling is the norm today. 

How come?


The reason is simple and we hinted at this above: when we say “given the chance” we mean that understanding what the real problem is and solving the real problem is the utmost important job of every software analyst and designer.


This explains that why graphical methods developed and spread: all these help the developer to do what is the most important thing to do. Everything else is secondary.