UML Modeler - Software Analysis and Design Blog

Everything about software analysis and design - done right or wrong

History of Software Modeling and The Real Job of A Software Analyst and Designer / Developer

Advancements in the software world in recent decades is so vast that the industry has drifted from s
Feeling that you are not needed anymore. Think it over, please. Or just read this article.

In our article we briefly review the history of the software modeling and relating software project management methods and we draw a conclusion that will make you, the reader, feel better.

Advancements in the software world in recent decades is so vast that the industry has drifted from scientific interest alone into a multi-billion dollar business.

Software developers are in high demand as more companies rely on technology and software. Software development has had to adapt in a fast changing world where requirements and demands of the consumer are overtaking business aspects and the needs of other stakeholders.


Methodologies software developers use have also changed considerably. From the traditional Waterfall method, through incremental and iterative development to Agile (with scrum, XP etc.) and related techniques, methods have evolved a lot – and their development hasn’t stopped there, as we’re seeing the emergence of various hybrid methodologies combining the benefits of several previous methods.


Still, some tools have proven to be efficient even in this increasingly complex environment. Regardless of the decade, methodology applied, industry or details of the project, UML modelling is still a priceless tool for creating new software quickly and efficiently in an industry where time is limited, requirements are complex, and quality is respected above all.

History lesson – Looking back into the early days of software development

Early software development focused on building machine-intelligence, however this shifted after 1960 when LISP was created. And in 1962 the first graphic video game ‘Spacewar!’ was developed – this led to the famous Nokia’s ‘Snake’ in the 70’s and soon after, mobile phones were more than just a novelty.


In the 1980’s, technology drifted from the hands of the academics and the elite classes into the palms of the people. First came the 1D telephone with WAP, Nintendo created the Gameboy, and the traditional desktop application developer was suddenly involved in the embedded device market. Then a variety of different proprietary platforms emerged— and developers are still actively creating applications for them. Most platforms have associated developer programs that keep the developer communities small, and under contractual agreements on what they can and cannot do and say. These programs are often required and developers must pay for them. So some programmers responded to this by going it alone: now programmers can make a fortune overnight by creating mobile apps. In a market where the consumer wants everything and more, a good app idea can deepen someone's pockets.


Over the past 10 years the need for embedded software has increased in response to the explosion of product complexity in a developing market. Embedding the software into hardware systems makes for a more functional and cost effective device. Embedded systems are used in navigation tools like global positioning system (GPS), automated teller machines (ATMs), digital video cameras, mobile phones, aerospace applications, telecom applications, and are practically omnipresent in our everyday life. All these peaked in IoT – the Internet of Things, representing another significant wave of producing code.


As part of developing more and more code and more and more complex systems in the early 90’s, UML was developed based on previously created modeling notations, yet UML is the most commonly used modeling language today and it continues to adapt and develop.


The evolution of software management and modelling

UML did not appear from nothing. It developed hand in hand with software management methods and how developers were thinking at that time. The original incentive to find out something to model was to avoid failures. Who said that fear and greed are not primary motivators?


The early days

In the 1960’s and 1970’s there were attempts at creating such modeling languages that formally ensure that all requirements are met and that errors will not occur.

There were many attempts but no such a completely usable, feasible and reliable modeling language as been identified, unfortunately. So even today we have to live together with software errors and unsatisfied users.

These modeling languages did not use graphical notation but were mainly textual.

There were already flowcharts, though, in this period, used mainly to model the business process flows.


Structured methods / Waterfall / MITP and PRINCE2

The Waterfall method was invented in the 1980’s. It had preference for long-term planning, with sometimes complex management processes. As a reaction to this, developers tried to create alternatives that would lead to more efficient processes and less bureaucracy. As the stepping stone between these Waterfall practices and today’s popular Agile methodology, incremental and iterative development (IID) emerged.


Managing the Implementation of the Total Project, a serious name, was IBM’s own methodology to manage projects.  Then PRINCE2 was born – made for the waterfall-style management.


PRINCE2 walked hand in hand with SSADM (Structured systems analysis and design method), born in 1981 – it was a structured method. SSADM included the following elements (“stages”):

·       Stage 0 – Feasibility study

·       Stage 1 – Investigation of the current environment

·       Stage 2 – Business system options

·       Stage 3 – Requirements specification

·       Stage 4 – Technical system options

·       Stage 5 – Logical design

·       Stage 6 – Physical design

As one can see it clearly followed the waterfall approach with respect to software lifecycle.

It was invented in a world with procedural programming.


Its main factors and tools were:

·       Use supported by many CASE tools (including Select) from the 1990’s

·       Applies the traditional Systems Development Life Cycle and has clearly defined stages

·       Provides development staff with detailed guidelines, requiring, for example, the completion of pre-printed documents

·       Data driven; based on assumption that systems have an underlying data structure that changes little over time – however, later versions of SSADM have placed increasing emphasis on user

·       Thorough quality assurance: deliverables at every stage reviewed in structured walkthroughs and signed off by users – can be beneficial in contractor-user relationship

·       Separates the logical view of the system from the physical – this is a valid approach even today as the physical implementation can change

·       Provides 3 main views of the system; which can be cross-checked one against the other:

·       the Data Flow Diagram – this shows how data flows between data storage elements (entities) e.g. what activities change which piece of data storage

·       the Entity Relationship Diagram (invented in 1976 by Chen) – this shows the relationship between data elements and this forms the basis of the generation of the relational data model and

·       the Entity Life History – this shows single pieces of data elements and how their statuses change over the life of the system (e.g. for an Employee: Hiring, Contracted, Suspended, Terminated)

·       Besides provides additional tools to analyse and model the business:

·       Flowcharts

·       Requirements modeling

·       Feasibility study (business, technical)



Structured methods for rapid application development

For smaller projects with heavy user involvement and the need to produce results fast another method arose: RAD: rapid application development.

Its main factors and tools were:

·       The name speaks for itself – need for RAD driven by rapidly changing business needs

·       RAD can already be viewed as an example of the spiral model for systems development.  The spiral model acknowledges the stages that form the SDLC but builds into each stage iteration, prototyping and extensive user involvement.

·       Prototyping may be of whole system or part of the system to tease out particular problem areas

·       Early versions of prototype may be paper-based

·       Important users are identified and involved in workshops at early stages of development

·       Has already “features” from Agile: e.g. gather all the project stakeholders together in one forum in order to reach mutually acceptable decisions; issues are resolved so that the design can move forward; the right people (users and those in authority) are present; there is commitment to the meeting as a forum for critical decision making; that an executive sponsor and an experienced facilitator must be both present.

·       Design developed using diagramming aids such as

·       Data Flow Diagrams and

·       Entity Relationship Diagrams

·       Shorter cycles (than in the case of Watefall): prioritises functionality that is strictly necessary so that development is achievable in a 90 day life cycle. 



Object orientation and even faster delivery

Another change in the history of programming was the shift from procedural to object oriented  modeling and languages in the late 1970’s.


For object orientation objects became the units where both data is stored and functions (“procedures” in the older days) are “packaged” with data. As such data cannot be directly accessed or changed in these systems but objects provide means to change data through their own methods (=functions = procedures).

E.g. an invoice can have a method to say its customer, invoice date and total amount. Besides can have a method to add a new invoice line.


This approach, and the need to further involve the user, resulted in newer modeling methods.



UML / Agile (Scrum etc.)

Object orientation-based modeling became mature by the early 1990’s.

Grady Booch, Ivar Jacobson, James Rumbaugh have all invented object-oriented notation / modeling methods to describe the systems to be developed.

They contributed to the software development world we know today with the followings:

·       Rumbaugh, 1991: OMT: provides notation for static and dynamic software modeling (classes and behaviors)

·       Jacobson 1992: OOSE: Describes visual notation for use cases (requirements modeling); it is a “use case driven approach” to software engineering

·       Booch 1993: The Booch Method: Combines different models for logical, physical, static and dynamic aspects of a system


Then in 1994 Booch, Rumbaugh join Rational, anounce merging of methodologies at OOPSLA and in 1995 Jabson joins Rational, as well. In 1996 the “three amigos” rename the Unified Method to the Unified Modeling Language.


In 1997 Rational proposes UML as a standard notation to the Object Management Group, UML 1.1 adopted. In 2003 the OMG publishes UML 1.5, most recent “stable” version of UML and in

2004 the OMG publishes UML 2.0, and in 2013 UML 2.5 (beta) that is the newest version of the standard.


OMG is a group of companies that collaborate on the development of UML. OMG is still alive and well, regularly publishing several other things, methods, beyond UML.


And about project management methods that followed the shift to object orientation…

The need for even less project failures led to IID - IID’s shorter development cycles and incremental building method (meaning that software was built piece by piece, always expanding the functionality of the previous version) allowed for more frequent and efficient phases of client feedback and change management. As developers realized the benefits of shorter iterations, the basic disciplines of IID were distilled into the methodology that is today known as Agile. There are numerous Agile software development methodologies (e.g. Scrum, XP) etc. used today.


Due to Agile’s short timeframes, time to market is shorter: as a consequence, the risk associated with development is lower, while costs can also be minimized. Implementing changes often and fast results in an end product that more closely reflects the needs of the client.


This concept of quick output led to the creation of Scrum, which focuses on project management eliminating excess and encouraging productivity. Led by a Scrum Master, the team of developers adopts an empirical approach – accepting that the problem cannot be fully understood or defined, focusing instead on maximizing the team's ability to respond in an agile manner to emerging challenges, increasing the overall efficiency of Scrum teams.


…and the software development method that took everything to the extremes is XP. The main goal of XP is to lower the cost of change in software requirements. XP provides teams with flexibility and is carried out at a sustainable pace (40 hours per week) which is revolutionary for software developers who are used to working well into the early hours and, well…usually not being paid overtime. This methodology not only considers the productivity of the process, it also considers its development team, which is something the tech world and stakeholders can sometimes forget.

Conclusion: the importance of development itself and developers – and the real objective of a modeling language

What all these methodologies have in common is a simple idea that was still pretty slow to take hold in the industry: when given the chance, good developers will use their creativity to come up with ingenious solutions. Modelling and development methodologies have evolved to provide more freedom to those working on the software, to ensure that the end product employs solutions that help drive business objectives. If you look at the history of software development it is easy to trace software failure to an overworked, underpaid developer trying to adhere to various bureaucratic methods that slow the developer down.


The real finding is that if you look at history, software modeling and management methods did not evolve in a way that formal modeling languages became the norm: almost just the contrary: that graphical languages and modeling is the norm today. 

How come?


The reason is simple and we hinted at this above: when we say “given the chance” we mean that understanding what the real problem is and solving the real problem is the utmost important job of every software analyst and designer.


This explains that why graphical methods developed and spread: all these help the developer to do what is the most important thing to do. Everything else is secondary.