Six Sigma encourages us to examine a situation and, through analysis, data, and good old-fashioned thought, discern a root cause.  The idea is this: go after the root cause, and the data should reflect a change for the better in how we manage the problem.

So let’s look at the problem of IT Systems Complexity.

I define IT Systems Complexity as the following:  A condition measured by the number of applications or independently configured software modules in an organization relative to the size of the organization, complexity of its competitive space, and the amount of market change typical in its industry.

Clearly the definition attempts to describe the results of a formula involving variables, including the organization’s size, how many markets it competes in, how complex those markets are, and the speed by which they change.

Simply understanding this definition can give you an insight into the things that breed IT complexity.  For example, to respond to very rapid change in a competitive landscape, then you need to allow substantial flexibility to your IT organizations.  Large amounts of corporate oversight would reduce that agility, but some oversight is needed to create a simple environment.  So clearly, the company needs to find the right balance between IT oversight and agility, with the expectation that simplifying the portfolio will have the effect of speeding up delivery of new programs and capabilities at another step of the IT process.

To put a little more meat into the definition:

In general, I’d classify IT Systems Complexity on a scale of one to ten, with one being low complexity and ten being a nearly unmanagable amount.

So, for example, a company that smelts steel may have substantial revenue and headcount, giving it a ‘large’ size.  It may compete in markets for manufacturing raw materials (one market) in sixty countries (where the manufacturers are) which would be medium market complexity.  On the other hand, it may choose to deal in only one country, say China, thus substantially reducing their market complexity.  Lastly, the amount of market change would relate to the number of different innovative features being introduced by competition into the space every year.  In the steel-raw-materials industry, I’d imaging this ‘change factor’ to be fairly low.  The customers will need the mettalurgic attributes of the raw materials to be relatively consistent to predict the quality of the manufactured parts, although the level of purity may be increasing if specific products are needed.  Hopefully this organization’s IT complexity doesn’t exceed about a 4, depending on the number of countries they sell into. 

On the other end of the spectrum, a company that creates software may have a different profile altogether.  Microsoft used to be a much smaller company with the revenues it has today.  Now it is a pretty big place.

Microsoft competes in products for office worker productivity, vertical market spaces like accounting, ERP, and sales force automation, database products, operating systems, collaboration products, games, computer accessories (mice and keyboards), Media devices, and a long list of others.  In each marketplace, there are unique competitive forces with unique products going up against the product in question. 

Now add another layer of complexity: the fact that these software packages have to be customized to different languages and social norms and sold in different countries around the world.  This adds complexity in legal management, a distributed sales and service force, and corporate/tax implications that can drive a great deal of complexity into the IT landscape.

Add to the fact that, in most of these marketplaces, the number of competitors is quite substantial and the speed by which they innovate is mind-boggling.  To keep track of the list of changes in innovation against the list of products that compete with any product from Microsoft would be a never-ending stream of data. 

In other words, it is not purely the application count that determines if IT Systems Complexity is a problem.  It is the ratio of system complexity to organizational complexity that determines if you have a problem, or just a cost of doing business.

An organization like Microsoft ‘needs’ a complex IT structure to handle some of this business complexity.

On the other hand, it probably doesn’t need as much as it has.  We have literally thousands of unique software packages in production, the vast majority of which were written in-house.  We need to be able to answer the question “what number is the right number?” and even more importantly, “in what area should innovation and complexity have free reign in order to encourage competitiveness, and in what area should control and oversight take top billing in order to reduce cost and process overhead?”

The best thing to do, from a complexity standpoint, is to create this scorecard.  (Inside MS, we have created a CIO scorecard using Microsoft Office Business Scorecard Manager).  We need the numbers to measure success.  Otherwise, it would be difficult to know if any of the changes we are making are having any effect at all on going from ‘where we are’ to ‘where we want to be.’ 

By Nick Malik

Former CIO and present Strategic Architect, Nick Malik is a Seattle based business and technology advisor with over 30 years of professional experience in management, systems, and technology. He is the co-author of the influential paper "Perspectives on Enterprise Architecture" with Dr. Brian Cameron that effectively defined modern Enterprise Architecture practices, and he is frequent speaker at public gatherings on Enterprise Architecture and related topics. He coauthored a book on Visual Storytelling with Martin Sykes and Mark West titled "Stories That Move Mountains".

Leave a Reply

Your email address will not be published. Required fields are marked *

two + two =

This site uses Akismet to reduce spam. Learn how your comment data is processed.