/Tag: Analysis

How the Program Management Office Views Enterprise Architecture…

By |2010-03-10T12:58:15+00:00March 10th, 2010|Enterprise Architecture|

There’s an interesting analysis available through the PMO Executive Board on “Project Interdependencies.”  In the problem statement, the author correctly observes:

As the volume and size of projects grow, the old problem of managing project and program interdependencies is becoming more acute: three quarters of PMOs consider “managing interdependencies” to be one of their most critical program management challenges.

First statement is cool.

Unfortunately, the next concept is somewhat troubling to me.  According to the author, the solution to this problem is one of process 

“… the ability to track and communicate project interdependencies boils down to two essential managerial disciplines: good risk management and clarity of communication with senior executives.”

In other words, we can track and communicate interdependencies better if improve our ability to manage risk and communicate interdependencies.  Is that argument convincing to you?  Seems pretty circular to me.

Digging a little deeper

Let’s take a look at the “project interdependency problem” for a minute.  If projects had no interdependencies, there would not be a problem.  However, if one project must complete before another one can deliver value, then there is an interdependency. 

This is a problem because we may not know which project is the “critical path” project, so we may not do a good job of accounting for delays or cost overruns in the “source” project that can ripple across many other projects.  In that aspect, understanding the interdependencies is a CRITICAL problem for the Program Management Office. 

But let’s dig a little deeper.  What causes projects to be interdependent upon one another?  According to the same analysis, the factors to consider are:

  1. data,
  2. artifacts or deliverables,
  3. technical functionality,
  4. infrastructure capabilities,
  5. milestones, and
  6. end-user commonality.

Think through the list above.  How many are NOT architectural?  You could say that end-user commonality is not architectural, if you were to exclude business architecture from the discussion, but once you examine the kinds of models developed in an Enterprise Architecture context, 100% of the types of interdependencies chartered in these different projects are visible in, or predicted by, the architecture of the systems.

Improving the advice

I am not saying that a PMO shouldn’t be concerned with Interdependency management.  It is not the job of Enterprise Architecture to track, communicate, and drive the flow of the project portfolio. 

What I am saying is that the advice needs to be extended.    The second sentence above, and the entire presentation to follow, needs to recognize the clear dependency that the PMO takes, or should take, on the Enterprise Architecture team to identify, review, minimize, and prioritize systemic and project interdependencies. 

In other words, the second sentence above would become:

“… the ability to track and communicate project interdependencies boils down to three essential managerial disciplines: timely development and delivery of Enterprise Architecture, good risk management and clarity of communication with senior executives.”

The cost of “SOA-fication”

By |2010-02-15T17:54:19+00:00February 15th, 2010|Enterprise Architecture|

No, Virginia, there is no SOA Santa Clause.  SOA is not free.

That said, if I’m changing a system to meet new needs, and I’m substantially refactoring a section of the code to deliver to those needs, SOA doesn’t have to be wildly expensive either.

The myth of “expensive SOA” is just that: a myth.  If you have an existing system, and you are refactoring it anyway, it may make sense to spend a bit more money for “SOA-fication” (the process of turning a “closed system” into a system based on Service Oriented Architecture principles… if you can’t get all the way to “based on SOA,” I’m happy with “plays well with SOA.”)

Of course, that cost is not trivial to estimate.  This is where I’d like to ask the community for your input.  What drivers have you seen to drive the additional cost of “SOA-fication” of an application?

Here’s what I’ve observed…

  • “SOA Security” (Auth/Auth) – to insure that there are no additional risks to the enterprise through the availability of a previously inaccessible system feature.
  • Design and Design Review – to insure that the services being developed are truly worth investing in.  That means that the services exhibit good behavior for enterprise SOA services wherever possible (non-chatty, independent, transactional, reliable, traceable, etc.)
  • Testing – to insure that the interface is stable, meets the stated business needs, and can realize the stated design goals.
  • Monitoring – to insure that the service can be discovered / tested / validated and, depending on the service, that it correctly and reliably participates in Business Activity Monitoring and Workflow scenarios.


Gentle Reader: Do these “dev cost drivers” coincide with your experience in turning legacy applications into SOA applications?  What “pitfalls” come to mind that are not listed?

(I’m interested in actual experiences.  Please share.  There are no wrong answers here… just good people helping one another.)

Business Architecture — includes process architecture?

By |2010-02-09T20:08:33+00:00February 9th, 2010|Enterprise Architecture|

Debasish Mishra, a colleague of mine, posted recently that we should let Business Architects out of the “Business Process Optimization Prison.”  (link)  He raises some good points.  Chief among them is whether process optimization should be the sole focus of the business architect.  Quote:

… a business architect who is narrowly focused on business process, organizations, and roles may have something interesting to tell the business sponsor but risks not being able to tell a complete story.

I completely agree, but I urge care when reading his post.  Debasish is not suggesting that business architects don’t care about business process.  He is simply saying that business architects have to care about many other things as well. 

Using your business architect for business process optimization is a bit like buying a Maserati race car to drive to the grocery store.  It can be done, but it’s not really practical, and certainly not cost effective. 

Sure, many business architects are quite good at process optimization.  By way of comparison, I’m good at coding, but I don’t code anymore. 

Similarly, a good business architect may have, in his or her past, been responsible for process modeling, measurement, and optimization.  But it’s no longer a focus of the work.  I’d go so far as to say that process optimization is not a necessary prerequisite skill for being an effective business architect.

So use your business architect to do the work of a business architect: to model the business, and the business drivers, and to develop feasible well-aligned roadmaps for achieving goals.  That is where the real value of business architecture lies.

When feasibility of integration is a measure of capability…

By |2010-01-29T07:08:44+00:00January 29th, 2010|Enterprise Architecture|

One of the jobs of an enterprise architect is to evaluate the business capabilities of an area of the business and determine if those capabilities are either strong, capable, or insufficient.  But what to do when two areas of the business overlap?

In some cases, as in mergers, or even consolidation of functions, you will find that two areas of the business have a need for a set of capabilities, and both have developed them independently.  This could be a good thing.  Lots of opportunity for independent movement.  On the other hand, silos in the business can be a problem for customer care among other things.  Silos also add complexity, in systems, data, and business process. 

So let’s say we have two parts of a business that have both developed the ability to manage inventory for the retail channels.  Two sets of warehouses.  Two sets of inventory management systems.  Two sets of supply chain relationships.  The warehouses are in the same geographical area.  So you bring together the inventories into a single building, but you still have two sets of systems. 

In this case, the capabilities associated with the processes, people, technology, and data may not be a simple comparison.  You may have strengths in both systems, and weaknesses in both, and yet it may be quite difficult to merge them.

In this case, the evaluation of business capabilities becomes more nuanced.  Instead of saying “contoso has strong capabilities in vendor management and shipment tracking,” you have to look ahead to the next step recommendations… which processes and systems will be used to manage the inventory of both businesses?  In that context, the ability of the existing systems to integrate, support multiple businesses, and adapt to change, becomes much more important than before.

Now, we are no longer comparing apples to oranges.  Now it is Fuji apples vs. Braeburn apples.  More measures are needed.

Modeling User Experience Scenarios

By |2009-11-10T19:01:00+00:00November 10th, 2009|Enterprise Architecture|

I’m working on modeling some requirements for a document management system.  I’m a big fan of using models to represent every element, from goals and strategies through to business processes.  From there, I model use cases and requirements and on down to system components that fulfill those requirements.  Just call me a traceability hound.

I find that the effort to develop requirements in this way is, if anything, substantially less than the traditional “text first” method, and I can always output text documents for those times when people need their Word Document.

User Experience Scenarios are not, however, an easy fit with our current modeling languages.  We have models for business processes (which are excellent for illustrating activities in the scope of roles), interaction diagrams (which are excellent for illustrating component lifecycles, timing, and information flow), and activity diagrams (which are excellent for illustrating the activity of a single module).

I am not completely comfortable illustrating a user scenario in any of these three visual languages.   They really capture the wrong information, and fail to illustrate the things I care about.

For a user scenario, I want to know what persona is involved, what motivation that persona has for interacting with my business service, and what information they will have before the scenario starts.  I also want to know what constraints they will expect of the scenario: what is their level of commitment to my business service?  How frequently will they be interacting, and what is their understanding of the underlying information?  While I can annotate one of the UML models above with this information, I cannot truly model this information in the UML diagrams described.

I’m using the following diagramming “language” to describe the connection between people, motivations, scenarios and use cases.  To see this sample full-size diagram, click on it.  The model attempts to show the people involved in creating and using architectural standards. 

Using a UML-derived diagramming language, I can document the scenario (labeled “<<Scenario>>” in the diagram) as a sub-activity diagram.  That activity takes on a sense of reusability, since it can be from the perspective of any involved actor while keeping the actor itself outside the scenario.  This allows a single scenario to apply to different people (Object Oriented Encapsulation, as applied to workflow).

Scenario model

On a different view (a different diagram), I can illustrate each of those scenarios with links to the constraints that I mentioned above (frequency, information objects, level of commitment. etc). 

The advantage of creating a model of this kind is obvious to me, because I’m a modeler.  But for many folks, the benefits may not be clear.  Let’s put it this way: if you change any one object or line, there is the potential for impacting other objects.  With a model of this kind, you can SEE those potential impacts before they happen.  By developing a model, the architect develops clarity of thought, and in doing so, reduces mistakes in the design.

I’m curious if others find this kind of model interesting.

Make IT appear as simple as possible, but not simpler

By |2009-05-22T20:22:52+00:00May 22nd, 2009|Enterprise Architecture|

Sometimes I hear a complaint from an IT architect who wants to have direct conversations with “the business” or “the customer” but, for some reason (usually bureaucratic), they cannot.  There is a team of analysts or project managers that they are supposed to talk to. 

The original objective of having “layers” of people is to make IT appear simple.  We all agree that business constituents can become confused if they are dealing with a long list of people from IT, each of which have different concerns.  In the worst case scenario, a business user reaches out to an analyst to tell them about a software error, and the ‘problem’ gets handed off from person to person, adding time and confusing the user.

Many companies favor the “Single Point of Contact” approach. For each business unit, there is a single point of contact for all projects.  There may still be one more point of contact for “support” related concerns.  But that is all.  This hides some of the complexity from the business customer, but adds a layer between IT and the customer.


So where does the IT architect fit?  Does it depend on the type of architect?  Does the enterprise architect need to have direct conversations with business stakeholders? 

What about solution or platform architects?  Should they be talking “directly to the business?” 

It would seem obvious that business architects should, but how do business architects relate to business analysts?  There’s still the support side as well.  Does each application have their own support contact?  What happens when one application has the right data, and the next one over has the wrong data… who should the customer call?

So we have a problem.  That much is clear.  How to solve it?

I’d like to consider introducing a concept into the conversation: interdisciplinary teams.

The notion of an interdisciplinary team is not widely used in computing, but there are many examples in science.  Used widely in research, medicine, and public policy, interdisciplinary teams provide a way for specialists in many fields to work together to solve a problem.  Any problem can be addressed from many viewpoints, using an understanding that emerges from the unique combination of talent and responsibilities.

Many of the processes for collecting and describing requirements, including the well-understood “Joint Application Development” or JAD process, incorporate the same basic ideas, but do so in a less structured manner and only for a single “problem” (understanding requirements). 

What I’d like to see done is to use the concept more consistently.  For Information Technology, and for consulting, this is quite doable.  Instead of having a single person represent IT to the business, have a team of people.  They meet the business on a monthly basis, and the concerns of each of the people can be brought to the monthly meeting.  All of this is coordinated by a single “IT Engagement Manager” or “IT Relationship Owner.”  However, unlike the bureaucratic processes we see in some companies, there are a few rules that apply.

The interdisciplinary team will have predefined roles.  The list of roles cannot be reduced by either the IT engagement manager or business stakeholder.  One person can fill more than one role.  However, the IT engagement manager does not assign IT staff to those roles.  That is up to IT leadership to do.

This kind of interdisciplinary structure can allow a more direct flow of information, communication, and shared commitments than is possible with the “single point of contact” model.  At the same time, the business stakeholders don’t get randomized by multiple requests for the same information or by the miscommunication that comes from collecting different information at different times in different contexts to apply to the same problem.

In many ways, using a single point of contact is an attempt to make the relationship, between IT and the business, simple.  It is too simple… to the point of ineffectiveness.  I believe that a broader approach is often a better one.

The Process of Strategic Planning

By |2009-05-07T13:41:00+00:00May 7th, 2009|Enterprise Architecture|

I’m a process guy.  I’m not a big fan of the claims of process management software, but I’m a huge fan of developing and using process models to organize the activities of people, and then to drive the requirements for software from those models.

So when I was asked to look into the processes for Strategic Planning (one of the three business functions of enterprise architecture), I took a process oriented approach.  I looked at each of the different activities that have been suggested or planned or were being performed in strategic planning.  I created a chart of inputs and outputs and linked it up so that the output of one activity is the input to another.  Normal stuff. 

Without going into the details of the result, I would like the share the huge reliance that all of the activities have on a basic understanding of what the business is and does.  It became obvious when we began to trace back the core element of strategic planning: the business goal.

Strategies chart a path to a business goal.  Both the OMG Business Motivation Model, and my Enterprise Business Motivation Model, say this same thing.  A strategy is a statement of “how” a business can reach a goal.  But beneath this statement, we have to recognize something even more fundamental: that an enterprise can have more than one business.

Let’s say that your company is a clothing retailer.  If you are successful at all, you probably have one or more “lines” of clothing that are made for you, and sold only through your stores.  That is differentiation.  You also may have clothing that is made by a well-known manufacturer and is sold widely, including in the stores of your competitors.  (this applies to other products, not just clothing, but I’ll stick with the scenario for now).

How many businesses are you in?  If we want to look at the strategies of your business, it matters.  This is because the strategies that may make the “custom made clothing” business successful may actually work against the “name brand retailing” business.  A strategy to promote the in-house brands may hurt relationships, or drain resources, or drive down prices of the name brand products. 

So before you can even write down a strategy, you have to write down the number of businesses that you are in, and then, when you write down the strategy, you write it down in context.  You say “this is a strategy for making Business 2 meet it’s goals.”  You may not know if that strategy hurts “Business 1.”  Depending on the organization of the business, and your responsibilities within it, you may not actually care. 

But the business architect must care.  The business architect, in this example, must be able to say “you are in two businesses, and they sometimes compete for resources, customers, and market-share.”

So as you look at your strategic planning processes, don’t forget to take the time to chart out how many businesses you are dealing with.  It is such an important part of the “first step” to strategic planning.

Collecting requirements from business processes

By |2009-03-13T21:20:16+00:00March 13th, 2009|Enterprise Architecture|

Ah, the sweet sounds of success. 

I got the opportunity, this week, to collect a list of requirements for a strategic planning tool that we will license and use within Microsoft IT (COTS).  The fact that I got to collect requirements is not particularly cool.  What is cool is this: I made a point of using a business process model to collect them in an agile process that took less than a week to run, start to finish.

Every day, I try to find ways to make Enterprise Architecture relevant to stakeholders.  Every day, I look for reasons to trace success in our normal IT duties back to the efforts of the EA team.  In this case, it was the simple demonstration of how the requirements for a system were directly derived from the needs of a business process.

This method, for those not familiar with it, involves insuring that a process model exists for the business, in each of the areas where a particular capability needs to be developed or improved.  Now, the existence of a process model does NOT mean that the process model is detailed to the task level.  That is simply not necessary, especially when specifying requirements for a COTS tool. 

The advantage of this method is this: our requirements are far more rich, and far more complete, and developed far more quickly, than if we had simply employed ‘traditional’ use case analysis to derive them.  We didn’t start with task-level technical functions (like "user logon," or "collect application metadata") and work up to describe the user interface steps needed to use them.  We started with the business objectives and methods (like "manage portfolio" and "quarterly funding cycle"), and quickly found the scenarios that we needed to detail.  This method is far faster, and far more resilient, than traditional use case analysis. 

The process model that I developed for this use is high level, but it covers all of the functions of IT management and Strategic Planning where we are expecting to use a tool.  The requirements are gathered, and the RFP has been released.  Once the product is selected, however, we will need a detailed model, so we will be spending some time, in the near future, to refine that high-level model and better understand the detailed processes that various constituencies will engage in.  (I’m being agile here.  I don’t develop something before I need it, and only what I need to perform the task at hand).

That detailed process work begins next week. 

For now… success is demonstrating that deriving requirements from a process map works, and works well.

Oh, and we can manage it entirely in a repository-based modeling environment. 

EA works.  Q.E.D.

Update to root cause analysis for poor software requirements

By |2009-02-28T18:23:00+00:00February 28th, 2009|Enterprise Architecture|

Just a quick note.  After reading through some of the feedback on my recent post on “the root causes of poor software requirements,” I had to agree with some of the respondents: I had forgotten a branch in the analysis. 

So, for the past week, I’ve been stealing a few minutes here and there to review the analysis and add updates.  I just posted an updated analysis, adding a new top-level branch (at the end) and adding a sub-branch about half-way through.

Feel free to check it out. 

Now, to take this to the next step: anyone can figure out the steps and costs involved with improving your requirements-gathering practices.  Here are the steps.

a) Steal the list right from the post.

b) Load it into an Excel spreadsheet.  Add the following columns: Score, Mitigation type, Mitigation Tactic

c) For each item, add a score, from 0-5.  See the weighted ranking below.

Score 0 if this cause does not happen in your environment
Score 1 if this cause occurs but is easily overcome in your environment
Score 2 if this cause occurs but it doesn’t occur frequently or drive substantial quality problems
Score 3 if this cause occurs occasionally, and the business analyst or program manager finds the flaws
Score 4 if this cause occurs frequently, and/or the software developers or testers find the flaws
Score 5 if this cause occurs frequently and the flaws are found in deployment or production

d) For each item that ranked 3,4, or 5, write a statement for how you think you should mitigate it in the Mitigation Tactic column.  Feel free to brainstorm many possible ways, but then select the way that you believe is most feasible.

e) For each mitigation, add a value in the Mitigation Type column.  use one of the types below or create your own list.

Use type ‘Training’ if your mitigation is focused on improving the skills of a participant
Use type ‘Automation’ if your mitigation is focused on moving information efficiently or using workflow tools
Use type ‘Data’ if your mitigation is focused on improving the information collected or making it available
Use type ‘Process’ if your mitigation is focused on analyzing workflow, allocating responsibility, or insuring accountability
Use type ‘Staffing’ if your mitigation is focused on adding staff to a role or creating a role that doesn’t already exist

f) Sort your analysis by type and score.  For each group of mitigations by type, you have a workstream (or sub-project).  That effort can be estimated and an approximate cost established.  For the ‘Data’ and ‘Automation’ groups, you have a list of objectives that can be used to select software for requirements management.

g) Pull together a project proposal from your cost estimates and propose a project to your management that outlines the steps you intend to take, the problems you intend to solve, the amount of money that you believe it will cost, and (here is the hard part) the value of the quality gains that you expect to reap.

Good Luck!

Understanding the root causes of poor software requirements

By |2009-02-18T16:03:10+00:00February 18th, 2009|Enterprise Architecture|

If I had a nickel for every time I’ve heard a developer complain about poor quality requirements, I’d… well… have a lot of nickels.  Let’s look, for a moment, at the root causes of poor requirements and business rules.  While I consider this to be a business problem, and not a technology problem per se’, I’ll use root cause analysis (discussed in a prior post) to organize the analysis.

The problem, as perceived by developers, can be quoted this way:

Problem Statement: The requirements for software, as delivered by typical business analysts, is not sufficiently clear, insightful, or well understood to develop software systems that meet the needs of business users. 

There is an assumption here.  One that I won’t challenge (today) but one that probably should be qualified:

Assumption: Improving the quality of software requirements will have a net positive effect on the quality, reliability, applicability, usability, and value of custom software as perceived by the business users who use it.

Interesting assumption, that one.  Perhaps fodder for a later blog post.

So, how do we help software developers deliver timely valuable code without driving the business users crazy with technical details that they’d rather not bother with or understand?  Let’s look at the causes of our problem. 

What follows is a (long) root cause analysis of "poor" software requirements.  This can be used to understand the problem, which is the first step to addressing it.

How to read this analysis:

Each entry is seen as a "cause" of the problem.  Indented below each are the answers to the question "why" or "what is the cause of this cause?"  As you read this list, insert the word "because" at the start of every sentence.

The text may look a little odd because of the indenting.  An analysis like this looks good on an Ishikawa diagram, but that’s pretty tough to do on a blog.  The ultimate goal is to ask why five times, or to get to five levels deep.  The numbering does not go from 1. to 1.1 to 1.1.1 as the levels increase.  That is due to a limitation of my blogging tool.  I also don’t go to "five levels deep" in many areas when there is no value in doing so.

Note: A SME is a Subject Matter Expert.  An Analyst will gather requirements from a SME, but may then need to review those requirements with other business stakeholders.

Problem: The requirements for software, as delivered by typical business analysts, is not sufficiently clear, insightful, or well understood to develop software systems that meet the needs of business users. 

  1. Business stakeholders may not believe that it is important to invest time in creating clear, unambiguous requirements.
    1. Business stakeholders perceive "requirements collection" as a time consuming activity
      1. When an analyst starts to collect requirements, the analyst requires a LOT of hand-holding.
        1. Analysts may not routinely understand key aspects of the business.
          1. The business may change in ways that the analyst is not aware.
          2. The analyst may not remember key aspects of the business that they may have seen in the past.
          3. The analyst may not have made an effort to learn or understand the business.
          4. The analyst may not be able to get sufficient face-to-face time with business SMEs due to geography, language, time zones, or security policy.
        2. Business SMEs may find themselves explaining the same things, over and over, to different analysts.
          1. Analysts need deeper understanding than is available from orientation courses.
          2. Analysts are unable to document the business with sufficient clarity to benefit other analysts.
          3. Analysts may not be motivated to create documentation for other analysts.
          4. Turnover in the analysis team may be high, causing the need to "re-explain" key concepts.
        3. Analysts often ask questions where the answers have not been decided yet, especially for new business programs or innovative new products. 
        4. Business SMEs may not trust Analysts in meetings with high-ranking executive SMEs
          1. Analysts may suggest a change that will cause a business leader to lose power or prestige.
          2. Analysts may question a leader’s pet project or assumption.
          3. Analysts may suggest a change to business that, if performed, would drive a great deal of effort and cost on the part of a person who does not want to do that work.
      2. When an analyst collects requirements, s/he takes a long time to write things down.
        1. The steps involved in documenting requirements may seem arcane or unclear to the business stakeholder.
        2. Business stakeholders may underestimate the amount of time it takes to write good requirements.
        3. Business stakeholders may not be aware of the contributions (and delays) of other business stakeholders in the requirements gathering process.
        4. The analyst may be using his or her time in an inefficient or ineffective way.
        5. The analyst may be working on many projects at one time, may be working part time, may be playing many roles, or may have some other reason why they cannot dedicate full time to documenting requirements.
      3. When a business stakeholder reviews the requirements, it takes a long time to validate them.
        1. Requirements are written in a "funny way" that takes getting used to.
        2. The business stakeholder may delay the start of the review process
          1. Requirements documents may be quite lengthy, and the business stakeholder may feel the need to try to tackle the document in one sitting. 
          2. The business stakeholder may benefit from delaying or sabotaging the analysis effort.
          3. The business stakeholders’ immediate superior may request a delay in the start of the review effort.
        3. Business stakeholders may not place a high priority on validating requirements, which slows the process down.
          1. Placing a high priority on reviewing requirements may cause friction for a business stakeholder that is involved in "competing" activities.
          2. The review process may be perceived as an ineffective use of the stakeholder’s time.
        4. Business stakeholders may delegate the effort of validating requirements to a person that is not qualified to make decisions, thereby delaying the validation process.
    2. Time spent in developing requirements does not seem to improve business operations in other ways. 
      1. No other part of the business gets value when a business SME spends time talking to an analyst.
        1. Business SMEs may feel that providing requirements does not add value to the stakeholder’s other responsibilities. 
        2. Business SMEs are often interrupted in their daily activities to provide requirements.
        3. Business SMEs may feel unprepared to provide useful information.
        4. Business SMEs may fear reprisals if they provide a bad requirements, so it may be politically risky to share a requirement.
      2. No other part of the business gets value out of reading the written software requirements.
        1. The detail required by software people is not interesting to others.
        2. Requirements are written in long terse sentences that will put an insomniac to sleep.
        3. Collected software requirements are not useful for driving other business efforts.
    3. The business stakeholder cannot tell when to invest time or resources in producing good requirements and when the investment is wasted.
      1. The connection between poor requirements and defects is poorly understood.
        1. Business stakeholders may not believe that a defect is caused by a poor requirement.
        2. Developers have a difficult time determining if a defect will be easy to fix.  It can take days, or weeks, just to figure out how hard a fix will be. 
        3. Developers may argue that a defect that is caused by a poor requirement is not a defect, but rather a change request, forcing the issue to be addressed in a different way altogether.
        4. When presented with a defect, a business stakeholder is rarely able to see the requirement that led to it. 
        5. Business stakeholders do not understand how to identify which kinds of requirements need the most attention in order to have the best impact on defect rates.
      2. IT professionals do a poor job of guiding business stakeholders to focus on the requirements that will most impact the cost and quality of a solution.
        1. Business analysts often do not understand which requirements will "drive costs" and which will not.
          1. Simple cost analysis tools do not exist.
          2. Simple formulas, methods, or procedures do not exist for this kind of outcome.
          3. Where tools methods and processes do exist, few analysts know about them.
        2. Business analysts present information in a manner that does not highlight "cost driving" requirements
          1. The most common tools for collaborating on requirements are Excel and Word, which do not have features specific to requirements gathering.
          2. Most business stakeholders have no way of easily reshuffling, highlighting, sorting, or prioritizing requirements, so they cannot use their own concerns to select the requirements to review.
  2. Business analysts may not add sufficient value in the collection and analysis process to drive down software defects.
    1. Many business analysts may not be aware of how to add value to business requirements.
      1. The business analyst may be poorly trained or may not have sufficient experience.
        1. The organization may not have invested in training and support for the business analysis role.
        2. The analyst may be new in his or her role.
        3. The analyst may not have sufficient basic skills in business or computing to be able to perform the role at all.
      2. Effective training and practices may not be available to the analyst.
        1. Long term studies on specific best practices in business analysis are lacking.
        2. Shared training and leadership in the business analysis profession is just getting started.   
    2. Many business analysis efforts are understaffed, have poor tools, or insufficient time to add substantial value.
      1. The project manager who planned the analysis may not understand the amount of effort needed to collect proper requirements.
      2. It may take longer than estimated, or be more difficult than expected, to collect requirements, causing a budget squeeze on the amount of time and money that can be spent on them.
      3. The project manager who planned the analysis effort may benefit in other ways by intentionally undercutting the time, resources, or tools for analysis.
    3. Many business analysis efforts do not have sufficient access to key decision makers to add the right amount of value.
      1. Interaction with business analysts may be delegated to people who cannot make decisions.
      2. Business analysts may be seen as outsiders, causing the organization to keep them at bay.
      3. Key decision makers may have prior bad experiences with IT professionals, creating a culture of "Avoid IT"
    4. Business stakeholders may oppose the effort of business analysts to add value to the requirements
      1. Adding value to the requirements may increase the cost to the business.
        1. The business may need to commit more resources to validate the requirements if the analyst adds value to them.
        2. The analyst may need to access more stakeholders and SMEs, for a longer period of time, to add value to the requirements. 
        3. The analyst may need to use data gathering means like surveys or travel that incur costs in order to add value to the requi
      2. Business stakeholders may oppose the efforts of an "outsider" to analyze or document their business beyond basic software requirements gathering.
        1. An existing resource within the business may be responsible for business analysis, business modeling, strategic development, or process improvement who would perceive additional analysis efforts as "overlapping" or wasteful.
        2. Business leaders may fear the political consequences of allowing an outsider to see, and document, how their business operates.
      3. The business stakeholders may not trust the interpretation of requirements written by an analyst.
        1. The analyst may use different words than the business does.  The business stakeholder may not trust or understand those terms.
          1. The analyst may have chosen to use different words than the business stakeholder to improve consistency between various requirements, between stakeholders of the project, or with other projects in the same line of business.
          2. Analysts may elect (or be forced) to use standardized terminology that the business stakeholder is not familiar with.
        2. The analyst may draw conclusions about the business model or business processes that the business stakeholder does not agree with.
          1. The analyst may lack the authority to draw the cited conclusion.
          2. The analyst may be recommending a change in the business that the business stakeholder believes will not work, or will work against current company policy or strategy.
          3. The analyst may lack the sufficient business acumen to explain their conclusions well.
          4. The analyst may lack the sufficient situational awareness to recognize the political or organizational implications of delivering the conclusion.
          5. The analyst may be using a different underlying paradigm of business or economics than the business stakeholder.
        3. The analyst may elect (or be required) to use a requirements gathering methodology that the business stakeholder does not like or agree with, leading to a lack of trust in the conclusions.
          1. The methodology may ask questions that the business stakeholder is not prepared to answer or that the business stakeholder believes will produce an incorrect conclusion.
          2. The methodology may assume that the business stakeholder is more knowledgeable, empowered, or self aware than he really is, producing conclusions that the business stakeholder cannot agree with (or, in some cases, understand).
          3. The methodology may involve people (internal or external) that have a stake in the failure of the particular business stakeholder, or that part of the business.
      4. Business stakeholders may not want to see contradictions in their business highlighted in text by a business analyst.
        1. The business stakeholder may benefit from those contradictions for their continued success.  For example: A business stakeholder may want to protect a business program funding model that allows "shadow" projects to be funded, even if the system being described calls for accountability, visibility, openness, and a fair funding model.
        2. The business stakeholder may have been responsible for removing a problem in the past, and they may have declared the problem "solved."  Any conclusion by an analyst that suggests that the problem still exists will work against their credibility with their own manager.
      5. Business stakeholder may benefit from structure within their business that they do not want an analyst to call attention to.
        1. The business stakeholder may believe that their success rests on controlling a particular business capability, even if doing so is skirts company policy.  For example, a business leader may have created a marketing team or IT team within their own business, even when the executives have issued directives to remove these independent units and consolidate to a central function.
        2. The business stakeholder may have a responsibility that other parts of the business do not want him or her to have.  For example, the business stakeholder may be chartered to complete a project that their business unit is not, technically, supposed to be working on. 
    5. The business analyst may be unable to properly reconcile conflicting requirements from multiple SMEs (updated)
      1. The business analyst may not know that the requirements conflict.
      2. The business analyst may view the reconciliation process as "not my problem"
      3. The business analyst may lack the skills to negotiate a reconciled set of requirements
      4. The business analyst may reconcile the requirements improperly, resulting in requirements that are consistent, accurate, and wrong
  3. Software developers may not invest sufficient time and effort in understanding the requirements.
    1. Requirements may be difficult for developers to read and understand.
      1. Developers may be daunted by a large requirements document.  (see related cause 3.5 below).
        1. The format, delivery method, or structure of the requirements document may force the developers to consume it in paper form, HTML form, or some format that the developer finds non-navigable.
        2. Security or access restrictions on the document may make it difficult for the developers to consume and remember the document of the size delivered.
      2. Developers may not understand the text of the requirements document.
        1. In some cases, the developers may not natively speak the language that the document is written in, reducing comprehension.  (example: developers in China reading a requirements document written in German).
        2. The requirements document may have been translated from one language to another, causing a loss in readability.
        3. The analyst may be writing the requirements in a language that they are not sufficiently skilled to write in, creating grammar and logic errors that reduce readability.  (example: an analyst from the USA who speaks French poorly, writing in French for a French IT group to use).
        4. The analyst may have written the text in a logically inconsistent manner.  In other words, errors in thinking or structuring the information may come through, making the b
          usiness requirements difficult to understand.
        5. The writing style, grammar, punctuation, formatting, or layout used by the analyst may be difficult for the developers to accept or consume.  For example, the developers may prefer a particular numbering style or indention rules that the analyst did not follow.  Culture may play a role in the consumability of the requirements document.
      3. Developers may not understand the cultural context of the requirements.
        1. The software system may be intended for users in other parts of the world than the developers are familiar with.  As a result, statements in the document may be perplexing or vague when taken out of the context of the culture in which the software will be used. (example, an IT division in the USA building a system marketed to Japanese accountants).
        2. The software system may be intended for users in other parts of the enterprise itself where business cultural rules drive specific business policies.  If a developer is not familiar with those business cultural rules, even if the users live in the same country, the requirements themselves may be perplexing or vague.  (example: an IT division that provides services to an airplane manufacturer that builds both military and civilian aircraft).
      4. The business terms and concepts in the requirements document may be unfamiliar, inconsistent, non-standard, or poorly defined.
        1. The document may not contain sufficient information about what the terms are, what they mean, or how one term relates to another.  (Example: a requirements document may describe the need for the CCC division to access a particular report, without either defining what the acronym means or making it clear that "account managers" may work for both the CCC division and the LJM division).
        2. The developers may have used some of the same terms in a different way, requiring them to unlearn their prior understanding in order to adopt the unique definitions in the requirements document.
        3. The terms and concepts may be inconsistently described or applied.  Example: one part of the document may describe requirements for an "administrator" while another part of the document may describe an "account manager" when the business intended to describe only one role, or the business process calls for only one role.
        4. The terms and concepts may have more than one meaning and it is not clear which meaning is being implied.  For example, the requirements may refer to "services" provided to customers and "service levels" applied to system reliability, without differentiating the use of the term "service."
      5. The business context of the requirements may be unfamiliar to the developers.
        1. The developers may not understand how the business itself makes money, how the partners make money, or how the products/services of the business are positioned in the marketplace.
        2. The developers may not understand what externally generated rules or regulations are influencing the business decisions that show up in the requirements.
        3. The developers may not understand the business strategies being implemented or how those strategies are supposed to produce results for the company.  For example, if a business leader decides that their products should be introduced at a particular price point in order to beat a competitor, that may lead to the need to keep the costs of a particular customer service process very low, driving specific software requirements.  Without tracability to the business strategy, the developer may not understand the need to automate the entire process to the exclusion of a small number of potential customers.
      6. The business processes described in the requirements may be incomplete, poorly described, or inconsistent.
        1. The business processes themselves are chaotic, inconsistent, ad hoc, or flawed.
        2. The business may not have a consistent set of roles and responsibilities defined.
        3. The analyst was unable to get a consistent explanation of the business process.
          1. Business may not have experts that understand the business processes from end to end.
          2. Business SMEs may not be available to the analyst to explain or validate the business process models.
          3. The business stakeholders may benefit from keeping the business process hidden or obscured.
        4. The business analyst may not have sufficient skill or training to produce effective business process models.
        5. The business analyst may not have had sufficient time or resources to describe business processes.
        6. The inclusion of business processes in a requirements document may not have been adopted as a practice within the particular enterprise.
      7. Developers may not understand the diagrams used in the requirements document.
        1. The developers may not be familiar with the diagramming notation used by the analyst.
          1. The analyst may not have used a standard diagramming language like BPMN or UML to capture information.
          2. The analyst may have used a standard notation in a novice or inconsistent manner.
          3. The developers may not have sufficient skill or training to read the diagramming notation.
        2. The developers may not find the diagrams to be useful for understanding software requirements.
          1. The diagrams may illustrate information that is not important to the developer who is consuming it.
          2. The diagram may illustrate information from a viewpoint that the developer does not understand.
          3. The diagrams may be include objects of different types that do not make sense to combine.
          4. The diagrams may introduce concepts that are not explained in the text.
          5. The diagrams may actively contradict the accompanying text or may not be explained at all in the accompanying text.
        3. Important content within the diagrams may be obscured, missing, or difficult to find.
          1. Analysts may have made poor use of color, losing content when printed in black and white, or when consumed by color-blind developers.
          2. Analysts may have included too much o
            r too little text on the diagrams.
          3. Analysts may have mixed many concerns onto a single diagram, making the diagram needlessly complex.
          4. Analysts may use a metaphor in the diagrams that the developers are not familiar with, or find confusing or offensive.
    2. Requirements are boring and tedious to understand
      1. Analysts may adopt a "special language" for describing the requirements that feels unnatural or fails to illuminate the requirements.
        1. The "special language" may require sentences to written in a restricted subset of the language.  Flaws in the methodology may produce some "requirements statements" that are convoluted or difficult to understand using that restricted subset. 
        2. The software developer may not be familiar with the need for accuracy in requirements, causing them to reject "special languages" as unnecessary overhead, reducing the effectiveness of communication.
        3. Analysts may have adopted a "special language" in an inconsistent or novice manner, making it difficult for the intent of the requirement to be understood because the developer stumbles on flaws in the writing.
      2. Requirements are usually written in terse, humorless prose with few diagrams
        1. Embedded jokes may offend a stakeholder.
        2. Light-hearted or pleasantly written requirements text may be perceived by stakeholders as "not taking the problem seriously."
        3. Analysis may fear that developers will dismiss a requirement as "nonsense" if it is pleasantly written.
        4. Analysts may lack the skills to write requirements in a more engaging manner.
        5. Analysts may lack the skills to create useful diagrams to illustrate the requirements.
        6. Existing diagramming methods may not have been adopted by the organization.
      3. Requirements documents may place emphasis on sections of text that do not illuminate the problem.
        1. Requirements may be written in a manner that reflects the political desires of the business decision makers that need to approve it, causing emphasis to be placed on elements that the business leader considers valuable, rather than elements that developers would consider valuable.
        2. Requirements may be written in a manner that intentionally obscures flaws or defects in the business processes or business structure.  Obscuring these details may require that specific business rules or descriptions are written in a vague or incomplete manner.
    3. It may be difficult to describe how requirements specifically translate into design
      1. The design process may be seen as a creative process that is inspired by, but not tied to, requirements.
      2. The developers may find it difficult to tie specific aspects of the design to requirements.
        1. The developers may use reference architectures or patterns that have features that exceed the requirements.
        2. The developers may not understand which feature of a reference architecture specifically suits a requirement.
        3. Processes, methods, and formulas for deriving design from requirements may not be commonly accepted or widely followed within the enterprise.
      3. Standards of the organization may introduce requirements that must be met, but which the developer feels that they cannot justify to the business stakeholder.
      4. Developers may feel that the analyst has missed a requirement and therefore may add a design element to meet the "shadow requirement" without documenting the requirement itself.
    4. It is difficult and time consuming to insure that a design will meet requirements.
      1. The tools in use by the organization make it difficult or impossible to tie specific requirements to specific aspects of the design.
      2. Organizations may not have adopted standard methods or practices for connecting requirements to design.
      3. Developers may not be rewarded for spending time connecting requirements to aspects of the design.
        1. Tracing design to requirements may not be an activity expected within the organization.
        2. Developers may not view the activity as valuable.
        3. Project managers and project leaders may not be aware of the effort or value of the activity.
        4. Business stakeholders may not be aware of the value of demonstrating tracability in design.
        5. Project leaders may view the activity as "more expensive than beneficial."
      4. More than one design may have been considered, driving up the cost of tracability for each design considered.
      5. Developers may have made an intentional tradeoff that allow some requirements to go unmet.
      6. Requirements may be changing as the design process, making it frustrating for the developer to tie requirements to design.
      7. Iterative design processes allow some requirements to be left out of consideration until a later time.  At the end of iterative design timeframe, the project team may not have funds to deal with requirements that remain unmet by any design candidates.
    5. Requirements may contain details that any particular software developer is not interested in
      1. The organization’s requirements management tools may not allow the software developer to select the requirements that are salient for him to understand.
      2. The organization may have adopted the practice of "write down everything" in an effort to improve requirements, triggering information overload.
    6. The project manager may not have allocated sufficient time or resources for developers to understand requirements
      1. The project manager may not understand the process or the difficulty involved with understanding requirements.
      2. The budget for the project may not allow for sufficient time spent on understanding requirements.
      3. The organization may not culturally value the effort of understanding requirements.
      4. The developers may be using inefficient means to unde
        rstand requirements.
      5. The tools and techniques in use by the enterprise may make it more difficult for the developers to understand the requirements.
    7. Software developers may resist reading or understanding a particular requirement or set of requirements
      1. Since requirements change, developers may feel that it is a waste of time to invest in understanding them.
      2. Developers may want to substitute their own judgement for the judgement of the business stakeholders and analysts 
        1. Some developers may recognize common problems that the analyst did not describe, or may wish to bring insight from other experiences.
        2. If a developer does not understand a requirement, it is often simpler to substitute a different requirement (or ignore it) than to invest in understanding it.
        3. The developer may not trust the analyst, or the business stakeholder, to correctly describe the problem.
      3. A developer may not see the value in investing time and effort in understanding a requirement.
        1. Software developers may not have been trained on the implications of poorly understood requirements.
        2. The organization may inadvertently reward developers for not spending time to read requirements.
          1. Software developers may not be measured on the number of defects they produce.
          2. Many organizations place an emphasis on the test team, not the development team, to insure that a system meets requirements.
  4. The analyst may not have acquired sufficient information to create a complete list of requirements. (updated)
    1. The analyst may not have searched sufficient sources of information
      1. The analyst may not have spoken with the right people
        1. The most knowledgeable SME exists in the business, but the analyst does not interview him or her
          1. He or she has changed jobs or been promoted, and the new manager does not view their contribution as "part of their job"
          2. He or she has lost favor, so the analyst is encouraged not to speak with him or her
          3. The SME does not announce his knowledge or experience to the analyst
          4. The business does not know who the most knowledgeable SME is, and does not direct the analyst to him or her
        2. The most knowledgeable SME no longer exists in the business
          1. The SME has retired
          2. The SME has been laid off to cut costs
          3. The SME has been fired
          4. The SME was a contractor and their contract has ended
          5. The SME was a consultant and their project has completed
        3. The analyst may not be able to communicate effectively with the most knowledgeable SME
          1. The SME may speak a different language than the analyst
          2. The SME may work in a different location than the analyst
      2. The analyst may not have collected requirements from existing software
        1. Existing requirements may not be documented
        2. Existing design, constraints, and business rules may not be documented for analysts to use
      3. The analyst may not have collected requirements from external regulations
        1. The analyst may lack awareness of the regulations that apply
        2. The analyst may not have the necessary expertise to collect requirements from regulations
        3. The analyst may fail to get the business to share regulatory requirements
          1. The business may be relying on external experts that are not available.
          2. The business may be relying on external experts that are not sufficiently skilled.
          3. The business may not have analyzed the regulations for their requirements
          4. The business’ analysis of regulatory requirements may be flawed or incomplete
    2. Some of the information may not be available to the business
      1. The business may be taking on a new venture
      2. The business may be merging with another business that is sufficiently different from the existing business models.
      3. The business may be insourcing an existing function
      4. The business may be split off of an existing business (for example: IBM split off Lexmark as an independent printer business).
      5. The business may not know what existing requirements must change
        1. Existing requirements may not be searchable
        2. Existing requirements may not be documented
        3. Existing requirements may not be understood by anyone
        4. No one in the business is accountable for analyzing existing requirements for change
    3. The analyst may collect conflicting requirements from different SMEs (see related section 2.5)
      1. SMEs may have differing personal opinions of how to solve specific problems
      2. SMEs may have different philosophies and/or business models in mind for the future of their business.
      3. Business leaders may have provided conflicting guidance to business SMEs.
      4. SMEs may have personal or political reasons for suggesting conflicting requirements
    4. The business users may not know how to share requirements correctly
      1. The business may not have trained their SMEs on "how to work with IT"
      2. The business users may not have consistent terms or processes
      3. The business users may not be aware of IT services
      4. The business users may not be aware of the kinds of features that software can provide



Whew.  That list is long, and probably missing a few things, but it is a fairly good attempt at describing the root causes of "poor software requirements."  (There are 149 184 listed ’causes’ of poor software requirements in the list above, only counting the leaf levels of the analysis).

Some takeaways from this effort:

  1. Skills and training show up many times.  Select or develop standards and then train staff members to use them.
  2. Creating useful requirements that get used… easy to say, hard to pull off, especially with so many potential roadblocks. 
  3. Good, inexpensive, well thought out, requirements management tools can fix many of these problems.