Announcing: The Case Study Series

Do you struggle to understand the business justification for your projects? Find it difficult, of not impossible, to convince stakeholders to dedicate time to exploring the problem before jumping into “solution mode”? Have a hard time demonstrating  critical thinking, initiative-taking, and influencing without authority skills in job interviews or performance reviews?

This new series will feature six case studies tackling the top challenges identified by Adriana Beal while coaching business analysts over the past two years. Lessons are based on concrete, real-world examples designed to help you understand, remember, and apply knowledge to succeed in your business analysis career. All case studies conclude with notes that help you review and memorize what you’ve learned and extrapolate the lesson to other real-life scenarios.

To download a sample and learn more, visit The Case Study Series page.

Are you using the right unit of analysis in your software projects?

One of the mistakes I see repeated over and over by product managers, product owners, and business analysts, is using features as their “unit of analysis”.

It doesn’t help that books and online resources reinforce this notion all the time:

If the focus of your work is to define and prioritize features, chances are you’re missing huge opportunities to address important and underserved needs for your internal or external customers.

To increase the value you deliver to direct or indirect users of a software application, change your “unit of analysis” from feature to problem statement.

Here’s an example from a content management tool used by marketers to store and publish content to blogs and social media.

The product team is looking for opportunities to improve the quality of the product with the goal of increasing user adoption. It plots an “importance and satisfaction” graph and identifies an opportunity to improve the widget used to upload images and videos to the content management tool that will save users some clicks and make the upload process faster and more delightful.

  • Analysis at feature level: A usability test is performed for the new upload widget. All goes well, and the feature is prioritized for the next release.
  • Analysis at problem statement level: By performing “problem interviews”, the product manager identifies an obstacle that is preventing more users from adopting the tool: marketers can’t see the number of times a piece of content has been published, and prefer to use their own spreadsheet to keep track of when/where content was published so they can avoid content fatigue. Until this barrier to product adoption is overcome, there’s little point in continuing to improve the service dimension, as a faster and more delightful method to upload images is not going to convince non-users to adopt the content management tool.

When you shift from feature to problem statement as your unit of analysis, you develop a more holistic view of value and increase the odds of delivering a product that people want to use. One useful way to arrive at a valid problem statement is to use the following framework:

Service: The work the application does or help users do.

Barriers to success: The circumstances or obstacles that prevent the service from being successfully delivered or the application from being adopted by its intended audience.

Desired outcomes: The measures of performance that customers use to judge its value and are inherent to the execution of the service.

Here’s the framework applied to our case study:

Application:  Content Management tool

Who is it for: Marketers publishing content to blogs and social media

  • Service:  Store text, images and video, and publish content to blogs and social media.
  • Barrier to success:  Lack of visibility into how many times a piece of content was already been used and when causes marketers to prefer the use of a spreadsheet to keep track of their content in order to avoid the risk of content fatigue.
  • Desired outcome: Minimize effort to publish content to blogs and social media while preventing content fatigue.

Problem statement (in the format of a key question): How can we remove the barriers keeping marketers from using our content management tool in order to increase user adoption?

Given that the new file upload widget would merely help improve the service dimension without supporting the achievement of the desired outcome, it would not be prioritized until the barrier to success was removed. A capability that may not have been requested by users yet–a counter of how many times a piece of content had already been used, and in which channel–, would be given a higher priority because of its higher potential to produce the desired outcomes for the customer and the business.

This kind of analysis can be applied to all sorts of software initiatives, from small enhancements to entirely new products. By shifting your attention away from prioritizing individual features and concentrating instead on solving a problem end-to-end for a group of users with the customers’ and business’ desired outcomes in mind,  product teams can pave the way for the evolution of their product toward much higher levels of customer satisfaction and value delivery.

 

You may also like:

Stop prioritizing features

 

Going back to basics

A message posted by in the discussion forum of the Business Analysis Leadership group got me thinking. We were talking about requirements reviews as a good practice to improve the quality of the requirements delivered by a business analyst, and a manager wrote,

I have seen some companies where the manager tried to implement a requirements review done with the other BAs. Because the other BAs don’t know much about the project, and nothing about the business needs, processes and requirements, this kind of review was soon abandoned because it was not productive at all.

It’s curious how people will find all sorts of excuses to give up on an approach that has been proven to produce a positive impact. The book Software Requirements by Karl Wiegers, among others, provides good advice on how to set the stage for effective requirements reviews. Among his tips:

Give reviewers context for the document and perhaps for the project if they are not all working on the same project. Seek out reviewers who can provide a useful perspective based on their knowledge. For example, you might know a coworker who has a good eye for finding major requirements gaps even without being intimately familiar with the project.

In my experience, it’s not hard to find people within the organization capable of highlighting issues with a set of requirements (or user story acceptance criteria). Members of the BA, QA, development, and support teams are typically good choices. Even when they’re not very familiar with the specific project or business process, by reading a project overview and reviewing the process flow they should have enough context to make a positive contribution during the requirements review process.

But this is only one of many examples I see in the BA community of people finding excuses not to apply proven techniques to their business analysis work. Here are others:

“Oh, but it’s easy for you to do that [ take a step back to focus on the problem space before jumping into “solution mode”] because you shifted to product management from business analysis. It’s much easier for you to impose resistance when stakeholders are impatient to get their project started before the problem is fully understood”

(Hmm… No, I’ve used the same approach while working as a senior business analyst at a large tech company, and it succeeded even when I had been explicitly told by IT management not to talk directly to our business stakeholders. )

“This book doesn’t provide any solid material for reuse – its full of theoretical approaches which will never work on the job practical approach. The methods and the approaches are good for schools or colleges which teach BA for first timers.” 

(Really? Before I made the recommendation in reply to a request for a good reference book on BA activities, I had  worked on two successful large projects that used the exact same techniques it describes.  Clearly this person was looking for a “silver bullet” that doesn’t exist, while refusing to try techniques that do work, but only if you put significant effort to prepare, educate executives and teams, etc.)

To avoid falling into the same pitfall, here are a few things to ask yourself:

  • Am I looking for excuses for not doing the hard work? For example, not defining quantitative success criteria for my project because there’s no time or people to provide context?
  • Did the approach I use fail because it’s not applicable to my situation, or because I’ve not applied it the right way? Scott Sehlhorst has written great article that illustrates this situation. His post is about product roadmaps, but it could be applied to useful BA practice: “Drawing a stupid relationship diagram is bad, therefore, don’t draw a relationship diagram!”
  • Did I use the premortem technique  to avert failure? When you’re attempting a new approach, it pays off to pretend a success or failure already occurred so you can identify the conditions required to win, instead of waiting until the end of a project to find out what went wrong.

 

Image by Mario Klingemann