20 Questions to Ask Before Designing Your Next Dashboard

Here are 20 questions that you should ask before designing a dashboard and 5 questions you should ask after. These questions are set in a template to build consensus and facilitate communication with the stakeholders through the design process.

The key to designing an actionable dashboard is getting a good start.

However, before we talk about where to start, let’s talk about where NOT to start.

Do not start with the tool.
Do not start with “let me see what I can do in Power BI or Tableau or Qlik or Excel”.

Do not start with the data.
Do not start with “let me see what data I can find on xyz”.

When you start with either the tool or the data, you lock yourself into a frame of mind and play host to a number of biases. Wikipedia has a list of nearly 200 types of biases. Generally speaking, these biases are the mind’s defense to address four problems: information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later.

If you start with an ocean of data, your mind will inherently lock on to the bits that seem relevant. Add to this the competitive need to act fast and you introduce more bias. If things do not add up, your mind forces logic on to it to make sense of it. So, this approach of starting with the data or the tool actually ends up being pretty far from evidence-based or data-informed. So, where SHOULD you start?

Start with THE Question!

The template below is a communication, consensus-building and learning tool. As you work your way through it, it should reflect the thought processes and expectations of both the analyst and the stakeholders. It is essentially a list of questions to figure out THE question, THE purpose of your dashboard. So that when it is time to pick a tool or get the data, your mind’s defense system triggers the fewest number of biases

Why use this template?

  • to create a consistent approach to problem solving with data
  • to maintain a big picture view of the project
  • to increase stakeholder involvement
  • to quickly identify priorities, issues, options and milestones

The template has 8 sections:

  1. Who
  2. Background
  3. Current State
  4. Target State
  5. Gap Analysis
  6. Function
  7. Implementation
  8. Follow up
Finding the Starting Line (Template for Actionable Dashboards)

and as promised here are the 20 questions that fit in those sections:

1. Who

Questions you should ask to identify the stakeholders:

  • Who will create the dashboard?  (Analyst)
  • Who will validate the data?  (Subject Matter Expert)
  • Who will maintain the dashboard?  (Owner)
    • Handle access requests? Troubleshoot? Answer questions?
  • Who will use this dashboard? (Viewer)

2. Background

This section keeps you focused when you are faced with an ocean of data and visualizations. Ask these questions to mine the full background so that everyone involved understands the context and importance of this dashboard. the more time you spend specifying the question and why is it important, the more invested your end users will be in the process and the resulting dashboard.

Questions you should ask to determine why this dashboard is needed:

  • What is the ONE question you want to answer about the Area of Focus?
  • Why is this an important question to answer?
  • What will change once this dashboard is in place?
  • What actions/decisions will the answer inform?
  • How does it align with the big picture goals of the team/organization?

3. Current State

This section gives everyone involved a concise and precise view of the current state. This is also a prompt to discover other analysts who might be a good resource and learn from their experiences. It can help identify the most efficient way forward.

Questions you should ask to determine what exists

  • How are you currently answering these questions?
  • How are decisions currently made? What is the current process?
  • What assumptions are being made? Are they valid, consistent?
  • Are there related tools/reports/data sources that other groups may be using?

4. Target State

This section allows you to think about the problem without limitations, perceived and actual.

Questions you should ask to determine what the dream is:

  • If you could have access to ANY data you want, what would it be?
  • If you could directly track/measure anything, what would it be?
  • What did you always want to know but couldn’t find out?

5. Gap Analysis

Questions you should ask to isolate the pain points:

  • What have been blockers in the past?
  • What are the potential new blockers?
  • What is currently missing? hard to get? incorrect?
  • Which of the existing reports/dashboards is the closest to the target state? What changes might be made to existing reports?

6. Function

By now you should know the function of your dashboard. This will dictate the type of visuals, level of interaction, and amount of detail to include. Ideally, a dashboard serves one purpose at a given time.

  • What is the function of your dashboard: View, explore or monitor?
User wants toDecision Level
View high-level summaryWhich route shall I take?
Explore data for trends, build prediction models, identify benchmarksChanging lanes in time
Monitor activities in a focus area to identify problems/ deviationsBraking in time for a crossing cyclist
Dashboard functions

7. Implementation

In this section pend down some dates on the analytics lifecycle.

Analytics Lifecycle

As you go through these stages, you will discover new information that may support or challenge your findings from previous sections of the template. Make sure you jump back and update those. Remember, this is a communication tool, so keep it iterative. The messier, the better.

8. Follow-up

A good dashboard must spark more questions. Hence, all such projects should be iterative in response to evolving questions. Here are 5 bonus questions you should ask to review usage and ROI, after you have designed your dashboard.

  • What is the actual number of views for this dashboard? How does it compare to the expected usage?
  • If this dashboard is not doing what the viewers expected it to do, is it a training issue? Design issue? Data issue?
  • Are the initial assumptions still valid?
  • Which decisions/actions did the dashboard inform?
  • What are the new questions that need to be answered?