Analytics Dashboard Design
Summary
Led the UX Design of the Enterprise Balanced Scorecard - an analytics dashboard intended to give an overview of overall health of the UC Davis Health system.
The Challenge
Designing an enterprise performance dashboard to showcase UCDH's system-level performance across multiple metrics, tailored to accommodate users with varying needs, different levels of employees, and different levels of access for viewing specific metrics.
Core Team
Representatives from each of the core organizations within UC Davis Health including clinical staff, finance, operations, HR, Diversity, Research, and Education.
UX designer (myself)
BI analyst/Tableau developer
Data architect
My Role
My responsibilities included the following UX activities - User Research, Requirements Gathering, Stakeholder Interviews, Personas, White-boarding, Information Architecture/Site maps, Wire-framing, and Prototyping, as well as creating Design Specifications for BI analysts/developers.
Dashboard Design Process
Led the UX Design and Data Visualization for roughly 45 analytics dashboards for different organizations at UC Davis Health and Sutter Health, utilizing a playbook adopted from collaboration with Deloitte.
The Discovery and Design phase encompassed the majority of my UX work. Throughout this phase, I ensured that the dashboards were aligned with user needs, met stakeholder expectations, and effectively addressed business objectives.
During the Front End/UI Development phase I provided detailed design specifications to our Tableau developers and collaborated with them as they implement the views and layouts according to our requirements.
User Research
Stakeholder Interviews: I began by conducting interviews with key stakeholders, including business owners and core team members. This helped me understand their specific needs, goals, and pain points with existing analytics solutions.
Requirements Gathering: Requirements gathering sessions were conducted through group sessions which allowed me to gather comprehensive requirements and insights from all relevant representatives. I incorporated an affinity mapping exercise to stimulate conversation and generate ideas during requirements gathering since there were a number of representatives involved.
Additional User Research Activities: I also analyzed data on existing dashboards and reports to understand usage patterns and conduct heuristic evaluations to identify areas for improvement.
Listening is key in this phase (and trying to read between the lines).
Understanding the users/personas and which content each want to see was also essential.
Often have to really push users to prioritize the metrics/data they want to include at this phase.
Personas
User Needs and Personas Identified
Identified the following Personas and basic requirements during the user research and requirements gathering process:
C-suite Executives - summary or high-level overview of all metrics for each category
Supervisors/Managers - primary metrics for each category
Individual Contributors/Data Analysts - want to see the primary metrics as well as leading metrics for each category
Whiteboarding
Once the users of the dashboards, the business questions they want answered, and the metrics they want to see are understood, the sketch to the right was created on a whiteboard in the room to give core team members and stakeholders a rough visual representation of the dashboard to react to. This sketch helped the core team see the different categories of metrics that were chosen, a potential layout of the dashboards, and potential visualizations of the metrics. Sketches of the potential visualizations helped the core team provide feedback on which metrics they wanted closer together so they could compare then as well as ones they wanted to see change over time on.
Review/Validate with Users:
Requirements captured and whiteboard sketches created were reviewed with stakeholders/core team members to ensure the key metrics had been captured accurately before moving on to designs.
Information Architecture/Sitemaps
The following sitemap was created based on the metrics required as well as the personas identified during the user research in order to to identify the different levels of pages needed.
Level 1 view - provides executives with a high-level overview of key metrics, allowing them to quickly grasp the overall performance.
Level 2 view - tailored for directors and managers, providing more detailed insights to help them effectively manage performance at the department or individual group level.
Level 3 view – created to show more details for each category for individual contributors and data analysts
This is also where personas are aligned with the sitemap and the different levels of the dashboard to determine the details needed for each level. Additionally, sitemaps also help me understand the necessary navigation for users to seamlessly move between views.
Review/Validate with Users:
When designs were reviewed with the core team, we determined that a visual showing some type of composite score for each category was needed for executives.
Wireframes 1
Created wireframes based on the initial whiteboard sketch shown above.
Typically use Balsamiq to create wireframes, sometimes use Figma or Sketch.
Created multiple versions of the primary view with different visualizations for each metric in order to share different ideas with core team and give them different options to react to.
Wireframes 2
Based on the feedback received from the core team on the initial wireframes, created a revised version with the following:
To meet the needs of C-level executives, created a high-level, easy-to-view composite of the key performance measures for each category. For directors, supervisors, and managers, I provided additional metrics and detailed information below the summary, with interactive features that allow users to click for more in-depth details.
Changed the layout of the individual metrics below the composite score to better fit the space. All detailed panels below the summary are collapsible, enabling users to focus on the most relevant categories, thereby enhancing the dashboard's usability and efficiency.
Review/Validate with Users:
The core team liked this option as did the executives for each of their organizations.
Review/Validation with Users:
Once wireframes with the data visualizations are created, a review session is conducted with the stakeholders to gather feedback. I typically provide multiple design options for users to provide feedback on at this stage (unless the dashboard needs are very well understood). This allows users to provide feedback on the best way to visualize the data for them.
Data Visualizations
Once all the requirements were understood for each page or view, the next step was to visualize the different metrics based on the input gathered from stakeholders and the core group. This involved considering several factors:
Determine if stakeholders need to see trends over time for specific metrics. Line charts or area charts are commonly used for visualizing trends.
Identify if stakeholders need to compare certain metrics side by side. Bar charts, column charts, or scatter plots can be effective for comparing metrics.
Determine which metrics are related and should be shown together. Grouping related metrics together in the same visualization helps stakeholders make connections and draw insights.
Assess if stakeholders need to see how variables are distributed over time to identify outliers and trends. Histograms or box plots can be used to visualize the distribution of data over time.
By considering these factors and selecting appropriate visualization techniques, the dashboard can effectively communicate key insights to stakeholders and support informed decision-making.
High Fidelity Prototypes
High Fidelity Prototypes were created based on the wireframes to show an additional level of detail for users to review. These contained more detail such as chart and axis labels, colors, and fonts.
Note: High Fidelity Prototypes were created for this project since it was a high profile project and required an additional level of review by stakeholders. We wanted to make sure all the details were in place before starting development.
Review/Validation with Users: Once these prototypes have been created, an additional review session is held to gather feedback and stakeholders are asked to sign off on the final design.
User Testing: I frequently conduct user testing with wireframes or prototypes to ensure the design, navigation, and interactions meet user expectations.
Design Specifications
Once final designs have been signed off on by stakeholders, design specifications for the specific project are created to hand off to the development team to build the dashboard in Tableau.
Specifications include details such as container sizes (in pixels), font colors and sizes, color specifications, etc.
Final Product - Tableau Dashboard
Once the actual dashboard is created in Tableau by the development team, it is tested by our team then users are given access to review and test. In addition to testing the overall design and functionality, my focus is on optimizing the interaction and navigation.
During the last 2 phases of our design process - the QA & User Acceptance Testing phase and the Go Live and Support phase, I ensure the product meets the needs of the users identified up front, as well as ensuring the dashboards adhere to design specifications and identifying any usability. Furthermore, I provide training and documentation to users, address feedback from testers, and ensure a smooth transition to the live environment.
Review/Validation with Users: Follow up feedback sessions are conducted at 3- or 6-month intervals to gather feedback from users and either identify bugs that need to be fixed immediately or collect enhancements for the next revision.
Additional Tableau Dashboards
Additional analytics dashboards I led the design for.