Improving Usability with Data Visualization
Redesign SonarQube Dashboard
Background
SonarQube is a popular platform used by development teams to continuously inspect code quality and security.
The dashboard is a crucial part of SonarQube, providing an overview of the project's code quality. However, SonarQube's previous dashboard caused many help tickets, forum questions, and bounce rates.
My Role
Proposal, User Research, Information architecture, Wireframe, Prototype, Usability Test, Visual Design
The Team
UX designer, Product manager, Content strategist, Engineering manager, developers, dev-ops, customer success team, marketing team
Convincing Stakeholders of the value in Redesigning the Dashboard
Focusing on user-centered design can significantly enhance user experience, drive better engagement, and improve overall productivity.
Together with the Product Manager, we worked on the redesign proposal to determine why this should be a priority and how it aligns with business goals. Investing in a user-centered redesign for the SonarQube dashboard is not just a design choice; it’s a strategic move that aligns with our business goals of enhancing user satisfaction, driving engagement, and increasing retention.
Goal
The primary goal of the redesign is to create a more intuitive, interactive, and actionable dashboard that allows users to quickly understand the status of their codebase, prioritize tasks, and take immediate action.
The design should cater to new users who need guidance and experienced users who seek quick insights.
Conducting User Research
User interviews help with understanding specific pain points and usability challenges. Due to strict time and resource constraints, we did not have a chance to conduct user interviews with clients. I then conducted one-on-one interviews within the company.
By analyzing the feedback from these interviews, we aimed to identify the root causes of the issues and inform the development of a new, more effective dashboard.
Key Insights
confusing metrics
difficult to read
challenging to extract valuable insights
confusing how to take action
confusing why quality gate fail
A well-designed dashboard should give clear, actionable insights and help users quickly understand the current status and take action.
Running a Design Workshop
To address the dashboard redesign effectively, I facilitated a design workshop with the team to foster collaboration, idea generation, and alignment.
Workshop Objectives:
Identify User Pain Points: Present the feedback from user research results and share insights to help the team understand the user pain points and also get the team's thoughts.
Define Redesign Goals: Establish clear goals that align with business objectives.
Generate Ideas and Solutions: Brainstorm and prototype solutions.
Prioritize Features and Enhancements: Determine the most impactful changes to focus on during the redesign process.
Create an Action Plan: Develop a roadmap for the redesign project, including key milestones, responsibilities, and timelines.
Issues and Pain Points
Unclear Metrics: The current dashboard has two metrics on a single line; the ratings and metrics lack explanations, leading to confusion.
Unclear Prioritization: It is unclear which issues are most critical and need urgent attention.
Lack of Actionability: No contextual help or next steps are provided.
Unutilized Space: Not using space well to display useful data and information.
Lacks interaction with data: Static Display that could not provide deeper insights or show trends over time.
current dashboard
Design Approach
After the workshop, we prioritized the problems and defined the design scope according to the timeline.
Improve Information Hierarchy
Emphasize critical data and de-emphasize less important data.
Enhance Data Clarity
Review metrics and ratings, make sure a clear definition. Adding tooltips to help new user understand.
Provide Actionable Insights
Offer specific suggestions to improve code quality directly from the dashboard.
Prioritize Metrics
How many metrics and ratings? What do they stand for? Which are the most critical issues? Which users care the most?
Bring these questions to the team. With PMs, developers, and the Content Strategist, we review the current criteria, have several rounds of discussions, and work on clarifying and prioritizing the metrics. The Content Strategist rewrites the documentation.
Before:
Bugs vs. Reliability, Vulnerabilities vs. Security, Security Hotspots vs. Security Review, Debt, Code Smells, Maintainability, Coverage vs. Unitest, Duplications vs. Duplicated Blocks
After
Bugs, vulnerabilities, and security hotspots are called issues.
Take out the Debt and Code smell because it’s an estimate.
Prioritize ratings: Security > Reliability > Maintainability
Visualize Hierarchies
How can users quickly understand and act on critical information? How to make good use of the space?
Bring these questions with me. I explored several design solutions while keeping the SonarQube visual style, such as adopting a card-based layout for better organization and making these cards interactive for deeper user engagement. I removed confusing icons to enhance clarity, incorporated issue severity levels for better prioritization, and added clear calls to action. This approach lets users quickly investigate and address issues based on severity, providing a more intuitive and efficient experience.
At the time Sonarcloud team is working on a brand new visual style. We implement the new visual style.
Track Trends Over Time
How’s my code quality changed over time? Does SonarQube help me?
During user research, we discovered that the user cares a lot about quality trends. We brought these questions to the team, developed new ideas, and finally decided on an activity graph for the user. However, we left the implementation for a future release because introducing a new feature requires heavy back-end work.
New Feature: Activity Graph
The "Activity" section provides a historical overview of project analysis with a precise date and time stamp. It is straightforward and allows filtering (e.g., "Issues") to focus on specific types of activities.
Usability Issues
When a quality gate fails, the user has no idea if it’s on new code or overall code and why it fails.
I explored several iterations to show why the quality gate failed. For example, one iteration directly expanded details under the quality gate status card.
Edge Cases
We also worked on solving Edge cases, for example:
The quality gate not computed, no historical data, Dashboard Performance Issues with Large Data Sets…etc
Key Changes in the Redesigned Dashboard
Clean and Minimalist Layout: The new layout improves readability. "Quality Gate" is displayed at the top, so the main data is full width.
Prioritize the most critical issues: Prioritizes key metrics into three distinct sections: "Security," "Reliability," and "Maintainability." Kill unnecessary metrics. The metrics are concise and meaningful.
Provide quick action: Adding High, Medium, or Low severity helps immediately understand the issues and take appropriate actions.
Present trends: The "Activity" section provides a historical overview of project analysis with a precise date and time stamp.
Improved User Guidance: Integrates guidance elements, like tooltips, to help users understand what each metric means.
Visual Feedback with Alerts and Notifications: The dashboard includes easy-to-read, less intrusive, and informative alerts.
Consistent Color Coding: Consistent use of colors enables users to assess the health of their code quickly.
Usability Test & Success Metrics
The redesigned dashboard was tested with a sample group of SonarQube users, including developers, DevOps engineers, and project managers. The test showed increased overall user satisfaction. The redesign achieved significant improvements in the following areas:
User Engagement: Thanks to the improved visual hierarchy and layout, users reported reduced time spent searching for critical information.
Reduced Learning Curve: New users reported a faster analyzing time due to the contextual help and clear metrics. Reduce confused.
Actionable Insights: The actionable data presentation increased the number of users taking immediate action after identifying issues.
Issue Resolution Time: Teams reported a reduced average time to resolve issues, as the new design made it easier to identify, prioritize, and address code quality concerns.
Suggestions for Further Iterations
Improve the overall UI and consistency across the whole software.
Implement Accessibility requirements.
Introduce more graphs or icons to show trends.
Adding more direct actions and customizable elements.
Run more user research and usability tests.
Conclusion
The new SonarQube dashboard significantly improves clarity, usability, and interactivity. Its use of visual hierarchy, actionable insights, and clean layout align well with the redesign goals.
Learnings
Key aspects of data visualization in B2B projects
In B2B projects, effective data visualization is crucial for making informed decisions and communicating insights clearly.
Clarity: Ensure that visualizations are clear and easy to understand. Avoid clutter and focus on straightforwardly presenting data.
Relevance: Tailor visualizations to the specific needs and interests of the audience. Highlight the most relevant metrics and insights for stakeholders.
Accuracy: Ensure that the data is accurate and up-to-date. Misleading or incorrect visualizations can lead to poor decision-making.
Simplicity: Use simple and intuitive charts and graphs. Overly complex visualizations can confuse the audience and obscure key insights.
Consistency: Maintain a consistent style and format across all visualizations. This helps in maintaining professionalism and makes comparisons easier.
Interactivity: Incorporate interactive elements like filters and drill-downs where appropriate. This allows users to explore the data more deeply and gain personalized insights.
Customization: You can customize visualizations to fit the specific needs of different stakeholders, such as executives, managers, or analysts.
Actionability: Focus on visualizations that drive action. Provide insights that lead to actionable recommendations or decisions.