Optimize performance tracking of government policies
The Socrata Performance Management suite of tools is utilized by many government agencies to track the progress of investment and policy decisions against strategic goals. One way information on the Socrata platform is conveyed to citizens is through public-facing, customer-created webpages that tell a story through data, including content cards, data visualizations, data tables, photos or videos, and text. Online performance dashboards on the Socrata platform are important for agencies to share how their programs are tracking in an open and accessible format.
What is performance management? Performance management is the process to structure data that measures against a government agency’s goal; this process drives overall investment and policy decisions to achieve certain outcomes within government agencies at all levels - whether it’s city, county, state, or federal.
As the lead product designer on this product area, my team and I were tasked with optimizing the content card to better fit our customers’ needs within our redesigned Performance Suite, which now enabled advanced collaboration across agencies’ data programs. These content cards serve as quick snapshots on dashboards, sharing progress on performance measures with their citizens and general public. They contain at-a-glance, important information about the performance measure: the calculated data value, the units of the measure, the title, targets, and its status.
Some of the constraints that accompanied this work included:
At the time of this project, I had a good sense that the publishing tool’s breakpoints were a messy endeavor. As I combed through the CSS with an engineer, we discovered that there were actually 5 breakpoints (3 official ones, 2 questionable ones) that were not typical breakpoints and were based on older mobile devices.
I also performed an audit of the entire Socrata platform (outside of my product team’s area) and a quick industry deep dive into standards. Breakpoints were either inconsistent or missing from different pieces of the platform.
As the lead for establishing the design pattern system at Socrata, I paired with a peer designer, to evaluate these older breakpoints with current needs of the platform and the product team’s overall future direction. We worked together to determine the best course of path across areas of the platform.
Based on consistent customer feedback over time, and their observed behavior over time, we knew that candidates for pieces of content on these performance measure cards included:
As I explored the old design with the above pieces of content, I realized it would be important to determine how much information was enough to make sense for our customers. Wireframing 3 content design concepts and soliciting feedback from existing customers would help us move forward (see below for the low-fidelity wireframes…and yes, that is Comic Sans**).
To determine which pieces of content resonated most with customers, I pulled together a qualitative questionnaire. Based on past research experience, customers frequently told us that all the information was important. So we gave them 3 non-interactive rough content card wireframes in a dashboard configuration as fodder for feedback. We qualitatively asked 11 key people from our performance customers what information was working for them, what wasn’t working, what was missing, and why. Additionally, we had participants rank the order of the types of contents found on these dashboard examples, in relative order of preference.
Once the content pieces for the content card were determined, I paired heavily with a lead engineer to pair design and focus on developing the tool’s new 6x1 content block. We also cleaned up the responsive breakpoints based on the new standard that I spearheaded, in addition to refactoring code and fixing bugs. Core to this challenge was ensuring that most pieces of content could flex to the cards within the new breakpoints. We prototyped it directly in code, and then carefully quality tested all the possible configurations all the way through to implementation.
Shortly after these changes rolled out, several customers began generating massive amounts of performance measures. Customers were beginning to take advantage of these new features on these cards — the count numbering in the hundreds — to create dashboards and initiative-specific webpages that shared progress on their performance measures.
In the end, our customers wanted their end users to avoid misinterpreting the meaning of specific data. And this results in positive outcomes of the constituents these agencies serve.
Live example: Pierce County, WA