Card Content

Optimize performance tracking of government policies

Product
User Experience
UI Design

The Socrata Performance Management suite of tools is utilized by many government agencies to track the progress of investment and policy decisions against strategic goals. One way information on the Socrata platform is conveyed to citizens is through public-facing, customer-created webpages that tell a story through data, including content cards, data visualizations, data tables, photos or videos, and text. Online performance dashboards on the Socrata platform are important for agencies to share how their programs are tracking in an open and accessible format.

What is performance management? Performance management is the process to structure data that measures against a government agency’s goal; this process drives overall investment and policy decisions to achieve certain outcomes within government agencies at all levels - whether it’s city, county, state, or federal.

Performance dashboard with content cards

Context

As the lead product designer on this product area, my team and I were tasked with optimizing the content card to better fit our customers’ needs within our redesigned Performance Suite, which now enabled advanced collaboration across agencies’ data programs. These content cards serve as quick snapshots on dashboards, sharing progress on performance measures with their citizens and general public. They contain at-a-glance, important information about the performance measure: the calculated data value, the units of the measure, the title, targets, and its status.

Performance reports at the City of Austin in magazine boxes in a bookshelf.
Printed performance reports at the City of Austin.

Challenges

Some of the constraints that accompanied this work included:

  • A history of challenging customer needs, with a different pace often misaligned with typical agile software development.
  • Embedded on a lightweight CMS-like publishing tool called Perspectives, containing an outdated tech stack created in Angular in a larger React ecosystem.
  • A business need to add smaller-width content block to accommodate multiple cards placed in a dashboard format. This was required to meet feature parity to allow migration to newly revamped Performance Management product.
  • Broken responsive breakpoints based on outdated device widths.
  • Multiple layers of users to consider: the general public or citizens (consumer) + multiple content creators as analysts or program managers or leaders.
6 personas of the Performance Management suite consumers.
Personas developed from prior research of the Performance Management tools.

Responsive breakpoints

At the time of this project, I had a good sense that the publishing tool’s breakpoints were a messy endeavor. As I combed through the CSS with an engineer, we discovered that there were actually 5 breakpoints (3 official ones, 2 questionable ones) that were not typical breakpoints and were based on older mobile devices.

Table showing the inconsistent breakpoints of the platform.
Table of inconsistent breakpoint standards.

I also performed an audit of the entire Socrata platform (outside of my product team’s area) and a quick industry deep dive into standards. Breakpoints were either inconsistent or missing from different pieces of the platform.

As the lead for establishing the design pattern system at Socrata, I paired with a peer designer, to evaluate these older breakpoints with current needs of the platform and the product team’s overall future direction. We worked together to determine the best course of path across areas of the platform.

  • As a first step, it was imperative to move all existing platform pages with breakpoints to a consistent standard of defined breakpoints. Once this was done, the fluid, device-agnostic grid could be accomplished at a component level across the platform to improve scalability and efficiency.
  • I created an internal wiki that captured where we were at that point in time, especially since appetite for these breakpoints across development and design was not high. External challenges included other teams not being ready for or receptive to the amount of effort required to implement this standardization across the entire platform. Building support for this across the organization would take a lot of time.
Table showing new breakpoint standards.
New breakpoint standards.

Concept

Based on consistent customer feedback over time, and their observed behavior over time, we knew that candidates for pieces of content on these performance measure cards included:

  1. Performance measure title
  2. Calculated data value
  3. Unit label for data value
  4. Status color and text label
  5. Date range (new)
  6. Target numbers detail (new)
  7. Link to performance measure
  8. “Ended” flag when a measure ends at a specified date
  9. Trendline of data visualization (new)

As I explored the old design with the above pieces of content, I realized it would be important to determine how much information was enough to make sense for our customers. Wireframing 3 content design concepts and soliciting feedback from existing customers would help us move forward (see below for the low-fidelity wireframes…and yes, that is Comic Sans**).

Quick feedback

To determine which pieces of content resonated most with customers, I pulled together a qualitative questionnaire. Based on past research experience, customers frequently told us that all the information was important. So we gave them 3 non-interactive rough content card wireframes in a dashboard configuration as fodder for feedback. We qualitatively asked 11 key people from our performance customers what information was working for them, what wasn’t working, what was missing, and why. Additionally, we had participants rank the order of the types of contents found on these dashboard examples, in relative order of preference.

Insights

  • For months, we had been told by very vocal customers that the trendline would be a popular component of this card, and yet it was ranked quite low. While trendlines were interesting, a lot of context about the data itself was missing. Other contributing factors included the simple visual look of the wireframes, an openness to misinterpretation without looking at the detailed data visualization, and one-item-too-many on the content card. The trendline was removed from consideration.
  • Targets were unexpectedly ranked lower in preference overall, though when they were missing from the design, customers noticed and voiced their concern. The target was helpful and crucial for the display of the performance measure, but lacked clarity (“[Target] [#] [Date or quarter]”). Should it display if the target date has passed/ended? Should it be adjusted depending on the breakpoint and the corresponding card size?

Bar chart of customer ranking of card content types
Customer ranking of card content types.

  • As expected, the 3 main pieces of information on a performance tile (title, big number, and unit label) were ranked highest in the top 3 choices. This would continue to be a high priority for our customers’ requirements.
  • The character length of the card titles were not meeting the needs of our customers. We needed to lengthen this, while setting a character limit as we knew customers had tendency to creating lengthy descriptions about the performance measure in this field.
  • Displaying all information as desired on every breakpoint wasn't possible.
  • User confusion around the role of the content card  — does it make sense to convey as much information as possible on the card versus inviting the user to explore the underlying data? The product needed to encourage “click to learn more” behavior by linking to the measure page itself, with its calculated dataset, background and context, and detailed information. The “view measure” link was crucial to accomplish this.
  • One interesting caveat of these results was that the people who experienced these options were viewing these as a webpage visitor (not the creator, which is the role they were more accustomed to being - see personas above). Seeing these concepts in “view mode” possibly accounts for an interesting angle within the feedback — of being in the shoes of their own customers. With more time allowed for testing, it would have been interesting to further test these multiple layers of perspectives.


Design

Once the content pieces for the content card were determined, I paired heavily with a lead engineer to pair design and focus on developing the tool’s new 6x1 content block. We also cleaned up the responsive breakpoints based on the new standard that I spearheaded, in addition to refactoring code and fixing bugs. Core to this challenge was ensuring that most pieces of content could flex to the cards within the new breakpoints. We prototyped it directly in code, and then carefully quality tested all the possible configurations all the way through to implementation.

Prototype of new content cards in the layout of the desktop breakpoint.
In-progress prototype of desktop breakpoint w/ new 6x1 content block

Impact

Shortly after these changes rolled out, several customers began generating massive amounts of performance measures. Customers were beginning to take advantage of these new features on these cards — the count numbering in the hundreds — to create dashboards and initiative-specific webpages that shared progress on their performance measures.

Old and new content cards, side by side.
Card - old and new

In the end, our customers wanted their end users to avoid misinterpreting the meaning of specific data. And this results in positive outcomes of the constituents these agencies serve.

Live example: Pierce County, WA

Scrolling animation of a communications dashboard with the new content cards.
Communications dashboard, Pierce County.

Scrolling animation of a communications dashboard with the new content cards.
Health and Human Services dashboard, Pierce County.