ELT Connector Dashboard

The End Product
I delivered an MVP design for a dashboard that effectively communicates critical data points—with associated scenarios, visualization, and detailed interaction specifications.

My Role
Product Designer

The Problem
Data Engineers and Architects need a dashboard with useful—and often critical and timely—information about data pipelines—exporting, loading, and transforming data.

Keywords
UX, UI, Data Pipelines, ELT, ETL, chatGPT, AI, prototyping, user interviews, wireframing, visual design

1. ELT/ELT Research

Key questions answered:

  1. What are the most important data points that personas need to see in the dashboard, especially potential issues and pain points?

  2. Within the ELT/ETL process, for the specific domain of the company, which key metrics and data points are the most visible and pertinent?

  3. Prioritization: If you can choose just 3 key metrics, which would they be?

These questions home in on what’s important, and where to surface key information within the final experience and UI.

While I did not limit the dashboard to 3 key metrics, there’s inherent value in prioritizing data points as these can align the most to my users’ needs and pain points.

2. Themes and Feasibility

Key insights surfaced:

  1. The “top 3” question surfaces themes under which you can also group other more minor data points which happen to be related to that theme.

  2. Some key metrics are not really within scope for the company. For instance, data quality is better left for a data warehouse-focused organization (ex. Snowflake, GCP) along with related companies that can better analyze and provide visualization for these (ex. PowerBI, Tableau).

3. Main Dashboard Metrics

Key Metrics/Data Points on Dashboard:

  1. Error Rate, Data Transfer Rate, and Data Latency—all of which are quite related and connected to the health of the export or transform processes (but especially export).

  2. Data Volume, Duplication Rate, and Data Availability—all roughly supporting the first 3 data points.

  3. Ongoing Processes—useful to see on the dashboard, especially if there’s an issue. In the example scenario, I created, the too-high Average Data Latency value is tied in to why a current data sync process is running slower than usual.

4. Key Result and Learnings

  1. A successful MVP design for a dashboard was created in the space of two weeks.

  2. A surprising highlight is the accuracy of the research done to identify pain points and also to identify appropriate charts to describe those data points—despite just using AI in place of interviewing real data scientists (which was the key persona characteristic).

  3. AI has come a long way and will become a valuable new tool augmenting the UX process and research findings.

Previous
Previous

AI and Events: Mastercard Spending Pulse

Next
Next

Trading Application UI Kit (WIP—trading passion project)