Skip to main content

Dashboard Design

Dashboard design transforms raw data into visual interfaces that enable decision-making. This task covers the complete process from understanding audience needs through deployment, producing dashboards that load within acceptable timeframes and answer the questions users bring to them.

Prerequisites

RequirementDetail
AccessBI platform with dashboard creation permissions; read access to source datasets
PermissionsDataset access for all metrics to be displayed; publish permissions for target workspace
Data availabilitySource data populated and refreshed on schedule; semantic layer or data model defined
InformationList of intended users with roles; business questions the dashboard must answer; refresh frequency requirements
SkillsFamiliarity with BI platform interface; understanding of source data structure
Time8-16 hours for initial design through deployment; additional time for iteration

Before beginning, confirm that the underlying data exists and refreshes reliably. A dashboard built on incomplete or stale data creates confusion rather than insight. Verify data freshness by checking the most recent records in each source table:

-- Check data freshness for each source table
SELECT
'beneficiary_registrations' as table_name,
MAX(updated_at) as latest_record,
COUNT(*) as total_records
FROM beneficiary_registrations
UNION ALL
SELECT
'programme_activities',
MAX(updated_at),
COUNT(*)
FROM programme_activities
UNION ALL
SELECT
'outcome_measurements',
MAX(updated_at),
COUNT(*)
FROM outcome_measurements;

Expected output shows records within your refresh window (daily data should show yesterday or today; weekly data should show within 7 days).

Gather the business questions explicitly before designing. Vague requirements produce unfocused dashboards. Document at minimum: who will use this dashboard, what decisions they make with it, what questions they ask repeatedly, and how frequently they need updated information.

Procedure

  1. Analyse the audience and their information needs

    Identify the primary users and classify them by how they will interact with the dashboard. Operational users check dashboards multiple times daily to monitor ongoing activities and respond to exceptions. Analytical users explore data periodically to understand patterns and investigate issues. Strategic users review high-level metrics weekly or monthly to assess progress against goals.

    For each user group, document:

    • The 3-5 questions they ask most frequently
    • The decisions those questions inform
    • The timeframe of data they need (today, this week, this quarter, trend over time)
    • Their technical comfort level with filters and interactivity

    A programme manager tracking cash distributions might ask: How many distributions completed today? Which distribution points are behind schedule? What is our completion rate against target? These questions suggest an operational dashboard refreshed hourly with emphasis on current status and exceptions.

    A country director reviewing programme health asks different questions: Are we on track for quarterly targets? How do completion rates compare across regions? What is the trend in beneficiary satisfaction? These questions suggest a strategic dashboard refreshed daily or weekly with emphasis on aggregates and trends.

  2. Select metrics and define calculations

    Choose metrics that directly answer the documented questions. Resist including metrics because they are available; every metric competes for attention. A dashboard with 30 metrics communicates nothing effectively.

    For each metric, specify:

    • Exact calculation including filters, aggregations, and time periods
    • Data source table and columns
    • Refresh frequency and expected latency
    • Target or benchmark for comparison
    • Owner responsible for the metric definition

    Document calculations precisely to prevent ambiguity:

Metric: Distribution Completion Rate
Calculation: (Distributions with status='completed') / (Total planned distributions) × 100
Filter: Current reporting period only
Source: distributions table, filtered by planned_date within period
Refresh: Hourly during operating hours
Target: 95% by end of distribution cycle
Owner: M&E Manager

Validate metric calculations against known values before building. Query the source data directly and compare results with existing reports or manual calculations. Discrepancies discovered after deployment erode trust in the entire dashboard.

  1. Design the layout structure

    Arrange metrics in a hierarchy that matches how users consume information. Users scan dashboards in predictable patterns: left to right, top to bottom, with attention concentrated in the upper left quadrant. Place the most critical metrics where attention naturally falls.

    Apply the inverted pyramid structure: summary metrics at top, supporting detail below, granular data at bottom or accessible through drill-down. A user glancing at the dashboard for 5 seconds should grasp overall status from the top section alone.

+------------------------------------------------------------------+
| DASHBOARD HEADER |
| Title | Date Range Selector | Last Refreshed: 2024-11-16 09:15 |
+------------------------------------------------------------------+
| |
| +---------------------------+ +---------------------------+ |
| | PRIMARY KPI 1 | | PRIMARY KPI 2 | |
| | 1,247 / 1,500 | | 83.1% | |
| | Distributions Today | | Completion Rate | |
| +---------------------------+ +---------------------------+ |
| |
| +---------------------------+ +---------------------------+ |
| | SECONDARY KPI 1 | | SECONDARY KPI 2 | |
| | 12 | | 4.2 / 5.0 | |
| | Sites Behind Schedule | | Avg Satisfaction | |
| +---------------------------+ +---------------------------+ |
| |
+------------------------------------------------------------------+
| |
| +-----------------------------+ +---------------------------+ |
| | | | | |
| | TREND CHART | | COMPARISON CHART | |
| | Daily completions | | By region | |
| | over past 14 days | | | |
| | | | | |
| +-----------------------------+ +---------------------------+ |
| |
+------------------------------------------------------------------+
| |
| +-------------------------------------------------------------+ |
| | DETAIL TABLE | |
| | Site | Planned | Completed | Rate | Status | |
| | ------------------------------------------------ | |
| | Site A | 150 | 142 | 94.7% | On Track | |
| | Site B | 200 | 156 | 78.0% | Behind | |
| +-------------------------------------------------------------+ |
| |
+------------------------------------------------------------------+

Figure 1: Inverted pyramid layout with KPIs at top, trends in middle, detail at bottom

Group related metrics visually. White space between groups signals conceptual separation. Metrics within a group should answer related questions or represent different views of the same phenomenon.

Allocate space proportionally to importance, not data volume. A single KPI that answers the primary question deserves prominent placement even though it contains less data than a detailed table.

  1. Design interactivity and filtering

    Filters enable users to focus on relevant subsets without creating separate dashboards for every combination. Position global filters (those affecting all visuals) at the top of the dashboard where users encounter them first. Position local filters near the visuals they affect.

    Standard filter patterns for programme dashboards:

    • Time period: Default to current reporting period with ability to select historical periods
    • Geography: Country, region, site hierarchy with multi-select
    • Programme: Programme or project filter when dashboard spans multiple
    • Status: Active, completed, cancelled for filtering by record state

    Configure filter defaults to show the most commonly needed view. Users should see useful data immediately without interaction. A distribution monitoring dashboard defaults to today’s date and all active sites; users filter to specific sites only when investigating.

    Implement drill-down paths for users who need to investigate detail. A regional completion rate that seems low should link to site-level detail, which should link to individual distribution records. Each drill level answers the natural follow-up question: “Why is this number what it is?”

Regional Summary Site Detail Record Detail
+----------------+ +----------------+ +------------------+
| Region | Rate | | Site | Rate | | ID | Status |
|--------|-------| --> |--------|-------| --> | Date | Recipient |
| North | 94% | | Site A | 98% | | Items| Notes |
| South | 76% | | Site B | 71% | | | |
+----------------+ +----------------+ +------------------+

Figure 2: Drill-down path from summary through detail

Limit interactivity to what users need. Every filter adds cognitive load and potential for confusion. A dashboard with 12 filters overwhelms users who need only 3.

  1. Build the dashboard in the BI platform

    Create visuals systematically, starting with the primary KPIs and working down the hierarchy. Building top-to-bottom ensures you allocate space to priority items first rather than squeezing them into remaining space.

    For each visual:

    • Select chart type appropriate to the data relationship (refer to Data Visualisation Standards for selection guidance)
    • Configure data source and apply metric calculation
    • Set appropriate aggregation level
    • Apply formatting: titles, axis labels, number formats, colours
    • Test with realistic data volumes

    Configure KPI cards with comparison to target or previous period:

KPI Card Configuration:
- Value: SUM(completed_distributions)
- Target: 1,500
- Comparison: vs. same day last week
- Conditional formatting:
- Green: >= 95% of target
- Yellow: 80-94% of target
- Red: < 80% of target

Build trend charts with consistent time granularity. Daily data displayed over 14 days produces readable trends; daily data over 365 days produces noise. Match granularity to the decision timeframe: operational dashboards show hours or days; strategic dashboards show weeks or months.

Configure tooltips to provide context without requiring drill-down. A bar showing regional completion rate should display, on hover: region name, absolute numbers (completed and planned), percentage, and comparison to target.

  1. Optimise performance

    Dashboard load time directly affects adoption. Users abandon dashboards that take more than 10 seconds to load. Test performance with realistic data volumes and concurrent users before deployment.

    Measure current performance by timing full dashboard load:

    • Open browser developer tools (F12)
    • Navigate to Network tab
    • Clear cache and reload dashboard
    • Record time until all queries complete

    Target load times by dashboard type:

    Dashboard typeTarget load timeMaximum acceptable
    OperationalUnder 3 seconds5 seconds
    AnalyticalUnder 5 seconds10 seconds
    StrategicUnder 8 seconds15 seconds

    Common optimisation techniques:

    Aggregation: Pre-aggregate data in the semantic layer rather than calculating in the dashboard. A visual showing monthly totals should query a monthly aggregate table, not sum daily records at display time.

-- Create monthly aggregate for dashboard performance
CREATE TABLE monthly_distribution_summary AS
SELECT
DATE_TRUNC('month', distribution_date) as month,
region_id,
programme_id,
COUNT(*) as total_distributions,
SUM(CASE WHEN status = 'completed' THEN 1 ELSE 0 END) as completed,
SUM(beneficiary_count) as total_beneficiaries
FROM distributions
GROUP BY 1, 2, 3;
-- Refresh aggregate on schedule
-- Run as part of nightly ETL or create as materialized view

Query reduction: Consolidate visuals that query the same data. Ten visuals each querying the same table generate ten queries; a single query feeding multiple visuals performs better.

Caching: Enable query caching for data that does not change frequently. A dashboard showing yesterday’s data does not need to re-query the database for every user.

Filter optimisation: Apply filters at the data source level rather than filtering displayed results. A filter that restricts query results to one region is faster than querying all regions and hiding unwanted data.

  1. Test with representative users

    Before deployment, test with users from each identified audience group. Provide no instructions beyond the dashboard’s purpose; observe where users hesitate, misinterpret, or cannot find information.

    Structured testing approach:

    • Present the dashboard and state its purpose
    • Ask the user to answer each of the business questions the dashboard addresses
    • Note which visuals they examine, in what order
    • Record questions they ask or confusion they express
    • Time how long each question takes to answer

    A well-designed dashboard enables users to answer documented business questions within 30 seconds each. Questions taking longer indicate unclear layout, missing information, or confusing visualisations.

    Collect specific feedback:

    • Can you find [specific metric]?
    • What does [specific visual] tell you?
    • What would you do if [metric] showed an unexpected value?
    • Is anything missing that you expected to see?
    • Is anything present that you do not need?

    Document findings and prioritise changes by impact on usability.

  2. Deploy and communicate

    Publish the dashboard to the appropriate workspace with access controls matching the intended audience. Restrict access to users who need the information; broader access than necessary creates noise when users encounter dashboards irrelevant to their work.

    Configure refresh schedule to match user expectations documented in requirements. Operational dashboards may require hourly refresh; strategic dashboards may refresh nightly or weekly. More frequent refresh than necessary consumes compute resources without user benefit.

    Create brief documentation for users:

    • Purpose: what questions this dashboard answers
    • Audience: who should use this dashboard
    • Data freshness: when data updates and what lag to expect
    • Key definitions: how primary metrics are calculated
    • Known limitations: what the dashboard does not show
    • Contact: who to reach with questions or issues

    Announce availability through appropriate channels. Include a screenshot showing the dashboard layout so users can assess relevance before opening.

  3. Establish iteration cycle

    Dashboards require ongoing refinement as user needs evolve and issues emerge. Schedule a review 2-4 weeks after deployment to assess adoption and gather feedback.

    Track usage metrics if the BI platform provides them:

    • Number of unique users per week
    • Average views per user
    • Most and least viewed visuals
    • Common filter selections
    • Peak usage times

    Low adoption indicates either poor promotion, wrong audience, or design that does not meet needs. Investigate by contacting intended users who have not accessed the dashboard.

    Iterate based on evidence rather than requests alone. A user requesting an additional metric may have a legitimate need or may not understand that existing metrics already answer their question. Discuss the underlying question before adding complexity.

Mobile and offline considerations

Field users accessing dashboards on mobile devices or intermittent connections require adapted design. Mobile screens display approximately one-quarter the content of desktop screens; a dashboard designed for desktop becomes unusable on mobile without intentional adaptation.

For mobile access, create a separate mobile-optimised view rather than relying on responsive scaling. The mobile view includes only the highest-priority metrics, typically 4-6 KPIs with minimal interactivity. Users needing full functionality access the desktop version when connectivity allows.

+------------------+
| MOBILE HEADER |
| Distribution |
| Monitoring |
+------------------+
| |
| 1,247 |
| Distributions |
| Today |
| |
+------------------+
| |
| 83.1% |
| Completion |
| Rate |
| |
+------------------+
| |
| 12 Sites |
| Behind |
| Schedule |
| |
+------------------+
| [View Details] |
+------------------+

Figure 3: Mobile dashboard showing stacked single-metric cards

For offline access, some BI platforms support offline snapshots that users can download when connected and view later without connectivity. Configure critical dashboards for offline availability where the platform supports it. Offline snapshots are static; communicate clearly that offline data reflects the snapshot time, not current state.

Verification

Confirm successful deployment by checking each aspect:

Data accuracy: Compare dashboard values against source queries for the same filters and time period. Run validation queries and compare results:

-- Validation query for completion rate KPI
SELECT
COUNT(CASE WHEN status = 'completed' THEN 1 END) as completed,
COUNT(*) as total,
ROUND(COUNT(CASE WHEN status = 'completed' THEN 1 END) * 100.0 / COUNT(*), 1) as rate
FROM distributions
WHERE distribution_date = CURRENT_DATE;
-- Dashboard should show identical values

Performance: Measure load time from user perspective, not just query completion. Clear browser cache and load dashboard; time should meet targets established in step 6.

Access control: Verify that intended users can access the dashboard and unintended users cannot. Test with accounts from each access group.

Refresh: Confirm data refresh occurs on schedule. Check the “last refreshed” timestamp after a scheduled refresh should have occurred.

Filters: Test each filter individually and in combination. Verify that filters affect all intended visuals and produce expected results.

Mobile: If mobile access is required, test on actual mobile devices (not just browser simulation). Verify readability and touch targets are accessible.

Troubleshooting

SymptomCauseResolution
Dashboard loads slowly (over 10 seconds)Queries scanning large tables without aggregationCreate pre-aggregated tables; apply filters at query level
Dashboard loads slowlyToo many visuals generating separate queriesConsolidate visuals using same data source; reduce total visual count
Dashboard loads slowlyComplex calculations in visual layerMove calculations to semantic layer or pre-compute in ETL
Values differ from source systemFilter mismatch between dashboard and comparison queryVerify all filters match including time zone handling
Values differ from source systemCalculation logic differs from sourceDocument and reconcile calculation definitions; align with source
Values differ from source systemData refresh timing differenceCompare at consistent points; document expected lag
Filters not affecting all visualsVisuals not connected to filterCheck filter configuration; ensure all visuals reference filtered dataset
Users report metrics missingMetric visible only after scroll or interactionReorganise layout to show critical metrics without interaction
Users report metrics missingAccess control restricting underlying dataVerify user has access to source datasets
Refresh failing on scheduleCredential expiration for data sourceUpdate service account credentials; configure credential rotation alerts
Refresh failing on scheduleSource system unavailable during refresh windowAdjust refresh schedule to avoid source maintenance windows
Mobile display unreadableDesktop layout scaling to small screenCreate dedicated mobile view with reduced content
Export producing blank or truncated outputExport timeout on large dashboardsReduce data volume for export; export individual visuals

Variant: Humanitarian operational dashboards

Dashboards supporting humanitarian response have additional considerations beyond standard design.

Multi-language support: Users may need the same dashboard in multiple languages. Where the BI platform supports it, create translated versions of dashboard text (titles, labels, metric names). Where translation is not platform-native, maintain parallel dashboards with consistent layout.

Offline resilience: Field teams in humanitarian contexts frequently lose connectivity. Design dashboards to remain useful with data up to 48 hours old; include prominent “last updated” timestamps so users understand data currency.

Cluster reporting alignment: Dashboards feeding into cluster reporting (3W, 4W, 5W) must align calculations with cluster definitions. See Humanitarian Reporting Standards for standard indicator definitions and ensure dashboard metrics match.

Security classification: Dashboards displaying beneficiary-level data, location data, or protection information require appropriate access controls. Apply data classification from Data Classification and restrict access accordingly.

See also