Mobile Clinic IT Setup
A mobile clinic IT setup provides self-contained technology infrastructure that travels with a mobile health unit, operates independently of fixed facilities, and maintains data integrity across locations with varying connectivity and power availability. This pattern addresses the fundamental tension between clinical data requirements and the unpredictable infrastructure of mobile operations, where a vehicle or temporary structure must function as a complete healthcare delivery point within minutes of arrival at a new location.
Problem context
Mobile health operations face infrastructure challenges that fixed facilities solve through permanent installations. A vehicle-based clinic arriving at a remote community has no guarantee of grid power, cellular coverage, or internet connectivity, yet clinical staff require functioning devices, access to patient records, and the ability to capture consultation data from the moment doors open. The traditional approach of depending on local infrastructure fails when that infrastructure does not exist or proves unreliable.
Three forces shape this pattern’s requirements. First, clinical workflows cannot wait for IT troubleshooting. When a mobile clinic opens at 08:00 and closes at 16:00, every minute spent resolving connectivity or power issues directly reduces patient consultations. The IT infrastructure must reach operational state within 15 minutes of vehicle arrival. Second, patient data carries legal and ethical obligations that persist regardless of location. A consultation record captured in a tent beside a dirt road requires the same protection as one captured in a hospital. Third, mobile operations accumulate data across multiple sites that must eventually synchronise with central systems, creating reconciliation challenges when the same patient appears at different mobile clinic locations.
This pattern applies when healthcare or similar services deploy from vehicles, temporary structures, or rotating locations where setup and teardown occur regularly. It does not apply to permanent field offices with stable infrastructure, emergency response deployments where speed outweighs optimisation, or operations with reliable grid power and cellular connectivity at all locations. Fixed field sites use the procedures in Field Site IT Establishment, while emergency deployments follow Field Office Rapid Deployment.
Solution
The mobile clinic IT pattern creates a self-contained technology stack that travels as a unit, maintains its own power and connectivity, operates fully offline when necessary, and synchronises data when connections become available. The architecture separates into four layers: power, connectivity, compute, and data, each designed for independence from external infrastructure while enabling integration when that infrastructure exists.
+-----------------------------------------------------------------------------+| MOBILE CLINIC IT ARCHITECTURE |+-----------------------------------------------------------------------------+| || +--------------------------------+ +--------------------------------+ || | POWER LAYER | | CONNECTIVITY LAYER | || | | | | || | +-----------+ +-----------+ | | +-----------+ +-----------+ | || | | Vehicle | | Solar | | | | Cellular | | Satellite | | || | | 12/24V | | Panel | | | | Bonding | | Backup | | || | +-----+-----+ +-----+-----+ | | +-----+-----+ +-----+-----+ | || | | | | | | | | || | v v | | v v | || | +---------------------+ | | +----------------------+ | || | | Battery Bank | | | | Router/Failover | | || | | (LiFePO4 200Ah) | | | | Controller | | || | +----------+----------+ | | +-----------+----------+ | || | | | | | | || +---------------+----------------+ +----------------+---------------+ || | | || v v || +-----------------------------------------------------------+ || | COMPUTE LAYER | || | | || | +-------------+ +-------------+ +-------------+ | || | | Local Server| | Tablet 1 | | Tablet 2 | | || | | (sync/cache)| | (clinical) | | (clinical) | | || | +------+------+ +------+------+ +------+------+ | || | | | | | || | +----------------+----------------+ | || | | | || +---------------------------+-------------------------------+ || | || v || +-----------------------------------------------------------+ || | DATA LAYER | || | | || | +----------------+ +---------------+ +---------------+ | || | | Offline Cache | | Sync Queue | | Encrypted | | || | | (local SQLite) | | (pending ops) | | Storage | | || | +----------------+ +---------------+ +---------------+ | || | | || +-----------------------------------------------------------+ || |+-----------------------------------------------------------------------------+The power layer provides electrical independence through a combination of vehicle power, battery storage, and optional solar charging. A battery bank using lithium iron phosphate (LiFePO4) chemistry stores 200Ah at 12V (2.4kWh), sufficient for 8 hours of operation with tablets, a local server, and networking equipment drawing approximately 250W average load. Vehicle alternator charging replenishes the battery during transit, while a 200W solar panel provides supplementary charging during stationary operation. An inverter converts 12V DC to 240V AC for devices requiring mains power, though the design favours 12V and USB-powered equipment to maximise battery efficiency.
The connectivity layer establishes network access through multiple paths with automatic failover. A cellular bonding router aggregates multiple SIM cards from different carriers, combining their bandwidth and providing redundancy when individual networks fail. In areas with adequate cellular coverage, bonded 4G connections deliver 20-50 Mbps aggregate throughput. When cellular becomes unavailable or insufficient, a satellite terminal provides backup connectivity. Modern LEO satellite services offer 50-200 Mbps with 20-40ms latency, though GEO satellite remains an option with 600-800ms latency where LEO coverage does not exist.
The compute layer runs applications and manages data on devices that function independently of network availability. A local server, typically a ruggedised mini-PC or single-board computer, hosts cached reference data, queues synchronisation operations, and provides local services to clinical tablets. Clinical staff interact with tablet devices running applications configured for offline operation, with the local server acting as an intermediary between tablets and cloud systems.
The data layer ensures patient information persists locally when offline and synchronises correctly when connectivity returns. Each tablet maintains an offline cache in local SQLite databases, storing both reference data (medication lists, diagnosis codes, patient demographics for expected visits) and transaction data (consultations, observations, prescriptions). A sync queue tracks pending operations, enabling the system to replay changes against the central system in correct order when connectivity resumes. Encrypted storage protects all data at rest using AES-256 encryption with keys derived from device credentials.
Implementation
Implementation proceeds through equipment selection, configuration, integration testing, and operational procedures. Each phase builds on the previous, with testing validating assumptions before deployment.
Equipment selection
Power system sizing starts from load calculation. A typical mobile clinic runs two clinical tablets (10W each), one registration tablet (10W), a local server (25W), a cellular router (15W), a satellite terminal (40W when active, 10W standby), and miscellaneous charging (20W), totalling approximately 130W continuous with peaks to 250W during satellite transmission. Eight hours of operation at 150W average requires 1.2kWh. A 200Ah LiFePO4 battery at 12V provides 2.4kWh, allowing 50% depth of discharge while maintaining reserve capacity. This configuration costs approximately £800-1200 for the battery bank, £200-400 for the charge controller, and £150-300 for the inverter.
+------------------------------------------------------------------------+| POWER SYSTEM DESIGN |+------------------------------------------------------------------------+| || LOAD CALCULATION BATTERY SIZING || +-----------------------+ +-----------------------+ || | Component | Watts | | Required capacity | || +-----------------------+ | = Load x Hours x 1.2 | || | Tablets (3) | 30 | | = 150W x 8h x 1.2 | || | Server | 25 | | = 1,440 Wh | || | Router | 15 | | | || | Satellite | 40 | | Battery spec | || | Charging | 20 | | = 1,440 / 12V / 0.5 | || | Buffer | 20 | | = 240 Ah minimum | || +-----------------------+ | Select: 200Ah LiFePO4 | || | Total | 150 | +-----------------------+ || +-----------------------+ || || CHARGING SOURCES || +-----------------------------------------+ || | Source | Rate | Daily Ah | || +-----------------------------------------+ || | Vehicle (2h) | 40A | 80 Ah | || | Solar (6h peak) | 12A | 72 Ah | || | Combined | | 152 Ah | || +-----------------------------------------+ || | Daily use at 150W = 100 Ah | || | Surplus margin = 52 Ah (weather buffer) | || +-----------------------------------------+ || |+------------------------------------------------------------------------+Connectivity equipment selection depends on operating locations. For operations within cellular coverage, a Peplink MAX Transit Duo or similar multi-SIM router provides carrier bonding and failover for £600-900. Satellite backup adds £400-2500 for equipment depending on service (Starlink at the lower end, traditional VSAT at the higher end) plus monthly service fees of £75-500. The cellular router connects to a compact WiFi access point creating a local network for tablets and the server.
The local server runs on a fanless industrial mini-PC with solid-state storage, consuming under 25W while providing sufficient processing for local database operations and synchronisation management. An Intel NUC-class device with 16GB RAM and 512GB SSD costs £400-700 and survives the vibration and temperature variation of vehicle operation when properly mounted.
Tablets for clinical use require capacitive touchscreens usable with medical gloves, displays readable in bright sunlight, IP65 or better ingress protection, and battery life exceeding 8 hours. Consumer tablets rarely meet all requirements; ruggedised devices from Samsung (Galaxy Tab Active series), Zebra, or Panasonic (Toughbook/Toughpad) cost £400-800 each but survive operational conditions that destroy consumer equipment within months.
Configuration
Server configuration establishes the local hub that tablets communicate with during operation. The server runs a lightweight Linux distribution (Ubuntu Server or Debian) with services for local DNS, DHCP, time synchronisation, and application-specific caching. A reverse proxy (nginx or Caddy) provides local endpoints that clinical applications connect to, routing requests to cached data when offline and to cloud services when online.
upstream cloud_backend { server api.clinicalsystem.example.org:443;}
server { listen 80; server_name clinic-local.internal;
# Health check endpoint - always local location /health { return 200 'OK'; }
# Patient lookup - try cache first location /api/patients { try_files /cache$uri @cloud; }
# Consultation submission - queue if offline location /api/consultations { proxy_pass http://127.0.0.1:8080/queue; }
# Fallback to cloud location @cloud { proxy_pass https://cloud_backend; proxy_connect_timeout 5s; error_page 502 503 504 = /offline-fallback; }
location /offline-fallback { return 503 '{"status":"offline","cached":true}'; }}Tablet configuration focuses on offline capability and data protection. Device management through MDM (Intune, Jamf, or open source alternatives like Fleet) enforces encryption, screen lock policies, and application restrictions. Clinical applications install in kiosk or managed mode, preventing staff from installing additional software or accessing device settings. Each tablet caches reference data during the pre-shift synchronisation window, downloading patient lists for expected visits, current medication formularies, and diagnosis code sets.
Connectivity failover configuration on the cellular router establishes priority order and failover triggers:
# Peplink configuration excerpt (conceptual)wan_priority: - interface: cellular_slot_1 carrier: "Primary Carrier" priority: 1 health_check: ping 8.8.8.8 interval 30s threshold 3
- interface: cellular_slot_2 carrier: "Secondary Carrier" priority: 2 health_check: ping 8.8.8.8 interval 30s threshold 3
- interface: satellite priority: 3 activation: on_demand health_check: ping satellite_gateway interval 60s threshold 2
bonding: enabled: true interfaces: [cellular_slot_1, cellular_slot_2] algorithm: weighted_round_robin
failover: trigger: all_primary_down OR latency > 2000ms target: satellite timeout: 10sData flow design
The data flow pattern handles three scenarios: connected operation, degraded operation, and offline operation. In connected operation, tablets communicate through the local server to cloud systems in real-time, with the local server caching responses for resilience. In degraded operation (high latency or limited bandwidth), tablets prioritise local cache reads and queue writes for background synchronisation. In offline operation, tablets function entirely from local data, queuing all write operations for later synchronisation.
+------------------------------------------------------------------------+| DATA FLOW BY CONNECTIVITY STATE |+------------------------------------------------------------------------+| || CONNECTED (latency < 200ms, bandwidth > 1 Mbps) || +----------+ +----------+ +----------+ +----------+ || | Tablet |----->| Local |----->| Cloud |----->| Central | || | | | Server | | Gateway | | Database | || +----------+ +----+-----+ +----------+ +----------+ || | || v || Cache update || || DEGRADED (latency > 200ms OR bandwidth < 1 Mbps) || +----------+ +----------+ +----------+ || | Tablet |----->| Local |----->| Sync | || | (reads | | Server | | Queue | || | cached) | | (writes | | (batch | || +----------+ | queued) | | upload) | || +----------+ +----------+ || ^ | || | v || Read from Background sync || cache when bandwidth allows || || OFFLINE (no connectivity) || +----------+ +----------+ +----------+ || | Tablet |<---->| Local | | Sync | || | (full | | Server | | Queue | || | offline)| | (offline | | (grows) | || +----------+ | cache) | +----------+ || +----------+ || ^ || | || All operations || use local data || |+------------------------------------------------------------------------+The sync queue implements an operation log pattern, recording each data modification as an immutable entry with timestamp, operation type, entity identifier, and payload. When connectivity resumes, the synchronisation service replays operations in timestamp order against the cloud API, handling conflicts through last-write-wins with conflict logging for manual review. A 100-patient clinic day generates approximately 500-800 sync queue entries (registrations, consultations, prescriptions, referrals), requiring roughly 2-5 MB of queue storage.
Security implementation
Patient data protection requires encryption at rest, encryption in transit, access control, and audit logging regardless of connectivity state. Tablets encrypt their local SQLite databases using SQLCipher with a key derived from user authentication credentials combined with a device-specific secret. The local server encrypts its cache partition using LUKS with a key stored in TPM where available or entered at boot where not.
Transit encryption uses TLS 1.3 for all communications, including tablet-to-server traffic over the local WiFi network. The local server presents a certificate for clinic-local.internal, installed in tablet trust stores during provisioning. This prevents man-in-the-middle attacks even on the local network segment.
Access control requires clinical staff to authenticate before accessing patient data. Authentication checks against locally cached credentials when offline, with credential cache expiring after 7 days without connectivity to force re-authentication. Role-based access control limits functionality by staff type: clinical officers access consultation functions, registration clerks access demographic functions, and administrators access synchronisation and configuration functions.
Audit logging captures all data access and modification events, storing logs locally until synchronisation uploads them to central systems. Log entries include timestamp, user identifier, action type, affected records (by identifier, not content), and outcome. Local log storage retains 30 days of entries, sufficient for any reasonable offline period.
Consequences
This pattern provides predictable operation across varying conditions, enabling clinical staff to deliver services without IT dependency at each location. Setup time reduces from hours of troubleshooting to minutes of power connection and system verification. Data integrity improves through structured offline handling rather than ad-hoc workarounds when connectivity fails.
The trade-offs are significant. Equipment costs increase substantially compared to relying on local infrastructure. A complete mobile clinic IT kit costs £3,000-6,000 for equipment plus ongoing connectivity costs of £100-500 monthly for cellular and satellite services. Operational complexity increases as staff must understand power management, recognise connectivity states, and respond appropriately to sync status indicators. Technical support becomes more challenging when issues occur at remote locations beyond easy reach.
Synchronisation introduces conflict potential that does not exist with real-time connected systems. When the same patient visits two mobile clinics before either synchronises, their records diverge. The conflict resolution mechanism handles most cases automatically, but edge cases require manual review. Organisations using this pattern need processes for conflict review and resolution, typically 1-2 hours weekly for a busy mobile clinic programme.
Battery dependency creates operational discipline requirements. Staff must connect vehicle charging during transit, position solar panels when stationary, and monitor battery state. A clinic that depletes its battery mid-day has no fallback. The 200Ah specification provides margin for missed charging or cloudy days, but extended poor conditions eventually exhaust reserves.
Variants
Vehicle-based variant
The vehicle-based configuration mounts all IT equipment permanently in a converted vehicle (van, truck, or bus), using vehicle power systems as the primary power source. The vehicle alternator charges the battery bank during transit, and an auxiliary battery isolator prevents clinic loads from draining the starter battery. This variant suits programmes with dedicated vehicles, predictable routes, and daily return to base for charging.
Equipment mounts to vehicle structure using vibration-dampening brackets. The local server and networking equipment occupy a lockable cabinet accessible from inside the vehicle. Tablets charge from vehicle USB ports during transit. External antenna mounts on the vehicle roof provide optimal cellular and satellite reception.
Vehicle integration adds £500-1,500 for mounting hardware, cable routing, and auxiliary electrical installation. The benefit is elimination of setup/teardown for IT equipment, reducing site arrival time to under 5 minutes from vehicle stop to first patient registration.
Tent-based variant
The tent-based configuration packages IT equipment in transportable cases that deploy within temporary structures. This variant suits programmes using aircraft, boats, or porters where vehicle access is impossible, or where multiple temporary sites operate simultaneously from shared equipment pools.
Equipment organises into three cases: power case (battery, charge controller, inverter), connectivity case (router, satellite terminal, cables), and compute case (server, tablets, accessories). Each case weighs under 23kg to meet airline checked baggage limits. Setup requires 20-30 minutes for a trained operator to unpack, connect, and verify systems.
Solar charging becomes primary rather than supplementary, as vehicle charging is unavailable. The solar panel specification increases to 300-400W with a portable mounting frame, adding £300-500 and 15kg to the equipment manifest. Battery capacity may increase to 300Ah for multi-day operations without solar charging opportunity.
Multi-site rotation variant
The multi-site rotation configuration operates across a fixed set of locations visited on regular schedules, enabling location-specific optimisation. Each site maintains a power cabinet with battery and charging infrastructure, while the mobile team transports only tablets and networking equipment.
Site cabinets contain a 200Ah battery, charge controller, and solar panel connection, remaining in place between visits. This configuration suits weekly or fortnightly rotation schedules where transporting batteries on each visit wastes capacity and increases wear.
Pre-positioned equipment adds £1,500-2,500 per site but reduces transported weight by 25kg and eliminates battery condition concerns from frequent transport. Site cabinets require periodic maintenance visits (quarterly) to verify battery health and solar panel condition.
Anti-patterns
Infrastructure dependence
Designing for expected infrastructure rather than worst-case infrastructure produces fragile systems that fail unpredictably. A mobile clinic configuration that requires cellular connectivity to function provides no value when cellular coverage proves unavailable at site locations. The symptom is clinical staff inability to work when an expected resource fails, followed by increasing workarounds that compromise data integrity.
The pattern requires full functionality at zero connectivity as the baseline design, with connectivity enhancing rather than enabling operations. Every feature must answer the question “what happens when offline?” with a functional response, not an error message.
Consumer equipment
Consumer electronics optimised for domestic use fail rapidly in mobile clinic conditions. A consumer tablet survives perhaps 6 months of daily vehicle transport, temperature extremes, dust exposure, and handling by multiple users with varying care levels. Consumer WiFi routers without vibration tolerance develop loose connections. Consumer power inverters lack the robustness for continuous mobile operation.
The capital cost difference between consumer and ruggedised equipment (often 2-3x) recovers within the first year through reduced replacement, reduced downtime, and reduced support burden. A £300 consumer tablet replaced three times costs more than a £700 ruggedised tablet that lasts the project duration.
Synchronisation afterthought
Adding offline capability to a system designed for continuous connectivity produces architectural complications and data integrity risks. Applications must design for offline operation from initial architecture, with synchronisation as a core capability rather than an added feature. Retrofitted offline support typically manifests as inconsistent behaviour, unexpected conflicts, and data loss under edge conditions that testing did not anticipate.
The sync queue and conflict resolution mechanisms require careful design and thorough testing across connectivity transition scenarios. Cutting corners on synchronisation design creates problems that surface months later when accumulated edge cases overwhelm manual resolution capacity.
Insufficient power margin
Sizing power systems to exactly match expected load leaves no margin for increased consumption, degraded battery capacity, or reduced charging opportunity. A battery bank that provides exactly 8 hours of operation at design load provides 6 hours after a year of capacity degradation, 5 hours on a cloudy day that reduces solar charging, and 4 hours when an unexpected load appears.
The 50% depth-of-discharge limit in the pattern provides the necessary margin. Designs that discharge batteries to 20% remaining capacity on a normal operating day have no reserve for abnormal days.
Single point of failure
Connectivity designs with one cellular SIM, power designs with one charging source, or compute designs with one server create single points of failure that eventually cause complete service interruption. Mobile operations experience higher failure rates than fixed installations due to environmental stress and handling damage.
The pattern’s redundancy through cellular bonding, multiple charging sources, and tablet independence from server availability addresses common failure modes. Each critical function must have a fallback that maintains operation at reduced capability rather than failing completely.
See also
- Field Site IT Establishment for permanent field office infrastructure
- Solar and Off-Grid Power for detailed power system design
- Offline System Configuration for offline application setup procedures
- Data Synchronisation Setup for synchronisation configuration
- Intermittent Connectivity Patterns for connectivity-aware application design
- Case Management Systems for clinical system options
- Protection Data Principles for patient data handling principles
- Satellite Connectivity for satellite service options