Back to portfolio
🏠 Mobile App · IoT

Smart Home

🏠 Unified device control 📱 iOS · Dark UI ⚡ Room-first navigation

A mobile IoT control app that replaces fragmented device apps with a single, room-first interface — atmospheric, fast, and always in context.

📱 Mobile App 🏠 Smart Home 🔌 IoT 🔬 UX Research 🌑 Dark UI ⚛️ React Native
Duration
3 months
Year
2024
Role
UX Developer
Platform
iOS · Mobile
Room grid
Smart Home bedroom control

One app.
Every room.

Smart Home is a mobile application for unified IoT device control — managing lighting, air quality, and smart devices across every room from a single, intuitive interface.

The project emerged from a real frustration: the average connected home requires 3–5 separate apps to control its devices. One for lights, one for the thermostat, one for the air purifier, another for the vacuum. Every "quick" adjustment became a multi-app juggling act.

The design challenge was to replace this fragmented experience with a room-first navigation model — one that mirrors how people actually think about their home. Not "which device?" but "which room?"

The visual direction chose atmospheric photography and a warm dark UI system that makes controlling your home feel as natural as being in it.

Project Brief
My RoleUX Research · UI Design · Dev
TeamSolo
Duration3 months
PlatformiOS · Mobile
ToolsFigma · React Native · Expo

What I found

Three friction points that defined the design direction — uncovered through user interviews and competitive analysis of 8 existing smart home apps.

INSIGHT 01
📱

App Fatigue

The average smart home owner switches between 3–5 separate apps to manage their devices. Each context switch costs ~20 seconds of navigation overhead — making "quick" adjustments anything but quick.

INSIGHT 02
🗺️

No Spatial Context

Existing apps organize devices alphabetically or by brand — not by physical location. Users reported confusion about which "Light #4" belonged to which room, leading to trial-and-error to find the right control.

INSIGHT 03
🚧

Onboarding Friction

Smart device setup required 8–14 steps, with technical terms like "WPA2 passphrase" surfacing during first run. Most users stopped at their second device, leaving the bulk of hardware disconnected.

"How might we make controlling your home's atmosphere feel as natural as walking from room to room?"

Root Cause

The core problem isn't technology — it's a mental model mismatch. Users think in places ("the bedroom is too warm"), but apps force them to think in devices ("Xiaomi DEM-F600, channel 2"). Bridging this gap required a complete restructure of the navigation hierarchy.

Why it works
the way it does

Every major design choice traces back to a specific friction point found in research.

01
Navigation Architecture

Room-First, Not Device-First

Problem solved
Users couldn't find devices because apps used device-centric navigation — requiring them to remember product codes instead of locations.
Choice made
Entry point is a room grid (Bedroom, Kitchen, Livingroom). Devices only appear after selecting a room.
Why this choice
People's mental model of home is spatial. "I want to control the bedroom" is a natural thought — "find device BRK-042" is not.
02
Visual Wayfinding

Ambient Photography as Interface

Problem solved
Text labels and icons alone created ambiguity — users second-guessed which control panel belonged to which room.
Choice made
Real atmospheric room photographs as full-bleed backgrounds throughout room control screens and navigation grid.
Why this choice
Photos activate spatial memory faster than text. A user recognises their bedroom in under 0.2s — no reading required. Photography becomes functional, not decorative.
03
Information Architecture

Progressive Disclosure Controls

Problem solved
Showing all device parameters at once caused cognitive overload and slowed down simple tasks like checking room humidity.
Choice made
Each room shows 2 key metrics (humidity %, air quality) + on/off toggles. Fine-control sliders appear below — accessible but never prominent.
Why this choice
80% of interactions are quick status checks or simple on/off. Detail is available for the 20% who need it — never taxing the 80% who don't.

The interface

The House screen is the core navigation hub — a room grid that puts spatial orientation first, with every detail one tap away.

Smart Home house overview with room grid
1
2
3
4
  • 1
    WiFi Network Indicator
    Always-visible connection status — users instantly know the app is live and connected to their local network.
  • 2
    Atmospheric Room Header
    Minimal top chrome so room content takes center stage. House label and profile shortcut anchor navigation.
  • 3
    Room Photo Grid
    Each room card uses real atmospheric photography. One tap enters full room control. Photos remove all ambiguity about which room is which.
  • 4
    Contextual Bottom Navigation
    Home · Search · Grid · Profile — four core actions always reachable with a thumb, regardless of current screen.

Results that matter

Numbers from task analysis and usability testing — honest metrics reflecting what was actually built and measured.

1
app replacing 3–5 separate device apps in the unified user flow
8
complete screens covering the full user journey from login to room control
~70%
fewer taps to reach any device control vs multi-app workflow (task analysis)
"Finally, one place for everything. I checked the bedroom humidity before I even sat down — didn't have to think about which app."
— User feedback, usability testing session

How it's built

The technical decisions that made the design vision possible.

Stack

React Native Expo React Navigation AsyncStorage Figma

React Native with Expo for rapid cross-platform development. The dark UI system — all colors, radii, and spacing — was prototyped in Figma first and mirrored 1:1 in code using a shared design token file.

Architecture

Room-based state model: each room holds its device states as a single object. All controls sync through a central state layer — no prop drilling, no stale UI between screens.

roomState = {
  bedroom: {
    humidity: 76,
    purifier: 33,
    lights: { main: true, floor: false }
  }
}

Device Discovery

Devices detected via local WiFi scanning (mDNS). The Search screen auto-discovers supported hardware on the network — no QR codes, no manual IP entry.

  • Auto-detection on network join
  • Unrecognised devices with manual fallback
  • Devices assigned to rooms on first connect
  • Discovery state persisted via AsyncStorage

What I'd do differently

Key takeaways

  1. Mental models matter more than feature lists. I almost built a comprehensive device dashboard before user research revealed that people think in rooms, not devices. The research sprint paid back 10× in rework avoided.
  2. Photography is interface, not decoration. The atmospheric room backgrounds weren't visual filler — they were the primary navigation cue. Treating photography as a functional element opened up the entire design direction.
  3. Onboarding deserves its own design sprint. The login and device search flow was designed last and it shows. Most friction in testing occurred there. Next time, I'd start with onboarding before designing any feature screens.

What's next for Smart Home

Voice control — trigger common room presets with natural language commands
Energy consumption tracking per room — daily summaries with anomaly alerts
Automation rules — time-based triggers ("When I go to bed, dim everything to 10%")
Multi-home support — for users managing more than one property