MENU

My Resume

GET IN TOUCH

cyrilkups95@gmail.com
Back

Braille Technology

Year

2026

Tech & Technique

Swift, SwiftUI, Core Haptics, Apple Intelligence, MVVM, Combine, XCTest

Description

Braille Technology is an AI-powered tactile interface that converts structured digital meaning into haptic intelligence. It replaces visual screens with a braille-dominant interaction layer where every gesture, every notification, and every reply is driven entirely by touch — no vision or hearing required. Built for deafblind users. Designed for everyone.

Key Features:
  • 🤲 Tactile Braille Reading: Drag across braille dots to read — each character fires a unique haptic burst based on its 6-dot bitmask pattern
  • ⌨️ 6-Dot Braille Keyboard: Compose and send replies entirely through touch with haptic success/failure feedback
  • 🧠 AI Semantic Compression: On-device Apple Intelligence distills messages into tactile summaries preserving intent, urgency, and emotional tone
  • 🚨 Fraud Alert Room: Dedicated response flow — Freeze Card / Call Bank / Ignore — operable without sight or sound
  • 📵 Offline-First & Private: All processing on-device; no raw message content leaves the phone
  • Apple Watch Layer: Wrist-level urgency alerts with stress-aware escalation (roadmap)

Technical Highlights:
  • 310+ Deterministic Unit Tests: Full state machine coverage across 26 test files using SpyHapticService to assert exact haptic event sequences
  • Generic State Machine: SenseLayerState<HapticService> enables test injection without any UI dependency
  • Per-Character Tactile Fingerprints: Each braille cell drives a unique burst — pulse count, timing gaps, and per-dot-position intensity vary so every letter feels distinct
  • Core Haptics Integration: CHHapticEngine for precise patterned haptic rendering; fallback to UIImpactFeedbackGenerator on older devices
  • Dependency Injection Throughout: SendService, Scheduler, HapticService, and MessageRepository are all protocols with mock implementations

My Role

Founder • Architect • iOS Engineer

Vision & Product:
  • Defined the post-visual interaction paradigm and core UX principles (gesture-only, zero audio reliance, on-device privacy)
  • Designed the complete deafblind user journey from fraud alert detection → action in under 30 seconds
System Architecture:
  • Designed a deterministic state machine generic over HapticService for full testability
  • Built the haptic language specification: per-character tactile fingerprints, urgency-weighted alert signatures, and navigation boundary bumps
  • Architected the AI compression pipeline: semantic summarization, urgency scoring, and tone detection — all on-device
Engineering:
  • Implemented BrailleCellMapper, TactileEngine, CompressionService, DraftStore, and SendService
  • Built 310+ unit and integration tests using SpyHapticService and TestScheduler
  • Developed the 6-dot braille keyboard with commit/space/delete/send gesture flows
Testing & Quality:
  • Verified all state transitions, haptic sequences, draft persistence, and urgent message queuing
  • Established contribution standards: spy assertions required for all state machine changes

Case Study Impact

Problem: Deafblind users have no independent path to time-critical information. When a fraud alert arrives, they cannot see a banner, hear a ringtone, or scan a screen. They must depend on another person — surrendering autonomy at the exact moment it matters most. Today's accessibility tools (VoiceOver, external braille displays) translate overload; they do not solve it.

Solution: A software-only tactile operating layer where every notification, message, and reply flows through a structured haptic language. AI compresses meaning before it reaches the user. The state machine ensures every interaction is gesture-driven and predictable. No vision required. No sound required. No hardware attachment required.

Key Implementation Details:
  • MultipeerConnectivity-style session model replaced by CHHapticEngine + gesture recognizer stack
  • Urgency scoring surfaces fraud alerts with a sharper, faster haptic signature than routine messages
  • 60-second inactivity timer auto-saves drafts and returns to home — preventing lost work
  • Fraud response room hardcoded as a first-class state to minimize time-to-action under stress

Outcome:
  • ✅ Detection → comprehension → decision → action → confirmation in under 30 seconds
  • ✅ 5 gestures to read and reply (vs. 15+ with VoiceOver + keyboard)
  • ✅ 310+ passing tests proving correctness of every state transition and haptic sequence
  • ✅ Full two-way tactile communication: read, compose, and send without any sighted assistance
  • ✅ On-device privacy architecture — no raw message content transmitted externally
Braille Technology
welcome.py
1
2
3
4
5
cyrilkups95@gmail.com