Lanchester R&DTactical Exploration Lab
Operational Intelligence
HospitalityAIReal-TimeUX

IRU-Assistant

AI companion surfacing guest context for hospitality staff.

IRU-Assistant Diagnostic
IMG_REF // IRU-ASSISTANT

Problem Defined

"Guest preferences are buried in silos, causing reactive service."

01

Strategic Context

Staff lack access to immediate intelligence.

02

Competitive Imbalance

The data gap between expectation and reality degrades loyalty.

03

System Hypothesis

Real-time intelligence at the interaction point enables proactive service.

04

Process Architecture

How the system was designed, tested, and refined.

01

DEFINE

Objective

Surface guest context for hospitality staff without manual search.

What We Did
  • Shadowed hotel staff
  • Audited PMS data silos
  • Identified preference gaps
What Failed
  • Initial focus was on data collection rather than staff utility
What We Learned
  • Intelligence is useless if not delivered at the point of interaction
What We Adjusted
  • Shifted focus to real-time service nudges and context delivery
02

MAP

Objective

Map guest preference data to interaction touchpoints.

What We Did
  • Mapped PMS data flow to staff mobile devices
  • Identified decision points during guest arrival
What Failed
  • Maps didn't account for high-tempo breakfast/check-out peaks
What We Learned
  • Context delivery must be tiered by urgency
What We Adjusted
  • Created priority-based intelligence delivery logic
03

VALIDATE

Objective

Test preference-based service interventions.

What We Did
  • Tested prototype with service teams
  • Measured staff confidence after briefings
What Failed
  • Staff found long-form bios distracting during service
What We Learned
  • Information must be converted into actions, not just data
What We Adjusted
  • Switched from "Guest Bios" to "Suggested Nudges"
04

EXECUTE

Objective

Build the contextual intelligence interface.

What We Built
  • Preference modeling engine
  • PMS integration layer
  • Nudge UI
What Failed
  • Over-engineered the historical data parsing early on
What We Learned
  • Current context > Deep history for immediate service quality
What We Adjusted
  • Prioritized immediate visit data and active preferences
05

MEASURE

Objective

Measure guest NPS and staff operational confidence.

Metrics Tracked
  • Guest NPS increase
  • Staff response time
  • Preference fulfillment rate
What Failed
  • Early data was too anecdotal to confirm system shift
What We Learned
  • Briefing adoption is the best proxy for system trust
What We Adjusted
  • Introduced automated adoption tracking for briefings

Rule Application

How doctrine was operationalized.

Intellectual Rigor
01_INT
Applied By
  • Mapping PMS hierarchy before integration
  • Defining clear success metrics
Evidence

18% increase in NPS achieved through structured briefings

Tactical Execution
02_TAC
Applied By
  • Shipping lean briefing interface first
  • Integrating with existing hardware
Evidence

System operational on existing staff tablets in 3 weeks

Human Calibration
03_HUM
Applied By
  • Reducing cognitive load for front-line staff
  • Ensuring glanceable data delivery
Evidence

Briefings reduced to <5 seconds of staff attention

Machine Leverage
04_AI
Applied By
  • AI synthesis of disparate guest data
  • Automated nudge generation
Evidence

AI identifies high-value preference patterns without manual filter

05

Product Architecture

Preference modeling, PMS integration, contextual UI.

IRU-Assistant Architecture
System Schematic // V-01
06

AI Leverage

Real-time synthesis for service nudges.

07

Outcomes & Learnings

Delivered personalized service without manual briefing.