Varun InnovatesPublic XR+AI lab
01 Harmony framework

Adaptive XR + AI for human-centered 3D interaction.

Harmony is a research framework for interfaces that perceive context, preserve spatial continuity, and adapt with transparency so technology can support people without taking control away from them.

Why now

Spatial interfaces need continuity, not just more features.

Most digital systems still assume static screens and fixed layouts, even when the work is spatial, collaborative, and changing in real time. Harmony starts from that gap.

Context is lost

XR experiences often reset between devices, users, sessions, and task phases. That makes the interface feel unaware of what the person is trying to do.

Layouts stay fixed

Interface complexity rarely changes with cognitive load, attention, or task state. The user carries the burden of adaptation.

Trust needs explanation

AI-mediated interfaces need readable reasons for what changed, why it changed, and how the user can override it.

Research vision

Human judgment stays central.

Harmony investigates how AI and XR can co-adapt with people by interpreting spatial, behavioral, and collaborative cues without replacing human agency.

Human-centered first

Design for well-being, accessibility, meaningful control, and calm interaction.

Context as signal

Use environmental and behavioral cues to inform adaptation without turning every signal into surveillance.

Collaboration by design

Support multi-user sensemaking, shared state, and human-AI co-creation.

Edge-aware responsiveness

Keep adaptation fast and privacy-conscious by making careful choices about local and cloud processing.

Responsible AI

Make adaptation explainable, opt-in, auditable, and reversible.

Research rigor

Evaluate usefulness, cognitive load, user trust, and long-term fit instead of only measuring novelty.

Framework

Three layers working together.

Harmony is a conceptual framework for adaptive XR experiences. It connects perception, mediation, and collaboration into one research direction.

Perception of context

Signals about space, task, and behavior are summarized into context descriptors: just enough awareness to support adaptation.

Adaptive mediation

Interface complexity, layout, and guidance tune to moment-to-moment needs so the experience can reduce friction.

Collaboration orchestration

Shared attention and state are aligned across people, devices, and AI assistance for smoother multi-user workflows.

01Signals

Gaze, pose, hands, task phase, spatial state.

02Context model

Compact descriptors for the current situation.

03Spatial memory

Continuity across sessions and shared spaces.

04Adaptive response

Layout, complexity, guidance, and feedback.

05User control

Explanation, override, tuning, and auditability.

Research axes

The questions guiding the work.

Each axis is intentionally practical: it connects a research question to an evaluation path and a system behavior.

Current explorations

Prototype questions, not product claims.

Current work scopes pilot studies and low-friction prototypes around adaptation mechanisms that can be evaluated in real scenarios.

Adaptive layout for spatial tasks

UI elements scale, relocate, or simplify based on task phase and user focus.

Attention-safe guidance

Subtle cues for wayfinding and feedback without unnecessary interruption.

Co-presence and shared focus

Lightweight mechanisms to align group attention in collaborative XR.

Explainable adaptation

Micro-disclosures that show what changed, why it changed, and how to adjust it.

Where it helps

Domains where context matters.

Harmony is most relevant when the interface needs to understand task state, shared attention, user ability, or environmental context.

Immersive learning

Adaptive simulations and skill training that respond to learner progress.

Accessibility

Context-aware support that adapts to individual needs and preferences.

Assistive workflows

Guidance for field service, AEC, healthcare training, and operational procedures.

Creative collaboration

Multi-user spatial sketching, review, and synchronized attention.

Safety-critical training

Progressive disclosure to manage cognitive load in high-stakes scenarios.

Immersive analytics

Spatial sensemaking where data, environment, and human attention interact.

Roadmap

A phased research path.

The roadmap keeps the work grounded: synthesize, prototype, evaluate, and share only what is responsible to share.

Now - 6 months Foundation

Literature synthesis, study design, low-fidelity prototypes, and evaluation planning.

6 - 12 months Systemization

Iterative prototypes, evaluation methods, architecture notes, and early publications.

12 - 24 months Ecosystem

Toolkit patterns, reference implementations, and carefully scoped community pilots.

24+ months Scale

Open materials, partnerships, and longitudinal studies where appropriate.

Ethics first: adaptation must be opt-in, reversible, and user-tunable. Transparency and well-being metrics guide every phase.

Background

Grounded in XR research and delivery.

Harmony sits inside Varun Siddaraju's broader body of XR, spatial AI, and applied systems work across research, publications, prototypes, and enterprise delivery.

50+

XR projects and prototypes across research and applied work.

2

Public books spanning mixed reality engineering and systems thinking.

3

Peer-reviewed publications plus research receipts across XR and spatial systems.

Ecosystem

Part of Varun Innovates Lab.

This is the owned, indexable Harmony home inside the lab ecosystem.

Get involved

For research collaboration and serious applied work.

Harmony is a research-driven framework. Collaboration is best when it connects HCI, XR systems, responsible AI, spatial computing, or applied evaluation.