Te Wā
The Space Between Worlds
Concept Overview
Te Wā is an immersive, multi-sensory installation that explores the relationship between time (wā) and space (wā), sound and silence, presence and memory.
It is both an interval and an environment — a living, responsive system where light, sound, and digital form converge through AR+IQ.
Each participant steps into a shared time-space, co-creating a moment that only exists when they are together.
Philosophical Foundation
In te ao Māori, wā is not fixed — it flows, expands, and folds.
It is the pulse between past (mua) and future (muri), embodied in the present moment.
In this sense, Te Wā is an invitation to experience te ao hurihuri — the ever-turning world — through an interactive field of light and sound.
Ko te wā te manawa o te ao — time is the breath of the world.
This concept aligns with iSPARX™’s kaupapa:
Tino Rangatiratanga: Creative sovereignty in defining our digital space.
Whai Oranga o te Taiao: Designing technology that sustains ecological and emotional balance.
Mana: Ensuring integrity in experience and representation.
Ngākau Pono: Grounding innovation in authenticity.
Spatial & Experiential Design
The Installation:
A minimal physical set — LED curtains, reflective surfaces, and dichroic filters in deep reds and blues (181).
The environment is choreographed like a breathing organism: light waves, spatial sound, and AR objects move in relation to people’s presence.
Audio System:
Binaural and Atmos layers delivered via Apple AirPods, mapped to each participant’s position.
A sub-heavy house system anchors the group in a collective pulse.
The AI engine modulates intensity, creating waves of tension and stillness.
Lighting & Projection:
Intelligent fixtures with dichroic wheels create transitions of te ao and te pō — light and darkness.
LED curtains act as portals; projection surfaces become membranes between digital and physical layers.
Light behaves like a whakaaro (thought) — responsive, emotive, and evolving.
AR Layer:
Participants see animated forms emerging from reflective planes — perhaps ancestral patterns or digital whakapapa nodes.
Using the AR+IQ multi-user synchronisation system, these forms react to collective rhythm and proximity.
The shared AR field becomes a living metaphor for whanaungatanga — connection through time and space.
Narrative Arc (Optional)
1. Te Ao Mārama — Emergence
The space begins in darkness. Subsonic vibrations rise as subtle light filters through — users’ AR views reveal drifting forms.
2. Te Ao Hurihuri — Flow
As people move, the soundscape and light synchronise. The AI reads the group rhythm — intensity, movement, energy.
Reflections shimmer. The wā begins to breathe.
3. Te Kore — Silence / Stillness
The system draws the energy inward — all audio drops to a low pulse, lighting shifts to soft red.
Each participant’s AR view glows faintly, reflecting their collective imprint in space.
4. Te Wā — Union
Everything fuses — sound, light, projection, and AR converge into one continuous resonant field.
A brief stillness, then fade.
Te Wā
Te Wā
Concept: “181 — Immersive Spatial Installation”
An Interactive, Multi-User Audio-Visual Experience Powered by AR+IQ
A multi-user interactive installation combining:
Binaural/Atmos headphone mix via Apple AirPods (personal immersion)
Spatial house mix with sub-heavy low frequencies (shared physical experience)
Choreographed lighting (dichroic colour wheels, deep blues and reds)
AR+IQ synchronised animation — responsive to user position, movement, or interaction
This creates a layered sensory environment: private (in-ear) + collective (onsite) + visual (AR and light).
1. Core Experience
Objective:
Create a shared environment where spatial audio, light, and augmented reality converge — an intelligent space that responds to audience presence, sound, and motion.
Environment:
A contained room or stage with reflective and translucent set pieces, LED curtains, and minimal physical props. The space operates as a responsive system — sound, light, and visuals are synchronised across users via AR+IQ’s Unity SDK and cloud infrastructure .
Client
PRIVATE & CONFIDENTIAL
Year
01/06/2026
3. Visual and Lighting System
Lighting Design:
Dichroic colour wheels in deep reds and blues (181) — referencing optical interference patterns.
Intelligent lighting scenes choreographed using DMX/ArtNet integrated with AR+IQ’s real-time event system.
Lights respond to AI cues (user density, gestures, and AR triggers).
Projection and LED:
LED curtains used as translucent portals — projection mapped from Unity scene data.
Reflective set pieces amplify depth and play with perception.
Visuals driven by Unity’s Shader Graph and synced via AR+IQ’s “Event Module”.
2. Multi-Layered Audio Design
Headphones:
Apple AirPods (Spatial Audio) deliver binaural and Atmos mixes.
Each user hears a personalised adaptive layer based on position and interaction (AI-assisted modulation).
House System:
A sub-heavy environmental layer (LFE-driven) immerses the collective audience.
Controlled through AR+IQ’s AI interaction engine, it syncs with user telemetry and environmental triggers.
Audio Architecture:
Core Mix: Atmos environment (7.1.4) mixed down to binaural for AirPods.
Reactive Layer: Sound objects mapped to AR anchors (Unity spatial audio API).
Dynamic Layer: AI learns user movement and proximity, altering sonic space in real-time.
4. AR Layer (Unity SDK + AI Integration)
AR Interaction:
Multi-user synchronised AR elements (Unity Multipeer Connectivity + Photon Realtime).
Objects (particles, avatars, or story elements) respond to sound frequencies, user gaze, or gestures.
Shared experience supported by AI state management to maintain narrative consistency.
Example Sequence:
Users enter → calibration via AirPods.
Lighting fades → low-frequency pulse introduces AR objects emerging through fog.
Motion sensors trigger AI-guided choreography across light and sound layers.
Co-creation moment: users collectively influence the visuals or soundscape via interaction.
5. Technical Framework
6. Interaction Flow (Based on AR+IQ Userflows)
Phases:
Arrival: Users connect via AirPods; calibration in AR.
Activation: AR+IQ initiates adaptive audio sync; lighting transitions.
Immersion: Real-time co-creation via gestures and spatial movement.
Reflection: AI agent captures the group pattern; renders final “scene memory”.
Exit: Ambient fade to silence — data uploaded to CMS for analysis.
7. Potential Deployment Contexts
Cultural installations (e.g., Pātaka, Te Papa, or Sydney Contemporary)
Retail brand experience rooms
Educational XR learning environments
Event or festival activation spaces
8. Ethical and Cultural Foundation
Aligned with kaupapa Māori principles from the AR+concierge™ research:
Tino Rangatiratanga: Data sovereignty in user telemetry
Whai Oranga o te Taiao: Sustainable lighting and low-power compute cycles
Mana: Audience integrity and privacy
Ngākau Pono: Culturally respectful co-design of AR narratives
wā
(noun)
Time / Season / Duration — a measure or passage of time; a particular phase or moment.
He wā pai tēnei mō te whakatā.
→ This is a good time to rest.
I te wā o Matariki, ka tīmata anō te tau hou Māori.
→ During the time of Matariki, the Māori new year begins.
Synonyms: tāima (loanword from “time”), tau (year, season), kaupeka (season, term), takiwā (period, era), houanga (season).
Space / Area / Place / Region — a defined portion of space; an area or extent.
Ko tēnei wā o te marae he wā tapu.
→ This area of the marae is sacred.
I roto i te wā o te ao marama.
→ Within the space of the world of light.
Synonyms: pae (horizon, region), whaitua (district, zone), rohe (region, boundary), takiwā (district, area).
Conceptually
wā often transcends the linear separation between time and space — it refers to a moment and a place where something significant occurs. In many kaupapa Māori frameworks, wā can represent a temporal-spatial event, an intersection where people, environment, and purpose align.
This is especially fitting for AR+IQ and our immersive installation ideas — we’re effectively creating a “wā”:
a living interval of light, sound, and presence — a shared time-space where technology meets tikanga, and where people can feel, hear, and move together.
AR+IQ by iSPARX™
AR+IQ is a modular augmented reality platform that combines spatial computing and artificial intelligence to create immersive, intelligent, and location-aware experiences. It adapts to different sectors through custom verticals designed for Retail, Art & Culture, and Entertainment.
Entertainment
From live music to public installations, AR+IQ brings digital layers to events. It connects audiences and performers through interactive effects, spatial storytelling, and shared AR experiences that extend beyond the stage or screen.
Art & Culture
AR+IQ supports artists, museums, and cultural institutions with tools for immersive storytelling. It anchors 3D content, sound, and narrative in real spaces, allowing audiences to explore culture through location-aware experiences.
Retail
AR+IQ transforms shopping environments into interactive spaces. Digital concierges, product visualisation, and gamified loyalty tools enhance customer engagement and provide real-time insights into behaviour and sales.
www.sweetorange.app
SweetOrange - the fan-facing mobile application that transforms how music lovers discover, experience, and engage with live events.
Personalised Event Discovery
AI-powered recommendations based on listening history, location, and preferences to help fans find their perfect gig.
Seamless Ticketing (iNVITE™)
Direct, secure ticket purchases with mobile QR codes and exclusive pre-sales and offers.
AR+IQ Enhanced Experiences
Augmented reality overlays at venues providing interactive content, artist information, and immersive experiences.
Fan - Artist Engagement
Direct messaging with artists, tipping, and social sharing to foster deeper connections.
Matariki.App™
XR/AR/Ai/spatial | LOCATION | STREAMING | 360° | 3D
The Matariki.App™ is an event index where you can discover & retrieve the details of events in the regional schedule of the Matariki & Puanga, Waitangi Day & throughout the year.
The Matariki.App™ is designed to enhance the event experience for users & provide an effective platform for delivery of immersive media in a Matariki real-time 3D environment.
*powered by iSPARX™
resonate.land – Geolocated Binaural Audio Experiences
resonate.land is an immersive Augmented Reality (AR+) experience that layers persistent, geolocated binaural audio onto real-world locations. Designed for places of cultural, historical, and artistic significance, this experience allows users to discover hidden audio artefacts through an interactive mApp, seamlessly blending sound, space, and storytelling.
Using the AR+IQ platform *powered by iSPARX™, resonate.land delivers an intelligent and interactive multi-user audio experience:
✔ Geolocated Discovery – Receive a notification when within 100m of an artefact and navigate via the mApp.
✔ Augmented Reality Activation – Locate and interact with digital markers using your device’s camera.
✔ Binaural Audio Immersion – Tap the AR+ marker to unlock an immersive, spatialised sound experience.
✔ Persistent Multi-User Engagement – Designed for shared interactions in real-world environments.
Why AR+IQ?
Built on AR+IQ resonate.land benefits from
Real-time Spatial Computing – Delivers location-based experiences with precision.
AI-Powered Content Delivery – Provides dynamic, personalised interactions.
Multi-User Synchronisation – Supports collective engagement with spatial audio.
Cross-Platform Compatibility – Works across mobile devices with seamless AR integration.
Ethical & Inclusive Design – Ensures accessibility and cultural sensitivity.
Step into a world where sound meets space - resonate.land brings places to life through the power of AR+ binaural audio.
NVLP ATMOS Studio & HA-HA (GMA) Studio
– Pioneering Immersive Audio & Digital Arts
NVLP ATMOS Studio | Sydney
Led by Ant Smif and Josh Wermit (notable for recent work with Thom Yorke), NVLP ATMOS Studio in Sydney is a cutting-edge spatial audio lab specialising in Dolby ATMOS production and immersive sound design. Known for work in music, film, and interactive media, NVLP pushes the boundaries of 3D soundscapes, crafting deeply immersive sonic experiences for contemporary and experimental artists.
HA-HA Artist PTY (GMA) Studio | Perth
Situated in Perth, Australia, HA-HA Studio is a leading-edge remote facility dedicated to Animation, AR (Web & App), Dolby ATMOS spatial audio application, digital arts commissioning, and immersive storytelling. Under the stewardship of Ethan Wimsett, The Guerrilla Media Agency - HA-HA studio facilitates digital commissions for the Arts, supporting emerging and established artists in crafting groundbreaking interactive and multi-sensory works.
Design & Product / Creative Direction & Concept Development
Design and product execution are driven by:
Lead Developer James Norling James has led code for all the iSPARX™ innovation, development & product. Obsessive work ethic - “it’s got to be fun & the interesting!”
Executive Producer Joff Rae (ARTIVIST/iSPARX™) integrates immersive media, digital storytelling, and kaupapa Māori principles to create culturally resonant experiences.