The Vendor is required to provide content developer for a complete multisensory extended reality (XR) system.
- The content developer should provide a platform that support immersive, interactive, and research driven experiences integrating visual, auditory, olfactory and other sensory modalities.
1. Interactions and user experiences:
• Multiple interaction pathways.
• The system must support hands-free and low mobility use, including options such as eye tracking, voice input, switch access, proximity or environmental sensors, and other adaptive interfaces appropriate for diverse physical, sensory, and cognitive considerations.
• Interaction design that does not require fine motor control, sustained physical effort, or the use of handheld controllers as a primary mode of engagement.
• Multimodal engagement through integrated sound, visual, haptic, and olfactory outputs to support non-text-based experience and communication.
• Integration with augmentative and alternative communication (AAC) technologies to support non-verbal and multimodal interaction.
• Support for individual user sessions conducted alongside researchers, with interfaces that allow observation and interaction without interrupting or overriding user control.
• XR-compatible wearable systems designed for accessibility, including mobility considerations, comfort, adjustability, and compatibility with assistive devices (e.g., wheelchairs, head supports, mounts).
• Ai-enabled wearable technologies, such as smart glasses, to support real-time contextual interaction and capture of first-person user perspective and environmental data, without requiring manual input.
• Systems that allow users to experience and modify simulated environments through dynamic adjustment of sensory and environmental variables, including sound, lighting, spatial configuration, haptic feedback, and scent.
2. Computing and control:
• Computer and control station (workstation or cluster) capable of running the XR applications, managing multisensory devices, and processing and storing collected data in real time.
• The system must include a centralized, control interface that is accessible.
• The interface must support real-time scenario management, including adjustment of sensory outputs and environmental conditions (e.g. Sound, visual, spatial configuration, haptics, and scent).
• The control system must allow researchers to monitor user interaction and system performance in real time, without disrupting or overriding the user experience.
• The system must support data capturing across sensory conditions, with export in standard formats suitable for research and analysis.
3. Multisensory components:
• The system must support integrated and synchronized multisensory outputs within xr environments, including audio, visual, haptic feedback, and olfactory delivery systems
• Outputs must be individually controllable and adjustable in real time to support variation in intensity, timing, and combination of sensory inputs.
• The system must enable synchronization of sensory outputs with simulated environmental conditions and user interaction.
• The sensory outputs must be designed with accessibility in mind, including options to reduce, substitute, or customize sensory channels based on user needs and preferences.
4. Content and software:
• The proposed system must include developing software and content creation capabilities to support the creation, control, and modification of immersive multisensory XR environments.
• The system must include two interactive XR scenarios that recreate real-world environments relevant to public and service-based contexts, such as public transit and healthcare settings.
• The system must support real-time control and sensory manipulation within each scenario, including:
o Visual components (environment) o auditory components (include sound type, volume, and timing) o olfactory outputs (including scent type, intensity, and timing) o haptic feedback (including tactile cues)
• Scenarios must be interactive and responsive to user input, including movement and actions
• Control interface: the system must include an accessible interface that allows researchers and staff to modify scenario parameters, adjust sensory outputs, and manage environmental conditions in real time.
- Budget: $125,000.00 to $190,000.00
Set up free email alerts and get notified when new government bids, tenders and procurement opportunities match your industry and location. Choose daily or weekly delivery.