SLAB is a software-based, real-time virtual acoustic environment rendering system being developed as a tool for the study of spatial hearing. SLAB is designed to work in the personal computer environment to take advantage of the low-cost PC platform while providing a flexible, maintainable, and extensible architecture to enable the quick development of experiments. The software provides an API (Application Programming Interface) for specifying the acoustic scene as well as an extensible architecture for exploring multiple rendering strategies.
The SLAB Render API supports a number of parameters including sound source specification (device, file, and signal generator plug-ins), source gain, source location, source trajectory, listener position, listener HRTF (Head-Related Transfer Function) database, surface location, surface material type, render plug-in specification, scripting, and low-level signal processing parameters.
Potential applications include psychoacoustic research, spatial auditory display prototypes, virtual reality for simulation and training, augmented reality for improved situational awareness, enhanced communication systems, education, music composition, sound effects, and entertainment. For these applications and others, SLAB provides a low-cost system for dynamic synthesis of virtual audio over headphones without the need of special purpose signal processing hardware.