On October 10, the research team led by Professor Yang-Hann Kim and Jung-Woo Choi announced the invention of a system called the Sound Ball system.

This system is the first of its kind in that sound can be localized in a given space. This means that when applied to a 3D TV, the audience can experience a truly 3D environment both visually and auditorily. For instance, the ability to concentrate an instrument’s sound at a specific location can provide the impression that one is at a concert hall. In addition, the capacity to freely control multiple sounds means that the Sound Ball system can be used not only privately but also professionally, such as in broadcasting companies to edit sound waves. The basis for its development is in Professor Kim’s earlier technique, acoustic contrast, which was presented to the Acoustical Society of America in 2002.

In the contemporary period, various 3D sound producing methods are available. The problem, however, is that 3D sound can vary depending on the auditor and his or her environment. The solution to this problem can be found in the balance knob, because the balance knob prioritizes finding the sound that the auditor wants over finding the sound that is practical and ideal. In other words, the auditor can personally adjust the sound through trial and error, until the perfect sound is achieved.

The secret behind this innovative system lies in the usage of multiple speakers, the so-called loudspeaker array. When more than one loud speaker release sound waves, interference occurs at specific points. By controlling these points, the sound can be focused at certain places. This idea of interference was initially derived from Huygens’ theory and from the Kirchoff-Helmholtz integration equations.

The Sound Ball system algorithm was used to format the Spatial Equalizer as a real audio system structure. Consisting of 24 speakers in linear arrays, complemented by 50 additional speakers in spherical arrays, the Sound Ball uses the Open Sound Control interface protocol. Consequently, the control tools, such as smart phones and host PCs, can exchange data even at relatively long distances.

Professor Kim and Professor Choi have been granted a patent for this invention, and their research paper was published in IEEE Transaction of Audio, Speech, and Language Processing.

Copyright © The KAIST Herald Unauthorized reproduction, redistribution prohibited