NUST Institutional Repository

A Point Cloud Based HRTF Technique for Analysing the Effects of 3D Sounds

Show simple item record

dc.contributor.author Muhammad Usman
dc.date.accessioned 2021-01-26T11:21:31Z
dc.date.available 2021-01-26T11:21:31Z
dc.date.issued 2017
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/21845
dc.description Supervisor DR. KHURRAM KAMAL en_US
dc.description.abstract 3D Multimedia technologies such as virtual and augmented reality are in major focus nowadays due to advancements in multimedia technology. These systems are required to produce sound effects in 3D in order to accurately simulate the real world. This thesis proposes a novel system for realize 3D sounds in accordance with the head position for virtual reality and gaming purposes. The technique employed an Inertial Measurement Unit (IMU) mounted on a headphone and a depth sensor to calculate azimuth and elevation of head. Extended Kalman Filter is used to fuse the head tracking data from IMU and Kinect measurements. The estimated head position is fed to the CIPIC Head Related Transfer Function (HRTF) database in order to produce 3D sound effects. Results show a promising future for the proposed technique in 3D sound realization. Realistic sound generation is an integral part of Virtual Reality (VR) and Augmented Reality (AR). Conventionally, sound systems are designed in such a way that sound realization is independent of the direction of the listener. Normally, this is not critical, but in the case of VR and AR, sound localization becomes compulsory [1-2]. The listener should be able to distinctly perceive the exact location and orientation of any sound source inside the simulated environment. Hence, the sound generation must account for all 6 degrees of freedom in nature by taking into account three positions and three orientations of the receiver with respect to the source. Although the modification of a sound based on the relative position is important for large simulated spaces, for small, confined areas the relative orientation takes precedence. Therefore, this work tackles the issue of relative orientation based modification of source sound. This is called realistic 3D sound generation in this paper. Commercially available virtual reality products, like Samsung Gear and Oculus Rift, have spawned many new researches. One of the major problems in VR systems is generating 3D sounds according to the listener’s head position in 360 view of a simulation and orientation with respect to the simulated environment. The Challenge is to generate realistic sounds onto virtual world en_US
dc.publisher CEME, National University of Sciences and Technology, Islamabad en_US
dc.subject A Point Cloud Based HRTF Technique for Analysing the Effects of 3D Sounds en_US
dc.title A Point Cloud Based HRTF Technique for Analysing the Effects of 3D Sounds en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • MS [205]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account