Realistic lighting is important to improve immersion and make mixed reality applications seem more plausible. To properly blend the AR objects in the real scene, it is important to study the lighting of the environment. The existing illuminationframeworks proposed by…
Realistic lighting is important to improve immersion and make mixed reality applications seem more plausible. To properly blend the AR objects in the real scene, it is important to study the lighting of the environment. The existing illuminationframeworks proposed by Google’s ARCore (Google’s Augmented Reality Software Development Kit) and Apple’s ARKit (Apple’s Augmented Reality Software Development Kit) are computationally expensive and have very slow refresh rates, which make them incompatible for dynamic environments and low-end mobile devices. Recently, there have been other illumination estimation frameworks such as GLEAM, Xihe, which aim at providing better illumination with faster refresh rates. GLEAM is an illumination estimation framework that understands the real scene by collecting pixel data from a reflecting spherical light probe. GLEAM uses this data to form environment cubemaps which are later mapped onto a reflection probe to generate illumination for AR objects.
It is noticed that from a single viewpoint only one half of the light probe can be observed at a time which does not give complete information about the environment. This leads to the idea of having a multi-viewpoint estimation for better performance. This thesis work analyzes the multi-viewpoint capabilities of AR illumination frameworks that use physical light probes to understand the environment. The current work builds networking using TCP and UDP protocols on GLEAM. This thesis work also documents how processor load sharing has been done while networking devices and how that benefits the performance of GLEAM on mobile devices. Some enhancements using multi-threading have also been made to the already existing GLEAM model to improve its performance.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Computer-based auditory training programs (CBATPs) are used as an at-home aural rehabilitation solution in individuals with hearing impairment, most commonly in recipients of cochlear implants or hearing aids. However, recent advancements in spatial audio and immersive gameplay have not seen…
Computer-based auditory training programs (CBATPs) are used as an at-home aural rehabilitation solution in individuals with hearing impairment, most commonly in recipients of cochlear implants or hearing aids. However, recent advancements in spatial audio and immersive gameplay have not seen inclusion in these programs. Isle Aliquo, a virtual-reality CBATP, is designed to reformat traditional rehabilitation exercises into virtual 3D space. The program explores how the aural exercise outcomes of detection, discrimination, direction, and identification can be improved with the incorporation of directional spatial audio, as well as how the experience can be made more engaging to improve adherence to training routines. Fundamentals of professional aural rehabilitation and current CBATP design inform the structure of the exercise modules found in Isle Aliquo.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as…
Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for feedback on how to improve. In this work, we present Augmented Coach, an augmented reality tool for coaches to give spatiotemporal feedback through a 3-dimensional point cloud of the athlete. The system allows coaches to view a pre-recorded video of their athlete in point cloud form, and provides them with the proper tools in order to go frame by frame to both analyze the athlete’s form and correct it. The result is a fundamentally new concept of an interactive video player, where the coach can remotely view the athlete in a 3-dimensional form and create annotations to help improve their form. We then conduct a user study with subject matter experts to evaluate the usability and capabilities of our system. As indicated by the results, Augmented Coach successfully acts as a supplement to in-person coaching, since it allows coaches to break down the video recording in a 3-dimensional space and provide feedback spatiotemporally. The results also indicate that Augmented Coach can be a complete coaching solution in a remote setting. This technology will be extremely relevant in the future as coaches look for new ways to improve their feedback methods, especially in a remote setting.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as…
Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for feedback on how to improve. In this work, we present Augmented Coach, an augmented reality tool for coaches to give spatiotemporal feedback through a 3-dimensional point cloud of the athlete. The system allows coaches to view a pre-recorded video of their athlete in point cloud form, and provides them with the proper tools in order to go frame by frame to both analyze the athlete’s form and correct it. The result is a fundamentally new concept of an interactive video player, where the coach can remotely view the athlete in a 3-dimensional form and create annotations to help improve their form. We then conduct a user study with subject matter experts to evaluate the usability and capabilities of our system. As indicated by the results, Augmented Coach successfully acts as a supplement to in-person coaching, since it allows coaches to break down the video recording in a 3-dimensional space and provide feedback spatiotemporally. The results also indicate that Augmented Coach can be a complete coaching solution in a remote setting. This technology will be extremely relevant in the future as coaches look for new ways to improve their feedback methods, especially in a remote setting.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete's form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as…
Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete's form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for feedback on how to improve. In this work, we present Augmented Coach, an augmented reality tool for coaches to give spatiotemporal feedback through a 3-dimensional point cloud of the athlete. The system allows coaches to view a pre-recorded video of their athlete in point cloud form, and provides them with the proper tools in order to go frame by frame to both analyze the athlete's form and correct it. The result is a fundamentally new concept of an interactive video player, where the coach can remotely view the athlete in a 3-dimensional form and create annotations to help improve their form. We then conduct a user study with subject matter experts to evaluate the usability and capabilities of our system. As indicated by the results, Augmented Coach successfully acts as a supplement to in-person coaching, since it allows coaches to break down the video recording in a 3-dimensional space and provide feedback spatiotemporally. The results also indicate that Augmented Coach can be a complete coaching solution in a remote setting. This technology will be extremely relevant in the future as coaches look for new ways to improve their feedback methods, especially in a remote setting.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Augmented Reality (AR) especially when used with mobile devices enables the creation of applications that can help students in chemistry learn anything from basic to more advanced concepts. In Chemistry specifically, the 3D representation of molecules and chemical structures is…
Augmented Reality (AR) especially when used with mobile devices enables the creation of applications that can help students in chemistry learn anything from basic to more advanced concepts. In Chemistry specifically, the 3D representation of molecules and chemical structures is of vital importance to students and yet when printed in 2D as on textbooks and lecture notes it can be quite hard to understand those vital 3D concepts. ARsome Chemistry is an app that aims to utilize AR to display complex and simple molecules in 3D to actively teach students these concepts through quizzes and other features. The ARsome chemistry app uses image target recognition to allow students to hand-draw or print line angle structures or chemical formulas of molecules and then scan those targets to get 3D representation of molecules. Students can use their fingers and the touch screen to zoom, rotate, and highlight different portions of the molecule to gain a better understanding of the molecule's 3D structure. The ARsome chemistry app also features the ability to utilize image recognition to allow students to quiz themselves on drawing line-angle structures and show it to the camera for the app to check their work. The ARsome chemistry app is an accessible and cost-effective study aid platform for students for on demand, interactive, 3D representations of complex molecules.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
Spatial audio can be especially useful for directing human attention. However, delivering spatial audio through speakers, rather than headphones that deliver audio directly to the ears, produces the issue of crosstalk, where sounds from each of the two speakers reach…
Spatial audio can be especially useful for directing human attention. However, delivering spatial audio through speakers, rather than headphones that deliver audio directly to the ears, produces the issue of crosstalk, where sounds from each of the two speakers reach the opposite ear, inhibiting the spatialized effect. A research team at Meteor Studio has developed an algorithm called Xblock that solves this issue using a crosstalk cancellation technique. This thesis project expands upon the existing Xblock IoT system by providing a way to test the accuracy of the directionality of sounds generated with spatial audio. More specifically, the objective is to determine whether the usage of Xblock with smart speakers can provide generalized audio localization, which refers to the ability to detect a general direction of where a sound might be coming from. This project also expands upon the existing Xblock technique to integrate voice commands, where users can verbalize the name of a lost item using the phrase, “Find [item]”, and the IoT system will use spatial audio to guide them to it.
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)