Skip to main content

Research Repository

Advanced Search

All Outputs (3)

NeFT-Net: N-window extended frequency transformer for rhythmic motion prediction (2025)
Journal Article
Ademola, A., Sinclair, D., Koniaris, B., Hannah, S., & Mitchell, K. (2025). NeFT-Net: N-window extended frequency transformer for rhythmic motion prediction. Computers and Graphics, 129, Article 104244. https://doi.org/10.1016/j.cag.2025.104244

Advancements in prediction of human motion sequences are critical for enabling online virtual reality (VR) users to dance and move in ways that accurately mirror real-world actions, delivering a more immersive and connected experience. However, laten... Read More about NeFT-Net: N-window extended frequency transformer for rhythmic motion prediction.

Audio Occlusion Experiment Data (2025)
Data
McSeveney, S., Tamariz, M., McGregor, I., Koniaris, B., & Mitchell, K. (2025). Audio Occlusion Experiment Data. [Data]

This dataset comprises anonymous user study participant responses of audio occlusion to investigate presence response of body occlusions in the presence of sound sources in the direct path between the person and the audio driver speaker.

DeFT-Net: Dual-Window Extended Frequency Transformer for Rhythmic Motion Prediction (2024)
Presentation / Conference Contribution
Ademola, A., Sinclair, D., Koniaris, B., Hannah, S., & Mitchell, K. (2024, September). DeFT-Net: Dual-Window Extended Frequency Transformer for Rhythmic Motion Prediction. Presented at EG UK Computer Graphics & Visual Computing (2024), London, UK

Enabling online virtual reality (VR) users to dance and move in a way that mirrors the real-world necessitates improvements in the accuracy of predicting human motion sequences paving way for an immersive and connected experience. However, the drawba... Read More about DeFT-Net: Dual-Window Extended Frequency Transformer for Rhythmic Motion Prediction.