Skip to main content

Research Repository

Advanced Search

Dr Babis Koniaris' Outputs (27)

DeFT-Net: Dual-Window Extended Frequency Transformer for Rhythmic Motion Prediction (2024)
Presentation / Conference Contribution
Ademola, A., Sinclair, D., Koniaris, B., Hannah, S., & Mitchell, K. (2024, September). DeFT-Net: Dual-Window Extended Frequency Transformer for Rhythmic Motion Prediction. Presented at EG UK Computer Graphics & Visual Computing (2024), London, UK

Enabling online virtual reality (VR) users to dance and move in a way that mirrors the real-world necessitates improvements in the accuracy of predicting human motion sequences paving way for an immersive and connected experience. However, the drawba... Read More about DeFT-Net: Dual-Window Extended Frequency Transformer for Rhythmic Motion Prediction.

Auditory Occlusion Based on the Human Body in the Direct Sound Path: Measured and Perceivable Effects (2024)
Presentation / Conference Contribution
McSeveney, S., Tamariz, M., McGregor, I., Koniaris, B., & Mitchell, K. (2024, September). Auditory Occlusion Based on the Human Body in the Direct Sound Path: Measured and Perceivable Effects. Presented at Audio Mostly 2024 - Explorations in Sonic Cultures, Milan, Italy

Audio plays a key role in the sense of immersion and presence in VR, as it correlates to improved enjoyment of content. We share results of a perception study on the ability of listeners to recognise auditory occlusion due to the presence of a human... Read More about Auditory Occlusion Based on the Human Body in the Direct Sound Path: Measured and Perceivable Effects.

DanceMark: An open telemetry framework for latency sensitive real-time networked immersive experiences (2024)
Presentation / Conference Contribution
Koniaris, B., Sinclair, D., & Mitchell, K. (2024, March). DanceMark: An open telemetry framework for latency sensitive real-time networked immersive experiences. Presented at IEEE VR Workshop on Open Access Tools and Libraries for Virtual Reality, Orlando, FL

DanceMark is an open telemetry framework designed for latency-sensitive real-time networked immersive experiences, focusing on online dancing in virtual reality within the DanceGraph platform. The goal is to minimize end-to-end latency and enhance us... Read More about DanceMark: An open telemetry framework for latency sensitive real-time networked immersive experiences.

Design Considerations of Voice Articulated Generative AI Virtual Reality Dance Environments (2024)
Presentation / Conference Contribution
Casas, L., Mitchell, K., Tamariz, M., Hannah, S., Sinclair, D., Koniaris, B., & Kennedy, J. (2024, May). Design Considerations of Voice Articulated Generative AI Virtual Reality Dance Environments. Presented at SIGCHI GenAI in UGC Workshop, Honolulu, Hawaii

We consider practical and social considerations of collaborating verbally with colleagues and friends, not confined by physical distance, but through seamless networked telepresence to interactively create shared virtual dance environments. In respon... Read More about Design Considerations of Voice Articulated Generative AI Virtual Reality Dance Environments.

DanceGraph: A Complementary Architecture for Synchronous Dancing Online (2023)
Presentation / Conference Contribution
Sinclair, D., Ademola, A. V., Koniaris, B., & Mitchell, K. (2023, May). DanceGraph: A Complementary Architecture for Synchronous Dancing Online. Paper presented at 36th International Computer Animation & Social Agents (CASA) 2023, Limassol, Cyprus

DanceGraph is an architecture for synchronized online dancing overcoming the latency of net-worked body pose sharing. We break down this challenge by developing a real-time bandwidth-efficient architecture to minimize lag and reduce the timeframe of... Read More about DanceGraph: A Complementary Architecture for Synchronous Dancing Online.

A brain atlas of synapse protein lifetime across the mouse lifespan (2022)
Journal Article
Bulovaite, E., Qiu, Z., Kratschke, M., Zgraj, A., Fricker, D. G., Tuck, E. J., …Grant, S. G. (2022). A brain atlas of synapse protein lifetime across the mouse lifespan. Neuron, 110(24), 4057-4073. https://doi.org/10.1016/j.neuron.2022.09.009

The lifetime of proteins in synapses is important for their signaling, maintenance, and remodeling, and for memory duration. We quantified the lifetime of endogenous PSD95, an abundant postsynaptic protein in excitatory synapses, at single-synapse re... Read More about A brain atlas of synapse protein lifetime across the mouse lifespan.

Embodied online dance learning objectives of CAROUSEL + (2021)
Presentation / Conference Contribution
Mitchell, K., Koniaris, B., Tamariz, M., Kennedy, J., Cheema, N., Mekler, E., Van Der Linden, P., Herrmann, E., Hämäläinen, P., McGregor, I., Slusallek, P., & Mac Williams, C. (2021, March). Embodied online dance learning objectives of CAROUSEL +. Presented at 2021 IEEE VR 6th Annual Workshop on K-12+ Embodied Learning through Virtual and Augmented Reality (KELVAR), Lisbon, Portugal

This is a position paper concerning the embodied dance learning objectives of the CAROUSEL + 1 project, which aims to impact how online immersive technologies influence multiuser interaction and communication with a focus on dancing and learning danc... Read More about Embodied online dance learning objectives of CAROUSEL +.

A brain-wide atlas of synapses across the mouse lifespan (2020)
Journal Article
Cizeron, M., Qiu, Z., Koniaris, B., Gokhale, R., Komiyama, N. H., Fransén, E., & Grant, S. G. (2020). A brain-wide atlas of synapses across the mouse lifespan. Science, 369(6501), https://doi.org/10.1126/science.aba3163

Synapses connect neurons together to form the circuits of the brain and their molecular composition controls innate and learned behavior. We have analyzed the molecular and morphological diversity of five billion excitatory synapses at single-synapse... Read More about A brain-wide atlas of synapses across the mouse lifespan.

Depth codec for real-time, high-quality light field reconstruction (2019)
Patent
Mitchell, K., Koniaris, C., Kosek, M., & Sinclair, D. Depth codec for real-time, high-quality light field reconstruction. US20190313080A1

Systems, methods, and articles of manufacture are disclosed that enable the compression of depth data and real-time reconstruction of high-quality light fields. In one aspect, spatial compression and decompression of depth images is divided into the... Read More about Depth codec for real-time, high-quality light field reconstruction.

Memory Allocation For Seamless Media Content Presentation (2019)
Patent
Mitchell, K., Koniaris, C., & Chitalu, F. (2019). Memory Allocation For Seamless Media Content Presentation. US20190096028

A system for performing memory allocation for seamless media content presentation includes a computing platform having a CPU, a GPU having a GPU memory, and a main memory storing a memory allocation software code. The CPU executes the memory allocati... Read More about Memory Allocation For Seamless Media Content Presentation.

Real-time rendering with compressed animated light fields (2018)
Patent
Mitchell, K., Koniaris, C., Kosek, M., & Sinclair, D. (2018). Real-time rendering with compressed animated light fields. US20180322691

Systems, methods, and articles of manufacture for real-time rendering using compressed animated light fields are disclosed. One embodiment provides a pipeline, from offline rendering of an animated scene from sparse optimized viewpoints to real-time... Read More about Real-time rendering with compressed animated light fields.

Architecture of the Mouse Brain Synaptome (2018)
Journal Article
Zhu, F., Cizeron, M., Qiu, Z., Benavides-Piccione, R., Kopanitsa, M. V., Skene, N. G., …Grant, S. G. (2018). Architecture of the Mouse Brain Synaptome. Neuron, 99(4), 781-799.e10. https://doi.org/10.1016/j.neuron.2018.07.007

Synapses are found in vast numbers in the brain and contain complex proteomes. We developed genetic labeling and imaging methods to examine synaptic proteins in individual excitatory synapses across all regions of the mouse brain. Synapse catalogs we... Read More about Architecture of the Mouse Brain Synaptome.

GPU-accelerated depth codec for real-time, high-quality light field reconstruction (2018)
Journal Article
Koniaris, B., Kosek, M., Sinclair, D., & Mitchell, K. (2018). GPU-accelerated depth codec for real-time, high-quality light field reconstruction. Proceedings of the ACM on Computer Graphics and Interactive Techniques, 1(1), 1-15. https://doi.org/10.1145/3203193

Pre-calculated depth information is essential for efficient light field video rendering, due to the prohibitive cost of depth estimation from color when real-time performance is desired. Standard state-of-the-art video codecs fail to satisfy such per... Read More about GPU-accelerated depth codec for real-time, high-quality light field reconstruction.

System and method of presenting views of a virtual space (2018)
Patent
Mitchell, K., Koniaris, C., Iglesias-Guitian, J., Moon, B., & Smolikowski, E. (2018). System and method of presenting views of a virtual space. US20180114343

Views of a virtual space may be presented based on predicted colors of individual pixels of individual frame images that depict the views of the virtual space. Predictive models may be assigned to individual pixels that predict individual pixel color... Read More about System and method of presenting views of a virtual space.

Compressed Animated Light Fields with Real-time View-dependent Reconstruction (2018)
Journal Article
Koniaris, C., Kosek, M., Sinclair, D., & Mitchell, K. (2019). Compressed Animated Light Fields with Real-time View-dependent Reconstruction. IEEE Transactions on Visualization and Computer Graphics, 25(4), 1666-1680. https://doi.org/10.1109/tvcg.2018.2818156

We propose an end-to-end solution for presenting movie quality animated graphics to the user while still allowing the sense of presence afforded by free viewpoint head motion. By transforming offline rendered movie content into a novel immersive repr... Read More about Compressed Animated Light Fields with Real-time View-dependent Reconstruction.

Method for Efficient CPU-GPU Streaming for Walkthrough of Full Motion Lightfield Video (2017)
Presentation / Conference Contribution
Chitalu, F. M., Koniaris, B., & Mitchell, K. (2017, December). Method for Efficient CPU-GPU Streaming for Walkthrough of Full Motion Lightfield Video. Presented at 14th European Conference on Visual Media Production (CVMP 2017), London, United Kingdom

Lightfield video, as a high-dimensional function, is very demanding in terms of storage. As such, lightfield video data, even in a compressed form, do not typically fit in GPU or main memory unless the capture area, resolution or duration is sufficie... Read More about Method for Efficient CPU-GPU Streaming for Walkthrough of Full Motion Lightfield Video.

IRIDiuM+: deep media storytelling with non-linear light field video (2017)
Presentation / Conference Contribution
Kosek, M., Koniaris, B., Sinclair, D., Markova, D., Rothnie, F., Smoot, L., & Mitchell, K. (2017, July). IRIDiuM+: deep media storytelling with non-linear light field video. Presented at ACM SIGGRAPH 2017 VR Village on - SIGGRAPH '17, Los Angeles, California

We present immersive storytelling in VR enhanced with non-linear
sequenced sound, touch and light. Our Deep Media (Rose 2012) aim
is to allow for guests to physically enter rendered movies with
novel non-linear storytelling capability. With the ab... Read More about IRIDiuM+: deep media storytelling with non-linear light field video.

Real-time rendering with compressed animated light fields. (2017)
Presentation / Conference Contribution
Koniaris, B., Kosek, M., Sinclair, D., & Mitchell, K. (2017, May). Real-time rendering with compressed animated light fields. Presented at 43rd Graphics Interface Conference

We propose an end-to-end solution for presenting movie quality animated graphics to the user while still allowing the sense of presence afforded by free viewpoint head motion. By transforming offline rendered movie content into a novel immersive repr... Read More about Real-time rendering with compressed animated light fields..

Pixel history linear models for real-time temporal filtering. (2016)
Journal Article
Iglesias-Guitian, J. A., Moon, B., Koniaris, C., Smolikowski, E., & Mitchell, K. (2016). Pixel history linear models for real-time temporal filtering. Computer Graphics Forum, 35(7), 363-372. https://doi.org/10.1111/cgf.13033

We propose a new real-time temporal filtering and antialiasing (AA) method for rasterization graphics pipelines. Our method is based on Pixel History Linear Models (PHLM), a new concept for modeling the history of pixel shading values over time using... Read More about Pixel history linear models for real-time temporal filtering..

IRIDiuM: immersive rendered interactive deep media (2016)
Presentation / Conference Contribution
Koniaris, B., Israr, A., Mitchell, K., Huerta, I., Kosek, M., Darragh, K., …Moon, B. (2016). IRIDiuM: immersive rendered interactive deep media. . https://doi.org/10.1145/2929490.2929496

Compelling virtual reality experiences require high quality imagery as well as head motion with six degrees of freedom. Most existing systems limit the motion of the viewer (prerecorded fixed position 360 video panoramas), or are limited in realism,... Read More about IRIDiuM: immersive rendered interactive deep media.

Stereohaptics: a haptic interaction toolkit for tangible virtual experiences (2016)
Presentation / Conference Contribution
Israr, A., Zhao, S., McIntosh, K., Schwemler, Z., Fritz, A., Mars, J., Bedford, J., Frisson, C., Huerta, I., Kosek, M., Koniaris, B., & Mitchell, K. (2016, July). Stereohaptics: a haptic interaction toolkit for tangible virtual experiences. Presented at ACM SIGGRAPH 2016 Studio on - SIGGRAPH '16, Anaheim, CA, US

With a recent rise in the availability of affordable head mounted gear sets, various sensory stimulations (e.g., visual, auditory and haptics) are integrated to provide seamlessly embodied virtual experience in areas such as education, entertainment,... Read More about Stereohaptics: a haptic interaction toolkit for tangible virtual experiences.

User, metric, and computational evaluation of foveated rendering methods (2016)
Presentation / Conference Contribution
Swafford, N. T., Iglesias-Guitian, J. A., Koniaris, C., Moon, B., Cosker, D., & Mitchell, K. (2016, July). User, metric, and computational evaluation of foveated rendering methods. Presented at Proceedings of the ACM Symposium on Applied Perception - SAP '16

Perceptually lossless foveated rendering methods exploit human perception by selectively rendering at different quality levels based on eye gaze (at a lower computational cost) while still maintaining the user's perception of a full quality render. W... Read More about User, metric, and computational evaluation of foveated rendering methods.

Simulation and skinning of heterogeneous texture detail deformation (2016)
Patent
Koniaris, C., Mitchell, K., & Cosker, D. (2016). Simulation and skinning of heterogeneous texture detail deformation. US2016133040

A method is disclosed for reducing distortions introduced by deformation of a surface with an existing parameterization. In an exemplary embodiment, the method comprises receiving a rest pose mesh comprising a plurality of faces, a rigidity map corre... Read More about Simulation and skinning of heterogeneous texture detail deformation.

Real-time variable rigidity texture mapping (2015)
Presentation / Conference Contribution
Koniaris, C., Mitchell, K., & Cosker, D. (2015, November). Real-time variable rigidity texture mapping. Presented at Proceedings of the 12th European Conference on Visual Media Production - CVMP '15

Parameterisation of models is typically generated for a single pose, the rest pose. When a model deforms, its parameterisation characteristics change, leading to distortions in the appearance of texture-mapped mesostructure. Such distortions are unde... Read More about Real-time variable rigidity texture mapping.

Guided ecological simulation for artistic editing of plant distributions in natural scenes (2015)
Journal Article
Bradbury, G. A., Subr, K., Koniaris, C., Mitchell, K., & Weyrich, T. (2015). Guided ecological simulation for artistic editing of plant distributions in natural scenes. The Journal of Computer Graphics Techniques, 4(4), 28-53

In this paper we present a novel approach to author vegetation cover of large natural scenes. Unlike stochastic scatter-instancing tools for plant placement (such as multi-class blue noise generators), we use a simulation based on ecological processe... Read More about Guided ecological simulation for artistic editing of plant distributions in natural scenes.

Content aware texture mapping on deformable surfaces (2014)
Patent
Koniaris, C., Cosker, D., Yang, X., Mitchell, K., & Matthews, I. (2014). Content aware texture mapping on deformable surfaces. US2014267306

A method is disclosed for reducing distortions introduced by deformation of a surface with an existing parameterization. In one embodiment, the distortions are reduced over a user-specified convex region in texture space ensuring optimization is loca... Read More about Content aware texture mapping on deformable surfaces.

Survey of texture mapping techniques for representing and rendering volumetric mesostructure (2014)
Journal Article
Koniaris, B., Cosker, D., Yang, X., & Mitchell, K. (2014). Survey of texture mapping techniques for representing and rendering volumetric mesostructure. The Journal of Computer Graphics Techniques, 3(2), 18-60

Representation and rendering of volumetric mesostructure using texture mapping can potentially allow the display of highly detailed, animated surfaces at a low performance cost. Given the need for consistently more detailed and dynamic worlds rendere... Read More about Survey of texture mapping techniques for representing and rendering volumetric mesostructure.