XR-IT

5+

October 2024 - present

Systems Engineer

Unreal Engine, Unity


Overview

XR-IT (eXtended Realitites Intraverse Toolkit) is a system, Developed at DAE (Design Academy of Eindhoven) for making it easy to coalesce geographically dispersed locations into a shared virtual or mixed reality world using high-end film-grade tech.

It consists of a backend running in node.js to control and communicate with software like Motion Capture or video streaming, and Game Engine Plugins for Unreal Engine and Unity to properly take in Live Data, and co-locate Users/Actors among multiple real life and virtual spaces.

XR-IT was developed as a research project with investment from the European Media and Immersion Lab (EMIL) as part of the EU's Horizon Europe programme.

As a proof of concept and testing of the technology, we have filmed a short film with an actor in the Eindhoven, and an actor in Helsinki, all working and filming together with a crew of about 20 people, including a director, camera operators, technicians, and more.

XR-IT Backend Connections Graph

XR-IT Backend Connections Graph

My Contribution

I was solely responsible for developing the plugins in Unreal Engine and Unity, working together with a node developer to communicate with our backend, and with the rest of the team to design and implement a good usable plugin that non-technical people can use.

During our test production, I was present to make sure the tech was running smoothly, improving certain things on the fly, and noting down things to improve in the future.

After the production, as of writing this, I'm still (part time) working together with our node developer to improve the systems we've made based on lessons from the production, write documentation, and prepare XR-IT for a public release. I also help to run and extend XR-IT for student projects that involve it for cross-spatial collaboration.

XR-IT Unity Plugin UI

XR-IT Unity Plugin UI

The Tech

The main focus for me is the Unreal Engine Plugin, as that's what we used in our test film production.

I used a lot of C++ to create the Plugin. It communicates to our node backend through Web sockets, interfaces in a modular optional way with other Plugins, (say, motion capture from XSens-MVN and Optitrack) and native Unreal Engine Plugins that do Media Output Streaming. It synchronizes things such as character calibration, selected characters, opened scene. It controls VR to sync with MoCap, and has additional functionality for custom data streams that go from engine to engine.

There's also a big (extendable and customizable) part of the plugin in Blueprints, that interfaces with systems such as Animation Retargeting and custom user classes in Unreal.

I made a UI window, that lets the user control the plugin, and do such things as swapping the playable character models, and calibrating the virtual space character to a real world origin point, among other things.

The Unity Plugin has basically all the same things, except for Media Streaming, as it's not part of the Unity Package.

XR-IT Unreal Engine Plugin UI

XR-IT Unreal Engine Plugin UI