Marinel Tinnirello

Creative Technologist

Game Designer / Developer

XR Specialist

Full Stack Engineer

Transdisciplinary Researcher

Marinel Tinnirello

Creative Technologist

Game Designer / Developer

XR Specialist

Full Stack Engineer

Transdisciplinary Researcher

(Re)Live

  • Roles: VFX Artist, Programmer
  • Competition: MIT Reality Hack 2024
  • Categories: XR
See Demo

What is (Re)Live?

(Re)Live is a VR experience capturing what being at MIT Reality Hack 2024 was like. Acting as a virtual, immersive archive, numerous people were interviewed, with their reaction videos being displayed from the 3D scans made of the various buildings the event took place in.

Responsibilities

  • Fabricate VFX effects to highlight what we're interacting with, as well as audio-reactive particles for when we're recording to leave a memory.
  • Setup a system to import participant data to display and interact with.
  • Smooth out animations and transitions between scenes.

Creative Process

  • Create a virtual archive within the constraints of external factors.
  • Convey the feeling of "presence" through the memorializing of participants and audio recordings of users.
  • System for entering participants' data alongside their scans that can be directly interacted with.
  • Runtime audio recording saveable into the project for others to see.
  • 2 days to work on the project.
  • Fit the theme of "Presence".
  • Splitting time between this project and work (Had a product launch for my full time role the following Tuesday).

1) Create highlight & audio reactive VFX.

2) Design system for designers to match up participant data to scans that also allows for interaction.

Thought Process

We wanted to have some diegetic indication for when a user is hovering over an object before interaction, as well as when we were recording. Because of that, I decided to create some basic Niagara particles to take care of this. In truth, I'm not much of a VFX artist, it's something I wanted to try to get into more, so I wanted to take the opportunity to play with this area during the hackathon, sinceĀ  I wanted to work as an addendum programmer.

Niagara allows for audio spectrum information to be polled and edited, which was news to me, and made this set of particles pretty simple to knock out. These were placed over the hands whenever we'd start recording, and it gives a whimsical effect.

Originally, I wanted to create a beam particle as our highlighting mechanic, since the way user scans work via Luma AI, they're essentially particles. The effect I wanted to pull off was proving a bit too difficult (or at least time-consuming, given my personal constraints alongside this project's), so I instead swapped over to creating a simple ribboned halo ring effect. This actually ended up working out better, as it's much more clearly defined what's being highlighted and doesn't feel more immersion breaking due to the participant's scans making them 2D.

The last piece I created for this project was the visuals for the memories. It was an orb with a simple glow effect, surrounded by 3, concentric, rotating rings.

Results

The effects allowed a very clear and distinctive way of indicating to the player what was happening, as well as giving a fantastical visualization to the experience.

Thought Process

Crafting a designer-friendly system that allowed the dynamic addition of data and interaction for these Luma AI scans was going to be simple... or so I thought.

I originally figured I should be able to add a couple of components that would be exposed variables, changing in the construction script, and any interaction would be doable via raycasts. With the test objects, this proved true and took mere minutes to setup. This was not the case with the parented class made to help condense any information we wanted to add to the Luma AI-generated scan classes. The short answer is that Luma AI is set up in such a way that that it doesn't want to be played with without making some serious edits to the plugin itself.

We ran into the issue when I noticed that absolutely no raycasts were hitting, even on a simple primitive shape added inside the blueprint. Eventually after racking my brain for a bit, I came up with a bit of an ugly workaround in the form of child actor blueprints. So for anything that we wanted to add onto the Luma AI classes (or rather, the nested parent class I made), I made a couple of classes (one for our highlight and one for our name cards) with invisible meshes to interact with that were added as child actors. Unfortunately, that meant that the parameters were not exposed to the outliner, so designers would have to edit inside the blueprint.

Results

I was shocked with how much Luma AI put us into such a precarious situation for even something as simple as a raycast, blocking virtually everything in the process. We had already experienced some issues previously with dealing with how to cull the Gaussian effects, but we were able to solve that as well.

I feel bad I couldn't be as present as I would've liked to be, as my work takes precedence over a fun hackathon. That being said, I enjoyed not being the main programmer and taking a bit of a backseat in that regard to do more art. I would like to play with more VFX down the line. That being said, I don't think the Gaussian splatts are quite there at the moment (as of when the hackathon was done), and Luma AI, while interesting, is a very precarious plugin to work with (as of when the hackathon was done). I look forward to see how these technologies improve.

Teammates

  • Christian Yang: Programmer
  • Yihong Xu: Programmer, Project Manager
  • Zoe Zhou: Interviewer, Designer
  • Mandy Liu: Interviewer, Designer

Links

Here are links to the project: