Selected Projects


 

Gen-3 Alpha Video-to-Video

I worked on a new implementation for our video-to-video product at Runway based on the latest and greatest Gen-3 Alpha model. This included training the model and running experiments to figure out the best form of conditioning to ensure proper video-to-video transfer can occur and that consistency is adequate. I’m particularly excited about this model due to the control it offers and because of how it relates to computer graphics rendering.


CLIP-based NNFM Stylization for 3D Assets (EG 2023)

Shailesh Mishra joined NVIDIA as an intern for the summer under my supervision in the Applied Deep Learning Research team and we chose to work on 3D asset stylization. We extended "neural neighbor feature matching” to work with CLIP-ResNet50’s feature maps and also allow using multiple style images during optimization. In addition, we also enable artistic control using a color palette loss that can guide training towards other swatches of colors.


NeRF-Tex (Computer Graphics Forum 2022)

Hendrik Baatz’s project started out in his Master’s thesis that I helped supervise and later continued during an internship at NVIDIA Research. This approach presents a method of using neural fields for applying “mesoscale” materials, such as fur and hair, onto 3d objects. We trained the fields on patches of fur that are artistically controllable, and then during inference these patches are instanced onto the surfaces of objects to give them a specific appearance. The project was first presented at EGSR 2021 and later published in CGF 2022.


Neural Scene Graph Rendering (SIGGRAPH 2021)

A project that I started working on near the end of my internship with NVIDIA and continued after starting my permanent position with the Applied Deep Learning Research team. The paper presents a new neural scene representation inspired by traditional scene graphs that is both controllable and scalable. In addition to this, it offers some nice generalization. We focused on simpler scenes to make the problem easier at this stage, but the hope is that these neural representations enable inputting whole graphics scenes to neural networks.


Compositional Neural Scene Representations for Shading Inference (SIGGRAPH 2020)

This work started out during my Master’s thesis and eventually expanded into an internship at NVIDIA in Zurich. We explored how learned neural scene representations could be used to augment traditional graphics pipelines. The main focus was on providing new methods for improving the interpretability of the representations. Supervised by Jan Novák, Fabrice Rousselle and Marios Papas.

Houdini Demo Reel

------------------------------------------------ 1. Sandstorm - TDU Rent-a-mentor Project: I worked on a sandstorm simulation for my tdu rent-a-mentor course. There are a lot of microsolvers used in this project, mostly for masking turbulence, dissipation and other forces. The plants are simulated using wires and tiny debris are simulated with particles, all extra elements are advected using a low resolution version of the simulation. 2. Iceberg Waterfall - Personal Project: I simulated around 80 million FLIP water particles for the main waterfall and around 30 million for the ocean in separate simulations. In addition to that amount I also simulated air particles, which are causing most of the interesting movement in the FLIP sim. I also modified the FLIP solver's droplet detection method, it wouldn't recognize the water droplets because there were always air particles around the water particles and sometimes particles would get stuck in the air. 3. Force Field - FXPHD HOU212 Course Project: What I wanted from the effect was a force field forming from particles thrown from the character's hand and the idea was that when the character stomps on the rune the spell is activated. After the spell is over the rune disappears. The project's focus was on designing your own force field effect from scratch. I did everything except the character animation. 4. Hevisaurus - Explosion: I did the smoke, debris and spark simulation in Houdini. There's also an additional layer of spark stock footage composited on top. 5. Hevisaurus - Smell Effect: I created a volume deformation tool in Houdini that was used in several shots with similar effects. 6. Hevisaurus - Fire Breathing: I created a Houdini tool that controlled the fire breathing effect. A curve had to be created that particles would follow (volume advection) and the particles were used as fuel for the fire.