Pixar's Kitchen set asset is a well-known scene in the USD community, and is used across many tutorials and benchmarks. My goal for this post is to give a high-level introduction to how I was able to render that scene using my own renderer, MrRay, including how I achieved this using Hydra and Blender.
My previous journey with ray-tracing 📚
For the past few years, I have had a fascination with ray-tracing. I developed my first ever ray-tracer back in 2018 mostly in my university library. It was exhilarating seeing a program I wrote render a simple sphere; however, since the code sucked (due to having beginner object-oriented knowledge at the time), and since I wrote it all in Java, the project never got further than being able to produce simple geometry.
Fast forward to 2020; I gave ray-tracing another shot, this time using “Ray Tracing in One Weekend” series as my starting point. I started afresh, and in a few weeks I had gone through the whole 3 book series (which I highly recommend). For my 20th birthday, the one thing I asked for was the book “Physically Based Rendering”. I spent countless days reading it, and even more days implementing cool things I read about. A few months later, the motivation died out and the project was laid to rest. But it didn’t lay dormant forever.
One of the things that I wanted to implement next was support for a “scene description format” - a format that describes objects/lights/materials within a scene. At the time, a scene had to be described entirely within MrRay's code, which meant that for any changes made to the scene, I had to recompile the app. That can be quite a slow process, especially when wanting to create large environments for some cool renders - so removing the need for recompilation was ideal.
Hail Hydra 🙏
Early on in my VFX career I came across what is known as Hydra, a rendering framework created by Pixar to help bridge the gap between 3D applications and rendering. Before Hydra, each renderer would typically have its own scene description format used to describe all of the objects within a world. For each application like Blender/Maya/Katana that you wanted to support, you would have to write translators for each of those to convert between their scene representation and the renderer’s. Let’s say that I want to be able to use my ray-tracer in Blender and Maya, it would mean I would have to develop two separate translators for both of those. When using Hydra on the other hand, I only need to write one translator. This is because Hydra acts as an intermediary format that digital content creation (DCC) applications can pass information to through a "scene delegate", and renderers can read information from Hydra through implementing a "render delegate". Since most of the common DCCs now implement a scene delegate, it means that any render delegate can render scenes in those applications.
With all of this in mind, I skipped the step of creating my own scene description format, and chose to implement a Hydra render delegate for my ray-tracer. This wasn’t such a simple task, it took a couple of months to implement, but in the end I got there.
There are a few good examples of Hydra implementations online that I drew inspiration from. In the source code for USD, there are already a few good examples to pull from: HdEmbree
and HdStorm
(the Hd
prefix referring to Hydra). The first couple of weeks were spent trying to familiarize myself with the structure/workings of Hydra, with my first goal being to produce pixels inside of usdview
(a USD imaging tool). This by far took the most amount of time, since I had to refactor a lot of MrRay’s core code so that I could start a render and grab pixels from Hydra. Once I got pixels, however, the motivation picked up and then I was able to get meshes and camera information from usdview
across to MrRay rather quickly.
Throwing this all into a Blender 🍅
With the aptly named HdMrRay
plugin built, it was then time to try to produce images out of a DCC application. Blender felt like the most natural choice, since I started my 3D journey on Blender, and Hydra support was added very recently (so recently in fact I had to build Blender from source just to be able to test it out). With a quick copy/paste of the HdStorm
blender addon, and modifying it to point to HdMrRay
, I had finally added my renderer into Blender. Not to say it was perfect the first time, there were a few bugs in my Hydra implementation that I had to iron out first. Blender, at the time of developing, also didn’t support the displayColor
primvar which gives objects their color, so I also had to patch that in.
Once I had the plugin in Blender, it was then time to import Pixar's Kitchen Set asset and test out a render! The image below shows what a render looks like at 150 samples per pixel. As you can probably see, the image is still quite noisy, and the render time of 13 minutes isn't absolutely mind-blowing - but nonetheless, I was still blown away by the capabilities of my little renderer.
Wrapping up 🌯
This was a fun little project that has resulted in MrRay transforming into something that could be used to render scenes at at a much larger scale. Being able to design something in Blender and then rendering that directly with my own tech is super neat; it definitely beats having to spend ages writing scenes in C++. There are some limitations with my current Hydra implementation, such as there being no shader support, but I am pretty proud of what I have so far and wanted to share this with the community.
Source code 👨💻
All of the code for MrRay is available over on my Github. Support by dropping a ⭐!
🔗 MrRay core + Hydra plugin
https://github.com/lijenicol/MrRay
🔗 MrRay Blender plugin
https://github.com/lijenicol/MrRayBlenderAddon
Referenced topics ✍️
🔗 More information on Hydra
http://tinyurl.com/mr3xj7x7
🔗 Kitchen set asset
http://tinyurl.com/bddwadkh
🔗 Ray Tracing in One Weekend:
http://tinyurl.com/4mkcrpvv
Top comments (0)