Friday, June 30, 2017

Unity versus Unreal

This topic is as divisive as the US 2016 presidential election, so I'll tread carefully.

As a middleware maker, it makes no sense to have favorites. We do our best to keep integrations of Voxel Farm on par so we reach as many users as possible. As an individual, I see no problem stating I prefer Unreal, but this is only because it is an all C++ environment. It is not a rational thing.

This post, however, is not about how I feel. It is rather about the state of the two engines and how much they facilitate procedural generation and working with voxel data. I think many of the issues we have encountered over the past few years are common if you are doing a similar type of work with these engines. Hopefully, our story can help.

Let's start with the visuals. Both Unity and Unreal are capable of rendering beautiful scenes. Both are also able to render at very high frame rates, even for fairly complex content. This has likely been the lion's share of their R&D for years now. Unity has one crucial advantage over Unreal, which is it natively supports texture arrays. Unreal almost supports them, in fact, we managed to make them work in a custom branch of UE4 with little effort. However, this is not possible with the out-of-the-box Unreal distribution. That is a dealbreaker if your middleware is to be used as a plugin like it is our case.

Texture Arrays in Unity allow precise filtering and high detail

Texture arrays make a big difference if you need complex materials where many different types of surfaces need to be splatted in a single draw call. When an engine lacks texture array support, you must use 2D atlasing. This raises a whole hell of issues, like having to pick mip levels yourself and wasting precious memory in padding textures to avoid bleeding. When you hit this low point, you begin to seriously question your career choices.

If your application uses procedural generation, it likely means the contents of the scene is not known when the application is in design mode. This is at odds with how these engines have evolved to work. If your application allows users to change the world, it only makes it worse. For the most part, both engines expect you to manage new chunks of content in the main thread. This is something that if left unattended can cause severe spikes in your framerate.

There are multiple aspects involved in maintaining a dynamic world. First, you must update the meshes you use to render the world. This is fairly quick in both engines, but it does not come free. Then, you must create collision models for the new geometry. Here Unreal does better. Since you have closer access to the PhysX implementation, you can submit a much simpler version of the content. In Unity, you may be stuck with using the same geometry for rendering as colliders. (EDIT: I was wrong about this, see the comments section.) From reading their latest update, I see this motivated the Card Life developers to ditch PhysX collisions altogether.

Card Life, made in Unity, features a hi-res voxel world

Voxel Farm allows players to cut arbitrary chunks of the world, which then become subject to physics. Unity was able to take fragments of any complexity and properly simulate physics for them. Unreal, on the other hand, would model each fragment as a cube. Apparently, PhysX is not able to compute convex hulls, so for any object subject to physics, you must supply a simplified model. Unity appears to create these on-the-fly. For Unreal, we had to plug in a separate convex hull generation algorithm. Only then we could get the ball rolling, literally.

When it comes to AI and pathfinding, both engines appear to use Recast, which is a third party navigation mesh library. Recast uses voxels under the hood (go voxels!) but this aspect is not exposed by its interface. For a voxel system like us, it is a bit awkward to be submitting meshes to Recast, which then are voxelized again and ultimately contoured back into navigation meshes. But this is not bad, just messy. There is one key difference here between Unreal and Unity. Unreal will not let you change the scope of the nav-mesh solution in real-time. That means you cannot have the nav-mesh scope follow the player across a large open world. It is unfortunate since this is a tiny correction if you can modify the source code, but again for a plugin like Voxel Farm it is not an option.

Dynamic nav-mesh in UE4

This brings me to the last issue in this post, which is the fact Unreal is open source while Unity is closed. As a plugin developer, I find myself surprised to think a closed source system may be more amicable for plugin development. Here is my rationale: So far the open source model has been great allowing us to discover why a given feature will not work in the official distribution. You can clearly see the brick wall you are about to hit. For application developers, open source works better because you can always fork the engine code and remove the brick wall. The problem is this takes the pressure off and the brick wall stays there for longer. In Unity, both application and middleware developers must use the same version of the engine. I believe this creates an incentive for a more complete interface.

I'm sure there is more to add to this topic. There are some key aspects we still need to cover for both engines, like multiplayer. If you find any of our issues to be unjustified, I would love to be proven wrong, for the betterment of our little engine. Just let me know by dropping a comment.