June 6, 2021

Writing Bevy Retro's Renderer

The renderer for Bevy Retro is the first renderer we've ever written and it was a great learning experience. We went through several different iterations of the rendering strategy, everything from CPU rendering to a custom WebGL 1 compatible Luminance backend. In this post I'm going to walk through what we tried and where we ended up.

Why Make a New Renderer?

It's important to answer the question of why we made our own Renderer. After all, not only does Bevy already come with a renderer, but it also has a community rendering plugin, bevy_webgl2, that allows running Bevy games in the browser. What motivated us to write another one?

Pixel Perfectness

The first reason that we wrote our own renderer was because we wanted to write pixel-perfect, 2D games, and we wanted it to be as easy as possible. In out-of-the-box Bevy, when creating sprites, everything uses floating point positions in 3D space. This isn't bad most of the time, but we wanted to make sure that when displaying the game, we never had an instance where the pixels of a sprite were not perfectly aligned with all the others.

We also didn't really need rotations or scale ( though we might add those later ), and the Z position didn't have to represent a 3D position, it just had to represent a 2D layer number.

In order to make this easier to represent in code, we decided to replace Bevy's Transform struct with our own Position struct, which used integer coordinates that referred to full pixels. This makes it impossible to say "this sprite is half-way lined up with that sprite", making it harder to break the authentic retro look.

While is definitely possible to accomplish a pixel-perfect look in Bevy's out-of-the-box renderer, I had also run into some graphics difficulties when implementing the map renderer that made me feel like there might be a more consistent way to do pixel-perfect.

Simplicity

Along with replacing Bevy's Transform with our own Position struct, we also replaced bevy's components for sprites, textures, the camera, etc. with the goal of making things simpler.

In out-of-the-box Bevy, when creating a sprite you need to both load the sprite Texture and create a ColorMaterial asset for it. While that makes sense in the context of Bevy's more general-purpose renderer, we get a small but helpful simplification in Bevy Retro by not requiring the ColorMaterial:

Bevy Retro:

fn setup(
    mut commands: Commands,
    asset_server: Res<AssetServer>,
) {
    commands.spawn_bundle(SpriteBundle {
        image: asset_server.load("my_sprite.png"),
        ..Default::default()
    });
}

Normal Bevy:

fn setup(
    mut commands: Commands,
    asset_server: Res<AssetServer>,
    mut materials: ResMut<Assets<ColorMaterial>>,
) {
    let texture_handle = asset_server.load("branding/icon.png");
    commands.spawn_bundle(SpriteBundle {
        material: materials.add(texture_handle.into()),
        ..Default::default()
    });
}

By implementing our own rendering types specific to 2D rendering we can give the developer less to think about and make the code more readable.

Supporting Lower-end Hardware

Another bonus of writing our own renderer is that we liked being able to make our games run on older computers. Having grown up with a handful of old computers, and definitely nothing that had Vulkan support, it felt good to be minimal about the graphics requirements.

Especially when we were building a game with simple pixel art anyway, it was nice to make the renderer as portable as possible and know that it could probably run fine even on a Raspberry Pi.

Ease Of Implementation

Writing the renderer also seemed feasible enough to accomplish. Building a simple 2D renderer isn't super difficult. The initial prototype took us 4 days to finish, proving that we could wrangle enough rendering to at least get something on the screen for both desktop and browser. After that it took at least a couple weeks before finally settling on a strategy that worked and was performant enough, but we felt that the time spent on it was worth the value that it provided.

iOS Safari Support

Finally, the last reason to use our own renderer we hadn't even realized until after we started: iOS web support. It is a very unfortunate thing that the web browser for iOS is very out-of-date as far as implementations of new web standards. While practically all of the other browsers have supported WebGL 2 for a while now, Safari on iOS still only supports WebGL 1. This means that if we wanted to have our web builds of the game work on iPhones, we would have to make our renderer work for WebGL 1, and even bevy_webgl2 wasn't going to do that for us.

Writing the Renderer

Because we wanted to target web and new and old desktop computers with the same renderer we decided to use OpenGL/WebGL for rendering.

Our first attempt was a combination of CPU and OpenGL rendering.

CPU Rendering

Never having used OpenGL or for an actual project before, my initial implementation attempt for the renderer was to do all of the scene compositing on the CPU using the Rust image crate just to stack all of the sprites on top of each-other every frame before uploading it to the GPU as a texture and rendering it on a single quad that filled the screen.

This technique was super easy to get working. I hardly had to get past the "rendering a triangle" section of the OpenGL tutorial to figure out how to render it, and, by using the glow library I could target web and desktop! It was so easy!

Unfortunately, it was also really slow.

Creating a new image, stacking a bunch of sprites on top of each-other using the CPU, and uploading that image to the GPU over and over every frame is not very efficient. It worked fine for a few sprites on a powerful computer, but when running in the browser it was too slow to be usable.

That meant I had to learn how to write a decent GPU renderer.

Rendering With OpenGL

So I went back to LearnOpenGL.com and started reading again and did a lot of looking around. I learned how you put multiple sprites on the screen using the GPU and even how to efficiently batch sprite rendering and optimize it using atlases and texture arrays. I thought I was all set, I just had to write it.

Then after trying to write it for a while I decided that using the raw OpenGL API was really annoying and error prone. I thought that after I understood the concepts and how everything worked it would be simple enough, but I was wrong. Even though I understood it well enough, it was just super easy to accidentally mess one little thing up, and if you did, you were likely to get a blank screen with no info about what went wrong.

After struggling just to get the most basic renderer working I decided I was going to have to create my own wrapper around the OpenGL API just to make it usable.

Then I remembered Luminance.

Rendering With Luminance

Luminance is a Rust ( previously Haskell ) project that presents a much nicer graphics API built on other graphics APIs, such as OpenGL or WebGL, depending on which backend you select. It uses Rust's strong type system to make it much harder to mess up your rendering than it is with OpenGL. It helps force you into patterns of use where you don't have to manually remember do do all of the different steps at exactly the right time. And you can use it without needing any unsafe Rust like you have to when using raw OpenGL.

After trying out Luminance, I really liked it. It didn't make rendering any harder than it had to be, it just made sense.

So I started trying to integrate it into Bevy. I had to create a custom windowing integration for Luminance so that it could integrate with the window created by Bevy, and made the luminance_surfman crate to make that work. That was, thankfully, really easy.

While I never got to batched sprite rendering, which would still be a welcome performance improvement, I was able to get the Luminance renderer working great, and even without batched rendering we can still get up to 6,000 sprites flying around at 60 frames-per-second in the browser, and more than double that on desktop.

That was good enough for our games for now and we could always improve the performance more later.

Custom Luminance Backend

Finally our rendering was working on desktop and web and without performance issues! For desktop we were using the OpenGL backend for Luminance and on web we were using the WebGL 2 backend. It was working great!

That was when we realized that iOS didn't support Webgl 2.

We really didn't want to have to tell people that you couldn't play our web game on iPhones without having to manually install a different browser. It just felt so cheap to have to say that. Yes we had good reasons and it was pretty much Apple's fault that it was so hard to do, but still, that wasn't how users were going to see it.

It we really couldn't make web support work on iPhones, we decided that would be fine, but we were going to give it a try and attempt to make our renderer work using WebGL 1. The catch was that Luminance didn't support WebGL 1.

The cool thing about Luminance is that you can make your own backends for it. If Luminance didn't support WebGL 1, we could make our own backend for it. We just weren't sure how much work that would be.

After looking around a bit I realized the the glow library actually supported WebGL 1. Glow was the library I was using earlier when I was trying to use raw OpenGL, and the neat part about it was that it helped give you one API that worked for OpenGL, OpenGL ES, Webgl 1, and Webgl 2. That meant that in theory we could make our own Luminance backend using Glow and then be able to support all of those APIs at once, and be able to run on iOS Safari.

Because I had hardly even used Luminance, let alone made a backend for it before, I took the existing Luminance WebGL 2 backend and forked it. I went through and converted all of the WebGL 2 calls into glow calls, fixed some issues due to API differences, opened a pull request for glow to add some stuff that we needed, and Voilà! After some trial-and-error, it worked!

We had to rewrite our shaders to use GLSL ES 1.0, one of the least featured shading languages out of them all, which was annoying, but it was working! After 4 different attempts at implementing a renderer, it was doing what we needed it to.

Summary

We're very happy with the results of our renderer so far. There's likely going to be improvements to be made and more to learn, but we think it's a good base to build on.