Jump to content

- - - - -

Baking That Pi

Posted by Richard Kain, 10 August 2017 · 3547 views

I went a little overboard, and worked fairly late last night on my Raspberry Pi testbed. This time, my primary objective was testing standard controller input. I was using a wired Logitech F310 game controller, in X-Input mode with the analog mode toggle de-selected. With this plugged into the Raspberry Pi, I was able to get all of the standard face controlls mapped and reporting properly on the Pi. I was not able to retrieve the data for the triggers, which surprised me slightly. The shoulder buttons worked fine, but the triggers weren't picking up. Sadly, there's not a lot of documentation on this use-case on-line, so a lot of it was trial-and-error. The mapping for controllers is also not the same on Android as it is for Windows, so I had to create additional Input Axes specifically for the Android build.


After I got the controller working as desired, I moved on to a little basic rendering. I started off by dialing the resolution back down to 800 x 600. This introduces a few issues. The monitor I'm running this on is a Dell 24 inch PC monitor with a native resolution of 1920x1080. (full 1080p) But running the software on a Pi at that resolution hampers the framerate, dipping below 30 fps. For some types of games that might be acceptable, but I would rather sacrifice pixel density in order to bump the framerate up. But there are a limited number of input resolutions that my monitor will accept. Around the scaling window I am looking for, 800 x 600 was the best fit.


For those of you familiar with this type of work, the next issue I had to tackle should be obvious. 800 x 600 is a 4:3 aspect ratio, while 1920 x 1080 is a 16:9 aspect ratio. So the 800 x 600 signal I was feeding into my monitor was getting stretched horizontally. (and by a healthy margin, no less) For testing purposes, I took a standard 16x16 sprite from a sprite sheet I keep handy, and scaled it up to fill up most of the screen. When I exported the project, sure enough, the pixels (now quite obvious thanks to the scaling) were very clearly stretched horizontally. This would not do.


Fortunately, when it comes to aspect ratios and displays, this isn't my first rodeo. And it gave me an opportunity to test out another feature I had been meaning to try. I created a second camera in my scene, removed some of the unneeded components from it, and set its viewport to match a 1066 x 600 resolution. Then I shifted the sprite away from the initial camera, and instead placed it in front of the newly created secondary camera. I created a RenderTexture in my Assets folder, and set it to a 1066 x 600 resolution as well. I attached the RenderTexture to the secondary camera, so that it would render whatever it is pointing at into the RenderTexture.


Now I had a solution for rendering a texture from my scene with a 16:9 aspect ratio and the correct vertical resolution. But I still needed to get this texture in front of the main camera, and insure that it would be stretched to the bounds of the camera. So I dropped a quad into my scene, placed right in front of the main camera. I applied a material to it using the RenderTexture, with an unlit shader. Then I dropped a little script on it that I cooked up. This script querried the orthographic size property of the main camera, and used that data to proportionally scale the quad. If you set the math up right, it will insure that the quad is always scaled up to the exact bounds of the screen, no matter what size or orientation the screen is.


And that did the trick. When you render a wider texture, and squeeze it into a more narrow space, you get "taller" pixels. When those taller pixels then get stretched back out to a wider aspect ratio, they appear perfectly square. This approach worked pretty well, and ran fine on the Pi. I was able to get those extra frames I was hoping for, the software was ticking along nicely at close to 60 fps, even though I was using a RenderTexture. And if I should need even more performance, I can continue using this same approach while decreasing the base resolution further, or just reducing the resolution of the RenderTexture. (or adjust how frequently the RenderTexture updates)


Also, a little tip I ran across. When building your project for Android TV, you're going to want to change your rendering path in the Android-specific build settings. By default, it's set to "forward." But the Raspberry GPU does not seem to like that approach. Switch it over to "deferred" and you should start seeing your colors render properly.

October 2022

23 4 5678

Recent Entries

Recent Comments

user(s) viewing

members, guests, anonymous users

Search My Blog


Latest Visitors