Converting my accelerometer capability test from UWP to Godot 3.
Header Image: Shutterstock Image (recoloured).
Having successfully ported my touch manipulation capability test from UWP to Godot 3 (the subject of my last post) it was time to prove the capability of another sensor using Godot; the accelerometer. I haven’t posted about this test before (even on my you-tube channel), so let me set the scene for you.
The game I’m developing will be controlled by tilting a smartphone or tablet in the real world. Even in their cheapest form, these devices tend to have an accelerometer sensor. Some expensive ones have additional sensors such as gyroscopes and magnetometers which can also be used to determine/enhance how the device is tilted in space. Though prior research and testing I discovered I can get the degree of control I need for the game I’m building using the accelerometer alone. Initially this was revealed with a capability test (the remake of which is the subject of this post), but also later when I developed playable parts of the game on the previous game engine I was developing myself (see my first post).
Imagine the screen you’re reading this on is a device with an accelerometer. The accelerometer in this device defines it’s movement through real world space using a 3D cartesian coordinate system. It defines an x, y, and z axis through the screen as follows:
- the x axis passes across the face of the screen from left to right
- the y axis passes across the face of the screen from top to bottom
- the z axis passes through the screen from back (behind) to front
If you’re wondering, the origin of the axes (the point where all three axes intersect) is the top-left hand corner of the screen. However that doesn’t matter when we are talking about readings from the accelerometer.
The accelerometer measures the acceleration of the device in real world space on these three axes at the time a reading is taken. Readings from the accelerometer are split into x, y, and z values. The accelerometer sensor is constantly measuring so it can provide values any time you take a reading.
The key thing to understand is that gravity is always affecting these readings. Unless you’re lucky enough to be holding this device in outer-space or on another world, the force of Earth’s gravity is constantly accelerating the device towards the centre of the Earth. If the device is placed on a stationary table and you read the accelerometer you’ll find a force of 9.8 meters per second towards the centre of the Earth being reflected in the results. That’s the amount of force gravity is exerting on the device and the direction in which it is being exerted.
This fact can be used to determine the orientation of the device in the real world, and therefore how much the device is tilting. This amount of tilt is what is used to control the game.
For the purpose of this capability test, the screen is a 2D surface on which I want to show the 3D reading from the accelerometer sensor. I chose to represent the x and y component of each reading as a line from the centre of the screen pointing in the same x and y direction, with a length reflecting the strength of the reading in that direction. I also chose to draw a circle around the screen centre to indicate where a force of 9.8 metres per second is, so that if the screen was perfectly stationary and upright (where z == 0) the line would terminate at the circle. This lets me demonstrate rotation around the device’s z axis.
For showing the z value of the reading I needed to be creative. I chose to represent the z component by changing the fill colour of the circle. When the z reading is zero, the circle is transparent. When the screen is facing the centre of the Earth, the circle is solid green. When the screen is facing away from the centre of the Earth, the circle is solid red. As the screen tilts towards and away from the centre of the Earth, the colour gradually changes between these three colour values.
The end result is that it becomes easy to see which direction the centre of the Earth is on the device as you tilt it through space. It’s quite difficult to explain in words, so let’s jump ahead and see a video of this in action after this capability test has been implemented.
In UWP the accelerometer sensor is exposed in the
Windows.Devices.Sensors. You define the interval at which readings will be taken, and an event handler that will receive each readings. In the previous implementation of this capability test I wrapped this sensor object in a service that stored the values as readings came in, and the application simply read from this service when it is was ready to take a reading.
In Godot 3, the accelerometer sensor is exposed as a function
get_accelerometer() on the
Input singleton, which returns a
Vector3 containing the x, y, and z values. At the time I was coding, the Godot help system described the functionality of this method as:
If the device has an accelerometer, this will return the movement.
I knew my Surface Pro (where I do all my coding) had an accelerometer from when I wrote my UWP version of the capability test. When I read the sensor in Godot however, I was only getting zeros. Confused and unable to figure it out myself, I turned to Godot Discord for help.
Some friends there did some digging and found the
get_accelerometer() method only returns values when you export your code and run it on a device where an implementation has been written. Though it’s fortunate that both UWP and Android (my primary targets) have this implementation, being forced to export the code to test it every time is far from ideal. Unless I came up with a plan, I wouldn’t have a tight development loop and progress on my entire game would be painstakingly slow.
Making it better
The documentation totally misled me here but there’s a couple of things I can do to make it better. The first is to submit a pull request on the documentation to write better text and prevent confusion. The second is to request for the Godot development team (or community contributor) to implement accelerometer support within the editor (if the device supports it). I plan to take an action on these.
I wanted to simulate moving the device in real space while in the editor, but I knew
get_accelerometer() would only return zeroes when running there. However I could make an accelerometer sensor node (a facade) that exposed readings from
get_accelerometer() when it was returning values, and create touch controls to manually update the values when
get_accelerometer() wasn’t returning values. Then if I only consume the facade my application would work in both scenarios.
I designed a touchscreen vertical slider for the left edge of the screen whose value controlled the x and y readings on the sensor facade. Dragging up and down the slider would change its value between 0 and 360, which I would transform into a 2D direction in x and y using sine and cosine trigonometric functions. I allowed the slider to cycle through this range several times down the screen, and put a marker at every zero point.
Similarly I designed a touchscreen horizontal slider for the bottom edge of the screen to change it’s value between -180 and 180 and use it to control the z reading on the sensor facade.
Using what I had learned about touch input from my manipulations test to quickly knocked out
TouchSliderHorizontal nodes. I created an
AccelerometerSensor node (the facade) that would read from
get_accelerometer() a few times when it started up, and if it didn’t get a non-zero reading it would determine that the accelerometer sensor wasn’t present. It would set a boolean flag to indicate if the sensor was available, and not read the sensor further if it determined it wasn’t there. This flag was also used to disable the touch sliders if the sensor was available (so the sliders wouldn’t appear if they weren’t required).
Now it was time to put all this into a scene. I built a
SensorDisplay node to draw the line and the circle, using the readings from the
AccelerometerSensor facade (node). A script on the root node connected up all the necessary signals to marshall readings from the touch sliders to the facade, and to disable them if the sensor was present. I even threw in a
DebugLabel to show the state of the
Here’s a video of it all hooked up and running from the Godot editor. You can see the touch sliders in action, how they modify the readings of the accelerometer facade, and thus drive the display.
I know from experience that raw readings from an accelerometer sensor are noisy and not suitable for using as input in most cases. The way around this is to filter the readings to smooth out the values.
There are many strategies that can be employed to smooth the data, and choosing the one that works for you takes some experimentation. From previous efforts I’ve found a simple low-pass filter is adequate for my needs. Such a filter uses both the previous value and the current reading, performs some simple mathematics and creates a new value. This is remarkably easy to do, and given a good dampening
coefficient (again found through trial and error) the readings become smoother and more useful.
var x = 0.0
GDScript vs C#
It’s worth comparing the implementation of this filter with the implementation from my previous capability test in C# on UWP. In the previous test I separated the accelerometer facade and the smoothing function, then replicated values in a view model and synchronised them with the sensors own
ReadingChanged event and the
INotifyPropertyChanged event. The implementation was spread across multiple files, and while technically and architecturally acceptable, the simplicity of the implementation in GDScript gives me comfort and confidence in moving ahead on the Godot platform.
You’ve already seen the end result on Android in the video earlier in this post. While exporting to my phone (another device) may feel more complex, it’s actually the easier export path. Once setup, the Android export is as simple as clicking a button in Godot and it’s all done for you.
Exporting to UWP is more difficult as Godot doesn’t handle bumping up the version number or completely signing the exported file out of the box. It does handle signing with a developer key automatically on export which is something, however to run the result on Windows you also need to sign the resulting file with a certificate to create a trusted executable. I currently do the non-Godot actions on the command line, and I plan to take some time to automate this with script in the future.
Despite the clunky export process, once you have the executable it works just as well as the Android version.
Although it wasn’t as issue-free a journey as I was expecting the result is pretty satisfying. I was again impressed with the ease at which the Godot solution came together, and hopefully making the accelerometer work when running from the editor can be added to the platform in the future to make the developer experience even better.
Coding this capability test in Godot was all too easy and the challenges I encountered were relatively simple to resolve. I was helped greatly with my experience having built this test before, however it’s encouraging to know that if you know what you want to build then building it in Godot is a pleasurable and productive experience.