Game Engine Code Quality & KPI Tracking
KPI = key performance indicators
It's not a secret the game engine has evolved to be something of a chore to work with. "spaghetti code", "jenga code", "big ball of mud", take your pick , TripleA is an old case base : )
To improve this, let's try to quantify and track. How can we report on code base metrics? I'm curious on thoughts how we could do this well. IMO we could go as basic as someone running a report once every 2 months and posting key numbers somewhere. The question is how do we track progress?
Next question, what should we track initially out of the gate?
@LaFayette I think well-testable code tends to be less spaghetti-like, simply because the tests are relatively new.
I actually quite like the sunburst visualisation codecov offers, which lets you see all the TripleA code and the associated test coverage.
Of course coverage isn't a good metric for that, more like a very rough estimate.
But I believe that if we try to write tests for classes with 0 coverage, that'll force us to do the necessary refactoring to make it much more readable.
@roiex I generally agree :)
My point is a bit different though, it's one thing to be able to visualize a certain metric at a point in time, it's another to maintain a data set over time to know how you are trending.
One example of a KPI could be a snapshot image of the code cov sunburst visualisation. Some more examples to get the ideas flowing:
- total lines of code;
- branch coverage
- average and max cyclomatic complexity
- longest method line count.
Once we have a list and how to collect, it's also question of where we'll publish these numbers and how often