Map to Engine Compatibility Problems
-
@redrum Cost benefit analysis is a bitch with so many uncertain variables.
But I will say this...
-
Map makers tend to be more adaptive to change than users (broad brushing for sure).
-
The negative side effects of change can be mitigated if there are the proper resources available for people to acclimatize themselves to new processes and procedures.
-
As has been demonstrated in the past.... the forum and "monkey see-monkey do" have always been the benchmark for ways to use and change to new idea's and concepts. As long as there is a detailed functioning example (much like POS2)... and people to ask on the forum/chat.... obstacles can be over-come.
The key is really that it works right.... all the time... before it is launched and changes the community. and more than 1 or 2 people should be fluent in the new methods and procedures.
-
-
@Cernel YAML has the benefit of being well-readable and producing smaller file sizes than xml.
A good example of this is the official yaml website. This site is valid YAML and you can read the structure almost as normal text.
The probably biggest difference to xml is that while xml has tags with attributes that can be nested in each other, yaml consists solely of lists of elements (where the list is an element itself and can therefore be nested in itself) and sets of key-value pairs where the values can be a key-value pair set on its own or a list, or a string, or a number, or a boolean -
@roiex said in Map to Engine Compatibility Problems:
Supporting both options defeats the purpose of redesigning the system IMO, so that's not an option.
I would respectfully disagree oh this. We can very easily partition the existing game parsing code and keep it working. Meanwhile adjustments can be done as version flags or whole new forks.
@RoiEX @redrum I'm really trying to convey and convince you that mass updates are not feasible (the evidence is against them, they were painful, another data point, Roger is still upgrading maps from 1.8 to 1.9!!! https://forums.triplea-game.org/topic/120/conversions-to-version-1-9
-
@hepps said in Map to Engine Compatibility Problems:
Map makers tend to be more adaptive to change than users (broad brushing for sure).
I've great respect for the level of expertise of map makers, frankly it requires way too much and should be much easier to churn out maps.
The subtle problem is that 95% of maps created are not active, we have 140 map repositories currently (a couple of those are admin/script, so around 137~138 maps!)
When a few maps want a new tag that breaks DTD; it's really quite the design flaw that we have to coordinate update of all 138 maps at the same time, get Roger involved, and we interfer with people being able to upgrade engine and start new games with updated maps.
-
@LaFayette If you believe legacy compatibility is a concern, I'd suggest going for an on-the-fly converter that would convert maps to a potential new format when needed.
This way we'd have the freedom to rework internal parsing code freely without having to be backwards compatible, but we could easily convert official maps to make our lives easier. -
@LaFayette To clarify, a new format would have map variations in mind and clearly seperate resources and logic to allow us to update maps individually.
-
@RoiEX an on-the-fly converter is intriguing for a few uses.
I think it make sense to use that to generate new, updated versions of maps.But, on another level, if we have such a converter, why not just ship it with the game engine? Would it be feasible for example to create one so 1.9 engines could read 1.8 maps again?
-
@cernel said in Map to Engine Compatibility Problems:
Would a yaml standard allow faster loading and lower RAM requirement of heavy xml?
It is CPU requirement for heavy XML : )
That is a problem.
]
YAML could help as it can be more expressive than XML (easier to read too). If we do it right, we can drastically cut down on looping while reading files. I think personally YAML would dovetail with a possible future save game native format, so we could have one file hold more information in one file.I tried an experiment of converting a map to YAML, it does make it a bit cleaner, but does not radically cut down on complexity for a simple example: https://github.com/DanVanAtta/triplea/commit/5e1ae21c7e16ae952a9ad9cbb95ce75155c44937#diff-0dff504360f8239d76828118a02ebadd (that example is the Test.xml file converted to yaml, for comparison the original XML version is: https://github.com/triplea-game/triplea/blob/master/game-core/src/test/resources/Test.xml)
-
@LaFayette I was talking about shipping a converter with the engine, but I wanted to point out the converting mechanism, so the original map file is never being read by the game engine, but converted into an actual newer file format (and deleting the old map file?).
About YAML saving: Ideally we'd make use of built-in data structures of YAML which will save a lot of space, meaning for example no attachment, but properties instead that can be set or default to a default value and will make a lot of details implicit rather than explicitl.
Example:units: - name: test water: true
etc.
-
@RoiEX an in-game converter I would agree helps. I would look to the open-close principle in designing it, and would likely merge the converter and parsing functionality. So instead of having a pass that is "convert to latest for N versions over all properties", and then "read latest map file", I would sub-divide map parsing by topic so we read units, and in that module it would be parse version aware and would up-convert to the latest as necessary.
-
I really want to get back to the original topic, to clarify the problems we have.
Problem A) old engines with newer maps; old engine just fails when reading new map
Problem newer engines with older maps for save games; often the compatibility is broken eventually and save games are lost- Save games live for months
- old lobby lives for weeks. This means when we retire old lobby, we just about always kill save games. To boot, players will create new games in 'old lobby', which makes the problem that much worse.
- map downloader does not distinguish between old engine and new, so when a new map is finally downloaded it can kill the save game. To make matters worse, players who know this will be unable to start new games on the lastest map since they need to preserve an older version.
Problem C) On incompatible release, map admins have to update all maps, regardless of the benefit or strict necessity, we have to do it just to keep the maps alive. Given we did at one point have a working parser, and the maps are not all played even (80/20 rule, most are inactive and just map admin maintained), this does not seem valuable, we're wasting our time here. To boot
Problem D) Map Parsing is frozen in a catch-22 with a legacy design that does not scale well. We can't do complex changes in bulk, all changes must be done in bulk, so we can't do complex changes because we can't do them in bulk.
Bulk updates so far mostly have not gone well, and we've mostly tried relatively straight forward renames. The way maps are orgnized, one per repo, it is not intended for bulk updates.
Some suggested answers:
I really want to more focus on the problems presented, I don't think we even all agree they are problems, let alone major problems. So if these suggested answers are not quite right, just saying "no" to change is also a non-solution; we've got some major problems!
1. Immutable Maps (write-once, read-only map files)
- This is basically the practice of creating new maps instead of updating them formalized.
- This lets players start new games but still have old maps to finish up saves
2. Externalize game engine min version to map metadata
- downloader will know which map versions to not download
- peer to peer game compatibility logic would be cleaner
3. Add a game engine max version, externalized to map metadata
- allows flexibility on upgrade, we don't have to get all maps upgraded right away, all at once to do an upgrade.
- allows for maps versions to be sunsetted, presumably, we would convert all maps, but at least we have the option to be able to prune maps which we do not want to convert to a more recent spec.
4. In-game conversion logic to extend the compatibility window between major map XML versions for as long as possible.
- if we do this right, we could forever support all of the existing maps, and still move forward to have a new map XML that is more powerful, more features, and simpler (easier to create and understand, modify and maintain)
-
@lafayette I'd be interested in how you map the problems you identified to each suggestion. Thoughts on suggestions:
-
This idea would have to be flushed out a lot to see if its viable. Map makers and players still want to see versions of the same map especially when the map is newer so it would mostly have to be under the covers IMO.
-
So there is already an engine version in the map XML. Its use could be expanded so that say the downloader reads it to avoid older engines downloading new maps. But you'd also have to ensure map makers are actually specifying the min version XML field properly.
-
So if we add back in including past jar versions of TripleA then this could be useful but without those most players don't want to have multiple engine versions sitting around so any maps not upgraded are effectively dead. Also developers would most likely have to manage the max version since map makers would know what to set it to.
-
Would need a lot more detail on how this would work to determine if its feasible and worthwhile.
-
-
To specify, I'm mostly worried about an automatic converter not messing up very complex and big conditions / triggers xml codes, as well as all current triggers / conditions capabilities being still available (like a trigger checking for a condition that checks for other conditions and other stuff).
Personally, the xml works just fine, so I don't feel a huge need to move to something else, but if you have, like, 100,000 or more lines of coding, then it starts getting really heavy on the RAM (and on the CPU mostly if you are doing recursive stuff, like having a lot of activations), so the only thing I can think of I may be wanting is something that allows enormous xml to run with less RAM and CPU requirements, or being read and loaded faster. Aside from this, I'd stick with what we have now, since it works fine enough, and I don't see compatibility as something to worry about that much.I think it also depends a lot on the kind of followers a map has. If you break NWO savegames, I can see people getting angry, while I wouldn't expect reactions from TWW players, instead.
-
And, anyways, breaking savegames right now I think is nothing to worry at all, as long as you take care not do to it for a game in a middle of a ToC. The only current ladders are for games that you easily finish.
It may become something to worry only if you have any projects to (re)make a WaW ladder, or something like that. -
To make a clear example, a question may be would YAML allow "War of the Relics" to be loaded faster?
Would it also make it faster to play, or is that just dependent from the savegame, or other things, so not impacted at all from moving from XML to YAML?
Would moving from XML to YAML make savegame size smaller too, and allow loading savegames faster, or is that unaffected? -
The suggestions are not necessarily 1:1 or a complete solution to all the problems. It is an interesting question of how they map, here is that mapping:
1+3 (immutable maps + max version): helps B, solves C (D is no longer impossible to solve once C is solved)
2+3 (min+max version): solves A
1+4 (immutable+in-game conversion): can likely solve D
4 (in-game conversion): helps, if not solves B+C (maps from save game can be loaded with newer engine, and if engine can read older maps, then we don't have to do C, update all maps) -
@cernel said in Map to Engine Compatibility Problems:
I'm mostly worried about an automatic converter not messing up very complex and big conditions / triggers xml codes, as well as all current triggers / conditions capabilities being still available (like a trigger checking for a condition that checks for other conditions and other stuff).
In this suggestion, from me, the converter would not be
XML -> XML
. The map parser of today is effectively a converter fromXML -> java objects (POJO)
. The problem we have is that the java objects created are 100% specific to the XML, there is no leeway to remap the java objects. To make it worse, we use those java objects everywhere in the system, so we're coupled to the XML text all throughout the game code. To make it worse we save that to disk on peoples hard drives, which then goes on to live for a very long time. It's an ugly problem, but we've actually done some pretty good work to unwire the POJO's from the rest of the game engine, so we basically have something like:
XML -> single java converter object -> POJOs
So an in-game converter would basically be: "If you see the word 'infantry', map it to 'army'".
XML -> adapter(java) -> single java converter object -> POJOs
Personally, the xml works just fine, so I don't feel a huge need to move to something else
yes and no. There is a lot of unnecessary complexity in the spec. Take note for example how certain relationships are redundantly defined. The XML spec file for how a game is setup, should not be thousands of lines long, let alone 10k, or 30k. All we are doing is specifying map and unit positions, and game rules. The fact it can grow so large makes it not accessible; we want map making to be really easy. I'm not necessarily saying we need to move to YAML, but the current spec is overly-complex for what it is, it can be simplified.
but if you have, like, 100,000 or more lines of coding, then it starts getting really heavy on the RAM (and on the CPU mostly if you are doing recursive stuff, like having a lot of activations), so the only thing I can think of I may be wanting is something that allows enormous xml to run with less RAM and CPU requirements, or being read and loaded faster.
There is the efficiency of parsing large XML's, but fundamentally we should not have such large files. We've already lost when any map file is over 1000 lines long.
Aside from this, I'd stick with what we have now, since it works fine enough, and I don't see compatibility as something to worry about that much.
@Cernel is that only because we do not break compatibility very often? Do you think the map XML is approachable for new people? We ideally want new map makers to join the group, IMO that should take 30minutes to an hour to create your first map, and maybe 10 minutes max to update an existing map.
I think it also depends a lot on the kind of followers a map has. If you break NWO savegames, I can see people getting angry, while I wouldn't expect reactions from TWW players, instead.
Hehe, when dealing with the engine, often the answer can be all, so we'd be talking about all savegames for every player. It's a bit of a running myth/theme that we lose n% of all players on every non-compat release. The splintering of community and trashing of save games is very upsetting.
-
If you want to make a set of triggers that applies to all territories of a big game with many players you can easily have that set alone being some tens of thousands of lines of xml code. I don't see anything bad about that, just that when you have too many of such triggers you start hitting the RAM or getting slowed down.
The same for actions codes, like making a serious political actions system, in a game with many players and many relationships.
As long as nearly unlimited triggering will be still possible, and what we have can be converted, or whatever, that's fine.
Currently, the xml is simple enough, if you stay basic, and everyone can pick its own level of complexity (comprising not using triggers at all), so, since I don't quite see where this discussion is getting, I'd just stay with what we have now. -
@redrum said in Map to Engine Compatibility Problems:
- This idea would have to be flushed out a lot to see if its viable. Map makers and players still want to see versions of the same map especially when the map is newer so it would mostly have to be under the covers IMO.
There is a question of how to link such maps. We already have a database of maps, it is a yaml file. To get to the immutable suggestion, every version would be a new entry. We'd convert the version and download link to a list. In this way we have an under-cover implementation for the most part. A good chunk of the benefit is from the assumptions we can make about any given download location. If we know a map version, and they are immutable, and we know the download link for a given version, we can always download that XML. Today with updates to maps, we lose that capability with the overwrite of new data, and we lose the information for the old location. I suspect there are more simplifications once things are immutable by convention. Basically, we would be optimizing for the 'copy/paste' and update strategy for maps : )
- So there is already an engine version in the map XML. Its use could be expanded so that say the downloader reads it to avoid older engines downloading new maps. But you'd also have to ensure map makers are actually specifying the min version XML field properly.
Indeed, but:
- usability problem as you mention
- introduces a bulk update problem, we have to bulk update engine versions
- tracking engine versions in map XML, I think the two concepts should not be fundamentally related at all. I'd like to see the game engine depend on map parsing, not the other way round in a circular dependency.
- we'd have to parse every map XML to show the 'download maps' window, this is likely to be too slow.
- it's not known where a game XML will be, and there are multiple. We'd have to scan the entire file directory of a github repo to find the XMLs, and we'll find multiple.
It would help if games and maps were 1:1, then we would know the map XML location and maybe could parse the actual file on the fly. That would be nice as we could put other items like download description and map title all in the same file.
So if we add back in including past jar versions of TripleA then this could be useful but without those most players don't want to have multiple engine versions sitting around so any maps not upgraded are effectively dead. Also developers would most likely have to manage the max version since map makers would know what to set it to.
Good point, I've considered the same. But:
- old jar was not a complete/good solution. It did not work for the 1.8 to 1.9 upgrade. We also had to do work to maintain it, it made path loading very whacky and was hard to keep working.
- non-upgraded maps are already effectively dead, it's something of an existing problem
- I suspect developers will need to manage min-version even. Max version likely would be easy, it'll match to be the next version that is not going to be compitible. WE'd have to discuss what kind, if any mapping there is to the game engine version. So I'd envision this being set to 'version 2' everywhere. When we have a new incompatible engine, we would add a bunch of maps with min version '2' and max version '3'.
Would need a lot more detail on how this would work to determine if its feasible and worthwhile.
Should be feasible, we can introduce some logic to upconvert objects. I'd personally lean on the open-close principle when designing it. 'Worthwhile' should be pretty obvious I would hope. My experience of a player for a decade plus, I've heard a number of endless tantrum's from players before, during and after upgrades. I would tend to agree from experience that destroying a save game is one of the worst things that can happen. If we can lessen the need for a non-compatible release, we get a pretty good win-win where save games are not trashed, and developers can do their work with less effort, which means more work done per unit effort
Fixing save-game compatibility I think is the best thing we can do for this project. At this point, with the code and maps out of SVN, individual owners converted to teams, I think that is probably the highest impact thing we can do now.