The end goal would be to reach that this structure can become an array of ints for example.
Why an array of ints?
In int taking 32 bits, we have figured out we can represent cards easily, with imposing some gameplay limits that are currently set into the stone for example:
Max cards in the game 255 → so 8 bits
Max card level 12
and so on…
In our game, during one turn player can draw multiple cards and at first we were sending for every card draw an rpc, but that was a bit overkill. So instead where we could we converted data that is being sent to array based sending for example:
On Client side:
We have method down the line that has something like this
_RPC_On_Cards_Drawn(params object cardsData)
On Server side:
local cards_packed = {}
for i=1,#cards_drawn do
if cards_drawn[i] ~= nil then
local packed_num_str = game_helpers.get_binary_card_representation_str(cards_drawn[i])
if packed_num_str ~= “” then
local raw_num = utils_bin.bin2dec(packed_num_str);
if raw_num ~= -1 then
table.insert(param[message_key], raw_num)
end
end
end
end
local message = nk.json_encode(param)
dispatcher.broadcast_message(opcode_handler.opcode_name_to_id[“_RPC_On_Cards_Drawn”], message, nil)
So the message that gets sent, is pretty much just an array of ints.
In your example currently we had that case with opcodes being called, that is why for every opcode we have a packet structure/definition how to read it, and it needs to map on it, else it is being discarded.
But we didnt had until now for player data structure.
For now we have came up with 2 possible solutions but we need a little bit more info before deciding to proceed on any.
- Introduce to storage system tables a new colum (game_client) version. This was only so we could better filter /access collection state/structure before parsing it completly.
- Add to each value inside collection a key “version” and which structure it is using it, and through JSON Schemas let data be controlled and migrated properly.
Idea is to:
-
Define schemas for each collection and version
-
In let say Upgrade Manager define a list of all possible user collections names like: [“cards”,“rewards”,“quests”,“deck”]
-
On Before hook, check for each user collection what version it have and then if it needs updating it would start the upgrade process. This step can also be mitigated after authentication and before our internal loading process and fetching of the data.
-
The Upgrade Manager when it receives oudated data structure, it looks for its version, it would check how far it needs to go or how many updates would it need to apply and then look for mitigatation schemas or functions. Not sure if Nakama runtime allowes dynamic function calling through string concatanation need to test.
Example would be maybe something like this:
Outdated {“version”:“1”, “cards”:[{“cardId”:“Rocket”, “cardLevel”:1}, {“cardId”:“ReRoll”, “cardLevel”:2}]}
Current Server/Client required version: 5
It would then run through a for loop going from 1 until 5 and look for mitigation functions or scripts.
local old_data = current_data
local collection = "cards"
local from_version = start_version
local next_version = from_version + 1
for i = next_version, required_version do
old_data = upgrade_manager.mitigate(from_version, next_version, collection, old_data)
from_version = next_version
end
-- Save to storage
And mitigate function it self, could based on collection and from, next version to what is required with that thus:
version:1 => [cardLevel, cardExp]
version:2 => transform function would be
→ Requires that version1 data has structure based on schema
→ Do manually steps to upgrading by reading data that is in currently and applying well code
→ So then it becomes very easy to know how to mitigate it and if values are correct.