Analytic events to Snowflake

Are there any examples of using the Nakama events system to push analytic events to Snowflake? I know the events system would allow for such a thing, but I’d like to hear about anybody’s experience doing it.

Hi Josh - you can use Nakama’s event system to push events to any 3rd party service including Snowflake, which is the most popular service with customers of ours. Within Satori we also have first party support to push events to Snowflake as well as BigQuery and Redshift.

Do you happen to have any examples of that kind of implementation? Do the best implementations you have seen do something like write each event to a file locally and then periodically upload those files to a Snowflake stage?

That, or are you seeing good implementations where people sort of relay events to their own Kafka cluster, and from there use the Snowflake-Kafka systems to get events into tables?

We’ve seen some customers use the Event system as a buffer to write to Snowflake API directly. However this isn’t recommended by us as the Event system is memory-backed, doesn’t have retry behaviour and it’s at-most once delivery. It’s designed mostly as a not-high input event capture (eg for connecting different game features together like an event bus). It can also be used if you have something local than can accept writes much faster (eg Satori) than a heavy/slow API like Snowflake.

If you do have lots of events data that need to be ingested, either connect a fast queue like Kafka or RabbitMQ to Nakama (sorry no code to share) or use Satori for event ingestion and connect that to Snowflake.