Error when sending multiple requests to the server from multiple clients

I was testing my server by login and joining matches with 150 clients at the same time
my server is running on AWS ECS
and I have a Cloudflare proxy in front of it
error on the console:

{
    "level": "error",
    "ts": "2022-07-05T08:05:30.125Z",
    "caller": "server/core_storage.go:429",
    "msg": "Could not read storage objects.",
    "error": "failed to connect to `host=localhost user=root database=nakama`: dial error (dial tcp 127.0.0.1:26257: socket: too many open files)"
}

after getting this error I can’t login for like 5 min
is there any config I can set to prevent this from happening?

Hi, from the single line of your stacktrace it seems you are running the database on the same node as the nakama server which is not an ideal setup for scale testing. As for the error itself you are probably not configuring the ulimit correctly but since this is your custom infrastructure to which we don’t have access it’s difficult to give more precise guidance here.

thanks for pointing out the ulimit this was helpful
Question: What is the ideal setup for scale? one database node and multiple Nakama nodes?

Hello,
for scale testing a good start is to have a separate instances for 1) Nakama (which is memory and cpu intensive) and 2) database (which may be storage iops intensive).

You can have a look at the benchmarks we’ve done and the setup we’ve used here: Nakama: Benchmarks | Heroic Labs Documentation

Best,
Michal

I will follow this instruction
Thanks, Michal

could you please give more detail on this, how can i achieve this?? i have created a load balancer with two instances as a backend and i started nakama server on both vms and connected from the load balancer, but it gives me this error :

InvalidHttpResponseCodeException: 401 Unauthorized
Nakama.Ninja.WebSockets.WebSocketClientFactory.ThrowIfInvalidResponseCode (System.String responseHeader) (at :0)

what http header rule, response/request header rule should i use ??

where can i change this??

I’m not sure what service are you using to deploy
but for changing the connection protocol it’s mostly on the load balancer listener and on the target group

i’m using OCI (Oracle cloud infrastructure)