StackExchange.Redis.RedisTimeoutException from VariantService and ProductService

I get these errors from Redis while running the Elastic search Indexing. Any Idea what it can be? Thank you
Full message:
StackExchange.Redis.RedisTimeoutException
HResult=0x80131505
Message=Timeout performing EVAL (5000ms), next: EVAL, inst: 2, qu: 0, qs: 2, aw: False, rs: CompletePendingMessage, ws: Idle, in: 0, in-pipe: 231, out-pipe: 928, serverEndpoint: 127.0.0.1:6379, mgr: 7 of 10 available, clientName: CJ-LM-W3SVC-3-ROOT, IOCP: (Busy=8,Free=992,Min=8,Max=1000), WORKER: (Busy=18,Free=32749,Min=8,Max=32767), v: 2.0.601.3402 (Please take a look at this article for some common client-side issues that can cause timeouts: https://stackexchange.github.io/StackExchange.Redis/Timeouts)
Source=
StackTrace:

Message:
{“Timeout performing EVAL (5000ms), next: EVAL, inst: 2, qu: 0, qs: 2, aw: False, rs: CompletePendingMessage, ws: Idle, in: 0, in-pipe: 231, out-pipe: 928, serverEndpoint: 127.0.0.1:6379, mgr: 7 of 10 available, clientName: CJ-LM-W3SVC-3-ROOT, IOCP: (Busy=8,Free=992,Min=8,Max=1000), WORKER: (Busy=18,Free=32749,Min=8,Max=32767), v: 2.0.601.3402 (Please take a look at this article for some common client-side issues that can cause timeouts: https://stackexchange.github.io/StackExchange.Redis/Timeouts)”}

Litium version: 7.4
Package versions:

It’s hard to pinpoint the route-cause of the timeout but one thing that is sure is that the application not can reach the Redis server and execute the command inside the time that is configured.

Can you ensure that the Redis server is up and running and that you can connect with a Redis-client like redis-cli or FastoRedis?

Do you have steps about how to reproduce the error?

Thank for quick reply Patric. Yes Redis is running. I can verify it with both Docker, and with Redily.
If I turn of Redis I get a connection error on build-time.

It’s hard to pinpoint but I see this error being thrown in the logs from time to time.
I get this error when I do a full rebuild of the Elastic search Product Index.

Can this be related to https://docs.litium.com/support/bugs/bug_details?id=49447
according to https://stackexchange.github.io/StackExchange.Redis/Timeouts. thread problems can cause the timeout error

Sounds strange to get Redis connection error at build time… building the solution should not connect to Redis at all. Or do you mean the startup directly after solution is built?

How is the performance on your computer? Elasticsearch rebuild will use some CPU to maintain the search index and also populating the data. If you will use a lot of CPU on the computer it will affect both Redis and Elasticsearch performance. Are you getting the same error also if you connect to remote instances of Redis and Elasticsearch?

The issue that is found with sync-over-async have not caused this error-log but you get a heavily use of new threads in the application. In the customer application where this was found the threads was more than 30 000 and was caused in the time frame of a nightly import work that was changing around 20 000 variants. In the end, the application was deadlocking itself and was not responding on http requests.

Sorry for the confusion Patric I mean on runtime not build time. :slight_smile:
If Redis is not running, I get the same error like if I couldn’t connect to database.

My computer should be ok:
16GB RAM CPU: i7-8565U 1.99Ghz
No unusual spikes in Memory usage.

I get his error from the Redis log from time to time: (Can it be related?)

1:M 19 Mar 2020 10:57:30.088 # Possible SECURITY ATTACK detected. It looks like somebody is sending POST or Host: commands to Redis. This is likely due to an attacker attempting to use Cross Protocol Scripting to compromise your Redis instance. Connection aborted.

Yes, it can be related but I’m not sure and have not have this error. We have made some improvement in the Redis connection, can you try the pre-release that exists and see if that solves your issue?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.