Redis is a single-threaded program.
A Redis Cluster has nodes of Redis to do all the incoming work.
Redis Cluster only has 16384 hash slots for keys. 1 When Redis receives a key to store, Redis hash the key, and put the slot in the right node. So Redis is acting as load balancer for nodes inside your Redis Cluster.
Each node of Redis has their responsible nodes to work on. 16384 is enough for <1000 nodes Redis Cluster.
Redis use different CPUs for async opes such as snapshots or UNLINK.
Redis pipelining is to improve performance by issuing multiple commands at once without waiting for the response for each command.
$redis.set(key, value, ex: 60.seconds)
$redis.get(key)
$redis.mget(key)
$redis.setex(key, 60.seconds, value)
to avoid race condition, you can pass in a nx option which only sets the key when key does Not eXists:
$redis.set(key, value, ex: 60.seconds, nx: true)
$redis.del(key)
The time complexity is O(N), which if you have 100 million items. It will delete one at a time. Redis would raise Redis::CannotConnectError
if your DEL
command timed out.
If you issue a DEL
and it is stuck, you cant kill the command because Redis is single thread. You also cannot reboot in order to stop it. You can provision another Redis to replace the one that is stuck. To properly delete large set of data in Redis. Use UNLINK: UNLINK key1 key2 ...
.
Return how many elements associated with key:
$redis.scard(key)
- CPU/Engine CPU utilization.
- Connections.
- Cache hit rate.
- Replication lag.
- Get commands
- Set commands.
- Memory usage.
- Swap usage.
- Evictions.
Open Redis Client of your Heroku app
heroku redis:cli -a yourapp --confirm yourapp
INFO
command to see status of your Redis.
> info
Some infos you should pay attention:
used_memory:3976008
used_memory_human:3.79M
keyspace_hits:2270
keyspace_misses:1229
You could also get these info separately by info memory
or info stats
.
Calculate your cache hit rate:
keyspace_hits / (keyspace_hits + keyspace_misses)
Aim for 90%.
All items outputed from INFO
command, please refer to here.
And you can also compress what got saved into Redis to save memory!
config.cache_store = :redis_cache_store, {
url: ENV.fetch("REDIS_URL"),
namespace: "cache",
expires_in: 604800, # 7 days
compress: true,
compress_threshold: 64.kilobytes,
# compress things > 64K
}
- GET
- MGET
- RPOPLPUSH
- SETEX
- LREM
- MULTI SADD LPUSH
- EVALSHA
- MULTI EXPIRE HGET
- SET
- PUBLISH
- DEL and UNLINK
- HMSET EXPIRE
- INCR EXPIRE
- MULTI INCR EXPIRE
- SCAN
- TIME
- MULTI SET INCR
- HGET
- MULTI SREM SCARD