Performance and Caching v6.0.2¶
Where file-based configuration is used, performance and caching parameters are configurable through the cache section of the file.
Each Quantum Origin entropy generation cycle generates a fixed block-size of entropy depending on use-case and circumstance.
In some situations, many small entropy requests - far smaller than a single block size - are required. To avoid wasted cycles generating unused entropy, Quantum Origin can cache entropy between each request. This is called synchronous caching. Any excess entropy from a previous generation call will be used in the next call.
In other situations, immediate responsiveness is critical. For this, background/multithreaded caching can be used, this starts a new thread or threads that can independently generate entropy outside of direct requests. This generates a cache pool that can be immediately accessed on demand without further generation.
Tip
Where high performance, multi-core generation is required, use multithread-caching.
The standard cache mechanisms can be significantly extended for almost any use case.
Option |
Description |
|---|---|
|
Do not cache between each request. |
|
Cache excess entropy between requests, no new thread is used. |
|
Start a new thread to generate cached entropy in the background. Cache excess entropy. A new thread is used. |
|
Start multiple new threads to generate cached entropy in the background. Cache excess entropy. |
If caching is enabled, the size of the cache must be provided.
A particularly useful cache size for synchronous caching is 3973 bytes, the size of a large block - this ensures no entropy is discarded.
Option |
Description |
|---|---|
|
Total capacity of the cache in bytes. |
|
The amount of bytes to generate at startup prior to any randomness call being accepted. The cache will be filled to this level synchronously on initialisation even when using a background/multithreaded cache. |
|
The level below which refilling of the cache will be automatically triggered. Only used for synchronous caching. It does not apply to background/multithreaded cache which will always attempt to keep the cache filled to capacity. |
|
The number of threads to use in multithreaded extraction. It is recommended this is kept fairly low. |
The following are examples of cache configs, using no caching and multithreaded (background) caching respectively.
cache:
type: none
cache:
type: multithread-caching
size: 4096
prefill: 128
refill_at: 1024