cache

Performs operations against a cache resource for each message, allowing you to store or retrieve data within message payloads.

cache:
cache: ""
operator: set
key: ""
value: ""

This processor will interpolate functions within the key and value fields individually for each message. This allows you to specify dynamic keys and values based on the contents of the message payloads and metadata. You can find a list of functions here.

Operators

set

Set a key in the cache to a value. If the key already exists the contents are overridden.

add

Set a key in the cache to a value. If the key already exists the action fails with a 'key already exists' error, which can be detected with processor error handling.

get

Retrieve the contents of a cached key and replace the original message payload with the result. If the key does not exist the action fails with an error, which can be detected with processor error handling.

delete

Delete a key and its contents from the cache. If the key does not exist the action is a no-op and will not fail with an error.

Fields

cache

string The cache resource to target with this processor.

operator

string The operation to perform with the cache.

Options are: set, add, get, delete.

key

string A key to use with the cache.

This field supports interpolation functions.

value

string A value to use with the cache (when applicable).

This field supports interpolation functions.

parts

array An optional array of message indexes of a batch that the processor should apply to. If left empty all messages are processed. This field is only applicable when batching messages at the input level.

Indexes can be negative, and if so the part will be selected from the end counting backwards starting from -1.

Examples

The cache processor can be used in combination with other processors in order to solve a variety of data stream problems.

Deduplication

Deduplication can be done using the add operator with a key extracted from the message payload, since it fails when a key already exists we can remove the duplicates using a processor_failed condition:

- cache:
cache: TODO
operator: add
key: "${!json_field:message.id}"
value: "storeme"
- filter_parts:
type: processor_failed

Hydration

It's possible to enrich payloads with content previously stored in a cache by using the process_map processor:

- process_map:
processors:
- cache:
cache: TODO
operator: get
key: "${!json_field:message.document_id}"
postmap:
message.document: .