Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Variable Insertion (Interpolation) is a way to dynamically modify a string expression , or object at runtime. 

Usage Engine use uses curly brackets to indicate that a Variable Insertion will take place. The variable insertion is supported in some Functions' configuration fields and also within Script and Script Aggregator Functions.

Within the curly brackets, you can access variables that belong to different scopes and use them to generate a dynamically changed output at runtime. Depending on what you want to achieve or manipulate, the output from Variable Insertion can be a string, (if prefixed by '$') or an object( if prefixed by '@' ).

Let's have a look at a simple example:


Info
titleExample

Let's consider Consider a scenario where you want to send the same message to different customers and insert the customer name in the message at the runtime.


Code Block
titleScript code block
payload.customer_name = "Johnny";

payload.full_message = `Hello ${payload.customer_name}. You have received a new mail.`;

await push(payload);


And , this is what the Variable Insertion in the Email Notification Function looks like:


When you run this code, the intended recipient (Johnny in this case) will see an email with the following content:

No Format
Hello Johnny. You have received a new mail.


...

  • payload
  • meta
  • deploy
  • sharedStore

Refer to the 7861678 below table below to know how to use these scopes and some examples for understandingwith a few examples. You can also perform nested Variable Insertion using the same or different scopes.

...

Info
title$ or @

While using '$' or '@', you need to remember note the following:

  • ${scope} will result in a string. ${scope} behaves like a JavaScript template Variable Insertion with the following exceptions: 

    Note
    iconfalse
    • Returns 'undefined' instead of throwing an error when accessing non-existing paths
    • Returns full object (payload) for ${payload}
    • Returns a JSON representation of objects instead of [Object object]


  • @{scope} will result in an object

For example:Example
If you have a nested data structure in payload.foo, and you want to keep the structure and not transform it into a string, then you must use @{payload.foo} instead of ${payload.foo}.


Variable Insertion with Scopes 
Anchor
table
table

...

 You can use Variable Insertion using the following scopes/variables/objects only:

ScopeExplanationExample

payload

You can perform Variable Insertion on any data from the current record.
In this regard, the The payload scopes cover any content that is sent from one node to the next.

Payload type variable insertion can be used to address the data flowing through the stream.

Note
titleNote!

The Payload scope is not accessible for Collector Functions.



Code Block
// Payload = { 
  
allFruits: 
{ 
    banana: 1, 
    apple: 2, 
    orange: 3 
 }, 
  myFruit: 'apple' 
} 

To access apple you can use Variable Insertion like: ${payload.myFruit}

or,

if you want to use nested Variable Insertion, you can use square brackets '[]'

${payload.allFruits[apple]} ---> The result will be 2


meta

Meta covers additional information that gives further context to the data (metadata). 

meta type Variable Insertion is designated to process the accompanying metadata. 

You can use Variable Insertion using any data from the current record's metadata. You can use meta with a Processor or a Forwarder Function.

Note
titleNote!

The Meta scope is not accessible for Collector Functions.


For example, If if you want to try to read from an Amazon S3 Forwarder Function using a Script Function:

Code Block
meta.fileName = "myFile"
    log.info(meta.fileName);
   
await push(payload);

And, do Variable Insertion on the filename:
${meta.fileName}${payload.value}

Once you run the stream, you will find the files in S3 Forwarder with the following format:
myFile<count of the file>_<timestamp>.csv

deploy 



You can use Variable Insertion using replicaNumber and lastReplicaNumber properties only.


When running a stream with multiple replicas, the first instance would be identified as the first replica. This is represented in the deploy scope as deploy.replicaNumber = 1 

Each running replica instance will have a different replica identification number.

The lastReplicaNumber variable will be the same in all replicas, identifying how many replicas running in total. For example, a stream configured to run 3 replicas would have the lastReplicaNumber set to 3 in all running instances. 

For more information about replicas, see Performance and scalability.

For example,

To access the second instance of a replica with a total of three instances running:

replicaNumber will be 2

lastReplicaNumber will be 3

You can use Variable Insertion like:

Code Block
${deploy.replicaNumber} ----> 2

//or, nested Variable Insertion:

${deploy.[${deploy.lastReplicaNumber}+${deploy.replicaNumber}]}


sharedStore



Data that is currently stored in the shared store.

The sharedStore property allows you to define your variable and use it across different streams.

The sharedStore scope is loaded during the start of the stream and the content does not change during the execution.  So if you have done any changes to the sharedStore variable that will take effect in the subsequent execution only.

Refer Script for more information.


Code Block
sharedStore:
      1 = 5.csv
      2 = 10.csv
      anotherProperty = { 
        first: 
{ 
          second: "third" 
        } 
      }


${sharedStore[1]} ----> 5.csv
${sharedStore[${sharedStore.replicaNumber}]} ----> 5.csv

// SharedStore = { file1: 'my-first-file.csv', file2: 'my-second-file.csv'} 

${sharedStore[file${deploy.replicaNumber}]} 
---> 'my-first-file.csv' for replica 1
---> 'my-second-file.csv' for replica 2

...



Examples

...

Expand
titleS3 Bucket Dynamic Naming

Variable Insertion can be used conveniently to designate dynamic names for files that are read from S3 buckets, for example, the time and date of the collection. Data collected from the buckets can come with their metadata called “CollectionTime”.

Defining the filenames is done by specifying

Filename: Collected$(meta.collectionTime). 

As a result, the system will generate file names during execution that utilize the data. Data collected at “2022-07-13To7:26:44.729Z” will be collected in a file called Collected.2022-07-13T07:26:44:729Z. 

...