CloudCherry is now part of Cisco.
Learn More About Cisco

Cisco Webex Experience Management Invitations Module Architecture Document

1. Introduction

In an economy where customer experience trumps price and product, the Cisco Webex Experience Management (referred to as Experience Management here on in the document) platform helps organizations exceed customer expectations and deliver business outcomes through its three pillars of customer experience:

One of the key steps in measuring customer experience is to reach out to customers over various channels such as email, SMS, web intercept, etc to solicit feedback. Amongst all the survey distribution channels, email and SMS are 2 of the popular ones. Global delivery invitation management enables a personalized experience for receiving survey invitations across various channels on the journey using workflows that can be configured and reused across chosen transmission channels like SMS and Email while requiring no PII information in the Experience Management platform.

2. Key Benefits and Features

Features Benefits
Omni-channel survey invitation dispatch Send invitations across multiple journey channels, giving end customers the choice of how they can interact with your business
Centralized delivery policy management via delivery templates Manage organizational corporate communication policies for invitation dispatch such as time of day communication windows, channels for communication, and campaign workflow rules for follow-up messages.
Global data centers and data residency (PII and outbound delivery processed in sovereign territories) Businesses operating in countries with government or industry regulations that require PII to be processed in the region can run a compliant solution.
Single instance hosted PII processing (no PII required by Experience Management, only hashed data needs to be transmitted) Run a completely secure, zero PII solution with a AWS/Azure Cloud cloud instance that processes PII and dispatches invitations
Extensible architecture that supports ETL processing with serverless AWS Lambda or Azure functions for a scalable pipeline Flexibility to customize and set up a big data processing pipeline with custom business logic based on the unique needs of your business
Content template upload and preview Upload and test personalized content templates with easy substitutions that deliver improved response rates
Schedule-less delivery pipeline Just like FedEx, tee up delivery batches. Based on your delivery policy, these get queued, and sent off in-flight.
Real-time notifications about bad data during ingress Quickly take action to address bad data issues at the source system of your data pipeline.
EOD invitation downloadable reports with progress/success/failure status Periodically review dispatch performance to continuously tune content templates, dispatch policies, and survey logic to optimize response rates.

3. High-Level Workflow

The following diagram shows at a high level how CCX/CCE/WCC consumes the Experience Management Invitations feature to send surveys to customers over emails and SMS.

delivery-Policy-screen-shot/invitations-delivery-architecture/Overall-Architecture.png


The “Dispatch request API” becomes the entry point for Cisco contact centre products or 3rd party systems such as CRMs to consume the Experience Management Invitations feature. Various elements of the infrastructure provisioned in AWS/Azure cloud to host the “cloud hosted module” of the Invitations feature is covered in the above diagram.

Cisco Contact Centre Express and Cisco Contact Centre Enterprise suite of contact center products have integrated Experience Management Invitations feature already. To consume the Invitations feature reference implementation as is, no development effort is warranted. Simply provisioning the infrastructure required to deploy the "cloud hosted module" of invitations feature and then subsequently deploying the "cloud hosted module" on the provisioned infrastructure and configuring the Invitations end to end is sufficient to configure "Cross Channel surveys" in CCX/CCE/WCC.

The following diagram zooms into and provides a high-level view of the workflow of the Invitations feature itself.

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step2new.png


The Cloud hosted module of the Invitation solution would be a single-tenant AWS/Microsoft Azure cloud hosted module that would interface with the multi-tenant SaaS Experience Management. Everything on the left-hand side of the above diagram encapsulates the Cloud hosted module of the Invitations solution and everything on the right-hand side depicts the multi-tenant SaaS modules. Both modules work together to form the Experience Management Invitations feature.

The various components which are a part of the Cloud hosted module of the Invitations feature are is as given below:

While the API can be deployed on a Linux server, the Dispatcher component is designed to run on serverless compute (AWS Lambdas or Azure Functions). Hence the cloud-hosted module has to be deployed on an AWS/Microsoft Azure cloud infra on either AWS or Microsoft Azure.

4. Detailed Architecture

Each component explained above is captured in more detail and covers the request/response patterns and data flow between various components and the database. In the upcoming sections, we will dive into more specifics of each component and how they function independently.

4.1 Dispatch Request

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step3.png


4.1.1 Dispatch Request Description

The above diagram depicts the detailed architecture diagram of the single-tenant Invitations module.

The Dispatch request API should be deployed and exposed to other Cisco products such as Cisco Contact Center Express or Cisco Contact Center Enterprise or even other 3rd party systems such as CRM systems, ERP systems, etc. These systems can make an API request with the details of all the records to be processed to send emails or SMS to end customers.

An overview of the sequence of events that happens within this component is as given below. This is basically how the component functions and no additional development activity is required to enable any of what is detailed below:

4.1.2 API Details

4.1.3 Error Scenarios

4.2 PII(Personally Identifiable Information) Details

Most businesses collecting feedback from their customers usually care about the sensitivity of any PII data that is being collected and stored along with the customer’s feedback. Additionally, concerns around handling of PII data in the context of running a compliant business.

In such cases, ability to restrict PII data storage, or storing it in ways that does not expose this data to certain users like vendors, partners, external support teams, would be useful. Dispatch API provides flexibility to hash the PII records at source and then only pass the information to Experience Management for survey token creation. Experience Management is designed to run with zero PII and anonymous data sets.

Dispatch API while processing a request identifies the records which are marked as PII and chosen to be hashed in Experience Management Portal and create a hashed equivalent of that. The response to prefill questions, that are tagged as PII with PII handling type as Hashing in Experience Management, needs to be hashed.

The option to mark a prefill as PII and choosing the hashing algorithms reside in the Experience Management portal. The hashing algorithms which are supported currently as listed below:

Algorithm Hashed Bits Length Hexa Deciman String Length Final Output(Always in lower case)
sha256 256 64 sha256:{output in hashed}
sha384 384 96 sha384:{output in hashed}
sha512 512 128 sha512:{output in hashed}

In a rare scenario, if the Dispatch API is not able to find the required hashing algorithms using Experience Management APIs and the prefills are tagged to be hashed, then default hashing algorithm sha512 will be used. The flow of PII for a token journey goes as below:

4.3 Caching Mechanism

Dispatch API caches certain details from Experience Management for a certain time to keep the processing faster and avoid the load on Experience Management APIs. It uses MemoryCache internally which caches the information with assigned expiry in seconds. The following information is cached in the MemoryCache within Dispatch API.

  1. Authentication Token - Cached for 15 mins for Authenticated tokens. Any Dispatch request which comes with the same Authentication token and available in MemoryCache will not be validated again
  2. List of dispatches - Cached for an hour. All subsequent requests containing the same dispatchID will be reusing the cached response unless expired. Once expired, a fresh response will be retrieved using Experience Management API and cached again for an hour
  3. Settings, Delivery Policy, Active Questions, Questionnaire Responses – Cached for an hour. Same as point (b) above.
  4. Account Configuration – Cached for an hour. Details are retrieved using DB call.

Due to the caching mechanism used here, it is to be noted that the information used by Dispatch API is not real-time with a maximum delay of 1 hour. So, if any changes are made in the Experience Management portal, it won’t reflect in the Dispatch API unless the old cached response is expired, and the details have been fetched again over the API. Also, any changes made in the ACM front-end would take a maximum of 1 hour to reflect in Dispatch API. For example, a new Dispatch ID is created in Experience Management and configured in ACM front-end with vendor details, then the request containing this Dispatch ID will be accepted by Dispatch API only once the Cache is refreshed.

4.4 Extensibility options for In-memory queue

Dispatch Request makes use of In-memory queue to batch the bulk API payloads. In-Memory Queue is considered adequate for CEM use-case, since the loss in the worst case will be 1min (or the configured interval) worth of invites in case this component restarts which is negligible. (Compared to financial transactions, loss of 1 min worth of data (survey data) doesn’t significantly impact the overall Experience metric that an organization is tracking or impact the overall outcome).

However, it can be extended to use persistent queues like Azure storage queue or AWS SQS queue, etc. The advantage to use persistent queue will be 0% loss of records. To achieve this, here are the steps.

SharedSettings.AvailableQueues.Add("exampleQueue", new ExampleQueueImplementation());
GET {BaseURL}/api/config/extendedproperties -> Gives the latest configuration available
POST {BaseURL}/api/config/extendedproperties

4.5 Sampling

Currently default sampling has not been included in the current release. We will look to add this in the future.

4.5.1 Extensibility options in Sampling

SharedSettings.AvailableSamplers.Add("exampleSmample", new ExampleSamplingLogic());
GET {BaseURL}/api/config/extendedproperties -> Gives the latest configuration available
POST {BaseURL}/api/config/extendedproperties

4.6 Cross Channel Token Creation

The Single instance Invitations module (Dispatch request) batches the requests for token creation and runs a background process every 1 min(configurable) to request for token creation using Experience Management bulk token API. Once the survey tokens are successfully created, the Dispatch request API component updates the token details in EventLog collection. This information is then further referred by the Dispatcher module while dispatching the survey invites.

At Experience Management end, once the survey tokens are created, it gets attached to a DP(delivery policy). The DP runs every 5 mins and checks for tokens to be processed within the business hours configured. DP also checks for unsubscription or throttling (survey fatigue rules) and eliminates the tokens which fall in this category. It also attaches the content template of the Email/SMS configured for that Dispatch for which the token is created. In the case of multi-language, the language preference will be attached to the token as one of the prefills based on which the email/SMS content template will be processed in the respective language.

Once all these criteria are successfully met, DP will create a survey record object and push to Azure or AWS queue. From here, the Dispatcher module will be triggered to further process and dispatch the records to end customers using the Email/SMS vendors.

4.7 Unsubscribe

The unsubscribe component resides within the Experience Management product. Once the Cloud hosted components interfaces with the Experience Management delivery policy module to create survey tokens, the Experience Management delivery policy module refers to the “unsubscribe list” it maintains and discards the tokens that are created and has a subsequent unsubscribe entry. Thus, the discarded token doesn’t reach the dispatch queue and hence mails will not be delivered to the unsubscribed customers.

Though the core unsubscribe list is maintained within the Experience Management product, we offer an extensibility option within the partner hosted module. This extensibility hook sits before the cross-channel token creation step. This extensibility option offers the flexibility for partners to refer an API or a 3rd party database maintained by the client themselves or any other global databases and identify certain records in the dispatch request that can be removed from the request. Hence, all such records identified by referencing the 3rd party system, will not even be sent for cross channel token creation thereby suppressing an email/SMS for that customer.

4.7.1 Extensibility options in Unsubscribe

SharedSettings.AvailableUnsubscribeCheckers.Add("exampleUnsubscribe", new ExampleUnSubscribeCheck());
GET {BaseURL}/api/config/extendedproperties -> Gives the latest configuration available
POST {BaseURL}/api/config/extendedproperties
{
"BatchingQueue": " exampleQueue ",  
"Sampler": "wxm",                     
"Unsubscriber": "wxm" // Pass same key added in “AvailableUnsubscribeCheckers” dictionary
}  

4.8 Mongo DB

For the end-to-end functioning of the Invitations Delivery Module, a variety of data is stored in various MongoDB collections which is inserted, read and updated by the different components of the Invitations Feature Module. The following section details out each collection and its usage:

4.9 Dispatcher

This component is responsible for the last mile dispatch of the survey invitation to 3rd party email/SMS systems to the intended recipient. Dispatcher supports both Single-Send and Bulk-Send vendors. A Single-Send vendor accepts only 1 invitation per service request, while a Bulk-Send vendor accepts more than 1 invitation (usually upwards of 1K invitations) per service request. The following is the list of vendors whose reference implementations have been provided in the source code:

  1. CustomSMTP: A Single-Send Email Vendor
  2. MessageBird: A Single-Send SMS Vendor
  3. SparkPost: A Bulk-Send Email Vendor

Additionally, through the Dispatcher’s extensibility options (covered later in this section), there is a flexibility to tie up with other vendors of choice.

To directly start utilizing the CustomSMTP/MessageBird/SparkPost option(s), simply provide your SMTP-Server/MessageBird-Account/SparkPost-Account details in the Account Configuration Management

4.9.1 Functioning

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step5.png

The starting point for this component is the Azure Queue Storage or the AWS SQS into which the Experience Management enqueues all the messages. The following is the lifecycle of these messages:

  1. Every message is dequeued one-by-one from the queue using an Azure Function/AWS Lambda (a serverless compute, referred to as Queue-Trigger from here on) that internally utilizes some kind of queue-message based event-trigger to achieve this functionality.

  2. Therefore, these event-triggers result in the waking up of the Queue-Trigger, which then performs the required null-checks, validations, and hash lookups on the dequeued message, before ulitmately converting it into a invitation. A personalized email or SMS that contains a unique survey link, which solicits user-feedback, is known as an invitation.

  3. Each invitation created has a vendor tied to itself whose details are present in the AccountConfiguration-Collection. Therefore, the Queue-Trigger then reads the respective vendor’s details from the AccountConfiguration-Collection.

  4. Now, based on the type of vendor fetched for the given invitation, one of the following occurs:

    • Single-Send Vendor: If the vendor is of this type, then the invitation is immediately dispatched to the vendor for end-delivery to the intended recipient. Once the vendor accepts the invitation, all logs are pushed to the EventLog-Collection, after which the Queue-Trigger goes back to sleep.

    • Bulk-Send Vendor: If the vendor is of this type, then the invitation is stored into the BulkMessage-Collection, for the sole purpose of batching, which ultimately enables a one-shot bulk dispatch of all the batched invitations to the desired Bulk-Send Vendor. Once the database accepts the invitation, all logs are pushed to the EventLog-collection, after which the Queue-Trigger goes back to sleep.

Therefore, the Dispatcher supports the following two kinds of Dispatch-Mechanisms:

4.9.2 Source Code

Links to the two serverless computes’ source code is presented below:
Azure:

AWS:


4.9.3 Scalability and Extensibility

The Dispatcher has been made highly customizable, where you can choose to horizontally and/or vertically scale the serverless computes to handle a variety of dispatch loads. In addition to this, you can choose to integrate with your choice of Single-Send/Bulk-Send Vendor(s) as well.

  1. Scalability of the Queue-Trigger Serverless Compute -

    • Vertical Scaling:

      • Azure: A single Queue-Trigger Azure Function can vertically scale to process upto 48 (with batch-size set to 32) messages concurrently. The value of the batch-size, which decides the max concurrency limit, can be changed in the host.json file of the provided Azure Queue-Trigger source code. Official Reference.
      Source Code's Batch-Size = 32
      • AWS: A single Queue-Trigger AWS Lambda can vertically scale to process upto 10 messages concurrently. This value of Batch-Size is set in the 10th-step of your AWS Lambda deployment. Official Reference
      Source Code's Batch-Size = 10
    • Horizontal Scaling: As we utilize a serverless architecture, the Queue-Trigger Azure Function/AWS Lambda can be naturally horizontally scaled by deploying more than 1 instance of it.

  2. Scalability of the Time-Trigger Serverless Compute -

    • Vertical Scaling: A single Time-Trigger Azure Function/AWS Lambda can be vertically scaled in two ways. Firstly, one can modify the number of Bulk-Send invitations that are processed by the Time-Trigger at once. This can be done by changing the Bulk-Read-Size property of the Time-Trigger. Secondly, one can modify the wake-up frequency of the Time-Trigger. This can be done by changing the CRON expression of the time based event-trigger.

      • Azure: The Bulk-Read-Size property and the CRON expression both can be changed in the trigger.cs file of the provided Azure Time-Trigger source code. To change the Bulk-Read-Size property, alter the respective DispatchHandler's constructor argument from 10000 to a value of your choice. To change the CRON expression, modify the RunAsync function’s TimeTrigger parameter from 0 */5 * * * * to a value of your choice.
      • AWS: The Bulk-Read-Size property can be changed in the trigger.cs file of the provided AWS Time-Trigger source code, where as the CRON expression is set in the 12th-step of your AWS Lambda deployment. To change the Bulk-Read-Size property, alter the respective DispatchHandler's constructor argument from 10000 to a value of your choice.
      Source code's Bulk-Read-Size = 10000, CRON expression = every 5 minutes
    • Horizontal Scaling: As we utilize a serverless architecture, the Time-Trigger Azure Function/AWS Lambda can be naturally horizontally scaled by deploying more than 1 instance of it, where each compute wakes up at a different instant to read from the BulkMessage-Collection. Every time it wakes up, it will read and update the status of a batch of Bulk-Send invitations, which consequently allows other Time-Trigger instances to pick up an another batch of invitations from the same collection, while the already running Time-Triggers are still processing their batch. After a bulk-dispatch is made, a Time-Trigger then deletes its batch of Bulk-Send invitations from the BulkMessage-Collection.

      However, as the [Bulk-Find-and-Update] operation, which is responsible for reading and updating the status of a batch of invitations, of a Time-Trigger isn't atomic, no two [Bulk-Find-and-Update] operations should overlap. Therefore, a new Time-Trigger should only wake up when the [Bulk-Find-and-Update] operation of all the currently running Time-Tiggers have been completed.
  3. Vendor Extensibility - One can integrate with their choice of vendor by implementing either the ISingleDispatchVendor or the IBulkDisapatchVendor interfaces provided in the source code. To integrate with a new Single-Send Vendor (reference implementations: CustomSMTP and MessageBird), an object of type ISingleDispatchVendor is required to be added to your Queue-Trigger Serverless Compute’s runtime. This is done via the Trigger.cs file that can be found in the provided Azure/AWS Queue-Trigger source code, where you would use the DispatchHandler's Additional-Dispatch-Creator-Strategies contructor argument to do this. To integrate with a new Bulk-Send Vendor (reference implementation: SparkPost), an object of type IBulkDispatchVendor is required to be added to your Time-Trigger Serverless Compute’s runtime. This is also done via the Trigger.cs file that can be found in the provided Azure/AWS Time-Trigger source code, where you would again use the DispatchHandler's Additional-Dispatch-Creator-Strategies contructor argument to do this.

A single Time-Trigger can only serve one Bulk-Vendor and to configure the Time-Trigger's Bulk-Vendor, one needs to provide the Bulk-Vendor-Name to the Time-Trigger's runtime using the DispatchHandler's "Bulk-Vendor-Name" constructor argument. This is done via the Trigger.cs file of the provided Azure/AWS Time-Trigger source code. Therefore, for one to have two different Bulk-Vendors, one needs to have two different Time-Triggers deployed.
  1. Logging Extensibility: One can provide the desired level of logging (1=Failure; 5=Debug) right from the Trigger.cs file of the provided Azure/AWS Queue-Trigger source code and Azure/AWS Time-Trigger source code. To change the Log-Level property, alter the respective DispatchHandler's constructor argument from 5 to a value of your choice.
Source code's Log-Level = 5

4.9.4 Caching

To improve the performance of the Dispatcher component, the Account-Configuration is cached when the first ever message trigger is fired. These details are then shared amongst all subsequent message trigger invocations by creating a singleton which is made to persist in the serverless compute’s memory for the entire duration of its lifetime. Therefore, any change made in the Account-Configuration via the Account Configuration Management component, warrants a restart of all the existing Queue-Trigger(s). As the Time-Trigger(s) don’t have a dependency on the Account-Configuration, one doesn’t need to restart them.

4.9.5 Dispatch Failure

In the highly unlikely scenario of a dispatch failure, which could occur due to misconfigurations, 3rd party vendor issues or an internal exception, the invitation(s) will be lost. This is due to the light-weight nature of this component. However, through the Notifications component, one will be made aware of this in real-time/EOD, while the extensive logging would help one to debug the root cause in such a scenario and stop it from recurring.

4.9.6 Dead Letter Queues and Messages

In Azure if there is an internal exception while dequeue-ing a message from the Queue Storage, the message will result in an error. However the message’s dequeue count still increases by one. Such errors are very rare and might occur due to an exception in PaaS management of the Queue/Azure Function by Azure. Nonetheless through the MaxDequeueCount set in the host.json, the message will be dequeued another 4 more times before it moves to a dead letter queue. Now, Azure creates a dead letter queue automatically. For ex, if your queue name is “invitations”, these errored messages would end up in a queue called “invitations-poison” after they have been attempted for dequeue-ing for the set number of times, which is the MaxDequeueCount property in the host.json file. This queue is not provisioned manually and is automatically created in the same storage account on the first occurance of a dead letter message.

In the case of AWS we need to create a dead letter queue ourselves. Even here messages will throw an error if there is some exception in the PaaS Management of the SQS/Lambda by AWS. Additionally, if the Lambda itself gets throttled, any messages which were not acceoted by the Lamda instances on dequeue-ing will also result in an error. You can refer to this official documentation which captures this issue. However, like Azure, these messages are attempted to be dequeued for a set number of times, which is the Maximum Receives property of the SQS that is created as the Lambda’s event source. You set your own choice of value at this deployment step.

Dead leter messages should be a very rare occurence, even in the case of the AWS Lambda getting throttled. We have set a high visibility timeout for SQS messages to avoid this. Nonetheless, it is advisable to monthly check if you are seeing any dead letter messages in the Azure Poison or the AWS Dead Letter Queue.

4.10 Initiator

While this component is optional for CCX/CCE/WCC clients, it is mandatory for stand-alone WXM clients.

This component is responsible for the converting a meaningful .csv or .xlsx file, which is provided by the client, into a valid Dispatch Request API that consequently initiates the entire Invitation-Delivery workflow that begins at the Dispatch Request Component. The meaningful .csv or .xlsx files, here on referred to as target files, contain invaluable information that allows the component to form the required HTTP Request Body of the Dispatch Request API. This, is done according to the API’s requirements that have been detailed out here.

The structure of the target files is as follows:

  1. Sample .csv Target File:
Name, Email Id, Mobile Number, Agent Email Id, Location
Kaxxl, kxx4@gxxxx.com, 9xxxxxxxx4, xyz@gxxxx.com, Delhi
Kaxxa, kxx4@cxxxx.com, 9xxxxxxxx1, asd@yxxxx.com, Bangalore
  1. Sample .xlsx Target File: delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step10.png

Essentially, each row (apart from the first row) in the target file contains the answer part of the pre-filled data that is sent to the Dispatch Request Component and therefore, each row translates into a unique invitation. Whereas, the first row of the target file, also known as the header row, contains column names that each individually represent the question part of the pre-filled data. The mapping of these column names to valid XM Question Ids is configured and stored in XM.

4.10.1 Functioning

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step10.png
The starting point for this component is always an AWS S3 Bucket to which the clients upload their target files. Once target files are added to the S3 Bucket, using S3 Events a serverless compute (An AWS Lambda or an Azure Function) is triggered, which then consumes the newly uploaded target file.

For this component to work, each Dispatch created by the client in XM has to have its own unique folder/directory in the S3 Bucket into which clients will upload the target file along with a mandatory config.json file. The config.json file is uploaded only once and is reused across all the subsequent target file uploads. The config.json is required to have the following structure:

{
    "DispatchId":"5xxxxxxxxxxxxxxxxxxa3624",
    "DispatchReqApi":"https://xxxxxx.xxxxxxx.xxx:8xxx"
}

Here the DispatchId represents the unique identifier of Dispatch that was created in XM and the DispatchReqApi represents the URL where the Dispatch Request Component has been deployed.

Once the trigger is invoked, the serverless compute downloads both the Target file and and its corresponding Config file from the same folder of the configured S3 Bucket. Immediately after the target file has been downloaded, a request is made to S3 to archive only the Target file. As a result of this opertation, the target file ends up in a sub-folder called Archive that is present inside the Dispatch Folder. This is the same Disaptch Folder into which the target file was initially uploaded.

After performing null checks and important validations on the corresponding Dispatch, the target file downloaded into the serverless compute’s memory and is converted into its equivalent Dispatch Request API. After a HTTP response is received from the Dispatch Request Component for the HTTP Request made by the component, all logs are pushed to the Database before the compute goes back to sleep. And this marks the end of the Initiator Component’s workflow.

4.10.2 Source Code

Links to the serverless compute’s source code is presented below:

5. Account configuration management

There is a web-based application with a user interface developed that should be used to create a new dispatch configuration. The User Guide covers the various features covered in this front-end application and how they can be configured.

As explained under the “Dispatcher” section, one may configure email vendor details or SMS vendor details to send the emails or SMS to the customers. We have provided out of the box reference implementations with Custom SMTP, Sparkpost, and Messagebird. However, Partners have the option to add new Email or SMS vendors based on the needs of the clients that can be used to send emails or SMS. To extend this addition and enable the Account configuration management front end to choose these new vendors and configure them, the following can be done.

5.1 Extensibility instructions to add new vendor

A new vendor can be added to the portal manually. Follow the steps below.

Create New Vendor Modal Form

In the file config.html , find the commented section Add new vendor in a modal here . Add a new form with pop up modal id. Add the required fields as per the API requirements as shown in the example.

<div class="form__group">
    <label for="vendorAPIKey" class="form__label">Api Key</label>
    <input type="text" id="vendorAPIKey" name="ApiKey" value="" class="form__field" required
    placeholder="Enter Api Key">
</div>

Add a cancel and Save button within this form like other.

Add a selector input mapping

Map your input fields with the id that is assigned to it in the form. This has to be done to convert the data to a JSON while saving/editing. Ensure the input elements has name that matches with the key of JSON data that needs to be saved. In the above example, it follows

data = { ApiKey: '****' }

Create Read Only View

Once the Modal form is designed with the required form fields, we now create the read only view. Add the new vendor name in the dropdown. Find the block commented as Add new Email Vendor name for dropdown selection . Add a new entry for your vendor to appear in the selection. Add accordingly the SMS vendor name to SMS vendor dropdown.

Now add the block to show the values under New Vendor Read Only View . Add the data to be displayed as shown in the example.

<h4> URL </h4>
<span id="newVendorURL"></span>

Load data on selection

Now that the Read only form is ready, add a function to load the vendor specific data from the backend like sparkpost.

Under function onEmailSelectChange() in main.js , add a condition to check if the selected value matches the newly added vendor id in the dropdown. If so then call the function to get data from the backend. This new function will get the data via ajax API call and set the data to Read only form elements we created in previous step. Refer getSparkPostData for help.

Open the modal pop up

Adding the vendorName to the drop down will automatically wire the edit button. On clicking the edit button, we will need to show the modal pop up for user to edit. In config-file.html add the new vendor related code below

Add the new vendor modal with right ID

Get the cancel button id from the form that we just added. Create an onclick handler to clear the form elements within the form.

Add the callback to open the form pop up like shown below.

if (document.getElementById('getVendorSms').value == "newVendorModalId") { getNewVendorData();
$("body").css({"overflow": "hidden"});
newVendorModal.style.display = "block";
}

Save data from the form

Add a callback to the save button within the new modal form. This new function will validate the data in the form. Collect the data. Serialize it to a JSON and make API call to the backend. Utility functions like emailFormat , required and serialize will come in handy to make these calls. Use vendorEmailUpdateAPI() function for reference and ensure the data is posted to backend in the required format.

6. Logging and Notifications

6.1 Logging

Overview of logging architecture and system components used for logging

Logging is a critical part of the architecture as this contributes heavily to the serviceability aspects. All necessary events in the form of “Mongo DB documents” are being captured and pushed to a Mongo DB collection for the components in the partner hosted module. On the Experience Management end, all events are being captured in “Event Hub”.

There are 2 broad categories of events:

The second category of events are all duly categorized into one of the following:

6.1.1 Dispatch Request Logs

Scenarios Result Level Email to group level Log Message/Email Body
Account Configuration not set-up API request will be rejected E Account Dispatch API cannot process the incoming request because the Dispatches are not yet configured in ACM front-end. Please configure this by logging into Account Configuration Module.
Authentication failed API request will be rejected E Account Authentication Bearer token in the incoming request header is invalid. Please ensure you are using correct user credentials or a valid Authentication Bearer token.
Max Record size Exceeded API request will be rejected E Account The API cannot process more than 18,000 records in a single request. Please split the batch into multiple requests and try again. Total records received is -
max dispatch supported exceeded API request will be rejected E Account The API cannot process more than 15 dispatches in a single request. Please ensure dispatch requests are spread out over time. Total dispatches received is -
Sampling type is not configured Sampling wont happen on the payload W Account Sampling type is not configured hence all records in the payload are processed. Please ensure sampling type is configured in ACM backend using the “extendedproperties” API.
WXM API Failed for dispatch/questions/DP/Settings/Questionnaire API request will be rejected E Account Dispatch, Delivery Policy , Questionnaire, Active Questions or Settings not found. Please ensure the dispatch configured on partner hosted side is available in Experience Management.
failed to get dispatch from memory cache or WXM platform Complete batch will fail E Account Dispatch not found. Please ensure the dispatch configured on partner hosted side is available in Experience Management.
failed to get Delivery Plan from memory cache or WXM platform Complete batch will fail E Account Delivery Policy not found. Please ensure the dispatch configured on partner hosted side is available in Experience Management.
failed to get Active Questions from memory cache or WXM platform Complete batch will fail E Account Active Questions not found. Please ensure the dispatch configured on partner hosted side is available in Experience Management.
failed to get Settings from memory cache or WXM platform Complete batch will fail E Account Settings not found. Please ensure the dispatch configured on partner hosted side is available in Experience Management.
failed to get Survey Questionnaires from memory cache or WXM platform Complete batch will fail E Account Survey Questionnaire not found. Please ensure the dispatch configured on partner hosted side is available in Experience Management.
Exception getting Dispatch/deliverPlan/Question from cache and If not found from WXM Complete batch will fail F Account Getting API details from MemoryCache failed + { Exception }
Unknown exception within a Batch Request in controller Complete batch will fail F Account Exception in DispatchRequest Controller + {Exception}
Unknown exception while validating the Batch Request Complete batch will fail F Account Exception in ProcessInvitation in DispatchRequest Controller + {Exception}
No hashing algo configured Hashing algo not found from WXM API, so switching to default D Dispatch Algorithm for hashing of PII data is either missing or configured incorrectly. Default hashing algorithm SHA512 is being used for now as fallback. Please check and use a valid algorithm in Experience Management.
Hash Algo configured Valid Hash Algo found D Dispatch Hashing algorithm for PII is configured as: .
Exception while validating every dispatch single to many dispatch can fail F Account Exception in CheckDispatchID + {Exception}
Dispatch id passed is not a valid one created in WXM A dispatch will be ignored from the batch E Account Dispatch passed in the API request is not valid. Please ensure a valid dispatch is configured in Account Configuration Module and same is passed in the API request as well.
A dispatch request is passed which is not configured in the partner hosted end through SPA All records in the dispatch will be rejected E Account Dispatch configuration is missing in Account Configuration Module. Please ensure a valid dispatch is configured in Account Configuration Module and same is passed in the API request as well.
Dispatch is paused A dispatch will be ignored from the batch and not process E Dispatch Dispatch configured to be used to send invites is paused. Invites will not go out unless this is resolved. Please sign in to Experience Management and un-pause the Dispatch. Also note, any changes in Experience Management may take up to an hour to reflect in Dispatch Request API.
Dp is paused A dispatch will be ignored from the batch E Dispatch Delivery Policy configured under Dispatch to be used to send invites is paused. Invites will not go out unless this is resolved. Please sign in to Experience Management and un-pause the Delivery Policy. Also note, any changes in Experience Management may take up to an hour to reflect in Dispatch Request API.
UniqueId question missing in DeliveryPlan A dispatch will be ignored from the batch E Dispatch Unique User Indentifier (UUID) configured in Delivery Policy used in Dispatch is not availabel in the questionnaire. This will impact fatigue rules and invites may go out to customers for multiple surveys. Please ensure the UUID is added as a pre-fill question in the questionnaire.
No valid dispatch request in the batch found Batch failed E Account No valid dispatch in the batch found. Please setup a valid dispatch configuration in Account Configuration Module.
Unknown Exception in a Dispatch while validation A dispatch out of batch will fail F Account Exception in a Dispatch in CheckDispatchData + {Exception}
Unknown Exception in overall Dispatch while validation Batch failed F Account Exception in CheckDispatchData + {Exception}
Exception validating DP and getting valid channels Batch failed F Account Exception in GetChannelFromDP + {Exception}
Invalid or unsupported channels configured in dispatch A dispatch will be ignored from the batch E Dispatch An error occured while using the Delivery Policy configured under Dispatch . Please make sure the Delivery Policy is configured correctly with supported channels.
records added to internal queue for bulk import Success D Dispatch {prefill Responses?.Count} records accepted for further validation
All records are rejected due to invalid channel or data A dispatch will be ignored from the batch E Dispatch All the records received in the API requests are rejected. Please ensure the channels and UUID are configured correctly in the Delivery policy in Experience Management. Also ensure the Email or SMS values are sent in correct format.
Invalid prefill for a record Prefill will be skipped and token will be created W Dispatch Some of the prefills with following question IDs are ignored while processing the records. This may be due to missing question in the questionnaire. Please verify and reconfigure this in the Experience Management. <Prefill QID 01> <Prefill QID 02>
Record Rejected due to Invalid or missing Email or Mobile number Tokens for the record will not be created F Report Invites were not sent to the following records from the data file: <UUID 01> <UUID 02>…
Record rejected due to Common Identifier missing Tokens for the recod will not be created F Report Invites were not sent for some records with missing email or mobile number. Please check the data file for missing data for more details.
Record is Suppressed Tokens for the record will not be created I Report Invites were not sent to the following records from the data file: <UUID 01> <UUID 02>…
Record is suppressed due to sampling rules For supressed records, tokens won’t be created I Report Invites were not sent to the following records from the data file as per fatigue rules. Below is a list of records for which invites were not sent. <UUID 01> <UUID 02>…
Bulk request failed due to Inactive or Invalid Dispach No tokens will be created and hence nothing will be sent. All records in a dispatch will fail E Account Invalid or inactive Dispatch requested. Please ensure the Dispatch is configured correctly in Webex Eperience Management and is not paused
Bulk request failed due to Inactive or Invalid DeliveryPlan No tokens will be created and hence nothing will be sent. All records in a dispatch will fail E Account Invalid or inactive DeliveryPlan. Please ensure the DeliveryPlan is configured correctly in Webex Eperience Management and is not paused
Bulk request failed due to Invalid token Template No tokens will be created and hence nothing will be sent. All records in a dispatch will fail E Account Invalid Token Template attached to Dispatch. Please ensure the Token Template is configured correctly and added to Dispatch.
Bulk request failed due to Invalid content template No tokens will be created and hence nothing will be sent. All records in a dispatch will fail E Account Invalid Message Template configured in Dispatch. Please ensure Message Template is configured correctly and added to valid channels against each Message.
Bulk request failed to WXM No tokens will be created and hence nothing will be sent. All records in a dispatch will fail E Account Request to send invites failed due to an error. Please check the configuration for Delivery Policy, Token Template, Message Templates and Dispatch in Experience Management.
Queue for batching is missing Whole batch will fail E Dispatch Queue to batch the records for bulk token creation is not found. Please verify and correct the batching queue type using “extendedProperties” API in Account Configuration Management.
x number of record for a batch id is pushed to Queue for bulk token generation Success D Dispatch {x} records added to queue {Queuename} for bulk token generation
Dispatch Completely accepted for processing Success D Dispatch Accepted for processing {count} records
few records from the dispatch is accepted for processing Success D Dispatch Accepted for processing {Count}, Rejected {Count}
Multiple status returned Success D Account Multiple dispatch status returned in response
Error while making the API request to WXM Batch/Token creation failed E Account “$"“StatusCode: {responseMessage.StatusCode} "” + $"“ResponseMessage: {responseMessage.ToString()} Url: {url}”
Unknown Exception while making API request to WXM Batch/Token creation failed F Account HTTP Send failed for HTTPWrapper sendAsync + {Exception}
Auth token not generated for Bulk Token API in Background task Token creation failed E Account Authentication Bearer token not generated. Please ensure you are using correct user credentials.
DB update of bulk token Token creation and updated in DB D Account Update to DB completed for bulk token response of size: {Count}
Exception while generating bulk token creation Token creation failed F Account Bulk Token API failed due to unknown exception + {Exception}

6.1.2 Dispatcher Logs

Scenarios Result Level Email to group level Log Message/Email Body
Queue-Trigger has dequeued the invitation Invitation is under processing D Dispatch Dequeued
Queue-Trigger has validated the invitation against Null-Checks Invitation is under processing D Dispatch Validated (Additional Token Parameters: ____)
Queue-Trigger has invalidated the invitation due to failure against Null-Checks Invitation won’t be dispatched E Dispatch Invalidated as Token ID, Batch ID or Dispatch ID is not available
Queue-Trigger has failed to identify invitation’s channel as both email Id and mobile number are attached with the invitation Invitation won’t be dispatched E Dispatch Channel couldn’t be inferred as both email ID and mobile number are available
Queue-Trigger has failed to identify invitation’s channel as both email Id and mobile number aren’t attached with the invitation Invitation won’t be dispatched E Dispatch Channel couldn’t be inferred as both email ID and mobile number are not available
Queue-Trigger has correctly identified invitation’s channel as Email Invitation is under processing D Dispatch Channel inferred as Email
Queue-Trigger has correctly identified invitation’s channel as SMS Invitation is under processing D Dispatch Channel inferred as SMS
Queue-Trigger has failed to find the invitation’s corresponding EventLog-Object from the EventLog collection Invitation won’t be dispatched E Dispatch Corresponding Event Log was not found in the database
Queue-Trigger has successfully found the initation’s corresponding EventLog-Object from the EventLog collection Invitation is under processing D Dispatch Corresponding Event log was found (id: _____)
Queue-Trigger has prepared the invitation’s Hash-Look-Up Dictionary for performing the PII replacements Invitation is under processing D Dispatch Corresponding Hash Look-Up Dictionary has been configured
Queue-Trigger couldn’t find the invitation’s dispatch details within the AccountConfiguration collection Invitation won’t be dispatched E Dispatch Corresponding Dispatch was not found in the Account Configuration Module
Queue-Trigger has found the dispatch’s vendor name within the AccountConfiguration collection Invitation is under processing D Dispatch Corresponding Dispatch’s Vendor Name has been found (name: ______)
Queue-Trigger couldn’t find the dispatch’s vendor name within the AccountConfiguration collection Invitation won’t be dispatched E Dispatch Corresponding Dispatch’s Vendor Name is missing from the Account Configuration Module
Queue-Trigger has found the vendor details within the AccountConfiguration collection Invitation is under processing D Dispatch Corresponding Vendor Details are available (vendor details: ______)
Queue-Trigger couldn’t find the vendor details within the AccountConfiguration collection Invitation won’t be dispatched E Dispatch Corresponding Vendor Details are missing from the Account Configuration Module
Queue-Trigger has found the vendor type to be Bulk-Send Invitation is under processing D Dispatch Corresponding Vendor is of type Bulk-Send. Invitation will now be inserted into the database.
Queue-Trigger has found the vendor type to be Single-Send Invitation is under processing D Dispatch Corresponding Vendor is of type Single-Send. Invitation will now be prepared for dispatched.
Queue-Trigger/Time-Trigger has found the vendor implemention object in its runtime Invitation is under processing D Dispatch Corresponding Vendor implemetation object was found in the serverless compute’s memory (vendor details: ______)
Queue-Trigger/Time-Trigger has failed to find the vendor implementation object in its runtime Invitation won’t be dispatched E Dispatch Corresponding Vendor implemetation object was not found in the serverless compute’s memory
Queue-Trigger/Time-Trigger has successfully dispatched the invitation to the required vendor for end-delivery Invitation has been successfully dispatched I Dispatch Successfully Dispatched (via: _______)
Queue-Trigger/Time-Trigger has failed to dispatch the invitation to the required vendor for end-delivery Invitation has failed to dispatch E Dispatch Failed at Dispatch (via: _______)
Queue-Trigger/Time-Trigger has run into a Internal-Exception Invitation has failed to dispatch F Dispatch Internal Exception + Details about the exception
Time-Trigger has read the Bulk-Send invitation from the database for processing Invitation is under processing D Dispatch Read from database into memory (Bulk-Send Vendor: _______)
Time-Trigger has started Bulk-Send invitations will be processed, given there is a presence of such invitations in the BulkMessage collection D Account Time Trigger Serverless Compute has now started
Time-Trigger has ended Bulk-Send invitations have been processed, given there was a presence of such invitations in the BulkMessage collection D Account Time Trigger Serverless Compute has now ended (Invitations Processed: _______)
Time-Trigger started later than intended - W Account Time Trigger Serverless Compute is running late

6.1.3 Initiator Logs

Scenarios Result Level Email to group level Log Message/Email Body
A target file is uploaded to the S3 Bucket Request to initiate a dispatch is accepted by the Initiator D Account A request to initiate a dispatch was received from the S3Bucket
The target file is uploaded to the S3 Bucket but into a directory Request to initiate the dispatch is aborted E Account The dispatch cannot be initiated as the uploaded file wasn’t uploaded into a directory as required
The config.json cannot be retrieved from the S3 Bucket Request to initiate the dispatch is aborted E Account The required config.json couldn’t be retrieved from the S3Bucket. Reason => _______
The config.json retrieved is empty Request to initiate the dispatch is aborted E Account The retrieved config.json was Empty
The config.json retrieved is not empty Request to initiate the dispatch is under processing D Dispatch The required config.json was successfully retrieved from the S3Bucket
The target file cannot be retrieved from the S3 Bucket Request to initiate the dispatch is aborted E Dispatch The uploaded file couldn’t be retrieved from the S3Bucket. Reason => _______
The target file retrieved is empty Request to initiate the dispatch is aborted E Dispatch The retrieved uploaded file was Empty
The target file retrieved is not empty Request to initiate the dispatch is under processing D Dispatch The uploaded file was successfully retrieved from the S3Bucket
The config.json retrieved is configured incorrectly Request to initiate the dispatch is aborted E Dispatch The required config.json hasn’t been configured correctly. Please check again!
The config.json retrieved is configured correctly Request to initiate the dispatch is under processing D Dispatch The required config.json has been configured correctly
Incorrect account credentials are stored in the DB Request to initiate the dispatch is aborted E Dispatch The login into WXM failed. As a result, a Bearer-Token couldn’t be fetched
Correct account credentials are stored in the DB Request to initiate the dispatch is under processing D Dispatch The login into WXM succeeded. As a result, a Bearer-Token was fetched
The config.json contains an unknown/incorrect dispatch-id Request to initiate the dispatch is aborted E Dispatch The config.json specifies an unknown Dispatch-Id
The config.json contains a valid dispatch-id corresponding to a Not-Live dispatch Request to initiate the dispatch is aborted E Dispatch The config.json specifies a Not-Live Dispatch
The config.json contains a valid dispatch-id corresponding to a dispatch that has no prefill-questions Request to initiate the dispatch is aborted E Dispatch The config.json specifies a Dispatch with no associated Questions
The config.json contains a valid dispatch-id corresponding to a disaptch that is Live and has prefill-questions Request to initiate the dispatch is under processing D Dispatch The required Dispatch details, along with its Questions, were successfully fetched from XM
None of the target file headers can be mapped to a WXM Question Request to initiate the dispatch is aborted E Dispatch The uploaded file cannot be processed as none of the headers could be mapped to a Question-Id that belongs to the corresponding Dispatch. Available Headers in XM for the corresponding Dispatch: _______
The target file contains duplicate headers Request to initiate the dispatch is aborted E Dispatch The uploaded file cannot be processed as it has the following duplicate headers: _______
The target file contains no duplicate headers and 1 or more headers are mapped to a WXM Question Request to initiate the dispatch is under processing I Dispatch The uploaded file’s headers are ready for processing. Details => _______ . Available Headers in XM for the corresponding Dispatch: _______
The target file has no valid answer rows (in case of a .csv file) or has no answer rows Request to initiate the dispatch is aborted E Dispatch The uploaded file cannot be processed as none of the rows are correctly configured.
The target file has 1 or more (valid) rows Request to initiate the dispatch is under processing I Dispatch The uploaded file’s rows are ready for processing. Details => _______
The dispatch API server does not return a 2xx response Request to initiate the dispatch is unsuccessful E Dispatch The HTTP Response received
The dispatch API server does return a 2xx response Request to initiate the dispatch is successful I Dispatch The HTTP Response received

6.1.4 How to access Logs using API

We have exposed API’s that can be used to pull the logs from both Partner hosted side and from Experience Management Side.

  1. Prerequisites to access EventLog API
    • Experience Management Bearer token should be passed in the request header for authorization
    • Deployment of Web API to get the Base URL
    • Method: POST
    • API URL: Base URL + /api/EventLog
    • Authorization:
    • To access this API, bearer token should be retrieved using Experience Management LoginToken API and should be passed under “Authorization” header as bearer token
Here, all the request parameters at once are not mandatory. It should be used in combinations or alone based on the use cases. For example: to get the log for the whole request, use the “BatchId” parameter. If the request contained multiple dispatches, use the “DispatchId” as well to filter based on BatchId and DispatchId. To search for a specific record, use “Token” field or “Target:” field. Basically, to narrow down the data use these combinations properly. If the whole day data is required, use “Created” option with.
[
    {
        "id": "5e9990b2ddc510a0d4d3cf2c",
        "tokenId": null,
        "deliveryWorkFlowId": "5e569ba530bb351664da3622",
        "dispatchId": "5e56a09830bb351664da3624",
        "batchId": "69afacb7-8cdb-4c1e-ada5-e01c67e27243",
        "user": null,
        "location": "Invitations",
        "created": "2020-04-17T11:19:14.644Z",
        "updated": "0001-01-01T00:00:00Z",
        "events": [
            {
                "timeStamp": "2020-04-17T11:19:14.644Z",
                "channel": 2,
                "action": 1,
                "message": null,
                "targetId": null,
                "eventStatus": null,
                "logMessage": {
                    "message": "Failed due to invalid Email or mobile number",
                    "level": null,
                    "exception": null
                }
            }
        ],
        "target": " 919953973304 ",
        "targetHashed": "sha512:a285024929f2ae46c1abd6014cad64c68f1a4b4190290e8a2262665d84f8d97ea93fa3d8db438f81247e17668f629f53b73295fe0dc35ce47f9c660a9eaef4f8",
        "prefills": [
            {
                "questionId": "5e52b55330bb2cee102b9a39",
                "input": "Kapil - 1",
                "input_Hash": "Kapil - 1"
            },
            {
                "questionId": "5e52b55e30bb2cee102b9a3c",
                "input": " kk04@cisco.com ",
                "input_Hash": "sha512:f0a19d5ea6630aa2f75f86c1d31e5762fe8c9a432a4ec3379b235f062af0e7a8a2b6c5597839af50d8b76bea2526700629cad6eb89d0ca729ebb416c3504478c"
            },
            {
                "questionId": "5e52b56b30bb2cee102b9a3f",
                "input": " 919953973304 ",
                "input_Hash": "sha512:a285024929f2ae46c1abd6014cad64c68f1a4b4190290e8a2262665d84f8d97ea93fa3d8db438f81247e17668f629f53b73295fe0dc35ce47f9c660a9eaef4f8"
            },
            {
                "questionId": "5e52b58230bb2cee102b9a42",
                "input": "123@gmail.com",
                "input_Hash": "123@gmail.com"
            },
            {
                "questionId": "5e54dbdd30bb2c6018f488ca",
                "input": "Delhi",
                "input_Hash": "Delhi"
            }
        ],
        "logMessage": null,
        "tags": [
            "UserData"
        ]
    }
]
  1. 401-Unautorized in case of wrong bearer token
  2. Empty response [] in case of wrong API Request

6.2 Notifications

Notifications can be seen as an extension of our logging capabilities built-in. As explained under the “Logging” section, all events are categorized under the DWIEF categories. Here is an architectural diagram to provide an overview of the module.

delivery-Policy-screen-shot/invitations-delivery-architecture/Notification-architecture.png

6.2.1 Functionalities

NOTE: House keeping of log files is needed at regular intervals based on storage capacity available on servers to store logs. The purging of log files have to be done by the admins themselves and is not automated as a part of this reference implementation.

6.2.2 Subscribe to Notifications

  1. Dispatch Level Notifications

We allow users to subscribe for notifications for these events that are logged per Dispatch level. This can be achieved by adding an email ID of subscribers to the specific event category in the partner hosted front end.

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step6.png


There are 2 modes in which the Notifications are delivered.

Note: Real-time notifications can be configured to have Error and Failure/fatal both. It can also be configured to have only Failures in real-time and rest all in the EOD.

  1. Account Level Notifications

Few logs are not attached to any of the Dispatch but is at an account level. These can be cumulated and send in real-time or as EOD.
One set of email ids can be configured in the ACM for this category of notifications. All account level logs will be delivered to these email IDs EOD. Ex. Fail to authenticate the bearer token or fetching data like Delivery Policy/Dispatch/Questions from WXM etc.

delivery-Policy-screen-shot/invitations-delivery-architecture/Notifications-Account-level.png

6.2.3 Logs processing details

{
    "_id" : "5e9969c4fd16580fd0c24753",
    "TokenId" : "CCTRISH-721309",
    "DeliveryWorkFlowId" : null,
    "DispatchId" : "5e9099b7cb93e8c9e8f7119a",
        "BatchId" : "a8b7442a-a85f-424f-9728-d8d7937ffa1b",
    "User" : "cctrish",
    "Location" : null,
    "Created" : ISODate("2020-04-17T08:33:08.586Z"),
    "Updated" : ISODate("2020-04-17T08:33:08.586Z"),
    "Events" : null,
    "Target" : null,
    "TargetHashed" : "example@gmail.com",

    "LogMessage" : {
        "Message" : "Invitation's User-Data-Log-Event not configured => No matching document found in collection",
        "Level" : "Error",
        "Exception" : null
    },
    "Tags" : [
        "Dispatcher"
    ]
}

6.2.4 Configuration Requirements

7. Reporting

The Invitations Delivery module is used to send surveys in bulk through SMS/ Email channels. To improve the performance of the Email or SMS campaign, it is necessary to measure certain metrics which can help to improve overall response rates and survey completion rates. These metrics can help in benchmarking various strategies that any brand may try to improve their CX. The reporting module gives following metrics across different splits

7.1 Architecture Overview

delivery-Policy-screen-shot/Invitaion-report-images/report-arch.png


7.2 How to download Reports

  1. Login to ACM and click on “Reports” tab and request a report by providing the details as per below screenshot

    delivery-Policy-screen-shot/Invitaion-report-images/report-UI.png


  2. Two types of report can be generated. One is the Operations Metrics Report and the other one is Detailed Logs report. The latter can only be obtained through ACM and not through scheduling. The Operations metrics report gives you various data points across data cuts. The overview sheet is shown below-

    delivery-Policy-screen-shot/Invitaion-report-images/report-excel-overview.png


  3. Reports also contain response and completion rates across different cuts like Channel, Month etc. All data in the data store is persisted for 90 days and will be deleted after that. Report can also be generated with data cuts across parameters as per the usecase by configuring the parameter in ACM.

    delivery-Policy-screen-shot/Invitaion-report-images/report-excel-split-question.png


  4. The attributes you select in this menu will be used to split the data by in the Operations Metrics Report. Each question you select will generate a table of its own in this excel sheet. Only Single Select WXM-Prefill questions can be selected for these additional cuts.

    delivery-Policy-screen-shot/Invitaion-report-images/data-slice.png


  5. The detailed logs report contains logs of all the different Event Actions that occur after requesting for a SMS/ Email campaign using customer data. The Event actions cover the lifecycle of every survey token. Starting from Token Creation to Dispatch Successful to Answering of Survey. These Events are captured for each survey token.

    delivery-Policy-screen-shot/Invitaion-report-images/detail-log-report-excel.png


    delivery-Policy-screen-shot/Invitaion-report-images/detail-log-report.png


  6. The detailed logs report is also sent via mail after requesting for it on ACM.

7.3 API details for generating the report

For all these APIs, authentication is mandatory through ACM module

7.3.1 Metrics Report API

If request is accepted by this API, it will send a 200 Status Code along with the message “Your report has been sent for processing”. If a report job is already running, then it will send a BadRequest with the message- “A report is being generated right now. It will be emailed to the listed recipients once it’s generated. You can request another report only after this request is completed.” Once your report request is accepted, you’ll have to wait for some time for the report to be sent to your mail.

Endpoint- “MetricsReport/{OnlyLogs}”

Type- POST

Request body

{OnlyLogs} Is a bool parameter which needs to be passed through the API URL. Set it to True if you want the Detailed Logs report and set it False if you want the Operations Metrics Report.

	{
        string afterdate,  - in dd/MM/yyyy format
        string beforedate  in dd/MM/yyyy format
	}

Response

Below are some response possibilities-

  1. For a 200 status code, you’ll get the message- “Your report has been sent for processing”
  2. Here are some Bad request scenarios-
    • “Entered date format is not correct. Please enter the date in dd/MM/yyyy format only.”
    • “Unable to convert the entered date range to UTC. Please Re-Login and contact administrator if issue persists”
    • “Unable to convert the entered date range to UTC. Please Re-Login and contact administrator if issue persists”
    • “There are no dates provided in the request. Please enter valid dates and try again.”
    • “Entered date range is too long. Reports can be downloaded for 90 days of date range only.”
    • “No smtp details configured”

7.3.2 Fetch already set Prefill Slices for Metrics report

As it was shown earlier, it is possible to configure our own data slices for Operations Metrics report. Using this API, we can get all the configured Pre-Fill slices. These Pre-Fills are nothing but questions which are configured in WXM. The metrics that you get in the Operations Metrics report can be split by the options of a “Select” type prefill question.

Let’s say if we have configured a question called “Product type” and the options for this question is “Product A” and “Product B”. Then, we will be able to get the operations metrics across these 2 product categories.

Endpoint- “GetPrefillSlices”

Type- GET

Response-

[{
        /// Unique question identifier from WXM
        string Id 
        /// question note from WXM
        string Note
        /// Type of questions from WXM
        string DisplayType
        /// Question text from WXM
        string Text
        /// Options configured from WXM
        List<string> MultiSelect
    }]
    

7.3.3 Fetch all Qualified Prefill questions in WXM which can be configured as slices

This endpoint will give all the Prefills in WXM which are qualified to be configured for data slicing in the Operations Metrics Report.

Endpoint- “GetQualifiedPrefills”

Type- GET

Response- Same as the Question response in WXM.

7.3.4 Configure data slices for the Operations Metrics

Using this API, we can set prefills to slice the data by in the Operations Metrics report.

Endpoint- “SetPrefillSlices”

Type- POST

Request

[{
        /// Unique question identifier from WXM
        string Id 
        /// question note from WXM
        string Note
        /// Type of questions from WXM
        string DisplayType
        /// Question text from WXM
        string Text
        /// Options configured from WXM
        List<string> MultiSelect
    }]

Response- For 200 status code,

[{
        /// Unique question identifier from WXM
        string Id 
        /// question note from WXM
        string Note
        /// Type of questions from WXM
        string DisplayType
        /// Question text from WXM
        string Text
        /// Options configured from WXM
        List<string> MultiSelect
    }]

For Bad Request, it will return an error message as “Unable to set prefill slices”.

7.4 Log files- Reporting and Merging

One of the most crucial parts of any application service are the logs that are generated. These log files would help to diagnose and rectify any issue that occurs during the application life cycle. The log files are stored in a separate folder which you can specify in the appsettings.json file. Here are some of the scenarios of debugging using the logs.

7.4.1 Reporting

The reporting service can run on-demand/ scheduled, the report type can be Operation Metrics/ Detailed Logs. The logs are generated for all report scenarios. We’ll walk through some common examples of the logs generated by this service.

Success - This is an example of successful report dispatches through email.

delivery-Policy-screen-shot/Invitaion-report-images/report-success.png


Configuration error - Configuration errors come up due to mistakes in configuring the values in the appsettings.json file. For example, if a certain parameter is expected to be an Integer but a String value is passed then the app will not run. This can be diagnosed and rectified by checking the log file which was generated.

delivery-Policy-screen-shot/Invitaion-report-images/config-error.png


Unexpected errors - These errors are rare and are unexpected in nature. These are mostly caused by configuration issue in the account. When these errors come up, it is best to get in touch with the WXM Support team.

delivery-Policy-screen-shot/Invitaion-report-images/unexpected-error.png


7.4.2 Data merging

The data merging job does the upload in small batches to make sure not a lot of data is loaded into the RAM at once and make the application crash. Each single stage of data upload is logged into the log file.

Success - This is an example of a successful data merging process

delivery-Policy-screen-shot/Invitaion-report-images/data-merging-success.png


Error - As we saw above in the Reporting logs section, errors can come up due to multiple reasons during the application life cycle. Here’s an example of the data merging job with an authentication error in WXM.

delivery-Policy-screen-shot/Invitaion-report-images/data-merging-error.png

7.5 Benchmarks

7.5.1 Reporting

Due to the large number of logs generated and the elaborate nature of the report, it takes some time to get the report. The report is delivered to the email provided through ACM. Below is the benchmark for the time taken to get the Operations Metrics Report and the Detailed Logs report. Mostly the reports are delivered in 5-10 minutes.

delivery-Policy-screen-shot/Invitaion-report-images/report-benchmark.png


7.5.2 Merging

The data merger is a batch process that pulls data from WXM and Invitations Delivery, to merge the log for each survey request. Depending upon the size of the CX program, below is the benchmark for the time taken to finish one round of Data Merging. The date filter to fetch the data for merging keeps moving on a rolling basis. The frequency at which the merging should happen and the amount of data to merge in each cycle can configured.

delivery-Policy-screen-shot/Invitaion-report-images/report-data-merging.png


8. Serviceability

For any serviceability aspects, we would refer to EventLogs captured in Database to figure out where exactly the issue is. How to call EventLog API is already mentioned in Logging section of this document. These are the parameters which EventLog API accepts:

{
  "BatchId": "",
  "DispatchId": "",
  "Token": "",
  "Created": "",
  "Target": ""
}

The more information we have on these fields, the easier the tracking will be. Here are some of the sample issues which can be resolved using the EventLog API.

8.1 Dispatch request payload is entirely getting rejected

This can be investigated in two ways. If you have access to Dispatch API response, that message will have information on why the particular dispatch is rejected

{
    "batchId": "1ed178ff-87ad-4aa1-ab05-86b649ab85b9",
    "statusByDispatch": [
        {
            "dispatchId": "5e7ceb1ccb93e80e60b31bf",
            "message": "Dispatch Invalid",
            "dispatchStatus": "400"
        }
    ]
}

In this case, its Invalid Dispatch, so the immediate thing to check is the parameter called “dispatchId” which is passed in API request. Check the dispatchId value, correct it and pass in the API request.

If the Dispatch API response looks like below, then again it can be resolved from the message in Dispatch API response itself.

{
    "batchId": "432sdff-243cs-53vc-sds3-dfvfr324dwdd",
    "statusByDispatch": [
        {
            "dispatchId": "5e7ceb1ccb93e80e60b31bf",
            "message": "DispatchID not present in SPA record.",
            "dispatchStatus": "400"
        }
    ]
}

In this case, your “dispatchId” is valid but not configured at all in the ACM front-end. To resolve this, please login to your ACM portal and configure this dispatch. It will take around max 1 hour for this to reflect in Dispatch API due to caching of response(explained in this section). In case this is required immediately in Dispatch API, then the Dispatch API module need to be restarted.

Alternatively, you can also call EventLog API by passing the BatchId in the request and analyse the response.

{
  "BatchId": "1ed178ff-87ad-4aa1-ab05-86b649ab85b9",
  "DispatchId": "",
  "Token": "",
  "Created": "",
  "Target": ""
}

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step7.png


8.2 Dispatch request is only partially accepted

In this case, the dispatchStatus must be 206 which means partially processed. It would also have a message which would say how many records are rejected and how many are accepted for processing.

“message”: “Accepted for processing 1 Rejected: 18”

Now, make a call to EventLog API with all the available information.

{
  "BatchId": "8920cd83-11b2-489b-ae34-6b152b2440f8",
  "DispatchId": "5e7ceb1ccb93e80e60b31bf3",
  "Token": "",
  "Created": "22-04-2020",
  "Target": ""
}

While glancing through the logs, you can look for “events” section for each record and it will have “action” field. In this case, action = 1 which means rejected. To know the reason of rejection, you can see the “message” field under “logMessage”.

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step8.png


In this case, it failed due to Invalid Email or mobile number. To verify that look at the prefills section in same object. In this case, email is passed as “dummycisco.com” which is invalid and should be corrected in the request payload before calling the Dispatch API again.

delivery-Policy-screen-shot/invitations-delivery-architecture/invitation-delivery-architecture-step9.png

Another failure message could be “Failed due to no Common Identifier or Channel” where also action will be 1. In this case, verify the prefill which is declared in Experience

Management as UUID is available in the prefills section. Also check the value of this prefill which is valid or not.

8.3 Verify Invitation Delivery and Token Journey

To quickly verify the status of an Email/SMS dispatch for a recipient, all you need is the Token Number (ex. XM-1369420) of the Invitation that you are looking to track. Using the just this token number, one should call the EventLog-API with the following request:

{
  "BatchId": "",
  "DispatchId": "",
  "Token": "XM-1369420",
  "Created": "",
  "Target": ""
}


Now glance through the returned API response, which would be an array of LogEvent objects, and look for the LogEvent object with Tags=[“UserData”]. Below is an example of the response returned (with sensitive details redacted through the usage of ‘xxxxxxxx’):
[
{
    "_id" : "5eba4ae62a90fe2f5c9be08d",
    "TokenId" : "XM-1369420",
    "DeliveryWorkFlowId" : "5ea5d9c3cb93f425b8ec553f",
    "DispatchId" : "5eb32477cb93f24c2c3812ed",
    "BatchId" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1",
    "User" : null,
    "Location" : "XYZ",
    "Created" : ISODate("2020-05-12T07:06:14.951Z"),
    "Updated" : ISODate("2020-05-12T07:10:26.942Z"),
    "Events" : [
        {
            "TimeStamp" : ISODate("2020-05-12T07:06:14.951Z"),
            "Channel" : 2,
            "Action" : 0,
            "Message" : null,
            "TargetId" : null,
            "EventStatus" : null,
            "LogMessage" : null
        },
        {
            "TimeStamp" : ISODate("2020-05-12T07:08:04.697Z"),
            "Channel" : 2,
            "Action" : 2,
            "Message" : null,
            "TargetId" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
            "EventStatus" : null,
            "LogMessage" : null
        },
        {
            "TimeStamp" : ISODate("2020-05-12T07:10:26.622Z"),
            "Channel" : 0,
            "Action" : 6,
            "Message" : "Message Template Id: 5ea5d7cecb93f425b8ec5536 | Additional Token Parameters: &c=0&n=0",
            "TargetId" : "xxxxxxxxxx@gmail.com",
            "EventStatus" : null,
            "LogMessage" : {
                "Message" : "Successfully Dispatched (via: CustomSMTP)",
                "Level" : "Information",
                "Exception" : null
            }
        }
    ],
    "Target" : "kar",
    "TargetHashed" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
    "Prefills" : [
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5929",
            "Input" : "xxxxxxxxxx",
            "Input_Hash" : "xxxxxxxxxx"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5923",
            "Input" : "kar",
            "Input_Hash" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966"
        },
        {
            "QuestionId" : "5d1d1c4fdc2a7799007f5947",
            "Input" : "fr-FR",
            "Input_Hash" : "fr-FR"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5926",
            "Input" : "xxxxxxxxxx@gmail.com",
            "Input_Hash" : "sha512:24162ec81e197070bfe9c530bf2467bd3d69dbec1f84a5892e1de259bc6f77d661a2c51e53719ea54ffe7bca2437b345d8905a5c8abc5edc82634e8548a55ad2"
        },
        {
            "QuestionId" : "5d6f7655f482c52f30b70870",
            "Input" : "5ea5d9c3cb93f425b8ec553f",
            "Input_Hash" : "5ea5d9c3cb93f425b8ec553f"
        },
        {
            "QuestionId" : "5ea7c363cb93eae944bbe627",
            "Input" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1",
            "Input_Hash" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1"
        }
    ],
    "LogMessage" : null,
    "Tags" : [
        "UserData"
    ]
},
{
    "_id" : "5eba4bddc702fd000b94f3eb",
    "TokenId" : "XM-1369420",
    "DeliveryWorkFlowId" : null,
    "DispatchId" : "5eb32477cb93f24c2c3812ed",
    "BatchId" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1",
    "User" : "xm",
    "Location" : null,
    "Created" : ISODate("2020-05-12T07:10:21.267Z"),
    "Updated" : ISODate("2020-05-12T07:10:21.267Z"),
    "Events" : null,
    "Target" : null,
    "TargetHashed" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
    "Prefills" : [
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5929",
            "Input" : null,
            "Input_Hash" : "xxxxxxxxxx"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5923",
            "Input" : null,
            "Input_Hash" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966"
        },
        {
            "QuestionId" : "5d1d1c4fdc2a7799007f5947",
            "Input" : null,
            "Input_Hash" : "fr-FR"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5926",
            "Input" : null,
            "Input_Hash" : "sha512:24162ec81e197070bfe9c530bf2467bd3d69dbec1f84a5892e1de259bc6f77d661a2c51e53719ea54ffe7bca2437b345d8905a5c8abc5edc82634e8548a55ad2"
        },
        {
            "QuestionId" : "5d6f7655f482c52f30b70870",
            "Input" : null,
            "Input_Hash" : "5ea5d9c3cb93f425b8ec553f"
        },
        {
            "QuestionId" : "5ea7c363cb93eae944bbe627",
            "Input" : null,
            "Input_Hash" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1"
        },
        {
            "QuestionId" : "Token",
            "Input" : null,
            "Input_Hash" : "XM-1369420"
        }
    ],
    "LogMessage" : {
        "Message" : "Dequeued",
        "Level" : "Debug",
        "Exception" : null
    },
    "Tags" : [
        "Dispatcher"
    ]
},
{
    "_id" : "5eba4bddc702fd000b94f3ec",
    "TokenId" : "XM-1369420",
    "DeliveryWorkFlowId" : null,
    "DispatchId" : "5eb32477cb93f24c2c3812ed",
    "BatchId" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1",
    "User" : "xm",
    "Location" : null,
    "Created" : ISODate("2020-05-12T07:10:21.302Z"),
    "Updated" : ISODate("2020-05-12T07:10:21.302Z"),
    "Events" : null,
    "Target" : null,
    "TargetHashed" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
    "Prefills" : [
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5929",
            "Input" : null,
            "Input_Hash" : "xxxxxxxxxx"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5923",
            "Input" : null,
            "Input_Hash" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966"
        },
        {
            "QuestionId" : "5d1d1c4fdc2a7799007f5947",
            "Input" : null,
            "Input_Hash" : "fr-FR"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5926",
            "Input" : null,
            "Input_Hash" : "sha512:24162ec81e197070bfe9c530bf2467bd3d69dbec1f84a5892e1de259bc6f77d661a2c51e53719ea54ffe7bca2437b345d8905a5c8abc5edc82634e8548a55ad2"
        },
        {
            "QuestionId" : "5d6f7655f482c52f30b70870",
            "Input" : null,
            "Input_Hash" : "5ea5d9c3cb93f425b8ec553f"
        },
        {
            "QuestionId" : "5ea7c363cb93eae944bbe627",
            "Input" : null,
            "Input_Hash" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1"
        },
        {
            "QuestionId" : "Token",
            "Input" : null,
            "Input_Hash" : "XM-1369420"
        }
    ],
    "LogMessage" : {
        "Message" : "Validated (Additional Token Parameters: &c=0&n=0)",
        "Level" : "Debug",
        "Exception" : null
    },
    "Tags" : [
        "Dispatcher"
    ]
}
.
.
.
.
.
{
    "_id" : "5eba4be2c702fd000b94f3f4",
    "TokenId" : "XM-1369420",
    "DeliveryWorkFlowId" : null,
    "DispatchId" : "5eb32477cb93f24c2c3812ed",
    "BatchId" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1",
    "User" : "xm",
    "Location" : null,
    "Created" : ISODate("2020-05-12T07:10:26.622Z"),
    "Updated" : ISODate("2020-05-12T07:10:26.622Z"),
    "Events" : null,
    "Target" : null,
    "TargetHashed" : "kar",
    "Prefills" : [
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5929",
            "Input" : null,
            "Input_Hash" : "xxxxxxxxxx"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5923",
            "Input" : null,
            "Input_Hash" : "kar"
        },
        {
            "QuestionId" : "5d1d1c4fdc2a7799007f5947",
            "Input" : null,
            "Input_Hash" : "fr-FR"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5926",
            "Input" : null,
            "Input_Hash" : "karthickdanush6@gmail.com"
        },
        {
            "QuestionId" : "5d6f7655f482c52f30b70870",
            "Input" : null,
            "Input_Hash" : "5ea5d9c3cb93f425b8ec553f"
        },
        {
            "QuestionId" : "5ea7c363cb93eae944bbe627",
            "Input" : null,
            "Input_Hash" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1"
        },
        {
            "QuestionId" : "Token",
            "Input" : null,
            "Input_Hash" : "XM-1369420"
        }
    ],
    "LogMessage" : {
        "Message" : "Successfully Dispatched (via: CustomSMTP)",
        "Level" : "Information",
        "Exception" : null
    },
    "Tags" : [
        "Dispatcher"
    ]
}
}


Next investigate the Events field of this LogEvent-Object, which is an array of InvitationLogEvent objects, to understand what the status of the token is.
{
    "_id" : "5eba4ae62a90fe2f5c9be08d",
    "TokenId" : "XM-1369420",
    "DeliveryWorkFlowId" : "5ea5d9c3cb93f425b8ec553f",
    "DispatchId" : "5eb32477cb93f24c2c3812ed",
    "BatchId" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1",
    "User" : null,
    "Location" : "XYZ",
    "Created" : ISODate("2020-05-12T07:06:14.951Z"),
    "Updated" : ISODate("2020-05-12T07:10:26.942Z"),
    "Events" : [
        {
            "TimeStamp" : ISODate("2020-05-12T07:06:14.951Z"),
            "Channel" : 2,
            "Action" : 0,
            "Message" : null,
            "TargetId" : null,
            "EventStatus" : null,
            "LogMessage" : null
        },
        {
            "TimeStamp" : ISODate("2020-05-12T07:08:04.697Z"),
            "Channel" : 2,
            "Action" : 2,
            "Message" : null,
            "TargetId" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
            "EventStatus" : null,
            "LogMessage" : null
        },
        {
            "TimeStamp" : ISODate("2020-05-12T07:10:26.622Z"),
            "Channel" : 0,
            "Action" : 6,
            "Message" : "Message Template Id: 5ea5d7cecb93f425b8ec5536 | Additional Token Parameters: &c=0&n=0",
            "TargetId" : "xxxxxxxxxx@gmail.com",
            "EventStatus" : null,
            "LogMessage" : {
                "Message" : "Successfully Dispatched (via: CustomSMTP)",
                "Level" : "Information",
                "Exception" : null
            }
        }
    ],
    "Target" : "kar",
    "TargetHashed" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
    "Prefills" : [
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5929",
            "Input" : "xxxxxxxxxx",
            "Input_Hash" : "xxxxxxxxxx"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5923",
            "Input" : "kar",
            "Input_Hash" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966"
        },
        {
            "QuestionId" : "5d1d1c4fdc2a7799007f5947",
            "Input" : "fr-FR",
            "Input_Hash" : "fr-FR"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5926",
            "Input" : "xxxxxxxxxx@gmail.com",
            "Input_Hash" : "sha512:24162ec81e197070bfe9c530bf2467bd3d69dbec1f84a5892e1de259bc6f77d661a2c51e53719ea54ffe7bca2437b345d8905a5c8abc5edc82634e8548a55ad2"
        },
        {
            "QuestionId" : "5d6f7655f482c52f30b70870",
            "Input" : "5ea5d9c3cb93f425b8ec553f",
            "Input_Hash" : "5ea5d9c3cb93f425b8ec553f"
        },
        {
            "QuestionId" : "5ea7c363cb93eae944bbe627",
            "Input" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1",
            "Input_Hash" : "dfd28c8f-2b6a-4e12-9590-a9c685ecf9c1"
        }
    ],
    "LogMessage" : null,
    "Tags" : [
        "UserData"
    ]
}


As you can see, there is an InvitationLogEvent with Channel=0 (Email) and Action=6, which means that the email was successfully dispatched to the Email Vendor (in this case CustomSMTP) for end-delivery. Similarly, in the case of reminders for this token, there will be more InvitationLogEvent objects that will get added into this same array. The Action for the subsequent reminders will still be 6(Dispatch Successful) or 7(Dispatch Unsuccessful) and Channel will still be 0(for Email) or 1(for SMS). However, the difference would be demarcated in the Message field of the InvitationLogEvent that will consequently get added.

The Message field within the InvitationLogEvent reveals the Additional Token Parameters that contain details about the reminder level. Here, n=0 means that this token was the first ever invitation delivered to the recipient. Thus any additional invitation delivery (basically any reminders) can be identified via the additional parameters that would be mentioned in the Message field. Therefore, the Events field within the concerned LogEvent, gives one all the necessary details regarding the entire journey of the token.

Subsequently, in the rare case of a Dispatch Unsuccessful (Action=7), look for the field LogMessage as that will have all the details regarding why the failure occurred. An example has been provided below:

{
    "_id" : "5eba2ab7a531f21d80e2c328",
    "TokenId" : "XM-1369421",
    "DeliveryWorkFlowId" : "5ea5d9c3cb93f425b8ec553f",
    "DispatchId" : "5eb32477cb93f24c2c3812ed",
    "BatchId" : "8fc78fa8-fd87-4c39-9909-658ede102388",
    "User" : null,
    "Location" : "XYZ",
    "Created" : ISODate("2020-05-12T04:48:55.526Z"),
    "Updated" : ISODate("2020-05-12T04:51:45.221Z"),
    "Events" : [
        {
            "TimeStamp" : ISODate("2020-05-12T04:48:55.526Z"),
            "Channel" : 2,
            "Action" : 0,
            "Message" : null,
            "TargetId" : null,
            "EventStatus" : null,
            "LogMessage" : null
        },
        {
            "TimeStamp" : ISODate("2020-05-12T04:50:20.550Z"),
            "Channel" : 2,
            "Action" : 2,
            "Message" : null,
            "TargetId" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
            "EventStatus" : null,
            "LogMessage" : null
        },
        {
            "TimeStamp" : ISODate("2020-05-12T04:51:44.915Z"),
            "Channel" : 0,
            "Action" : 7,
            "Message" : "Message Template Id: 5ea5d7cecb93f425b8ec5536 | Additional Token Parameters: &c=0&n=0",
            "TargetId" : "xxxxxxxxxxxxx@gmail.com",
            "EventStatus" : null,
            "LogMessage" : {
                "Message" : "Corresponding Vendor Details are missing from the Account Configuration Module",
                "Level" : "Error",
                "Exception" : null
            }
        }
    ],
    "Target" : "kar",
    "TargetHashed" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966",
    "Prefills" : [
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5929",
            "Input" : "xxxxxxxxxxxxx",
            "Input_Hash" : "xxxxxxxxxxxxx"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5923",
            "Input" : "kar",
            "Input_Hash" : "sha512:9da6363c5a071ebbac86bc2b90f1ed8203ee89866a25d6527401121d9b393159b309e990e090fbe3fe5541453a381222a3c17d2facaba9d08f04e1c6bc7ed966"
        },
        {
            "QuestionId" : "5d1d1c4cdc2a7799007f5926",
            "Input" : "xxxxxxxxxxxxx@gmail.com",
            "Input_Hash" : "sha512:24162ec81e197070bfe9c530bf2467bd3d69dbec1f84a5892e1de259bc6f77d661a2c51e53719ea54ffe7bca2437b345d8905a5c8abc5edc82634e8548a55ad2"
        },
        {
            "QuestionId" : "5d6f7655f482c52f30b70870",
            "Input" : "5ea5d9c3cb93f425b8ec553f",
            "Input_Hash" : "5ea5d9c3cb93f425b8ec553f"
        },
        {
            "QuestionId" : "5ea7c363cb93eae944bbe627",
            "Input" : "8fc78fa8-fd87-4c39-9909-658ede102388",
            "Input_Hash" : "8fc78fa8-fd87-4c39-9909-658ede102388"
        }
    ],
    "LogMessage" : null,
    "Tags" : [
        "UserData"
    ]
}