Using Webhooks with Dynamic Content

Sean Ives
September 30, 2019
9 mins
Engineering

Recipe for integrating with Dynamic Content using webhooks

​Dynamic Content is great at managing content, but sometimes you might want to include other components in your architecture such as a search index, database or a file server. In these cases, it can be useful to synchronise content produced in Dynamic Content with these other systems. Webhooks provide a way to do just that. ​ ​

TL;DR

Algolia is a hosted search engine that allows you to create a fast, as-you-type search experience with instant results.

Using Algolia to index your data in Amplience Dynamic Content provides a method for returning instant search results, which are nicely formatted and contain both your images and descriptive text.

In this article we talk about how we created a simple integration between Algolia and Dynamic Content (DC) which uses webhooks to synchronise changes between an Algolia index and content items when they are published in DC.

This article is a good starting point if you want to create your own integration between Dynamic Content and another third party service, such as Algolia. We provide a complete example of a third-party integration on GitHub.

What exactly are Webhooks then?

​Webhooks are a means by which a web application can notify another of an event and pass information about the event. For this to work, your application must first ‘subscribe’ to certain events that it wants to be notified of.

In this context the web application is an ‘observer’ and will be automatically invoked whenever an event is broadcast by Dynamic Content, which is the ‘subject’ of the webhook.

You may have come across this pattern, known as the Observer Pattern, before. A webhook is simply an implementation of this pattern between two services over HTTP. ​ ​

What do we need for a webhook integration?

Simply put, a web application. This will subscribe to certain events and then act upon the contents of these when it receives a notification: perhaps making a call to an API, pushing a message onto a queue; potentially almost anything.

We also need a way of building, deploying and serving the application. To speed up development, it would be nice if we could do this locally on our development machine, so we need some way of making the web application externally visible to the system we are observing (in our case, Dynamic Content). ​

Creating a web application

​For our web application we will be using Node.js and Express.js. We will also be using TypeScript, but feel free to follow along at home in vanilla JavaScript if you prefer. ​

Pre-requisites

Before starting, you’ll need downloaded and run the Node.js installer for your platform: download the latest LTS version. Alternatively you can install Node.js via the package manager. ​ ​

Setting up the project

​We’ll start by creating a bare-bones express application, get this running on our local machine and then iteratively add new features: ​

  • add a webhook handler

  • add calls to Dynamic Content to fetch content

  • add a call to the Algolia index to add our fetched content

  • add validation ​ We’ll call our app 

    1dc-integrations-algolia
     and create a folder of the same name in the current working directory. Under this we’ll create some folders for our source and test files: ​

1 $ mkdir dc-integrations-algolia
2 $ cd dc-integrations-algolia
3 $ mkdir src
4 $ mkdir test
We'll also need to create the essential files for the Express application:
The `express-application.ts` file sets up Express and configures the router, error handling and any additional middleware that we want. We'll define a bare-bones implementation for now:
1import * as express from 'express';
23export default (): express.Application => {
4  const app = express();
5  const router = express.Router();
67  app.use('/', router);
8  return app;
9};
The `index.ts` file is the main entry point for our application and is where we start our Express app to listen for connections: ​
1import getApp from './express-application';
2
3(async (): Promise => {
4  const PORT: number = Number(process.env.PORT) || 3000;
56  const app = getApp();
7  app.listen(PORT, (): void => console.log(`Listening on port ${PORT}!`));
8})().catch(
9  (err): void => {
10    console.log('Unexpected error whilst setting up app', err);
11  }
12);

Again, this is just bare-bones at the moment; we'll add more functionality in here for validation and configuration later. ​ The other two files, 

1package.json
 and 
1tsconfig.json
, define the npm package dependencies and configure the Typescript root files and compiler options, respectively. We won't go into the specifics of these here; instead we suggest that you download copies of these files from the full example integration on GitHub. ​

Running the application locally

​ To run the application locally, run the following (default port is 3000) ​

1$ npm run start
​ If you require this app to run on a different port, you can set the PORT environment variable: ​
1$ PORT=1337 npm run start
​ If all is well, you should see output similar to the following: ​
12> dc-integrations-algolia@1.0.0 start /Users/myuser/dc-integrations-algolia
3> ts-node -r tsconfig-paths/register src/index.ts
45Validating credentials
6Credentials validated
7Listening on port 3000!
8

​You can check that your server is running by pointing your browser to http://localhost:3000. It will display Cannot GET / because we have not configured any routes yet. ​

Exposing your local port using ngrok

​ We'll use 

1ngrok
 to make our locally running application visible to the outside world. This will allow it to be invoked by the Dynamic Content webhook handler.

Ngrok is reverse proxy tunnelling software which establishes a secure tunnel from a public endpoint (i.e. what will be our webhook 'callback' url) to a locally running network service (our integration application).

Ngrok also provides a web UI where you can introspect all HTTP traffic running over your tunnels and this is immeasurably useful when it comes to debugging webhooks.

You can signup for an Ngrok account for free and then download and install the version of Ngrok suitable for your platform. The setup and installation step will show you your generated authtoken for your Ngrok account.

You can run the following command to add your account's authtoken to your ngrok.yml file. This will give you more features and all open tunnels will be listed in the Ngrok dashboard:

1$ ./ngrok authtoken
​ Then you just need to run one command to expose your chosen HTTP port (in this case we are running our application on port 3000): ​
1$ ./ngrok http 3000
This should give you output similar to the following: ​
1ngrok by @inconshreveable                                                                                  (Ctrl+C to quit)
2
3Session Status                online                                                                                       
4Account                       Myuser (Plan: Free)                                                                            
5Version                       2.3.34                                                                                       
6Region                        United States (us)                                                                           
7Web Interface                 http://127.0.0.1:4040                                                                        
8Forwarding                    http://33abcdef.ngrok.io -> http://localhost:3000                                            
9Forwarding                    https://33abcdef.ngrok.io -> http://localhost:3000                                           
10
11Connections                   ttl     opn     rt1     rt5     p50     p90                                                  
12                              0       0       0.00    0.00    0.00    0.00

Adding a webhook handler

​ We’ll create a folder to contain the Typescript class files for our webhook processor and route handling: ​

1$ cd ~/dc-integrations-algolia/src
2$ mkdir webhooks

The Application Coordinator pattern

​ The default architecture for most web applications, Model View Controller (MVC), can begin to break down as Controllers become bloated and logic begins to creep into the view templates.

The Coordinator pattern addresses this problem by adding another layer of abstraction: a class representing the state of the view (this is usually called the 'Presenter'). The presenter forms a contract between components with different responsibilities.

We'll use this pattern to ensure that we explicitly handle all outcomes in our logic (i.e. responding differently depending on whether it is a success or failure scenario). This will help separate out the route handling logic in one component from the business logic which processes the webhook in another component.

We'll go into more details about the presenter interface when we talk about the webhook processor. ​

The webhook processing

​ In the webhooks directory, create a file named 

1snapshot-published-webhook.ts
. This contains the class responsible for handling the processing of our webhook: ​

1export class SnapshotPublishedWebhook {
2
3  public static async processWebhook(
4    request: SnapshotPublishedWebhookRequest,
5    presenter: SnapshotPublishedWebhookPresenter
6  ): Promise {
7    ...
8  }
9
10}

​ In this case we've named our class 

1SnapshotPublishedWebhook
 since it responds to Dynamic Content snapshot publishing events. This will allow us to index any new content when it is published.

On our class we define a method, 

1processWebhook
, that will be called by our route handler function (which will in turn be invoked when a webhook is received from Dynamic Content).

The method is passed two parameters: a copy of the request that was received by the webhook endpoint and the presenter used to notify the route handler whenever an exit point (failure or success) is reached during the webhook processing logic.

The webhook processor must carry out the following operations: ​

  • validate the fields of the webhook request, also ensuring that the webhook event is the one we're expecting

  • create a connection to Dynamic Content and retrieve the snapshot

  • use the snapshot to retrieve the version of the content item related to the snapshot

  • check that the content item type is in the allowed 'type whitelist'

  • extract just the properties that we want to index from the content item using the 'property whitelist'

  • add the extracted properties to the Algolia index ​

    Validating the webhook event type

    ​ We use the popular Class Validator library to validate the fields of the webhook request. This works using decorators that we define on our model: ​

1    export class WebhookRequest {
2    @IsNotEmpty()
3    @IsString()
4    public name: string;
56    @ValidateNested()
7    @IsNotEmpty()
8    public payload: WebhookSnapshot;
9    }
​ It is then fairly simple to validate the fields and report any errors by invoking the presenter. ​
1    const validationErrors = await validate(request.webhook);
23    if (validationErrors.length > 0) {
4    return presenter.invalidWebhookRequestError(request.webhook);
5    }
After ensuring that our fields are valid we must also check that the received webhook event type matches the one we were expecting (just in case the webhook has been misconfigured in Dynamic Content): ​
1    if (request.webhook.name !== 'dynamic-content.snapshot.published') {
2    return presenter.unsupportedWebhookError(request.webhook);
3    }

Again, we invoke the presenter to return an error if the event is not a snapshot published event. ​

Creating a connection to Dynamic Content

​ We need to use the Dynamic Content SDK to create a client for the Dynamic Content API. ​ We use the OAuth2 credentials (id and secret) passed into our processor to initialise the connection:​

1  const clientCredentials: OAuth2ClientCredentials = {
2        client_id: request.dynamicContent.clientId,
3        client_secret: request.dynamicContent.clientSecret
4      };
5      const dynamicContent = new DynamicContent(clientCredentials, request.dcConfig);
6

​ We also pass configuration details containing the URLs for the Dynamic Content API and the Amplience Authentication Service. ​

Retrieving the snapshot

​ Next, we'll fetch the snapshot (a representation of the content graph for a particular version of a content item) from the Dynamic Content API and then use a templated HAL link on the snapshot to fetch the correct version of the content item corresponding to our snapshot (the 'root' content item on the snapshot): ​

1    let contentItem;
2      try {
3        const snapshot = await dynamicContent.snapshots.get(request.webhook.payload.id);
4        contentItem = await snapshot.related.snapshotContentItem(request.webhook.payload.rootContentItem.id);
5      } catch (err) {
6        return presenter.dynamicContentRequestError(err);
7      }

Again, we invoke the presenter to report any error. ​

Whitelist filtering

​Now that we have the correct version of the content item we need to check that the content type is one of the ones that we are expecting and exclude it if it is not: this narrows down the amount of content that will be indexed which will help us avoid exceeding the maximum capacity of the Algolia index.

We will keep a whitelist of content types which we handle in our web application: we need to check that the type of our fetched content item is in the whitelist. We do this using the schema identifier:

1const contentItemSchema = contentItem.body._meta.schema;
2const contentTypeWhitelist = request.dynamicContent.contentTypeWhitelist;
3if (!SnapshotPublishedWebhook.isContentTypeSchemaInWhitelist(contentItemSchema, contentTypeWhitelist)) {
4  return presenter.noMatchingContentTypeSchemaError(contentItemSchema, contentTypeWhitelist);
5}
​Otherwise we report that we've filtered out the unknown content type via the presenter. ​ To further narrow down the content that we're going to index, we will also 'whitelist' the *properties* of our content item that we want to include in the index (again, with the aim of reducing the amount of unnecessary data that we add into the index): ​
1const properties = Object.keys(contentItem.body);
2const includedProperties = SnapshotPublishedWebhook.filterPropertiesInWhitelist(
3  properties,
4  request.dynamicContent.contentTypePropertyWhitelist
5);
6if (includedProperties.length == 0) {
7  return presenter.noMatchingContentTypePropertiesError(
8    properties,
9    request.dynamicContent.contentTypePropertyWhitelist
10  );
11}

If we end up with no properties after our whitelist filtering then this is probably due to a misconfiguration so we report this an error via the presenter. ​

Updating the Algolia index

​ The final step is to build up the object that we want to pass to the Algolia indexing engine. This will have a copy of the keys/values of the properties that we've whitelisted from our content item.

It will also have the the unique reference (

1objectID
) for the Algolia index. In this case, we'll use the id of the content item so that we can retrieve it in full when we do a search using the index): ​

1const objectToAddToIndex = {
2  ...includedProperties.reduce(
3    (obj, prop) => ({ ...obj, [prop]: contentItem.body[prop] }), {}
4  ),
5  objectID: contentItem.id
6};
7
Then we just need to add this object to the index by connecting to Algolia, initialising the index and making the call to add it: ​
1const algoliaIndexName = request.algolia.indexName;
2try {
3  const algoliaClient = algoliasearch(request.algolia.applicationId, request.algolia.apiKey);
4  const index = algoliaClient.initIndex(algoliaIndexName);
5  await index.addObject(objectToAddToIndex);
6} catch (err) {
7  return presenter.algoliaSearchRequestError(err);
8}

​ As usual, any error is reported via the presenter.

If this indexing operation completes without error, then we need to signal to the presenter that we have been successful. This is the last step in our processing function: ​

1return presenter.successfullyAddedToIndex(algoliaIndexName, objectToAddToIndex);

Adding a route handler for the webhook callback

​ Now that we’ve created our webhook handler we need to wire this up to the controller in our application so that it gets invoked when the webhook callback endpoint is called by Dynamic Content. ​

The route handler function

​ In the webhooks directory, create a Typescript file named 

1snapshot-published-webhook-route-handler.ts
 This contains the Express route handling function: ​

1export const snapshotPublishedWebhookRouteHandler = async (
2  req: express.Request,
3  res: express.Response,
4  next: express.NextFunction
5): Promise => {
6    ...
7}

The route handler function is invoked when a webhook is received by the application. It then invokes the webhook handler, passing it an instance of the presenter. The webhook handler uses the presenter to call back to the route handler (e.g. whenever a failure occurs) which is responsible for determining the response code.

For example: ​

1const presenter = new (class implements SnapshotPublishedWebhookPresenter {
2
3    public unsupportedWebhookError(webhook: WebhookRequest): never {
4      throw new UnsupportedWebhookError(webhook);
5    }
67        // other methods handling failure conditions go here...
89    public successfullyAddedToIndex(algoliaIndexName, addedObject: AlgoliaObject): void {
10      res.status(202).send({ message: 'Successfully added object to index');
11    }
12})();
1314try {
15  return await SnapshotPublishedWebhook.processWebhook(request, presenter);
16} catch (err) {
17  return next(err);
18}

Here we handle the case when we receive a webhook event type that we do not support (i.e. not a snapshot published event). The code throws an exception which is handled by the default error handling in Express. ​

Registering the route handler with Express

​ We must register the route handling function that we've defined with the Express Router instance. We'll insert a call to the router in 

1express-application.ts
 for our webhook callback endpoint: ​

1export default (): express.Application => {
2  const app = express();
3  const router = express.Router();
45  router.post(
6    '/webhook',
7    snapshotPublishedWebhookRouteHandler
8  );
910  app.use('/', router);
11  return app;
12};

This ensures that our handler is invoked when we receive an incoming webhook callback to our ‘/webhook’ endpoint. ​

Create a webhook subscription in Dynamic Content

​ Follow these steps to setup the webhook subscription in Dynamic Content: ​

1. Go to the Webhook management section within Dynamic Content (“Development“ -> “Webhooks“ from the menu)

If this option is not available to you, please open a support ticket requesting Webhook access from Amplience Support

2. Click the "Add webhook" button in the top right corner of the screen

Copy the public URL displayed by the output from 

1ngrok

Note that:

  • The URL must end in "/webhook"

  • You can provide your own secret, or follow our recommended approach of using the generate button to create a random signature.

    • This must be passed in as your WEBHOOK_SECRET environment variable which will be used later when validating the webhook signature
  • You must select the "Snapshot - published" Webhook trigger

Publish a content item

​ If you’ve followed the steps above and you now publish a content item in Dynamic Content it will trigger your Algolia integration. ​ If you have the debug log enabled for Express (define the environment variable 

1DEBUG=express:*
 to enable this), you should see output similar to the following in your console: ​

1express:router dispatching POST /webhook +999ms
2express:router query  : /webhook +1ms
3express:router expressInit  : /webhook +1ms
4express:router router  : /webhook +1ms
5express:router dispatching POST /webhook +0ms

Validating the webhook signature

​ It is important to secure your webhook handlers to avoid creating a back door into your system. Dynamic Content signs each webhook using a shared secret so you can cryptographically prove it came from us.

With Express, you can provide multiple route handler functions that behave like middleware to handle a request.

We will use this facility to define an additional step which performs validation of the webhook request. Validation is necessary to ensure that we have received a valid HTTP request with a content type of 

1application/json
, and that the webhook signature (passed encrypted in the X-Amplience-Webhook-Signature header) is valid, by checking it against our known secret. ​ Edit 
1express-application.ts
 and replace the router call for the /webhook endpoint with the following: ​

1  router.post(
2    '/webhook',
3    ValidateWebhookRequest.middleware(process.env.WEBHOOK_SECRET),
4    snapshotPublishedWebhookRouteHandler
5  );

This makes a call to a factory function 

1ValidateWebhookRequest.middleware
 which will return a route handler function (or series of functions) that checks the content type and performs validation of the webhook body in the passed request against the known secret which we have configured as an environment variable.

The implementation of this function involves defining one or more validation handler functions: ​

1export default class ValidateWebhookRequest {  
23  public static handlerfn0(
4    req: express.Request,
5    res: express.Response,
6    next: express.NextFunction): void {
7    ...
8  }
9
10  public static handlerfn1(
11    req: express.Request,
12    res: express.Response,
13    next: express.NextFunction): void {
14    ...
15  }
16
17  ...
18
19  public static middleware(webhooksecret: string): NextHandleFunction[] {
20    return [handlerfn0, handlerfn1, ...];
21  }
22}
There can be multiple handlers defined and each of these will call the next function in the chain or end the request/response cycle in the case of a failure i.e. a validation error in our case. ​ Checking the content type header is fairly straightforward: 
1 public static validateHeaders(req: express.Request, res: express.Response, next: express.NextFunction): void {
2    if (req.get('content-type') !== 'application/json') {
3      return next(new BadRequestError());
4    }
5    return next();
6  }

We return a 400 HTTP error (Bad Request) if the content type if the content type does not match any in the content type filter list, otherwise we just continue with 

1next()
.

The function to validate the webhook signature is a little bit more involved: we must calculate the expected signature by encrypting the body of the webhook using the webhook secret and compare this against the supplied signature: if they don't match then we return a HTTP error: ​

1 public static expressJson(webhooksecret: string): NextHandleFunction {
2    return express.json({
3      type: (): boolean => true,
4      verify: (req: express.Request, res: express.Response, buf: Buffer): void => {
5        const suppliedSignature = req.get(AMPLIENCE_WEBHOOK_SIGNATURE_HEADER);
6        const calculatedSignature = WebhookSignature.calculate(buf, webhooksecret);
7        if (suppliedSignature !== calculatedSignature) {
8          throw new InvalidWebhookSecretError(webhooksecret);
9        }
10      }
11    });
12  }

In this handler we need to use the raw request body of the webhook (i.e. before any parsing) so that the webhook signature can be calculated. For this we use the Express 

1json
 middleware and pass it a 
1verify
 function as a parameter.

Finally, we just need to insert these functions into the sequence of functions called in the middleware processing chain: ​

1  public static middleware(webhooksecret: string): NextHandleFunction[] {
2    return [ValidateWebhookRequest.validateHeaders, ValidateWebhookRequest.expressJson(webhooksecret)];
3  }

Where to go from here

​The code examples above do not cover the full details of creating an integration. There are plenty of other refinements to the basic application that you can make, including configuring the application using environment variables defined in a file.

Take a look at the full sample application on GitHub for details of some other ways of enhancing the integration and ideas you can use to build you own.

The Algolia integration discussed in this post is just one example of how you can use webhooks to integrate Dynamic Content with other services. Webhooks allow developers to be notified of events in real time and can be used to trigger actions in apps such as JIRA or Slack - easy to do with our Zapier integration. We also use webhooks to trigger a rebuild of our blog site when content is updated and to integrate with e-commerce systems.

If you want to build your own integration with Dynamic Content, the sample code and the information in this blog post should be a good starting point.