Third-Party API Caching in Commerce Cloud

Enhancing the performance of your different online channels is a given, including keeping an eye on any third-party integrations. All Salesforce Commerce Cloud sites rely on APIs in some capacity, such as for retrieving location data, weather updates, address verification, submitting files, and more, all with different levels of performance stability. 😅.

Understanding and applying caching for third-party services can enhance your third-party’s integration performance and cost-effectiveness.

Now, let’s delve into the process, its benefits, and some things to remember to get you going.

Caching with LocalServiceRegistry?

Salesforce Commerce Cloud Web Service Framework
Salesforce Commerce Cloud Web Service Framework

The Web Service Framework is key to managing external service interactions. It enables developers (and other profiles) to manage third-party integrations with ease from the comfort of the Business Manager. 
One of those built-in features is caching responses from third-party APIs, which reduces the need for repeated network requests and, in many cases, reduces costs related to those services (usage-based licenses).

Creating a service in the LocalServiceRegistry comes with a simple configuration for managing request handling. Here’s a basic example of how to create a service with a caching feature:

				
					var callTestGet = LocalServiceRegistry.createService("test.http.get", {
        createRequest: function(svc: HTTPService, args) {
  		    svc.client.enableCaching(1000);
            	svc.setRequestMethod("GET");
        },
        parseResponse: function(svc: HTTPService, client: HTTPClient) {
            return client.text;
        },
        mockCall: function(svc: HTTPService, client: HTTPClient) {
            return {
                statusCode: 200,
                statusMessage: "Success",
                text: "MOCK RESPONSE (" + svc.URL + ")"
            };
        },
        filterLogMessage: function(msg: String) {
            return msg.replace("headers", "OFFWITHTHEHEADERS");
        }
    });
				
			

In this snippet, the `enableCaching` method is invoked, enabling caching for the HTTP requests serviced by this configuration. The argument (in this case, `1000`) represents a timeout setting, which dictates how long a cached response will be valid before the subsequent request is made.

configuring underlying clients
A screenshot of the official documentation on how to add caching to a service.

Why Caching Matters

Caching has several benefits, especially for services with consistent data and infrequent updates. Let’s have a look at how this minor code change can significantly affect the way your Salesforce Commerce Cloud channel works:

  1. Faster performance: Caching allows your site to retrieve data from “local” storage instead of repeatedly calling an external server. When a cached response is available, the app server can quickly fulfil requests, significantly reducing wait times.
  2. Greater Reliability: With caching, your site becomes more robust. If a third-party service goes down or experiences issues, your app can still provide cached data, ensuring a smoother user experience.
  3. Better Rate Limit Management: Many APIs limit the number of requests you can make within a specific timeframe. Using cached responses reduces the number of requests sent out, helping you stay within these limits and preventing potential service interruptions.

Everyday Use Cases for Caching

Here are a few everyday situations where you might want to use caching:

  1. Google Location Services: Location data doesn’t change very often, so caching it can speed up response times in local applications.
  2. Address Verification Services: Address information stays the same over time. Caching these responses can improve efficiency.
  3. Weather Services: Weather data can be cached for short periods. While it might change frequently, most applications don’t require constant real-time updates.

Implementing caching for these services can significantly enhance performance, boost speed, and reduce expenses. 

However, don’t anticipate any “magic 🪄”—it’s the accumulation of many small enhancements that can lead to significant improvements. The development effort here is minimal, yet the potential impact is substantial! Is anyone looking for quick wins?

Clearing the cache

A screenshot of the "Service Maintenance" configuration page in the Business Manager.
A screenshot of the "Service Maintenance" configuration page in the Business Manager.

You can clear the HTTPClient Response cache in the Business Manager by going to Administration > Operations > Service Maintenance. Here, you’ll find options related to this cache.

To clear cached responses for ALL services, simply click the Invalidate button next to “HTTP Client Response Cache.”

Some things to keep in mind

General Caching Warnings

Caching HTTP responses has numerous benefits, yet it’s crucial to be mindful of potential drawbacks. 

  • Stale Data: Cached data can become outdated, especially if the third-party API updates its responses frequently. Be sure to set a cache expiration period that matches the data’s update frequency.
  • Inconsistent States: Relying too much on cached data without refreshing it can lead to users receiving outdated or incorrect information. This can negatively affect user experience and erode trust.
  • Error Propagation: If there’s an error from the external service and you cache this response, users might keep encountering the same error until the cache is cleared or expires.
  • Debugging Complexity: Debugging can get tricky if cached responses interfere with your expectations while developing or testing your application. It’s important to know precisely what data is being cached.
  • Impact on Business Logic: Cached responses might not show real-time changes crucial for essential business processes. This can result in making incorrect decisions based on outdated data.

LocalServiceRegistry-Specific Considerations

Limitations

It caches only status codes of 2xx with content length and size under 50k, which are not immediately written to a file. The cache keys consist of the URL and the user name. The system automatically manages and limits the total size of cacheable content and the number of cached items.

Initializing the HTTP Client

When configuring the HTTPClient, only use the `getClient` method and other HTTPClient functions within the `createRequest` callback or any following callbacks. 

Accessing the client before invoking service will yield `null`, resulting in service call failures.

Rate Limits and Circuit Breakers

Remember that cached requests still count toward your service’s rate limits and circuit breaker configuration (and quota limits). 

While caching helps reduce direct external requests, every time you call the service—whether through the cache or directly—it impacts your statistical limits. This could cause service disruptions if you exceed certain thresholds.

Monitoring Cached Requests

Monitoring cached requests is a crucial part of making sure that your caching mechanism is working. It’s about “pressing the enable button” and actively tracking its impact on your site’s performance and usability.

Conclusion

In conclusion, adding a caching mechanism to the LocalServiceRegistry for third-party services is a significant step towards boosting your performance and reducing operational costs. Even if the improvements are not always dramatic, every enhancement contributes to a smoother user experience. 

Here’s an example of a successful (anonymised) result from using this cache and rate limiting bot traffic:

A screenshot of a graph showing the results of third-party service improvements, including using Cloudflare and secondly adding caching.
The number of requests handled by the API decreased considerably, leading to a lower monthly bill.

These improvements not only increased overall performance but also reduced costs on the third-party service, as each API call incurred a charge.

An illustration depicting a browser on the left, a third-party server on the right, and a person directing traffic in between to the cache or the third-party server.

Table of Contents

Facebook
Twitter
LinkedIn