Blog Post

The Challenges of Ajax CDN

Published
September 7, 2021
#
 mins read
By 

in this blog post

For the longest time, hosting static files on CDNs was the de facto standard for performance tuning website pages. The host offered browser caching advantages, better stability, and storage on fast edge servers across strategic geolocations. Not only did it have performance benefits, but it was also convenient for developers. Recent developments, however, show that self-hosting static files such as Ajax (Asynchronous JavaScript and XML) and jQuery libraries, CSS styles, and other include directives are faster, more reliable, and add better security to the system.

Possible Performance Degradation from CDN Usage

Most developers and administrators think that adding a CDN-hosted static file improves performance. The idea has been that a CDN has fast edge servers that cache content and deliver it based on the user’s geolocation. These cached servers are faster than a traditional single hosting server, and the developers got the benefit of convenience.

Some studies, however, show, that there could actually be performance slowdowns when hosting static files on third-party servers (such as JavaScript, CSS and HTML files used to enable Ajax functionality), especially with the introduction of HTTP/2. In this study from CSS Wizardry's Harry Roberts, he highlights how one company he consulted with were able to reduce latency by 300ms after moving static files from a third-party server to a self-hosted solution. The savings of300ms had a direct impact on sales and revenue, increasing the company's bottom line by £8 million (c. $11,000,000).

The same study also showed that slower mobile connections experienced higher latency from loading files on third-party servers. Over 3G, the same client’s customers experienced a 1.765 second slowdown compared to self-hosted files. After migrating their files to a local server, the client went from a load time of 5.4 to 3.6 seconds.

This might not seem like a considerable amount, but think in terms of large enterprise sites that have millions of user visits a day, which can easily add up to hundreds of millions a month. At scale, latency issues quickly trickle down to the end-user experience. The speed of a website has shown to affect bounce rate, customer satisfaction, and customer retention, not to mention the way in which Google bakes site speed into its ranking algorithm.

Avoid Single Points of Failure

If you’ve ever been through a disaster recovery exercise, you'll know that redundancy is the key to resiliency against failure. Should these third-party servers fail, internal infrastructure also fails unless you have failover systems configured. Popular third-party CDNs and cloud services have failover baked into their infrastructure, but as we've seen a lot recently, even the biggest cloud providers occasionally have outages.

One school of thought is that services in the cloud rarely, if ever, fail but it does happen to even the biggest providers. For example, back in 2017, a simple operational error crashed AWS S3 buckets in the entire Virginia US East data center region. S3 buckets are used as cloud storage and the downtime affected thousands of AWS customers. Those businesses without failover configured that relied solely on AWS would certainly have experienced downtime.

Using a third-party provider for hosting static files - such as JavaScript, HTML and CSS files that support Ajax functionality - also leaves the business open to the same kind of outage. It’s possible to code for failover, but many small and mid-size businesses do not have these resources available. Downtime could cost thousands in revenue loss, so it should always be a concern.

A small but related risk is when the third-party host retires services. This is rare with a large organization such as AWS, GCP or Azure, but smaller hosters could shut down services at any time, leaving the site application owner struggling to find an alternative as quickly as possible.

How Network Penalties Can Impact Performance

The network penalties associated with third-party hosts tie into performance degradation, but they provide more insight into why it happens. For every new origin included during load time, the browser opens a new TCP connection. It’s not uncommon for some sites to have several external scripts included in a web application.

In addition, most sites are now using SSL/TLS, so there's a handshake between the client and host to determine the cipher that will be used to encrypt data and to transfer the symmetric key used to establish a session. If you have dozens of third-party files, this can quickly add latency to load times. As a workaround, you can minimize the impact of opening files from third-party domains using the preconnect resource hints to indicate to the browser that you want to process the connection and file download as soon as possible.

The second penalty is in the loss of prioritization available in HTTP/2, which is what most applications use currently. The HTTP/2 protocol provides features for prioritizing connections. Prioritization allows developers to define important connections so that critical files can be returned faster and delay loading of less important content.

Connection prioritization works well on the same domain, but a dependency tree must be built for each new external TCP connection created for external files. This means that you cannot build one dependency tree when several third-party domains are used to host files, which adds latency. Note that connection coalescence is available to overcome this limitation, but the domains must resolve to the same IP address and each browser handles it differently.

Take Into Account Security Considerations

For critical applications, hosting static JavaScript, HTML and CSS files (to support Ajax functionality) on third-party servers adds cybersecurity risk. Should the third-party host suffer from a compromise, the attacker could tamper with the code and execute arbitrary commands. The tampered code could be used to perform actions within your application based on the context of the user session. If the user is authenticated into a banking application, for example, the tampered code could be used to transfer money or disclose financial information.

Third-party hosting also adds the risk of sensitive data disclosure to the third-party host if data is sent over URLs. If OAuth tokens, for example, are sent across a connection to a hosted script, it discloses these access tokens to the third-party host and could lead to a scenario in which an attacker can execute commands as the victim using their access tokens.

OWASP has a cheat sheet available to help developers code for this issue, but self-hosted static files would not be a threat for this specific issue. Secrets and access tokens should never be included in query string parameters, because they can be recorded in several locations (e.g., logs and browser cache). By self-hosting files, developers eliminate sensitive data disclosure to third parties.

Conclusion

If performance and security are a concern, self-hosted static files are a better option for developers to support Ajax functionality. The few hundred milliseconds saved can improve user experience, reduce load times, and lower resource costs. For security, developers reduce the risk of sensitive data disclosure and can protect source code from outside tampering. Finally, organizations should eliminate a single point of failure based on reliance on a third-party host in order to avoid downtime should their cloud provider have an unforeseen outage.

For the longest time, hosting static files on CDNs was the de facto standard for performance tuning website pages. The host offered browser caching advantages, better stability, and storage on fast edge servers across strategic geolocations. Not only did it have performance benefits, but it was also convenient for developers. Recent developments, however, show that self-hosting static files such as Ajax (Asynchronous JavaScript and XML) and jQuery libraries, CSS styles, and other include directives are faster, more reliable, and add better security to the system.

Possible Performance Degradation from CDN Usage

Most developers and administrators think that adding a CDN-hosted static file improves performance. The idea has been that a CDN has fast edge servers that cache content and deliver it based on the user’s geolocation. These cached servers are faster than a traditional single hosting server, and the developers got the benefit of convenience.

Some studies, however, show, that there could actually be performance slowdowns when hosting static files on third-party servers (such as JavaScript, CSS and HTML files used to enable Ajax functionality), especially with the introduction of HTTP/2. In this study from CSS Wizardry's Harry Roberts, he highlights how one company he consulted with were able to reduce latency by 300ms after moving static files from a third-party server to a self-hosted solution. The savings of300ms had a direct impact on sales and revenue, increasing the company's bottom line by £8 million (c. $11,000,000).

The same study also showed that slower mobile connections experienced higher latency from loading files on third-party servers. Over 3G, the same client’s customers experienced a 1.765 second slowdown compared to self-hosted files. After migrating their files to a local server, the client went from a load time of 5.4 to 3.6 seconds.

This might not seem like a considerable amount, but think in terms of large enterprise sites that have millions of user visits a day, which can easily add up to hundreds of millions a month. At scale, latency issues quickly trickle down to the end-user experience. The speed of a website has shown to affect bounce rate, customer satisfaction, and customer retention, not to mention the way in which Google bakes site speed into its ranking algorithm.

Avoid Single Points of Failure

If you’ve ever been through a disaster recovery exercise, you'll know that redundancy is the key to resiliency against failure. Should these third-party servers fail, internal infrastructure also fails unless you have failover systems configured. Popular third-party CDNs and cloud services have failover baked into their infrastructure, but as we've seen a lot recently, even the biggest cloud providers occasionally have outages.

One school of thought is that services in the cloud rarely, if ever, fail but it does happen to even the biggest providers. For example, back in 2017, a simple operational error crashed AWS S3 buckets in the entire Virginia US East data center region. S3 buckets are used as cloud storage and the downtime affected thousands of AWS customers. Those businesses without failover configured that relied solely on AWS would certainly have experienced downtime.

Using a third-party provider for hosting static files - such as JavaScript, HTML and CSS files that support Ajax functionality - also leaves the business open to the same kind of outage. It’s possible to code for failover, but many small and mid-size businesses do not have these resources available. Downtime could cost thousands in revenue loss, so it should always be a concern.

A small but related risk is when the third-party host retires services. This is rare with a large organization such as AWS, GCP or Azure, but smaller hosters could shut down services at any time, leaving the site application owner struggling to find an alternative as quickly as possible.

How Network Penalties Can Impact Performance

The network penalties associated with third-party hosts tie into performance degradation, but they provide more insight into why it happens. For every new origin included during load time, the browser opens a new TCP connection. It’s not uncommon for some sites to have several external scripts included in a web application.

In addition, most sites are now using SSL/TLS, so there's a handshake between the client and host to determine the cipher that will be used to encrypt data and to transfer the symmetric key used to establish a session. If you have dozens of third-party files, this can quickly add latency to load times. As a workaround, you can minimize the impact of opening files from third-party domains using the preconnect resource hints to indicate to the browser that you want to process the connection and file download as soon as possible.

The second penalty is in the loss of prioritization available in HTTP/2, which is what most applications use currently. The HTTP/2 protocol provides features for prioritizing connections. Prioritization allows developers to define important connections so that critical files can be returned faster and delay loading of less important content.

Connection prioritization works well on the same domain, but a dependency tree must be built for each new external TCP connection created for external files. This means that you cannot build one dependency tree when several third-party domains are used to host files, which adds latency. Note that connection coalescence is available to overcome this limitation, but the domains must resolve to the same IP address and each browser handles it differently.

Take Into Account Security Considerations

For critical applications, hosting static JavaScript, HTML and CSS files (to support Ajax functionality) on third-party servers adds cybersecurity risk. Should the third-party host suffer from a compromise, the attacker could tamper with the code and execute arbitrary commands. The tampered code could be used to perform actions within your application based on the context of the user session. If the user is authenticated into a banking application, for example, the tampered code could be used to transfer money or disclose financial information.

Third-party hosting also adds the risk of sensitive data disclosure to the third-party host if data is sent over URLs. If OAuth tokens, for example, are sent across a connection to a hosted script, it discloses these access tokens to the third-party host and could lead to a scenario in which an attacker can execute commands as the victim using their access tokens.

OWASP has a cheat sheet available to help developers code for this issue, but self-hosted static files would not be a threat for this specific issue. Secrets and access tokens should never be included in query string parameters, because they can be recorded in several locations (e.g., logs and browser cache). By self-hosting files, developers eliminate sensitive data disclosure to third parties.

Conclusion

If performance and security are a concern, self-hosted static files are a better option for developers to support Ajax functionality. The few hundred milliseconds saved can improve user experience, reduce load times, and lower resource costs. For security, developers reduce the risk of sensitive data disclosure and can protect source code from outside tampering. Finally, organizations should eliminate a single point of failure based on reliance on a third-party host in order to avoid downtime should their cloud provider have an unforeseen outage.

This is some text inside of a div block.

You might also like

Blog post

Catchpoint Expands Observability Network to Barcelona: A Growing Internet Hub

Blog post

Learnings from ServiceNow’s Proactive Response to a Network Breakdown

Blog post

Don’t get caught in the dark: Lessons from a Lumen & AWS micro-outage