Q&A: Why emerging IoT platforms require the same leading-edge security as industrial controls
By Byron V. Acohido
January 8, 2019
The heyday of traditional corporate IT networks has come and gone.
In 2019, and moving ahead, look for legacy IT business networks to increasingly intersect with a new class of networks dedicated to controlling the operations of a IoT-enabled services of all types, including smart buildings, IoT-enabled healthcare services and driverless cars.
This coming wave of IoT networks, architected to carry out narrowly-focused tasks, will share much in common with the legacy operational technology, or OT, systems long deployed to run physical plants — such as Industrial Control Systems (ICS,) Supervisory Control and Data Acquisition (SCADA ,) Data Control System (DCS,) and Programmable Logic Controller (PLC.)
The global cybersecurity community is keenly aware of these developments and earnest discussions are underway about how to deal with the attendant security exposures. This includes a rising debate about the efficacy of the Common Vulnerability Scoring System, or CVSS. Initially introduced in 2005, CVSS is a framework for rating the severity of security vulnerabilities in software.
Last Watchdog recently sat down with a couple of senior executives at Radiflow, a Tel Aviv-based supplier of cybersecurity solutions for ICS and SCADA networks, to get their perspective about how NIST and ICS-CERT, the two main organizations for disclosing and rating vulnerabilities, are sometimes not aligned. Radiflow currently is conducting this survey to collect feedback from IT and OT professionals about the ramifications of this conflict.
Radiflow expects to release its survey findings in late January. This is not just another arcane tussle among nerdy IT professionals. New vulnerabilities and exposures are part and parcel of accelerating the deployment of vast distributed systems, fed by billions of IoT sensors. And they must be fully addressed if digital commerce is to reach its full potential. Here are excerpts of my discussion about this with Radiflow’s CEO Ilan Barda and CTO Yehonatan Kfir, edited for clarity and length:
LW: As we move forward with digital transformation and the Internet of Things, is it becoming more urgent to think about how we protect OT systems?
Barda: Yes. The risks are growing for two reasons. One is the fact that there are more and more of these kinds of OT networks, more and more sensors, more and more automation applications. And these systems are becoming more dominant, not just in bigger facilities, but especially in mid-sized and smaller organizations
We’re seeing more and more smaller buildings where it makes sense to have everything automated because, on an operational level, you can optimize usage of the lights, heating and air conditioning, and elevators. All of this is done by automated systems.
The other thing is that, unfortunately, the tools to attack these kinds of systems have become easily accessible on the Internet. It’s not just nation-state attackers we have to worry about, it’s also the 18-year-old hacker who wants to play around and find the right tools to shut down a hospital’s automated systems. The combination of automated building systems becoming more popular and the tools to attack them becoming much more widely available has made the need to use dynamic security systems much more urgent.
LW: So how are the existing vulnerability scoring systems holding up? And what needs to be improved?
Kfir: CVSS is widely accepted as a way to capture the principle characteristics of a vulnerability, and eventually to produce a numerical score that reflects the severity level of that specific vulnerability. Although it is a sophisticated and very widely accepted scoring system, it has some problems, specifically when it comes to ICS.
One problem is that the CVSS metric was developed mostly based on IT cyber incidents. The main difference has to do with the requirements of the network. In IT networks, there is a high requirement for confidentiality, whereas in OT networks, the main requirement is integrity and availability. There have been a lot of arguments in the security community about what we should do and how to improve CVSS scoring in order to make it more relevant for SCADA.
LW: Aren’t the ICS-CERT standards more relevant to OT networks?
Kfir: What’s happening is that, even if we assume the ICS-CERT metric is accurate, when two big organization use it to measure the same vulnerability, in a lot of cases they’ll assign different values, even though both are analyzing the same vulnerability, using the same metric.
LW: So there’s lots of room for confusion. How does that translate into exacerbating exposures?
Kfir: Whenever you have a metric requiring someone to input a value, there’s a good chance different people will input different things. But the problem is more than that. Even if the community were to put a lot of effort into developing a metric specifically for SCADA, it would still result in a static input.
So different organizations may see the value differently, and more important than that, you are assigning a static value, at the point in time that you detect a vulnerability. However, the threats are not static; they’re dynamically changing. So you may have a vulnerability that, at this time, is very hard to exploit, but one year from now someone will release a tool, and it will be very easy to exploit.
LW: How can this be addressed?
Kfir: There is an alternative. It involves moving into a more dynamic way to score vulnerabilities. In the short term, even we use CVSS as a feed we must change the scoring of the vulnerability to take into account the evolving threat, and the context in which the vulnerable device is being used in the network. If the analyst decides that a vulnerability has a bigger effect on a specific device, we need to adjust the current scoring method to provide more weight for the impacts on availability and integrity.
LW: Does the scoring tool itself needs to be refined, or is this a matter of more training for the analyst?
Kfir: It’s a combination, both the analyst and the monitoring system should dynamically change the values according to the context of the device. For example, the analyst should tell the monitoring system that this is a critical process, and then the monitoring system would reevaluate all the scores of the vulnerability, while taking into account the fact that this is a critical process.
LW: So where do we go from here?
Kfir: What we hope will happen is that the security monitoring systems will become more risk oriented, and more focused on reducing critical vulnerabilities. For that to happen, dynamic scoring will have to be more widely adapted. Monitoring systems will have to be able to automatically identify business processes and dynamically receive feeds of new exploits in the wild. And new algorithms need to be developed, to estimate the exploitability of the device.
Barda: Let me put it in a higher level view. First of all, the industry is starting to move from the level of using visibility and mapping more towards actually taking risk into consideration — but it’s based on static scoring. What needs to be done to make this realistic. To accomplish this we need to move towards dynamic scoring of vulnerabilities. And for this to happen different tools are needed to support these new processes.
It is too much to expect a human analyst to manually evaluate hundreds of parameters involved in dynamic scoring. It will never happen. It is too much manual work and involves too much complexity. So tools need to be used to automate these processes as much as possible — to gather the threat feeds, to correlate analysis business processes and to assess the impacts between different devices. The analyst can then focus on assessing criticality at the business level. There will always be something for the human analyst to do.
LW: This sounds very much like the same initiatives generally being pursued to secure IT networks.
Barda: It’s not just taking the same methods used for IT security into the OT environment. Accounting for the business processes requires specific understanding of how the OT algorithms and applications work. You can take the same concepts, but you need to apply the right methods, with the right algorithms for the OT side.
LW: How much traction do you think dynamic scoring will achieve, in OT settings, in 2019?
Barda: I think wide use of dynamic scoring in OT systems is inevitable. The market is evolving, and understanding customers’ needs is evolving. You can just look at the previous year, and it is quite clear that this will continue to evolve. We’ve already moved through the early phases, from gaining visibility of assets, to getting a view of the attack vectors, to corelating vulnerability identifiers, to refining the scoring metrics. The next step is dynamic scoring.
To view the original article, please click here.