• Technology
  • Electrical equipment
  • Material Industry
  • Digital life
  • Privacy Policy
  • O name
Location: Home / Technology / Real words or buzzwords? Attack surfaces

Real words or buzzwords? Attack surfaces

techserving |
674

Editor’s note: Thisis the 61st article in the “Real Words or Buzzwords?” series about how realwords become empty words and stifle technology progress.

Cyber-physical systems (CPS) arecomputerized systems that interact with the environment around them in physicalways. Securing them can be complicated because of their dual nature (cyber andphysical). The purpose of cyber-physical security is to ensure that the entiresystem works as intended – not just the computing part. It requires both cybercontrols and physical controls, as well as a level of due diligence appropriatefor the consequences of system failure.

Attack surface is anIT term that we don’t commonly hear spoken in the physical security domain. Anattack surface is defined as the total number of all possible entry points forunauthorized access into any system. It includes all vulnerabilities andendpoints that can be exploited to conduct a security attack. Cyber-physicalsystems have a bigger and more vulnerable attack surface due to their overall complexityand because nowadays they are usually connected to and exchange data withother larger systems.

ComplexityIncreases Attack Surfaces

Today’s security systems are vastly more complexthan in earlier decades. They have more failure points than earliernon-networked systems. For over two decades security investigators have beentelling me that 10% to 20% of the time the evidential video they look for isn’tthere but should be.

In Chapter 4, “Systems and How They Fail”, of hisoutstanding book Beyond Fear: Thinking Sensibly About Security in an Uncertain World, Bruce Schneier writes, “Security experts worry more about how systemsdon't work, about how they react when they fail, how they can be made to fail.”IT folks scan and monitor their networks and devices to get ahead of problemsbefore users experience them because they focus on delivering an excellentuser experience, which requires robust IT infrastructure management.

For various reasons, many security industrymanufacturers don’t seem to give enough thought to how their products couldpossibly fail. In other fields of engineering, the opposite is true. Failure Mode and Effects Analysis (FMEA) is partof an engineer’s education and is standard practice in many industries. Readabout it at the link above on the website of the American Society for Quality(ASQ).

Attack SurfaceExposure

Hackers work to discover both physical and digital accessvulnerabilities and failure modes. Many of these are published both on the “surfaceweb” (what search engines index) and on the “deep web” (the unindexed contentthat’s several hundred times more than the surface web), and especially on the“dark web” (a small fraction of the deep web) that requires a special browser (Tor)to access.

Attack surface access vulnerabilities and failuremodes are also the subject of educational sessions at the annual Black Hat andDEF CON hacker conventions, where various hacking contests are held, includinglockpicking contests.

Usefulness of the Attack Surface Concept

For those of us who deploy and rely oncyber-physical systems such as those of physical security, the primaryusefulness of the attack surface concept lies in two things.

1. Definingthe attack surface lets us aggregate a wide variety of vulnerabilities thatdon’t fully surface during traditional security design, deployment andoperations so that we can address them. Defining the attack surface in itsentirety enables us to identify, document and properly remediate all the system'sdigital, physical and functional weaknesses we can find.

2. A keypurpose for the attack surface term is to emphasize the attacker’s perspective,which includes accidental and intentional insider threats. We do that (orshould) for our facilities in physical security risk assessments. We need to dothe same for the protection of our security systems or they will continue toremain vulnerable – which means that our people and assets will be more at riskthan they should be due to what we have typically called security system glitchesbut should really have been labeled security system failures.

There are two kinds of attack surfaces – digital andphysical – and our electronic security systems have both. Evaluating the attacksurface vulnerabilities involves assessing how security system capabilities canbe compromised or misused. Because our physical security systems are based oninformation technology plus physical sensor and control technology, they are evenmore vulnerable than business information systems.

Digital attack surfaces forsecurity systems include workstation and server computers, computer operatingsystems and software applications, networks (wired and wireless), and theirpoints of connection to other systems and the Internet, plus network and software-basedpoints of human interaction such as device and systems configuration.

Physical attack surfaces forsecurity systems encompass all endpoint devices, such as server, desktop, andlaptop computers and their USB ports; personal mobile devices; securitycameras; intrusion detection sensors and controllers; and access card readers,controllers and their door monitoring and control hardware. A door monitor switch that can be defeated using asimple magnet is part of the attack surface.

How can we effectively approach attack surfaceprotection?

Real words or buzzwords? Attack surfaces

The Security Triad forCyber-Physical Systems

The Information Security Triad(sometimes just “Security Triad” for short) has provided a rock-solid three-pillarapproach to developing a sound cyber security strategy, using the securitydesign objectives of establishing and maintaining Confidentiality, Integrityand Availability (CIA).

I have updated the traditionalCIA diagram to add a new perspective – the control element – for cyber-physicalsystems (See Figure 1). In the event of a compromise of system integrity orfunctionality, we need to ensure that the system won’t cause harm due to lossor misuse of its physical world control capabilities and that we can bealerted of the failure within and appropriate time frame to manually interveneif needed.

We already have this concept inphysical security system design, which we know as the fail safe and failsecure modes.

Cyber-physical systems in manyother industries have much more critical failure mode situations to deal with,such as surgical robotic assistance systems and autonomous vehicles, for whichmillisecond response time can be life critical. But such as in the case ofaccess control and intrusion detection systems, a life-critical situation requiring a response in seconds or minutes could occur.

Figure SEQ Figure \* ARABIC 1.CIA for Cyber-Physical Systems


Image source: © 2022 RBCS, Inc.

Confidentiality

Confidentiality forinformation systems originally meant restricting data access only to authorizedindividuals. For cyber-physical systems, this also means controlling access to preventunauthorized system control and misuse.

Additionally, systems thatinteract with people may capture data that has privacy restrictions governingit. Some data privacy restrictions are obvious, like for captured biometric accesscontrol data. Less obvious are simple records like logs of customer or operatorsystem use that include place and time information and can be linked to a specificindividual – because GDPR has defined an individual‘s location data as personallyidentifiable information and subject to privacy regulations, includinginformation destruction deadlines and anonymizing of data before sharing it withindividuals or other systems.

Integrity

Cyber-physical systems anddevices are typically part of a “systemof systems”, which means that the accuracy of some data can be moreimportant to an external system than to the system or device capturing orgenerating the data. Often cyber-physical systems serve other systems – both machinesystems and people systems – that have a larger purpose and are more impactfulin the overall scheme of things.

Consider a city trafficmanagement system that bases intersection traffic light control on counts andspeeds of vehicles on the road. Traffic light timing can reduce or increasepollution based on whether it reduces or increases the overall number ofbrake-then-accelerate cycles. Some cities collect parking lot occupancy data,so that city mobile apps can present estimated valet parking and self-parkingavailability and direct vehicle occupants to available parking closest to theirdestination.

Warehouse vehicle-pedestrian accidentshave been reduced through the use of forklift and vehicle management systems focusedon dangerous intersections, blind crossing points and other hazardous zones. Theyprovide automatic speed reduction or stopping based upon detected pedestrianactivity. The integrity of pedestrian activity data is critically important forsystems that automatically control vehicle speed and stopping. A failure of intendedvehicle control could be catastrophic.

Video surveillance technologycan be involved in such scenarios. Increasingly, physical security systems are participatingin business operations optimization in addition to their typical securityfunction. In such cases, the larger system’s data integrity depends not just onthe accuracy of the data received, but on the continuous operation of that incomingdata stream without interruption or system failures. Data integrity oravailability issues in a security system could impact the functionality of other systemsin many ways, sometimes with larger consequences than the integrity failure hadin the security system.

Availability

The data center deployments of Amazon Web Services,Microsoft Azure and Google Cloud are exceedingly more complex than physicalsecurity system deployments. They offer system availability guarantees of99.9999% uptime (called “six nines”) for their top-tier cloud services. See thedowntime chart below.


Thus, we know high availability is possible andthanks to the advances in cloud computing and information technology in general– high availability is better, simpler to deploy and more affordable than inprevious decades.

However, if we don’t demand high availability for oursystems, we won’t get it. Security systems should be at least 99.9% reliable(three nines). Why aren’t they?

Undoubtedly, it’s because we don’t treat our cyber-physicalsecurity systems (PACS, video, etc.) like IT practitioners treat their criticalinformation systems. We allow only 80% reliability – things working 80% of thetime we expect them to, which means 20% failure – to prevail. Security alarmsare so unreliable that many police departments now require video verificationof an alarm before they will respond.

Today’s AI-enabled technologies can achieve muchgreater results than were possible in the previous generation of securitysystems. Now that they are becoming more valuable to security and to businessoperations, will we finally take action to protect their attack surfaces?

Learn more about attack surfaceprotection at the website of Viakoo, an IT company that entered the IoT industryto help protect IoT attack surfaces and their IoT devices including securityvideo surveillance cameras. Its Service Assurance Manager product is designed to address theproblem of missing video and much more.


Aboutthe author: Ray Bernard, PSP CHS-III, is the principalconsultant for Ray Bernard Consulting Services (RBCS), a firm that providessecurity consulting services for public and private facilities (www.go-rbcs.com). In 2018 IFSEC Global listed Ray as #12in the world’s Top 30 Security Thought Leaders. He is the author of the Elsevier book SecurityTechnology Convergence Insights availableon Amazon. FollowRay on Twitter: @RayBernardRBCS.

© 2022 RBCS. All RightsReserved.