Intrusion & Access Control

 

 

https://sm.asisonline.org/Pages/Personnel Peril.aspxPersonnel PerilGP0|#cd529cb2-129a-4422-a2d3-73680b0014d8;L0|#0cd529cb2-129a-4422-a2d3-73680b0014d8|Physical Security;GTSet|#8accba12-4830-47cd-9299-2b34a43444652018-04-01T04:00:00Z<p>​When employees steal proprietary information, they don't just cause headaches for the organization—they erode confidence in the trustworthiness of screened employees and vetted business partners. Following the recent spate of high-profile incidents—including leaks by U.S. National Security Agency contractor Edward Snowden in 2013, violent attacks on Fort Hood by Major Nidal Hasan in 2009, and Washington Navy Yard shooter Aaron Alexis in 2013—the U.S. government determined that existing vetting processes and security standards for sensitive programs were inadequate. Key policy changes were implemented, including a new requirement for government organizations and certain government contractors to establish an insider threat program. The requirements changed the way government-affiliated organizations approached employee management and codified existing insider threat practices.</p><p>What does that mean for private sector organizations, even if they don't work with the government? Certain features of a U.S. Department of Defense (DoD)-style insider threat program may be relatively easy to implement and offer considerable security enhancements. Traditional administrative and physical security practices—locked doors, alarm systems, and inventory controls—are focused externally and are largely ineffective at preventing employees and other authorized persons from committing harmful acts.</p><p>Integrating an insider threat policy with employee and event best practices can create a well-rounded employee management program that benefits workers and the organization. Educating employees on how to recognize and report potential insider threat information can also have a positive effect on the organization's culture and emphasize everyone's role in keeping a safe, secure work environment.</p><p>Concurrent Technologies Corporation (CTC), an independent, nonprofit organization that conducts applied scientific research and development for government and industry, faced this exact challenge upon the creation of a nuclear research facility. </p><p>With industrial space and laboratories in five states, and more than 25 percent of employees telecommuting, CTC's potential insider threat profile is typical among many technology companies in the United States. Protection of sensitive government programs, client information, and intellectual property is paramount to success in a highly competitive environment.  </p><p>But the August 2017 establishment of CTC's Center for Advanced Nuclear Manufacturing (CANM) in Johnstown, Pennsylvania, created new insider threat challenges that CTC had to address. The CANM is designed to bring fabrication technology and materials expertise to the emerging next generation of commercial nuclear power plants and will conduct business only with private sector organizations that are working on small nuclear reactors. While CTC works with both industry and sensitive government programs—and must abide by federal insider threat policies—it wanted CANM to have a government-grade insider threat program that would defend against all kinds of manmade threats—from petty theft to intellectual property issues to event management.   </p><p>A planned ribbon cutting and open house event at the CANM would place about 75 visitors in close proximity to CTC's intellectual property and advanced technology—and would serve as the first real test of the organization's new insider threat policy. ​</p><h4>Tailoring a Solution</h4><p>The FBI, U.S. Department of Homeland Security (DHS), and U.S. Defense Security Service provide tools for industry organizations to develop insider threat programs, including online training courses and brochures available through public websites. The tools identify specific behaviors that may indicate the presence of an insider threat.  </p><p>Simply educating employees on what to watch for may improve the chances of averting a workplace incident. Other insider threat program features, such as information sharing and incident reporting, could also prove beneficial. Initiatives can be tailored to fit the organization, and security practitioners may find that their programs already include parts of the overall insider threat framework outlined in government directives.  </p><p>This was true for CTC as it began to build a more robust insider threat program. While the organization had taken an informal approach to communicating potential employee issues, it was nowhere near the formalized program needed. To make sure the program covered all threats, CTC created an insider threat working group.</p><p><strong>Comprehensive support. </strong>An insider threat program relies on buy-in throughout the organization. A single official with authority to develop policies and procedures should be appointed to manage the program. He or she should also be responsible for determining when to report substantive insider threat information to law enforcement and other entities outside the organization.</p><p>CTC appointed an insider threat program official and established a working group with membership based on relevant roles, including representatives from security, human resources, IT, executive management, and ethics and compliance. The working group conducted several program reviews and established the types of activities to watch out for or report. </p><p>The group also ensured that all employees completed awareness training in the time leading up to the CANM open house and helped foster a culture of communication so that employees would not hesitate to report concerns about visitors or fellow employees. Line employees are often the first to sense that something is off—if they notice changes in an employee's routine or behavior, they should know how to safely and effectively communicate the information to team leaders without fear of retribution. </p><p>Security staff and senior managers stood ready to work with department managers and labor representatives to reduce or eliminate social barriers to reporting. Reporting policy violations and unusual or suspicious behavior must not be viewed as tattling. Instead, it should be emphasized that timely reporting may save the company or business unit from significant financial loss, unfair competition, or even a tragic incident.</p><p><strong>Team approach. </strong>Effective information sharing and collaboration among security stakeholders in the organization are essential for a stalwart insider threat program. Functional leaders—like the ones in CTC's insider threat working group—typically monitor organizational performance in areas relevant to detecting a potential insider threat. For example, larger organizations usually rely on a CISO to detect violation or circumvention of policies regarding systems access, file transfers, software installation, and other network activities. Likewise, the human resources department should track, analyze, and share information on trends in employee misconduct, including harassment complaints and drug testing. In reviewing such information, the team must take care to protect employee privacy and focus only on security-relevant factors that might create concerns of an insider threat and identify needed adjustments in policies and training. </p><p>For special events and unusual situations, organizations should not shy away from reaching out for help. The CTC insider threat program's leader contacted the FBI private sector coordinator, Defense Security Service representatives, and local law enforcement officials several weeks before the open house to inform them about the event and to obtain updated threat information. The FBI coordinator participated in an event rehearsal and walkthrough, and provided a tailored counterintelligence briefing to CANM engineers, program managers, and support staff, offering specific recommendations to limit risk while accomplishing overall open house objectives.  </p><p><strong>Training. </strong>Employees should feel that they share a common security interest—success for themselves and for the entire organization requires their commitment to protecting intellectual property, proprietary information, and other valuable resources. Leaders must emphasize these points and encourage employees to actively support security programs and procedures. Employee commitment and loyalty to a common cause cannot be assumed, particularly in industries that experience high employee turnover. </p><p>Training employees to watch for specific activities and behaviors that may indicate an insider threat is the key to viable information reporting within the organization. Employees tend to recognize differences in a coworker's attitude, work ethic, or behavior well before an incident occurs, so they must know when and how to report concerns. Employees must also know how to recognize suspicious emails, scams, phishing attempts, and social engineering tricks to avoid becoming an unwitting insider or being coerced into providing information or other assistance. Training should also emphasize the importance of following basic rules aimed at mitigating risk, such as locking or switching off computer workstations when unattended.  </p><p>CANM employees were trained in traditional insider threat identification messages but were also given tips on identifying and reporting suspicious behavior at the open house event. </p><p>Because engineers, program managers, and event staff integrated security best practices into their job requirements, enhanced security was everywhere yet remained unseen at the event.</p><p><strong>Written plans. </strong>The insider threat working group at CTC identified all written guidance regarding employee behavior, from harassment policies and timekeeping systems to travel plans and procedures and integrated it into the plan. The insider threat program features a risk mitigation plan that identifies insider threat stakeholders, roles and responsibilities, resources, policies, and procedures. The team of stakeholders meet periodically to review the plan, share and assess potential insider threat information, and determine additional actions needed to protect people, operations, intellectual property, and other resources.</p><p>For example, at a stakeholder meeting, someone in charge of travel finances might point out that the rental car budget for the previous month was 20 percent larger than normal. Human resources personnel can revisit employee travel dates and potentially identify excessive use of rental vehicles for personal travel. The same insider threat reporting procedures should be followed to address the problem. ​</p><h4>Redefining Insider Threats</h4><p>CTC's reevaluation and preparation paid off—the open house event went smoothly for staff and visitors alike. </p><p>CTC security officials are also reaping longer-term benefits from the CANM experience. For example, the department is improving its approach to training by conducting lunchtime seminars and more personal interviews with employees to reinforce the significant role that each employee plays in countering insider threats, even if security is not their primary role.</p><p>In addition to the CANM program, other business changes prompted CTC to reassess potential threats and strengthen routine security procedures. New contracts with government clients outside the DoD brought new requirements and concerns for protecting sensitive information processed and stored on company networks. The company invested in new equipment, and other areas of business development brought increased interaction with international customers—along with added challenges for ensuring compliance with American export laws. </p><p>By thinking outside the box in regard to an insider threat, CDC was able to create a well-rounded employee management policy that is capable of addressing a variety of organizational concerns. Addressing a wide scope of potentially problematic employee-related activity—not just intellectual property or workplace violence concerns—through an insider threat lens strengthens the entire program and makes it more adaptable for addressing other business concerns.</p><p>As an example, security staff worked with shop floor staff and project managers to revise the facility's access control plan. Doors to certain industrial areas within the 250,000-square foot CANM were closed to employees who did not have a clear need for access. Facility access hours were restricted for many employees, and a proximity card in addition to a six-digit PIN is now required to use doors that are not routinely monitored. Process owners and senior managers fully grasped the need for such procedural changes and strongly supported the recommendations. </p><p>As international business contacts expanded, the security, contracts, and export compliance departments worked closely with program managers to ensure that export licenses encompass all international dealings involving protected technologies. The company's enterprise visitor system, internally developed in 2012 and upgraded in 2015, electronically routes international visit requests for coordination and approval. This ensures that the right managers and technicians are informed, projects are shrouded, or operations are suspended or rescheduled as needed.            </p><p>With such low- or no-cost security enhancements in place, establishing an insider threat program required only a modest effort to formalize plans and procedures, chartering a working group, and expanding existing training. Other corporations working exclusively or extensively with government contracts can engineer similar results.  </p><p>Increasing awareness of insider threats and encouraging employees to report suspicious behavior and policy violations has directly led to improved overall security. For example, information received in recent months from frontline employees has enabled managers to correct internal issues and mitigate vulnerabilities in how the company purchases, inventories, and accounts for low-cost supplies, equipment, and bench tools. Workers in the affected areas recognize how the changes reduce risk of pilferage and unauthorized use of company assets. Minimizing such losses helps the company control overhead costs, remain competitive, and protect jobs and salaries.     </p><p>If an organization is unaccustomed to a regimen of safety and security rules during daily business operations, it may take months to evolve a security culture where employees are likely to bring their concerns forward and key supervisors can evaluate information and respond effectively. The advantages of starting now almost certainly outweigh the risk of what could come later.  </p><h4>Sidebar: How Nuclear-Level Security Influenced Today’s Insider Threat Programs​<br></h4><p></p><p>Concerns about insider threats are not new. In the mid-1940s, during the highly secretive Manhattan Project—the United States' efforts to develop the world's first atomic weapons—leaders were most concerned that a trusted insider could be blackmailed or tempted to commit espionage for money. Losing atomic secrets to enemies could have drastic—and deadly—consequences. The art of protecting critical research, test activities, materiel and weapons production, and plans for use of nuclear weapons was woven into the Manhattan Project and remains a hallmark of security within U.S. Department of Defense (DoD) nuclear programs.</p><p>The personnel clearance process and the personnel reliability program (PRP) have been central in addressing insider threats to nuclear capabilities since the 1960s. Clearance processes are designed to screen people for trustworthiness and must be strictly followed prior to granting an individual access to classified nuclear design information, plans, capabilities, or operating procedures. A personnel clearance is based on favorable evaluation of factors such as the person's demonstrated financial responsibility, personal conduct, and allegiance to the United States. Cleared individuals are reinvestigated periodically to ensure continued access is appropriate. Those in unusually sensitive and critical positions may be subjected to polygraphs.   </p><p>The PRP is an added layer of administrative security comprising procedures, automated notifications, tiered supervision, and other checks designed to ensure workers are mentally and physically fit at the time they perform critical tasks, such as nuclear command and control, maintenance, or armed security. PRP requirements and standards are risk averse—the slightest concern may result in temporary suspension from normal duties until circumstances change or a problem is resolved. A common reason for temporary suspension from duties under the PRP is use of prescription medication, which may cause drowsiness. Minor disciplinary infractions may also result in PRP suspension, triggering security measures that block access to restricted facilities and information systems.</p><p>Together, clearance processes and the PRP foster a heightened safety and security environment where workers are dutybound to report relevant information about themselves and others to appropriate authorities. Such an environment is essential based on the destructive power and political significance of the nuclear arsenal. Senior government and military personnel hold leaders within the nuclear community accountable for evaluating conditions that may detract from anyone's assigned tasks under PRP. For example, removal of the responsible unit commander is often the outcome of failure to properly adhere to PRP guidelines.    </p><p>Historically, these stringent screening and reliability standards are seldom applied to government and contractor enterprises outside nuclear communities. Since 2013, however, government officials have increasingly acknowledged the threat of insiders. Personnel clearance processes are now bolstered with additional screening and random selection for background checks between the traditional timespans for periodic reinvestigation. Additionally, government clearance adjudicators may now review and consider social media information when determining overall eligibility for access to national security information.</p><p>A series of U.S. Department of Homeland Security and DoD documents and guidelines mandate insider threat programs for agencies and certain contractors but stop short of requiring self-reporting measures such as those associated with the DoD PRP due to cost, legal concerns, and other practical considerations. A PRP-like mindset, however, can be encouraged within any operation where inattention to detail, slowed reaction time, or lapse in judgment could result in injury, death, or unacceptable material or financial loss.​</p><p><br> </p><p><em>Ronald R. Newsom, CPP, is a retired U.S. Air Force officer now employed with Concurrent Technologies Corporation, a recipient of the DoD 2017 Colonel James S. Cogswell Award for sustained excellence in industrial security. Newsom is a member of ASIS International. He also serves as the Chair of the National Classification Management Society's Appalachian Chapter.    ​ ​</em></p>

Intrusion & Access Control

 

 

https://sm.asisonline.org/Pages/Personnel Peril.aspx2018-04-01T04:00:00ZPersonnel Peril
https://sm.asisonline.org/Pages/Take-No-Chances.aspx2018-04-01T04:00:00ZTake No Chances
https://sm.asisonline.org/Pages/Florida-Governor-Unveils-Major-School-Security-Plan-In-Wake-Of-Shooting.aspx2018-02-23T05:00:00ZFlorida Governor Unveils Major School Security Plan In Wake Of Shooting
https://sm.asisonline.org/Pages/Find-the-Fire.aspx2018-01-01T05:00:00ZFind the Fire
https://sm.asisonline.org/Pages/Call-for-Help.aspx2017-12-01T05:00:00ZCall for Help
https://sm.asisonline.org/Pages/ENDURECE-BLANCOS-SUAVES-CON-PSIM.aspx2017-11-21T05:00:00ZENDURECE BLANCOS SUAVES CON PSIM
https://sm.asisonline.org/Pages/What's-New-in-Access-Control.aspx2017-11-20T05:00:00ZWhat's New in Access Control?
https://sm.asisonline.org/Pages/School-Lockdown-Procedure-Prevented-Tragedy-in-Rancho-Tehama.aspx2017-11-16T05:00:00ZSchool Lockdown Procedure Prevented Tragedy in Rancho Tehama
https://sm.asisonline.org/Pages/Harden-Soft-Targets-with-PSIM.aspx2017-10-23T04:00:00ZHarden Soft Targets with PSIM
https://sm.asisonline.org/Pages/Safety-in-Shared-Spaces.aspx2017-09-01T04:00:00ZSafety in Shared Spaces
https://sm.asisonline.org/Pages/Book-Review---Biosecurity.aspx2017-08-01T04:00:00ZBook Review: Biosecurity
https://sm.asisonline.org/Pages/Identify-the-Solution.aspx2017-08-01T04:00:00ZIdentify the Solution
https://sm.asisonline.org/Pages/Healthy-and-Secure.aspx2017-07-01T04:00:00ZHealthy and Secure
https://sm.asisonline.org/Pages/Accesos-Bajo-Control.aspx2017-06-01T04:00:00ZAccesos bajo Control
https://sm.asisonline.org/Pages/On-Site-and-Cloud-Access-Control-Systems.aspx2017-05-22T04:00:00ZOn-Site and Cloud Access Control Systems
https://sm.asisonline.org/Pages/Message-to-the-Masses.aspx2017-03-01T05:00:00ZMessage to the Masses
https://sm.asisonline.org/Pages/Yale-Opens-Doors.aspx2016-12-01T05:00:00ZYale Opens Doors
https://sm.asisonline.org/Pages/Sounding-the-Alarm-at-Lone-Star.aspx2016-08-01T04:00:00ZSounding the Alarm at Lone Star
https://sm.asisonline.org/Pages/Cannabis-Cash.aspx2016-07-01T04:00:00ZQ&A: Cannabis Cash
https://sm.asisonline.org/Pages/What-the-Pulse-Nightclub-Attack-Means-for-Soft-Target-Security.aspx2016-06-14T04:00:00ZWhat the Pulse Nightclub Attack Means for soft Target Security

 You May Also Like...

 

 

https://sm.asisonline.org/Pages/A-Cyber-Pipeline.aspxA Cyber Pipeline<p>​​It was a tense moment. Twenty minutes before taking the stage at the 2016 RSA Conference in San Francisco, U.S. Secretary of Defense Ash Carter had signed an agreement to create the first U.S. government bug bounty program.</p><p>"I was sitting in the front row there, just shaking my head and praying everything would work out the way it was supposed to," says Lisa Wiswell, former U.S. Department of Defense (DoD) bureaucracy hacker who oversaw the bug bounty program.</p><p>And work, it did. Dubbed "Hack the Pentagon," the program allowed 1,400 security researchers to hunt down vulnerabilities on designated public-facing DoD websites. More than 250 researchers found and reported those vulnerabilities to the DoD, which paid them a total of $150,000 for their efforts.</p><p>"It's not a small sum, but if we had gone through the normal process of hiring an outside firm to do a security audit and vulnerability assessment, which is what we usually do, it would have cost us more than $1 million," Carter said in a statement. </p><p>Based on the program's success, the DoD launched "Hack the Army" in 2016, followed by "Hack the Air Force" in 2017, to continue to address security vulnerabilities in its systems. This method of crowdsourcing cyber­security is one that many organizations are turning to as they continue to struggle to recruit and retain cyber talent.</p><p>According to the most recent Global Information Workforce Study, the cybersecurity workforce gap is on pace to increase 20 percent from 2015—leaving 1.8 million unfilled positions by 2020.</p><p>"Workers cite a variety of reasons why there are too few information security workers, and these reasons vary regionally; however, globally the most common reason for the worker shortage is a lack of qualified personnel," according to the report's findings. "Nowhere is this trend more common than in North America, where 68 percent of professionals believe there are too few cybersecurity workers in their department, and a majority believes that it is a result of a lack of qualified personnel."</p><p>To help address this issue, study respondents reported that more than one-third of hiring managers globally are planning to increase the size of their departments by 15 percent or more. However, the report found that historically, demand for cybersecurity talent has outpaced the supply—which will continue to exacerbate the current workforce gap if the trend continues.</p><p>"It is clear, as evidenced by the growing number of professionals who feel that there are too few workers in their field, that traditional recruitment channels are not meeting the demand for cybersecurity workers around the world," the report explained. "Hiring managers must, therefore, begin to explore new recruitment channels and find unconventional strategies and techniques to fill the worker gap."</p><p>One technique to fill the worker gap is being used by the FBI, which has a long history of workforce training and development to keep agents—and Bureau staff—at the top of their game to further its mission.</p><p>In an appearance at ASIS 2017, FBI Director Christopher Wray explained that the Bureau has created a training program to identify individuals with cyber aptitude and train them so they have the skills necessary to identify and investigate cybercrime.</p><p>"We can't prevent every attack or punish every hacker, but we can build our capabilities," Wray said. "We're improving the way we do business, blending traditional techniques, assigning work based on cyber experience instead of jurisdiction, so cyber teams can deploy at a moment's notice."</p><p>In an interview, Assistant Section Chief for Cyber Readiness Supervisory Special Agent John Caliano says the FBI is looking internally to beef up all employees' cyber abilities.</p><p>"There is a notional thought that all the cybersmart people are in the Cyber Division," he adds. "There are a lot of very talented people outside the Cyber Division, some have worked in other areas…the goal is to start to pick up in the investigative realm and lift the abilities of all employees, so they have a basic understanding of cyber and digital threats today."</p><p>To do this, the FBI has employees undergo a cyber talent assessment which looks at the skill sets they brought with them when they were hired, the skills they have learned on the job, and their aptitude for formalized and informalized training on cybersecurity and technology. </p><p>The FBI then sorts employees into three categories: beginners, slightly advanced, or advanced. Employees are then sent to outside educational courses, such as those provided by the SANS Institute or partnering universities, to learn more about cybersecurity and bring that knowledge back to the FBI. The FBI also works with the private sector to embed employees to teach them specialized skills, such as how SCADA networks operate.</p><p>In 2016, Caliano says, the FBI identified 270 employees for cyber training who were not part of the Cyber Division. Approximately two-thirds of those employees were categorized as beginners at the outset, and Caliano says the Bureau plans to continue the assessments and training for the foreseeable future.</p><p>And for its specialized teams, the FBI is continuously developing in-house training that will eventually be offered to the entire FBI. </p><p>"One day, all FBI employees will take these courses and pass these courses," he says. "People will understand what depth and defense mean, how to secure networks, and trace IP addresses."</p><p>These specialized teams include its Cyber Action Team (CAT), which is made up of employees who deploy when a major cyber incident occurs. For instance, when the Sony hack occurred in 2013 the initial FBI response team had a few members who were also CAT members who were sent to the scene.</p><p>Once the FBI became aware of the severity of the incident, it sent a full CAT to Sony's headquarters to sit with the network operators to comb through their logs to see how the attack spread.</p><p>While this training provides professional development opportunities to current employees, the FBI is also focused on identifying future talent that can be recruited into the FBI. </p><p>"We can't compete with dollars, but we can compete on mission," Caliano says, adding that the FBI often gets to look at cyber threats and address them in a way that the private sector does not, providing employees a "deeper sense of fulfillment."</p><p>To attract talent, the FBI has a variety of initiatives including an Honors Intern Program open to all college students. It also has a postgraduate program where the FBI will pay for a graduate or doctoral student's degree. It's also reaching out to students at the high school level through its Pay It Forward program, which engages students in math, science, and technology who might show cyber aptitude.</p><p>"We are, as a workforce planning objective, training at schools—driving down to the high school level," Caliano tells Security Management.</p><p>Another new recruiting channel has been championed by Wiswell since she left the DoD in 2017. After leaving the public sector, she went to work at GRIMM, a cybersecurity engineering and consultant firm, as a principal consultant. One of her main responsibilities is to oversee its GRIMM Academic Partnership Program that runs the HAX program.</p><p>Through HAX, undergraduate cybersecurity clubs can participate in friendly competitions and gain hands-on cyber experience. GRIMM has partnered with Penn State University at Altoona's Security Risk Analysis Club and Sheetz Entrepreneurial Fellows Program, the Michigan Technological University (MTU) Red Team, George Mason University Competitive Cyber Club, and the Rochester Institute of Technology's Rochester Cybersecurity Club.</p><p>Throughout the academic year, participants in HAX break into teams to complete programs designed by GRIMM engineer Jamie Geiger that are similar to computer Capture the Flag challenges. While participants have the option to compete individually, Wiswell says she encourages students to create a team to hone their communication skills.</p><p>"A lot of this field has an individualist focus a lot of the time, and what's really needed is the ability to communicate well, both up and down, to work well on teams, and to have effective analytical skills," she explains. "The kinds of things that you learn well by doing these kinds of team-based challenges."</p><p>GRIMM chose these programs in particular to create a talent pipeline for the company, which has offices in the Washington, D.C., area and in Michigan—near two of the universities it's partnered with. By engaging college students through HAX, GRIMM hopes to create a talent pipeline and increase diversity on its own staff.</p><p>"HAX is an effort to do both those things," Wiswell says. "We are kind of do-gooders on one hand. If folks that are participating in the program have no interest in coming to work for GRIMM, that's fine. We just hope that they use their talents and go somewhere."</p><p>That's why the challenges and the experience to connect with people working in cybersecurity are important, according to Wiswell, because it helps students make informed decisions about what they would like to do after graduation.</p><p>"We're trying to think outside the box in ways that students feel very well rounded, so students can make decisions on what sliver of this workforce is most interesting," Wiswell says, explaining that current challenges are focused on Linux and Microsoft systems, but in the future, might include hardware and other areas. </p><p>And to gain even more experience before graduation, Wiswell says she encourages students to take part in bug bounty programs to get connected to companies that might one day hire them.</p><p>"If you already have a lot of good skill and you're trying to hone skill—and make some cash—we think that bug bounty programs are a great way to do that," Wiswell explains to Security Management. "GRIMM is partnered with a couple bug bounty as a service providers to help them get in a broader group of individuals who are interested in participating, as well as companies that could benefit from hosting bug bounties themselves."   ​</p>GP0|#91bd5d60-260d-42ec-a815-5fd358f1796d;L0|#091bd5d60-260d-42ec-a815-5fd358f1796d|Cybersecurity;GTSet|#8accba12-4830-47cd-9299-2b34a4344465
https://sm.asisonline.org/Pages/Take-No-Chances.aspxTake No Chances<p>​Security processes are working properly if nothing happens, as the adage goes—much to the chagrin of the security manager looking for buy-in from the C-suite. But if something does go wrong at an organization, the error lies in either the company's risk profile or its implementation of mitigation procedures. Using risk management principles to create a risk profile and implement procedures to mitigate those risks should leave no gray areas for an incident to occur, says Doug Powell, CPP, PSP, security project manager at BC Hydro. Security Management sat down with Powell, the 2017 recipient of the Roy N. Bordes Council Memb er of Excellence Award, to discuss how to create a mitigation program that only gets stronger after a security incident.​</p><h4>Weigh the Risks…</h4><p>A basic tenet of risk management principles is understanding what risks an organization faces by conducting a thorough risk assessment. "For me, nothing should happen in the security program in terms of making key decisions around protection principles until you've been through your risk management exercise, which will do two things for you: tell you where you have gaps or weaknesses, and what the priority is for addressing those," Powell says. </p><p>Look for the risks that are high-probability, low-impact—such as copper theft—and low-probability, high-impact—such as a terror attack—and build a protection plan that primarily addresses those, Powell says. </p><p>"You use that prioritization to get funding," he explains. "I tell people there's a broad spectrum of risks you have to consider, but there are two that you focus on that I call the board-level risks—the ones the board would be interested in because they could bring down the company."​</p><h4>…And Use Them to Build a Strategy</h4><p>Establishing those risk categories will not only help get buy-in from the C-suite but frame the company's security strategy.</p><p>"You should never say something like, 'well, the copper losses are so small that we're not going to deal with this at all,' in the same way you're not going to say that you'll never likely be attacked by terrorists so let's not worry about it," Powell says. "With that in place, you should have an effective mitigation strategy on the table."​</p><h4>Flesh Out the Baseline…</h4><p>While getting buy-in may rely on emphasizing the impact a risk can have on business operations, the security team needs to have a well-rounded understanding of the risk itself. Powell illustrates the distinction by using an example of how protesters might affect critical infrastructure.</p><p>"It's one thing to say that there's risk of work being disrupted or of a pipeline being taken out of service by protesters, but it's quite another thing to say that in the context of who these protesters are," according to Powell. </p><p>"You have one level of protesters who are just people concerned about the environment, but all they really do is write letters to the government and show up and carry picket signs to let you know they are concerned. The more extreme groups are the ones that would come with explosives or physically confront your workers or who would blockade machinery," Powell explains.</p><p>While these two groups of people both fall under the protester category, the risks they present—and how to respond to them—are vastly different.</p><p>"You have to understand the characteristics of your adversaries before you can adequately plot the seriousness of the risk," Powell explains. "Would it be serious if our pipeline got blown up? You bet it would. But who has the capability to do that? Are they on our radar? And what's the probability that we would ever interact with them? There's a bit more than just saying it's a bad thing if it happens."​</p><h4>…And Keep It Updated</h4><p>Don't let an incident be the impetus for conducting a new risk assessment. Creating a governance model will facilitate regular reviews of the risk assessment and how it is conducted.</p><p>"If you do it well at the head end, you should be mitigating to those standards," Powell says. "Risk doesn't happen once a year, it's an ongoing process where you establish the baseline, mitigate to the baseline, and start watching your environment to see if anything bad is coming at you that you should be taking seriously because the world is dynamic."</p><p>Consistent monitoring of threats allows the mitigation strategy to be adjusted before weaknesses are discovered and exploited.</p><p>"The monitoring aspect is critical, and after an incident you might say that the reason your mitigation plan failed is you simply didn't monitor your environment enough to realize there were new risk indicators you should have picked up," Powell says. "The risk management process is dynamic, it never stops, it's continually evolving, and whether something happens to cause you to reevaluate or whether you reevaluate because that's your normal practice, that has to happen."</p><h4>Establish a Process…</h4><p>Through risk management, a security incident occurs when the risk assessment was not accurate, or the mitigation processes were not properly carried out. After an incident, security managers should never feel blindsided—they must identify the shortcomings in their processes.</p><p>"When something critical happens, the first thing you will do is go back to your risk profile and ask yourself some key questions," Powell advises. "Did we get it right? Did we miss something? How did this incident occur if in fact we had our risk profile correct? Or did our mitigation planning not match well with the risk profile we had developed? If we had this assessed as low-risk but it happened anyway, maybe we got something wrong. If it was high-risk and it happened anyway, what was the cause?"</p><p>If the security program matches the risk profile and an incident still occurred, it's time for the organization to change the baseline.</p><p>"Did we understand our adversary?" Powell asks. "Was it someone we anticipated or someone we didn't anticipate? If it was someone we anticipated, how did they get in to do this thing without our being able to stop it or understand that they were even going to do it? Do we have the right security in place, did we do the right analysis on the adversarial groups in the first place? What did we miss? Are there new players in town? Is there something going on in another country that we weren't aware of or ignored because we didn't think it impacted us over here in our part of the world?"</p><p>And, if it turns out that the risk profile was inaccurate despite proper governance and maintenance, don't just update it—understand why it was wrong. "Look at whether your intelligence programs or social media monitoring are robust enough," Powell suggests.</p><p>"If you had 10 or 100 metal theft incidents in a month, you want to go back and ask why this is continuing to happen," Powell notes. "We've already assessed it as a risk and tried to mitigate it. For me, the two things are intrinsically connected. If you're performing risk management well, then your mitigation programs should mirror that assessment. If it doesn't, there's a problem, and that's what this review process does, it gets you into the problem."​</p><h4>…And Use It Consistently</h4><p>Whether it's copper theft or a terrorist attack, the incident management process should be carried out in the same way.</p><p>"That should always be a typical incident management process for any kind of event," Powell says. "What varies is input, but the methodology has got to be identical. If it's metal theft, it's a pretty simple thing—we have some thieves, they broke into a substation, removed ground wires, and as a result this happened. What can we do to mitigate that happening at other substations in the future? </p><p>If it's a terrorist attack, of course a lot more people will be involved, and you'll be asking some very challenging questions. The process becomes a lot more complex because the potential for damage or consequence value is much higher, but the methodology has to be the same all the time."</p><p>"Overall, whether you're looking at a security breach that happened because you exposed your cables and the bad guys were able to cut them or whether it was a new, more dangerous group coming at you that you weren't aware of, or because you neglected to identify the risk appropriately—all of this has to go into that evaluative process after something happens," Powell says. "Then you have to reestablish your baseline, so you're going back into that risk analysis and move to mitigate it according to what that new baseline is. If something bad happens that's what you do—go back to the baseline and discover what went wrong, and once you know, you seek to mitigate it to the new baseline." </p>GP0|#cd529cb2-129a-4422-a2d3-73680b0014d8;L0|#0cd529cb2-129a-4422-a2d3-73680b0014d8|Physical Security;GTSet|#8accba12-4830-47cd-9299-2b34a4344465
https://sm.asisonline.org/Pages/Find-the-Fire.aspxFind the Fire<p>​The University of Hawaii at Hilo (UHH), founded in 1941, is located on the largest island of the Hawaiian archipelago, Hawaii–also known as "the Big Island." The school offers 38 undergraduate areas of study, including a renowned astronomy program, to approximately 3,600 students.</p><p>The Hawaiian skies over the central Pacific Ocean offer a spectacular view of the heavens. </p><p>But despite the campus's magnificent panoramas, the university's security staff found itself gazing too often at fire panels that weren't functioning properly, says Ted LeJeune, project manager at UHH. </p><p>When the campus began major renovations about five years ago, the security department ran into challenges with the fire panels, which worked via radio signal. "We were starting to experience issues with the reflectivity and the inconsistencies of the radio system," LeJeune says, "so we were having trouble passing our final fire inspections with the fire marshal."</p><p>The institution's fire system includes panels that intermittently report back to a central station in the campus security office. "On a regular basis, the panels transmit signals that say, 'Hey, I'm here, I'm doing fine,'" LeJeune explains. "And as long as we get that heartbeat notification, the security office knows that we don't have any problems."</p><p>The fire panels report any issues to the central station, including triggered smoke detectors, pulled fire alarms, and offline panels. When any of these alarms are triggered, "we get an immediate notification to our campus security office that we have an issue with a building, and we need to dispatch somebody to investigate," LeJeune notes.</p><p>In the campus security operations center, which is staffed around the clock, security staff members monitor a large screen that displays the fire life safety system's current status, as well as active alarms. The screen allows operators to scroll through notifications and keep an archive of reports. In case of fire or another life-threatening hazard, the fire department is contacted. </p><p>The campus roofs are made of corrugated steel. But whenever the Hawaiian sun would hit the metal rooftops, the signals could get diffused or jammed, causing the radio-based fire alarm systems to report inconsistently, or not at all. This led to a host of issues for the campus security department. </p><p>"We were having intermittent connectivity and even losing connectivity to some of the locations because of the radio signal reflectivity of our roof systems," LeJeune says. </p><p>Besides the connectivity and transmission issues, the old radio units were burdensome to maintain, and an outside engineer had to travel to the campus to service the units. </p><p>These challenges led to a conversation with Digitize, which provides several aspects of the campus's fire life safety system. In the fall of 2016, Digitize suggested land-based radio units that connect into the university's existing fiber optic cable and Ethernet system. "We've done several upgrades over the last few years to standardize and stabilize our Internet," LeJeune explains, "and it was just a natural extension to add Digitize to the land system because we already had the existing backbone."</p><p>The land-based radio units allow the end user to remove the frequency transmitter on the fire panels, and connect into either the Ethernet or fiber connections in the buildings. This landline connection enables the panels to report back to the central station within seconds. </p><p>UHH launched a pilot project in the spring of 2017 to test the new product on its recently renovated College of Business and Economics building. The university upgraded its base unit in the campus security office to accommodate both the radio frequency and the land inputs. </p><p>During the testing, the land-based units successfully and accurately reported all issues to the central station. "Our pilot project went fantastically," LeJeune says. "We were able to retrofit the remote unit [with the landline], and we were able to clearly communicate and program the base unit," he says. The school also brought the fire department in to observe the new system. "They were thrilled that we were getting a more stable network and that we were able to more clearly manage and supervise our system." </p><p>Since installing the new system, the campus has not experienced any issues with fire alarm panel reporting. Over the next several months, the campus plans to add additional land-based units to at least 25 buildings. Some of the larger buildings will have their own unit while groups of smaller buildings can share units, LeJeune adds. </p><p>With the new system, UHH security staff can service the panels themselves, rather than relying on an outside engineer. "Digitize has given us in-house training, so that we can not only diagnose but also put new systems online, and program them at both ends to communicate consistently and properly," he notes. "The ability to work on them internally…and the training that we've been able to get from Digitize has just been a real major step forward for us." </p><p>He adds the new system allows security to fully focus on the issues that deserve attention. "It's about having confidence that we have consistent communications, and that we're not getting dropouts or false alarms," he says. "This allows the security office folks to focus on their assigned tasks rather than chasing ghosts and false alarms."</p><p>For more information: Abe Brecher, Digitize, www.digitize-inc.com, abeb@digitalize-inc.com, 973.219.2567 ​</p>GP0|#cd529cb2-129a-4422-a2d3-73680b0014d8;L0|#0cd529cb2-129a-4422-a2d3-73680b0014d8|Physical Security;GTSet|#8accba12-4830-47cd-9299-2b34a4344465