National Security

 

 

https://sm.asisonline.org/Pages/The-Virtual-Lineup.aspxThe Virtual LineupGP0|#21788f65-8908-49e8-9957-45375db8bd4f;L0|#021788f65-8908-49e8-9957-45375db8bd4f|National Security;GTSet|#8accba12-4830-47cd-9299-2b34a43444652017-02-01T05:00:00Zhttps://adminsm.asisonline.org/pages/lilly-chapa.aspx, Lilly Chapa<p>​U.S. State and federal agencies are amassing databases of American citizens’ fingerprints and images. The programs were largely under the public radar until a governmental watchdog organization conducted an audit on them. The so-called “virtual lineups” include two FBI programs that use facial recognition technology to search a database containing 64 million images and fingerprints.</p><p>In May 2016, the U.S. Government Accountability Office (GAO) released Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy, a report on the FBI programs. Since 1999, the FBI has been using the Integrated Automated Fingerprint Identification System (IAFIS), which digitized the fingerprints of arrestees. In 2010, a $1.2 billion project began that would replace IAFIS with Next Generation Identification (NGI), a program that would include both fingerprint data and facial recognition technology using the Interstate Photo System (IPS). The FBI began a pilot version of the NGI-IPS program in 2011, and it became fully operational in April 2015. </p><p>The NGI-IPS draws most of its photos from some 18,000 federal, state, and local law enforcement entities, and consists of two categories: criminal and civil identities. More than 80 percent of the photos are criminal—obtained during an arrest—while the rest are civil and include photos from driver’s licenses, security clearances, and other photo-based civil applications. The FBI, which is the only agency able to directly access the NGI-IPS, can use facial recognition technology to support active criminal investigations by searching the database and finding potential matches to the image of a suspected criminal. </p><p>Diana Maurer, the director of justice and law enforcement issues on the homeland security and justice team at GAO, explains to Security Management that the FBI can conduct a search for an active investigation based on images from a variety of sources—camera footage of a bank robber, for example. Officials input the image to the NGI-IPS, and the facial recognition software will return as many as 50 possible matches. The results are investigative leads, the report notes, and cannot be used to charge an individual with a crime. A year ago, the FBI began to allow seven states—Arkansas, Florida, Maine, Maryland, Michigan, New Mexico, and Texas—to submit photos to be run through the NGI-IPS. The FBI is working with eight additional states to grant them access, and another 24 states have expressed interest in using the database.</p><p>“The fingerprints and images are all one package of information,” Maurer says. “If you’ve been arrested, you can assume that you’re in, at a minimum, the fingerprint database. You may or may not be in the facial recognition database, because different states have different levels of cooperation with the FBI on the facial images.”</p><p>The FBI has a second, internal investigative tool called Facial Analysis, Comparison, and Evaluation (FACE) Services. The more extensive program runs similar automated searches using NGI-IPS as well as external partners’ face recognition systems that contain primarily civil photos from state and federal government databases, such as driver’s license photos and visa applicant photos. </p><p>“The total number of face photos available in all searchable repositories is over 411 million, and the FBI is interested in adding additional federal and state face recognition systems to their search capabilities,” the GAO report notes.</p><p>Maurer, who authored the GAO report, says researchers found a number of privacy, transparency, and accuracy concerns over the two programs. Under federal privacy laws, agencies must publish a Systems of Records Notice (SORN) or Privacy Impact Assessments (PIAs) in the Federal Register identifying the categories of individuals whose information is being collected. Maurer notes that the information on such regulations is “typically very wonky and very detailed” and is “not something the general public is likely aware of, but it’s certainly something that people who are active in the privacy and transparency worlds are aware of.” </p><p>GAO found that the FBI did not issue timely or accurate SORNs or PIAs for its two facial recognition programs. In 2008, the FBI published a PIA of its plans for NGI-IPS but didn’t update the assessment after the program underwent significant changes during the pilot phase—including the significant addition of facial recognition services. Additionally, the FBI did not release a PIA for FACE Services until May 2015—three years after the program began. </p><p>“We were very concerned that the Department of Justice didn’t issue the required SORN or PIA until after FBI started using the facial recognition technology for real world work,” Maurer notes. </p><p>Maurer says the U.S. Department of Justice (DOJ)—which oversees the FBI—disagreed with the GAO’s concerns over the notifications. Officials say the programs didn’t need PIAs until they became fully operational, but the GAO report noted that the FBI conducted more than 20,000 investigative searches during the three-year pilot phase of the NGI-IPS program. </p><p>“The DOJ felt the earlier version of the PIA was sufficient, but we said it didn’t mention facial recognition technology at all,” Maurer notes. </p><p>Similarly, the DOJ did not publish a SORN that addressed the collection of citizens’ photos for facial recognition capabilities until GAO completed its review. Even though the facial recognition component of NGI-IPS has been in use since 2011, the DOJ said the existing version of the SORN—the 1999 version that addressed only legacy fingerprint collection activities—was sufficient. </p><p>“Throughout this period, the agency collected and maintained personal information for these capabilities without the required explanation of what information it is collecting or how it is used,” the GAO report states.</p><p>It wasn’t until May 2016—after the DOJ received the GAO draft report—that an updated SORN was published, Maurer notes. “So they did it very late in the game, and the bottom line for both programs is the same: they did not issue the SORNs until after both of those systems were being used for real world investigations,” Maurer explains. </p><p>In the United States, there are no federally mandated repercussions for skirting privacy laws, Maurer says. “The penalty that they will continue to pay is public transparency and scrutiny. The public has very legitimate questions about DOJ and FBI’s commitment to protecting the privacy of people in their use of facial recognition technology.”</p><p>Another concern the GAO identified is the lack of oversight or audits for using facial recognition services in active investigations. The FBI has not completed an audit on the effectiveness of the NGI-IPS because it says the program has not been fully operational long enough. As with the PIA and SORN disagreements, the FBI says the NGI-IPS has only been fully operational since it completed pilot testing in April 2015, while the GAO notes that parts of the system have been used in investigations since the pilot program began in 2011. </p><p>The FBI faces a different problem when it comes to auditing its FACE Services databases. Since FACE Services uses up to 18 different databases, the FBI does not have the primary authority or obligation to audit the external databases—the responsibility lies with the owners of the databases, DOJ officials stated. “We understand the FBI may not have authority to audit the maintenance or operation of databases owned and managed by other agencies,” the report notes. “However, the FBI does have a responsibility to oversee the use of the information by its employees.” </p><p>Audits and operational testing on the face recognition technology are all the more important because the FBI has conducted limited assessments on the accuracy of the searches, Maurer notes. FBI requires the NGI-IPS to return a correct match of an existing person at least 85 percent of the time, which was met during initial testing. However, Maurer points out that this detection rate was based on a list of 50 photos returned by the system, when sometimes investigators may request fewer results. Additionally, the FBI’s testing database contained 926,000 photos, while NGI-IPS contains about 30 million photos.</p><p>“Although the FBI has tested the detection rate for a candidate list of 50 photos, NGI-IPS users are able to request smaller candidate lists—specifically between two and 50 photos,” the report states. “FBI officials stated that they do not know, and have not tested, the detection rate for other candidate list sizes.” </p><p>Maurer notes that the GAO recommendation to conduct more extensive operational tests for accuracy in real-world situations was the only recommendation the FBI agreed with fully. “It’s a start,” she says. </p><p>The FBI also has not tested the false positive rate—how often NGI-IPS searches erroneously match a person to the database. Because the results are not intended to serve as positive identifications, just investigative leads, the false positive rates are not relevant, FBI officials stated.</p><p>“There was one thing they seemed to miss,” Maurer says. “The FBI kept saying, ‘if it’s a false positive, what’s the harm? We’re just investigating someone, they’re cleared right away.’ From our perspective, the FBI shows up at your home or place of business, thinks you’re a terrorist or a bank robber, that could have a really significant impact on people’s lives, and that’s why it’s important to make sure this is accurate.”</p><p>The GAO report notes that the collection of Americans’ biometric information combined with facial recognition technology will continue to grow both at the federal investigative level as well as in state and local police departments.</p><p>“Even though we definitely had some concerns about the accuracy of these systems and the protections they have in place to ensure the privacy of the individuals who are included in these searches, we do recognize that this is an important tool for law enforcement in helping solve cases,” Maurer says. “We just want to make sure it’s done in a way that protects people’s privacy, and that these searches are done accurately.”</p><p>This type of technology isn’t just limited to law enforcement, according to Bloomberg’s Hello World video series. A new Russian app, FindFace, by NTechLab allows its users to photograph anyone they come across and learn their identity. Like the FBI databases, the app uses facial recognition technology to search a popular Russian social network and other public sources with a 70 percent accuracy rate—the creators of the app boast a database with 1 billion photographs. Moscow officials are currently working with FindFace to integrate the city’s 150,000 surveillance cameras into the existing database to help solve criminal investigations. But privacy advocates are raising concerns about other ways the technology could be used. For example, a user could learn the identity of a stranger on the street and later contact that person. And retailers and advertisers have already expressed interest in using FindFace to target shoppers with ads or sales based on their interests. </p><p>  Whether it’s a complete shutdown to Internet access or careful monitoring of potentially dangerous content, countries and companies around the world are taking advantage of the possibilities—and power—inherent in controlling what citizens see online. As criminals and extremists move their activities from land and sea to technology, governments must figure out how to counter digital warfare while simultaneously respecting and protecting citizens’ basic human right to Internet access.​ ​</p>

 

 

https://sm.asisonline.org/Pages/Cross-Border-Disorder.aspx2016-12-01T05:00:00ZCross-Border Disorder
https://sm.asisonline.org/Pages/Who’s-Staying-Over.aspx2016-06-01T04:00:00ZWho’s Staying Over?
https://sm.asisonline.org/Pages/Terrorist-Attacks-in-Brussels-Leave-Numerous-Dead.aspx2016-03-24T04:00:00Zterrorist Attacks in Brussels Leave Numerous Dead, Cause City Shut Down

 

 

https://sm.asisonline.org/Pages/The-Virtual-Lineup.aspx2017-02-01T05:00:00ZThe Virtual Lineup
https://sm.asisonline.org/Pages/Radioactive-Remedies.aspx2017-02-01T05:00:00ZRadioactive Remedies
https://sm.asisonline.org/Pages/Book-Review---Threat-Assessment.aspx2017-01-01T05:00:00ZBook Review: Threat Assessment

 

 

https://sm.asisonline.org/Pages/ASIS-News-February-2017.aspx2017-02-01T05:00:00ZJack Lichtenstein Leaves ASIS, Offers Insights on Trump
https://sm.asisonline.org/Pages/Security-101--What-to-Expect-at-the-U.S.-Presidential-Inauguration.aspx2017-01-18T05:00:00ZSecurity 101: What to Expect at the U.S. Presidential Inauguration
https://sm.asisonline.org/Pages/Brexit,-Employment,-and-the-Law.aspx2017-01-01T05:00:00ZBrexit, Employment, and the Law

 

 

https://sm.asisonline.org/Pages/Gunman-Opens-Fire-at-Fort-Lauderdale-Airport;-Authorities-Say-Multiple-Dead.aspx2017-01-06T05:00:00ZGunman Opens Fire at Fort Lauderdale Airport; Authorities Say Multiple Dead
https://sm.asisonline.org/Pages/World-Water-Woes.aspx2017-01-01T05:00:00ZWorld Water Woes
https://sm.asisonline.org/Pages/Truck-Drives-Into-Berlin-Christmas-Market-Killing-Nine.aspx2016-12-19T05:00:00ZBerlin Christmas Market Attacker Killed in Shootout

 You May Also Like...

 

 

https://sm.asisonline.org/Pages/Surveillance-is-Instrumental.aspxSurveillance is Instrumental<p>Where can you go to see the iconic black suit worn by Johnny Cash, a guitar strummed by Eric Clapton, and instruments from sub-Saharan Africa, all under one roof? The Musical Instrument Museum (MIM) in Phoenix, Arizona, a 200,000 square-foot facility, is home to these and thousands of other legendary and significant instruments from around the world. ​<br></p><p>The collection is made up of more than 16,000 instruments, 6,000 of which are on display at any given time. Each year, upwards of 220,000 people visit the museum, which also has a 300-seat theater where notable musicians make regular headlines. The museum, which opened in 2010, is an affiliate of the Smithsonian Institution. “We’re constantly updating exhibits, changing things out, telling new stories,” says David Burger, security manager at the facility. ​</p><p>Securing this wealth of cultural items, as well as keeping the museum’s visitors safe, are top priorities for MIM, Burger says. “Very few of the exhibitions are under glass, so that creates a unique security concern between providing our guests with the world-class experience that we strive for, but also maintaining the safety of the instruments and making sure that everything is here for generations to come,” he says. </p><p>The museum employs contract security officers, in addition to police from the local precinct who act as “boots on the ground” security. “The local police are an invaluable asset to our security operations, both for the visibility and deterrence that they bring, but also their wealth of experience and knowledge,” Burger says. <img src="/ASIS%20SM%20Callout%20Images/0217%20Case%20Study%20Stats%20Box.jpg" class="ms-rtePosition-2" alt="" style="margin:5px;width:495px;" /></p><p>The security operations center is another vital piece of the puzzle at MIM, where contract officers monitor the approximately 200 cameras that cover the premises, as well as manage alarms and access control, and dispatch help in the case of an incident. “Our video is not just for forensics use, we actually do a lot of training and work with our security operators to be more proactive—live-monitoring the video, identifying issues before they become incidents,” Burger notes. </p><p>A couple of years ago, MIM was in the process of upgrading its existing cameras for increased situational awareness and improved analytics across the entire property. “We reached out to several manufacturers, talked to their local representatives, and found out more about their products,” he says.</p><p>After narrowing it down to a few products, MIM chose Hanwha Techwin America, formerly Samsung, and selected a variety of its camera models. “This was a multiphase project of refreshing all our cameras and getting them up to a certain standard,” says Burger. “Hanwha was selected for this portion of it, which covered all of the main public spaces, employee areas, and building perimeters.” </p><p>Approximately 70 Hanwha cameras were installed, including fisheye and pan-tilt-zoom (PTZ) cameras. For sensitive places, such as loading docks and cash-handling areas, higher megapixel cameras were deployed. Burger says MIM was attracted to Hanwha for several reasons. “The integration the Hanwha cameras had with our Genetec VMS was a big deciding factor,” he notes, explaining that the alarms, motion detection, and other features of the existing video management system are easily tied into the Hanwha cameras. There is also “plenty” of storage space on the cameras, he adds, allowing for additional analytics or other processes to be run on the edge.</p><p>The installation began in early 2015 and was completed in March 2016. With the Hanwha cameras, MIM can set video analytics to detect motion and set off alarms if appropriate. With facial detection, the analytics can differentiate a human from other moving objects like debris and small animals that would not necessarily warrant the triggering of an alarm. If the system detects unwanted motion or people, an alarm goes off in the control center to alert operators to pay attention to the monitor showing that camera. “It’s an improved efficiency, being able to automate those features so the operator isn’t constrained with watching hundreds of cameras at once, and having to make all of those decisions himself,” Burger says.  </p><p>When an incident occurs that requires dispatch, control room operators notify the police at the main security desk in the front lobby. Those officers have a few monitors at their station for viewing any relevant video, as well as smartphones to receive images or video in the field. </p><p>Burger notes that, thankfully, no notable security incidents have occurred at the museum since installing the cameras. However, the day-to-day issues are easily resolved thanks to the cameras and ease of reviewing video on the Genetec VMS. “A common scenario is locating lost family members, and we’re able to pretty quickly backtrack and do some forensic searches [with the video],” he says. </p><p>Locating lost bags or spotting unattended packages is another routine event, as well as dealing with visitors’ slips, trips, and falls. “We can identify cases where somebody says things happened a certain way, and we were able to find that it wasn’t exactly the case,” notes Burger. On average, MIM keeps the video for 30 days before overwriting it, unless an incident warrants holding onto the footage longer.</p><p>Eventually Burger says MIM will integrate access control with video as well, so that alerts and alarms for doors can be tied to the appropriate cameras. </p><p>“The cameras have really increased our situational awareness, reducing potential blind spots or areas where there could have been a gap before,” he says.</p><p>--<br></p><p>For more information: Tom Cook, tom.cook@hanwha.com, www.hanwhasecurity.com, 201.325.2623 ​</p>GP0|#cd529cb2-129a-4422-a2d3-73680b0014d8;L0|#0cd529cb2-129a-4422-a2d3-73680b0014d8|Physical Security;GTSet|#8accba12-4830-47cd-9299-2b34a4344465
https://sm.asisonline.org/Pages/The-Virtual-Lineup.aspxThe Virtual Lineup<p>​U.S. State and federal agencies are amassing databases of American citizens’ fingerprints and images. The programs were largely under the public radar until a governmental watchdog organization conducted an audit on them. The so-called “virtual lineups” include two FBI programs that use facial recognition technology to search a database containing 64 million images and fingerprints.</p><p>In May 2016, the U.S. Government Accountability Office (GAO) released Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy, a report on the FBI programs. Since 1999, the FBI has been using the Integrated Automated Fingerprint Identification System (IAFIS), which digitized the fingerprints of arrestees. In 2010, a $1.2 billion project began that would replace IAFIS with Next Generation Identification (NGI), a program that would include both fingerprint data and facial recognition technology using the Interstate Photo System (IPS). The FBI began a pilot version of the NGI-IPS program in 2011, and it became fully operational in April 2015. </p><p>The NGI-IPS draws most of its photos from some 18,000 federal, state, and local law enforcement entities, and consists of two categories: criminal and civil identities. More than 80 percent of the photos are criminal—obtained during an arrest—while the rest are civil and include photos from driver’s licenses, security clearances, and other photo-based civil applications. The FBI, which is the only agency able to directly access the NGI-IPS, can use facial recognition technology to support active criminal investigations by searching the database and finding potential matches to the image of a suspected criminal. </p><p>Diana Maurer, the director of justice and law enforcement issues on the homeland security and justice team at GAO, explains to Security Management that the FBI can conduct a search for an active investigation based on images from a variety of sources—camera footage of a bank robber, for example. Officials input the image to the NGI-IPS, and the facial recognition software will return as many as 50 possible matches. The results are investigative leads, the report notes, and cannot be used to charge an individual with a crime. A year ago, the FBI began to allow seven states—Arkansas, Florida, Maine, Maryland, Michigan, New Mexico, and Texas—to submit photos to be run through the NGI-IPS. The FBI is working with eight additional states to grant them access, and another 24 states have expressed interest in using the database.</p><p>“The fingerprints and images are all one package of information,” Maurer says. “If you’ve been arrested, you can assume that you’re in, at a minimum, the fingerprint database. You may or may not be in the facial recognition database, because different states have different levels of cooperation with the FBI on the facial images.”</p><p>The FBI has a second, internal investigative tool called Facial Analysis, Comparison, and Evaluation (FACE) Services. The more extensive program runs similar automated searches using NGI-IPS as well as external partners’ face recognition systems that contain primarily civil photos from state and federal government databases, such as driver’s license photos and visa applicant photos. </p><p>“The total number of face photos available in all searchable repositories is over 411 million, and the FBI is interested in adding additional federal and state face recognition systems to their search capabilities,” the GAO report notes.</p><p>Maurer, who authored the GAO report, says researchers found a number of privacy, transparency, and accuracy concerns over the two programs. Under federal privacy laws, agencies must publish a Systems of Records Notice (SORN) or Privacy Impact Assessments (PIAs) in the Federal Register identifying the categories of individuals whose information is being collected. Maurer notes that the information on such regulations is “typically very wonky and very detailed” and is “not something the general public is likely aware of, but it’s certainly something that people who are active in the privacy and transparency worlds are aware of.” </p><p>GAO found that the FBI did not issue timely or accurate SORNs or PIAs for its two facial recognition programs. In 2008, the FBI published a PIA of its plans for NGI-IPS but didn’t update the assessment after the program underwent significant changes during the pilot phase—including the significant addition of facial recognition services. Additionally, the FBI did not release a PIA for FACE Services until May 2015—three years after the program began. </p><p>“We were very concerned that the Department of Justice didn’t issue the required SORN or PIA until after FBI started using the facial recognition technology for real world work,” Maurer notes. </p><p>Maurer says the U.S. Department of Justice (DOJ)—which oversees the FBI—disagreed with the GAO’s concerns over the notifications. Officials say the programs didn’t need PIAs until they became fully operational, but the GAO report noted that the FBI conducted more than 20,000 investigative searches during the three-year pilot phase of the NGI-IPS program. </p><p>“The DOJ felt the earlier version of the PIA was sufficient, but we said it didn’t mention facial recognition technology at all,” Maurer notes. </p><p>Similarly, the DOJ did not publish a SORN that addressed the collection of citizens’ photos for facial recognition capabilities until GAO completed its review. Even though the facial recognition component of NGI-IPS has been in use since 2011, the DOJ said the existing version of the SORN—the 1999 version that addressed only legacy fingerprint collection activities—was sufficient. </p><p>“Throughout this period, the agency collected and maintained personal information for these capabilities without the required explanation of what information it is collecting or how it is used,” the GAO report states.</p><p>It wasn’t until May 2016—after the DOJ received the GAO draft report—that an updated SORN was published, Maurer notes. “So they did it very late in the game, and the bottom line for both programs is the same: they did not issue the SORNs until after both of those systems were being used for real world investigations,” Maurer explains. </p><p>In the United States, there are no federally mandated repercussions for skirting privacy laws, Maurer says. “The penalty that they will continue to pay is public transparency and scrutiny. The public has very legitimate questions about DOJ and FBI’s commitment to protecting the privacy of people in their use of facial recognition technology.”</p><p>Another concern the GAO identified is the lack of oversight or audits for using facial recognition services in active investigations. The FBI has not completed an audit on the effectiveness of the NGI-IPS because it says the program has not been fully operational long enough. As with the PIA and SORN disagreements, the FBI says the NGI-IPS has only been fully operational since it completed pilot testing in April 2015, while the GAO notes that parts of the system have been used in investigations since the pilot program began in 2011. </p><p>The FBI faces a different problem when it comes to auditing its FACE Services databases. Since FACE Services uses up to 18 different databases, the FBI does not have the primary authority or obligation to audit the external databases—the responsibility lies with the owners of the databases, DOJ officials stated. “We understand the FBI may not have authority to audit the maintenance or operation of databases owned and managed by other agencies,” the report notes. “However, the FBI does have a responsibility to oversee the use of the information by its employees.” </p><p>Audits and operational testing on the face recognition technology are all the more important because the FBI has conducted limited assessments on the accuracy of the searches, Maurer notes. FBI requires the NGI-IPS to return a correct match of an existing person at least 85 percent of the time, which was met during initial testing. However, Maurer points out that this detection rate was based on a list of 50 photos returned by the system, when sometimes investigators may request fewer results. Additionally, the FBI’s testing database contained 926,000 photos, while NGI-IPS contains about 30 million photos.</p><p>“Although the FBI has tested the detection rate for a candidate list of 50 photos, NGI-IPS users are able to request smaller candidate lists—specifically between two and 50 photos,” the report states. “FBI officials stated that they do not know, and have not tested, the detection rate for other candidate list sizes.” </p><p>Maurer notes that the GAO recommendation to conduct more extensive operational tests for accuracy in real-world situations was the only recommendation the FBI agreed with fully. “It’s a start,” she says. </p><p>The FBI also has not tested the false positive rate—how often NGI-IPS searches erroneously match a person to the database. Because the results are not intended to serve as positive identifications, just investigative leads, the false positive rates are not relevant, FBI officials stated.</p><p>“There was one thing they seemed to miss,” Maurer says. “The FBI kept saying, ‘if it’s a false positive, what’s the harm? We’re just investigating someone, they’re cleared right away.’ From our perspective, the FBI shows up at your home or place of business, thinks you’re a terrorist or a bank robber, that could have a really significant impact on people’s lives, and that’s why it’s important to make sure this is accurate.”</p><p>The GAO report notes that the collection of Americans’ biometric information combined with facial recognition technology will continue to grow both at the federal investigative level as well as in state and local police departments.</p><p>“Even though we definitely had some concerns about the accuracy of these systems and the protections they have in place to ensure the privacy of the individuals who are included in these searches, we do recognize that this is an important tool for law enforcement in helping solve cases,” Maurer says. “We just want to make sure it’s done in a way that protects people’s privacy, and that these searches are done accurately.”</p><p>This type of technology isn’t just limited to law enforcement, according to Bloomberg’s Hello World video series. A new Russian app, FindFace, by NTechLab allows its users to photograph anyone they come across and learn their identity. Like the FBI databases, the app uses facial recognition technology to search a popular Russian social network and other public sources with a 70 percent accuracy rate—the creators of the app boast a database with 1 billion photographs. Moscow officials are currently working with FindFace to integrate the city’s 150,000 surveillance cameras into the existing database to help solve criminal investigations. But privacy advocates are raising concerns about other ways the technology could be used. For example, a user could learn the identity of a stranger on the street and later contact that person. And retailers and advertisers have already expressed interest in using FindFace to target shoppers with ads or sales based on their interests. </p><p>  Whether it’s a complete shutdown to Internet access or careful monitoring of potentially dangerous content, countries and companies around the world are taking advantage of the possibilities—and power—inherent in controlling what citizens see online. As criminals and extremists move their activities from land and sea to technology, governments must figure out how to counter digital warfare while simultaneously respecting and protecting citizens’ basic human right to Internet access.​ ​</p>GP0|#21788f65-8908-49e8-9957-45375db8bd4f;L0|#021788f65-8908-49e8-9957-45375db8bd4f|National Security;GTSet|#8accba12-4830-47cd-9299-2b34a4344465
https://sm.asisonline.org/Pages/Five-Incidents-That-Shaped-Crisis-Management.aspxFive Incidents That Shaped Crisis Management<p>​<span style="line-height:1.5em;"><strong>1. Deepwater Horizon. </strong>Eleven oil platform workers died and 5 million barrels of oil were released into the Gulf of Mexico when the Deepwater Horizon rig exploded off the Louisiana coast in 2010. BP’s attempt to blame other parties backfired, and the different organizations became adversaries. BP’s approach was seen as a refusal to take responsibility. Practice, practice, practice, especially with your partners. </span></p><p><strong>2. Exxon Valdez.</strong> Exxon Valdez’s 10-million-gallon oil spill in the Prince William Sound in 1989 continues to be a source of litigation and contention. The company initially declined media requests. When the chairman finally gave an interview, he was unprepared and unimpressive. Companies must have a bank of trained, prepared spokespeople to respond to inevitable media requests. Refusing to speak to the media is never an option. </p><p><strong>3. Piper Alpha.</strong> In July 1988, an explosion on Occidental’s Piper Alpha Platform in the North Sea killed 167 men. Occidental had no local response team, so the local police took the role of informing families about injuries as well as fatalities (under U.K. law, it is always the role of the police to inform families of fatalities, but not of injuries). Because the process was slow, the company was accused of not caring about employees or their families. Major incidents require a coordinated response. </p><p><strong>4. Pan Am.</strong> The bombing of Pan Am Flight 103 over the town of Lockerbie, Scotland, in 1988 killed 243 passengers, 16 crew members, and 11 people on the ground. Because the incident was a terrorist attack, Pan Am made a conscious decision that it was not going to communicate about the disaster because it was the “victim” not the “villain.” The media went to bereaved relatives instead. Pan Am’s silence ensured that it became the villain. Whatever the causes, companies need to take part in any rescue and response efforts. Companies cannot be victims.  </p><p><strong>5. Miracle on the Hudson.</strong> In 2009, US Airways Flight 1549 made an emergency landing on the Hudson River, allowing all 155 passengers and crew to be safely evacu­ated. The airline chose to focus on and leverage the heroic actions of the crew members and publicly praised their “five outstanding aviation professionals.” While the story may have been different had there been fatalities, the incident enhanced the company’s reputation. You can set the narrative for your crisis.  </p><p><em><strong>Andrew Griffin</strong> is the CEO of Global Crisis M​anagement Consultancy Regester Larkin</em><br></p>GP0|#28ae3eb9-d865-484b-ac9f-3dfacb4ce997;L0|#028ae3eb9-d865-484b-ac9f-3dfacb4ce997|Strategic Security;GTSet|#8accba12-4830-47cd-9299-2b34a4344465