Recent Comments

    0Years GRC Experience
    0Trained Participants
    0Strategic Partnerships

    CONTACT OUR TEAM

    Our experienced team of GRC Software, Services, Content and Support resources are excited to meet you!

    Contact Us

    NEWS AND ARTICLES

    Follow GRC 31000 on social media to receive the latest news on Risk, Security and Related Technology form our team of experts, academics, professionals, partners and trainers

    Analysis of Radiation Poisoning Crimes

    September 25, 2016
    Introduction There seems to be two certainties in the Alexander Litvinenko case. The first is that he died from polonium-210 (210Po) poisoning . The second is that 210Po as a poison is almost 100% certain to kill anyone if ingested . Alexander Litvinenko died on 23 November 2006, 22 days after first checking into the Bartnet General Hospital in north London . Alexander may have claimed that he was poisoned but he died not knowing what the poison was, a recurring theme of uncertainty and opaqueness consistent with a clandestine, radiological incident of suspected espionage. On November 3rd, 2006 Litvinenko was admitted to the Bartnet General Hospital in north London with symptoms of vomiting, diarrhea, fatigue and abdominal pain which was evidence of Acute Radiation Syndrome in the prodromal phase . His condition deteriorated and he was transferred to the University College Hospital in central London on November 17th, alert but weak. Litvinenko would start experience hair loss and bleeding in the coming days. At the time he claimed in media interviews that he had been poisoned. Litvinenko showed all the symptoms of acute radiation syndrome, but no radiation had been detected at this point. According to Geoil Bellingan, the clinical director of the department of critical care at UCH, “The Geiger counter readings were negative”. 210Po is not detectable when digested in the human body which gives rise to suspicions of Cold War era espionage rather than terrorism . Litvinenko became progressively worse and on November 22, was incubated and placed on mechanical ventilation. His vital organs failed one after the other, and in the end his immune system collapsed as his white blood cell count plummeted to virtually nil . Britain’s Atomic Weapons Establishment confirmed 210Po was the radionuclide emitting alpha particles into his organs on November 23, but the news could not reach the hospital before Litvinenko could find out what killed him . International Espionage Using common profiling techniques, the following analysis provides additional information about the radiological event, its implications for public health and security, and the long term political and social consequences in the aftermath. Adversary Target Alexander Litvinenko is a former officer of the Soviet era KGB and a former Russian Federal Security Service agent (FSB). During the late 1990’s Litvinenko worked for the FBS organized crime unit assigned to combat corruption during the Russia’s transition to a free market economy. It is in this post that Litvinenko became aware of what is described as officials “lining their own pockets” and “settle accounts with undesirable persons.” Litvinenko became a fierce critic of Putin who he singled out in a bizarre press conference and personal accusations of being a pedophile that caused a media frenzy in . Litvinenko accused Putin publically of orchestrating the 1999 series of apartment bombings which Putin blamed on Chechen separatists, and had used to justify an invasion of Chechnya in that same year. Suspiciously a new Russian law authorizing the elimination of individuals outside of Russia, who the Kremlin accuses of

    Weapons of Mass Deception

    September 18, 2016
    Introduction Times change and bring about knowledge and understanding that affect perceptions. Changes in perceptions leads to changes of behaviors in individuals, nation states, special interest groups and society. The most notable change that the cyber world introduces, is a collapse of trust. The contents of this study explores whether human behaviors and social norms are carried forward to a novel domain such as cyberspace. Even though cyberspace may not be new to the X or Y generation, it is now for the first time in human history, an additional space that an entirely new generation Z is born into. With a new generation, many of whom have been exposed to a considerable amount of time playing simulated computer war games, are entering adulthood where real strategies and actions have real consequences. One must wonder how this incubation in cyberspace will affect behaviors in space-time. The scope of this review is focused on the psychological, social and behavioral consequences that cyberspace and cyber-security bring about. Literature Review A common thread in the literature is that Cyber-security is so novel that reliable, complete, accurate and adequate research is not only hard to find, but raises the question of whether a sufficient amount of people are capable of conducting research on this complex and esoteric topic. A second cause for concern is that research is limited to a Western governance perspective , which introduces bias, ironically a preferred method used to deceive victims in cyberspace. The only group that undoubtedly have an abundant source of information on cyber-security are the world’s military’s. However due to the nature and sensitivity of this information it is almost never acknowledged or publicly demonstrated, remains classified, and therefore unattainable . This is one of the many faces of cybersecurity that is able to mask its true nature, and is a primary attribution factor. A second feature of cyber-security that is as enigmatic as it is descriptive, is the Internet. Describing and understanding the vastness of the Internet is comparable to that of an infinitely possible set of consequences in the space-time construct. Several authors have describe the internet as a function of the “Information Revolution” where communications flow without boundaries, from handheld devices to global networks, to cars, to computers, to super computers, and as of 2014 also quantum computers that have the potential to render all cryptological systems obsolete . With the expected growth of mobile phones usage reaching 2 billion in 2015 , there are many affected stakeholders, even when they are unaware of the psychological effects. Psychological effects of loaded language One article addresses the need to monitor vocabulary used in “management guru” literature. Quigley, Burns and Stallard warn that special interests groups are using language to influence society when overstating the threat or using tropes to further political or financial interests. It was found by these three psychologists that scientific articles or technical studies refrained from injecting loaded language into the literature, and diverges from media and privately sponsored articles promoting cybersecurity

    Risk Communication Strategy

    November 13, 2015
    Background Since the 9/11 terrorist attacks, there have been multiple others terror attacks on transportation and critical infrastructure including the 2004 Madrid train bombings, 2005 London bombings and 2006 Mumbai train bombings and therefore sufficient reason to believe that public transportation and critical infrastructure remain at risk in the United States. Title 6 U.S.C. § 1112 – Authorization of the Transportation Security Administration’s (TSA) Visible Intermodal Prevention and Response (VIPR) teams authorizes the program to “augment the security of any mode of transportation at any location within the United States”. The VIPR program’s mission is [was, webpage now removed] to “promote confidence in and protect our nation’s transportation systems through targeted deployment of integrated TSA assets utilizing screening and law enforcement capabilities in coordinated activities to augment security of any mode of transportation” (TSA.gov Website, 2013 until removed). The methods that are authorized by Title 6 U.S.C. § 1112, and employed by VIPR are examples of what the general public can expect during an incident or emergency. For the objectives of this article, all security teams, agents and government agencies could effectively issue risk communications and benefit from this strategy and the term Agency is refers to these collectively. The activities discussed introduce an element of unpredictability to combat terror activities due to its random application and classified status. As a result, travelers who may come into contact with security personnel unexpectedly may react with confusion or shock. This scenario is a source of risk, conflict and confusion that warrants explanation and proper risk communication procedures to avoid fear, panic, and public resistance. This article presents an example of a risk communications strategy that could be used in scenarios of similar nature, to inform and educate members of the general public, private and non-governmental organizations, domestic and international travelers and public officials. Risk Communication Methodology The Elaboration Likelihood Model (ELM) of persuasion developed by Richard E. Petty and John Cacioppo (1986) was used to design this example risk communication strategy, incorporating attitude certainty  and the theory of changing minds . Using the Model of Source Characteristics , it was determined who would deliver the message and how the message should be structured so that it is perceived to be credible, attractive and exert the appropriate amount of power and influence, as Kelman describes. Two routes of communication, central and peripheral, were selected to reach the majority of stakeholders in a short to medium time-frame. Long-term communication strategies will be employed regularly and security measures will be publicized repetitively using social media, placards, public meetings and feedback channels. The message was evaluated using a self-assessment checklist inspired by the 7 R’s of Changing Minds for continual review and improvement. Stakeholder Analysis This risk communication strategy serves as a reference guide when releasing information externally, and to describe how an Agency intends to communicate with stakeholders. The plan addresses stakeholder needs (see Figure A1, Stakeholder Analysis) including those that are responsible for making decisions about communication changes as a result of feedback or public reaction to the information that was received. Stakeholders are

    Common Pitfalls in Incident Management Implementation

    October 30, 2015
    Common Pitfalls in Incident Management Implementation Unity of Command A disorganized command structure may lead to units that self-dispatch to an incident or task, and do not have accountability to a supervisor. It is important that there is an established process for communications within the delegation of authority system, as well as free sharing of information between units. Prevention Neglecting a review of prevention measures in a post-incident investigation is often a cause of future incidents, especially when the incident was neutralized or managed successfully. Successful incident responses often provide false assurance that the control measures are effective. No assumptions should replace maintenance, observations, reviews, and audits after an incident. Post-incident investigations should prioritize prevention measure reviews so that the root cause of the control failure or success is established, documented, corrected, or enforced if required. Tailoring, Flexibility and Robustness No plan fits all incidents, and not all incidents may have been identified in the risk assessment. For this reason, the emergency plans should be tailored to fit the context of the incident, and the action plans should be flexible to consider these changes. Terminology Using complex terminology, abbreviations, or technical language during an incident could lead to misunderstanding, miscommunication, and errors that may escalate an incident to a crisis. Terminology should be in plain English and clearly defined in the emergency plan so that nothing is left to individual interpretation. Personnel Training and Competence Human and cultural factors must be taken into account when planning emergency operations. Human factors include the perceptions and intentions of individuals and groups, capabilities and competence, experience, knowledge and skills of personnel, as well resistance to change. It is often the case that a director may not hold that title during an incident, but may be influenced or influence other personnel when being reassigned or losing their span of control. Human factors should be defined and understood so that responsibilities are assigned accordingly, and enable the stakeholders to execute their duties. Resources must be held accountable, and therefore must be competent, trained, and skilled to use tools and techniques to achieve their goals. There should be training programs and mechanisms in place to measure skills and knowledge on a continuous basis in order to maintain preparedness.

    Implementing Incident Management in Emergency Management

    October 30, 2015
    Implementing Incident Management in Emergency Management All organizations must have an emergency plan in place that includes resources, roles and responsibilities, procedures, logistics and contractual arrangement for an EOC (NFPA® 1600 , 2013, p. 12). The Emergency Operations Center is a physical location near the incident where the incident response will be coordinated. From the FEMA goals, it is understood that emergency management programs prioritizes the overall coordination of an incident response. Although it is essential to minimize loss of life and impact to the environment or biodiversity, it may be more beneficial in the long term to prevent incidents from occurring. According to the NFPA® 1600 “[t]he entity shall develop an incident management system to direct, control, and coordinate response, continuity, and recovery operations”. Incident management is however more than coordinating the response, continuity, and recovery operations. The following three risk management processes are vital in implementing incident management in an organization: Hindsight Insight Foresight Together these three risk management processes form a “line of sight” for emergency management personnel. Hindsight: Root cause analysis All incidents, regardless of severity must be investigated to ensure that maintenance of measures were conducted, that all risks were identified, and that the incident was not an indicator of a change in circumstances that may warrant a full risk assessment review. Root cause analysis is backward looking and aims to discover control inefficiencies and failures so that these can be corrected as appropriate. Successes and strengths must be considers as well so that these can be enforced. There are many tools and standardized methods that can be used for root cause analysis. One such method is a fault tree analysis (FTA). An FTA is a process where all causal factors are identified and associated with the incident to determine a hierarchy of failures leading from the event backwards to find the root of the problem or failure. As with all assessment methodologies, FTA’s have a number of limitations and strengths to consider. Although it provides an easy method to visually determine causal factors and binary failures, it does not consider time-frames of each factor, and is challenging to compute combined factors leading to a significant amount of uncertainty in estimating probabilities. Nevertheless, FTA’s present a diagram that may be used to prioritize research activities and control design corrections. The lessons learned from root cause analysis is documented and incorporated to improve or renew measures, best practices, and training programs to educate responders, the public, and government of emerging issues and incident trends. Root cause analysis is applied to failures and successes to learn everything that is relevant to both outcomes. Insight: Control assurance. All prevention measures should be maintained, monitored, and reviewed periodically as these measures prevent incidents from occurring, and are the first indicators that incident severity or frequencies are increasing. Examples of preventive measures may include equipment and machinery maintenance, periodic procedural reviews, policy implementation, management review, structural, automated, and managerial control testing, and independent auditing. Prevention measures should include compensating measures so that a secondary