Growing up as a girl from the Australian bush, the United Nations was a long distance away, physically speaking, but not as far from my thoughts as one might think.
You see, I was a big Audrey Hepburn fan, and Hepburn’s service as a goodwill ambassador for the United Nations International Children’s Emergency Fund placed UN participation on my radar as a worthy, albeit improbable, dream to which I should aspire.
I am blown away that this dream is about to come true.
I will soon be part of the official Australian government delegation to the 62nd session of the UN Commission on the Status of Women. The session, which will take place 12 March to 23 March at UN headquarters in New York, will focus on how technology can help empower rural girls and women. It will be an honor to collaborate with Gillian Bird, the Australian ambassador to the UN, the Honourable Kelly O’Dwyer, Australia’s Minister for Women, and several other distinguished Australians to address a topic about which I am extraordinarily passionate. I will serve as one of only two members of the Australian delegation who work outside government.
Before going any further, I feel compelled to express my gratitude to ISACA for setting this incredible opportunity in motion. It is the work that I have done through ISACA, and particularly as a champion for the SheLeadsTech program, that opened the doors for me to be considered.
Throughout my career in the technology field, I have had to demonstrate considerable resilience, often finding myself to be the only woman in the room at a given meeting, conference session or client engagement. I have often had to persevere in the face of biased thinking and work environments. While I am a big believer in the importance of developing perseverance, I will continue to work toward a world in which the career path for the next generation of women in the tech workforce is much smoother than the one I encountered.
In addition to the work in which our delegation will participate – which will result in a publicly released report detailing our findings and recommendations – I look forward to the new connections that I will make in the coming weeks and months. Many influential people and groups are keenly interested in the UN Commission on the Status of Women, and this will be a prime opportunity to expand my network of like-minded advocates for women in the technology sector. Building global alliances, after all, is one of the three pillars of the SheLeadsTech program, and what better place to do so than at the UN?
I look forward to keeping the ISACA global community updated on the progress of the Commission on the Status of Women. While walking the corridors of the UN will be a new and thrilling experience for me, advocating for rural women and girls, for women in the tech industry and for the empowerment of women and girls across the world has been a lifelong passion. I pledge to make the most of this honor by doing my part to move this incredibly important work forward.Category: ISACA Published: 2/15/2018 9:15 AM
The institutions we all serve are inevitably going to utilize big data, if not now, soon. This is because of the power of extracting value from big data for the benefit of the products we make and the customers we serve. This can be said about almost all industries and, as we move towards technological advancement and creating new efficient ways to make our work intelligent, it is key to also think about the regulatory landscape that we must constantly assess. In my recent Journal article, I wrote about how audit professionals who work with big data, deal with global privacy implications, and handle sensitive research data require the knowledge and technical aptitude to audit the big data space to stay relevant. Almost all enterprises are now taking on big data projects, and staying compliant with growing regulatory risk requirements is causing internal compliance, risk and audit functions at these enterprises to demand auditors with these necessary skill sets.
My article introduced 3 components for working with the concepts of big data and protection of the data :
Big data will exponentially grow and, as studies show, “A full 90 percent of all the data in the world has been generated over the last two years.” The use of big data to capitalize on the wealth of information is already happening, and this can be seen from the daily use of technology platforms such as Google Maps or predictive search patterns while on a website. The conversation around the use of big data is already happening, and it is our job as auditors to become part of the conversation and help the organizations we serve navigate through the risk of managing big data.
Read Mohammed Khan’s recent Journal article:
“Big Data Deidentification, Reidentification and Anonymization,” ISACA Journal, volume 1, 2018.
With less than 100 days to 25 May, many organizations outside the European Union have the same question: “Does the General Data Protection Regulation (GDPR) apply to my organization?”
The answer has to be “it depends” – although this is an answer that no one likes. You cannot immediately say yes or no. Instead, you need to take a step-by-step approach to identify the requirements of GDPR, the organization’s connection with the personal data of EU citizens and consult an attorney specializing in GDPR as needed. The answer to this question can only be given based on an analysis of the organization’s operations and usage of personal data, based on Article 3, which defines territorial scope. This article is really important for organizations outside of the EU to determine whether they need to adhere to GDPR. The article states that organizations must comply with GDPR if they offer goods or services to EU citizens, even without payment, or monitor behavior of EU citizens (data subjects). In today’s digital world, these practices are not rare.
The starting point should be to determine whether the organization processes personal data of EU citizens, either as a controller or a processor of data, or whether a part of your organization operates within EU borders. If the answer to one of these questions is yes, then it does not matter where your business headquarters are located. As long you are in the “place where Member State law applies by virtue of public international law,” you need to comply with GDPR.
To help guide this process, organization should perform a data protection impact assessment as a required element of GDPR. This is an initial step in determining the need to comply with GDPR in the process of GDPR implementation. Once the organization determines that it has to comply with the regulation, the compliance program must include all parts of data processing. Data processing “includes the collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction of personal data.” GDPR applies to both automated and manual data processing.
The organization being impacted by GDPR needs to assess, implement and comply with specific GDPR requirements. These requirements will impact the entire organization and how day-to-day operations are being conducted with respect to personal data. New processes and controls should be implemented to protect personal data of EU citizens and also to protect the organization from liabilities caused by non-compliance with GDPR.
Organizations that see 25 May not only as a deadline, but more as the starting point of a long-lasting GDPR compliance program, will have an advantage in processing personal data applying GDPR principles. Organizations should use this moment as an opportunity to implement best practices and realize benefits from GDPR.
Editor’s note: ISACA’s Implementing the General Data Protection Regulation publication is an educational resource for privacy and other interested professionals; it is not legal or professional advice. Consult a qualified attorney on any specific legal question, problem or other matter. ISACA assumes no responsibility for the information contained in this publication and disclaims all liability with respect to the publication. 2018 © ISACA. All rights reserved. For additional ISACA resources on GDPR, visit www.isaca.org/GDPR.Category: Privacy Published: 2/15/2018 3:00 PM
If you’re not seeing the results you want, you may need to switch SAP implementation partners. SAP implementation is becoming more important than ever, with revenues from enterprise resource planning (ERP) software expected to reach $84.1 billion by 2020, according to Apps Run the World. Not only does this technology help your organization become more efficient, but your top competitors are following suit – so you’ll need to increase your pace and attention if you want to keep up.
What happens if your chosen SAP implementation partner isn’t giving you what you need? What qualities should you look for in a new provider?
Signs It’s Time to Switch
If you notice any or all of the following signs, it’s a good indication that it’s time to switch providers:
Qualities to Look For
These are the qualities you should look for in a new partner:
Choosing a new SAP implementation partner doesn’t have to be a painful experience even though there are many options available. With enough research, you’ll inevitably find one that can serve your specific needs. Before you begin the process, outline the specific qualities that are most important to your organization, and start weeding out the candidates that can’t meet those criteria.Category: Risk Management Published: 2/14/2018 3:05 PM
Big data is a huge volume of data that cannot be treated by traditional data-handling techniques because it is mostly unstructured and complex. Thus, proper collation, coordination and harnessing of such data is necessary for relevant users, such as chief information officers, IS auditors and chief executive officers, to make meaningful decisions. My recent Journal article describes a 6-stage cycle for implementing big data for organizations, especially commercial banks. This is illustrated by the acronym DIRAPT, which stands for definition, identification, recognition, analysis, ploughing-back and training. I consider DIRAPT to be a cycle because there is a need to repeat the stages over and over:
The DIRAPT cycle can prove beneficial to organizations, such as commercial banks, to enjoy the dividends of big data.
Read Adeniyi Akanni’s recent Journal article:
“Implementation of Big Data in Commercial Banks,” ISACA Journal, volume 1, 2018.
Nestled in William Craig’s book Enemy at the Gates, which recounts World War II’s epic Battle of Stalingrad, is the story about a Soviet division that was plagued by failure in the face of the enemy. Desertions were rising, officers’ orders were not being followed, and the invading enemy was making gains. Faced with this calamitous condition, the regimental commander called the troops into formation and let them know that collectively, they were failing and would be held responsible. Then, in an outrageously cold manner, he walked through the ranks and summarily executed every 10th soldier until six soldiers lay dead on the field. He got their attention, and the unit was instrumental in the subsequent Soviet counterattack that led to victory against the Nazi invaders.
Obviously, I do not support such extreme and violent methods of accountability, yet the example does make you pay attention. As we grapple with today’s digital “enemy at the gates” or even the “enemy inside the gates,” the importance of accountability for failure to properly protect the information our national prosperity and security depends on has never been more important. Firing CEOs and CIOs is typically a public gesture enacted to diffuse blame rather than address the root causes. Sadly, accountability and ownership often are missing components in cyber strategies and risk management planning at a time when risks are ever-increasing. Therefore, it is critically important that all organizations better manage cyber risk by embracing a culture of accountability and ownership that guides the implementation of due care and due diligence measures.
I define due care as “doing the right things” and due diligence as “doing the right things right.” Unfortunately, I’ve found too many organizations where due care and due diligence are not occurring. For example, ask most cyber incident responders about the root cause of cyber incidents and they likely will sigh and point to the “usual suspects” – failure to patch, misconfigured systems, failure to follow established policies, misuse of systems, lack of training, etc. As someone who led incident responders in both military and civilian government organizations, I found one of the great frustrations of cyber professionals is when they see leadership ignoring or tolerating the so-called “usual suspects” and not holding people accountable for a glaring lack of due care and due diligence.
While many media reports these days focus on the very real and present threat of well-funded nation-state actors, I contend that the greatest cyber threat we all face is what I refer to as the “Careless, Negligent and Indifferent” in our own ranks. Failing to properly configure a system so that it exposes information to unauthorized personnel is an example of carelessness. Failing to patch critical vulnerabilities quickly or implement additional compensating controls until the patch is ready for promotion could be considered negligence. Failure by personnel indifferent about following established policies such as prohibiting password-sharing exposes organizations to increased cyber risk. While nation-state actors get all the hype, I contend that more than 95% of all cyber incidents are preventable and are the result of the Careless, Negligent and Indifferent in our own ranks. We should not accept this!
Do we need more legislation, regulation or policies to thwart the threat posed by the Careless, Negligent and Indifferent? Do we need to continue our habit of buying the next neat technology in hopes that its “silver bullet” defense will save the day? I don’t think so. I believe what is needed is to execute our existing policies better and hold those who do not follow those policies accountable. While we can’t eliminate our cyber risks, we certainly can reduce our risk exposure by executing our plans, policies and procedures with greater velocity and precision. When we do so, we are exercising due care and due diligence that protects our brands, reputations, customer data, intellectual property, corporate value, etc.
Accountability must be clearly defined, especially in strategies, plans and procedures. Leaders at all levels need to maintain vigilance and hold themselves and their charges accountable to execute established best practices and other due care and due diligence mechanisms. Organizations should include independent third-party auditing and pen-testing to better understand their risk exposure and compliance posture. Top organizations don’t use auditing and pen-testing for punitive measures, but rather, to find weaknesses that should be addressed. Often, they find that personnel need more training, and regular cyber drills and exercises to get to a level of proficiency commensurate with their goals. Those organizations that fail are those that do not actively seek to find weaknesses or fail to address known weaknesses properly.
Sound execution of cyber best practices buys down your overall risk. With today’s national prosperity and national security reliant on information technology, the stakes have never been higher.Category: Security Published: 2/12/2018 3:07 PM
Editor’s note: Technology futurist Shara Evans, founder and CEO of Market Clarity, will deliver the closing keynote address at North America CACS 2018, which will take place 30 April-2 May in Chicago, Illinois, USA. Evans recently visited with ISACA Now to discuss topics ranging from the future of travel to why many executives struggle to take a long view of technology. The following is an edited transcript:
ISACA Now: What inspired your passion for technology and scientific research?
For as long as I can remember, I’ve been a science fiction fan, devouring sci-fi novels like they were candy. One of my earliest recollections was watching the original Astro Boy cartoons. I remember him flying around fighting giant robots, aliens and all sorts of bad guys. So, there I was at 4 or 5 years old, trying to figure out how to design rocket jets for the heels of my shoes.
So much of what I read in science fiction novels inspired me. Unfortunately, I grew up in a period where science and technology were thought to be outside of what little girls should aspire to. In fact, my teachers actively discouraged me from taking classes in this area, instead enrolling me in things like “Home Economics” (being a good little housewife) and Typing (which actually worked out well once I started programming).
I originally thought I was going to be a lawyer/politician because I saw so much injustice in the world and wanted to do something with my life that could make a difference … But much to my surprise, I found I had an aptitude for computer programming and logic when I was finishing my undergraduate degree in political science. In the last semester of my senior year, while taking a sociology course, I had to use SPSS for a research project about cultural bias in the media. That was a long time ago, in the days when computer mainframes used card-punch machines for input. I picked up SPSS very quickly, then went on to teach myself a range of programming languages. I did my graduate work in computer science rather than going to law school. I’ve been in the technology field ever since.
ISACA Now: What are some recent technology innovations that you think bode especially well for society?
Whenever I look at technologies, I always see a double-edged sword: wonderful advances that can come from using a given technology, and conversely, threats to our security and privacy. We really need to think ahead to what can happen and balance how we use technologies so that we end up with a wonderful future, rather than a dystopian nightmare.
Some of the advances that I'm particularly excited about are in the med-tech area. For instance, a project in Brazil called Walk Again used a combination of virtual reality, robotic exoskeletons and brain machine interfaces to help eight quadriplegics regain feeling below their waists, dramatically changing the lives of the people involved with this experiment.
There are so many examples of combining biology, chemistry, medical science and technology that have the potential to really help people. Bio-printing is another example, where 3D printers are used to print living tissue. Already, noses and ears have been bio-printed, sometimes in combination with cutting-edge stem cell therapies that allow cells matching the recipient to grow on top of the 3D bio-printed scaffolding. Eventually, we will be able to print entire organs – no more waiting for transplants.
ISACA Now: You recently gave a talk about the future of travel. What do you anticipate will be the biggest changes on that front?
Two things: Hyperloop transport and the exploration of outer space.
There are already a number of companies actively exploring pilot hyperloop transport systems. So far, experiments with pilot test tracks are going well. If this comes to fruition, we will have large transport tubes taken to near-vacuum conditions to eliminate air resistance, which can autonomously transport people or cargo at speeds of up to 760 mph or more. This technology has the potential to change what it means to live in regional areas, taking the pressure off increasingly dense and expensive metropolitan areas.
Outer space exploration is another exciting area. Because of the emergence of re-usable rockets, exploration of space is going to be feasible within the next 10 years. Imagine the job possibilities and adventure that would bring!
ISACA Now: What are the biggest keys to getting more women into the tech workforce?
I think the only way to successfully achieve gender balance in the technology workforce is to encourage little girls (elementary school or younger) by presenting technology as something they can relate to, have fun with, and imagine themselves doing as a career.
One of the things that I noticed at [the Consumer Electronic Show] last year was the number of robot educational games designed to teach children how to program robots. We need to ensure that the teaching games being developed also appeal to little girls.
I think companies also need to be more open-minded about hiring women with a technical aptitude who may not have formal qualifications in this area. Give them a chance, offer training opportunities, and you’d be surprised how many shining stars are groomed.
It’s also important to understand and communicate that not all tech industry jobs are about writing code. So much of what we have developed is the result of imagination (often taking hints from science fiction) or trying to solve real-world problems. And, with AI and many other technologies, there are repercussions with respect to privacy, security and ethics. We need input from women on these issues, too.
ISACA Now: What are some common challenges with inspiring executive teams and boards of directors to take the long view in thinking about the potential impact of technology on their businesses?
I do keynotes and workshops for boards of directors and executive teams all the time. One of the biggest challenges is getting them to look beyond the 12-month horizon. As a futurist, my medium-term horizon is two to five years out – but to my clients, medium-term is six months. My long-term is the five- to 10-year horizon and beyond, whereas long-term to my clients is typically 12 to 18 months out.
So, when I talk about medium- or long-term opportunities and risks, I need to take them on a timeline journey about all kinds of technologies, how they relate to their businesses, and the consequences of failing to take action, whether it be missing out on huge opportunities, loss of competitive advantage, planning for the impact of automation – especially on their workforces – and the exponentiality factor as applied to today’s technologies. Most people are used to gradual changes, but we’ve now hit a curve in our technological advancement where the baseline is already high and we’re doubling capabilities every 18 to 24 months.Category: ISACA Published: 2/9/2018 3:07 PM
Determining the level of process maturity for a given set of IT-related processes allows organizations to determine which processes are essentially under control and which represent potential “pain points.” Process maturity has been a core component of COBIT for more than a decade; however, in COBIT 5, there was a change from the Maturity Model used in COBIT 4.1 to a Process Capability Model.
Currently, the COBIT 5 Process Assessment Model (PAM) is based on International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC) ISO/IEC 15504, which is a global reference for conducting process capability assessments. Meanwhile, a new standard, namely the ISO/IEC 330xx family, replaced and extended the ISO/IEC 15504 family. Since the ISO/IEC 15504 family is now withdrawn and was replaced by the ISO/IEC 330xx family, an update of the ISACA publication COBIT Process Assessment Model (PAM): Using COBIT 5 should be considered.
The new ISO/IEC 330xx family of standards presents a more detailed and well-defined process assessment model than the older ISO/IEC 15504 family. The gaps regarding rating methods and aggregation methods perceived in the older standard have now been solved with clear and standardized guidance on how to perform it. Also, the definitions of some process attributes, outcomes and base practices are now more consistent. Therefore, for all these reasons, updating COBIT 5 PAM to this new standard is not only a necessity, but also an opportunity to improve the assessment of COBIT 5 processes.
Read Joao Souza Neto, Rafael Almeida, Pedro Linares Pinto and Miguel Mira da Silva’s recent Journal article:
“A COBIT 5 PAM Update Compliant with ISO/IEC 330xx Family,” ISACA Journal, volume 1, 2018.
Machine learning is bantered around in the media often these days, many times erroneously. The key question that concerns auditors is not how to build machine learning algorithms or how to debate on the relative merits between L1 and L2 regularization, but rather, in what context is the algorithm operating within the business? Additionally, do we have assurance that it meets all regulatory and business constraints and fulfills the needs of the enterprise?
Data scientists, of which I am one, have the most fun working with algorithms and spending time clustered together attempting to eke out half of a percentage of accuracy from our models. However, this extra half of a percentage point almost never turns into improved results for the organization, or at least relative to the risk reward. For technology auditors, knowing how to create machine learning algorithms or understanding the actual mathematical mechanics behind the models is not required, or even very helpful, in evaluating the effectiveness of machine learning in the enterprise. As auditors, we need to provide assurance over how the algorithms are functioning in the business and find out whether proper governance and controls are in place to make sure the models are operating in the best interests of the enterprise. A favorite example of mine is that you can have a model that has 99% accuracy, say for fraud detection, that is practically worthless. For example, if 99 out of 100 transactions are not fraud, we could get a model with 99% accuracy by just saying that all transactions are not fraud. This does nothing for us though; we care about the recall of the model in this example. We want to make sure we detect the 1% of fraudulent transactions, even if we have 4 false positives. So, the moral of the story is in understanding the business ramifications and providing assurance that the models accomplish something. This is where audit can provide the most value and where we as auditors should focus: understanding the context of machine learning algorithms and applications and providing assurance that they are fulfilling the business requirements while staying within the bounds of relative regulations.
Read Andrew Clark’s recent Journal article:
“The Machine Learning Audit—CRISP-DM Framework,” ISACA Journal, volume 1, 2018.
We go into the hospital with a great deal of trust. We trust that doctors will help us and potentially even save our lives. Beyond hospitals, there are not many places in the world where we are willing to do anything we are asked: take off our clothes, talk about our sex lives, etc.
Recent cyberattacks, such as WannaCry and NotPetya, put this trust into question. An increasing number of cybersecurity incidents have impacted many hospitals and made them unsafe. Not only was patient information stolen and privacy impaired, but, in some cases, the cyberattacks interrupted normal operations and services. In hospitals, that could mean life or death.
Over the last decade, the healthcare industry made significant progress on digital transformation. Patients’ healthcare records are online, test results and images are digitized, an increasing number of medical devices are connected, and medical equipment can be remotely monitored and maintained. This technology has brought tremendous improvements in efficiency and convenience to medical staff and patients alike, while helping reduce human errors and lower operational costs. At the same time, however, this high level of connectivity has created a much larger surface area for security risks. Because there are so many connected devices and a large variety of different types of connected devices, it is becoming increasingly difficult to completely secure all of them at all times.
Hackers can not only use these devices as stepping stones to access critical assets, such as patients’ healthcare records, they also can compromise these devices to cause physical harm and put people’s lives at risk. For example, we demonstrated in our research lab that we can hack into an infusion pump from a leading vendor to change the dosage of the medication that is going directly into a patient’s body. This dosage change alone could be fatal to a patient.
Mid- to large-size hospitals use hundreds, if not thousands of third-party products and services. Even if the hospital itself is secured, these third-party vendors can bring in lots of vulnerabilities. Each of these third parties also uses many more other external vendors. If any of those external vendors is affected, there could be a domino effect on the hospital’s security – yet another reason it is extremely challenging to secure a hospital and all its IoT devices.
Is there a solution? In many ways, an IoT system is very similar to the human body – a large and complex system that is always on. Let’s use a heart attack as an analogy. We all know that a heart attack can be catastrophic. Although a heart attack usually happens suddenly, the conditions that make it likely actually take days, months or even years to build up. If we could continuously, automatically and intelligently monitor the heart and body, we could detect early signs of problems and take preventive actions to avoid the heart attack.
Doctors detect and cure diseases through their detailed knowledge of different parts of our body and their functionalities. Surprisingly, we don’t have similar information on IoT networks. Most hospitals we have talked to don’t have up-to-date information about what types of IoT devices they have, much less how many of these devices are connected onto their networks. So, IoT device visibility is the first task for each organization. At any given time, we need to know which devices are connected onto the network – plus, what they are supposed to do and not supposed to do – and conduct real-time monitoring of their behavior for early detection of potential cyberattacks.
Yet another challenge beyond the number and varied types of devices: these devices get on and off the network dynamically. How do we handle a highly dynamic system of such large scale? Obviously, manual monitoring is not feasible. The key is to leverage artificial intelligence (AI) to identify and monitor devices automatically, so that we can further protect them – and the hospital and its patients – in the event of a cyberattack.
In summary, visibility and AI are the keys for IoT security in healthcare.Category: Security Published: 2/7/2018 3:03 PM
IT auditors can act as strategic but independent partners to businesses currently working toward compliance with the European Union General Data Protection Regulation (GDPR), scheduled to come into enforcement on 25 May 2018.
Executive management increasingly expects the audit function to add more value to the business as a subject matter expert in all areas of risk management, as well as by supporting key business objectives and strategic initiatives. GDPR compliance is fundamentally a risk management exercise, which the audit function is well equipped to support.
Technology breaks down organizational silos
GDPR requirements require attention and remediation expertise from various functions within the business, including human resources, legal, compliance, marketing, communications and IT. For compliance efforts to succeed, the unintentional walls that often exist between these functions need to be broken.
While GDPR compliance is not solely a technology issue, technology acts as a common denominator across business processes and plays a significant role in the collection, processing, storage and transfer of personal data. This is the reason IT auditors in particular can use their overarching view of technology across the organisation to highlight interdependencies and gaps in GDPR compliance efforts.
In addition to supporting a robust control environment, IT auditors can act as risk consultants while maintaining their auditor independence.
During remediation activity made necessary by GDPR compliance, IT auditors should establish strategic partnerships within the business through:
Below are five examples of GDPR compliance workstreams and technology domains where IT audit can add value by providing an independent view.
1. Data Protection Impact Assessments (DPIA)
IT auditors acting as subject matter experts can help facilitate discussions so that the risks and impact of processing personal data are considered as early as possible when initiating new IT projects or vendor relationships.
The early identification of data protection risks through DPIA exercises is a significant step for successful implementation of privacy-by-design within:
Beyond merely satisfying compliance requirements, IT auditors should help the business take a longer-term view by institutionalising data protection impact assessments (Article 35) and fostering new ways of thinking about the impact of privacy on data processing activities.
2. Data Governance and Data Flows
Organizations (data controllers and data processors) must demonstrate their compliance with GDPR by maintaining records of processing activities under their responsibility and implementing technical and organizational measures (Article 32).
This requirement aligns perfectly with the main objective of data governance – to ensure the management of data as a strategic business asset in order to derive maximum value.
Effective data governance involves understanding data flows within business processes and ensuring the stewardship of data through activities such as developing data architectures, implementing quality management, data integration and meta-data management.
As organizations develop and maintain records of their personal data processing, IT auditors can provide a view on data flow mapping activities. Key questions to ask business representatives include:
IT auditors can help facilitate evaluations of the completeness of data flows by sharing good practices from their experience in mapping business processes during scoping activity.
3. Risk-Based Data Protection Controls
While it may be tempting to rush toward implementing encryption and pseudonymisation as solutions to data protection, it is important to question whether these controls are necessary in the first place (see GDPR Recital 28). Other protection strategies might be more appropriate, depending on the risk.
Where a risk assessment determines that pseudonymization is required as a method of data protection, IT auditors can help the business consider whether:
By challenging the business to consider the real risks to data, it is possible to arrive at pragmatic solutions for data protection, which may include applying controls like pseudonymization.
4. Big Data and Machine Learning
According to the EU Agency for Network and Information Security (ENISA), “The extensive collection and further processing of personal information in the context of big data analytics has given rise to serious privacy concerns, especially relating to wide scale electronic surveillance, profiling, and disclosure of private data.”
While unlocking the business value of data is a critical part of any digital agenda, businesses must thoroughly consider the potential impact on data subjects from unfair/biased data models, inaccurate analysis and prediction of future events (such as using methods such as machine learning), and profiling (Article 22).
IT auditors can challenge data scientists within their organizations to consider questions such as:
5. Data Processing in the Cloud
While IT auditors’ focus on cloud computing is not new, GDPR compliance requires renewed attention on data processing performed by third parties, including cloud service providers (CSPs).
Data privacy/protection-related control considerations for cloud-based data processing include:
Rather than a sprint to the finish line, organizations must see GDPR compliance as a marathon toward the goal of institutionalizing data privacy and data protection in the corporate culture. IT auditors can support this cultural change by looking beyond annual IT audit calendars and one-off GDPR-related audit engagements.
Through early and consistent engagement with the business through conversations, training and workshops, the IT audit function can mature from its traditional focus as a control watchdog to become a strategic business partner supporting longer-term organizational objectives.Category: Audit-Assurance Published: 2/6/2018 3:00 PM
The adoption of cloud applications (apps) and services is accelerating unabated as organizations increasingly look to take advantage of the business, collaboration and productivity benefits these apps provide. The flip side, however, is that the cloud is increasingly home to high-value confidential corporate and personal data, making cloud apps prime targets of cybercriminals.
Exploitation and malware distribution attacks in the cloud, in particular, should be treated as an arms race between cloud security firms and cybercriminals. As cybercriminals find design vulnerabilities in cloud apps that leave them vulnerable to attack and identify exploitable cloud user behaviors, cloud security vendors need to step in to fill the security gaps that cloud app vendors cannot.
Malware distribution mechanisms have become more advanced as attackers have begun using cloud storage services, such as Google Drive and Dropbox, to distribute malware. Many examples in which malware such as Petya and Cerber ransomware were distributed via DropBox and Office 365, respectively, have been encountered recently. Attackers are deploying advanced malware-hosting techniques, such as obfuscation, camouflaging and metamorphism, to hide the malicious content in cloud-hosted files and then distributing those files to a large number of Internet users as a part of drive-by download attacks.
Malware distribution is not the only threat. Cloud apps are also susceptible to targeted phishing attacks, sensitive data exposure, account hijacking and other exploits. By leveraging new security approaches, such as massively scalable cloud-based architectures and sophisticated data science and machine learning technologies, cloud security vendors, and the enterprises they seek to protect, can get a leg up on the “bad guys.” Some countermeasures and proactive steps include:
Read Aditya K Sood and Rehan Jalil’s recent Journal article:
“Cloudifying Threats—Understanding Cloud App Attacks and Defenses,” ISACA Journal, volume 1, 2018.
We live in a world full of risk, and nowhere is risk more prevalent than in technology.
The Center of Internet Security (CIS) has recommended 20 critical security controls to respond to threats and vulnerabilities associated with the internet. The premise is that proper implementation of these controls will mitigate the risks of damage, unauthorized alteration or theft of information and technology assets. However, when it comes to risk mitigation, how much is enough? How much reduction of risk is required? In other words, what is the risk appetite of the enterprise?
This varies from company to company depending on multiple factors, such as the industry in which it operates, the type of service or product provided, the current economic climate and companies’ financial position. Risk appetite also depends on the overall risk landscape. As evidenced by a continual wave of news reports, the cyber arena is full of threats designed to steal, destroy, alter or simply gain unauthorized access to information assets.
In this digital world, it stands to reason that managements are more and more cognizant of cyber threats that endanger their assets. Managing these risks could benefit immensely from a cybersecurity audit. While the CIS Controls Audit/Assurance Program is not designed to provide assurance beyond the security program of an enterprise, the controls are presented in a prioritized fashion to assist the enterprise in leveraging its potentially limited resources to protect key assets and realize the most benefit.
The purpose of an audit is to assess the efficiency and effectiveness of current controls and provide a level of assurance that assets are adequately protected and accessible to authorized users when needed.
To ensure proper safeguards are in place, management should not rely solely on the CIS Controls IS Audit/Assurance Program. Audits of other pertinent operational processes should take place. A holistic approach is necessary and requires a strategic partnership between the board of directors, senior management, IT and functional business units, and audit. While the board of directors provides guidance and direction, management is responsible for executing based on those directives. This holistic approach can result in the creation and implementation of policies and processes that are designed for business value, as well as the security of all company assets.Category: Audit-Assurance Published: 2/5/2018 3:01 PM
After I passed the CISM exam late last year, ISACA offered to let me share my experience of how (and why) I chose to become a CISM, and what I did to accomplish my goal. I hope this article provides some useful ideas to help you go after your professional development goals, as well.
Why the exam mattered to me
GSWS is a small business that provides cybersecurity and compliance-related services to other small and mid-size organizations in the Southern California region of the U.S. Our clients include optometrists, dentists, CPAs, attorneys, etc. – I mention this because our work environment isn’t like that of a lot of other CISMs, who are employed by much larger organizations.
Our clients are woefully unprepared for the type of cyber risks they face on a daily basis. They are highly skilled within their respected trades, but they have no clue when it comes to understanding cybersecurity. They rely on us to provide this knowledge, experience and solutions. I needed a way to ensure my skills were of a high level and communicate our qualifications to clients and prospects in an easy-to-understand way.
I was familiar with www.cyberseek.org, but when revisiting the site, I saw how highly the CISM and CISA certifications were recognized. I had recently joined ISACA and passed the CSX-F exam, which gave me some degree of familiarity with how ISACA works. That’s when the CISM and CISA certifications became the obvious choices for me. I chose to go after the CISM first.
How I studied
In preparing for the exam, I used the following resources:
Depending on your budget, select what is best for you. I was fortunate to have access to all these resources.
Some additional recommendations to help you prepare for and pass the exam:
I hope this helps. I’m scheduling for my CISA exam in April and studying for that now. My preparation for the CISA is identical to what I’ve described in this article. Good luck to you!Category: Certification Published: 2/1/2018 3:03 PM
The purpose of the General Data Privacy Regulation (GDPR) is to harmonize the data privacy regulations that each European Union member state implemented to comply with GDPR’s predecessor. GDPR provides a single, comprehensive regulation that is compulsory for all organizations processing the personal data of individuals living within the European Union.
The regulation becomes enforceable on 25 May 2018, after a two-year grace period to allow organizations to implement GDPR. GDPR substantially increases data subjects’ rights – and with penalties of up to 4% of gross turnover, the regulation has the potential to fundamentally change the way organizations view and process personal data. That said, the purpose of this blog post is not to tell you what GDPR is, who it will impact, nor to pour more oil on the fear-mongering flames. Over the past two years, most of us have seen more than enough of these types of articles from privacy experts. I am writing today to introduce ISACA’s new GDPR guide.
Six months ago, ISACA brought together a team of information technology, information security, audit and data privacy professionals from around the world to help develop a guide that provides a pragmatic approach to implementing GDPR in organizations large and small. This guide provides a comprehensive introduction to GDPR, along with a plan to help organizations implement a data privacy program that complies with GDPR requirements.
The guide also includes the available information from the Article 29 Data Protection Working Party (WP 29), which provides clarification on various topics covered in the regulation. WP 29 guidance, where available, has been included within ISACA’s GDPR guide. At 100 pages, the guide can be easily read in a weekend. It will serve as a handy guide both during the implementation of your data privacy program, as well as a solid reference during your day-to day-activities.
The guide provides advice on topics such as identifying and classifying personal data, data governance, information security, managing compliance in your supply chain, data breaches, employee awareness and more. The guide also includes several annexes that provide specific recommendations to help practitioners implement an effective and efficient data privacy program. Annex 1 is divided into nine domains that cover 46 processes organizations should implement as part of their GDPR programs. Annex 2 provides guidance on how to set up and manage the Data Privacy Impact Assessment (DPIA) process. Annex 3 provides a sample personal data register that must be created, maintained and readily available in the event of an audit. Throughout the document, we have defined common data privacy terminology and included a glossary of terms that we suggest you ensure are correctly used within your organization to avoid confusion.
The ultimate purpose of the guide is not simply to help organizations become GDPR compliant, but also to ensure the privacy of real people. To this end, we stress that the comprehensiveness of your data privacy program should be based on the risk to the subjects’ data that you hold and not solely on the risk to your organization.
ISACA’s GDPR Working Group believes that implementing GDPR will not only reduce the risks to your organization, partners and customers, but also has the potential to improve the effectiveness of your organization through the implementation of sound policies and processes. Many of us on the working group are privacy practitioners who will use the guide to help implement GDPR in our organizations. This will allow us to see first-hand what worked well and what could be improved. Stay tuned to this space, as we will provide regular updates as we count down to 25 May. Once we’ve received sufficient feedback, we will review and update the guide. In the meantime, we hope this guide is beneficial to you and your organization.Category: Privacy Published: 1/30/2018 10:00 AM
The recent Global Risks Report by the World Economic Forum offers the latest evidence that cybersecurity is rising among the top global risks. Cyberattacks are now the global risk of highest concern to business leaders in advanced economies. This reflects the inability of enterprises to keep pace with today’s challenging threat landscape, and points to an urgent need for increased prioritization of and investment in cybersecurity by executive leadership.
While a cyberattack does not qualify as a natural disaster – one of the other top risks identified in the Global Risks Report – large-scale cyberattacks are capable of devastating critical infrastructure in similar fashion. A cyberattack has the potential to disrupt many of the most essential aspects of our lives, from electric, gas and water utilities to banking and cellphone coverage.
It is evident that the status quo will not be sufficient if we are to expect a reasonable level of security in both our personal and professional lives. Society and enterprises will need to focus on resilience, both technological and human. While contending with threats may be inevitable, our ability to recover cannot be undermined. We will need to build real and virtual firebreaks to ensure critical infrastructure elements do not fall due to the domino effect of a potential collapse.
Systemic challenges and threats require systemic solutions. Enterprises must focus not just on providing the next big app or solution to customers, but also on educating customers about potential threats and actions that can be taken to prevent or address them. In this context, it was encouraging to see the World Economic Forum announce plans for a new Global Centre for Cybersecurity. Deeper collaboration between the public and private sectors – while also tapping into the knowledge base of global industry associations such as ISACA – must be part of any substantive solutions going forward.
The increasing cybersecurity challenges that accompany the expanding threat landscape also call for the constant skilling and re-skilling of the technology workforce. Enterprises must be more committed to investing in real-world training for their security teams that takes into account the most up-to-date threats and vulnerabilities. Why is it so necessary to develop a more robust, highly skilled cybersecurity and tech governance workforce? Consider several realistic possibilities that I suspect we could encounter as 2018 progresses:
These, and other technology-driven stress points, are unprecedented challenges that demand proactive defense strategies. Disruptive technologies have the potential to power our global economy in many promising and innovative ways, but we must nurture new and more collaborative solutions to ensure these technologies are implemented effectively and securely.
While cybersecurity rising on the list of top global threats can not be construed as good news, at least the global community has begun to recognize the scope of the challenge. Now, it is time to pull together as a global community and meet this challenge together.Category: Security Published: 1/30/2018 3:04 PM
As auditors and security professionals, much of our focus is spent on the network perimeter. However, with the trifecta of porous perimeters, misconfigured cloud environments, and the enormous amount of compromised and exposed data due to breaches, we must rethink how we scope our audits. Zero-trust concepts must be considered due to the increased likelihood of malicious actors getting past a somewhat hardened exterior to the soft, chewy middle of our corporate networks and virtual private clouds. We must focus our security assessment procedures on data protection.
The ISACA website contains many useful free resources, such as GDPR Data Protection Impact Assessments, HIPAA Audit/Assurance Program, and ICQ and Audit/Assurance Program for PCI DSS Compliance Program, addressing elements of data protection controls. The following are some questions that can be covered in data protection audit programs:
I wish you the best of luck as you drive your client or organization’s data protection strategy to new levels of maturity.
Read Mike Van Stone and Ben Halpert’s recent Journal article:
“Mistakes Happen—Mitigating Unintentional Data Loss,” ISACA Journal, volume 1, 2018.
The most prominent data security events of 2017, such as WannaCry and Equifax, were direct results of poor patching practices. Now, 2018 is off to a menacing start with disclosure of two hardware vulnerabilities affecting most modern microprocessors and requiring a number of patches on several levels of defenses.
To clarify, Meltdown is a vulnerability that allows core system memory access by any user process, while Spectre allows an unprivileged application to access the memory space of others.
What can happen? In simplest terms, one program executed on your computer can gain access to data that belongs to other users or utilize the operating system to access data, including passwords and personal data. What is affected? Most personal computers, servers and mobile devices. What can we do about this? The simple answer: patch everything that is affected, including BIOS, OS and browsers.
If everything seems to be simple, why is this a such a big problem? The answer is not so simplistic. As far as the scope, possible vectors of attack and potential ramifications, these two vulnerabilities present perhaps the largest impact to our computer systems and networks that we have seen in a very long time.
Let’s start with the fact that it is likely that every computer and mobile device in your infrastructure is somehow affected, along with a significant number of IoT devices. Arguably, your shared environments (such as Citrix) present the greatest vulnerability, as these systems are designed for multiple users and the core design is a secure segregation between user resources.
Let’s consider the work of many of us in the security community. We need to identify all the systems and software that must be patched, test the patches, implement them and deal with “side effects.” This includes legacy systems, as the vulnerabilities include microprocessors manufactured all the way back to 1995.
Today, while there are challenges with some patches that introduce processing slowness and compatibility issues, not patching is not an option. We learned our lessons with the 2017 NotPetya ransomware, where the compromise of only one unpatched system would begin infecting the rest of the adjacent network devices.
As of now, there are no known mass exploitations of these vulnerabilities, but it is not because the hackers discounted these issues as “unexploitable.” In the world of hackers, exploitation of a vulnerability is only part of the equation. First, you must have a reliable distribution vector for the malware. Can an exploit be distributed in an email, on malicious sites or through other means to facilitate infection?
After malware is allowed to execute its exploit, it must deploy a malicious payload – a set of instructions of what to do next. Sometimes, it is an instruction set to allow victim system interaction with a Command & Control server, or it is simply used to deploy ransomware. At this stage, there must be a lot of consideration to bypass typical security controls such as anti-virus, IPS and other safety tools.
Lastly, there must be a mass monetization component – for ransomware, it is a setup to ask for a ransom, receive payments, release the encryption keys; in other cases, to facilitate data identification and exfiltration. None of these tasks are simple for the hackers and they can rarely be accomplished by a single person. Thus, nearly a month after the world became aware of the microprocessor vulnerabilities, there is still no mass exploitation.
Today on the dark web, the most common relevant conversation is not about abuse of Meltdown or Spectre. The most entrepreneurial hackers want to know if there are similar vulnerabilities in microprocessors that are not discovered and patched. Hacker bounties for these zero-day bugs are astronomical, and for good reason. No matter how good your system security is, if there is a fundamental hardware flaw, almost nothing will stop hackers from exploiting it on any vulnerable target of their choice.
Meanwhile, as hackers are regrouping and fantasizing about the unexploited data caches, let’s keep diligently patching and hope that the next vulnerability or wave of exploitation will not be brutal.Category: Security Published: 1/26/2018 1:38 PM
News of medical device security flaws are increasingly in the news. Consider the announcement from the U.S. Food & Drug Administration last year about a flaw in one model of a St. Jude Medical implantable pacemaker. This was subsequently covered in more than 14,000 published reports to date. Thirty-four different individuals sent me a message soon after the news broke, asking if I had heard about the approximately 750,000 pacemakers of this specific model that had significant security vulnerabilities. Many reports about other types of wirelessly connected medical device flaws occurred prior to that, and more have been reported in the few months since.
Medical devices are integral parts of hospital networks
According to various estimates from research organizations – and healthcare CISOs I chatted with at the Detroit SecureWorld event last fall, where I delivered a keynote about medical devices – anywhere from 30-70% of medical devices within hospitals and clinics are smart”... digitally connected to smartphones, the internet, clinic networks, directly to other devices, etc. These large numbers of medical devices attached to healthcare networks increase the possibilities for a wide range of security and privacy incidents to occur through exploiting their vulnerabilities – especially from and through the medical devices that have no legitimate security controls engineered within them.
Security and privacy incidents can occur due to various factors, such as:
Security complexity requires multiple layers of controls
Some changes to medical devices can be done remotely. Some need to be done in proximity using near field communication (NFC) protocols. However, I’ve communicated with too many in the medical device industry who have expressed belief, or claimed, that using NFC is a 100% solution for security. When I asked upon three different occasions in 2017 about the security of their newly announced medical devices, representatives (IT security VPs/management) from each of three different large medical device manufacturers told me, “We use NFC, so security is not an issue.” When I explained that if medical devices attach via NFC to computers that are part of a network, then basically any other node on that network may be able to get to the medical device through that network connection, such as through control settings necessary for network functions, or through the use of discovery tools such as Shodan, each of the medical device representatives stopped communicating with me. Avoiding a security risk discussion does not solve the associated security risk.
Lack of planning and integrating with networks and systems can shut down medical devices, sometimes during operations. There have already been medical devices used for performing operations, such as heart procedures, that shut down as a result of an anti-virus scan. Or, the time a nurse tried charging her cellphone using the USB port in an anesthesia machine; it shut down the machine. I could provide a hundred additional examples. If medical device manufacturers do not improve the security engineering of their medical devices, security incidents will increase, along with privacy breaches and patient harm.
Medical device security concerns are justified
Healthcare providers (doctors, nurses and surgeons) are concerned. Rightly so. Flawed devices negatively impact their ability to assure patients they are providing them with safe devices that will help, and not potentially harm, them.
Healthcare information security practitioners (CISOs, CIOs, VPs, managers, etc.) are concerned. And for good reason. Security flaws within medical devices create vulnerabilities to data and functioning not only within the devices themselves, but also to the networks to which they are attached, and other devices on the networks.
Healthcare IT auditors are concerned. And they should be. Insufficient medical device security controls are compliance violations for growing numbers of regulations, laws and contractual requirements, in addition to facilities’ own posted privacy and security notices, which contain promises to which they are legally bound.
Healthcare regulators are increasingly concerned. Justifiably so. They are accountable for ensuring information security and privacy regulations are followed. When regulators see more reports of medical device security flaws and vulnerabilities, they are going to become more proactive to pressure medical device-makers to improve security controls, and to pressure device users to ensure devices are implemented with appropriate security.
Patients are concerned. Of course. Their lives could be at stake.
Dedicate 2018 to improving medical device security
As Data Privacy Day approaches this Sunday, here’s a recommendation for those in the medical device space (manufacturers, engineers, and vendors). Make it a goal in 2018 to successfully establish effective and practical information security controls within your devices. Stop telling hospitals and clinics that it is not practical for you to do this. It is actually more practical, and will significantly improve security protections for those using medical devices, to build the security controls into the devices from the start. This idea is supported by not only those in the information security profession, but also by the FDA and other regulators.
This will not let healthcare data security practitioners off the hook. Even if medical device creators improve the security of their devices, healthcare IT and security practitioners will still need to remain diligent to ensure the security of those devices in how they are connected to their networks, the control settings to access them, and the management of the data that comes from them. But improved device security will support these efforts.
Establish your baseline for current levels of medical device security now. Then, in December of this year, determine if and where there have been improvements, or if data security, privacy and patient protections have actually degraded. It all depends upon where medical device companies decide to place their priorities.Category: Privacy Published: 1/26/2018 9:59 AM
Editor’s note: Motivational business speaker Caspar Berry will bring his unique poker player’s perspective on risk to his opening keynote address at EuroCACS 2018, which will take place 28-30 May in Edinburgh, Scotland. Berry recently visited with ISACA Now to discuss topics such as overcoming the fear of failure and the dynamics of risk-aversion. The following is an edited transcript:
ISACA Now: You contend that all decision-makers are investors. What do you mean by that?
I mean that all decisions are investment decisions when you break them down. All decisions are resource allocation decisions. Allocations of money, yes, but often time. Sometimes we measure time in hours, but sometimes in less tangible units of passion or patience or dedication. All these resources are limited, and all are being allocated in a world of inherent uncertainty. By that, I don't mean the next five years. The next five minutes are uncertain … So, we're all investors, because everything we do is an allocation of a scarce resource in an uncertain world with a view of getting some kind of return on that investment by a variety of different criteria. That’s what investment is, at a fundamental level, and that's what we're all doing thousands of times a day.
ISACA Now: In an enterprise context, how can decision-makers push past their fear of failure?
In any context, the key to pushing through fear of failure is to understand what fear of failure is and where it comes from. Actually, what we colloquially call fear of failure is the product of two – arguably three – very prosaic psychological phenomena acting on us all the time. At the basic level is what we call “loss aversion,” a product of the diminishing marginal utility we get from most things we consume. Then there's time preference, which encourages us to seek short-term rewards and thus eschew long-term investments or delayed gratification. Then there's our judgment, which makes us pessimistic that new things can work compared to old things that apparently do.
In pretty much all these cases, the trick is to think long-term. In many contexts, that is an act of overriding our basic psychological hardwiring, which is still mostly – though not totally – designed to get us home safely at the end of each day. But, a poker player is not concerned about results at the end of any particular day. It’s irrelevant to us. We're concerned with maximizing our long-term expected value. If you do the right thing, then the law of large numbers gives you what you deserve at the end of the long term.
ISACA Now: Do you believe that millennials and other younger members of the workforce are any more or less risk-averse than those who came before them?
I don't think there is any data that proves this either way, per se. A lot of metrics for measuring risk tolerance are bunkum, anyway. Much of what we call risk tolerance is actually a product of the timeframe of judgement that people are thinking of the consequences of failure within ... don't get me started.
My gut, however, says that broadly speaking, millennials are inherently no more or less risk-tolerant than older generations. I hate the brushes with which a lot of people tar the millennial generation. That video on YouTube of Simon Sinek saying how short-term they all are and how they're all the beneficiaries of nepotism are also bunkum. (That’s logically impossible by the way; how can they all be beneficiaries of nepotism over people of their own generation??) Not a single statistic is cited. You may as well ask a man in a pub.
The reality is that risk tolerance is a product of genetics (which won't have changed noticeably between boomers and millennials) and circumstance. So, for example, if someone sees less future in following the tried-and-tested path of university and then a corporate job, they may be inclined to take more risk in their life, but no more than their parents would have done had they been in that situation. Indeed, look at my grandparents’ generation. They put their lives literally on the line every day in a way that we would never think of doing, but only because the alternative was Nazi occupation. They weren't genetically different; they just had different situations producing different upsides and downsides, or what economists call incentives.
ISACA Now: How does one become a professional poker player? What set you on that path?
Oh, that's easy. Poker is the easiest career in the world to get into. You just go to Las Vegas ... put your money on the table, and ta da!