ISACA

SheLeadsTech Returns to United Nations

ISACA Now Blog - 2019年03月16日 04:18:38
Body:

SheLeadsTech was back this week at the United Nations for the 63rd Session of the Commission on the Status of Women to continue the critically important work of empowering women and girls by providing access to social protection and appropriate infrastructure, including technology infrastructure. This annual meeting attracts delegations from each of the UN’s member-states and up to 4,000 civil society representatives.

 

There are a range of events and presentations that take place in parallel to the main business of the Commission, which is to agree a set of conclusions that will form the roadmap for all 193 member-states in relation to the theme of the session. Winning the right to run such an event is highly competitive and not all proposals are accepted, but ours was! We were very excited and just a little nervous.

 

SheLeadsTech proposed a panel discussion, "Building a Global Alliance to Empower Women through SheLeadsTech.” Our panelists were chosen because they were either working in collaboration with SheLeadsTech or are experts in creating coalitions or alliances. Even though our session was scheduled at 8.30 a.m. Wednesday after a big reception the night before, we need not have been concerned ... we ended up with a full house and our panel discussion was very well-received.

 

While the women’s leadership space seems like a crowded space, that just means that there are more voices, and more voices means that we can elevate those voices outside our traditional community. In order to be heard you need to first listen, and you need to be authentic and believe in what you say. The panel (pictured with this post) provided another important platform for those voices to be heard.

 

In addition, we have been involved in ensuring that the requirements of our global constituents are not lost during the negotiation process and will be enshrined in the final roadmap to move the member states toward greater equity and empowerment for women and girls.

 

Editor’s note: Find out more about SheLeadsTech’s involvement in the Commission on the Status of Women last year in this blog post.

Category: ISACA Published: 3/15/2019 3:57 PM
カテゴリー: ISACA

ISACA at RSA 2019: Sharing Research and Spurring Conversations

ISACA Now Blog - 2019年03月14日 23:21:04
Body:

The theme of last week’s RSA Conference 2019, “Better,” gave ISACA the opportunity to engage with information and cybersecurity professionals on how we collaboratively move the technology field into a better future.

ISACA kicked off RSA with the release of part 1 of the 2019 State of Cybersecurity report, which revealed insights into issues affecting the cybersecurity workforce and the skills that are currently most in demand.

ISACA leaders addressed this topic in their panel, “Building—and Keeping—Your Cybersecurity Team with Nontraditional Staff.” Rob Clyde, CISM, ISACA Board Chair, walked through key data from the 2019 State of Cybersecurity report, then moderated a discussion on how cybersecurity teams can source talent from diverse backgrounds and skill sets. Joining Clyde were panelists Tracey Dedrick, ISACA Board Director; Tammy Moskites, managing director, Accenture; Gabriela Reynaga, ISACA Board Director and founder/ CEO, Holistics GRC Consultancy; and Gregory Touhill, ISACA Board Director and president, Cyxtera Federal Group, Cyxtera Technologies, Inc.

To illustrate how common it is for people to join the industry from other career or educational paths, Clyde asked the audience of over 75 attendees, “How many of you studied cybersecurity and then went immediately into a job?” Only a couple attendees raised their hands.

The panelists shared some of the qualities that non-traditional job candidates can bring to a team. Dedrick noted that she likes to hire people who are the first generation in their family to go to college, noting they often “know how to not take things at face value, and to negotiate.” Both Moskites and Touhill emphasized the importance of new hires having a strong desire to get into the industry. “You want folks who are naturally curious and eager to solve problems,” said Touhill. Added Moskites, “I look for anyone with that ‘fire in their belly’ who is eager to learn,” noting that she once hired someone who had previously worked at a grocery store filling food containers, because of that person’s drive and passion—and this person turned out to be one of her best employees.

Reynaga and Touhill also shared their recommendations on how to attract more women and military veteran candidates to cybersecurity jobs. “There needs to be flexibility in scheduling for both men and women,” said Reynaga. Touhill noted that HR professionals need to have relationships with their local military bases’ transition offices, saying, “If you’re not, you’re cheating yourself out of a great talent pool. Veterans want to continue their mission.” He added that the Wounded Warrior Cyber Combat Academy is another place to look for veterans with cybersecurity skills.

Later that day, ISACA’s SheLeadsTech program hosted a panel that continued the conversation, focusing on how to attract and retain women in cybersecurity roles. Clyde introduced the group, including moderator Moskites and panelists Reynaga, Dedrick and Kim Dale, CISA, CISSP, IT Audit Specialist, Federal Reserve Bank of Chicago. The four women shared the stories of their career paths and challenges faced along the way—including one sharing how it felt to be the only woman on the leadership team, and another panelist recalling how she was once told that a more senior role wouldn’t be appropriate for her because she should be focused on getting married and having children. They provided advice for other women around applying for jobs even if not all qualifications are met, charting your own path if others try to hold you back, and to define what success means to you, even if it’s not considered “traditional.”

ISACA plans to continue these conversations well beyond the conference, promoting initiatives that lead the industry and engage all information and cyber security professionals toward “better.”

To learn more about the ISACA 2019 State of Cybersecurity Report, click here.

To learn more about SheLeadsTech and to get involved, click here.

Category: Security Published: 3/15/2019 10:16 AM
カテゴリー: ISACA

‘Didn’t You Read My Email??’ and Other Security Awareness Fallacies

ISACA Now Blog - 2019年03月13日 06:39:36
Body:

I live in Austin, Texas, USA, where the bumper sticker quotient is fairly high, although diminishing with every vehicle that comes here from places like Dallas (no offense, Dallas — I don’t have any bumper stickers on my car either). One of my favorites is, “If you’re not appalled, you’re not paying attention.”

I’m sure it was written with politics in mind, but it’s absolutely relevant for cybersecurity, too. Most security professionals — me included — remember a time when we were appalled, closely followed by a desire to be part of the solution.

I see stacks of security awareness materials. To be effective, the producers of those materials rely on an appalled and aghast audience. Fear, uncertainty and doubt often provide an “easy out” for those looking for shortcuts.

"If you only understood how important this is, and all the bad things that really bad people are doing in the world, you’d stop reading this poster/email/training module and change your password/use a password vault/enable MFA right now.”

The problem is, people are only temporarily appalled, and after the shocking breach headline fades, they are no longer paying attention.

When considering the world of consumer messaging and advertising, we’re led to believe that humor, optimism and a sense of purpose are better levers than fear to motivate action. Let’s look at three common security awareness fallacies and how we can improve the ways we communicate to get people’s attention and create positive, engaging awareness campaigns, instead of shock and awe.

Awareness fallacies and corrective controls

Didn’t you read my email/policy/standard? People are bombarded all day by messages from all kinds of media — email, TV, billboards, Facebook and Twitter. They cannot escape it. I’ve worked at companies where people routinely receive 200 emails a day. With that much noise, people cannot read and intelligently process every email they receive. They read or skim what they think is important. Their focus is on their priorities and no one else’s. Corrective control: Use more than one channel to say the same thing over and over again. Not everyone is reading everything, so use email, posters, social media, videos, graphics, events and more to get your message across in every media channel available to you.

Up to and including termination. You cannot threaten your employees into a culture of security. Creating a culture is a lot like creating a brand – you can influence it, but you never completely control it. A brand lives in the hearts and minds of everyone who chooses to participate in it. People have to want to be a part of it – you can’t force them. Compliance is critical, and there’s a time for language like “up to and including termination” when you’re assigning mandatory training or writing policy. But if you use this type of threatening language with your security awareness materials, you should realize that it’s contrary to creating a culture people will embrace. Corrective control: I know a lot of training and awareness managers (I was one) who run a small part of their program for compliance, but the rest is optional. That requires you to be good at engaging people to take part and be new culture adopters. Identify those in the organization who are early and eager adopters and enlist them to help spread the message.

Human firewall, weakest link, end user. Way too many security communications refer to people in really unappealing terms – how can we blame them for not paying attention? I looked for an example of a successful consumer messaging campaign that instructed people to be more like technology, instead of illustrating how technology serves humanity. I did not find one, and that’s probably a good thing. Corrective control: Use language that empowers. Impart information that make people better people — not the human element, firewalls, links or users.

Think about your company’s culture and your current approach to these common fallacies. Take a razor blade to all those appalling bumper stickers you might have on your security awareness training vehicles. Replace them with upbeat and engaging messages that educate and empower.

Editor’s note: For more insights on this topic, download a joint white paper from ISACA and Infosec.

Category: Security Published: 3/13/2019 9:10 AM
カテゴリー: ISACA

C-Suite: The New Main Target of Phishing

ISACA Now Blog - 2019年03月12日 00:45:12
Body:

We know that phishing attacks are on the rise, but did you know that more and more executives are falling for these phishing emails every day? New phishing campaigns targeting executives are intelligently crafted and difficult to spot. Traditional hardware/software protection cannot keep up with rapidly evolving phishing methods. They easily bypass spam filters and Business Email Compromise protection solutions, and successfully get executives to reply, click on links and open documents.

One blatant example, according to the Agari Cyber Intelligence Division's London Blue Report, describes how a criminal organization, structured just like any modern organization, created “a list of more than 50,000 finance executives that was generated over a five-month period in early 2018. This list was likely used by London Blue as a massive targeting repository for their BEC attacks. Among them, 71 percent held a CFO title, 12 percent were finance directors or managers, nine percent were controllers, six percent held accounting roles, and two percent had executive assistant titles.”

According to Intermedia, 34 percent of executives/owners and 25 percent of IT workers themselves report being victims of a phishing email, more often than any other group of office workers.

From my latest research speaking to customers whose executives were targeted successfully, the first emails that came in had NO links or files contained in them. The hackers are doing their research on these executives and their contact circles, so they can send simple emails from organizations and people that the targeted executive has done business with or interacted with before. These first few emails are used to build trust, so that at some point in the future the target will click on a link, open a document or, even worse, tell an assistant to respond on his or her behalf.

By August 2018, at least 400 industrial companies were targeted by spear-phishing attacks disguised as legitimate procurement and accounting letters, according to Kaspersky Lab.

These folks are smart – very smart. They know that for lower amounts, fewer approvals are required, so they will typically seek approvals for the release of funds under US $50,000 per transaction. Now, add to this the fact that some organizations may not realize they have been phished until five months later, and that makes for a scary proposition.

Evolving phishing attacks mean that criminals are continually looking for new ways to completely mask their malicious URLs, especially on mobile devices. They either hide them behind a page like Google Translate that users are already familiar with or completely trick users with custom web fonts and altered characters. One of the latest approaches is to create an Office 365 meeting invite that contains quiz buttons or a poll asking recipients to pick the topic or date for the next meeting; employees that end up clicking are presented with a fake Office 365 login page where they enter their O365 credentials and then lose control over their email account. Another approach is an email that comes from someone you know with a request to take a look at something for them. When you click on the link or attachment, malware installs on your system, takes over your email client, and then emails the same message from you to all your contacts.

All is not lost, however. There is a way to help prevent and thwart these attacks. You need a security awareness program that instils a culture of security throughout your organization starting in the boardroom and leading by example.

According to Cybersecurity Ventures 2019 Cybercrime Report, “Training employees how to recognize and defend against cyberattacks is the most underspent sector of the cybersecurity industry.”

If more than 92 percent of all breaches and hacks are due to phishing, then employees with an email address, social media account, phone or tablet are your organization’s largest attack surface. Millions of dollars are spent on hardware and software security measures, yet still today, a single click from a single user can circumvent all the expensive protections in place. It may be time to rethink your approach to cybersecurity and start applying the Human Fix to Human Risk.

To effectively change phishing behaviors and build a security culture among executives and all employees, you need a comprehensive awareness program that is carefully planned, and which is based on your organization’s specific needs and objectives. This is difficult to achieve unless you apply a proven security awareness framework—an ongoing methodical approach – which should include these five steps:

Step 1 – Analyze your organization’s needs and objectives and develop a cybersecurity awareness program that generates results.
Step 2 – Plan your campaigns to stay on track and engage your workforce as well as your stakeholders.
Step 3 – Deploy an effective training initiative and witness behavior change as it happens.
Step 4 – Measure the performance of your campaigns against your objectives and demonstrate progress to stakeholders.
Step 5 - Optimize campaigns accordingly and update your program to incorporate new insights.

Without a framework, it’s just hit and miss, and you will never get your users, whether they are executives or not, to change their risky behaviors with an unorganized approach. A framework is designed to take everything into consideration – especially how people learn, adopt and maintain new habits. Taking such a methodical approach ultimately leads to a culture of security awareness … with dramatically fewer human-related security breaches.

Malicious and fraudulent emails will continue to bypass filters and malware detection solutions for the foreseeable future, allowing cybercriminals to make more money. But, there is hope if you leverage a tried and proven combination of phishing simulations targeting the C-Suite that include executive awareness training based on a pedagogical approach, continually reinforced with communication to change current behavior and help reduce your largest attack surface.

Editor’s note: For more insights on this topic, download the Phishing Defense and Governance white paper, released by ISACA in partnership with Terranova Security.

Category: Security Published: 3/14/2019 10:31 AM
カテゴリー: ISACA

GDPR Audits for SMEs Are All About the Language

ISACA Now Blog - 2019年03月12日 00:34:39
Body:

It is often said that a good auditor is a good communicator, and this is particularly true when dealing with smaller organizations.

Small and medium-sized enterprises (SMEs) tend not to have the capacity to employ specialists in every role, instead relying upon generalists who fulfil many roles in the organization.

Unless the SME’s business is data processing or falls into one of the other categories that require a data protection officer (DPO), then the chances are that as auditors we will be speaking to the finance head or IT manager or HR manager about data protection.

ISACA’s new GDPR Audit Program for Small and Medium Enterprises is written not with the professional IT auditor in mind, but the auditee. Consequently, its language is simplified from that of the enterprise version.

One of the biggest issues I have found when dealing with SMEs is ensuring my conversations and questions are designed to fit the audience and are jargon-free. Only by adjusting the narrative to fit the audience can we hope to deliver an audit product that adds value. This is particularly important with GDPR in the SME space. Indeed, many SMEs still have not fully embraced the central theme of the GDPR – it’s all about the data subject, not the organization.

When auditing SMEs, it’s as much about education as compliance. GDPR is about how following some basic rules about good data governance, such as ensuring data quality, can add value, not just cost, to an SME. As auditors, we can help owners and managers to embrace this concept that we are adding value above and beyond what is derived from a compliance report.

It is also important to be aware that many SMEs will not have received the best advice leading up to GDPR. Many will have scoured the internet, talked with fellow business owners or at best attended a seminar or two – or, worse, been drawn into spending money on software solutions that are generic and not a good fit for their businesses.

In the hands of an experienced auditor, the audit program should be used as much to help devise a remediation plan as to arrive at an audit opinion. After all, the audit is designed to validate controls implemented to manage risk and to agree to a risk treatment plan.

A survey by Q2Q in November 2018 found that 41 percent of SMEs are still unsure about the rules and regulations surrounding GDPR. This, combined with 22 percent saying that emerging online risks are their biggest headache, present an opportunity for the auditor to use the program to offer genuine guidance to their SME clients.

One of the major issues that organizations and their auditors had with the previous Data Protection Act was that it was primarily viewed as an IT problem to be solved with technology. Complying with GDPR is about managing information risk and needs to consider a trio of risks: people, processes and technology. These risks must be considered across all facets of an organization.

Category: Privacy Published: 3/12/2019 3:44 PM
カテゴリー: ISACA

Cybersecurity: A Global Threat That We Can Control

ISACA Now Blog - 2019年03月06日 08:31:54
Body:

If there were any question about the critically important role that information and cyber security practitioners play in the welfare of today’s society, there is new evidence spelling it out in stark, attention-grabbing terms.

Data fraud/theft and large-scale cyberattacks were each identified among the top five global threats in the latest edition of the World Economic Forum’s Global Risks Report. The other elements on the list: extreme weather events, failure of climate change mitigation and major natural events, such as earthquakes and tsunamis.

Think about that for a moment: protecting data and thwarting cyberattacks now have ascended alongside dealing with natural catastrophes as the most pressing threats demanding the world’s full attention.

In some ways, the cybersecurity dangers we face are similar to the other, naturally-occurring disasters that occupy the top spots on the global threats risk. Just like a city or village can appear perfectly tranquil one day, only to be torn asunder the next by a raging storm or fierce earthquake, too many organizations today are lulled into a false sense of security, preoccupied by business as usual, and then are blindsided by a major cyber incident that causes business upheaval from which they may never fully recover. But unlike most of the natural disasters that cause so much damage, humans are capable of preventing much of the suffering that results from attacks on our digital world. That is a challenge the security community must commit to addressing on a global scale.

Given that backdrop, it is encouraging that the gathering of world leaders in Davos for the 2019 World Economic Forum included extensive discussions around cybersecurity and its rising importance in the global digital economy. As Brad Smith, president and chief legal officer at Microsoft, said in a panel discussion in Davos, “It’s all about keeping the world safe. The world depends on digital infrastructure and people depend on their digital devices, and what we’ve found is that these digital devices are under attack every single day.”

Cybersecurity is a fundamental enabler of the digital economy, protecting organizational assets, contributing to business continuity, defending brand names, potentially providing a competitive advantage, and managing liabilities and risk as a whole. The failure of organizations to take sufficient action in protecting themselves and their customers from cyber threats has necessitated increasing regulatory involvement, with 2018 marked by the enforcement of the EU’s General Data Protection Regulation (GDPR) and similar policies being crafted in the US and elsewhere; Smith anticipates a large-scale federal privacy law in the US to be enacted within the next year or two.

While new regulation and the development of national cybersecurity strategies can be helpful, there is not one or two isolated steps that alone can keep us safe. Cybersecurity requires a holistic approach, taking into account people, process, technology, organizational structures, business strategies and addressing the overall business ecosystem, which nowadays is built through the interfacing of many actors. These actors increasingly work across international borders, meaning the more substantive dialogue that international leaders have, such as the conversations that took place in Davos, the more opportunity for meaningful collaborations that will drive toward real solutions. This dialogue must be ongoing and include both the public and private sectors, as well as academia and industry professional associations.

These challenges are only going to intensify in the coming years. The evolution of the cyberthreat landscape cannot be ignored, especially with the rapid proliferation of new technologies and the corresponding changes to business models. The fact that only 40 percent of respondents to ISACA’s 2018 Digital Transformation Barometer express confidence in their organization’s ability to assess the security of systems based on AI and machine learning suggest that the challenges will only escalate in the coming years as AI and other fast-developing technologies are deployed more frequently. The global public and private sectors are still far from being prepared for this reality. In particular, there is much work to be done in recognizing the need to take a risk-based approach to understanding organizational cybersecurity preparedness and in appropriately prioritizing and investing in training resources for security teams.

One of the more interesting comments at the World Economic Forum came from Troels Oerting Jorgensen, Head of Centre for Cybersecurity at the WEF, who said, “We must not sell fear but protect hope to make sure the good side of the internet is always in focus.” That is a great way to look at it, but even better than hope is confidence, and confidence must be earned by being prepared. While cybersecurity appearing so prominently among top global threats is a jarring sight for all security professionals, at least there is no ambiguity about the extent of the challenge. While there is only so much humans can do about a tsunami or prolonged drought, cybersecurity is a people-driven challenge that our collective ingenuity and resolve can go a long way toward addressing.

Editor’s note: This post originally appeared in CSO.

Category: Security Published: 3/8/2019 2:58 PM
カテゴリー: ISACA

Artificial Intelligence and Cybersecurity: Attacking and Defending

ISACA Now Blog - 2019年03月06日 00:18:28
Body:

Cybersecurity is a manpower-constrained market – therefore, the opportunities for artificial intelligence (AI) automation are vast. Frequently, AI is used to make certain defensive aspects of cybersecurity more wide-reaching and effective. Combating spam and detecting malware are prime examples.

On the opposite side, there are many incentives to using AI when attempting to attack vulnerable systems belonging to others. These incentives include the speed of attack, low costs and difficulties attracting skilled staff in an already constrained environment.

Current research in the public domain is limited to white hat hackers employing machine learning to identify vulnerabilities and suggest fixes. At the speed AI is developing, however, it won’t be long before we see attackers using these capabilities on a mass scale, if they aren't already.

How do we know for sure? The fact is that it is quite hard to attribute a botnet or a phishing campaign to AI rather than a human. Industry practitioners, however, believe that we will see an AI-powered cyber-attack within a year; 62 percent of surveyed Black Hat conference participants seem to be convinced in such a possibility.

Many believe that AI is already being deployed for malicious purposes by highly motivated and sophisticated attackers. It’s not at all surprising given the fact that AI systems make an adversary’s job much easier.

Why? Resource efficiency aside, AI systems introduce psychological distance between attackers and their victim. Indeed, many offensive techniques traditionally involved engaging with others and being present, which, in turn, limited attackers’ anonymity. AI increases the anonymity and distance. Autonomous weapons are the case in point; attackers are no longer required to pull the trigger and observe the impact of their actions.

It doesn’t have to be about human life, either. Let’s explore some of the less severe applications of AI for malicious purposes: cybercrime.

Social engineering remains one of the most common attack vectors. How often is malware introduced in systems when someone just clicks on an innocent-looking link?

The fact is, to entice the victim to click on that link, quite a bit of effort is required. Historically, it’s been labor-intensive to craft a believable phishing email. Days and sometimes weeks of research, and the right opportunity, were required to successfully carry out such an attack. Things are changing with the advent of AI in cyber.

Analyzing large data sets helps attackers prioritize their victims based on online behavior and estimated wealth. Predictive models can go further and determine willingness to pay the ransom based on historical data, and even adjust the size of pay-out to maximize the chances and, therefore, revenue for cybercriminals.

Imagine all the data available in the public domain, as well as previously leaked secrets, through various data breaches are now combined for the ultimate victim profiling in a matter of seconds with no human effort.

When the victim is selected, AI can be used to create and tailor emails and sites that would be most likely clicked on based on crunched data. Trust is built by engaging people in longer dialogues over extensive periods of time on social media, requiring no human effort. Chatbots are now capable of maintaining such interaction and even impersonating the real contacts by mimicking their writing style.

Machine learning used for victim identification and reconnaissance greatly reduces attackers’ resource investments. Indeed, there is even no need to speak the same language anymore. This inevitably leads to an increase in scale and frequency of highly targeted spear phishing attacks.

The sophistication of such attacks can also go up. Exceeding human capabilities of deception, AI can mimic voice thanks to the rapid development in speech synthesis. These systems can create realistic voice recordings based on existing data and elevate social engineering to the next level through impersonation. This, combined with other techniques discussed above, paints a rather grim picture.

So, what do we do?

Let’s outline some potential defense strategies that we should be thinking about already.

First and rather obviously, increasing the use of AI for cyber defense is not such a bad option. A combination of supervised and unsupervised learning approaches is already being employed to predict new threats and malware based on existing patterns.

Behavior analytics is another avenue to explore. Machine learning techniques can be used to monitor system and human activity to detect potential malicious deviations.

Importantly though, when using AI for defense, we should assume that attackers anticipate it. We must also keep track of AI development and its application in cyber to be able to credibly predict malicious applications.

To achieve this, a collaboration between industry practitioners, academic researchers and policymakers is essential. Legislators must account for potential use of AI and refresh some of the definitions of “hacking.” Researchers should carefully consider malicious application of their work. Patching and vulnerability management programs should be given due attention in the corporate world.

Finally, awareness should be raised among users on preventing social engineering attacks, discouraging password re-use and advocating for two-factor-authentication where possible.

Category: Security Published: 3/6/2019 3:05 PM
カテゴリー: ISACA

Paying for Apps with Your Privacy

ISACA Now Blog - 2019年02月28日 06:53:19
Body:

Don’t look at your device when I ask you this question: How many apps do you have on your smartphone? Or, if you use your tablet more often, how many apps do you have on your tablet? Remember this number or write it down.

OK, now look at your device. How many apps do you actually have installed? Is that number higher than what you wrote down previously?

For most people, it would be. In many of my keynotes, and in most of my client key stakeholder meetings, I ask this question. I’ve seen around 90-95 percent of people severely underestimate the number of apps they have on their devices. For example, I’ve had people tell me they had maybe 15 or 20 apps installed, and after they checked, they found they actually had well over 100. But they were only using around 15 of them.

Keep this in mind: just because you are not actively using apps does not mean that those apps are not actively harvesting data from you.

Most people download apps willy-nilly. The mentality is often, if it is free, then, hey…let’s get it and see what it does! Oftentimes those never-used-but-still-installed apps are silently and often continuously taking data from the device and sending it to the app vendor, which then shares the data with unlimited numbers of other third, fourth, and beyond parties. Who are those third parties and beyond? What are they doing with your app data? How can those actions have negative impacts on those associated with the data?

Throughout my career, when doing my hundreds of assessments and risks analyses, I’ve often heard the following from those reading the reports, “Have these possibilities you’ve outlined actually happened? Has such misuse of data actually happened? Why is sharing data from devices a problem?” The overwhelming opinion was, "If nothing bad has happened yet, or we haven’t heard about bad things happening, then why worry? Probably nothing bad will happen." This often-stated denial of risks, and the lack of accountability that such opinions try to establish, are factors motivating app vendors and tech companies to share as much app data as possible, monetizing it along the way, and leading to a wide range of emerging invasions of privacy that don’t fall neatly under the definitions of “privacy breaches,” even though those involved certainly feel creeped-out and victimized, often in multiple ways.

Recent reports, including an intriguing one from the Wall Street Journal, are shining light on how so many app vendors are sharing data with Facebook, one of many social media and tech giants that is involved. For example, the report noted, “Instant Heart Rate: HR Monitor, the most popular heart-rate app on Apple’s iOS, made by California-based Azumio Inc., sent a user’s heart rate to Facebook immediately after it was recorded.” Do you think the app users knew this would happen? To what other businesses was their data sent? What about all the other apps being used? How many other organizations are they sending data to, unbeknown to the app users?

The types of data from apps that are being shared, and the insights they can give into people’s lives, are alarming, and go far beyond heart rate data. Apple and Alphabet Inc. (Google’s parent company) reportedly don’t require apps to disclose to the app users all the third parties that receive their personal data. So, in the HR Monitor example, the app users were likely not told that Facebook was going to get their data immediately as the data was collected. How many other third parties, and which ones, also got their data?

There are some huge problems that app creators and tech companies are generally not addressing in any meaningful or long-term way. Here are a few of them:

  • They do not clearly describe all the data they are collecting, deriving, sharing, processing and storing that that can be linked to specific individuals. In other words, they are not defining the personal data involved with the apps.
  • They do not specify the types of other data being associated with personal data, a combination that can result in very sensitive data.
  • They do not list the third parties with whom they are sharing that data, nor how the app users can determine how those third parties are using their data.

App creators and distributors need to do a better job at communicating the answers to these important questions to all those using their apps. But app users also need to be more proactive. They need to be more vigilant with how they download, use, and remove apps from their devices. I provided advice to app users about this in a couple of recent news stories – you can check them out at USA Today and Nerdwallet.

Category: Privacy Published: 2/28/2019 9:58 AM
カテゴリー: ISACA

Environmental Drift Yields Cybersecurity Ineffectiveness

ISACA Now Blog - 2019年02月27日 07:52:36
Body:

Your cybersecurity tools are working, optimized, and providing real, measurable, business value. They are successfully blocking attacks, detecting nefarious activity, and alerting the security team.

Then it happens. Somewhere a change is made by someone outside of the security department. That change isn’t communicated to the security team. Now all of a sudden, your cybersecurity tools are becoming ineffective and, worse, financial, brand, and operational risk has been introduced to the organization. Your cybersecurity effectiveness has drifted from a known good state. You are experiencing environmental drift.

Environmental drift
There are countless causes of environmental drift. More often than not, environmental drift is the result of someone in IT or a related group making a change without any malicious intent. However, the change might not be communicated to the security team or it might have unintended consequences that degrade cybersecurity effectiveness.

Here are some examples of ways that environmental drift can be introduced and the impacts that can result:

  • A proxy is installed that is inadvertently dropping syslog traffic between cybersecurity tools and their management consoles. This can result in a lack of visibility on the management console, and if a SIEM is involved, events relevant for correlation and alerting are simply not seen.
  • A tap or span is modified to only send unidirectional traffic to a cybersecurity tool. This can result in that cybersecurity tool becoming totally ineffective because many tools require access to bidirectional traffic to operate correctly.
  • A firewall rule configuration change is made to open various ports for testing, but the configuration is never returned to the prior state. This can result in a wide range of issues such as data exfiltration, cleartext protocols being allowed, successful beaconing, and active C2, for example.
  • An update made to an endpoint cybersecurity tool before being fully tested breaks some existing capabilities. This can result in endpoints such as laptops and servers being made vulnerable to credential theft, data theft and sabotage.
  • A configuration modification made in the cloud alters network segmentation. This can result in webservers, databases, and other assets not being protected by cybersecurity tools such as firewalls or WAFs because from a networking perspective those assets are now on the internet side of those cybersecurity tools. This type of mistake is pretty easy to make in the cloud while less likely in a data center, where you are physically connecting cables.

These are just a few simple examples where a known good cybersecurity effectiveness baseline can drift because of environmental changes of which the security team may not even be aware. Environmental drift happens all the time, everywhere, regardless of company size, processes, and tools. It greatly reduces the value the cybersecurity tools and teams provide and puts organizations at risk.

Detecting and mitigating the drift
Because environmental drift can happen at any time, impacting any cybersecurity tools, it’s essential to utilize an automated approach to detect when you have drifted from a known good cybersecurity effectiveness state. In other words, so you know when that thing that was working, has stopped working. For example, my WAF was stopping XSS attacks for my cloud-based webserver, my DLP was preventing PII from going out to the Internet over ICMP regardless of compression type, and my SIEM was correlating and alerting on lateral movement based on cybersecurity tool and operating system logs. But now something has stopped.

So, what can be done?

Create a baseline of known good cybersecurity effectiveness. Understand how your cybersecurity tools are reacting to various tests such as data exfiltration, the installation and execution of malware, beaconing, and thousands of other measures across endpoint, email, network, and cloud cybersecurity tools. In most cases, you’ll need to tune those cybersecurity tools to operate the way you want because the process of validating their effectiveness will often yield a number of shortcomings.

Use automation to detect drift from that known good baseline. You still may have an imperfect environment, but as you continue to improve, you’ll know at each stage how your cybersecurity tools should be preventing, detecting, alerting, and so on. Any deviation from this known good baseline detected through automation is an anomaly. Responding to the anomaly means you’ll be managing by exception, making your responses more precise, such as knowing that the WAF in the cloud that was preventing XSS stopped preventing XSS 15 minutes ago.

Use these drifts as scenarios to dissect in postmortems with a goal of improving the processes and communication between the cybersecurity team and others.

The ongoing effort to mitigate environmental drift will often result in highlighting where investments need to be made to further improve cybersecurity effectiveness as well as where legacy or redundant solutions can be removed, thus allowing those dollars to be reinvested in more critical areas. Continue to expand the reach of your known good baseline and the types of tests you are using to validate your cybersecurity effectiveness.

Environmental drift will not go away. There are simply too many variables and too much complexity to fully remediate it. However, using automation, environmental drift can be detected and mitigated, and the process of doing so will have broader ramifications for cybersecurity effectiveness as a whole – leading to improved change management and communication, greater value and precision from cybersecurity investments, and ultimately, reduced financial, brand, and operational risk from cyber threats.

Category: Security Published: 2/27/2019 3:00 PM
カテゴリー: ISACA

Challenges on Cybersecurity Landscape Demand Strong Leadership

ISACA Now Blog - 2019年02月26日 06:45:11
Body:

Senior leaders in business and government ought to take note of ISACA’s State of Cybersecurity 2019 research, which details the findings of a global survey of cybersecurity professionals.

The report highlights many of the issues of which we cybersecurity professionals long have been painfully aware: that it is increasingly difficult to recruit and retain technically adept cybersecurity professionals; that while gender diversity programs have yielded positive results, support for these programs may be waning; and, cybersecurity professionals are concerned that budgets for cybersecurity programs are flattening or on the decline.

While most senior leaders are already sensitive to these issues, the report should kindle a sense of urgency to address them. I submit that traditional methods of addressing these issues are inadequate to remedy the situation and we need to look to other leadership approaches to fill the gaps.

With cybersecurity professionals being such a high demand/low density asset, organizations ought to think out-of-the-box to ensure they have the right people, with the right skills, in the right place, at the right time. They need to look at other sources of talent. As an example, I am a huge fan of reskilling personnel. Reskilling is a term meant to describe where an existing employee is trained in new skills to fill gaps. During my time in the US Air Force, I saw this technique used to great effect as we took mid-level security forces personnel and trained them in information technology and cybersecurity skill sets. Some of the best cybersecurity professionals I know are former Air Force cops. Reskilling personnel is a tool that senior leaders can use to close the gaps.

Retention of coveted cybersecurity personnel is always going to be difficult, especially when the promise of a better salary and benefits are presented. Senior leaders ought to take a hard look at the compensation of their cybersecurity personnel, most of whom are undervalued. When you consider the “value at risk” protected by the cybersecurity professional, there is a good case to be made that, in many organizations, the cybersecurity staff is not receiving proper or competitive compensation.

For cybersecurity professionals, compensation is more than just making money. It is about being valued. It means seeing the organization demonstrate its commitment to its workforce (and its clients) by investing in the right technology and ensuring that its staff receive continuing professional education paid for by the organization. It means assigning leaders who understand and appreciate technology’s role in driving business success and sharing the rewards equitably. The best organizations that I served in made sure staff training was in the budget and that every member of the team knew what we, as an organization, were investing in them. In fact, I received my CISM certification through ISACA thanks to a commitment from my organization. Leadership matters when it comes to retention.

Likewise, leadership matters when it comes to fostering an environment where everyone’s contributions are valued. I know the value that diversity provides organizations and take notice when I see diversity programs being perceived as on the decline. ISACA’s 2019 State of Cybersecurity findings ought to spur an internal look into your organization. Is your diversity program on-track and meeting your current and future goals? Do you have the right personnel to ensure that you have diversity of experience, thought, culture and perspectives? Is your diversity program training producing the results you need? If the answer is no to any of these questions, it is time for leaders to step in and step up.

Effective and informed leadership is needed to address the issues this report highlights. Let’s all take a leadership role to make things better.

Category: Security Published: 3/4/2019 7:59 AM
カテゴリー: ISACA

Three Keys to Improving Medical Device Security

ISACA Now Blog - 2019年02月26日 04:16:11
Body:

A report released in January by the Healthcare & Public Health Sector Coordinating Councils details the need for better security for medical devices, a topic infrequently discussed in healthcare until recently. Successful cyber incidents that have used medical devices as the attack vector have brought the reality home. In one instance in the US, a network-connected portable X-ray machine was infected and spread through the entire corporate network. It took about 16 months to completely eradicate the malware.

While medical device security is a vital segment of overall security, it’s often relegated to the bottom of the action plan. Historically, securing medical devices was in the hands of the manufacturers who often stood by the claim that they could not update their software without violating FDA regulations or going through arduous approvals. While that was true, recent efforts between regulators, manufacturers and healthcare leaders are beginning to show signs of progress on this front.

In the meantime, what can healthcare IT leaders do to secure medical devices? Here are three basic steps every healthcare organization should take immediately.

1. Connect information technology and clinical engineering.
In some healthcare organizations, leadership of IT and clinical engineering (CE) reside in one role. That may be a VP or a director, but at some point in the hierarchy, these two departments need unified leadership to ensure they are working in a coordinated manner. Connect and cross-train. While IT and CE aims are different, they are well-aligned. IT knows systems and security; CE knows medical devices and associated regulatory requirements. CE is a foreign world to most IT people, but it’s knowable. When you have an IT VP or director facilitating discussions about medical device network segmentation, for example, both IT and CE have to figure out the details together. While an IT leader might struggle at first to understand the world of medical devices, it’s an interesting and worthwhile endeavor that will pay dividends in streamlined communication and more effective security risk management.

2. Inventory network connected devices.
Your CE leadership should have an accurate inventory of the medical devices in the organization and that data should include whether or not the device is network-connected. If that data doesn’t exist, that should be your priority. It’s relatively easy to discover based on the class of medical device. For example, your CE leader will know whether your medication pumps are on the network (likely yes) or your intra-aortic balloon pumps are network-enabled (it depends) or your CT or MRI machines are connected (likely yes). With that data, you can begin developing your security plan. Start with the easiest ones (stationary devices like CT and MRI) and work your way down the list. Or, start with the devices you deem to be highest risk. The point is to make a plan and get to work.

3. Segment medical devices on isolated networks.
Since you are unlikely to be able to run any sort of protective software (anti-virus, anti-malware, etc.) on most of your medical devices, you have to protect them via firewalls, network segmentation, white lists, network monitoring, etc. There are many things you CAN do that are non-invasive, but don’t take any action until you’ve thoroughly vetted it with your CE leaders and done tests at a time when patients won’t be impacted by any glitches. For example, you want to protect your X-ray and CT machines. Test your solution at 1 a.m. on a Sunday after you’ve notified X-ray and CT leaders of the test. It may sound extraordinarily cautious, but you can’t introduce issues that might impact patient care. If a case is underway and a network change causes the machine to reboot, you could cause problems that delay patient care. It’s important to remember that things taken for granted in IT must be done with much more care in the medical environment. Ensure you plan and test in advance of making any changes that could impact patient care.

While the FDA and manufacturers continue to work out the kinks in the process, there are proactive steps you can take to improve medical device security. It may not be perfect, but information security is an always-evolving field and there is no perfect state – only improvements along the way.

Category: Security Published: 2/26/2019 3:05 PM
カテゴリー: ISACA

How to Ensure Data Privacy and Protection Through Ecosystem Integration

Journal Author Blog Posts - 2019年02月25日 23:32:00
Body:

My recent ISACA Journal article, “Data Privacy, Data Protection and the Importance of Integration for GDPR Compliance,” describes how the movement and processing of personal data, along with the procedures around those workflows, are central to General Data Protection Regulation (GDPR) compliance. Here are actionable steps enterprises can take to implement a modern integration strategy that ensures both data protection and data privacy.

Ensure Data Protection
The keys to ensuring enterprise data protection through a combination of tools and policy include:

  • PGP encryption—Apply Pretty Good Privacy (PGP) encryption standards for data in motion and data at rest, and control the keys.
  • Secure protocols—Leverage built-in secure communication protocols like Secure File Transfer Protocol (SFTP) and Applicability Statement 2 (AS2) rather than standard email- or File Transfer Protocol- (FTP-) based workflows, and use digital certificates and keys rather than usernames and passwords to authenticate.
  • A backup strategy—Quickly accessing disaster recovery (DR) data is imperative to keep operations running—and compliant. But it is equally important to ensure data in the DR environment is protected in the same way as production.

Ensure Data Privacy
Data privacy has more to do with how the information is governed and used, and it is ensured through enhanced:

  • Data minimization—Only collect and keep data that you need. It may seem obvious, but many challenges can be avoided if you do not collect the data in the first place or delete it as soon as it is no longer relevant.
  • Governance—End-to-end integration enables a full view of the entire life cycle of your data. Leverage dashboards to see every touch along the data journey, and safeguard against unauthorized access.
  • Control—How do you know who can access data and for how long? Do they really need access? Similar to data minimization, reducing the number of people who have access will simplify control.
  • Education—Regardless of the technology in place, you are ultimately at the mercy of your people. Make sure you educate them on what is expected and what their responsibilities are.

Conclusion

While you could lock a few developers in a room and build out solutions that enable all these things, it will be more cumbersome and more expensive in the long run. What happens, for instance, when the next GDPR gets passed and your solution does not quite comply? You end up modifying your existing solution or rebuilding it altogether.

I recommend a single-platform ecosystem integration solution with built-in security, governance and control mechanisms to manage your data workflows.

Read Dave Brunswick’s recent Journal article:
Data Privacy, Data Protection and the Importance of Integration for GDPR Compliance,” ISACA Journal, volume 1, 2019.

Category: Privacy Published: 2/25/2019 2:46 PM BlogAuthor: Dave Brunswick PostMonth: 2 PostYear: 2,019
カテゴリー: ISACA

Is Your GRC Program Ready to Thrive in the Digital Economy?

ISACA Now Blog - 2019年02月22日 04:16:16
Body:

Digital technologies have profoundly changed our lives, blurring the lines between the digital and physical worlds. From its humble beginnings, the current constellation of tools and technologies that empower organizations has grown smarter. While digitalization makes businesses intelligent and offers immense value, it also opens up a diverse range of risks. Organizations often face challenges in effectively sensing and managing digital risks and in demonstrating reasonable compliance.

The impediments inhibiting effective GRC often get reflected as operational shortcomings, such as inadequate visibility into crown-jewel assets, a siloed view of risks, risk and compliance reports not catering to the right audience, redundant approaches restraining correlation and compounding exposure of risks, poor user experience and overwhelmingly complex GRC automation. With digital transformation going mainstream, organizations that fail to keep pace with relevant GRC strategies are likely to put themselves at a competitive disadvantage.

The following list summarizes the common misconceptions about the role of GRC in the digital ecosystem:

1) Traditional risk and compliance management practices organize operations into chunks of disconnected units, often noted as disparate departments merely administering their own chores to satisfy compliance requirements, with no homogeneity between risk frameworks, risk-scoring techniques, and terminologies, leading to misconceptions and cognitive disparities of GRC. The silo model also results in wasted resources and inefficiencies due to isolated approaches. Organizations should focus on bolstering effectiveness of GRC by breaking down silos and setting common or comparable frameworks and definitions.

2) With digitalization, businesses end up processing heaps of data of all forms, ranging from users' searches, clicks, website visits, likes, daily habits, online purchases and much more, to achieve their competitive edge. With data being the juice of digitalization, this also puts the organization on a path toward malicious attacks and information thefts. Given the fast pace of digital business and the burgeoning data underpinning the processes, GRC cannot work as a separate competence outside the digital processes – instead GRC should be integrated into design of digital transformation.

3) Digitalization is making inroads with novel delivery methods, and the supply chain is too big to ignore. The burgeoning growth of third-party relationships demands credible and timely insights of the risk and compliance posture underpinning supply chain entities. Remember, your organization is only as strong as its chain of suppliers, and any weak link in the chain is an opportunity for perpetrators to intrude. GRC cannot make the cut with a checklist focus.

4) GRC should communicate in the language of the audiences to demonstrate its value. How many times have we seen a risk assessment conducted at a theoretical level, highlighting the issues that management is already aware of; or a frontline questioning the context of the requirement in the controls framework and how it applies to his jurisdiction of support; or failing to keep the board’s attention due to technically overloaded risk presentations? It all sums up into a simple yet most complex expectation of “communication.” GRC should tailor its language to its audiences to advance user experience and to demonstrate value to business.

5) As speed and agility are the key influencers of success in the digitalization journey, administering GRC in spreadsheets and shared drives results in clear diminishing value for organizations. At the same time, automation is not the ultimate fix; the use of silo technologies without sufficient collaboration is far more upsetting than manual paperwork. Remember, the goal of GRC solutions is to deliver business value by providing accurate, credible and timely intelligence of risk and compliance, rather than getting tangled in solution warfare.

Digitalization is spreading its tentacles across organization. Though organizations are challenged to find new avenues of bulletproofing GRC, successful risk practitioners are staying ahead of the game by focusing on business value creation.

Editor’s note: Sathiyamurthy will provide more insights on this topic in his “Bulletproof your Governance, Risk and Compliance program - GRC by Design” session at ISACA’s 2019 North America CACS conference, to take place 13-15 May in Anaheim, California, USA.

Author’s note: Sudhakar Sathiyamurthy, CISA, CRISC, CGEIT, CIPP, ITIL (Expert) is an experienced executive and director with Grant Thornton’s Risk Advisory Services with a broad range of international experience in building and transforming outcome driven risk advisory services and solutions. His experience has been shaped by helping clients to model and implement strategies to achieve a risk intelligent posture. Sathiyamurthy has led various large-scale programs helping clients stand-up and scale risk capabilities. He has led and contributed to various publications, authored editorials for leading journals and frequently speaks on international forums. He can be contacted at sudsathiyam@gmail.com.

Category: Risk Management Published: 2/22/2019 3:08 PM
カテゴリー: ISACA

Certifications and the Paycheck: Trends and Truth

ISACA Now Blog - 2019年02月21日 04:52:28
Body:

New highly validated data from 3,305 employers reveals that the average cash market value for hundreds of tech certifications is at its lowest point in four years. Meanwhile, pay premiums for non-certified skills in the same period have gained 6 percent in value on average. What gives?

There’s always been a tug of war within employers about hiring tech people with skill certifications versus those who have learned by experience on the job. Eventually the question of comparable pay arises, shining a light on whether certification is a valid factor when measuring a worker’s value or potential on the job. And if it isn’t, then how should employers be assessing skills competence?

Pay disparities between certified and non-certified tech skills
While the performance of ISACA certifications in the compensation landscape has been mixed, as a group they are earning the equivalent of 12.4 percent of base salary in average pay premium, with CGEIT and CRISC earning the most. Compare this to the 7.5 percent average premium across all 466 certs reported in Foote Partners’ quarterly updated IT Skills and Certifications Pay Index™ (ITSCPI).

The fact is, employers have been perfectly willing to throw cash at both certified and non-certified skills for many years, typically in the form of premiums above and beyond base salary. Foote Partners has been capturing and reporting these cash premiums since 2000. Until 2007, certified skills were earning more on average than non-certified skills, but beginning in mid-2007 this trend reversed. Since then, the gap in pay premiums between the two has widened, with 551 non-certified skills now earning, on average, the equivalent of nearly 2 percent of base salary among more than 466 tech certifications tracked at more than 3,300 employers.

Certifications had a long run of consistently losing overall value from late 2006 to 2012. These were dark years marked by charges of fraud in the certifications-testing business and a prevailing opinion by many that certifications were simply too easy to attain, in particular those being offered by vendors to support their product lines. Technology vendors and vendor-independent certifying organizations fought back by adding prerequisites to sites for exams, real-time labs and peer-review panels.

It seemed to work, as certifications pay began to rise, although not nearly to the level of non-certified skills premiums, often for the same technologies. More and more management, process, and methodology skills and certifications gained popularity in the growth years for both intermediate and advanced skill levels, and pay continued to rise for both segments until about two years ago.

Average pay premiums for tech certifications recorded in the ITSCPI have most recently decreased in the last quarter of 2018, down 1.8 percent overall. They lost 2.4 percent of their value in calendar year 2018 and nearly 3 percent over the last two years. In the last three months of 2018, 57 certifications recorded cash pay premium losses against 17 gaining value.

Meanwhile cash pay premiums for non-certified skills increased 0.6 percent overall in October-November-December with 87 of these skills recording pay premium gains and 72 losing market value. Pay gains have been consistently higher in most quarters in each of the past three years.

Declines in certification market value can be misleading
Certification values decline in the marketplace for a number of obvious and not-so-obvious reasons.

Pay premiums diminish as certifications expire, are retired, or as technology evolves, and they’re replaced with more appropriate certifications. Since certifications have traditionally been attached to infrastructure tech (networking, systems, security) they have natural market pay volatility: nearly 17 percent per calendar quarter in the past four years. Volatility measures the percent of total certs that reported that change value every three months.

There also are certifications for architecture and for processes such as project management, frameworks and methodologies; as a group, they earn the highest average premiums among all categories reported, but they have their non-certified skill counterparts. This subjects them to pay erosion as employers feel more confident measuring talent in these disciplines based on work experience, especially at intermediate and advanced certification levels.

Non-certified skills can be found in far greater numbers than certifications in other segments such as programming and applications development, web, and database. Again, employers devise their own ways to judge proficiency in these areas; for example, coding tests, evaluating past work experience, references, and trial-to-hire employment. They drive down certification values by building their own robust internal training and development program, in effect devising their own certification programs.

There also remains a lingering bias that passing a proctored exam does not necessarily confer onto the test-taker an expertise in a subject, especially in cases when the pass rate entails getting only 70 percent of answers correct. Adding a laboratory requirement only works if the lab is a sufficient test of a candidate’s capabilities in the real world.

But, in a counterintuitive twist, cash market value can be a victim of a certification’s success. As interest in a certification escalates and more people attain the certification, the gap between supply and demand for the certification narrows, driving down its price as the laws of scarcity would dictate. This has been documented in the case of hundreds of certifications over the almost two decades of Foote Partners tracking and reporting cash pay premiums. The media rarely recognizes this contradiction in its reporting.

Perhaps the most common reason for certification values falling is a fundamental weakness that persists in the certification industry: a vast number of popular tech skills simply do not have a certification associated with them. No vendor owns the particular tech with products that are supported by certification training necessary to ensure sales and upgrade investments.

And what about so-called soft skills? Employers often are just as willing to recognize their value with pay premiums both inside and outside of salary, especially if they are combined with hard tech skills and industry, domain, or customer knowledge and experience.

Author’s note: For more information on certification market value in 2018, view this news release summarizing our latest IT Skills Demand and Pay Trends Report.

Category: Certification Published: 2/21/2019 3:05 PM
カテゴリー: ISACA

Data Analytics in Internal Audit: State of the Data, 2019

ISACA Now Blog - 2019年02月20日 07:21:46
Body:

Back in 2008, I placed a talented senior IT auditor who was one of the first I had seen with excellent data analytics skills, an ACL certification, and a vision for how to apply data analytics to a broader suite of audits. Our Fortune 500 client seemed very keen to capitalize on his skills. However, in the end, our client couldn’t clearly articulate a vision for Audit’s use of data analytics. The senior IT auditor moved to another company where Internal Audit had fully embraced this. Today, he is a senior IT audit & data analytics manager for a Fortune 100 company.

I’ve seen this story unfold more than once over the years. The takeaway: even with great people evangelizing the power that data analytics can bring, data analytics has taken a long time to take root within Internal Audit. With the seeds long planted, the garden is finally burgeoning forth.

Now is the time to embrace data analytics, people! But this isn’t the data analytics of even five years ago. The real power play today is coming from Python, R, and SQL. These are the tools IT audit professionals need to embrace and learn to use. Alteryx is on the horizon but doesn’t seem to have made big inroads into Internal Audit yet.

In the course of doing the research for this piece, I spoke with more than a dozen Internal Audit data analytics leaders and senior practitioners, at companies ranging from a major airline to a behemoth in the search engine and innovation space, to get their views on where things stand now, and what proactive IT auditors can do to hang ten on the data wave.

The current Internal Audit data analytics landscape
You might be wondering how embedded data analytics are at this point. Good question. My research shows that, in terms of the percentage of audits that use data analytics, the range goes from 25-30 percent to 50 percent, with at least two very large companies looking to be at 100 percent by this year.

How to get into the game
Build skills. How do you do that? The consensus among the leaders I spoke with is that online courses are a fantastic way to start. Check out Coursera, Udacity and Datacamp. Take this first step and dip your toes into the water with their free course offerings on Python, SQL and more. Boot camps are another way to go if you have the time. Once you have some skills, take on a project at work – a small one that you can drive to an early win. Do you need some sort of analytics certification? The answer across the board was no. What you need is curiosity, fearlessness, some skills and a growing portfolio of projects to build a solid use case.

Other tools that are ancillary but useful to start getting your arms around: Power BI, Tableau, QlikView and Spotfire. What’s in the works: Robotic Process Automation, AI, Neural Nets.

Now’s the time to start reading up. Who makes the best data analysts for Internal Audit? IT audit professionals. Why? Because being an adept and successful IT auditor requires that one is able to translate complex technical topics for non-techies. Business acumen and business process knowledge come with the territory, as does customer-facing concern and interface. What problem are you trying to solve? What data do you think will help you find an answer? IT auditors know the audit process and the evidentiary requirements for solid audit findings and recommendations. They also know how to write for a variety of audiences, which was identified by all the experts I spoke with as a critical skill.

Sure, data scientists know the tools inside and out, but they don’t have these other pieces, many of which are part of the intangible art of auditing.

Editor’s note: For more resources on what’s next in audit, visit ISACA’s future of IT audit page.

Category: Audit-Assurance Published: 2/20/2019 3:10 PM
カテゴリー: ISACA

Moving Beyond Stubborn Reluctance to Comply with GDPR

ISACA Now Blog - 2019年02月19日 04:56:00
Body:

Last May marked the beginning of the application of the General Data Protection Regulation (GDPR), which harmonized and unified the rules governing privacy in the European Union. Leading up to and following the adoption of the regulation, data protection has been in the focus of attention all around the world. Governments introduced new legislation, while supervisory authorities, the civil society, data controllers and processors publicly discussed rules, obligations and institutions set out in the GDPR, and campaigns have been launched to raise privacy awareness among data subjects and the public.

Despite all this, at the six-month mark after the compliance deadline took effect, only 30 percent of companies located in the EU could be considered GDPR compliant, a recent study showed.

Perhaps we should not be entirely surprised by that underwhelming statistic. GDPR compliance can be time-consuming and resource-intensive. It necessitates a strategic approach and a permanent focus on all activities related to data processing. Unfortunately, these characteristics might result in certain hazardous attitudes on the side of controllers and processors. Many of these actors are aware of the new rules introduced by the GDPR, yet they choose to ignore the relevant obligations, hoping to avoid inspections and further consequences. Others are reluctant to comply with the regulation and may consider other responsibilities as trumping privacy, for instance, assigning economic benefits more weight than protection of personal data. Finally, it is a common misconception that if the controller publishes its privacy notice or policy, its activities would be in line with all obligations deriving from GDPR.

Nonetheless, data subjects are becoming more conscious about their privacy and demand effective control over their personal data. Beside the heightened interest in the activities of controllers and processors, monitoring and enforcement mechanisms set out in the GDPR are operated by supervisory authorities around the European Union. Fines have been issued for non-compliance and, as a further consequence, the publicity of unlawful conduct further damages the reputation of controllers. Thus, the previously mentioned attitudes are harmful to the rights and freedoms of individuals; they violate provisions protecting privacy of data subjects, and may also lead to significant loss of income on the side of the controllers and processors.

Instead of demonstrating the previously mentioned attitudes, controllers and processors should realize that certain easy steps can promote GDPR readiness. First, they need to be self-aware concerning activities connected to the use of personal data. An updated record of processing activities and the designation of a data protection officer may be of great help in this respect. Application of data governance tools can also assist in setting the relevant internal policies. Furthermore, it is necessary to document every aspect of these activities, thus demonstrating compliance with the principle of accountability. Finally, controllers and processors should make their operations transparent to supervisory authorities and data subjects as well as to the general public, via data protection notices and other methods of providing information.

These are certainly not the overall conditions for GDPR compliance, but they facilitate controllers and processors in achieving it, and constitute valuable proof that an organization is willing to abide the rules of the regulation and respect the privacy of data subjects.

Category: Privacy Published: 2/19/2019 3:22 PM
カテゴリー: ISACA

Getting Your GDPR Compliance Program Into Gear With Proper Record Keeping

Journal Author Blog Posts - 2019年02月19日 01:32:35
Body:

Compliance procedures are notoriously demanding, and European Union General Data Protection Regulation (GDPR) compliance programs are no different. My recent Journal article underlined some of the challenges that may be experienced by organizations as they try to meet GDPR requirements and introduced a series of steps that organizations can take to help them in their GDPR compliance journey.

Arguably, one of the integral first steps is developing and maintaining a record of processing activities undertaken by the organization.1 This will help in understanding:

  • The categories of personal data being processed
  • The categories of processing being undertaken
  • The external and internal flows of personal data throughout the organizational ecosystem
  • The key accountabilities associated with processing
  • The risk associated with processing

Article 30 of the GDPR requires that an organization, whether a controller or processor, must maintain proper record of processing activities under their responsibility. Therefore, your organization must maintain a record if it carries out certain operations or set of operations on the personal data under its responsibility. This may include collecting customer demographic details, recording employees’ personal information or other types of operations.

Your record must be in writing, including electronic form, and made available to the supervisory authority on request.

For these reasons, your record should be current, complete and accurate at all times and in a form suitable for scrutiny.

The obligation to maintain a record may not apply if your organization employs less than 250 persons. However, this exception does not apply if your processing activity:

  • Is likely to result in a risk to the rights and freedoms of data subjects. Therefore, if your customer records are stored with a cloud services provider, for example, this form of processing may be viewed as a risk.
  • Is not occasional. This could mean that once your processing is not random or rare, then a record of processing activities is required.
  • Includes special categories of data, such as personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or other special categories referred to in Article 9(1).
  • Includes personal data relating to criminal convictions and offenses as referred to in Article 10.

Based on the foregoing, it may be best that your organization errs on the side of caution and maintain a record of its processing activities.

The obligations in relation to the details of the record differ between a controller and processor. Common among them, however, are the requirements to maintain:

  • A description of the technical and organizational security measures undertaken to minimize the risk to processing
  • Details relating to transfers of personal data to a third country or an international organization
  • The name and contact details of your organization, or its representative, and your data protection officer

Knowing the extent of your processing activities will greatly assist your organization as it moves forward in its GDPR compliance program.

Read Corlane Barclay’s recent Journal article:
The Road to GDPR Compliance: Overcoming the Compliance Hurdles,” ISACA Journal, volume 1, 2019.

1 This article assumes that an organization falls within the scope of GDPR. For further discussion on the scope of GDPR, please see my Journal article.

Category: Government-Regulatory Published: 2/18/2019 3:08 PM BlogAuthor: Corlane Barclay, Ph.D., PMP PostMonth: 2 PostYear: 2,019
カテゴリー: ISACA

More on Password Dictionaries

Journal Author Blog Posts - 2019年02月15日 03:48:24
Body:

As a follow-up to our recent ISACA Journal article, “NIST’s New Password Rule Book: Updated Guidelines Offer Benefits and Risk,” we wanted to provide some additional thoughts on the password dictionary concepts. As our article suggests, organizations should place appropriate controls around the establishment and maintenance of the password dictionary. Under the passphrase approach advocated by the latest US National Institute of Standards and Technology (NIST) guidelines, the dictionary becomes the primary tool for enforcing complexity and uniqueness in user authentication credentials. As such, it is integral to ensuring secure access to IT resources.

With respect to initially establishing the password dictionary, it can be difficult to build a comprehensive and highly secure dictionary from scratch. Enterprises should remember:

  • Open-source lists of bad and commonly used passwords are publicly available and may provide a sound starting place. Commercial services have spent considerable time and resources researching and compiling password dictionaries and may be worth the investment.
  • Implementing a standard dictionary alone is not really enough. It would not include prohibitions specific to the organization and its context. Involve organization leaders and/or interested users in contributing names and terms associated with the organization, its brand image, close affiliations, products, lines of business and people. Be sure to block known (or suspect) compromised credentials and consider using the dictionary to also block use of employee-specific information (such as names and usernames).

It is important to note that a password dictionary should not be considered a “one-shot and done” task. Organizations and the environment they operate in are dynamic, and the password dictionary will become obsolete over time. Organizations should consider the following:

  • Regularly refresh the standard dictionary as lists of bad and commonly used passwords evolve. Customized dictionaries of prohibited words and phrases need to be reevaluated, augmented, and updated periodically.
  • If a breach occurs (or is suspected), the password dictionary should be quickly updated to prevent the potential use of compromised phrases.

Maintenance of the dictionary should become a routine and continuous process for the organization. Establish an appropriate owner of the dictionary maintenance process (for example, a leader in the IT security or compliance functions), and put controls in place to ensure periodic and ad-hoc maintenance of the dictionary. In highly sensitive applications, consider a periodic independent audit of the dictionary and its use. The organization needs assurance that the effectiveness and robustness of the dictionary does not erode over time.

Read Bachman Fulmer, Melissa Walters and Bill Arnold’s recent Journal article:
NIST’s New Password Rule Book: Updated Guidelines Offer Benefits and Risk,” ISACA Journal, volume 1, 2019.

Category: Security Published: 2/14/2019 3:02 PM BlogAuthor: Bachman Fulmer, Ph.D., CISA, Melissa Walters, Ph.D., and Bill Arnold, CISSP PostMonth: 2 PostYear: 2,019
カテゴリー: ISACA

Women in Cybersecurity Often Worth More Than They Realize

ISACA Now Blog - 2019年02月15日 03:46:20
Body:

Before beginning my career in cybersecurity recruitment, I worked in the female-dominant industry of travel public relations. I was largely oblivious to the challenges of being a female in the workplace because I was surrounded by other strong businesswomen on a day-to-day basis. As a result, it came as quite the shock when entering the male-centric world of cybersecurity. I was surprised by just how little women trusted themselves when it came to applying for high-level managerial positions, and how few females there were in this space.

It’s become a running theme when attending cyber networking events that for every 20 men I see, there will be one woman. While so many clients I work with accentuate the fact they require more females in their workplace, they tend to only see it from a gender diversity “tick a box” standpoint and are often frustrated or confused as to why they need more women in their team.

In Forbes’ article on the shortage of women in cybersecurity, Priscilla Moriuchi, Director of Strategic Threat Development at Recorded Future, said, “We need people with disparate backgrounds because the people we are pursuing, (threat actors, hackers, 'bad guys') also have a wide variety of backgrounds and experiences. The wider variety of people and experience we have defending our networks, the better our chances of success.”

While I believe this is true, I’d also like more attention paid to the value being added by women in the security space. Some of the work done by women in cyber recently was driven into the spotlight by Forbes’ US list of the 50 Top Women in Technology.

This list includes Manal Al-Sharif, who resides in Sydney, and is well-known for being the first Saudi woman to specialize in information security. She is also the Founder of Women 2 Hack Academy, Australia's first social enterprise dedicated to discovering women leaders and nurturing them to pursue a career in cybersecurity. She’s breaking down barriers and really proving what females are capable of achieving in this space.

New research from Cybersecurity Ventures found that women now represent 20 percent of the global cybersecurity workforce. While this is up from 11 percent in 2013, there’s still so much work that needs to be done.

Often, when I meet with female candidates, they’re completely unaware of their value in the market and just how much their skillset is worth. They undersell themselves both in terms of seniority and salary. There needs to be better recruitment strategies around attracting and influencing more females to get into, or progress, within cybersecurity.

It’s encouraging that we’re seeing more and more female cybersecurity graduates coming through now – the level of job applications I’m receiving from junior-level females is proof of this. That said, it’s imperative that we continue to aid young women in seeing cybersecurity as a progressive and attractive career path, and also to allow females at mid-senior levels to value their own worth.

In order to do this, recruitment and HR professionals need to be consultative with female candidates about the value of their skillset in the current market. Without giving them this kind of education, along with the confidence to ask for more, their abilities – which are like gold dust in today’s market – will be taken advantage of by employers who will often try to get them at the cheapest price possible.

Similarly, when I meet with exceptional female candidates and they say they lack the skills or experience to apply for a more senior role or managerial position, I do everything in my power to provide them with the confidence to go for it. In a male-centric industry, it can be intimidating to imagine managing peers, which will almost always include managing men.

This is justifiable as there will, unfortunately, always be those who will have a problem with female authority. Even as a young female recruiter, I come across clients in cybersecurity who are initially hesitant to work with me, and sometimes even make it obvious they’d rather work with my male counterparts instead. That is until I deliver them with a good service and prove their initial judgments to be false. If this is the kind of predisposition that even recruiters have to deal with in the security space, then I can understand why female candidates I work with are hesitant to apply for those senior positions. It’s imperative that we challenge the status quo and encourage girl power in this thriving industry.

Editor’s note: For more resources on this topic, visit ISACA’s SheLeadsTech website.

Category: Security Published: 2/15/2019 3:31 PM
カテゴリー: ISACA

North America CACS Keynoter Guy Kawasaki Sizes Up Innovation, Entrepreneurship

ISACA Now Blog - 2019年02月14日 04:30:49
Body:

Editor’s note: Guy Kawasaki, a Silicon Valley-based author, speaker, entrepreneur and evangelist, will be the opening keynoter at ISACA’s 2019 North America CACS conference, to take place 13-15 May in Anaheim, California, USA. Kawasaki recently visited with ISACA Now to discuss some of the themes he will explore at North America CACS, including innovation and entrepreneurship. The following is an edited transcript. For more of Kawasaki’s insights, listen to his recent interview on the ISACA Podcast.

ISACA Now: At North America CACS, ISACA will be celebrating its 50th anniversary. Of all the technology-driven changes during that time, which do you consider to be the one that will have the longest-lasting impact?
It’s an all-encompassing change, but the internet by far will have the longest-lasting impact. And not all of the impact may be positive.

ISACA Now: How can practitioners be evangelists in their own right in support of innovation at their enterprises?
It would take a book to answer this question (which I have written). But the gist is that your innovation has to be “good news” that improves the life of your customers. It’s easy to evangelize good stuff. It’s hard to evangelize crap. Then you have to believe it’s good news and develop the skill set to do great demos to show, as opposed to tell, people why it’s good news.

ISACA Now: Which aspects of successful entrepreneurship tend to be most misunderstood?
That it’s fast, fun and easy. Entrepreneurship is slow, painful, and hard – and that’s if you succeed. Also, people vastly underestimate the role of luck and overestimate the impact of their skills.

ISACA Now: Your career has included time with Goliaths such as Apple and Google – what lessons from enterprises of that size are most applicable to smaller and medium-size businesses?
The lessons that Goliaths can teach are:

  1. Anything is possible. Two guys/gals in a garage can create the next big thing.
  2. Engineering counts more than marketing. We’re not talking about selling sugared water here.
  3. Trees don’t grow to the sky. Every company hits a wall or two. What matters is what you do after you hit the wall.

ISACA Now: What should organizations focus on to make sure they are bringing in leaders with the right skills for today’s fast-evolving business landscape?
Organizations should add a third parameter to what makes a good candidate. The first two are always work experience and educational background. I would make the case that a love of what the company does is just as important. Honestly, I’d rather have a candidate with imperfect experience and background who loves the product than a perfect candidate who “doesn't get it.”

Category: ISACA Published: 2/14/2019 3:01 PM
カテゴリー: ISACA

ページ