Happy ISACA Volunteer Appreciation Week! While my colleagues and I agree that we should celebrate our volunteer partners at the chapter and international levels every day, we are thrilled to participate in a week of highlighting some of the ways volunteer support is essential. After all, ISACA exists to support our members in the IT audit, risk, governance, assurance and security industries, and our local and international volunteers are the ones fulfilling our purpose and promise, and exemplifying our values.
Where would we be without the passionate, dedicated and innovative experts advancing ISACA’s great work? For one, we would miss out on the camaraderie in networking and bonding over accomplishing sometimes-challenging objectives to advance our work. We love working with people like Jack Freund, CRISC Certification Working Group member and 2018 ISACA John W. Lainhart IV Common Body of Knowledge Award recipient, who is a huge proponent of giving back. “You should volunteer and get involved with ISACA because it is important, hard work,” he said. “It’s work that will put you in touch with the best in your industry at the local and international level, and working with the best makes you better as well.”
What makes our volunteers the best? It’s their interest and expertise that makes it possible to accomplish impactful initiatives. Check out these highlights from the first quarter of 2018.
Just think of what we can accomplish for the rest of the year! Why should you join the more than 4,200 people spending their valuable free time giving back each year? Not only are they meeting new people, expanding their professional network, gaining new experiences to advance in their careers, ensuring the security of the future of their profession, and earning CPE hours, but they’re also gaining personal satisfaction by mentoring, teaching, learning leadership skills and much more.
As ISACA Belgium Chapter President and past ISACA board director Marc Vael says, “In return for the time you invest as a volunteer, you meet so many people from different backgrounds, with different experiences and knowledge in an international context. Basically, you get so much more back for the rest of your career. And that is priceless.”
Our volunteers are priceless, and there is no doubt that every day should be ISACA Volunteer Appreciation Day! Without you, our organization would not have existed for nearly 50 years, much less be looking to grow in the next 50. You are the reason ISACA exists and continues to provide valuable resources to our global professional community. Thank you!
Editor’s note: Learn more about volunteering and apply for an open opportunity at www.isaca.org/volunteer.
Learn how to recognize outstanding international and chapter volunteer service with an ISACA Award at www.isaca.org/awards.Category: ISACA Published: 4/18/2018 3:03 PM
Cyberinsurance and data privacy will garner more focus for the remainder of 2018 and beyond. The impending “Equifax effect,” which most of us anticipated, was put forth in late February 2018 by the US Securities and Exchange Commission (SEC) in the form of guidance that states that public companies should inform investors about cybersecurity risk even if they have never succumbed to a cyberattack. The guidance also emphasizes that companies should publicly disclose breaches in a timely manner.
This development perfectly aligns with the (cyber)consumers, providers and regulators (CPR) cycle (see figure 1) I propose in my recent Journal article, which basically necessitates participation from 3 key players—cyberinsurance providers, consumers and regulators. This conglomerative effort not only improves addressing and estimating cybersecurity risk from an insurance coverage perspective but also minimizes cataclysmic breaches. Providers need to be able to identify the right amount of cyberrisk that they are willing to undertake to provide ideal pricing for the coverage. This, in turn, depends on the consumers themselves to quantitatively know how much risk they own.
Figure 1: CPR Cycle
Today, there are numerous ever-evolving cyberthreats (e.g., zero-day, Internet of Things botnet distributed denial-of-service attacks, ransomware attacks) that result in costs that are not inherently covered by most cyberinsurance policies. Above all this, cyberinsurance has always been an add-on policy to traditional insurance policies. Historically, insurance companies relied on abundant data to make decisions on how much auto or home insurance coverage may be offered to a person or entity. In the cyberworld, the common backlash by both the providers and consumers is that there are not enough data to rely on.
Heat maps continue to be a staple resource for IT risk professionals to estimate risk worldwide. In my experience of performing security risk assessments, I always had a disconcerted feeling leveraging heat maps to estimate risk quantitatively. Turns out, there are better and proven statistical and probabilistic methods that can be adopted to quantitatively estimate cyberrisk (in monetary figures rather than in colors of red, yellow or green), especially when there is a dearth of data. An organization’s emphasis should be on addressing the burgundy arrows in the CPR cycle, and my recent Journal article provides an overview of these methods, potential benefits and references in attaining these goals.
The purpose of attempting cyberinsurance CPR is to build a continuously maturing ecosystem comprising:
Read Indrajit Atluri’s recent Journal article:
“Why Cyberinsurance Needs Probabilistic and Statistical Cyberrisk Assessments More Than Ever,” ISACA Journal, volume 2, 2018.
If anyone had any doubts, data privacy is still kind of a big deal. Beyond being at the core of regulations ranging from the Health Insurance Portability and Accountability Act of 1996 (HIPAA) in the United States to the global, far-reaching General Data Protection Regulation (GDPR), data privacy has its own annual day of recognition – 28 January. As organizations design operational strategies and tactics around data privacy, opportunities to leverage applications with built-in functionality to safeguard sensitive and confidential data are valued. For those using Microsoft SQL Server 2016, there are a couple of areas where built-in functionality can assist with data privacy initiatives.
Where is the data?
Safeguarding of sensitive or confidential data generally begins with data classification. Once data has been identified and appropriately classified, the next effort is establishing internal controls commensurate with the sensitivity/confidentiality level of the data. Depending on the organization, designing and implementing internal controls may be a bit of a hurdle. In its 2017 State of Cybersecurity Metrics Annual Report, IT consulting firm Thycotic reported that 4 in 5 companies don’t know where their sensitive data is. Understandably, unknown data locations make it difficult to identify safeguards to protect the data. As in prior versions of SQL, using SQL Server Management Studio (SSMS) in SQL Server 2016 can provide a list of databases. Also, in addition to a variety of other data querying options, Transact-SQL (T-SQL) queries can be used to locate data and related tables.
Who has the data?
Having identified where the data resides, entities are faced with ensuring that access to the data is limited to those with the appropriate roles in their organizations. Once those access determinations are made (following the Principle of Least Privilege), organizations can then use Microsoft SQL Server 2016’s Dynamic Data Masking (DDM) feature to support its access strategy. With Dynamic Data Masking, sensitive/confidential data remains unchanged in the database while this data is hidden in designated database fields. Organizations can fully or partially mask the sensitive/confidential data depending on how they configure DDM.
Another option for limiting access to data is to use Always Encrypted. This feature allows encryption of sensitive data (at rest and in transit) within client applications. Since encryption and decryption happen outside of the SQL environment, it facilitates least privilege by limiting data access to those who own the data and need to view it.
As data privacy expectations become more permanent fixtures of entities’ operational landscapes, built-in features such as Dynamic Data Masking will become more commonplace. The newer DDM functionality, coupled with existing functionality through SQL Server Management Studio, can help entities achieve and maintain data privacy goals. Coupled with best practices in data management, this built-in functionality should provide an easier path to meeting the data privacy expectations of customers and compliance regulations.Category: Privacy Published: 4/13/2018 3:20 PM
As internal auditors, we’ve seen an uptick in usage of the term “Agile” in reference to how more and more companies are developing software. Agile software development has grown increasingly popular as both software and non-software companies transition from traditional development methodologies, such as the waterfall model, to a value-driven Agile approach. Like any auditable area, this requires internal auditors to understand the key concepts, evaluate the risks and determine how to effectively audit the process based on pre-defined objectives. However, that’s not the purpose of this blog post. What we auditors find even more intriguing is how the values and principles behind Agile software development apply to the field of internal auditing.
The Agile foundation
Agile is an overarching term for various software development methods and tools, such as Scrum and Scaled Agile Framework (SAFe), that share a common value system. Developed in 2001, the Agile Manifesto provides a set of fundamental principles that Agile teams and their leaders embrace to successfully develop software with agility. Companies that have adopted Agile development practices recognize the urgency to adapt quickly to changing technology and deliver enterprise-class software in a short amount of time; otherwise, they run the risk of becoming extinct.
Some of the top benefits of agile development include:
Why apply Agile to internal audit?
At The Mako Group, we have found that applying Agile concepts to the internal audit function is not a new concept, but has never been more crucial than in our current environment. Like the companies we aspire to protect through objective assurance and advice, internal audit must be able to address emerging critical risks and provide relevant insight in a timely fashion. Despite our best intentions, many audit departments still develop a long-term plan that cannot be easily changed and often employ antiquated audit methodologies. If we truly want to add significant organizational value and be a trusted partner with management, internal auditing must evolve, and Agile techniques can help us do that.
Agile internal audit tactics
Just as companies are scaling Agile software development based on the size, capabilities and culture of the organization, the extent of an internal audit function’s agility will vary widely for one group versus another. Nonetheless, we have narrowed our focus to three key areas that every internal audit department should consider when becoming more agile:
Furthermore, internal auditors must not wait until the end of an audit to provide results. Early and frequent communication with stakeholders means that the final report or presentation should simply reflect a visual summary of the insights already discussed. We should not only identify opportunities to enhance an organization’s operations but also continuously improve our own audit processes. A crucial role on an Agile team to help foster an environment of high performance and relentless improvement is the scrum master. Acting as the coach of an internal audit team, a scrum master would ensure that the agreed Agile process is followed and encourage a good relationship among team members as well as with others outside the team.Category: Audit-Assurance Published: 4/17/2018 8:58 AM
By now, most practitioners have heard (probably from a few different sources) that organizations struggle when it comes to finding, hiring and retaining the right resources for information security and/or cybersecurity professionals. There has been quite a bit written about this trend: the impact that it has on security efforts within enterprise, advice and guidance about how to staff and manage your security team in light of the talent challenges, strategies for working around it, etc. However, there is another potential angle that is comparatively less analyzed: the impact to existing practitioners – both in the short and long term – in light of the shortage.
Understanding this is important for practitioners as preparation now translates directly to continued success down the road. In knowing what we do about the workforce dynamics, we can make sure that we’re optimally positioned when the time comes for us to change jobs and continue to be in demand down the line.
Skills gap characteristics
The first thing to note is that the skills gap has characteristics that can be measured. We know that it exists from numerous research reports and surveys, specifically findings citing the lengths of time required to fill open positions, perceived difficulty in finding qualified candidates and challenges in retaining existing staff. ISACA’s 2018 State of Cybersecurity research was no exception in pointing this out. Findings from previous years of ISACA research, as well as studies from other organizations, suggest that these challenges are persistent.
However, the actual areas of need have been comparatively less thoroughly analyzed, including which positions are most problematic to staff and retain, which skills are in more demand, where the most hiring activity occurs, etc. Much like the skills gap itself can be measured, so, too, can these other characteristics. This year, we attempted to gather more information about these secondary characteristics of the skills gap.
What we learned was that individual contributors are in higher demand than managers. We also learned that there is a higher demand for technical resources, relative to non-technical ones. While that may not be a complete surprise to anyone who has tried to staff a security team, it is an interesting data point because it informs organizational staffing and retention strategies. The report data can also be useful for practitioners – i.e., those on the other end of the staffing equation. Meaning, individuals wishing to position themselves optimally for their future career growth can use this information as part of the “career strategy.”
Career “Future Proofing”
We as practitioners can maximize our competitiveness in the short term and ensure that we continue to be marketable over the long term by taking this information into account. For example, the information indicating that technical resources are harder to find relative to non-technical ones can help motivate us to stand out in the workforce by taking active measures to invest in our personal technical acumen. There are a number of ways to do this, of course, but ensuring that we remain abreast of new technologies, that we diversify the set of technologies with which we are conversant and keeping abreast of new attack methods is a good way to start.
In fact, there are many resources available to ISACA members to assist; for example, our partnership with Wapack Labs can help ensure that members stay abreast of attacker tradecraft; ISACA webinars (particularly those of a technical nature) and publications like the ISACA Journal can keep technical skills honed; and chapter activities can provide opportunities to learn new technical skills. This is potentially advantageous even for those that are more senior in their careers. For example, if a hiring decision came down to two resources – if all other things are equal, but one is more “current” in their technical understanding – who would you hire? See what I mean?
Over the long term, this information about the skills gap is likewise important for practitioners as it can inform their future career planning. Why? Because logic dictates that the dynamics will change over time in a few specific ways. For those with a decade or more before retirement, planning accordingly is valuable.
First, current challenges in obtaining qualified technical staff mean that it is most likely that organizations (and, in fact, the market at large) are likely to innovate toward automation strategies for technical work being done by human analysts today. Will this mean the existing workforce will be left high and dry? Not necessarily … but it does mean that technical acumen, while useful to help differentiate you among candidates in the short to intermediate term, isn’t a guaranteed way to future-proof your career over the long haul. This in turn means that establishing a diverse set of skills – as well as building a strong professional network – are important in the long term, in addition to building technical skills.
Second, the fact that there is increased demand for individual contributors relative to managers means that (again, thinking long-term), those who desire to move into manager positions should be looking to differentiate themselves as well from a competitive point of view. They might, for example, consider taking on management responsibilities now to give them skills that, down the road, will be important to their overall competitiveness.
As with most things, there’s no “one-size-fits-all” advice – there are as many viable career tracks as there are practitioners themselves. That said, one thing that’s probably universally true is that having a “career plan” that accounts for both near-term and longer-term changes is a good idea. The findings from this research can help accomplish that.Category: Security Published: 4/16/2018 8:30 AM
Many may be familiar with guidelines on personal data breach notification from Article 29 Working Party (WP29) prepared in October 2017 under Regulation 2016/679. In addition, the General Data Protection Regulation (GDPR) introduces the requirement for a personal data breach (henceforth “breach”) to be notified to the competent national supervisory authority.
The basic concept of personal data breaches was not introduced first by the GDPR, and there are also some EU Member States that already have their own national breach notification obligation. This may include the obligation to provide notification of breaches involving categories of controllers in addition to providers of publicly available electronic communication services (for example in Germany and Italy), or an obligation to report all breaches involving personal data (such as in the Netherlands).
GDPR contains several provisions relating to personal data breaches that data controllers (and processors) must also be aware of. Additional information can be found in ISACA’s Implementing the General Data Protection Regulation publication; however, I’ve outlined some key highlights on breaches below.
So first, what is a personal data breach?
The GDPR defines a “personal data breach” in Article 4(12) as: “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed.”
What type of personal data breaches exist?
It is also apparent from above that the concept of personal data breaches is closely linked to the principle of the integrity and confidentiality of personal data (Article 5 (1) (f) of the GDPR). Therefore, a wide variety of personal data breaches may occur, such as losing a laptop or USB drive that contains personal data, attacking an IT system, or even sending a letter or an email to wrong recipient.
Four years earlier, WP29, in its Opinion issued in 2014 (Opinion No. 03/2014), presented a number of practical examples of what is considered to be a personal data breach and the consequences it may have.
Why is it so important that the personal data breach is handled as soon as possible?
The Preamble to the GDPR (Point 85) states that "a personal data breach may, if not addressed in an appropriate and timely manner, result in physical, material or non-material damage to natural persons,” such as:
What should you do if a personal data breach occurs?
The data controller has several tasks when a personal data breach is noticed:
When does the personal data breach not need to be reported to the authority and when do the persons concerned not have to be notified directly?
If the data controller can demonstrate, in accordance with the principle of accountability, that the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons, the notification may be omitted. (For example, if mail sent by a controller to a wrong address is returned without being opened, meaning that no personal data has been accessed by an unauthorized person.
How can controllers prepare for handling personal data breaches?
Given that personal data breaches can occur at any data controller, and in such cases data controllers need to react quickly, it is important for controllers to be prepared in this respect as well.
First, every actor must prepare a data breach response plan, for which there may be internal rules as well. A data breach response plan enables an entity to respond quickly to a data breach. By responding quickly, an entity can substantially decrease the impact of a breach on affected individuals, reduce the costs associated with dealing with a breach, and reduce the potential reputational damage that can result.
Below is a data breach response plan quick checklist to help with this preparation:Information to be included Yes/No Comments What a data breach is and how staff can identify one
Clear escalation procedures and reporting lines for suspected data breaches
Members of the data breach response team, including roles, reporting lines and responsibilities
Details of any external expertise that should be engaged in particular circumstances
How the plan will apply to various types of data breaches and varying risk profiles with consideration of possible remedial actions
An approach for conducting assessments
Processes that outline when and how individuals are notified
Circumstances in which law enforcement, regulators (such as the OAIC), or other entities may need to be contacted
Processes for responding to incidents that involve another entity
A record-keeping policy to ensure that breaches are documented
Requirements under agreements with third parties such as insurance policies or service agreements
A strategy identifying and addressing any weaknesses in data handling that contributed to the breach
Regular reviewing and testing of the plan
A system for a post-breach review and assessment of the data breach response and the effectiveness of the data breach response plan
Recommendations on next steps:
An effective data breach response generally follows a four-step process — contain, assess, notify and review:
How does the Hungarian DPA prepare to perform its duties in relation to personal data breaches?
Based on available information from the Hungarian DPA, there is a separate department within the Hungarian DPA’s organization that addresses receiving and managing the personal data breach notifications. It is also expected that data breach notification must be made on the authority’s website, or there will be an online interface which the notifications can be sent to the authority.
Editor’s note: ISACA’s Implementing the General Data Protection Regulation publication is an educational resource for privacy and other interested professionals; it is not legal or professional advice. Consult a qualified attorney on any specific legal question, problem or other matter. ISACA assumes no responsibility for the information contained in this publication and disclaims all liability with respect to the publication. 2018 © ISACA. All rights reserved. For additional ISACA resources on GDPR, visit www.isaca.org/GDPR.Category: Security Published: 4/11/2018 3:09 PM
Editor’s Note: Mike Walsh, CEO of Tomorrow and futurist, innovation and technology speaker and authority on emerging markets and IoT, will bring his experience and perspective on Big Data to his closing keynote for ISACA’s 2018 EuroCACS Conference. The event will gather information systems audit, assurance, control, governance and security professionals, from 28-30 May 2018 in Edinburgh, Scotland.
What are some of the most promising applications of big data that you have observed in recent years?
Lately, the most interesting thing about data is not so much what we have been doing with it, but rather how our thinking about the strategic importance of data has changed. Instead of hiring an army of data scientists and building a monolithic bureaucracy to collect and analyze their data, the new focus of companies is on how to become AI-first. In other words, how do you leverage machine learning and algorithms in combination with data, to reimagine how you engage your customers and the way you do business?
What are some common missteps enterprises should seek to avoid when it comes to leveraging big data?
One of the missed opportunities for enterprises when it comes to data is allowing each operating unit to make its own disparate, non-aligned decisions with data and platforms, as opposed to supporting a company-wide vision to aggregate, process and analyze data into a single data lake. This is not merely a cost issue. With the rise of machine learning, gaining scope and scale in data, has become more important than ever.
How can business technology professionals – such as those that will be in attendance at EuroCACS – help their executive teams put data to good use?
The most important discussion that business technology professionals need to have with the other leaders in their organization is on how real-time data and algorithms should impact their approach to solving problems and making decisions. You can change your enterprise technology stack, but unless you also challenge the culture of leadership and decision making, then nothing really changes at all.
What are some emerging technologies that you anticipate will be most disruptive in 2018 and beyond?
Quantum computing is starting to look less like science fiction and more like a breakthrough technology with real -world applications. I recently spent some time in Tokyo, where transportation companies are already working on designing algorithms that leverage quantum platforms to solve the kind of tricky optimization problems that tomorrow’s traffic control and congestion systems will need to handle at scale.
What type of feedback have you received from readers of The Dictionary of Dangerous Ideas? What have been some of the major takeaways?
Readers of The Dictionary of Dangerous Ideas tell me that they love how the book helps them start interesting conversations and debates with their colleagues and clients. It is easy to see change in the world as merely the function of inevitable technology advancement, whereas in fact, many of the ideas and innovations— whether it be AI, automation or other algorithmic platforms —are really things that we should be debating and discussing in greater detail. The future may be now, but it is still for us to decide.
“Governance” and “innovation” are terms of such global importance today that an innovation governance event billed as “the first global leadership roundtable centered on issues at the intersection of [artificial intelligence] innovation and governance” was hosted in Belgium in March. No less than the country’s deputy prime minister cohosted the event.
Few can forget Elon Musk’s comments at the Massachusetts Institute of Technology (Massachusetts, USA) as quoted by The Guardian on 27 October 2014: “I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.” USA Today reported cosmologist Stephen Hawking saying that artificial intelligence (AI) could prove to be “the worst event in the history of civilization” on 2 January 2018. The source reminds us that Facebook’s Mark Zuckerberg poo-pooed these warnings. The summit’s participants, however, recognize that there is a potential issue and, therefore, aim to begin the conversation of AI innovation governance at a global policy level.
Closer to home, ISACA’s CGEIT Review Manual reminds us of John C. Henderson and N. Venkatraman’s strategic IT-business alignment model, published in the IBM Systems Journal back in 1999, titled “Strategic Alignment: Leveraging Information Technology for Transforming Organizations.” While the model provides a “competitive potential alignment” perspective, the question of the governance of that transformation is unanswered. For 1999, this was forgivable, as innovation governance was likely nowhere near top-of-mind. Today, forgiveness is increasingly less likely.
Pragmatically, the call for global policy-led AI innovation governance is at an extreme end of the IT innovation governance spectrum. For corporations, innovation governance matters because organizational resources are involved and because it is a governance imperative to ensure that those resources are appropriately directed toward fulfilling the organization’s strategy. While some may be familiar with the risk and compliance aspects of innovation, fewer might be familiar with the corporate governance imperatives associated with corporate innovation. My Journal article aims to create awareness of the need for improved corporate innovation governance in the interest of good corporate governance.
A follow-up AI innovation governance summit is already planned for the United States this year. If its future impact results in various government policies being established, regional regulations are sure to follow. And where there are more regulations, governance oversight and compliance management are imperatives, which ensures that innovation governance becomes increasingly topical at the board level.
Read Guy Pearce’s recent Journal article:
“Minimizing the High Risk of Failure of Corporate Innovation,” ISACA Journal, volume 2, 2018.
GDPR, the much-discussed General Data Privacy Regulation from the European Union, will not be a cure-all for the world’s data privacy problems simply because the GDPR, like every law, is subject to the bureaucracy out of which it was born. This bureaucracy can be compared to a super tanker and those who would violate the law to speedboats. While the super tanker takes miles to make a simple course adjustment, speed boats can dance around the super tank with little fear of a collision.
Sure, there will be times when a speedboat captain makes a mistake and collides with the super tanker resulting in the organization being penalized, but my current expectation is that the organizations that will ultimately pay the potential fine of 4 percent of global turnover will be few and far between. I say this because the GDPR, for all its good intentions, was created by humans, and lawyers will quickly find the loopholes, unintentionally created by the humans, to keep their customers from paying significant fines. Moreover, I simply do not believe that many of the organizations charged with enforcing the GDPR currently have the required manpower and skills to successfully enforce the law. Add to this the fact that Working Party 29 continues to provide guidance on what different sections of the law mean and, at least in the short term, we have a construct that may be difficult to enforce.
That said, I think the GDPR could have a very positive effect on the events we have recently seen involving Facebook, Cambridge Analytica and the political decisions they are claimed to have influenced. GDPR clearly lays out individual’s rights and a primary focus of data privacy and information security professionals should be training colleagues, family, and friends about those rights under this law and the threats that attempt to undermine their rights. The key to success is education, for it is only education that can fix stupid. We, the world, must add critical thinking to educational programs at all levels. An educated population, with solid critical thinking skills, will significantly improve our ability to reduce the effectiveness of fake news and to take back our democracies from the forces that would use our data and opinions against us.
Despite these observations, don’t despair. GDPR is a well-intended regulation that has the potential to change the way the world views data privacy. This value will be derived, however, through education rather than through fines. We must all understand that we do not have to accept our employers, governments or, perhaps worst of all, non-governmental organizations that attempt to sway public opinion on crucial political decisions, misusing our data. We have options. We can inform ourselves using multiple accredited sources. We must demand that our rights are respected. We should confront those who spread fake news, both in the internet but also at our own dinner table. Most importantly, we can vote, with a few mouse clicks, and can close our accounts on those social media platforms which exploit our data for their gain. We must all understand that data privacy is a universal right and thinking critically about what those with access to our data will do with it is the ultimate safeguard for our data, our privacy and ultimately for our democracies.
Author’s note: The author’s views are his own and do not necessarily reflect the views of his employer.Category: Privacy Published: 4/6/2018 3:01 PM
The healthcare industry has been revolutionized as the result of new technologies, advanced data collection methods, and the growth of cloud solutions. It’s equal parts exciting and intimidating. The only question is, are you staying up to date?
It’s time to take IT responsibilities seriously
In an age where data integrity is becoming increasingly important, healthcare organizations continue to be targeted and exposed. The Sixth Annual Benchmark Study on Privacy & Security of Healthcare Data, released last year by Ponemon, shows how serious the situation is.
“For the sixth year in a row, data breaches in healthcare are consistently high in terms of volume, frequency, impact, and cost,” explains Dr. Larry Ponemon, chairman and founder of the Ponemon Institute. “Nearly 90 percent of healthcare organizations represented in this study had a data breach in the past two years, and nearly half, or 45 percent, had more than five data breaches in the same time period.”
It’s not just outside attacks and data breaches, though. If you look at this industry, it’s clear that regulatory compliance – in the face of shifting digital requirements – is also a major challenge.
It’s time for healthcare organizations to slow down and focus on what they’re doing to protect themselves, their data and their clients. Here are a few IT-related suggestions to get the ball rolling in a positive direction:
1. Invest in training. You can implement sophisticated data platforms and develop intensive processes that protect patient data and promise to reduce risk, but it all comes down to the people. Your employees – i.e. the end user – will always be the weakest link in the chain. If you aren’t investing in training and providing them with the resources they need to be successful, then you’re compromising your entire approach.
2. Try predictive analytics. Regulatory compliance is obviously a chief concern in today’s environment, but you don’t have to feel like you’re constantly playing catch-up. With the right system in place, you can take a proactive stance and add value to your organization.
Many leading healthcare organizations are turning to predictive analytics. For example, a platform like IgniteQ uses proprietary algorithms and organization-specific CMS data to provide real-time analysis of how your company lines up with industry benchmarks and what you can do to improve quality of care, MIPS scores, and overall performance. This forward-facing approach is far more effective and powerful than the typical review-based strategy.
3. Get serious about limiting access. Nothing is worse than having your patients’ data stolen. Professional hackers can use this information to hack into bank accounts, steal identities and cause havoc for everyone involved. And while you may not be directly blamed for data theft, you’re almost always indirectly responsible.
The smartest thing you can do is limit access to protected patient data. The fewer people who have access to the data, the less risk there is that confidential information will get into the wrong hands. Nobody should be able to access patient information unless they have a specific need for it. Loose policies in this area will come back to bite you.
Putting it all together
It’s easy to feel as if your organization is immune to the larger problems of the industry. Data breaches and compliance issues are things that other, less responsible organizations deal with. But this simply isn’t true. No modern healthcare organization is safe.
It’s imperative to understand that breaches and mistakes can come from both inside and outside the company. In an effort to strengthen your organization and safeguard data, you have to account for both forces. Better training, a focus on predictive analytics and initiatives to limit access to confidential information will give you a good place to start.Category: Risk Management Published: 4/5/2018 3:10 PM
I want to take this opportunity to dive a little more into the metrics that come out of an access certification program. One of the greatest joys in life is when you have enough data that you can identify patterns and trends in your certification program to monitor the health of your access controls. I can certainly identify plenty of research and articles on the textbook way to manage privileged access or to set up access control and role-based security. However, I have found very little research or guidance to indicate what “good” results are for an access certification.
This is one opportunity where you can use a data-driven approach in determining the health of your controls. Many organizations will see a slew of changes from their access certifications, despite front-end controls to add and remove access that are perceived as effective. Of course, in some situations this could be by design. Maybe the changes found are low risk, such as view-only access to data that are not confidential. However, a larger number of changes (more than 5%) usually indicates opportunities for improvement.
To generate more conversation around what good access certification metrics look like, I am inviting others to share their experiences and access certification metrics, and whether they feel the metrics indicate a positive or negative trend for their access controls. During your last certification event, what percentage of entitlements were removed? How would you characterize your front-end provisioning and deprovisioning/termination controls? Does the percentage of entitlements removed match up to the strength of the front-end controls? If you have strong controls, the entitlements removed should be low. If you have weak controls, the number of entitlements removed should be larger.
Do you have strong controls and the results of access certifications are bearing that out? Then consider doing certifications on a less frequent basis. Your users will appreciate it!
Read Vincent J. Schira’s recent Journal article:
“Rethinking User Access Certifications,” ISACA Journal, volume 2, 2018.
There have been many developments for policymakers, privacy advocates, corporate execs and, in fact, the public at large to contemplate considering recent news about Cambridge Analytica and the information collected by Facebook. The facts have been covered heavily elsewhere in the mainstream and industry press, so I’ll spare you a repeat play-by-play here. However, I do think there are a few important, timely observations to call out for leaders and practitioners in the security, risk and assurance communities.
Specifically, regardless of whether you are an end user of enterprise technology, an enterprise software vendor, or just an individual concerned about keeping your (and your household’s) information protected, there are some “lessons learned” to prevent or mitigate headaches down the road.
I should mention at the outset that these aren’t the only lessons that can be gleaned from these events, and they may not even be the best ones depending on your environment and circumstances. And, of course, we will continue to track developments as the story evolves, with perhaps more lessons on the horizon. Consider these, then, prudent measures for anyone – either observer or impacted party – and for organizations to benefit from, with current events serving as a useful “proof points” to explore at the enterprise level.
A punch in the face?
The first factor is being aware of permissions you give and agreements that you enter into – particularly in relation to privacy and security. Quite a few people were surprised and concerned about the volume of information collected by Facebook on mobile platforms, and many viewed with alarm the realization that Facebook collects call records and sent/received SMS messages on Android phones. However, the permissions requested by the Facebook-supplied app (which users agree to when they install) let it do exactly that. While some might view the outcome as undesirable, the app specifically requested these permissions and users agreed to them at the outset.
An analogy would be someone asking you if they can punch you in the face. If you give them your consent to go ahead and take a swing, are they in the right or in the wrong when they follow through? That’s a thorny question, and arguments can be made on both sides (for example, it might matter how they asked the question in the first place). But they did ask for your consent first and, if you don’t want to get punched, you can say no.
This might sound a bit like “blame the victim” – and, if so, that is not my intent. I bring it up because there are lessons here for those on both sides of this equation: end user and technology supplier alike. For the end user, viewing critically (and with a healthy skepticism) the permissions that apps request – and the measures agreed to by a supplier or service provider – is always an exercise in prudence. While some vendors might be more transparent about what they’re doing than others, keeping a handle on what is being requested (or promised) is absolutely critical. This is, in fact, what the Android permission system was built for in the first place.
This same principle extends beyond mobile. For example, if your cloud provider says it is performing a certain task (such as a security countermeasure), how confident are you in that? Are you checking? How would you know if not? For those supplying those services or products, being transparent about why you’re asking for the permissions you’re asking for (and how they’ll be used) can save you quite a bit of hassle down the road and being explicit about what you’re doing to keep information (and how) is likewise valuable.
The supply chain
The second item I’d call to your attention is the “transitive property” that exists between suppliers and the end entity – at least from a perception and customer point of view. For example, in this case, while it is true that Cambridge Analytica allegedly broke the rules and violated Facebook’s terms in how they acquired data, public angst (at least quite a bit of it) is directed at Facebook.
Are there reasons to be concerned about Facebook’s privacy and security more generally? Perhaps. But in this case, much of the pain that Facebook seems to be in results from actions taken by a member of its ecosystem rather than itself directly. As organizations become more interdependent on suppliers, contractors, business partners, and even customers, the lesson of how customers and the world at large will view a failure of trust is important. This is particularly true as it relates to private information about those users and customers.
So, lest we needed to be reminded, a lack of confidence in an organization’s data stewardship (i.e., a privacy issue, a security breach, or any other issue that impacts users’ information) caused by someone in the broader ecosystem can and often does generate ill-will to those connected to it via supplier, partner or other relationships. You’ve heard the old saying that “you can’t outsource liability”? It’s as true now as ever.
I’m sure as events unfold, we’ll all learn more about these circumstances and, with that, new lessons will continue to emerge that we can adapt to the work we do on behalf of our organizations. But starting with these, and working to make sure that we are aware of permissions and agreements that we might have entered into (including potential consequences that might arise), and the relationships that we have in our supply chain that can potentially impact us, is a useful way to ensure we’re keeping our organizations in solid shape.Category: Privacy Published: 3/29/2018 5:05 PM
CISOs have traditionally focused on the triad of “Confidentiality, Integrity and Availability.” Recently, emphasis has been placed on confidentiality, hackers and zero-day attacks. However, industry trends now require that focus to broaden to all business information risks within organizations.
Since information is a key part of almost all business transactions, information risks are becoming pervasive. The trends I want to highlight include increased need for Security departments to partner with business colleagues to understand risks from their point of view, and increased importance of integrity and availability.
In my mind, integrity issues go back to the ChoicePoint data breach in 2005. This breach did not result from a zero-day attack. It was carried out by fraudulent customers using fake accounts. This falls under the “data integrity” mandate. At the time, many would have thought that this breach was outside of the scope of information security. But this needs to change today.
Such incidents have taken off in recent years. Fake news incidents have regularly made headlines. The potential effects of fake information on SEO results also have been highlighted. Consider the reports of identity “theft” using synthetic identities. Or the recent scandal at Kobe Steel over the internal falsification of quality data.
After the Yahoo breaches cost that company US $300M, cybersecurity assessments have become a more important part of M&A transactions. This type of assessment has to mitigate business risk. Is the firm’s risk posture what it says it is? Class action lawsuits in the state of Michigan for faulty software algorithms bring up another information business risk. Software development errors may have real human life consequences as well as business consequences.
In the recent volatile financial market, several investment firms suffered outages, even in our era of scalable, virtualized application architectures. Ransomware attacks last year led to real money being lost from victims, not from ransoms, but from outages. The largest ever DDoS attack recently was reported. These attacks are likely to continue to be common.
This is still an important issue, but the diversity of incidents is increasing. An ex-Expedia employee pleaded guilty to stealing company information to facilitate his insider trading of company stock. Better keyless entry systems now facilitate faster theft by car thieves, not just theft of information. In 2016, steelmaker ThyssenKrupp lost trade secrets to cyber criminals. A large retailer recently was hit with a $27 million fine for stealing a small contractor’s intellectual property. Instead of just stealing IDs, criminals are now stealing whole systems and the intellectual property that goes along with those systems.
These incidents highlight newer ways to misuse information resources and adversely affect a business. More longstanding hacker attacks using technology are not going away; traditional technology controls are still needed to mitigate these risks and significant progress has been made in doing so. But these newer incidents highlight threats in which the misuse case and consequences are highly entwined with the business. To find these risks, CISOs will need, more than ever, to understand the business they are protecting and the risks that are seen by senior management. Security controls will need to be more integrated in business operations to be effective.
A recent presentation by Facebook CISO Alex Stamos also highlighted these issues. In his talk, Stamos distinguishes between two components of technology risk: traditional InfoSec and “abuse.” He defines abuse as “technically correct use of a technology to cause harm.” In his view, the abuse category of risk is much broader than the traditional InfoSec concerns. Some of his solutions to better manage the abuse category of risk include broadening the focus of security practitioners and increasing empathy toward business users and leaders.
My own conclusion is: if the issue involves company information, and misuse can affect the company’s risk posture, then CISOs need to play an active role in mitigating that risk.Category: Risk Management Published: 4/2/2018 3:12 PM
For all of the benefits remote working offers businesses, it’s hard to ignore the security risks and threats.
According to a Gallup survey, more US employees are working remotely than ever before (and for longer periods of time). In 2016, 43 percent of employed Americans said they spent at least some time working remotely. Of these employees, 42 percent of survey respondents report working remotely 60-100 percent of the time.
As remote working becomes more popular, there is more and more pressure on employers to offer remote working opportunities to employees. And while your organization would be wise to adjust to the preferences of their workforce, prematurely deciding to allow for remote working without thinking about safety will leave your company vulnerable to considerable security risks.
Whether you already have a remote working policy in place or are just now considering the feasibility of it, here are some practical ways, from an information security perspective that, you can keep your remote employees safe.
Switch to cloud-based storage. If you haven’t already, switch your organization to cloud-based storage. Not only does this improve the data integrity of your entire company, but it gives remote workers the ability to access files and programs without needing to store sensitive information on their devices. Providers offer encrypted cloud storage at very affordable rates.
Require regular password changes. While it might seem obvious, poor password hygiene remains one of the single biggest risk factors for remote workers. Whether it’s simple passwords or passwords that never get changed, inadequate passwords increase the risk of being compromised by hackers and other cybercriminals. For best results, encourage your employees to select passwords that contain at least 12 characters (with numbers, symbols, uppercase letters, and lowercase letters). They should then be prompted to change passwords every six to nine weeks.
Limit as much access as possible. Just because you can give an employee access to a program or file doesn’t mean you should. Each employee and/or device that has access to confidential data increases the risk of being compromised. To enhance security, limit access on an as-needed basis.
Have remote support systems in place. When something goes wrong with a computer or system in the office, all it takes is a quick call to the IT department and somebody can be quickly dispatched to deal with the issue and, ideally, neutralize any security risk.
While not quite as convenient, you can have the same sort of control and responsiveness with remote devices if you have the right support systems in place. Tools like Dameware Remote Support make it possible to log into a remote worker’s computer and troubleshoot problems in real-time to limit issues or security threats.
Keep software and programs up to date. When it comes to security risk factors, outdated software and programs are high on the list. Vendors don’t pay much attention to outdated versions and this often means there are vulnerable loopholes that could leave you out to dry. For best results, enable automatic updates on all employee devices.
Give your remote team a chance to succeed. If you genuinely want remote working to be a viable option for your organization, then you have to give your employees a chance to be successful. This means – among other things – paying close attention to security. If need be, meet with an outside security analyst or consultant to get some feedback on your setup. You can never be too safe.Category: Security Published: 3/30/2018 3:06 PM
As January 2018 rolled around, I went platinum. No, this had nothing to do with a New Year’s resolution, nor did I become a platinum blond, though that does bring up some interesting and hilarious possibilities (I can imagine the double-takes every time I would enter an airport or some other location requiring a photo ID). I did not become a platinum album-selling artist (though this would have trimmed one item off my to-do list!). Instead, January 2018 meant that I had entered my 15th year of ISACA membership!
Whew! What a memorable journey it has been.
I still remember entering a room on a leafy road in Bangalore in late 2002. The venue belonged to the University of Agricultural Sciences, yet we were going to discuss IT audit! The memory is still fresh. The room was abuzz with energy, and there must have been at least 100 people present. There were a bunch of important-looking people at the front of the room and a whole bunch of people from different age groups milling about in the back.
The session came to order quickly and we discussed this organization called Information Systems Audit and Control Association – ISACA for short. I recall going to that session, which the ISACA Bangalore Chapter called an “introductory seminar,” because, at that point in my career, I needed (not necessarily in the order below):
In the session, I heard about certifications offered by ISACA, primarily Certified Information Systems Auditor (CISA) and Certified Information Security Manager (CISM), and of course, the crown jewel, Control Objectives for Information and Related Technologies (COBIT), uttered with great reverence and respect. I have always loved good acronyms, and this was heaven – a great acronym always sticks in your mind, makes you look intelligent when you use it in public and, of course, is a great addition to your name! Exactly one year after attending the session, I attempted the CISA exam in December 2003. By then, I was member for almost a year. I took the exam in December, as it was offered only twice a year then, and my work commitments meant that December was the best time to take the exam.
The rest, as they say, is history, water under the bridge or whatever cliché catches your fancy. I won’t shy away from saying that attending that session was probably one of the best decisions I have made, and there has been no looking back. In hindsight, I realize that one thing that was not communicated at that introductory seminar, but which was and is crucial, was the volunteer ethic on which ISACA was built and which is one of the reasons ISACA is so successful.
Over the years, my relationship with ISACA via the Bangalore Chapter went from one level to the next as that volunteer ethic slowly rose to the fore. I was asked to be part of the local chapter board, and to cut a long story short, I went from member to president of the chapter in less than a decade’s time. I also was fortunate be invited to be part of the ISACA board of directors, which was such an honor that, even as I write this, I feel warm and elated. My relationship with ISACA has been rewarding in so many ways, including the opportunity to work with some great ISACA staff and volunteers, some of whom have become dear friends. I am thankful to many who have been with me on this journey.
It has been wonderful to see the whole organization steadily evolve. ISACA definitely has come a long way. We have more chapters now than in 2002 (up to 217), and our professional community has grown to more than 135,000 members and around 450,000 engaged professionals. The CRISC certification was introduced in 2010. This was a credential I pursued to test my exam-taking chops, and I have been CRISC-certified for a few years now. Of course there is a sea change in certification exams today; they are offered through computer-based testing and can be taken more or less at any time in the year – no need to wait for June or December to roll around.
Membership benefits are awesome to say the least. Members have access to a host of leading-edge research outputs, not only from ISACA, but also from organizations such as MIT, apart from a host of other benefits. How can I miss the fact that CMMI Institute (another organization I have always held in awe for coming up with great frameworks) is now part of the ISACA family, and I was fortunate enough to be part of history being made?
But enough of the past and the present – what should interest all of us, and what I am definitely looking forward to most, is what the future will hold. I am looking forward to the next 15 years and beyond, and wonder what that will look like. What would a young person at the cusp of beginning his or her professional life think of ISACA? I am sure ISACA – nearing its 50th year – will continue to be the go-to source for knowledge in its chosen areas of audit, risk, governance, information security, and whatever else we expand to encompass.
Whatever it is, I am sure ISACA will continue to live up to it is purpose, Help you realize the positive potential of technology, and its promise, Inspire confidence that enables innovation through technology. This will need to stand the test of time – the test of a tomorrow that is unknown and of the risks and threats that will continue to emerge. It is imperative that we pause now and then, stand at arm’s length, and take a deep look at what we are doing, where we are headed and what an organization like ISACA means in the grand scheme of things.
It is my personal conviction that the ISACA of tomorrow will be more than a purveyor of knowledge or expertise or certifications, and play a bigger role in providing the necessary scaffolding for the safe use of technology while also being true to its founding tenets, with expanded interpretations to suit the evolving challenges of our professional community. The ball already is rolling in this direction via the SheLeadsTech initiative, which means that ISACA also becomes synonymous with “opportunity” across the globe. This segues into the ISACA Foundation, which (in my mind) will have the task of expanding ISACA’s reach and impact in communities around the world.
I am looking forward to that tomorrow – a tomorrow in which ISACA continues to shine!Category: ISACA Published: 3/29/2018 3:06 PM
Everyone is talking about blockchain and is curious to know more. In addition to blockchain conversations among cybersecurity and IT professionals, TV programs are discussing the topic, investors are clamoring about it and many people are asking just what the heck it is. Blockchain is the trending topic in seemingly every technology conference, journal and summit.
I recently spoke at one of the famous technical universities in India on digital payments and their impact on the global economy. We explored various technologies around digital payments, including USSD payments, mobile banking/payments, e-wallets and payment devices to debit/credit card with biometrics. I also discussed cryptocurrencies and blockchain technologies. Though the topics were diverse, most of my Q&A session ended up focusing on blockchain, security risks around it, use cases and how to secure it. People want to know!
There are many initiatives in the US, Europe, and APAC driven by governments and technology companies to enable blockchain in multiple use cases covering healthcare, banking and finance, manufacturing, utilities and civil identity programs.
Blockchain in banking
Blockchain is adding value in a bank’s technology stack through enabling efficiency and faster execution, along with secure and robust features. Most banks are preferring private blockchain to implementing these use cases. Private blockchain has its own set of benefits – faster, restricted and authenticated user access control, centralized, and capable of controlling and monitoring transactions.
Blockchain adoption in the banking and finance industry has grown significantly in the past two years. Three use cases are gaining wide acceptance in this industry – international remittance, eKYC (Know Your Customer) and smart contracting:
4 things that disrupt the blockchain party
Top 3 security practices for secure governance around private blockchain
Author’s note: Ameya Jhawar, Consultant – Digital Security at Aujas, contributed to this blog post.Category: Security Published: 3/28/2018 3:05 PM
During my time as an IT auditor, I have been privileged to attend many excellent and inspiring presentations at ISACA Ireland conferences and seminars, ISACA webinars and, of course, EuroCACS. The speakers are almost always passionate and extremely knowledgeable about their subject matter.
As technology and regulations have changed, the subject matter at these conferences has changed too. For example, the current hot topic (certainly in Europe) is the EU General Data Protection Regulation (GDPR). Prior to that, it may have been cryptocurrencies or, to be more specific, blockchain. Prior to that again, it may have been advanced persistent threats (APTs).
These are all important subjects and most certainly should be covered at ISACA conferences. However, one thing that does strike me is that the talks are almost exclusively on what we audit and not how we audit it. We, as IT auditors, rarely discuss the IT audit process or, to be more specific, how we perform an IT audit. This is somewhat paradoxical as we spend a considerable amount of our time auditing and trying to improve information technology processes, such as change management and vulnerability management.
I discuss innovation while performing an IT audit in my Journal volume 2, 2018, column, “Innovation in the IT Audit Process.” As innovation is defined as the introduction of something new or a new idea, method or device, I am more than aware that many of my suggestions will not be new or indeed remotely innovative to many readers. However, I would be delighted if the article inspired a conversation around improving and innovating the IT audit process.
Can I ask each of you to please review the article, consider each phase of the audit process and use the comments section to explain how you innovate during the audit process in your organization? Even if there is nothing new in the article of relevance to you, your comment may inspire innovation in a fellow reader on the other side of the world. And this could change how, not what, they audit.
Read Ian Cooke’s recent Journal article,
“Innovation in the IT Audit Process,” ISACA Journal, volume 2, 2018.
Editor’s note: ISACA board director Jo Stewart-Rattray has provided updates from her participation in the UN Commission on the Status of Women, which took place from 12-23 March at UN headquarters in New York.
The final week of the session brought a lot of hard work; there were times when I thought we were going backwards and other times when there appeared to be no agreement, and that consensus was a long way off.
With a looming deadline, we negotiated through the night on Thursday. Those members of the Delegation on “night shift” walked out of UN headquarters at 5:45 a.m. on Friday in the pre-dawn light with a lot of hard negotiation and high-level diplomacy still on the horizon to get all 193 member-states on the same page in relation to what the roadmap for the empowerment of rural women and girls through the use of technology would look like.
Even those of us who had pulled the night shift were back at the UN at noon on Friday. There were small group discussions and bilateral work taking place to complete the tasks at hand. Paragraph by paragraph fell under the gavel after being agreed upon, and each agreement was met by a round of applause. Then, at 3:50 p.m., the gavel fell on the final paragraph, and the conference room erupted in applause, sighs of relief, and we were on our feet hugging members of our own delegations and, in some cases, other delegations. We had made it! We had a set of agreed upon conclusions!
I felt a lump rise in my throat knowing that not only had the dreams of a 7-year-old kid from the Australian bush come true, but that kid, as an adult, had worked to make a difference for rural women and girls across the world. This has been particularly moving for me when I realized that the last time this session theme was attempted in 2012, the member-states of the United Nations were anything but united and could not agree on many of the thorny issues. There was no final roadmap as a result.
I will head home with a sense of achievement like I have never had before. I know that ISACA and the SheLeadsTech program have a place in the world at the highest level, and I will continue to strive to make a difference with our advocacy efforts. Thank you ISACA for playing such a significant role in making a dream of a lifetime come true!Category: ISACA Published: 3/26/2018 9:45 AM
Although we are less than two months from the European Union’s General Data Protection Regulation (GDPR) compliance deadline of 25 May, many organizations are not yet confident in their level of preparedness for this landmark new data privacy regulation.
If that concern applies to you and your enterprise, know that you are in good company. Many of your colleagues across the globe are in a similar position, still working diligently to make the needed headway to be in solid position once GDPR takes effect.
Another reason not to panic: ISACA is here for you. Our new GDPR Assessment helps users and their enterprises identify gaps in their GDPR readiness and offers guidance on how to resolve those gaps. It provides customized output of areas in which your enterprise needs to focus and provides the opportunity to retake the assessment later after implementing the initial guidance.
The complimentary assessment was powered by the contributions of leading global security and privacy experts and includes gap analysis expertise from CMMI Institute. The tool is part of ISACA’s ongoing commitment to help our global professional community prepare for GDPR; if you have not recently viewed ISACA’s frequently updated array of resources on the topic, I encourage you to visit www.isaca.org/GDPR.
After such a long buildup, it is hard to believe that we are now less than two months away from the deadline. GDPR compliance should be seen as a business opportunity, rather than a roadblock. GDPR is not a checklist to be completed, separate from the enterprise’s core functions and capabilities. Instead, complying with GDPR needs to be a basic, foundational element of the organization’s operations, capabilities and decision-making. It requires a level of cross-functional collaboration that will serve the enterprise well long beyond the compliance deadline.
It will be fascinating to watch how data privacy regulations around the world evolve in the coming years. As the world becomes acclimated to conducting business with the EU in the era of GDPR, expect other nations to develop similar policies in an effort to deal with universal challenges in data protection and data privacy.
I fully expect the ISACA professional community to demonstrate leadership in embracing the challenge of helping their enterprises adjust to the new regulatory environment. GDPR represents an excellent opportunity to put our enterprises on stronger footing and better serve our customers.Category: Privacy Published: 3/27/2018 10:06 AM