Editor’s note: Self-described ‘passionpreneuer’ and award-winning author Moustafa Hamwi will deliver the closing keynote address at Asia Pacific CACS 2017, to take place 29-30 November in Dubai. Hamwi will address an often overlooked ingredient in business success – passionate leadership, also the subject of his recent visit with ISACA Now. The following is an edited transcript:
ISACA Now: Why is passionate leadership so important?
Passion is the key differentiating factor for true leaders. It is easy to lead when times are good. Truly passionate leaders show when times are tough. I always say, “The longest distance leaders have to walk is between their mouth and their feet.” Passionate leaders care about the purpose they are serving and find the energy and drive to keep them going against all the odds.
ISACA Now: Is passion innate, or are there techniques to become a more passionate leader?
The quality of one’s passion comes from the quality of their purpose. In my opinion, we are all born to serve a bigger purpose, and when leaders are aligned with their purpose, their ability to lead and to achieve increases. This takes a high dose of brutal honesty for the leader to ask, “Do I really care about what I’m leading?” If not, then any technique will be symptomatic rather than bringing any deep change.
ISACA Now: Can someone be too passionate? Is there a risk of burnout?
Great question. The best way to answer this is to quote, “If passion drives you, let reason hold the reins,” from Benjamin Franklin.
There is a huge difference between being guided by passion and being blinded by it! Planning and constant learning to adjust the plan is a crucial element in any successful venture and in avoiding burnout. Passion is using your heart as your guide and your mind as the planning and execution tool. With this mix, there is low risk of burnout and faster recuperation because the journey becomes more fulfilling.
ISACA Now: You once met a swami who was influential in your trajectory. Tell us how that came about.
When the student is ready, the teacher appears! It was a pure coincidence. I bought a one-way ticket to India in 2012 seeking some answers about my life and the bigger meaning, and through a friend of a friend of a friend, I ended up meeting him. The interactions with him were eye-opening; however, it was one of the questions he asked me that changed a lot of things for me.
One day I was asking him about life and he turns to me and asks me, “Do you know what you are thirsty for? Because if you do not know what you are thirsty for, you cannot quench your thirst.” This was the beginning of my search for how to quench my thirst, to have a great impact on the world, and to help people find and fulfill theirs.
ISACA Now: You are from Syria, then moved to Dubai and have visited plenty of other places. Is passion a universally important ingredient you have observed in successful leaders?
I have lived in, traveled to and spoken in around 30 countries, and have worked with thousands of leaders and executives, and one thing is for sure. Passion is the key to success anywhere in the world. The world is becoming so much more competitive, and without passion, you will run out of energy long before you achieve desired results. Also, passion gives you a joyful competitive advantage (when you are passionate about what you do, you perform better than the person that is doing it just for the money), so your quality of work is better, and people enjoy receiving services from you, which means you have higher demand.
With an ever-growing digital and virtualized world of interconnected devices, we are seeing the rise of an ecosystem of Internet of Things (IoT) that is impacting everyday actions. This ecosystem of devices can be managed and monitored remotely, can leverage mobile and wireless networks for communication, and capitalizes on a combination of the cloud and data centers for analysis.
While this is a good thing for technological and human-machine interaction, the confluence of many devices and applications, coupled with fragmented solutions, inherently opens the door to risk. This risk comes in the form of security threats and those arising from not being able to align with a standard framework, taking into account data privacy, and meeting regulatory and compliance requirements.
If we look at IoT through a security lens, then we have to consider the integration of network, sensors, human machine interactions, virtualized systems and other endpoints that must be able to provide actionable security intelligence in near real time, and which can align to a security framework or model. This model should identify and mitigate environmental risk, ensure data privacy and drive threat mitigation around:
Delving a layer deeper, and as we consider these potential areas of vulnerabilities, any security program for an IoT environment should leverage the principle of trust in the form of trusted IDs, trusted software and systems, integrated with trusted and protected communications, linked to trusted data sources and data, and all encircled within trusted secured connections and secured device configurations. In the telecommunications world, we can cite the example of authentication, where a mobile services provider can “authenticate applications within a federated gateway via SIM, but also through sensors on short-range networks” – Ericsson Innovation Day 2015.
These “trust principles,” though, should all follow a premise where we understand the areas of involvement and operations for any “thing” as it connects to any system or other device that can be part of a known framework, based on implied architecture or intrinsic architectural embedding into a defined framework. Any such framework should have a policy that defines a process and drives an implementation that can align with the mandate of a governance model covering architectural device identity and access management, authentication and security associations, risk management, and privacy, as well as regulatory/compliance alignments.
This post just scratches the surface of the impact of the IoT on our environments, as companies move more toward digitalization and drive digital transformation. These transformations will only see the growth of more connected “things” interacting with and impacting our environments. Some of these things will not necessarily be governed by human machine interactions, but may exist as components embedded within architectures found within smart buildings, vehicles and environmental systems.
Data privacy, legal oversight, IoT frameworks, and regulatory and compliance considerations are among the other areas that we need to have defined and standardized for IoT.
In recent years, board-level supervision in information technology matters has become a key IT governance topic. It is often assumed that national corporate governance codes can guide board members to design and potentially improve their IT governance practices. At the Antwerp Management School (AMS), we conducted a study to understand what IT governance-related guidelines are included in national corporate governance codes.
We selected 15 national corporate governance codes to study. These codes were selected based on income level and geographic dispersion across different continents. Surprisingly, we found that most national corporate governance codes do not include key IT governance topics. There is hardly any IT governance information incorporated in the codes at all. The only exception we found was the South African corporate governance code, King III, which contains an entire chapter on IT governance-related guidelines. We also note that the committee responsible for drafting the South African corporate governance code recently finalized King IV, in which IT-related matters assume an even more prominent role. Based on our findings, we conclude that:
This study was performed by researchers at AMS around an industry-sponsored research project on board-level IT governance. The research project focused on the need for boards to extend their governance accountability from a mono-focus on finance and legal as proxy to corporate governance. This extended accountability should include technology and provide digital leadership and organizational capabilities to ensure that the enterprise’s IT department sustains and extends the enterprise’s strategies and objectives. We discovered that board members are increasingly seeking guidance on how they can expand their IT governance accountability within the board and also in an appropriate modus vivendi with executive management. More information, including intermediary results, can be found on the AMS website.
Read Steven De Haes, Anant Joshi, Tim Huygh and Salvi Jansen’s recent Journal article:
“Exploring How Corporate Governance Codes Address IT Governance,” ISACA Journal, vol. 4, 2017.
Having had the privilege to have visited a number of cities throughout the world, I have learned that Chengdu is not Mexico City, Brussels is not Houston, Abuja is not Melbourne, and Johannesburg is not Dubai. That’s because the heart of every city beats differently. Each has its own character, its own vibe, and its own goals for assuring the best standard of living possible for its citizens and for the visiting public.
Likewise, every city is evolving at its own particular pace, though all are aligned to a common principle of modernizing their infrastructure services – public transportation, utilities, health care – by leveraging technology and law enforcement in “smart” ways to improve quality of life while assuring operational efficiency, stability and security. As noted by Eduardo Paes, the former mayor of Rio de Janeiro, “Smart cities are those who manage their resources efficiently. Traffic, public services and disaster response should be operated intelligently in order to minimize costs, reduce carbon emissions and increase performance.”
The term “smart cities” has been used recently as a label for those seemingly few cities of the world that are consciously embedding technology into all aspects of city planning. However, with current forecasts estimating close to 50 “megacities” housing over 10 million people and about two-thirds of the world’s population living in urban environments by 2050, the mindset must shift to think of the ‘smartness’ of any urban center as a non-negotiable element.
Many urban centers are claiming to be ahead of the “smart” curve though, in actuality, they are finding themselves handcuffed by custom systems that are not interconnected, interoperable, portable, extensible nor efficient in their operations, maintenance, and overall cost-effectiveness. Overcoming this challenge is burdensome, especially when paired with pressure to make progress that can unintentionally lead to chaos as concurrent initiatives are deployed, leading to uncoordinated solutions that can be misaligned to the intended outcomes. No wonder city planners are not sleeping at night.
The diagnosis is seemingly familiar, and not unlike the challenges many enterprises are facing with what are currently referred to as digital transformation projects. What's different for the urban center, however, is the scale of the complexity. The complexity is not just a question of technology deployment, but also taking into consideration economic, political and social issues that shape a city’s being. It’s an extreme case of a “system of systems of systems and more systems” problem, for which the only “smart” solution is a universal consensus-based governance framework.
Technology companies like Cisco and AT&T have developed their own frameworks, driven by their product strategies, especially for IoT. Standards-developing organizations such as ISO, IEC, ITU, IEEE and a number of others are facilitating the development of new standards related to specific pieces of the overall urban development challenges. Recognizing the fragmented (yet well-intended) and disparate approaches, NIST has launched a working group intended to converge these groups and their respective knowledge assets under the guise of a Smart City Framework.
The key to the success of any framework is its acceptance by universal consensus. This means the framework is created, maintained and endorsed by the professional community for the benefit of the community itself. The framework provides guidance on how to carry out the work aligned to desired outcomes in conjunction with tools that enable stakeholders to self-assess, benchmark, and measure capability maturity and progress toward the goals. This is indicative of the 20-plus year success experienced by ISACA's globally recognized COBIT framework for the governance and management of enterprise technology, which itself has the potential to be foundational for smart urban initiatives.
For now, city planners find themselves challenged across a wide spectrum of issues, ranging from technology to compliance. As members of the technology community, we need to help them by leveraging our knowledge of technology governance frameworks and their development and deployment, our holistic systems thinking and problem-solving capabilities, and our innate ability to assess and mitigate risk to inspire the confidence necessary to enable innovations that can evolve the urban environment by leveraging the best technology has to offer.
Our work has never been more important. And because we recognize the pervasive nature of technology and understand how to leverage its positive potential, I am confident that we can contribute enough to the evolution of so-called “smart cities” that the term “smart” will eventually be dropped from the lexicon. That in itself would be a great accomplishment.
Editor’s note: This blog post by ISACA CEO Matt Loeb originally appeared in CSO.
Editor’s note: Social business strategist, author and radio show host Ryan Hogarth will deliver the opening keynote address at Africa CACS 2017, to take place 11-12 September in Accra, Ghana. Hogarth’s keynote is titled “We Are Not Robots.” Hogarth recently spoke with ISACA Now about some of the themes he will address, such as navigating digital disruption and how to strengthen relationships with customers. The following is an edited transcript:
ISACA Now: You use the term a “frictionless economy.” What do you mean by that?
Everything about being a customer has been transformed through our use of customizable technology because friction is constantly eliminated. We can get what we want or what we need with a click, a tap, a swipe or a gesture. The businesses that will succeed are those that understand their customers’ journey enough to use the right technology to remove friction and make interaction, servicing and purchasing seamless and effortless, or frictionless.
ISACA Now: What are some common mistakes that organizations make in navigating digital disruption?
The two most common mistakes are:
ISACA Now: What are the best ways for an organization to strengthen its relationship with customers?
First, map your customers’ journey. What is their actual, real-world experience with your brand or business? Knowing this is far harder than we at first think because we make assumptions about what our customers actually do and experience. Once you plot this out, the shortcomings become far clearer and solutions a lot more obvious.
ISACA Now: Do you think most enterprises are utilizing social media effectively?
No. Most enterprises that are on social media still view it as just another tool of sales or marketing rather than a means of communication and relationship. Again, this requires a shift in thinking. Are you thinking about how you can build a relationship with a customer or just how you can push your latest offering?
ISACA Now: What technological innovations do you anticipate having the most impact on the global economy within the next few years?
There are several: Self-driving cars, clean energy, augmented reality, artificial intelligence, blockchain, high-quality online education, food science and medical technology. All of these are important because their impact will affect so many industries beyond the obvious. We see the immediate effect in how global businesses are playing in fields far outside their traditional spaces. Social media wants to get into banking, tech firms are playing in areas of transportation or health, and banks are pushing hard into the tech space.
Editor’s note: Vicki Gavin, CRISC, MBCI, is compliance director, and head of business continuity, cyber security and data privacy for The Economist. Gavin, based in London, recently visited with ISACA Now to discuss how her areas of expertise are being affected by the fast-changing technology and regulatory landscape. The following is an edited transcript.
ISACA Now: At InfoSec Europe last month, you were part of a panel that discussed building an agile team for the future. What were the major takeaways for you?
For me, the most significant takeaway was the need to do things differently. Current hiring processes are designed to exclude candidates. We need to get smarter about including candidates from a variety of backgrounds by systematically removing bias from role profiles, job descriptions and advertisements, screening and interviewing.
ISACA Now: How critical is it for organizations to have tech-savvy boards in terms of fostering strong governance?
I do not think the board needs to be tech-savvy. Tech awareness is sufficient. Security professionals need to become more business aware to communicate effectively with the board.
ISACA Now: What are some shortcuts that organizations tend to take in their governance that often come back to haunt them?
I think one of the biggest IT governance mistakes made by technology professionals is the assumption that risk is to be eliminated. Risk is to be managed; the key is to determine what level of risk your organization is willing to accept.
ISACA Now: What are the biggest keys to successful business continuity planning?
The value in planning is the process, not the plan. As Mike Tyson said, “Everybody has a plan until they get punched in the face.” The same is true for BCPs. The process, on the other hand, done properly, ensures a common risk appetite and approach to recovery when the time comes.
ISACA Now: Which emerging technologies present the greatest challenges from a compliance standpoint?
All of them. All change is disruptive. The challenge is to balance the risks and benefits of compliance.
ISACA Now: As we move closer to GDPR taking effect next year, are you sensing a greater sense of calm or of anxiety from your peers?
From my peers, anxiety. From my business, calm. We started on our GDPR journey about a year ago and will be ready by November 2017, giving us plenty of time to bed in new processes.
In today’s competitive environment, enterprises are under enormous pressure to focus valuable resources on initiatives that provide value. The inherent issue with most approaches is that the methods used to determine organizational priorities are often flawed by focusing on compliance as a primary navigation aid. A “compliance only” focused program can have a huge effect on performance. Of course, compliance is crucial for business survival, but it’s not always the only guidance system to use for value creation.
A solution to this narrow approach is to prioritize efforts using multiple perspectives to offer a balanced approach to determining priorities, allocating resources and, ultimately, providing value. As in travel, you need to have a good fix on your coordinates – location, altitude, heading and speed – before determining future moves. Where most companies go wrong is in choosing only one of these perspectives. Just like using a GPS to help you navigate, you should use more than one guidance system to help you focus efforts.
Having tools available that offer pinpoint accuracy to where you need to focus efforts in an organization is crucial – hence, the GPS analogy. GPS satellites help locate a position on the ground based on their time and position. The GPS receiver communicates with multiple satellites, and therefore determines a precise location on the ground. Decisions around funding, assurance, improvements and compliance are all areas in an enterprise that require resources, and should not be determined with only one signal. The more ‘GPS’ signals you have looking into your ecosystem, the more accurate you can be at focusing your efforts.
Using these multiple guidance systems will drastically improve your chances of success. These four GPS signals can include: 1) Goals cascading, 2) risk scenarios, 3) pain points, and 4) regulatory and compliance (see figure 1).
Figure 1—Using Multiple Perspectives to Prioritize Efforts
Guidance System 1: Cascading goals
I believe that one of the best-kept secrets in our industry today is the goals cascade. The model begins with stakeholder drivers that influence stakeholder needs. Stakeholder needs can be literally mapped to enterprise goals, IT-related goals and enabler goals. The enabler level is a more holistic view of the ingredients required to govern and manage enterprise IT. For example, if you know that a particular enterprise goal is the most important goal for the next year, then you can map that goal through the cascade and determine which processes are critical to its success. The model is already done for you in COBIT, where there is a set of tables that map each of these levels.
Guidance System 2: Risk scenarios
An IT risk scenario describes IT-related events that could lead to a business impact. COBIT 5 for Risk contains a set of generic IT risk scenarios and can serve as inputs to risk analysis activities and their effects on overall business objectives. This process results in the risk register and provides valuable information for informed decision-making. Use the results of this “GPS signal” to come up with the most critical risk scenarios that could hinder enterprise objectives, determine pain points or guide mitigation responses.
Guidance System 3: Pain points
Pain points are those areas that need little effort to identify. Use pain points as perspectives from which efforts toward the governance of enterprise IT initiatives are chartered. This can have a positive effect on the buy-in of your business case and create a sense of urgency and support. The COBIT 5 Implementation Guide identifies some common pain points associated with enterprise IT and maps these pain points to specific processes in COBIT.
Guidance System 4: Legal/regulatory/compliance requirements
No organization can be 100 percent compliant with everything. Synchronize this with your risk management process to determine the right response to each requirement. Some requirements are legally required and must be adhered to, but what level of adherence is the most appropriate?
Aligning your satellites
Each of these guidance systems should result in a very clear list of high-interest areas. Devise a prioritization scheme for each of these lists and normalize them into a single list. Now that the most important areas have been identified, compared and analyzed, more focused efforts can be identified. These results can assist in scoping assurance activities, allocating and prioritizing resources, and ensuring business/IT alignment.
The enterprise exists to create value for its stakeholders. Realizing benefits while optimizing risks and resources requires more than one perspective, or ‘guidance system,’ to fully understand what is required. This post has identified four potential perspectives that worked for one organization. Yours might have more, but should never have less.
Editor’s note: Mark Thomas will deliver a keynote session on using multiple guidance systems for the governance of enterprise IT at the GRC Conference 16-18 August in Dallas, Texas, USA.
One of the most common cyber security questions I get is: How do attackers plan/carry out their attacks? I thought this would be a great topic to address since we are always asked to explain the risk of any audit observation we make. So, what is risk anyway? In a cyber security context, think of risk as the overall probability of our systems or data being compromised by a malicious individual.
Attackers (which could be insiders) make up one piece of our risk equation, the other piece being vulnerabilities. If one piece of the risk equation does not exist (attackers or vulnerabilities), then there would be no risk to our systems and/or data. Why? Because if the world was full of attackers, but our systems/data were not vulnerable to any attack, then the attackers could not steal our data. In a similar way, if we ran a system full of vulnerabilities (think Windows XP, which is no longer supported by Microsoft), but attackers simply did not exist, then there would not be a risk of our systems or data being compromised.
So, how do attackers operate? Here are some common techniques:
1. Attackers perform reconnaissance activities on the targeted organization and can gather data from the following:
2. The data uncovered during reconnaissance allows the attacker to identify who/what to target within your organization. Next, the attacker prepares and delivers the exploit to your organization. The following are common methods of delivery:
3. Once on your network, the attacker will attempt to compromise additional systems and exfiltrate your data. They do this by exploiting known/unknown system vulnerabilities via command and control.
There you have it – those are the basic steps of an attack. I recommend you watch this video produced by Cisco that illustrates an attack better than I can. Here are some recommendations that can be acted upon:
Editor’s note: Jesse Fernandez presented on auditing cyber security at North America CACS 2017. For highlights and key takeaways from the North America CACS and EuroCACS conferences, read the CACS 2017 Conference Report.
As new technologies facilitate innovative uses of data, the corporations, governments and nonprofits using these technologies assume responsibility for ensuring appropriate safeguards over the collection, storage and purging of the data.
Highly publicized data breaches have heightened corporations’ concerns around their abilities to successfully meet this task. The concern is well-founded as the consequences of a data breach extend beyond reputational loss to include regulatory consequences as well as the possibility of class action legal action.
In this landscape, an audit of data privacy is a prime assessment for IT auditors to showcase the value that they bring to their organizations. This opportunity stems from data privacy relating to all areas for which organizations rely on IT auditors for expertise: providing assurance over information systems, ensuring that compliance expectations are met, and consulting on changing and emerging technologies.
In performing an audit of data privacy, inclusion of the following areas in the IT audit program are beneficial:
Data governance and classification
The primary objective of this portion of the audit is to confirm that the organization has identified and classified its data. The IT auditor’s assessment of data classification assures the organization that controls are commensurate with the sensitivity of the data. If the control requires significant resources (either in time or expense), the results of this assessment could allow management to make informed decisions on where to reduce costs or gain efficiency. Similarly, efficiency gains can be made when roles and responsibilities for the people involved in the organization’s management of Data Governance for Privacy, Confidentiality, and Compliance (DGPC) for the enterprise have been clearly defined. Well-defined roles mitigate the potential that responsibilities are duplicated, resulting in inefficiency.
Two of the essential areas addressed under data security are data loss prevention and authentication/credentialing. Concerns with data security often arise from those new technologies that fuel innovation discussed earlier. For example, as an organization explores and implements tools that enhance communication and collaboration (think instant messaging, removable media and, yes, email), data sharing by those who should have access to the data is enhanced. On the other side, the intentional or unintentional ways that the data can leave the organization (data leakage) also have increased.
Data leakage also can occur if weaknesses in the organization’s authentication and credentialing processes do not adequately limit access to data. However, the IT auditor’s assessment of the controls and vulnerabilities in both these areas (authentication/credentialing and the organization’s data loss prevention program) add a layer of defense to avert data breaches.
As organizations partner with vendors for data storage and other needs, it is true that ensuring the vendor’s ability to protect the data is paramount. But, before organizations can conclude one way or the other in that regard, there must be clarity around what data the organization has and the level of protection that is required for the data. During its data privacy audit, the IT auditor can contribute to the success of the organization’s data management partnership by reviewing an inventory of data and the data’s location: this may not be information that the organization has a solid understanding of prior to engaging a third-party provider.
In conclusion, a data privacy audit may appear to be just another instance where the IT auditor wears the hats of assurance, compliance and consulting. Looking deeper, however, a data privacy audit presents an opportunity to contribute to achieving organizational objectives. The likelihood is strong that organizations will continue to look to manage costs and efficiency, to balance implementation of innovative technologies with mitigating the risk of data breaches, and to engage the services of third parties for data management. Given that, a conscious effort by the IT audit team to connect its data privacy audit to these organizational objectives will reinforce the value that IT audit brings to the organization.
Editor’s note: For further guidance on this topic, download ISACA’s data privacy audit program.Category: Audit-Assurance Published: 7/7/2017 3:09 PM
Despite the prominence of larger companies, the growth of small businesses and entrepreneurs also is critical to a society’s development. Entrepreneurship can drive the growth of new businesses, provide solutions for various market niches, foster innovation and generate job creation. The entrepreneurial activities of today can impact the Fortune 500 of tomorrow.
Small businesses or start-ups serve as the beginning point for many who are seeking to navigate the complexities of modern enterprise. One of the things that may be overlooked at the beginning are the implications of IT governance and security on an enterprise’s future health. Regardless of the sector, both factors have important roles to play in continued success. Below are some standard considerations for both areas.
General security perspectives needing consideration:
General IT perspectives needing consideration:
There is a certain excitement for an entrepreneur entering into the market–the joys of prospects unknown and the hope of building a satisfied, stable customer base. However, cash flow can be a major challenge, so many things can be overlooked in order to get the business off the ground. This can be problematic and result in problems down the road, such as regulatory fines, data breaches and compliance issues, just to name a few.
The alignment of the entrepreneurial vision, security and IT can provide a strong foundation to build out the enterprise. GEIT principles can be helpful in the smallest of enterprises since they can be tailored as business expands and provide the necessary checks and balances to mitigate risk. A little time at the start can be helpful in the long run to face the digital disruption roller coaster of the future.
There are far more ways to apply encryption incorrectly than there are ways to apply it correctly. Sadly, many people think they already know everything they need to know about encryption because they have read a few articles online. Recently, I published an article in which I discuss methods for assessing your HTTPS posture. While I was specifically focused on internal systems where you have some degree of control or are obligated to inform those who do have the degree of control, it is also extremely important not to overlook the necessity of performing the same type of assessment against vendor solutions.
Many times, I have pressed vendors for details regarding security only to receive the responses, “I do not have the information,” or my personal favorite, “It is encrypted.” Not having the information is inexcusable, and responding with, "it is encrypted" is arguably even worse. It implies they cannot articulate the details and they hope that you simply nod your head and not ask any further questions.
When considering HTTPS posture, there are a few key points to keep in mind. While these points do apply to internal configurations, they especially apply to vendor-provided solutions and information:
The previous points will help drive out the true encryption details. By clarifying these details, , the level of security not only increases, but the level of understanding of how security is implemented also increases.
Read Kurt Kincaid’s recent Journal article:
“HTTPS Posture Assessment,” ISACA Journal, volume 3, 2017.
Whether in banking or any industry, business needs take precedence; everything else not as tangibly connected to organizational objectives and profitability is regarded as not as important by senior management.
Information security and the concept of CISO have struggled to gain prominence – this despite ISACA’s best efforts, shouting from the rooftop that information security must be part of boards of directors’ agendas and CISOs should be installed, reporting to the CEO.
During the late ’90s, the CISO position was always thought of as something connected to “IT.” It was more data security than information security. Even when I passed my CISA examination in 2005, I was given the role of “Data Security Officer” in my organization, reporting to the VP-IT.
In the banking sector, the CISO position was normally held by somebody handling network security and reported to CTO (GM-IT). We had a position called “head of IT,” and the custom of designating a CIO was quite infrequent.
Then, Reserve Bank of India (RBI) published a comprehensive report and recommendations of the working group on information security, electronic banking, technology risk management and cyber frauds, popularly known as the “Gopalakrishna Committee” report, in January 2011. This report not only mandated that the CISO position be held by a sufficiently senior-level official of the rank of GM/DGM/AGM, but also stated that the CISO report directly to the head of risk management. Thereafter, in most banks, the CISO position was held as a part of the risk management department and reported to GM-Risk Management, alternatively designated as Chief Risk Officer (CRO). Interestingly, the report also mandated that the CISO not have a direct reporting relationship with the CIO.
Not satisfied with the various banks’ response to continuing cyber attacks, RBI came out with a comprehensive cyber security framework consisting of baseline measures on 2 June 2016. Board level sponsorship was mandated, baseline controls were established and strict compliance was required, in addition to having a cyber-crisis management plan. The CISO position assumed huge relevance, and RBI expected the CISO to play a pivotal role.
Within a year’s time, RBI once again came out with a document clearly articulating the CISO role. Apparently wanting significant improvement in remediation of cyber security attacks by banks, the new mandate was for the CISO to directly report to Executive Director (ED) or the equivalent, overseeing the risk management function. Therefore, the CISO now has more board visibility than ever.
In addition, the regulator very clearly positioned the CISO role along with the CRO to establish a strong risk management framework. They both should have strong communication and work together to enable a holistic risk management approach.
This is a very good development, which will make cyber security in the banking sector more effective and the position of CISO more challenging and fulfilling. Both the positions report into the ED with their respective teams. Credit risk management and information risk management (IRM) for backing them.
With credit risk management being a proper discipline, we can soon expect that information risk management will fully mature into a robust discipline as it evolves to defend the entity against continuing cyberattacks and threats, and shapes itself to comply with associated advisories from the regulatory bodies.
Very exciting times ahead!Category: Security Published: 7/3/2017 9:02 AM
In May, US President Trump set into motion a series of requirements to obtain an understanding of where US federal agencies stood in terms of readiness to ward off cyber attacks and assured the American public his administration valued the importance of understanding the risk, mitigating it and building a world-class workforce.
As a CISO for several organizations, including a major healthcare contractor to the US government and a global accounting firm engaged in government contracts, I have watched the evolution of the US federal government’s focus on cyber security with great care. It has always been important, but as the tools and criminals become more sophisticated, our work to benchmark our current status, manage risk and develop a highly skilled workforce of tomorrow becomes even more critical.
My professional association, ISACA, for which I’ve spent over a decade providing information security presentations and certification workshops, and whose work I am highly passionate about, also supports a strong focus on cyber security risk management and workforce development. By mid-July, the federal agencies owe the Office of Management and Budget (OMB) and Department of Homeland Security (DHS) a risk management report or current assessment. It is the next milestone in the US Executive Order Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure. They will need to report information based on FY 2017 FISMA CIO metrics so everyone is reporting based on a consistent methodology. It must be based on 2016-2017 Guidance of Federal Information Security and Privacy Requirements.
Then, the OMB and DHS will use the FISMA metrics to produce agency-specific risk assessments in a report to the President. The results will serve as a way to identify new needs and offer ways to think about how to offer services differently to improve cyber security. In the future, agencies will have to conduct reviews bi-annually.
The order also calls for an action plan for implementing a framework that reduces risks and allows the network to:
Aligning with the NIST Special Publications, as well as other associated standards and guidelines, this structure will provide a common vocabulary and process sharing across the agencies. The framework plan is also due in mid-July.
From a cyber security practitioner’s viewpoint, while the timeframe is aggressive, the order is welcomed because 1) it acknowledges the importance of our interconnected government and that risks in one agency or critical infrastructure may impact others; 2) the approach is from a risk management perspective (vs. compliance), and requires an explanation as to why choices were made (strategic, operational, and budgetary considerations) to accept the current level of risk; and 3) it indicates a desire to move to current technology and updated systems.
After the reports and planning are in, there will need to be a time for reflection—a time to analyze our current situation and how we are best served as we enter a new era. Customized training will be critical. Traditional four-year degrees in computer science may not be our path forward in our new world. We’ll need to engage partners from academia, Veterans Affairs (veterans have a strong skill set that could be leveraged) and private industry.
ISACA offers cyber security training and supports exploring opportunities for non-traditional educational training to build a hardened IT infrastructure.
As a cyber security professional who has had the opportunity to see many changes throughout the landscape of my career, I believe thinking differently allows us to offer solutions that will strengthen our borders and build a more prepared workforce.Category: Security Published: 6/30/2017 3:02 PM
Wow. If only there were some way to defeat these terrible cyber attacks. Imagine if there were some kind of discipline, let’s call it cyber security, that contained steps we could follow to prevent malware from causing massive system outages like this.
While some news outlets have branded the latest Petya/NotPetya malware-based cyber attack as powerful and unprecedented, the reality is that only the organizations with gaping holes in their security basics were taken down. It just goes to show that no organization can afford to ignore these basic cyber steps:
So – yes – if you did those things above you would have been pretty much protected from this ransomware. But wait – is this really ransomware?
At first glance, this new malware looks like it is ransomware because it goes about encrypting files and making an empty offer to restore files in return for a payment – but that ransom payment route was swiftly blocked (the email address to send and receive the unlock key was disabled). This latest attack may be dressed like ransomware but that certainly is not the real intent.
In fact, this malware is really insidious. Like some superbug, it contains a combination of anumber of different nasty components. It is not only encrypting files, but also appears to have learned from WannaCry that if you blend a lot of recent and powerful exploits, you can likely catch organizations that are slow at patch management and security response without the right countermeasures in place. Security firms are still analyzing how this malware operates, but it has been identified that it includes worm capabilities to spread within networks and is also likely to be found to be doing some credential scraping – actually stealing username and password details.
About a week ago, I retweeted this post from New York Times journalist Nicole Perlroth. That article seems to have disappeared for now – however, it did seem to point to an early version of something that sounded and looked a lot like this latest malware attack. The attack recipient was running network recording and was able to identify that the so-called ransomware was not only trying to encrypt files – but also had set about trying to steal credentials.
In my ISACA Now blog post six weeks ago, I suggested that WannaCry should have been a watershed moment. I firmly believe that it was and that this latest attack will have proven beyond any doubt that any organization of any reasonable size that thinks it can run without following all of the basic security practices is having to change its mind. There may still be some organizations that keep running unpatched, unsupported software – but I think it would be fair to estimate that they are not likely to stay operational for much longer at the rate that the current security threats are evolving.
These are not malware attacks that are using previously unknown and sophisticated exploits – these are attacks that are downright easy to defend against. However, these attacks are also showing just how well armed the major nation-states are likely to be because both WannaCry and NotPetya have used powerful cyber exploits alleged to have been stolen from the NSA. If there are other unknown (zero day) exploits with this kind of power, it may only be a matter of time before just applying basic cyber security practices may no longer be enough.
One thing is for certain – for organizations that want to stay in business, it is time to get all of the basic cyber security practices right.Category: Security Published: 6/29/2017 9:01 AM
If you work in technology and have a working Internet connection, chances are good that you heard of (best case) or experienced firsthand (worst case) the ransomware variant making the rounds yesterday that most are referring to as a new Petya variant. It is fast, it is sophisticated and it has left a trail of global chaos in its wake as it impacted everything from national electrical grids to banks to shipping and logistics.
While this attack would be noteworthy on its own, it is particularly so coming as it does on the heels of the WannaCry attack just a few weeks ago. The reason that this fact is both relevant and noteworthy is that it leverages one of the same transmission vectors that WannaCry did: specifically, the EternalBlue Server Message Block (SMB) exploit (i.e., CVE-2017-0144)—an SMB issue addressed by MS17-010.
It bears noting, of course, that the situation is a bit more nuanced in this instance compared to WannaCry. WannaCry was fairly simple in its operation: Exploit the EternalBlue SMB issue, establish control, execute the payload, rinse and repeat. In the case of Petya, researchers are confirming that other transmission vectors are used beyond the relatively straightforward methodology employed by WannaCry. For example, IBM X-Force is reporting that this new malware also leverages the administrative remote execution tools Windows Management Instrumentation (WMIC) and PsExec as mechanisms to move laterally inside a network once it has established a foothold. Point being, it is not just a straightforward SMB worm anymore.
That said, the link with EternalBlue is significant for us as security practitioners because it highlights something to which we should pay attention. Specifically, this issue is fixable. The SMB issue can be patched. One might also reasonably question the rationale behind allowing unfiltered SMB traffic into an environment (particularly a relatively protected one such as a clinical or industrial control network) in the first place. But the fact that this ransomeware is spreading as rapidly as it is—and causing the incredible damage that it has—highlights the fact that in many organizations, actions are not (or cannot be) taken to address these issues. This, even after WannaCry caused a first round of serious chaos on the back of the same bug.
Is Your Risk Management Approach Working?
I apologize if that smacks of a “blame the victim” mentality. I am not calling it out to establish blame, but instead to highlight a point about this fact that is important. Specifically, there are some situations that are so dire—issues that have such a high level of potential risk associated with them—that they should cause us to at least question the status quo processes that we have guiding our organizations’ operations, meaning there are situations where the risk should cause us to approach the situation differently than might otherwise be the case. I think EternalBlue is one of those times.
There are often very good reasons why organizations cannot immediately apply every patch that comes along the exact second that they learn about it. Installing a patch is often a risky proposition from a critical business application uptime standpoint. Some applications are “temperamental,” requiring extensive testing before a patch can be applied. Other times, patching can require applying pressure to external vendors, for example, in situations when those vendors support a critical system directly due to a contractual relationship or existing business arrangement. In fact, even when employing technologies such as virtualization or patch management tools that can assist in minimizing the downtime risk associated with applying a patch, there are some systems and applications that are so critical that even a “ghost of a chance” of downtime is too great a risk to take.
And that is just the patching side of things. An organization could also have valid business reasons requiring it to allow SMB traffic directly into the production environment (even a protected or cloistered one). It might require this to support legacy applications, for example, that need to transfer files between environments—maybe those applications are both critical to business operations and challenging to replace. All these are potentially excellent reasons.
On the other hand, there is EternalBlue. There are two points I would make about this. The first one is that the ability to evaluate the potential impact of a given issue—and adjust processes as needed in a manner commensurate with that risk—is the quintessential touchstone of risk management. There are times when, even if your organization is one that has special needs with respect to applying patches or filtering inbound traffic such as SMB, the risk associated with a given issue may cause you to adjust your approach. The point? An issue such as this one is a useful time to examine performance and whether or not those risk management approaches are working as you want and expect. I am not suggesting you do this now if you are still fighting fires and dealing with the cleanup. But maybe make a note to address it after cleanup is resolved.
The second point I would make is that gaining foreknowledge about issues of this magnitude—and enabling preplanning as a result—is exactly the point of intelligence-driven security approaches. Meaning, getting a warning about an issue like this, e.g., its relative severity, ubiquity of vulnerability, is exactly why threat intelligence is valuable in the first place. So, your foreknowledge of both Petya and WannaCry (regardless of whether or not you were in the position to do anything about it in advance) is a useful barometer of the value that threat intelligence is providing to you. If you subscribe to a threat intelligence feed, have an internal threat intelligence capability or otherwise have some mechanism for advance warning and this was not on your radar, now is a good time to examine why not. Was this due to a breakdown in the process? Did the right people not hear about the issue? What were the available data and did the right people have them in time to make the right decision?
The point is, situations like these—dire as they may appear in the heat of the moment—can be a great opportunity to improve how we do things. Whether we act on them or not is what separates effective organizations from less effective ones.
Ed Moyle is director of thought leadership and research at ISACA. Prior to joining ISACA, Moyle was senior security strategist with Savvis and a founding partner of the analyst firm Security Curve. In his nearly 20 years in information security, he has held numerous positions including senior manager with CTG’s global security practice, vice president and information security officer for Merrill Lynch Investment Managers and senior security analyst with Trintech. Moyle is coauthor of Cryptographic Libraries for Developers and a frequent contributor to the information security industry as an author, public speaker and analyst.Category: Risk Management Published: 6/28/2017 2:17 PM BlogAuthor: Ed Moyle PostMonth: 6 PostYear: 2,017
Leftover qualified Wannacry victims – those that were vulnerable, didn’t get caught, and somehow continued to decline to patch – have become caught up in the next round of ransomware attacks. Victims of the attack initially referred to as Petya, including ATMs, trains, nuclear power plants, and the advertising industry, are being compromised due to a failure to conduct a couple of simple security no-brainers: back up and patch.
Incredibly, backups and patching can be hard. Why? A lot of the time we just don’t know what we have. Asset management is such a basic function it traditionally doesn’t even belong in the CISO’s domain. Maybe that’s the problem.
Governance issues aside, the Internet of Things has exploded faster than our ability to manage and protect the Things. When it comes to tackling the security of connected devices, administrators in and outside the security suite need to think (fast) about asset management. You can’t protect what you don’t know about. Version management layers on top of this, allowing systems administrators to understand what they have, and what needs attention.
Armed with this knowledge, procedures can be developed to manage and maintain systems. For those manufacturers and/or vendors that have declined to provide updates, the era of security by obscurity has come to an end. Increased transparency distinguishes manufacturers who respond to security vulnerabilities versus those that deny.
Attackers are looking to monetize their efforts, and ransomware presents them with a viable business model. These attacks will continue, pressure on IT administrators and manufacturers will mount, users will demand change, and the markets will react.
If the recent WannaCry ransomware attack did not make a clinching case to corporate entities across the world, with entities scampering to patch various computers quite reactively, the attack was followed by the Petya ransomware attack across Europe and spreading all over the world.
The targets are sensational with airlines, hospitals, police stations and major financial services players.
The attacks are well-planned and targeted. Suddenly, the world has become one small village with quick sharing of the patches globally, whether the patch was identified by a research analyst in the US or a cyber lab scientist in Russia.
As predicted in early 2017, the attacks are growing in geometric proportions, both in terms of dollars spent and tenacity of the attack. The world has not yet solved the problem of patching dated computers when now the entities are scampering to isolate the SMBs (target of Petya attack).
The main objective of the attacks is to create havoc, spread panic and hold entities ransom. Payments have been extracted even from government entities, law enforcement and other organizations that one would expect to take a tough stand.
The root causes could be many, from vulnerabilities not addressed swiftly enough to the need for clinical incident management, to finally the proactive threat modeling and surgical strike from the perpetrators. Predict more attacks until the world gets together on more proactive threat modeling and global execution.
There is a certain satisfaction that comes from turning the tables on a seemingly unbeatable adversary. Luke Skywalker exploited a design flaw to destroy the Death Star. Rocky Balboa exploited Ivan Drago’s arrogance to win a boxing round. Sarah Connor exploited a reprogrammed Arnold Schwarzenegger to beat the T-1000 in Terminator 2.
In cyber security, the hacker community often seems as evil as Darth Vader, as cold as Ivan Drago and as relentless as the Terminator. It would be nice if there were a way to turn the tables and beat hackers at their own game.
Whether for financial gain, social activism, mischievous vandalism or other malicious motivation, hackers have been exploiting weaknesses in human nature and network defenses and making life miserable for enterprises for decades. But today, security professionals are starting to turn things to our advantage. By amassing a knowledge of the millions of techniques known to be used by hackers and combining that information with real-time threat intelligence and continuous, automated vulnerability testing, it is possible to beat hackers at their own game.
Imagine, as in The Terminator, that you could see how an adversary attacked you, understand the weaknesses they would exploit, quantify which security defenses were failing and then go back in time to fix the problem before it happens. That is what we are talking about. And it is not a point-in-time assessment that may be valid today and obsolete tomorrow. It is a constant process based on up-to-the-minute analysis and intelligence.
It is important to use breach simulations to “breach your own castle.” It is a process that ensures not only that your investments in cyber security are calibrated to meet the specific needs of your enterprise, but it also creates a sort of incident response muscle memory that ensures a timely, efficient response when an attack does take place.
My recent Journal article goes into detail about my company’s approach, but to improve our industry’s readiness and efficacy, we believe in sharing information and in having a robust dialog that challenges assumptions and improves processes. In that spirit, I look forward to reading and responding to your comments.
Read Danelle Au’s recent Journal article:
“Breach Your Castle for Better Security,” ISACA Journal, volume 3, 2017.
There are few industries that need strong cyber security as much as the healthcare industry. Patients are often dealing with life-threatening conditions, exchanging large amounts of money and financial information, and must have their privacy protected when it comes to medical records.
Yet, healthcare is still one of the biggest targets for cybercriminals. In 2015 alone, according to IBM, there were over 100 million breaches of medical records. While some organizations are committed to patient privacy no matter what it takes, most healthcare organizations are behind in terms of cyber security adoption and advancement.
Why is it that so many hospitals and care facilities are lagging when it comes to cyber security?
So what can hospitals and healthcare facilities start doing to make up for this lack of preparedness?
Like with any other industry, cyber security improvement in healthcare isn’t going to occur overnight. It’s going to take ongoing commitment, by many organizations working together, for patient protection to improve. Even basic practices, like better informing staff members about potential scams and the importance of changing passwords regularly, can go a long way toward healthcare organizations better securing their networks.Category: Security Published: 6/26/2017 3:11 PM
Employees perform emotional labor (EL) when they conform their emotions to organizational expectations while interacting with customers. They can only express appropriate emotions that are specified by certain corporate rules and conventions. While not always recognized, this is one of the many factors that increases stress for IS auditors during audit engagements.
An IS audit engagement can be stressful as EL is required at different stages in the audit:
A number of personality factors influence emotional labor. Being conscientious, hardworking, persistent and achievement-oriented leads to higher job performance. Auditors also demonstrate these characteristics by highlighting significant and relevant issues of material risk to the audit entity and the organization. The higher the position, the higher the requirement for emotional intelligence (EI). However, higher-level client management does not always demonstrate EI. They may often tend to ignore the fact that auditors have a responsibility to provide assurance about the effectiveness of controls to the board and are not around to help them with their personal responsibilities and job functions. Emotional dissonance (ED) exists as IS auditors are officially expected to behave as though their role is positively helping the organization to achieve its goals, while they really may be perceived as a potential career threat to auditees, making them look “less capable” in the eyes of management. This requires IS auditors to perform EL.
The solution to this problem would be for IS auditors and IS audit management to address the causes of resistance to change, give more consideration to organizational culture and involve the audit client in designing and implementing the change. Additionally, auditors could provide facilitation (more emotional support), make greater use of EI and use psychological capital by leveraging positive emotions. IS auditors should focus on positive aspects—what the client is doing well—and what can be done to help them do it better. As there is an understandable emphasis on roles, rules and procedures, emotions are taken out of the mix as dehumanized organizations are perceived as more successful. Despite this perception, EI should be used to make organizations and IS auditors more genuinely human.
IS auditors must perform EL deliberately at times and unknowingly at others. They are trained in a way that teaches them to suppress their emotions which forces them into intrapersonal conflicts and eventually leads them to perform EL. If they learn how to manage and regulate their emotions, IS auditors are more likely to become more productive members of business organizations and society at large.
Kamal Khan, CISA, CISSP, CITP, MBCS, has been an IS audit manager, senior auditor and auditor over a period of nearly 30 years within a number of prestigious international organizations in the oil and gas, investment banking and utilities sectors. As a past member of the IS Audit and Control Foundation (ISACF) Research Board, he researched, served as project manager and reviewed a number of publications covering information security, audit and control. He can be reached at email@example.com.Category: Audit-Assurance Published: 6/22/2017 3:05 PM BlogAuthor: Kamal Khan, CISA, CISSP, CITP, MBCS PostMonth: 6 PostYear: 2,017