According to the ISACA State of Cyber Security 2017 research, 80% of respondents believe “it is either “likely” or “very likely” that they will be attacked in 2017.” In 2018 and beyond, based on current risk trends to organizations from their infrastructure, employees, supply chain and external threat actors, this figure is unlikely to drop.
Cyber threat intelligence (CTI) plays an important role in an organization’s defense-in-depth defense strategy often being leveraged by other cyber security functions, such as security event monitoring, incident response and forensic investigations.
To derive value from CTI, raw or processed data feeds must be analyzed and applied within the context of the organization to improve, among other capabilities, the ability to detect threats and respond to incidents.
Visibility into the design and operating effectiveness of CTI processes can provide some assurance to management and potentially support funding requests for further investment in this area. Based on that premise, below are five areas to consider when conducting a review of your organization’s CTI capabilities.
Alignment with your organization’s threat model
Commonalities exist in the threats to organizations operating in the same industry sector. However, because no two businesses are exactly alike, there is a high likelihood that each one will have a slightly different threat model.
Threat modeling is a necessary risk management step to ensure that resources are directed at controls that address the real threats to the organization. Therefore, to ensure that CTI sourced by an organization is effective, it must support an existing threat model.
A key initial part of your review should involve checking whether your organization maintains a threat model, whether the CTI sourcing strategy adds more visibility to that model and whether the combination of both supports effective decision-making when managing risk.
Quality of threat intelligence
Threat and vulnerability information originates from a variety of internal and external sources and is often ingested manually or through automation by the user organization.
Externally, sources include commercial CTI vendors, industry/community collaboration forums, and security product/vendor intelligence feeds. Internal sources include proactive vulnerability scanning, network monitoring and behavioral analysis tools.
Whether derived internally or externally, the quality of CTI is critical for it to effectively contribute toward improving an organization’s cyber security posture.
According to leading threat intelligence expert Sergio Caltagirone, the quality of threat intelligence is determined by four factors: completeness, accuracy, relevance and timeliness. Each of these factors is described briefly below:
Start by obtaining a list of your organization’s internal and external sources and reviewing them against each of these factors.
Integration with security monitoring
There are many use cases for CTI. According to the 2017 SANS Institute Cyber Threat Intelligence report, the top use case for CTI was in security operations, as 72% of respondents say they use CTI information when detecting potential cyber security events and locating sources and/or blocking malicious activities or threats.
An effective security monitoring strategy is one which correlates and analyzes data from multiple sources to detect threats before they can cause harm to the organization. Leveraging available CTI is one way to ensure the optimal use of security operations resources by focusing monitoring efforts on indicators of compromise that pose the highest risk.
Conduct a review of security monitoring procedures to determine how much CTI influences monitoring strategies.
Integration with incident response
Improving visibility into threats and attack methodologies is vital to an organization’s ability to respond to incidents. Effective CTI provides insight into the intent, opportunity and capability of a cyber-attacker. It is this insight which gives an organization some assurance that it can deploy appropriate defense mechanisms to prevent a successful attack.
As part of your review, assess the degree to which CTI is integrated with the steps in your organization’s incident response approach, including preparation, detection, analysis, containment, eradication and recovery.
Measuring the impact of incidents
A post-mortem review of security incidents could give an organization insight into what worked well (and what did not) during incident detection and response and help to identify improvement opportunities.
It is worth reviewing security incidents to determine whether the use of CTI in security monitoring and incident response played a significant role in areas such as detecting unknown threats, reducing time to identify and respond to threats, and preventing significant damage to systems and data.
An assessment of the relevance of CTI to reducing the impact of security incidents could provide a view on which intelligence sources provide the best value to the organization and deserve continued investment.
The value of CTI to any organization is in its ability to support timely decision-making by stakeholders including executive management, corporate security, security operations and risk teams.
Regardless of which cyber security functions it is applied to, this is the key consideration to remember when conducting a review of the design and operating effectiveness of CTI processes.
Editor’s note: For more insights on threat intelligence, download ISACA’s threat intelligence tech brief.Category: Security Published: 11/22/2017 3:04 PM
When I finished my proof-of-concept presentation to the CIO of a prospective client at a recent meeting, he was more than surprised – he was upset. He almost yelled at me: “How did you do it?”
For my demo, my client had to complete a paper application form used by his company’s sales force. He needed to do this by hand, as would any customer, but using a digital pen equipped not only with an ordinary ink cartridge, but also with a micro-camera that captured each trace of the pen on the paper. When he had finished the application, he checked one box at the end of it that read “Transmit.” While explaining the features of the digital pen, I opened my laptop and remotely connected to our demo server. From there, just a few seconds after he had completed the application, I could show to him not only a high-quality scan of the completed application, but also all the data already translated into usable fields: numbers, dates, addresses, ready for ERP integration. He stood up in astonishment and asked: “How did you do it? How??”
This appears to be a nice example of a presentation that went so well that I took my audience completely by surprise with an emerging, unexpectedly beautiful technology. But the truth is, less than two years after launching our work with digital writing, we had to completely write off two years of work and investment put in an offering that appeared to be “The Next Big Thing.”
Talking about our digital transformation successes is always nice, but I would like to share these five innovation facts that, from my experience, should be understood to avoid failing in this era where all of us are at the brink of launching The Next Big Thing, whether on top of blockchain or IoT or AI or machine learning technologies.
1. “Innovation Chasm” does exist. I am sure that many of you have seen the Technology Adoption Lifecycle graph that describes the Innovators, Early Adopters, etc. Well, in that graph, there is a chasm between being loved by technology fans and getting a growing majority of users that will make your product the next iPhone. In the case I described, we could not convince owners of the intellectual property in a timely fashion to simplify the pricing model to accelerate the creation of a minimum user base. Check your business model for scenarios where the chasm is bigger than anticipated.
2. Platforms and ecosystems matter. The possibilities of emerging technologies are immense but decisions need to be made in relation to the platform or ecosystem you want to belong to or create for others. No one cares for a solution that cannot integrate and evolve for future needs. Our digital writing offering did use industry standards like XML or GMS but relied heavily on proprietary technology within the core product.
3. The “Innovator’s Dilemma” is real. Professor Clayton Christensen has said that companies are designed for the status quo and innovation efforts are killed by design. This is, although companies may not say it, they do not really want to disrupt themselves. So, your presentation to whoever approves your innovation effort needs to avoid a collision trajectory and rather explain the complementary nature of business and customer bases that you are bringing to the table.
4. Being a maverick is cool, but … In the end, a successful launch of an emerging technology needs to be on good terms with the leading powers that will put your product in front of users. It needs to integrate seamlessly with dominant social platforms as well as with online and app stores, and be designed to quickly open its features to the newcomers that will play a dominant role in your marketplace. That is why you see such collaboration among companies that otherwise would be rivals to create the future ecosystems for blockchain, machine learning, etc.
5. ITBMS! I have a blog post called It’s the Business Model, Stupid. We have seen for several years that, in the end, all successful technology companies have managed to build a credible business model that will turn around years of losses (sorry, capital investments) by creating value for an ever-growing number of users. So, be bold in pursuing your dreams for a better world, but keep close your friends that can make sense of it in terms of a sustainable, long-term business model.
Author’s note: Jose Angel Arias has started and led several technology and business consulting companies over his 30-year career. In addition to having been an angel investor himself, as head of Grupo Consult, he participated in TechBA’s business acceleration programs in Austin and Madrid. He transitioned his career to lead the Global Innovation Group in Softtek for four years. He is currently technology audit director with a global financial services company. He has been a member of ISACA and a Certified Information Systems Auditor (CISA) since 2003.Category: Risk Management Published: 11/21/2017 3:10 PM
Ransomware holds a tight grip on its victims and their most valuable data and is a global epidemic reaching all corners of the world.
The most commonly used infection vectors used by ransomware are email attachments, links in emails, compromised websites and malvertising. The first type, attacks via email attachments, can be intercepted by a security or gateway appliance before a user even receives the lure.
When an attack is using a website that security products have already identified as having been compromised or hosting malicious behavior, it can be blocked by looking at the domain or IP used in the link embedded in the email or the URL visited by a user. In practice, however, simple blacklisting approaches suffer from the relatively short lifespan of these drive-by landing pages.
To cope with this problem of blacklisting short-lived content, security solutions must find the attack “on the wire.” This means that the system either proactively probes for the content of a website or it waits until a real user is tricked into following the link to the exploit site and finds the attack in the live traffic.
However, not all attacks make use of exploit kits; often, victims are simply tricked into downloading and running the ransomware payload. Thus, security technologies need to intercept these downloads and evaluate whether the file is safe to be opened by a user—typically by running the program inside a sandbox.
As ransomware evolves, it is imperative for enterprises to adopt solutions that intercept ransomware on the wire to protect their users from these emerging and ongoing attacks.
Read Clemens Kolbitsch’s recent Journal article:
“Evasive Malware Tricks: How Malware Evades Detection by Sandboxes,” ISACA Journal, volume 6, 2017.
The Health Insurance Portability and Accountability Act of 1996 (HIPAA) is a central concern of US organizations that are in any way involved with the creation, access, processing or storage of sensitive confidential health records – electronic protected health information (ePHI). The Security and Privacy Rules are a particular point of focus since violation of those guidelines often leads to federal fines and settlements; those parameters are covered under Title II of HIPAA.
A newer piece of healthcare legislation is the Health Information Technology for Economic and Clinical Health Act (HITECH) of 2009. The first act is typically discussed in terms of concern with security and privacy of health records, while the second is generally described as increasing the implementation of digital health records and technologies. However, Subtitle D of HITECH is specifically focused on issues of security and privacy of electronic health data; it achieves this end by modifying and elaborating on those parameters within HIPAA. Essentially, if an organization is HITECH-compliant, that means that they are compliant with the most recent HIPAA security and privacy stipulations contained within the 2013 Omnibus Rule.
HITECH gives professionals a chance to work with an access governance model so that they can better control who does and does not get access to information – particularly for any systems that contain ePHI. When companies do implement some of the lessons they can glean from HITECH into the structure of their organizations, they will see that it costs them less to operate and that they are better able to create more efficient workflow to manage access risk. Both this reduction in the cost of operation and the streamlining of workflow improve the security of the organization while boosting its value.
To consider that specific notion of value from a security system, it helps to look at the return on security investment (ROSI) of a HIPAA compliant system – and we can use the analogy of soccer.
ROI and ROSI—Like Offense and Defense
Return on investment (ROI) and return on security investment (ROSI) initially seem to be almost identical concepts. However, you can start to understand what makes them dissimilar when you think about how you arrive at an ROI figure: add up the gains, subtract the cost and divide the difference by the cost. Immediately it’s clear that formula will not work: you will not typically profit from adding security measures. Instead of focusing on gain, the intent of the ROSI concept is to limit your losses and help your organization’s value from that perspective. For that reason, rather than thinking in terms of gain and scoring goals as you would with a soccer team, think in terms of not letting the other team score.
You can figure out how much value is being achieved with your security controls by performing a quantitative risk assessment, as noted by the European Union Agency for Network and Information Security (ENISA). In order to come up with your ROSI number, you need to first look at other data: the ARO, SLEs, ALE and mALE.
The single loss expectancy (SLE) denotes the total cost of a single security incident. The annual rate of occurrence (ARO) is the probability that the incident will take place during a single year. The annual loss expectancy (ALE) is the complete loss from security incidents throughout the year. Finally, the modified ALE (mALE) is the ALE, plus whatever losses are avoided through adoption of the security mechanism – as expressed by the mitigation ratio (the percent of threats the solution is able to counter).
To get the ROSI itself, you want to multiply the ALE by the mitigation ratio (producing the mALE), and then subtract the cost of the security apparatus. Divide that total by the cost of the security plan. The end result is the return on security investment.
In other words, you will get the ROSI figure by adding up your loss reduction numbers, subtracting how much you spent on the security mechanism to achieve that loss reduction, and then dividing by the amount you spent on the protective system. You want the number to be higher for ROI, but you want it to be lower for ROSI.
Problems with ROSI
What exactly is the loss reduction, though? By subtracting the annual loss expectancy once the security system is implemented from the annual loss expectancy prior to its adoption, you get the loss reduction. The issue is that the second figure is not easy to measure accurately, with confidence. The figure often has more to do with suggestions made within individual projections and broader polling than it does with real objective measurement.
Pete Lindstrom has said that what must be involved when looking at any solution is effectively a “gut check,” asking oneself point-blank if the amount spent on security achieves a loss reduction that justifies the cost.
As you can see, ROSI can be problematic when it is taken too seriously as an absolute. For greater accuracy when determining value of security, it helps to think about how security can be considered – different perspectives and factors when attempting to accurately apply a value to it, as indicated by Steven J. Ross, CISA, CISSP, MBCP. First, there is the notion of a threshold condition for adequacy of security solutions, without which a business could not be sold because protections do not meet standards of “adequacy.” A higher degree of security would be sufficiency – based on an independent metric that goes beyond the needs of adequacy. Intellectual property should be factored into any estimation of the worth of security solutions, since that asset is being protected. Plus, security should be considered in terms of facilitating sales, since security solutions will often lead to greater revenue.
In the context of healthcare, you want to consider how precious the ePHI is. Because of the various costs related to compliance and general data protection, expenses incurred in a healthcare data breach are diverse, ranging from forensics to breach notifications to lawsuits to lost revenue to lost brand value to post-breach cleanup – and that doesn't even include the federal fine. By implementing industry standards such as those of ISACA, you can systematize your controls and auditing, resulting in security that you and your clients can trust – and that really is holding true as a valuable data defense.
Author’s note: Adnan Raja is the Vice President of Marketing at Atlantic.Net. During his tenure, Atlantic.Net has grown from having a primarily regional presence to garnering and developing attention nationwide and internationally.Category: Government-Regulatory Published: 11/20/2017 3:04 PM
As a 2003 CISA recipient and a former honorary secretary of the ISACA Singapore Chapter’s board of directors, I am honored to be selected as the ISACA liaison to the International Organization for Standardization (ISO) Technical Committee 309 – Governance of Organizations.
Having served nearly three years as the chair of the US Technical Advisory Group to ISO Project Committee 278 to help develop, draft and evangelize the ISO 37001 Anti-Bribery Management System Standard, I see this as a wonderful opportunity to not only keep both the ISACA and TC-309 communities informed of significant developments in the world of governance and compliance, but also to help shape and develop newly proposed ISO standards while supporting and strengthening existing ones.
As you may already be aware, TC-309 is focused on standardization in the field of governance relating to aspects of direction, control and accountability of organizations, and is responsible for:
The symbiotic relationship of COBIT and ISO governance and compliance standards, particularly in the realms of data governance, privacy, security in the cloud and the Internet of Things, likely goes without saying. However, having the opportunity to proactively and positively engage, inform, shape and contribute to this relationship with fellow subject matter experts from 40-plus countries is rare, and I thank ISACA for enabling me to participate in this partnership.
Author’s note: Judd Hesselroth is a Director in Microsoft’s Office of Legal Compliance, where he has focused primarily on anti-corruption programs and ISO 37001 since 2010, and prior to that, internal audit.
This is a story about researching a simple question: Why are there so many vulnerabilities in information systems? One answer that might strike a chord with ISACA members is: “failure to listen to experts.”
Many of us have spent years advising companies to adhere to the principles of security by design and privacy by design, yet some still ship products with holes in them, vulnerabilities that leak sensitive data or act as a conduit to unauthorized system access. We’ve been teaching cyber-hygiene to end users since before it was called that, and we’ve all encountered organizations that don’t listen to our warnings about the risks inherent in their deployment of digital technologies.
But why do some people not listen to experts? I decided to study this question with help from my research colleague at ESET, Lysa Myers. We found an established body of research that examines the way people perceive risk and explores the ways in which risk communication can become more effective. Many of these studies centered on the rejection of warnings about risks inherent in successive waves of technology. For example, some were funded back when people argued about the risks from nuclear power and radioactive waste disposal. More recent research has explored why so many people don’t heed the warnings of climatologists.
Many studies used survey questions phrased like this: “How much risk do you believe [this hazard] poses to human health, safety, or prosperity?,” where this hazard might be global warming, genetically modified foods, and so on. Responses to these questions revealed interesting patterns when subjected to demographic analysis, particularly when that analysis included profiles derived from the cultural theory of risk perception (CT for short). According to this theory, we tend to perceive risk in a way that affirms our understanding of social structures and our place within them.
People who see society as a hierarchy of individuals rather than as a community of equals typically rate global warming less risky than folks who are more egalitarian and communitarian. Studies also found that, as a group, white males rated risks from a variety of technologies lower than white females, non-white males, and non-white females. Dubbed “the white male effect” by researchers who first observed it in 1994, this phenomenon appears to be caused by a subset of white males drastically under-rating risk relative to the mean (these men are predominantly hierarchical individualists with above average education and income).
What we didn’t find in our literature review was comparable surveying around risks arising from digital technology, so we conducted our own. We mixed six digital hazards in with nine risks unrelated to information systems, like air pollution. Using Survey Monkey, we polled more than 700 adults in the US. Our first surprise when analyzing responses was that “criminals hacking into computer systems” rated higher than any other risk, ahead of air pollution and hazardous waste disposal. A second digital hazard, theft or exposure of private data, rounded out the top four.
These results suggest that a significant portion of the American public now “gets” that digital technology brings serious risks, but what did our survey tell us about communicating with those who don’t “get” it? We did find a white male effect in our sample, but it was less pronounced for digital risks. The cultural alignment of respondents followed earlier studies for global warming, but looked quite different for digital risks. That tells me there is more work to do in this field, but we can improve our risk communication skills by learning from the work of those studying how cultural theory informs the science of science communication.
I encourage you to read Dan Kahan’s articles on this at CulturalCognition.net, and hope to see more people studying why the advice of information security experts is not universally embraced.
For more of our results, see our slides on SlideShare: https://www.slideshare.net/secret/j6a7vyrtlEgzOf.Category: Risk Management Published: 11/14/2017 3:03 PM
Without a doubt, the information security space is experiencing a dramatic increase in hiring. Finding qualified candidates is continuing to get more difficult, and the duties of managers are steadily increasing. As a result, hiring managers and human resource recruiters are looking for ways to make the process more efficient. Because most certifications in the information security industry come with experiential requirements, the search for candidates possessing industry credentials is seen as a good way to achieve this goal. However, other challenges begin to surface if the proper value of certification is not considered, which I explore in further detail in my recent Journal article.
I personally value certification in the hiring process and use this as a tool to screen potential employees before evaluating their resumes. Some scoff at this idea, as there are many qualified candidates without certification. While these candidates will almost certainly be filtered out, there are few better qualifiers to help parse through resumes and candidate requests in an efficient manner.
Whether it be on Internet forums or in discussion with industry peers, there are widely varied opinions about requiring certification as part of a job search. It appears that this practice is taking place in many organizations—glancing through job postings recently, I have seen many job postings requiring certification. Pushback from a few of my peers in the industry caused me to reevaluate my stance and to dig deeper into understanding the value certification brings to the process, the person and the organization. While my evaluation was not scientific in nature, it highlights many experiences I have had over the years as a hiring manager and is an aggregation of conversations I have had with many of my peers over the past year.
I suspect that some may feel that certification is becoming irrelevant or that candidates do not possess the skills that are expected, but if you put certification in the proper context, I truly feel that it helps in the hiring process and also helps identify a great employee with some of the positive characteristics I mention in my Journal article.
Read Thomas Johnson’s Journal article:
“The Value of Certification,” ISACA Journal, volume 6, 2017.
“What could cause a digital Armageddon?” That is a popular question to pose to information and cyber security professionals, and when asked, I don’t hesitate: Quantum computing.
While the principles of quantum computing are certainly complex, at a high level, the risk from quantum computing can be understood fairly quickly. Unlike a digital computer bit, which can only be a zero or one, a quantum bit, or qubit, can be a zero, one, and everything in between – all at the same time. For those who are not quantum physicists, this can be mind-blowing, but the result is that a quantum computer can offer such a huge speed-up to solving certain problems, that some problems previously thought to be nearly impossible to solve may soon be solved.
For instance, it isn’t a question of if, but when, today’s cryptography that protects the Internet will be broken. Some experts have said that this is likely to occur in the next 3-7 years – it’s just a matter of having enough qubits, and it will likely take 100 to 300 qubits to fuel a quantum computer powerful enough to do this. Working quantum computers with fewer qubits already have been developed.
In addition to governments like the United States and China, today, there are major companies – IBM, Google and D-Wave – that are pursuing quantum computing. D-Wave already has quantum computers available for purchase commercially (including one with 2,000 qubits), but its systems are primarily useful for solving optimization problems, rather than for general purpose, and are not suitable for breaking cryptography. IBM is working on a general-purpose quantum computer that likely would be suitable. Earlier this year, IBM announced that it had built a working prototype with a real quantum processor and 16 qubits. Google indicated that it had a prototype with 22 qubits. Money plays a role, as quantum computers must be cooled to almost absolute zero (the temperature of outer space) to operate, making them very expensive and something that only large corporations and governments would be able to afford.
The underlying security of the Internet today is primarily based on the complexity of factoring large semi-prime numbers. There has been a quantum factoring algorithm around for 20 years by Peter Shor that factors semi-prime numbers, but requires a quantum computer to implement. With today’s computers, it would take thousands and thousands of years to factor a large semi-prime, but with quantum computing, that timeframe is potentially slashed to minutes, and even seconds.
Not long ago, Shor’s algorithm was implemented on a small quantum computer with four qubits to quickly factor small semi-prime numbers (like the number 15), and was able to do so in a matter of seconds. If replicated with a future, more powerful quantum computer to handle larger semi-primes like the ones that form the foundation used to encrypt the Internet, the security of the Internet would essentially be broken. This will occur as soon as a quantum computer is available with sufficient qubits.
Post-quantum cryptographic solutions have been proposed. NIST considers quantum crypto breaking a serious enough risk that it has issued a call for papers on the subject, with the deadline upcoming later this month. Experts and scientists have been working to find solutions that can be implemented into the Internet to replace the current method we’re using now – hopefully, before powerful enough quantum computers come out and disrupt the Internet’s security.
Likely because of the complexity of the scientific principles – you need to be a quantum physicist to have a true appreciation for Shor’s algorithm – this topic does not generate nearly as much attention as it should. At this stage, the risk from quantum computing is well-understood by top cryptographers, but by few others. That will certainly change, and I would love to see members of ISACA’s global professional community play a leading role. Information security and business technology professionals, executives and boards should closely monitor the situation, follow how things progress with NIST, and begin giving thought to what could unfold in the coming years.
Ultimately, quantum computing could have staggering implications on our professions, and society as a whole, transforming everything from space exploration to the financial markets.
In the meantime, the next time someone asks you what could cause digital Armageddon, you should not need to hesitate to come up with your response.
In my last post, I spoke about the Internet of Things (IoT) in terms of trust, security and privacy at a high level. Here, I will take a deeper dive in terms of how IoT security and privacy can impact an ecosystem interconnect.
When we talk about IoT, we think about the process we implement as we migrate to sensor-driven infrastructure for automated processes.
Looking at economies and technology ramp-up trends from a financial perspective, we will expect that there with be standardization around policies and processes, as well as implementing interfaces that are expected to connect sensors to networks, platforms, and application systems, or a combination of services.
It can all appear to be complex and large scale, especially in the borderless world of IoT. However, if as security and privacy professionals we ask ourselves, “What are the major areas we should focus on?,” my perspective is that we will have to look at:
IoT PriSec Model
The team at The Cyber Policy and Security Governance Institute have been developing an IoT PriSec Model. This model:
One area that will have an impact on IoT environments, given that the growth of cloud and big data are enablers of IoT, is that of unikernel security.
In the paper “Unikernels: Library Operating Systems for the Cloud,” A. Madhavapeddy and team describe a unikernel as follows: “In the context of virtual machines and cloud computing, it makes sense to describe the whole virtual machine as a unikernel.”
Bratterud, Happe and Duncan presented a paper on “Enhancing Cloud Security and Privacy: The Unikernel Solution,” which lists six observations exhibited by Unikernel systems as follows:
In a following piece, I will present further details on this aspect, as well as other areas that we are seeing leading IoT vendors focus on from a security and privacy best practice perspective.
Cyber security is now on the agenda in board rooms. The threats and risks in the cyberspace are significant enough to warrant the attention at the highest levels.
In 2017, those conversations often have focused on ransomware. This year, the global community has experienced a large number of incidents related to ransomware. Organizations are anxious to ensure that the necessary approach and countermeasures against ransomware are understood and implemented. Security professionals therefore need to update their knowledge on the subject.
My session later this month at Asia Pacific CACS 2017 – titled “WannaCry? No, Wanna Get Wiser” – discusses the theme of ransomware. The session will introduce the ransomware attack chain or methodology that depicts the steps that ransomwares deploy for infecting targets. Gaining an understanding of the methodology adopted by malware is an essential step toward defeating the nefarious designs of malware.
The session will identify the different stages of ransomware infection. This knowledge is useful in designing the countermeasures to be deployed at different stages of a ransomware attack to counter it. The session also will provide recommendations to build resilient operations and capable organizations to counter ransomware.
The WannaCry attack from earlier this year will provide the perfect backdrop for the discussion.
Author’s note: I am happy to share that my book – Pilgrims In The Digital World – will soon be published and available to book-lovers. The book discusses various facets of technology and the digital world – the opportunities and innovations, the issues, the threats and the response that can make the digital world a safer place. The book, written for both the general reader who does not possess much technical background and also for technology experts, takes a non-technical approach to discussing the larger issues related to technology and the digital world.Category: Security Published: 11/9/2017 3:06 PM
Cyber security gets a lot of discussion in terms of small business, but what few outside of the industry know is that many cyber attacks actually take close much closer to home. In fact, thousands of attacks actually occur in the home. Part of the role of security practitioners moving forward can be to educate homeowners and help them protect their households with stronger, more secure solutions.
Hackers target home “security” systems
The entire objective of a home security system is to keep threats out of the home. A security system is designed to be both a deterrent and a defense mechanism. But while most security systems are focused on physical threats – like burglars – the rise of internet-connected systems has created an entirely new risk category. With some basic hacking strategies, cybercriminals can gain access to security cameras, disarm alarm systems, and prey on homeowners and their families.
SimpliSafe and other Internet-connected systems have proven to be able to be manipulated – something that most homeowners aren’t aware of. It’s the job of the security community to be part of the solution and help educate homeowners and customers on the risks, while providing them with specialized guidance that helps them select security systems that are actually secure.
“The impression that I’ve got is that the home security product industry isn’t really actually putting any effort into security, whether it’s because they don’t realize the problem, or they don’t care, is not something I’m going to be able to tell you. It’s not just the SimpliSafe system that’s insecure,” Dr. Andrew Zonenberg, a security consultant, told Forbes. “These people are advertising security products that provide little to no actual security.”
While you may have an obligation to sell and drive revenue for your business, you shouldn’t be doing it at the expense of selling products that have loopholes and deep-seated issues. Believe it or not, there’s a lot of money to be made from telling the truth and establishing yourself as an authority figure in the industry. There are only a few people currently doing this, and you can make a name for yourself by opening up.
James Risley of Security Baron is the perfect example. He’s constantly publishing high-quality content that puts clients first and products second. One topic that he’s really passionate about is the hacking of cameras, which actually happens fairly frequently.
“If you’re looking for a camera, ensure that you’re buying from a company that updates its firmware in response to security flaws,” Risley tells his audience. “Many DIY systems make this a manual process, but more popular cameras like the Nest Cam or Logi Circle work in the background. Also, always update your passwords on your IoT devices as soon as you set them up. Ideally, you want a secure, unique password for each device you own.”
Whether it’s buying a used car or installing a smart security system in a million-dollar home, consumers want to understand the pros and cons of purchases and appreciate the transparency they receive from vendors. Be a part of the solution – not the problem.
Are you doing your part?
The worst thing about the loopholes found in security systems is that these systems are designed for protection. People install cameras and other connected devices in their homes with the purpose of being secure. The fact that they could actually be introducing more risk is rather alarming – no pun intended. The more honest you are with consumers – and the more you work to improve the integrity of smart security solutions – the better the industry will be.
I admit it … I am one of the 143,000,000 people afflicted by the Equifax breach. For those of us who reside in the US, that number approaches 60% of all adults, based on recent numbers from the US Census Bureau. Perhaps most unsettling is that failing to perform something as routine as a timely patch produced an event so catastrophic that it cost the CISO, CIO and CEO their jobs. Make no mistake about it, accountability for cyber resilience is in the boardroom and rests heavy on the shoulders of those in the C-suite. This is accentuated by the data from a recently completed study by ISACA and MIT which overwhelmingly confirmed that CEOs and boards are leading enterprise digital technology initiatives.
Strong oversight of cyber security is now a critical component of organizations’ overall governance of their information and technology, and on that front, there remains some steep hills to climb. ISACA’s new Better Tech Governance is Better for Business research shows that only a little more than half of senior business leaders think their organization’s leadership team and board are doing all that they can to safeguard the organization’s digital assets, and less than half of boards intend to fund a significant expansion of their cyber defenses in the coming year, despite expanding attack surfaces and daily changes to the threat landscape.
There is much in the media and literature today calling for increasing technology competency in directors and senior executive leaders to achieve better oversight of what’s happening in the enterprise operations. There are also repeated calls for boards and the C-suite to further invest in cyber security and risk management, not only as a path to averting disaster, but as an enabler of the innovation required to thrive within a rapidly changing and increasingly complex technology landscape and regulatory and compliance environment.
The answer seems simple enough: recruit some new subject matter experts who can ask the right questions to serve on the board. While this is a good start, there’s still something missing— the fundamental ability to qualitatively and quantitatively measure the capabilities of an enterprise, allowing the enterprise to build its cyber resilience.
A CISO for a leading global payment company recently shared with me his story of being asked by a director on the company’s Board, “Are we safe?” His response was, “I think so,” to which, the director retorted, “What do you mean you think so?” The story was instructional for me, confirming the need for ISACA and our CMMI Institute subsidiary to work with industry leaders on the development of a risk-based, enterprise-wide self-assessment that presents a holistic view of an organization’s established capabilities to protect and defend itself from cyber security attacks. Upon completion of the assessment, a report indicating the current state of the enterprise, including views on how the organization compares to other organizations of similar size, geographic location or industry, will be provided. Assessment outcomes can be used by boards and senior executives to understand the current state, along with a roadmap to improved cyber resilience that can serve as the basis for further risk management-based and business-focused investments. CISOs and board members won’t need to think their organization is safe; they will know it is.
With industry and government support, along with stakeholders in our professional community, this assessment can evolve into a community accepted “universal consensus model” to measure progress in our respective industry sectors. Without such a tool, organizations, many of which are struggling to find tech-savvy board members, will continue to operate with incomplete or misleading information to decide how to invest in the equipment, training and personnel required to build and maintain effective security programs.
The pressure on today’s executives when it comes to reliable cyber security and risk management is significant. The job of leading and managing these critical enterprise concerns is anything but easy. The days of cyber security being treated as a technology concern have passed us by. Cyber security is now and will remain a strategic business risk that, if properly managed, can fortify an enterprise to effectively and securely innovate. Perhaps the timing is now right for this new ability to measure cyber resilience, thereby creating the rising tide that will raise all ships.
Editor’s note: This blog post by ISACA CEO Matt Loeb originally appeared in CSO.Category: Security Published: 11/7/2017 3:08 PM
I have developed a risk-based management approach to third-party data security, risk and compliance methodology and published it to provide process guidelines and a framework for enterprises’ boards of directors and senior management teams to consider when providing oversight, examination and risk management of third-party business relationships in the areas of information technology, systems and cyber security.
My business relationships and the research that I went through, a number of professional surveys indicate that information technology and security managers, directors and executives report that significant data breaches are linked directly or indirectly to third-party access. Unfortunately, these security breaches are trending upwards.
I have also found that there is an absence of a structured and quantifiable methodology to measure the third-party risk to an enterprise and what expectations are required from the third party to substantiate the evidence that sound risk management is in place.
Types of Risk a Third Party May Have on an Enterprise
When a third party stores, accesses, transmits or performs business activities for and with an enterprise, it represents a probable risk for the enterprise. The degree of risk and the material effect are highly correlated with the sensitivity and transaction volume of the data.
Outsourcing certain activities to a third-party poses potential risk to the enterprise. Some of those risk factors could have adverse impacts in the form of, but not limited to, strategic, reputational, financial, legal or information security issues. Other adverse impacts include service disruption and regulatory noncompliance.
I have to emphasize that the third parties include, but are not limited to, technology service providers; payroll services; accounting firms; invoicing and collection agencies; benefits management companies; and consulting, design and manufacturing companies. Most third-party commercial relationships require sending and receiving information, accessing the enterprise networks and systems, and using the enterprise’s computing resources. The risk posed at different levels and the impacts range from low to very significant.
In my experience, it is critical to share with enterprise management teams that outsourcing an activity to an outside entity is by no means removing the responsibility, obligation or liability from the enterprise, but these outsourced activities are considered integral and inherent to operations. As a result, the enterprise is obliged to identify and mitigate the risk imposed on it by third-party commercial relationships.
I encourage subject matter experts and professionals with management responsibility to read my Journal article describing this methodology and the quantifiable representation, which is a risk-based management approach to third-party data security, risk and compliance, as shown.
Read Robert Putrus’ recent Journal article:
“A Risk-Based Management Approach to Third-Party Data Security, Risk and Compliance,” ISACA Journal, vol. 6, 2017
I recently received my CGEIT exam result, with a final score of 557. It is not an elite score, but surpassed the required number of 450. I was happy with this result, and glad about my CGEIT learning journey.
For me, each autumn is a yearly planning and budget discussion season. It has become harder to balance all stakeholders’ expectations and to keep pace with the fast-changing business landscape. Through CGEIT preparations, I could verify my perceptions, discover theoretical systems to support my ideas, and find more methods to convince others.
Let me share my lessons learned in preparing for the CGEIT. I hope it is helpful for your preparations.
For me, the “journey” took about two months, from getting two books – the CGEIT Review Manual and CGEIT Review Questions, Answers & Explanations Manual – to passing the exam. Because my daily job is very busy, I estimate I spent about 30 hours in total to read the books and other related materials.
My approach was:
Note: As everyone’s knowledge gap is different, the time required for step two will be a big range.
The exam time is four hours for the candidate to answer the questions. I spent about 130 minutes to complete the 150 questions. Everyone should have enough time to complete the exam. The questions are designed very well to match real business situations. If a candidate has the capability to make a proper business decision in his or her daily work, getting the right answer is no problem.
Lastly, I want to recommend three other resources for candidates who want to start the CGEIT journey.
Good luck with your CGEIT journey!Category: Certification Published: 11/6/2017 3:09 PM
Employees are at their best when they are encouraged to take calculated risks, rather than becoming complacent with what they know and what has become comfortable. The same holds true for enterprises.
Some of the best risks enterprises can take in our technology-driven business landscape involve deploying transformative technologies that allow them to connect with customers in new and innovative ways. Yet, in many cases, organizations are failing to capitalize on the widening array of opportunities.
ISACA’s new Digital Transformation Barometer research shows that only 31% of organizations frequently evaluate opportunities arising from emerging technology. Given the swift pace with which technology is introduced and refined, this shows that most enterprises are undercutting their ability to seize marketplace opportunities and better serve their customers.
Boards of directors and the C-suite should be challenging their operational teams to research, pilot and ultimately become experts in emerging technologies capable of transforming their enterprises. Big data, artificial intelligence, Internet of Things devices and blockchain are just a few examples of technologies capable of delivering transformational change. To lead effectively, senior leaders have to be able to articulate the future vision for their companies in the context of the technologies that will get them there.
There isn’t a board chair or CEO on the planet who would not be thrilled to open new revenue streams or reach new customers – some of the top motivators for pursuing digital transformation. So, what is holding so many organizations back? A shortage of digitally fluent leaders is one impediment. Only a little more than half of survey respondents expressed confidence that their organizations’ leaders have a solid understanding of technology and its related benefits and risks. ISACA’s research shows that those organizations lacking digitally fluent leadership are less likely to evaluate technology opportunities.
Even those organizations that perform their due diligence in vetting new technologies often develop reservations once more is learned about the associated risks. A whopping 96% of survey respondents believe there is high or medium risk in deploying IoT devices, and more than 9 in 10 respondents also categorized public cloud and AI/machine learning/cognitive technology as posing medium to high risk.
The reality is every new technology introduced expands the attack surfaces and presents new risks. Organizations must move beyond that inherent discomfort and devote the necessary resources to mitigate risk to acceptable levels. Enterprises with effective information and technology governance programs can deliver better customer experiences, innovate more, and improve their business performance and profitability. Investing in well-trained, highly skilled professionals in areas such as audit, risk, governance and cyber security can provide enterprises the confidence they need to effectively and securely leverage their technology. Organizations should also resist the urge to take shortcuts in pilot testing or research and development when evaluating new technologies.
It’s important to have realistic expectations about digital transformation. Not every turn of the wheel on an enterprise’s journey can be a smashing success, and organizational leaders must give their team members the freedom to take a well-reasoned risk that may – or may not – yield the anticipated results. Those failures can provide unparalleled learning opportunities.
Organizations that remain committed to digital transformation can reap great rewards. From telecommunications giant Sprint tapping into big data, to a town in North Carolina, USA, shedding the yoke of legacy applications, there is no shortage of examples of enterprise large and small successfully harnessing digital transformation.
As the Latin proverb goes, fortune favors the bold. Enterprise leaders should embrace that mindset and make digital transformation a centerpiece of their organizations’ roadmaps toward a prosperous future.Category: Risk Management Published: 11/14/2017 9:00 PM
Enterprises are becoming increasingly digital. Consider a bank that refers to itself as an information technology firm that happens to process financial transactions. Or, perhaps a manufacturer that likewise refers to itself as a technology company. The management of data is critical to all enterprises.
A breach can cause enormous harm outside of the core business of the enterprise. Target had a significant data breach that caused the company material damage. Technology firms are obviously at risk. Witness the recent breach at Equifax – the repercussions of that event are still being measured.
The short story is that no matter what business you’re in, data must be cared for!
The Getting Started with Data Governance using COBIT 5 paper looks at these issues from the perspective of using enablers to put goals and internal controls in place that will assist in the good shepherding of data. The paper extends the application of the COBIT 5 framework to the practice of data governance. The practice of data governance is described, and then elements of COBIT 5: Enabling Information are explored. Specific examples are provided against each of the COBIT 5 enablers.
Data maintenance and management are becoming ever more complicated. Data environments (e.g., the cloud) change rapidly, and so do internal enterprise data requirements. COBIT 5 provides definitions, good practices and modeling to assist practitioners in dealing with the critical role of data within the enterprise. Strong management provides the underpinning of good data governance.
Corporate governance and IT governance are credited with putting frameworks and standards in place to assist enterprises in using their resources effectively and efficiently to create and deliver value to their stakeholders. Data governance uses the same concepts, but applies them more narrowly to the protection and use of data. Enterprises must still define their needs for data and what resources will be available to accomplish those goals.
Once the right resources are in place, there needs to be performance measurement mechanisms put in place to ensure that the newly created, or altered, processes are functioning as needed. Reporting on the performance of data governance processes completes the data governance cycle. The governing body can then make additional, or new, directives to accomplish the enterprise’s data governance needs.Category: COBIT-Governance of Enterprise IT Published: 11/3/2017 3:03 PM
Emerging technologies – such as machine learning, artificial intelligence (AI), blockchain, Internet of Things (IoT), augmented reality, and 3-D printing – are swiftly disrupting several industries. To paraphrase Klaus Schwab, co-founder of the World Economic Forum, these mind-boggling innovations are redefining humanity, pushing the thresholds of lifespan, health, cognition, and capabilities in ways previously considered to be preserves of science fiction.
The possibilities presented by digital transformation are indeed captivating. The uses are as varied as the organizations putting them to use. Sensors attached to jet engines are transmitting signals mid-flight, enabling airlines to promptly detect sub-optimal performance and conduct pre-emptive maintenance, boosting safety and minimizing downtime. Physicians are replicating flesh and bones using 3-D technology to simulate high-risk surgical operations, lifting patients’ confidence and shortening their anaesthesia durations. Meanwhile blockchain – an open source, distributed ledger of everything – is being used to develop self-executing contracts, eliminating record labels and enabling artists to interact directly with consumers, maximizing their ingenuity rewards.
The benefits of digital transformation are unquestionable, but enterprises must manage these programs carefully. Here are three key recommendations:
Drive cultural change
Digital transformation transcends IT – it’s an enterprise-wide matter that requires unwavering commitment from the C-suite to front-line staff. To succeed, enterprises must place cultural change, not technology, at the core of their strategies. This requires eliminating unnecessary barriers to innovation, agility and change that exist within organizations, including breaking down functional silos and revising bureaucratic governance structures. As Jeffrey R. Immelt, CEO of General Electric, said, “You can’t have a transformation without revamping the culture and the established ways of doing things.”
Leadership from the top is essential to establish vision, institute appropriate governance structures and drive cultural change during any major change, and digital transformation is no exception. Executive messages must be clear and consistent, persuading employees that creating a nimbler enterprise that can swiftly respond to market needs is an existential matter; status quo is untenable. This fosters an environment of trust and spurs employee engagement, prerequisites for success.
On the contrary, inconsistent messages fuel doubts, forcing employees to work in silos and resent change. This risk looms large when transformation is perceived as a threat to people’s jobs. Consistent with this view, the majority of respondents to the ISACA’s Digital Transformation Barometer rated AI and public cloud as top candidates to face organizational resistance. While initial reservations about public cloud are waning, migration efforts and radical process changes can pose such organizational challenges.
In the race to keep up with competitors, enterprises often have a disproportionate emphasis on the pace of transformation. Often, security and infrastructure considerations are afterthoughts, but such missteps can have lasting business repercussions.
Emerging technologies are exerting enormous pressure on traditional security models. For instance, billions of IoT devices with glaring vulnerabilities are integrating with critical infrastructure, creating numerous backdoors for malefactors to exploit. Cloud is enabling employees to bypass IT governance processes and export volumes of sensitive data to unsanctioned environments, aggravating the enduring shadow IT problem. At the same time, location-based applications collect troves of personal data, raising safety and privacy concerns. Each emerging technology presents new security issues, many of which have not been sufficiently evaluated nor understood.
To thrive, businesses need to make security an inescapable facet of digital transformation programs, considering implications early during business case evaluations. Enterprises also must have a nuanced understanding of each technology, carefully balancing pace of adoption, security and convenience. Traditional one-size-fits-all models don’t cut it anymore. Securing an implanted cardiac pacemaker that can resuscitate a faltering heart, for example, requires more rigor when compared to securing a wearable device that tracks steps.
As this revolution unfolds, several jurisdictions are also tightening privacy laws. For instance, the EU’s General Data Protection Regulation (GDPR) will impose fines up to $20M EUR or up to 4% of the annual worldwide turnover, whichever is greater. Businesses must have a strong grasp of applicable privacy laws to ensure compliance and retain customers’ trust.
Consider the impact of legacy applications
As digitization gains pace, several enterprises are finding themselves saddled by jumbles of complex, aged and proprietary applications, referred to as “legacy spaghetti.” Several of these decades-old digital workhorses have developed a reputation for reliability and still underpin vital operations. But they can also be daunting obstacles to digital transformation. Specifically, they are not designed to handle the flexibility, speed and performance demanded by today’s digital enterprise. Furthermore, they don’t have well-defined interfaces, sufficient documentation and available subject matter experts.
To manage this risk, business leaders should ask the following questions:
An effective digital transformation strategy, therefore, carefully balances the need to rejuvenate customer experiences with the steadiness of core processes. None of these can be dealt with in isolation.
This wave of digital transformation calls for enterprises to deeply rethink their strategies. Those that stick their heads in the sand may soon be irrelevant to their customers.
About the authors
Phil Zongo is a head of cyber security for an Australian investment management firm. He is the 2016-17 winner of the ISACA’s Michael Cangemi Best Book/Article Award, a global award that recognizes individuals for major contributions to publications in the field of IS audit, control and/or security. Phil has more than 13 years of technology risk consulting experience, advising executives on how to manage critical risk in complex technology transformation programs across multiple industries.
Natasha Barnes, CISA, is a manager with a global consulting firm, based in the Washington D.C. metro area. She has provided IT risk and compliance consulting services within both public and private sectors for more than seven years. Natasha helps her clients to optimize their control environments and address evolving cyber security challenges. Natasha is also a member of ISACA and a career coach with Careerly, where she mentors aspiring cyber security professionals by providing students with practical guidance to make informed career decisions.Category: Risk Management Published: 11/14/2017 8:59 PM
The security risk of running an unsupported version of Windows File Servers is not at the top of the IT topic debate list. Most will concur that enterprises electing to use an unsupported version of Windows may expose themselves to security vulnerabilities.
These vulnerabilities arise because the patches and fixes that were formerly provided by Microsoft are no longer available. As a result, the enterprise may incur additional operational costs as it identifies (and sometimes purchases) its own solutions to vulnerabilities. Beyond that, there may also be compliance implications of running an unsupported version of Windows. For example, under section 6.2 of the Payment Card Industry Data Security Standard (PCI DSS), there is a requirement that organizations complying with PCI DSS must “ensure that all system components and software are protected from known vulnerabilities by installing applicable vendor-supplied security patches.” So, for the IT auditor, this means that assurance around Windows File servers is broader than security: operational and compliance considerations should be on the radar as well.
IT auditors should also remain cognizant of functionality changes in different versions of Windows. For those who have been in the Information Technology realm for some time, Windows File Server functionality changes appear to be steady and, at times, significant. These functionality changes present opportunities to audit familiar areas such as access management, authentication, patch management, incident response, and physical security. In looking at these areas, IT auditors may consider the following:
Beyond the audit, there are opportunities for IT auditors to collaborate with their organizations. For example, organizations do not automatically apply each patch that Microsoft provides. Given that, IT auditors may have an opportunity to partner with management to assess the risks and benefits of applying certain patches. Also, Windows File Server functionality changes can provide career development opportunities for IT auditors. Looking at remote server management may be a learning moment for IT auditors whose experience has been primarily in a physical server environment. Lastly, IT auditors can assist with identification of those compliance-related server issues – before they become issues.
Editor’s note: For more resources on this topic, download ISACA’s Windows File Server Audit/Assurance Program.Category: Audit-Assurance Published: 11/1/2017 3:13 PM
Editor’s note: The ISACA Now series titled “Faces of ISACA” highlights the contributions of ISACA members to our global professional community, as well as providing a sense of their lives outside of work. Today, we spotlight Paul Yoder, head of information systems security at El Camino College (Torrance, California, USA).Yoder recently was honored in the education category of the Center for Digital Government Cybersecurity Leadership & Innovation Awards, underwritten by McAfee. Yoder visited with ISACA Now to discuss the award and more; an edited transcript is below. Interested in joining ISACA and networking with colleagues like Yoder? Learn more here.
ISACA Now: You were recently honored for your innovation efforts at El Camino College – how were you able to gain the administration’s support for taking cyber security seriously?
It was a tenuous process since the college had never heavily invested in cyber security before. I’m sure that some people thought that hiring a dedicated security person was the first and last measure to be taken, and didn’t realize that specialized tools would have to be purchased as well to facilitate the hardening of digital assets.
First, I took a hands-on approach by joining the Technology Committee that would drive any future change in info security policy and spending. They had paid a consulting firm to write a five-year plan for upgrading the IT assets across the entire campus, and info security was one of those sections. After about 15 minutes of reading that section, I decided to throw it out and start from scratch! Since it was my first week on the job, I knew this would be a make-it or break-it kind of moment. I decided to craft a new five-year info security plan based on the SANS five-step Security Awareness Roadmap (I actually have a poster of it on my office wall). Fortunately, this didn't result in a pink slip and was actually embraced by all on the committee!
They also ranked all of the sections as to which were the most important to focus on, and info security rose to the top of the list. I reinforced this with some one-on-one “evangelism” with several key stakeholders, such as the president, VPs and deans. I met with everyone that would put me on their calendar. Let’s face it, it’s hard for us “computer geek” types to be social and outgoing sometimes, but this is a much more effective way to communicate your message than emails or phone calls.
ISACA Now: What attracted you to working in a higher education environment?
I think it was the opportunity to finally create my own info security program that led me to take the job. The money wasn’t spectacular, and the drive would turn out to be pretty horrendous, but how often do cyber security professionals get to put their own individual spin on things? I just couldn't pass up the opportunity!
ISACA Now: ISACA recently released research about how stronger board oversight of cyber security and risk management leads to improved business outcomes. What are some examples you have seen of that from your career?
I couldn't agree more with this concept. Ever heard of the two-story outhouse principal? It's not only true for nasty things flowing downhill, but also for good things. If you achieve buy-in at the top, then the warm bodies further down the food chain are more likely to follow in lock-step.
One important thing to remember though when dealing with C-level executives – they don't understand or speak security like you do. Keep it simple!
ISACA Now: How has your ISACA membership furthered your professional development?
First of all, being associated with one of the top security organizations provides credibility. The well-written articles provide deep insight into the threats we face every day. I find that they usually have more substance and meat to them than the typical security blogs, which are often filled with top-level or non-essential information. ISACA also provides monthly meet-ups, and that is something that I would like to be more involved with in the future. I proudly display the ISACA chapter logo on the front page of my resume.
ISACA Now: What are some of your major interests outside work?
I have been a professional musician for many years (started trumpet lessons when I was 4!). I currently have two CDs completed and hope to start on a third project soon. I also have been involved with Togakure Ryu Ninjutsu since I was a kid, and I hope to finish a book in 2018 that teaches ordinary people how to implement effective info security at home.
My name is Chris, and I am a CSX addict.
It wasn’t always that way. To be fair, my gateway drug was the workshops and lectures at my local ISACA chapter. Then, one thing led to another, and before I knew it, I was hooked. First, in Vegas. Then, earlier this month, in Washington, D.C., and next year, back to Vegas for CSX North America.
In my defense, the quality is irresistible, the “high” of discovery intense. Between the keynotes, the workshops, and the individual presentations, I mean … how can a professional resist? I tried others, but there was no “there there.” Useful vendor presentations, to be sure, but nowhere near the quality, diversity, or depth that I find at CSX conferences.
Where do I start my confessions to you? Do I tell you about the workshops, surrounded by accomplished peers, led by experts, and resulting in practical, actionable takeaways? Or, perhaps, I should emphasize the presentations, organized by track, that fill the days of the main conference.
Pick your flavor, and CSX has it! You prefer to focus on IDENTIFY? Got that. Perhaps PROTECT or DETECT floats your boat? CSX has that, too! For me, it was all about RESPOND, RECOVER, and DEFEND. Attending those presentations allowed me to learn more and hone my skills more than any reading I could have done.
And, that’s the thing with CSX. At the end of the day, it brings together the best professionals in the field, relaying, discussing, and sharing best practices with peers. This is not “a show.” It is a relationship among professionals, a true exchange of ideas, best practices, and lessons learned that I have not found anyplace else.
The good news is that I am not alone. I met dozens of fellow addicts at this year’s conference in DC. We compared notes, and we agreed: Unlike most other conferences, you could trust CSX to deliver objective, unbiased, actionable data. For practitioners in the field, there is nothing like it.
I will further confess that my addiction has shown no mercy. It’s not enough to be a regular delegate. I need more. And, for troubled people like me, ISACA has an answer: ELP. The Enhanced Learning Package. Cut straight to the front of the line, reserved seats at the presentations and keynotes, VIP treatment at the venue, help around every corner in getting me the information that I need when I need it, and so much more. This is not just “an incremental add-on.” For me, ELP has made the difference between a great conference, and an incredible-make-sure-you-do-not-miss-this-event experience.
I know what you’re thinking.
I should get help. Sadly, I am past that point. But, in the interest of serving the community, I needed to confess to you, my peers, and warn you. Stay away from CSX conferences, and stay clear of the ELP package in particular.
You’ll never be the same again.
Author’s note: Chris Moschovitis is the CEO of tmg-emedia, a 29-year-old independent consultancy in New York. He is the co-author of “History of the Internet: 1843 to the Present.” Chris’ latest book, “Cybersecurity Program Development for Business: The Essential Planning Guide,” is being published by Wiley later this year. He can be reached at Chris.Moschovitis@tmg-emedia.com