mimecast
What AI can (and can't) do for organisations' cyber resilience
Technologies such as artificial intelligence (AI), machine learning, the internet of things and quantum computing are expected to unlock unprecedented levels of computing power. These so-called fourth industrial revolution (4IR) technologies will power the future economy and bring new levels of efficiency and automation to businesses and consumers. AI in particular holds enormous promise for organisations battling a scourge of cyber attacks. Over the past few years, cyber attacks have been growing in volume and sophistication. The latest data from Mimecast's State of Email Security 2022 report found that 94% of South African organisations were targeted by e-mail-borne phishing attacks in the past year, and six out of every 10 fell victim to a ransomware attack.
- Africa > South Africa > Western Cape > Cape Town (0.05)
- Africa > South Africa > Gauteng > Johannesburg (0.05)
- Information Technology > Security & Privacy (1.00)
- Government > Military > Cyberwarfare (0.56)
Artificial intelligence is the future of cybersecurity
Cybercriminals are using artificial intelligence (AI) to evolve the sophistication of attacks at a rapid pace. In response, an increasing number of organisations are also adopting the technology as part of their cybersecurity strategies. According to research conducted in Mimecast's State of Email Security Report 2021, 39 per cent of organisations are utilising AI to bolster their email defences. Although we're still in the early phases of these technologies and their application to cybersecurity, this is a rising trend. Businesses using advanced technologies such as AI and layered email defences, while also regularly training their employees in attack-resistant behaviours, will be in the best possible position to sidestep future attacks and recover quickly.
- Information Technology > Security & Privacy (1.00)
- Government > Military > Cyberwarfare (0.92)
Artificial Intelligence in Cybersecurity: Where are We on the Technology Adoption/Hype Cycle?
You likely have noticed how prevalent artificial intelligence (AI) and its related terms such as machine learning, neural networks, and big data analytics have become in the last several years in the world of cybersecurity. Doesn't it make sense for the security industry to be searching for the next big thing given the distressing rate of incidents and breaches the world is currently experiencing? Maybe you - like I - have gone to big security events like the RSA Conference or Black Hat and come away confused as to how these analytic concepts relate to the everyday job of keeping an organization safe. What is the proper role of AI in cybersecurity and when will it assume that role in force? One way to answer that question is to figure out where AI in cybersecurity is in its technology adoption lifecycle generally.
- Information Technology > Security & Privacy (1.00)
- Government > Military > Cyberwarfare (1.00)
- Information Technology > Security & Privacy (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (0.91)
- Information Technology > Data Science > Data Mining > Big Data (0.55)
Why artificial intelligence is key to improving phishing defenses
As attackers constantly evolve their tactics to side-step more traditional defenses, artificial intelligence and machine learning technologies are stepping in to help organizations improve defenses. Technologies like MessageControl offer a critical extra layer of protection, especially when fully integrated into a multi-tenant platform to help inform cross-product detection. A Capgemini Research Institute study found that 69% of senior executive respondents said they would be unable to respond to a cyberattack without artificial intelligence. The same study found two-thirds of organizations plan to employ artificial intelligence by 2020, demonstrating the mandate security leaders face in implementing this technology in a focused and valuable way: at their email perimeters and inside their organizations. By constantly'learning' an organization's environment and user behaviors to get smarter over time, a baseline of normal is created, with deviations from that highlighting potential threats.
- Information Technology > Security & Privacy (1.00)
- Government > Military > Cyberwarfare (0.37)