If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Cybercrime is on the rise, and organizations across a wide variety of industries -- from financial institutions to insurance, health care providers, and large e-retailers -- are rightfully worried. In the first half of 2017 alone, over 2 billion records were compromised. After stealing PII (personally identifiable information) from these hacks, fraudsters can gain access to customer accounts, create synthetic identities, and even craft phony business profiles to commit various forms of fraud. Naturally, companies are frantically looking to beef up their security teams. A large skills gap is causing hiring difficulties in the cybersecurity industry, so much so that the Information Systems Audit and Control Association found that less than one in four candidates who apply for cybersecurity jobs are qualified.
The best sources of information are organized according to what you want to know and how you wish to know it. I recommend the Hillis Plot, a circular map of evolutionary relationships between thousands of animal, botanical, and microbial species. If it's straight news you crave (RIP Google Reader), I use Feed.ly, Wikipedia is the place for overviews and preliminary research. Transformation Maps, a new platform developed by the World Economic Forum is a completely different sort of visualization, one that I've never seen before.
History will look back on this time as the beginning of the artificial revolution. Artificial intelligence is beating us at Go, inventing its own languages, writing for us, and composing music. Machine learning is tackling complex, concrete challenges like image compression and solving classification problems like speech recognition and image classification. Even the security and IT industries are benefiting from AI. The problem with any complex new technology -- apart from actually inventing and building it -- is figuring out how to explain it to the public.
We believe it needs to be simpler for enterprises to effectively use AI. Our products strategically combine human expertise with machine learning to deliver results that are better than what either an expert or machine could achieve individually. Our first proof case is in cybersecurity: enabling enterprises to surface high-risk threats that are currently invisible to other approaches. Our approach generates meaningful results, with low false positives, by combining automated modeling of each customer environment with a focus on fundamental adversary behaviors. Versive is recognized on CB Insights' prestigious AI 100 list of the most promising, privately-held artificial intelligence companies, as well as on the SINET 16 list of the most innovative cybersecurity companies.
Earlier this week, a story surfaced about Google's artificial intelligence (AI) being duped. Researchers identified that by understanding the patterns and ways that an AI system can classify images, they could 3D print a turtle that was identified by Google's systems as a rifle from every angle. It's a funny story (and I haven't even mentioned the baseball classed as an espresso or cat categorised as guacamole), but one that serves to prove a wider point regarding the fragility of AI systems and the extent to which they can actually be deemed'intelligent'. Why isn't artificial intelligence more…intelligent? I've had back and forth arguments about the definition of AI with more than one person, and quite often it comes down to what definition you are actually using.
Dr. Sumeet Dua is currently an upchurch endowed associate professor and the coordinator of IT research at Louisiana Tech University, Ruston, USA. He received his PhD in computer science from Louisiana State University, Baton Rouge, Louisiana. His areas of expertise include data mining, image processing and computational decision support, pattern recognition, data warehousing, biomedical informatics, and heterogeneous distributed data integration. The National Science Foundation (NSF), the National Institutes of Health (NIH), the Air Force Research Laboratory (AFRL), the Air Force Office of Sponsored Research (AFOSR), the National Aeronautics and Space Administration (NASA), and the Louisiana Board of Regents (LA-BoR) have funded his research with over $2.8 million. He frequently serves as a study section member (expert panelist) for the National Institutes of Health (NIH) and panelist for the National Science Foundation (NSF)/CISE Directorate.
If you haven't heard, universities around the world are offering their courses online for free (or at least partially free). These courses are collectively called as MOOCs or Massive Open Online Courses. In the past six years or so, close to 800 universities have created more than 8,000 of these MOOCs. And I've been keeping track of these MOOCs the entire time over at Class Central, ever since they rose to prominence. In the past three months alone, over 200 universities have announced 600 such free online courses.
We receive consistent feedback from our clients that it's challenging for them to keep up with the expanding volume of noisy false positive findings that arise from their Static Application Security Testing (SAST) activities. With that in mind, we now offer a cognitive learning capability in our IBM Application Security on Cloud and IBM Security AppScan Source solutions that's referred to as Intelligent Finding Analytics (IFA).
DUBAI – Blockchain, together with artificial intelligence, machine learning, robotics, and virtual and augmented reality, have the potential to deliver disruptive outcomes and reshape digital business in 2018. And companies that have not started the digital investment cycle are at high risk of being disrupted. This is according to the list of top IT predictions for 2018 published Saturday by Dimension Data. But the top trend for the coming year is the adoption of Blockchain - the technology behind Bitcoin - and its immense potential to disrupt and transform the world of money, business, and society using a variety of applications. Ettienne Reinecke, Dimension Data's Group Chief Technology Officer, said Blockchain has gone from strength to strength.
Since the 2013 Target breach, it's been clear that companies need to respond better to security alerts even as volumes have gone up. With this year's fast-spreading ransomware attacks and ever-tightening compliance requirements, response must be much faster. Adding staff is tough with the cybersecurity hiring crunch, so companies are turning to machine learning and artificial intelligence (AI) to automate tasks and better detect bad behavior. In a cybersecurity context, AI is software that perceives its environment well enough to identify events and take action against a predefined purpose. AI is particularly good at recognizing patterns and anomalies within them, which makes it an excellent tool to detect threats.