Goto

Collaborating Authors

Results


Three opportunities of Digital Transformation: AI, IoT and Blockchain

#artificialintelligence

Koomey's law This law posits that the energy efficiency of computation doubles roughly every one-and-a-half years (see Figure 1–7). In other words, the energy necessary for the same amount of computation halves in that time span. To visualize the exponential impact this has, consider the face that a fully charged MacBook Air, when applying the energy efficiency of computation of 1992, would completely drain its battery in a mere 1.5 seconds. According to Koomey's law, the energy requirements for computation in embedded devices is shrinking to the point that harvesting the required energy from ambient sources like solar power and thermal energy should suffice to power the computation necessary in many applications. Metcalfe's law This law has nothing to do with chips, but all to do with connectivity. Formulated by Robert Metcalfe as he invented Ethernet, the law essentially states that the value of a network increases exponentially with regard to the number of its nodes (see Figure 1–8).


M. Tech. in Artificial Intelligence

#artificialintelligence

The Core courses give them sufficient expertise in the areas of Algorithm Analysis and Design, Modern Computer Architecture, Artificial Intelligence Foundations, Data Science and Machine Learning, Parallel and Distributed Data Management etc. Elective courses include various application domains of AI such as Robotics, Video/Image Analytics, Medical Signal Processing, Agents Based Systems, Data Mining and Business Analytics, Natural Language processing, Wireless Sensor Networks, Internet of things etc. Once they complete the course, students get opportunities to get fully paid Internships and placement offers at MNCs and IT/ITES companies like Intel, Cerner, Robert Bosch, DELL etc. Also, they could publish quality research papers of the case studies/dissertations done as part of their M. Tech. Along with regular M. Tech, this program also provides opportunities to do Dual Degree Program (M. Tech from Amrita and MS from International universities) or One Semester/ One Year abroad programs offered by premiere universities like KTH (Sweden), Politecnico Di Milano (Italy), University of New Mexico (USA) and RWTH (Aachen University Germany).


Wazuh and Its XDR Approach

#artificialintelligence

Today's cyber security technological evolution milestones in the context of effective detection and response are the endpoint detection and response (EDR), Manage Detection and Response (MDR), and Network Detection and Response (NDR). However, these all solutions are running independently and missing the correlated high level processed alert to which Extended Detection and Response (XDR) is a solution that emerged, rather than adding another tool, XDR aims to change this security landscape and enable a more compelling activity of the security stack. What problem does XDR solve? Attackers often target endpoints, but they also target other layers of the IT domain in the corporate network, such as email servers and cloud systems, and they may bounce between layers or hide in the interface between them to evade detection. XDR solves both problems at once.


Delhi traffic police to use AI-based systems to manage traffic

#artificialintelligence

The Delhi Traffic Police signed a Memorandum of Understanding for an integrated traffic management system with the Centre for Development of Advanced Computing (C-DAC), which has further assigned the task to a consultant firm. The new system that works on machine learning and artificial intelligence will also play a vital role in facilitating a quick passage to emergency vehicles such as ambulances and fire tenders.


The Evolution And Expansion Of AIOps In Network Management – Forbes

#artificialintelligence

Instead, by leveraging big data, machine learning, artificial intelligence (AI) and other advanced technologies, companies can bring greater …


Trends In Artificial Intelligence

#artificialintelligence

Artificial intelligence (AI) is a cutting-edge technology that is being adopted by forward-thinking businesses. The concept of artificial intelligence, on the other hand, has been around for decades. In 1955, "A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence" was published, which coined the term "artificial intelligence." Dartmouth University sponsored the first AI research project in 1956, which is widely regarded as the start of artificial intelligence. So, why is AI gaining popularity now, more than sixty years later?


What's in the TensorFlow Federated(TFF) box?

#artificialintelligence

Krzysztof Ostrowski is a Research Scientist at Google, where he heads the TensorFlow Federated development team. This blog post is inspired by his talk at the OpenMined Privacy Conference. TensorFlow Federated(TFF) is a new development framework for Federated Computations, that typically involve computations on data that is born decentralized and stays decentralized. TFF provides a common framework for federated computations in both research and production and is an open-source project within the TensorFlow ecosystem. The TFF library has been designed so as to facilitate an easy path from research to production.


Baidu Research: 10 Technology Trends in 2021 - KDnuggets

#artificialintelligence

While global economic and social uncertainties in 2020 caused significant stress, progress in intelligent technologies continued. The digital and intelligent transformation of all industries significantly accelerated, with AI technologies showing great potential in combatting COVID-19 and helping people resume work. Understanding future technology trends may never have been as important as it is today. Baidu Research is releasing our prediction of the 10 technology trends in 2021, hoping that these clear technology signposts will guide us to embrace the new opportunities and embark on new journeys in the age of intelligence. In 2020, COVID-19 drove the integration of AI and emerging technologies like 5G, big data, and IoT.


AI Hardware Technology Imitates Changes in Neural Network Topology

#artificialintelligence

A group of researchers at The Korea Advanced Institute of Science and Technology (KAIST) has proposed a new system inspired by the neuromodulation of the brain, which is called a "stashing system." This newly proposed system requires less energy consumption.


The role of streaming machine learning in encrypted traffic analysis - Help Net Security

#artificialintelligence

Organizations now create and move more data than at any time ever before in human history. Network traffic continues to increase, and global internet bandwidth grew by 29% in 2021, reaching 786 Tbps. In addition to record traffic volumes, 95% of traffic is now encrypted according to Google. As threat actors continue to evolve their tactics and techniques (for example, hiding attacks in encrypted traffic), securing organizations is becoming more challenging. To help address these problems, many network security and operations teams are relying more heavily on machine learning (ML) technologies to identify faults, anomalies, and threats in network traffic.