Results


Deep Learning at the Edge on an Arm Cortex-Powered Camera Board

#artificialintelligence

It's no secret that I'm an advocate of edge-based computing, and after a number of years where cloud computing has definitely been in ascendency, the swing back towards the edge is now well underway. Driven, not by the Internet of Things as you might perhaps expect, but by the movement of machine learning out of the cloud. Until recently most of the examples we've seen, such as the Neural Compute Stick or Google's AIY Projects kits, were based around custom silicon like Intel's Movidius chip. However, recently Arm quietly released its CMSIS-NN library, a neural network library optimised for the Cortex-M-based microcontrollers. Machine learning development is done in two stages.


Using AI Analytics to Avoid Supply Chain Disruption - SIPMM INSTITUTE

#artificialintelligence

Is there any way for manufacturers to predict machine failures before they happen to avoid supply chain disruptions, delay and customer dissatisfaction? An emerging solution is the use of Artificial Intelligence (AI)data analytics in the supply chain, which has the most potential to minimize supply chain disruption as well as to dramatically reduce costs of supply chain. The picture below shows the supply chain disruption in an automotive factory. According to Wollenhaupt (2016), downtime in auto manufacturing can cost $1.3 million per hour, according to published reports. The diagram below depicts the prediction of a motor failure.


AI Predictions 2019

#artificialintelligence

Mark has over 10 years of experience within AI and Analytics. Before joining the group Mark was responsible for AI within Sogeti Netherlands where he was responsible for the development of the team and business as well as the AI Strategy. He has worked with clients from multiple markets internationally on technologies around AI, Deep Learning and Machine Learning. When a mortgage application is accepted or rejected, the decision is often based on a whole range of factors that an automated AI is configured to measure. Many organizations have adopted a "black box" neural network model, for example using IBM's Watson, that safeguards the data, yet also prevents them providing a reason for the decision that's been made - something legislation now requires.


How AI and Quantum Computing May Alter Humanity's Future

#artificialintelligence

Quantum computing is a relatively modern technology that pioneering scientists, researchers, and entrepreneurs worldwide are actively seeking to commercialize. For example, recently at CES (Consumer Electronics Show) in January 2019, IBM debuted its "Q System One" as the first standalone quantum computer geared for scientific and commercial use. Making quantum computing accessible will help accelerate progress in artificial intelligence (AI). Speed up computing, and you enhance the performance of deep learning. For example, the parallel processing capabilities of GPUs (graphics processing units) helped accelerate AI deep learning by providing greater computational power than serial processing CPUs (central processing units) to process large amounts of big data used for machine learning.


AI Chip Startup Aims to Take on Industry Giants

#artificialintelligence

In the middle of the historic city of Bristol in England, about 150 engineers are currently designing the most sophisticated computer AI chip in the world. The "Colossus" has 1216 processors fitted on a chip characterized by the size of a postage stamp. Designed specifically for artificial intelligence (AI) applications, the AI chip draws its name from the computer that was used by cryptographers at Bletchley Park during World War II. "[Colossus] was all top-secret for decades after the war, so the Americans thought they invented everything first. Now it is clear to the world that they didn't," claimed Simon Knowles, the inventor of the novel AI chip.


Google to open artificial intelligence lab in Princeton and collaborate with University researchers

#artificialintelligence

Two Princeton University computer science professors will lead a new Google AI lab opening in January in the town of Princeton. The lab is expected to expand New Jersey's burgeoning innovation ecosystem by building a collaborative effort to advance research in artificial intelligence. The lab, at 1 Palmer Square, will start with a small number of faculty members, graduate and undergraduate student researchers, recent graduates and software engineers. The lab builds on several years of close collaboration between Google and professors Elad Hazan and Yoram Singer, who will split their time working for Google and Princeton. The work in the lab will focus on a discipline within artificial intelligence known as machine learning, in which computers learn from existing information and develop the ability to draw conclusions and make decisions in new situations that were not in the original data.


Pakistan's place in artificial intelligence and computing

#artificialintelligence

In the world of science and technology, it is being said that we are at the beginning of the Fourth Industrial Revolution. The first that lasted from 1760 to 1840 brought in the age of mechanized production. It was the result of new materials like iron and steel,which combined with new energy resources of coal and steam, led to'mass production', and a factory system with division of labour. The second industrial revolution from 1870 to the early part of 20th century was the result of electricity, and the internal combustion engine. Both powered industrial machines and made transport possible.


AI Blockchain: A Peek into the Future

#artificialintelligence

Nowadays two new technologies are roaring in the tech world – Artificial Intelligence and Blockchain. Both of these technologies have the potential to revolutionize the world. But the most discussed topic so far is that, whether these two can really be beneficial for each other. We already know that blockchain has the capability to offer a decentralized ledger system and many are already adopting the tech. On the other hand, Artificial Intelligence also started to streamlining processes for our benefit. But can blockchain based AI be the next technological milestone? Well, let's see if blockchain is really capable of powering AI or not. Artificial Intelligent is the simulation of human-like intelligence through computer systems. Usually, these computer systems are programmed in a way to mimic human-like actions. Apparently, the process is utterly complex as human activities are complicated to simulate. But the primary ability of artificial intelligence would be too rationalize like humans and take actions based on intellectual thinking.


Pakistan's place in Artificial Intelligence and computing

#artificialintelligence

In the world of science and technology, it is being said that we are at the beginning of the Fourth Industrial Revolution. The first that lasted from 1760 to 1840 brought in the age of mechanized production. It was the result of new materials like iron and steel, which combined with new energy resources of coal and steam, led to'mass production', and a factory system with division of labor. The second industrial revolution from 1870 to the early part of 20th century was the result of electricity, and the internal combustion engine. Both powered industrial machines and made transport possible.


Pakistan's place in Artificial Intelligence and computing

#artificialintelligence

In the world of science and technology, it is being said that we are at the beginning of the Fourth Industrial Revolution. The first that lasted from 1760 to 1840 brought in the age of mechanised production. It was the result of new materials like iron and steel, which combined with new energy resources of coal and steam, led to'mass production', and a factory system with division of labour. The second industrial revolution from 1870 to the early part of 20th century was the result of electricity, and the internal combustion engine. Both powered industrial machines and made transport possible.