application


15 PhD positions in physics, materials science, chemistry, computer science, mathematics, artificial intelligence and/or electrical engineering

#artificialintelligence

Apply for a position in our exciting research on "Materials for Neuromorphic Circuits" (MANIC), and become part of the next generation of neuromorphic experts! Funded by the European Commission through the Horizon 2020 Marie Sklodowska-Curie ITN Programme, the MANIC network offers 15 high level fellowships for joint research on new materials for cognitive applications. The most talented and motivated students will be selected for advanced multidisciplinary research training, preferably starting July 2020. The scientific aim of MANIC is to synthesize materials that can function as networks of neurons and synapses by integrating conductivity, plasticity and self-organization. Successes in deep learning show that the paradigm of neuromorphic computing is very attractive.


The state of digital transformation in 2020

#artificialintelligence

The past year has seen many businesses question exactly how transformational digital transformation really is. The answer, as with all IT initiatives, depends on the scope of the ambition, the skill of the leadership, and the ultimate degree of business impact. Yet we've seen a pattern emerge: Those with transformational aspirations discover that boil-the-ocean schemes seldom meet their objectives, while carefully planned and targeted initiatives often have broader benefit than even the original instigators imagined. The latter is particularly true of initiatives that reform fundamental processes. Transformation usually implies moving from one fixed state to another, yet digital transformation at its best involves a journey from inflexibility to a "permanently agile" condition.


Deep learning enables real-time imaging around corners: Detailed, fast imaging of hidden objects could help self-driving cars detect hazards

#artificialintelligence

"Compared to other approaches, our non-line-of-sight imaging system provides uniquely high resolutions and imaging speeds," said research team leader Christopher A. Metzler from Stanford University and Rice University. "These attributes enable applications that wouldn't otherwise be possible, such as reading the license plate of a hidden car as it is driving or reading a badge worn by someone walking on the other side of a corner." In Optica, The Optical Society's journal for high-impact research, Metzler and colleagues from Princeton University, Southern Methodist University, and Rice University report that the new system can distinguish submillimeter details of a hidden object from 1 meter away. The system is designed to image small objects at very high resolutions but can be combined with other imaging systems that produce low-resolution room-sized reconstructions. "Non-line-of-sight imaging has important applications in medical imaging, navigation, robotics and defense," said co-author Felix Heide from Princeton University.


Operationalizing AI

#artificialintelligence

When AI practitioners talk about taking their machine learning models and deploying them into real-world environments, they don't call it deployment. Instead the term that's used is "operationalizing". This might be confusing for traditional IT operations managers and applications developers. Why don't we deploy or put into production AI models? What does AI operationalization mean and how is it different from the typical application development and IT systems deployment?


Operationalizing AI

#artificialintelligence

When AI practitioners talk about taking their machine learning models and deploying them into real-world environments, they don't call it deployment. Instead the term that's used is "operationalizing". This might be confusing for traditional IT operations managers and applications developers. Why don't we deploy or put into production AI models? What does AI operationalization mean and how is it different from the typical application development and IT systems deployment?


Angular Image Classification App Made Simple With Google Teachable Machine

#artificialintelligence

AI is a general field that encompasses machine learning and deep learning. The history of artificial intelligence in its modern sense begins in the 1950s, with the works of Alan Turing and the Dartmouth workshop, which brought together the first enthusiasts of this field and in which the basic principles of the science of AI were formulated. Further, this industry experienced several cycles of a surge of interest and subsequent recessions (the so-called "AI winters"), in order to become one of the key areas of world science today. However, there are several examples and applications of artificial intelligence in use today, a large community of developers is still wondering how or from where to start developing AI-driven applications. So this article may be a kick start for those who are eager to start developing AI or ML-driven applications.


How the Pentagon's JAIC Picks Its Artificial Intelligence-Driven Projects

#artificialintelligence

The Pentagon launched its Joint Artificial Intelligence Center in 2018 to strategically unify and accelerate AI applications across the nation's defense and military enterprise. Insiders at the center have now spent about nine months executing that defense driven AI-support. At an ACT-IAC forum in Washington Wednesday, Rachael Martin, the JAIC's mission chief of Intelligent Business Automation Augmentation and Analytics, highlighted insiders' early approach to automation and innovation. "Our mission is to transform the [Defense] business process through AI technologies, to improve efficiency and accuracy--but really to do all those things so that we can improve our overall warfighter support," Martin said. Within her specific mission area, Martin and the team explore and develop automated applications that support a range of efforts across the Pentagon, such as business administration, human capital management, acquisitions, finance and budget training, and beyond.


Liquid Cooling Trends in HPC - insideHPC

#artificialintelligence

In this special guest feature, Bob Fletcher from Verne Global reflects on how liquid cooling technologies on display at SC19 represent more than just a wave. Bob Fletcher is VP of Artificial Intelligence at Verne Global. Perhaps it is because I returned from my last business trip of 2019 to a flooded house, but more likely it's all the wicked cool water-cooled equipment that I encountered at SC19 that I'm in a watery mood! Many of the hardware vendors at SC19 were pushing their exascale-ready devices and about 15% of the devices on a typical computer manufacturer's booth were water-cooled. Adding rack-level water cooling is theoretically straight forward, so I spent a few minutes checking out the various options.


C3.ai: accelerating digital transformation

#artificialintelligence

As one of the leading enterprise AI software providers, C3.ai is renowned for building enterprise-scale AI applications and harnessing digital transformation. The C3 AI Suite is software that uses a model-driven architecture to speed up delivery and reduce the complexities of developing enterprise-scale AI applications. Supply Chain Digital takes a closer look at the AI firm. The Suite propels organisations to deliver AI-enabled applications quicker than alternative methods while reducing the technical debt from maintaining and upgrading these applications. Its solutions cater to a range of different industries such as manufacturing, oil and gas, utilities, banking, aerospace and defence, healthcare, retail, telecoms, smart cities and transportation.


MIT's new tool predicts how fast a chip can run your code

#artificialintelligence

Folks from the Massachusetts Institute of Technology (MIT) have developed a new machine learning-based tool that will tell you how fast a code can run on various chips. This will help developers tune their applications for specific processor architectures. Traditionally, developers used the performance model of compilers through a simulation to run basic blocks -- fundamental computer instruction at the machine level -- of code in order to gauge the performance of a chip. However, these performance models are not often validated through real-life processor performance. MIT researchers developed an AI model called Ithmel by training it to predict how fast a chip can run unknown basic blocks.