Goto

Collaborating Authors

Results


NVIDIA and the battle for the future of AI chips

#artificialintelligence

THERE'S AN APOCRYPHAL story about how NVIDIA pivoted from games and graphics hardware to dominate AI chips – and it involves cats. Back in 2010, Bill Dally, now chief scientist at NVIDIA, was having breakfast with a former colleague from Stanford University, the computer scientist Andrew Ng, who was working on a project with Google. "He was trying to find cats on the internet – he didn't put it that way, but that's what he was doing," Dally says. Ng was working at the Google X lab on a project to build a neural network that could learn on its own. The neural network was shown ten million YouTube videos and learned how to pick out human faces, bodies and cats – but to do so accurately, the system required thousands of CPUs (central processing units), the workhorse processors that power computers. "I said, 'I bet we could do it with just a few GPUs,'" Dally says. GPUs (graphics processing units) are specialised for more intense workloads such as 3D rendering – and that makes them better than CPUs at powering AI. Dally turned to Bryan Catanzaro, who now leads deep learning research at NVIDIA, to make it happen.


Towards Broad Artificial Intelligence (AI) & The Edge in 2021

#artificialintelligence

Artificial intelligence (AI) has quickened its progress in 2021. A new administration is in place in the US and the talk is about a major push for Green Technology and the need to stimulate next generation infrastructure including AI and 5G to generate economic recovery with David Knight forecasting that 5G has the potential - the potential - to drive GDP growth of 40% or more by 2030. The Biden administration has stated that it will boost spending in emerging technologies that includes AI and 5G to $300Bn over a four year period. On the other side of the Atlantic Ocean, the EU have announced a Green Deal and also need to consider the European AI policy to develop next generation companies that will drive economic growth and employment. It may well be that the EU and US (alongside Canada and other allies) will seek ways to work together on issues such as 5G policy and infrastructure development. The UK will be hosting COP 26 and has also made noises about AI and 5G development.


Cybersecurity challenges in the AI age

#artificialintelligence

Cybersecurity failure could be among the greatest challenges confronting the world in the next decade, according to the World Economic Forum's Global Risks Report 2021. As artificial intelligence (AI) becomes increasingly embedded worldwide, fresh questions arise about how to safeguard countries and systems against attacks. To deal with the vulnerabilities of AI, engineers and developers need to evaluate existing security methods, develop new tools and strategies, and formulate technical guidelines and standards, said Arndt Von Twickel, Technical Officer at Germany's Federal Office for Information Security (BSI), at a recent AI for Good webinar. So-called "connectionist AI" systems support safety-critical applications like autonomous driving, which is set to be allowed on United Kingdom roads this year. Despite reaching "superhuman" performance levels in complex tasks like manoeuvring a vehicle, AI systems can still make critical mistakes based on misunderstood inputs.


AI Weekly: AI helps companies design physical products

#artificialintelligence

This week in a paper published in the journal Nature, researchers at Google detailed how they used AI to design the next generation of tensor processing units (TPU), the company's application-specific integrated circuits optimized for AI workloads. While the work wasn't novel -- Google's been refining the technique for the better part of years -- it gave the clearest illustration yet of AI's potential in hardware design. But the Nature paper suggests AI can at the very least augment human designers to accelerate the brainstorming process. Beyond chips, companies like U.S.- and Belgium-based Oqton are applying AI to design domains including additive manufacturing. Oqton's platform automates CNC, metal, and polymer 3D printing and hybrid additive and subtractive workflows, like creating castable jewelry wax.


American Express has revolutionized its credit checks with machine learning

#artificialintelligence

American Express (Amex) is a globally integrated payments company, providing customers with access to products, insights and experiences that enrich lives and build business success. And inside the company, the Amex Credit Fraud Risk business unit's mission is all about minimising credit and fraud losses while promoting business growth and delivering superior customer service. Nothing about this will surprise you so far, we're presuming. What may: while the financial services industry uses digital for just about every process imaginable, there's one surprising remaining exception-the commercial card underwriting process, which to you and me is'Are you going to lend my small business any money?' In a lot of Europe, this process is still completely manual and takes an underwriter a good chunk of time to complete.


What role does AI play in cybersecurity?

#artificialintelligence

Many believe that cybersecurity is an exciting field to work in, and indeed it is. Yet being responsible for an organization's IT Security is no easy feat. Attackers always seem to be a few steps ahead of defenders. It often feels like a game of one against many – from petty criminals to nation-states. It would be highly advantageous if our cybersecurity tools could automatically adapt to these threats.


Machine learning reduces microscope data processing time from months to just seconds

#artificialintelligence

Ever since the world's first ever microscope was invented in 1590 by Hans and Zacharias Janssen--a Dutch father and son--our curiosity for what goes on at the tiniest scales has led to development of increasingly powerful devices. Fast forward to 2021, we not only have optical microscopy methods that allow us to see tiny particles in higher resolution than ever before, we also have non-optical techniques, such as scanning force microscopes, with which researchers can construct detailed maps of a range of physical and chemical properties. Institute for Bioengineering of Catalonia (IBEC)'s Nanoscale Bioelectrical Characterization group, led by UB Professor Gabriel Gomila, in collaboration with members of the IBEC's Nanoscopy for Nanomedicine group, have been analyzing cells using a special type of microscopy called Scanning Dielectric Force Volume Microscopy, an advanced technique developed in recent years with which they can create maps of an electrical physical property called the dielectric constant. Each of the biomolecules that make up cells--that is, lipids, proteins and nucleic acids--has a different dielectric constant, so a map of this property is basically a map of cell composition. The technique that they developed has an advantage over the current gold standard optical method, which involves applying a fluorescent dye that can disrupt the cell being studied.


AI & Machine Learning Operationalization Software Market Technology Developments and Future Growth to 2026

#artificialintelligence

A newly published study on Global AI & Machine Learning Operationalization Software Market the report observes numerous in-depth, influential and inducing factors that outline the market and industry. All of the findings, data, and information provided in the report are validated and revalidated with the help of trustworthy sources. The analysts who have authored the report took a unique and industry-best research and analysis approach for an in-depth study of the global AI & Machine Learning Operationalization Software market. This report forecasts demands, Trends, and revenue growth at regional & country levels and provides an analysis of the industry trends in each of the sub-segments from 2021 to 2026. The global AI & Machine Learning Operationalization Software Market to grow with a CAGR of 44.2% over the forecast period of 2021-2026.


114 Milestones In The History Of Artificial Intelligence (AI)

#artificialintelligence

In an expanded edition published in 1988, they responded to claims that their 1969 conclusions significantly reduced funding for neural network research: "Our version is that progress had already come to a virtual halt because of the lack of adequate basic theories… by the mid-1960s there had been a great many experiments with perceptrons, but no one had been able to explain why they were able to recognize certain kinds of patterns and not others."


Client Alert: Artificial intelligence and GDPR – teaching machines 'fairness'

#artificialintelligence

This week the Chair of the European Parliament's committee on AI expressed concerns about the enforcement of the European Commission's proposed AI rules, which he said could create national fragmentation similar to that seen with the GDPR. So what are the issues involved, what is the proposed new EU law and how does GDPR already regulate AI? At the start of 2020, 42% of companies in the EU said they use technologies that depend on AI, and another 18% of companies said they are planning to use AI in the future (European Enterprise Survey – FRA, 2020). So, this is clearly an area that is justifiably generating considerable activity and interest from both industry and the regulators. It is important to note however that currently the available technologies involve varying levels of complexity, automation and human review and, despite some companies' optimism about their AI capabilities, many applications currently used remain in the development stage.