it software


6 reasons to ditch your old PC and buy a modern laptop

PCWorld

Intel, Microsoft, and others PC power players like to brag about how fast their computers are by drawing comparisons to "five-year-old PCs." But does this obsession with performance miss a more important message? We think it does, and there are many more reasons to buy a modern PC than just raw speed. Interactivity, convenient security, ease of use--these are critical features that don't appear on spec lists. Even a modern PC's sheer portability may not be immediately apparent.


78 Percent of UK Office Workers Believe Their Jobs Will Survive Automation

#artificialintelligence

LONDON--(BUSINESS WIRE)--Seventy eight percent of UK office workers are confident their jobs will survive automation, according to new independent research* undertaken for leading enterprise Robotic Process Automation (RPA) software company UiPath. The research, exploring the attitudes of 1,000 office employees toward automation, revealed that nearly half (49 percent) do not see how their administrative duties can be undertaken by a software robot. "RPA is one of the most far-reaching revolutions in the UK workplace," said Kulpreet Singh, managing director EMEA, UiPath. "Office employees may be in for a pleasant surprise as the burden of boring and routine tasks moves to being performed by software robots instead. They will need to adjust to having a greater amount of time for more valuable work."


Coding skills won't save your job -- but the humanities will

#artificialintelligence

Coding boot camps are becoming almost as popular as college degrees: Code schools graduated more than 22,000 students in 2017 alone. The bet for many is that coding and computer programming will save their jobs from automation, and there's a resulting wave of emphasis on STEM skills. But while a basic understanding of computer science may always be valuable, it is not a future-proof skill. If people want a skill set that can adapt and ride the wave of workplace automation, they should look to -- the humanities. Having knowledge of human culture and history allows us to shape the direction of how technology is developed, identifying what problems it should solve and what real-world concerns should be considered throughout the process.


Is Machine Learning The Biggest Focus In The Mobile Industry? - insideBIGDATA

#artificialintelligence

Whether people aware of it or not, artificial intelligence and machine learning have had a huge impact on human interaction, particularly in regards to machines, computers and devices. The impact can be felt across a range of industries including travel, retail and advertisement. Both Android and iOS mobile platforms have utilized this technology to create innovative and exciting new apps. How Is Machine Learning Currently Being Used? Artificial intelligence and machine learning technology are already being utilized to try and better our experiences every day.


Windows 10 may offer deeper support for AI helpers like Alexa

Engadget

While you can use voice assistants like Alexa on Windows 10, they still play second fiddle to Cortana. You can't just talk to your computer -- you have to either click a button or use a keyboard shortcut. Thankfully, Microsoft might be a little more egalitarian in the future. Albacore, WalkingCat and others have discovered that Windows 10 test releases may offer deeper support for third-party voice assistants. You could activate apps with a hotword (including when your PC is locked), and possibly "replace" Cortana on a system level.


If tech experts worry about artificial intelligence, shouldn't you as well? John Naughton

#artificialintelligence

Fifty years ago last Sunday, a computer engineer named Douglas Engelbart gave a live demonstration in San Francisco that changed the computer industry and, indirectly, the world. In the auditorium, several hundred entranced geeks watched as he used something called a "mouse" and a special keypad to manipulate structured documents and showed how people in different physical locations could work collaboratively on shared files, online. It was, said Steven Levy, a tech historian who was present, "the mother of all demos". "As windows open and shut and their contents reshuffled," he wrote, "the audience stared into the maw of cyberspace. Engelbart, with a no-hands mic, talked them through, a calm voice from Mission Control as the truly final frontier whizzed before their eyes."


Does AI Truly Learn And Why We Need to Stop Overhyping Deep Learning

#artificialintelligence

AI today is described in breathless terms as computer algorithms that use silicon incarnations of our organic brains to learn and reason about the world, intelligent superhumans rapidly making their creators obsolete. The reality could not be further from the truth. As deep learning moves from the lab into production use in mission critical fields from medicine to driverless cars, we must recognize its very real limitations as nothing more than a pile of software code and statistics, rather than the learning and thinking intelligences we describe them as. Every day data scientists build machine learning algorithms to make sense of the world and harness large piles of data into marketable insights. As guided machine assistance tools, they operate much like the large classical observation equipment of the traditional sciences, software microscopes and telescopes onto society.


3Q: Aleksander Madry on building trustworthy artificial intelligence

MIT News

Machine learning algorithms now underlie much of the software we use, helping to personalize our news feeds and finish our thoughts before we're done typing. But as artificial intelligence becomes further embedded in daily life, expectations have risen. Before autonomous systems fully gain our confidence, we need to know they are reliable in most situations and can withstand outside interference; in engineering terms, that they are robust. We also need to understand the reasoning behind their decisions; that they are interpretable. Aleksander Madry, an associate professor of computer science at MIT and a lead faculty member of the Computer Science and Artificial Intelligence Lab (CSAIL)'s Trustworthy AI initiative, compares AI to a sharp knife, a useful but potentially-hazardous tool that society must learn to weild properly.


5 Free eBooks to Help You Learn Machine Learning in 2019 - DZone AI

#artificialintelligence

Today, Machine Learning is one of the most important trends in every area of software engineering. No longer limited to researchers and analysts, it's a vital part of everything from cybersecurity to web development. To help you get started with Machine Learning, we've put together this list of 5 free Machine Learning eBooks from Packt. You can download as many of them as you like -- all you'll need to do is register when you download your first title. But there's an important reason it's the first free eBook on this list: Python is the go-to language if you want to develop Machine Learning models.


Creating Neural Networks in JavaScript: Quick-Start Guide

#artificialintelligence

Neural networks and machine learning, a field of computer science that heavily relies on them to give computers the ability to learn without being explicitly programmed, seem to be everywhere these days. Scientists want to use advanced neural networks to find energy materials, the Wall Street would like to train neural networks to manage hedge funds and pick stocks, and Google has been relying on neural networks to deliver highly accurate translations and transcriptions. At the same time, powerful, consumer-grade hardware for machine learning is getting more affordable. For example, Nvidia has recently introduced its Titan V graphics card, which is targeted specifically at machine learning developers, who would like to create neural networks without paying for a special server to handle all the complex math operations involved in machine learning. With the future of machine learning looking so bright and the necessary computational resources being so available, many JavaScript developers wonder what's the easiest way how to create neural networks in JavaScript.