Goto

Collaborating Authors

Results


Chromebooks versus Windows laptops: Which should you buy?

PCWorld

Should I buy a Chromebook or a Windows laptop? It's a common question, whether asked by parents weighing the best computer option for back-to-school or by people who just want an inexpensive computer for themselves. We'll help you choose the right one. Our latest update includes more answers to questions you might have: such as, how slow (and inexpensive) can a Chromebook be before it stops being usable? What does Windows 11 and Windows 11 SE mean for laptops? Read on for the answers, plus our up-to-date buying guide for November 2021 and Black Friday, plus more details and what to buy. A notebook PC or laptop powered by Microsoft Windows offers several advantages. Windows offers the most flexibility to run just about any app, your choice of any browser, and configure antivirus options, utilities, and more. You can tweak and configure your PC as you choose. That convenience demands more computing horsepower, and often a higher price compared to most Chromebooks. Prices can soar into the thousands of dollars, and if you need a powerful PC for gaming or video editing, Chromebooks can't compete, and they don't try to. But you'll find some great deals among our more affordably priced, top Windows picks. See our buying guide to the best laptops for even more options.


When Hackers Were Heroes

Communications of the ACM

Forty years ago, the word "hacker" was little known. Its march from obscurity to newspaper headlines owes a great deal to tech journalist Steven Levy, who in 1984 defied the advice of his publisher to call his first book Hackers: Heroes of the Computer Revolution.11 Hackers were a subculture of computer enthusiasts for whom programming was a vocation and playing around with computers constituted a lifestyle. Hackers was published only three years after Tracy Kidder's The Soul of a New Machine, explored in my last column (January 2021, p. 32–37), but a lot had changed during the interval. Kidder's assumed readers had never seen a minicomputer, still less designed one. By 1984, in contrast, the computer geek was a prominent part of popular culture. Unlike Kidder, Levy had to make people reconsider what they thought they already knew. Computers were suddenly everywhere, but they remained unfamiliar enough to inspire a host of popular books to ponder the personal and social transformations triggered by the microchip. The short-lived home computer boom had brought computer programming into the living rooms and basements of millions of middle-class Americans, sparking warnings about the perils of computer addiction. A satirical guide, published the same year, warned of "micromania."15 The year before, the film Wargames suggested computer-obsessed youth might accidentally trigger nuclear war.


Deep Learning NVIDIA GPU Workstations

#artificialintelligence

We understand every development environment is different, so shouldn't you have the option to choose what's best for you? All EMLI (Exxact Machine Learning Images) environments are available in the latest Ubuntu or CentOS Linux versions, and are built to perform right out of the box.


GPT-3 Creative Fiction

#artificialintelligence

What if I told a story here, how would that story start?" Thus, the summarization prompt: "My second grader asked me what this passage means: …" When a given prompt isn't working and GPT-3 keeps pivoting into other modes of completion, that may mean that one hasn't constrained it enough by imitating a correct output, and one needs to go further; writing the first few words or sentence of the target output may be necessary.


Windows 10's Linux subsystem gets GPU compute and an easier install in new preview

PCWorld

Microsoft released improvements to its Windows Subsystem for Linux 2 (WSL) in a Windows 10 preview build on Wednesday, with features benefiting newcomers and developers alike. As part of the update, WSL2 can now perform GPU compute functions, including using Nvidia's CUDA technology. The new additions deliver on the promises Microsoft made at May's Build 2020 conference, where the company also teased graphical user interface support for the Windows Subsystem for Linux. WSL's improvements are part of Windows 10 Build 20150, part of the Dev Channel of Insider builds. Formerly known as the Fast Ring, the Dev Channel is devoted to testing new features which aren't necessarily tied to any upcoming Windows 10 feature release. As the name suggests, the Windows Subsystem for Linux 2 allows you to run a Linux kernel from within Windows.


Meltdown

Communications of the ACM

Moritz Lipp is a Ph.D. candidate at Graz University of Technology, Flanders, Austria. Michael Schwarz is a postdoctoral researcher at Graz University of Technology, Flanders, Austria. Daniel Gruss is an assistant professor at Graz University of Technology, Flanders, Austria. Thomas Prescher is a chief architect at Cyberus Technology GmbH, Dresden, Germany. Werner Haas is the Chief Technology Officer at Cyberus Technology GmbH, Dresden, Germany.


Nvidia Jetson Xavier NX review: Redefining GPU accelerated machine learning

#artificialintelligence

Nvidia launched the Jetson Xavier NX embedded System-on-Module (SoM) at the end of last year. It is pin-compatible with the Jetson Nano SoM and includes a CPU, a GPU, PMICs, DRAM, and flash storage. However, it was missing an important accessory, its own development kit. Since a SoM is an embedded board with just a row of connector pins, it is hard to use out-of-the-box. A development board connects all the pins on the module to ports like HDMI, Ethernet, and USB.


A Survey on Edge Intelligence

arXiv.org Artificial Intelligence

Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis in locations close to where data is captured based on artificial intelligence. The aim of edge intelligence is to enhance the quality and speed of data processing and protect the privacy and security of the data. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this paper, we present a thorough and comprehensive survey on the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, namely edge caching, edge training, edge inference, and edge offloading, based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare and analyse the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, etc. This survey article provides a comprehensive introduction to edge intelligence and its application areas. In addition, we summarise the development of the emerging research field and the current state-of-the-art and discuss the important open issues and possible theoretical and technical solutions.


The 84 biggest flops, fails, and dead dreams of the decade in tech

#artificialintelligence

The world never changes quite the way you expect. But at The Verge, we've had a front-row seat while technology has permeated every aspect of our lives over the past decade. Some of the resulting moments -- and gadgets -- arguably defined the decade and the world we live in now. But others we ate up with popcorn in hand, marveling at just how incredibly hard they flopped. This is the decade we learned that crowdfunded gadgets can be utter disasters, even if they don't outright steal your hard-earned cash. It's the decade of wearables, tablets, drones and burning batteries, and of ridiculous valuations for companies that were really good at hiding how little they actually had to offer. Here are 84 things that died hard, often hilariously, to bring us where we are today. Everyone was confused by Google's Nexus Q when it debuted in 2012, including The Verge -- which is probably why the bowling ball of a media streamer crashed and burned before it even came to market.


My Programming Start

#artificialintelligence

I started my programming journey recently, having used computers only before for work, gaming, and the one high-school Maya animation class. I upgraded from an old PC to a new Maingear custom built PC, good for gaming and work. I wanted to learn how to code, find work as a developer, and learn the best programming languages of 2019. I looked at bootcamps and online certifications, but the cost of these degrees, their length and commitment, and the fact that their curriculum is not always up-to-date, made me decide to go the self-taught way. I found a lot of media and tutorials offering to get me started, and trusting reviews for FreeCodeCamp.org,