Technology
Cray Announces New, AI-Focused Supercomputers - ExtremeTech
AMD has made plans to enter these markets with deep learning accelerators based on its Polaris and Vega architectures, but those chips haven't actually launched in-market yet. By all accounts, these are the killer growth markets for the industry as a whole, and they help explain why even some game developers like Blizzard want to get in on the AI craze. As compute resources shift towards Amazon, Microsoft, and other cloud service providers, the companies that can provide the hardware these workloads run on will be best positioned for the future. Smartphones and tablets didn't really work for Nvidia or Intel–making AMD's decision to stay out of those markets retrospectively look very, very wise–but both are positioned well to capitalize on these new dense server trends. AMD is obviously playing catch-up on the CPU and GPU front, but Ryzen should deliver strong server performance when Naples launches later this quarter.
Care and Feeding of Machine Learning in Marketing. - Wednesday, 24th May 2017 at 4Hoteliers
The most straightforward definition of machine learning is making computers work without explicit telling them what to do (i.e., programming). In terms of marketing, machine learning has many practical applications today. For example, improving the ability to predict how consumers will respond to marketing messages based on how they've responded in the past. Or, decreasing churn rates by better predicting the conditions and behaviors that indicate that a customer is likely to reduce or cancel service. Machine learning can help marketers predict the best time to send an email and what creative elements, what mix of copy, image, and call to action is likely to work best with specific customer segments.
An Introduction to the MXNet Python API
In this series, I will try to give you an overview of the MXnet Deep Learning library: we'll look at its main features and its Python API (which I suspect will be the #1 choice). Later on, we'll explore some of the MXNet tutorials and notebooks available online, and we'll hopefully manage to understand every single line of code! If you'd like learn more about the rationale and the architecture of MXNet, you should read this paper, named "MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems". We'll cover most of the concepts presented in the paper, but hopefully in a more accessible way. First things first: let's install MXNet.
The past, present and future of AI in customer experience
However, AI represents an opportunity to introduce intelligent, scalable engagement and more personalised experiences to help customers accomplish tasks or solve problems while also improving overall satisfaction. Whether they're based in messaging platforms or hardware devices, virtual concierges are bots designed to provide personalised services. We're already seeing the following list of AI applications implemented today: Today's customers live in a multi-screen, omnichannel world. Whether it's integrating back-end CRM, enhancing commerce, personalising experiences, introducing new touch points, predicting behaviors, trends and expectations, successful AI implementations require a new blueprint.
a16z Podcast: Quantum Computing, Now and Next – Andreessen Horowitz
However, we're now resorting to brute-force hacks to keep pushing it beyond its limits and are getting closer to the point of diminishing returns (especially given costly manufacturing infrastructure). Yet this very dynamic is leading to "a Cambrian explosion" in computing capabilities… just look at what's happening today with GPUs, FPGAs, and neuromorphic chips. Through such continuing performance improvements and parallelization, classic computing continues to reshape the modern world. But we're so focused on making our computers do more that we're not talking enough about what classic computers can't do -- and that's to compute things the way nature does, which operates in quantum mechanics. So our smart machines are really quite dumb, argues Rigetti Computing founder and CEO Chad Rigetti; they're limited to human-made binary code vs. the natural reality of continuous variables.
The Music Industry in 2026 – Technology and Trends Changing the Future of Music
The music industry has been rapidly transforming with technology as the accelerant. Ten years from now the music industry will change significantly due to the rise of streaming, the proliferation of digital distribution, the marginalization of terrestrial radio, the rise of cloud-based personalization with artificial intelligence and machine-learning algorithms fed by Big Data, and the emergence of new distribution channels such as social media and virtual reality. Video didn't kill the Radio Star, and neither will Streaming… Yet Similar to the collapse and shakeout of the print media industry (newspaper and publishing), terrestrial radio has undergone consolidation (Media Life Magazine. Even with the consolidation, the remaining big players are struggling. The nation's largest owner of radio stations with 850 AM and FM stations in the US, IHeartMedia, is saddled in debt (Shaw, Lucas and Keller, Laura.
- Media > Music (1.00)
- Information Technology > Services (1.00)
3 Trends that Will Define Our Future – Erik P.M. Vermeulen – Medium
This week, I am in Japan teaching a course on business and law in a digital world. To prepare the next generation for the future, it is necessary to think about recent developments in technology. But we don't hear much from Japan these days (compared to the late 80s when Sony, Nikon, Toshiba and other amazing Japanese businesses were dominant). Yet, for various technologies, companies in Japan are still innovative and important. Japan approaches robotic technologies in a different way from other technology regions, such as Silicon Valley.
- Asia > Japan (0.90)
- North America > United States > California (0.25)
- Information Technology (0.91)
- Banking & Finance (0.73)
- Semiconductors & Electronics (0.55)
When is Big Data Too Big?
In fact, the'big' in Big Data will likely get even larger as billions of IoT (Internet of Things) devices come online. For self-styled'quants' like myself, Big Data is mana from heaven, as we can use large data sets to gain insight on almost everything. However, the rush of data and cracking an insanely large data set oftentimes breeds a sense of overconfidence in the power of Big Data. We make a lot of good decisions from small data and spreadsheets.' I would have to agree – more is not always better.
- Information Technology > Data Science > Data Mining > Big Data (1.00)
- Information Technology > Artificial Intelligence (1.00)
20 years after Deep Blue, a new era in human-machine collaboration
On May 11, 1997, an IBM computer called Deep Blue defeated the reigning world chess champion, Garry Kasparov, capturing the attention and imagination of the world. Distinguished IBM Research Staff Member Murray Campbell, one of the original developers of Deep Blue, looks back at the match and explains how AI has evolved over the last 20 years to embody augmented intelligence.
This app uses artificial intelligence to turn design mockups into source code
While traditionally it has been the task of front-end developers to transform the work of designers from raw graphical user interface mockups to actual source code, this trend might soon be a thing of the past – courtesy of artificial intelligence. Copenhagen-based startup UIzard Technologies has leveraged the latest developments in the field of machine learning to build a neural network that, once fed with raw screenshots of graphical user interface, proceeds to automatically generate code. What is particularly intriguing is that the so-called Pix2Code model has the capacity to produce code for three different platforms, including Android and iOS as well as other web-based technologies. As UIzard founder Tony Beltramelli explains in his research, the novel approach could potentially "end the need for manually-programmed" user interfaces altogether. At present, the method generates code from screenshots with an impressive accuracy of over 77 percent, but the consistency of the algorithm is likely to improve in the future.