Goto

Collaborating Authors

Software's Accelerating Data Needs May Benefit From Intel's High-Optane Push

#artificialintelligence

For all of computing's advances the past few decades, one aspect that's remained fairly constant is the fundamental relationship among processors, memory chips, and storage, coupled with software designed to route data where it's most urgently needed. New memory technology that reached the market this year in response to big computing trends could upend this longstanding choreography among the components that shuttle information around the world's data centers. Many of today's business applications thrive on supersize datasets that need to be processed in near real time, which means even today's solid-state drives (SSD) require processors to wait too long for data in storage. At the same time, memory performance gains haven't kept up with those of either CPUs or storage. Simply packing servers with more memory can be expensive.


Three Myths About Today's In-Memory Databases

@machinelearnbot

In-memory database technology is fashionable in recent years as the price of RAM drops substantially and gigabyte chips become affordable. By taking advantage of the cost-performance value of RAM, leading edge database developers are boosting the performance of next-generation databases with in-memory technology. However, many developers who intend to adopt in-memory technology only think of speed in terms of RAM, and do not exploit the true power of in-memory technology. The argument here is that in-memory technology means not only taking advantage of the speed of RAM, but also suggests a new way trading space for time. We understand that RAM offers lower latency than hard disk drives, including the newer SSDs.


AWS rolls out new EC2 high memory instances, tailored for SAP HANA

ZDNet

Amazon Web Services on Thursday announced new High Memory EC2 instances designed to run large in-memory databases like SAP HANA. The instances currently deliver 6 TB, 9 TB, and 12 TB of memory, with 18 TB and 24 TB instances coming in 2019. These High Memory instances enable customers to run in-memory databases in the same Amazon Virtual Private Cloud (VPC) as the rest of their enterprise applications. That means they can scale their in-memory database and easily connect it to storage, networking, analytics, IoT or machine learning services. The deployment of in-memory databases is becoming more common as enterprises process more real-time data.


In-Memory Computing

#artificialintelligence

I'm going to detour briefly into my own musing here, wondering if other cell technologies could be employed in this way. If you could get partial melting/crystallization of PCRAM cells, then might that would work? MRAM sounds like it wouldn't work, given the strict north/south orientation of a magnetic domain, but it's not that simple. The nature of the programming is determined by the amount and angle of aligned dipoles, which could be something other than 100% and something other than parallel or anti-parallel. So this could also, ideally, take on an analog value as well.


North Korean MacOS Malware Adopts In-Memory Execution

#artificialintelligence

A new piece of macOS malware linked to the North Korean hacking group Lazarus employs in-memory execution of payloads, researchers revealed this week.