Goto

Collaborating Authors

Results


2022 Technology Trends: Digital Health Marks the Future of Medical Development

#artificialintelligence

Digital health products played a prominent role in addressing the COVID-19 pandemic and in helping caregivers and patients navigate their care in the past year. Going into 2022, remote monitoring, wearables, sensors, and other mobile health (mHealth) products are taking center stage in defining the future of medicine. "One of the clearest areas of excitement now and into the future is the sector of healthcare products referred to as wearables. These are devices like fitness trackers, heart monitors, and other devices that record in real time and communicate biometric data either directly to the user or to a connected platform for a variety of purposes, including coaching, intervention, analysis and even within clinical trials administration," notes a recent report from contract manufacturer Jabil, St. Petersburg, FL. The report, "Digital Health Technology Trends," finds that "the top three solution categories providers are developing or plan to develop are in patient monitoring, diagnostic equipment, and on-body or wearable devices (see Figure 1). As digital and mHealth capabilities have become an integral part of many medical devices and diagnostics, they have enabled a more agile and flexible healthcare system to emerge in the face of COVID-19. These products will continue to improve access to patient care. Digital transformation of healthcare is not just about adopting new digital technology, notes a recent position paper from medtech giant Philips. It's about reimagining healthcare for the digital age -- using the power of data, artificial intelligence (AI), cloud-based platforms, and new business models to improve health outcomes, lower the cost of care, and improve the human care experience for patients and staff alike."


Forecasting: theory and practice

arXiv.org Machine Learning

Forecasting has always been at the forefront of decision making and planning. The uncertainty that surrounds the future is both exciting and challenging, with individuals and organisations seeking to minimise risks and maximise utilities. The large number of forecasting applications calls for a diverse set of forecasting methods to tackle real-life challenges. This article provides a non-systematic review of the theory and the practice of forecasting. We provide an overview of a wide range of theoretical, state-of-the-art models, methods, principles, and approaches to prepare, produce, organise, and evaluate forecasts. We then demonstrate how such theoretical concepts are applied in a variety of real-life contexts. We do not claim that this review is an exhaustive list of methods and applications. However, we wish that our encyclopedic presentation will offer a point of reference for the rich work that has been undertaken over the last decades, with some key insights for the future of forecasting theory and practice. Given its encyclopedic nature, the intended mode of reading is non-linear. We offer cross-references to allow the readers to navigate through the various topics. We complement the theoretical concepts and applications covered by large lists of free or open-source software implementations and publicly-available databases.


Artificial intelligence simulates microprocessor performance in real-time-Mis-aisa-The latest News,Tech,Industry,Environment,Low Carbon,Resource,Innovations.

#artificialintelligence

This approach is detailed in a paper presented at MICRO-54: the 54th IEEE/ACM International Symposium on MicroArchitecture.Micro-54 is one of the top conferences in the field of computer architecture and was selected as the conference's best publication. "This is a problem that needs to be studied in-depth and has traditionally relied on additional circuits to solve it," said Zhiyao Xie, lead author of the paper and a doctoral candidate in the lab of Yiran Chen, a professor of electrical and computer engineering at Duke."But our approach runs directly on microprocessors in the background, which opens up a lot of new opportunities. I think that's why people are excited about it." In modern computer processors, the computation cycle is 3 trillion times per second. Tracking the energy consumed for such a fast conversion is important to maintaining the performance and efficiency of the entire chip.


AI models microprocessor performance in real-time: New algorithm predicts processor power consumption trillions of times per second while requiring little power or circuitry of its own

#artificialintelligence

The approach is detailed in a paper published at MICRO-54: 54th Annual IEEE/ACM International Symposium on Microarchitecture, one of the top-tier conferences in computer architecture, where it was selected the conference's best publication. "This is an intensively studied problem that has traditionally relied on extra circuitry to address," said Zhiyao Xie, first author of the paper and a PhD candidate in the laboratory of Yiran Chen, professor of electrical and computer engineering at Duke. "But our approach runs directly on the microprocessor in the background, which opens many new opportunities. I think that's why people are excited about it." In modern computer processors, cycles of computations are made on the order of 3 trillion times per second. Keeping track of the power consumed by such intensely fast transitions is important to maintain the entire chip's performance and efficiency.


When Creators Meet the Metaverse: A Survey on Computational Arts

arXiv.org Artificial Intelligence

The metaverse, enormous virtual-physical cyberspace, has brought unprecedented opportunities for artists to blend every corner of our physical surroundings with digital creativity. This article conducts a comprehensive survey on computational arts, in which seven critical topics are relevant to the metaverse, describing novel artworks in blended virtual-physical realities. The topics first cover the building elements for the metaverse, e.g., virtual scenes and characters, auditory, textual elements. Next, several remarkable types of novel creations in the expanded horizons of metaverse cyberspace have been reflected, such as immersive arts, robotic arts, and other user-centric approaches fuelling contemporary creative outputs. Finally, we propose several research agendas: democratising computational arts, digital privacy, and safety for metaverse artists, ownership recognition for digital artworks, technological challenges, and so on. The survey also serves as introductory material for artists and metaverse technologists to begin creations in the realm of surrealistic cyberspace.


The age of exascale and the future of supercomputing

#artificialintelligence

Argonne looks to exascale and beyond, sorting out the relationship between computing and experimental facilities, the need for speed and AI's role in making it all work. In 1949, physicists at the U.S. Department of Energy's (DOE) newly minted Argonne National Laboratory ordered the construction of the Argonne Version of the Institute's Digital Automatic Computer, or AVIDAC. A modified version of the first electronic computer built at the Institute for Advanced Study in Princeton, New Jersey, it was intended to help solve complex problems in the design of nuclear reactors. With a floor area of 500 square feet and power consumption of 20 kilowatts, AVIDAC boasted remarkable computing power for the time. It possessed a memory of 1,024 words (about 5.1 kilobytes in total), could perform 1,000 multiplications per second, and had a programming capability that allowed it to solve problems consistently and accurately. Today, your smart phone can store around 100 million times more data, and can do in a single second what would have taken AVIDAC two months.


Supercomputing, 5G, & Enterprise AI: Takeaways from Nvidia's 2021 Conference

#artificialintelligence

From making GPU based video games to leading groundbreaking innovations in data science, AI and computing, Nvidia is leading the fourth industrial revolution. After years of innovation and experimentation, Nvidia launched "Grace," the company's 1st data center CPU. Grace will use NLP processing, recommendation systems, and AI supercomputing to tap large datasets. After opening the Omniverse design and collaboration platform in December, the company is developing its Omniverse Enterprise with over 400 major companies already using it. With Blue-field3, the company is facilitating real-time network visibility, and detection of cyberthreats for multiple organisations and with many other innovations such as autonomous vehicles, new SDK for quantum circuit simulation and more that's yet to disrupt AI and technology.


Four MIT faculty members receive 2021 US Department of Energy early career awards

#artificialintelligence

The U.S. Department of Energy (DoE) recently announced the names of 83 scientists who have been selected for their 2021 Early Career Research Program. The list includes four faculty members from MIT: Riccardo Comin of the Department of Physics; Netta Engelhardt of the Department of Physics and Center for Theoretical Physics; Philip Harris of the Department of Physics and Laboratory for Nuclear Science; and Mingda Li of the Department of Nuclear Science and Engineering. Each year, the DoE selects researchers for significant funding the "nation's scientific workforce by providing support to exceptional researchers during crucial early career years, when many scientists do their most formative work." The quantum technologies of tomorrow –– more powerful computing, better navigation systems, and more precise imaging and magnetic sensing devices –– rely on understanding the properties of quantum materials. Quantum materials contain unique physical characteristics, and can lead to phenomena like superconductivity.


A Review on Edge Analytics: Issues, Challenges, Opportunities, Promises, Future Directions, and Applications

arXiv.org Artificial Intelligence

Edge technology aims to bring Cloud resources (specifically, the compute, storage, and network) to the closed proximity of the Edge devices, i.e., smart devices where the data are produced and consumed. Embedding computing and application in Edge devices lead to emerging of two new concepts in Edge technology, namely, Edge computing and Edge analytics. Edge analytics uses some techniques or algorithms to analyze the data generated by the Edge devices. With the emerging of Edge analytics, the Edge devices have become a complete set. Currently, Edge analytics is unable to provide full support for the execution of the analytic techniques. The Edge devices cannot execute advanced and sophisticated analytic algorithms following various constraints such as limited power supply, small memory size, limited resources, etc. This article aims to provide a detailed discussion on Edge analytics. A clear explanation to distinguish between the three concepts of Edge technology, namely, Edge devices, Edge computing, and Edge analytics, along with their issues. Furthermore, the article discusses the implementation of Edge analytics to solve many problems in various areas such as retail, agriculture, industry, and healthcare. In addition, the research papers of the state-of-the-art edge analytics are rigorously reviewed in this article to explore the existing issues, emerging challenges, research opportunities and their directions, and applications.


Data Science With Raspberry Pi and Smart Sensors

#artificialintelligence

Ever thought that IOT can be used with Data Science? Most probably you did not even think of it(If you did bravo!). I am about to share with you How an IOT device works and how we can benefit from it in Data Science. Before getting in, I want to tell you that I will mostly talk about Raspberry Pi, a mini-computer and about different sensors and addons that we can use with it. There are a lot of IoT devices out there, but I will talk about this particular device. IoT stands for "INTERNET OF THINGS".