Companies today are leveraging more and more of user data to build models that improve their products and user experience. Companies are looking to measure user sentiments to develop products as per their need. However, this predictive capability using data can be harmful to individuals who wish to protect their privacy. Building data models using sensitive personal data can undermine the privacy of users and can also cause damage to a person if the data gets leaked or misused. A simple solution that companies have employed for years is data anonymisation by removing personally identifiable information in datasets.
The Army Engineer Research and Development Center (ERDC) is working with Microsoft to improve climate modeling and natural disaster resilience planning through the use of predictive analytics-powered, cloud-based tools and artificial intelligence services. Under a new agreement, ERDC will test the scalability of its coastal storm modeling system, CSTORM-MS -- which was previously run on high-performance computers -- inside Microsoft's Azure Government cloud. The CSTORM-MS models provide can give coastal communities a robust, standardized approach for determining the risk of future storm events and for evaluating flood risk reduction measures caused by tropical and extra-tropical storms, as well as wind, wave and water levels. Currently, CSTORM-MS models are run at ERDC's Department of Defense Supercomputing Resource Center. In 2020, ERDC worked with DOD's High Performance Modernization Program's (HPCMP) on a feasibility study testing whether CSTORM-MS could be run in a commercial cloud.
Cloud computing on a certified, compliant, properly-run cloud service like Microsoft Azure is likely to be far more secure than on-premise servers in your office or your data centre. Your data is encrypted at rest and in motion; cloud systems are probably patched more often and configured more securely than your servers; and admin access is locked down and only enabled for'just enough access, just in time' to run specific commands within specific time windows. Also, the admins will have gone through background checks and work in secure locations that require biometric credentials to access. Unlike nuclear weapons, cyberweapons can be proliferated more quickly and the threat from accidentally setting them off is even greater. There are still problems, though.
The cloud-based speech recognition/API provides developers or enterprises an easy way to create speech-enabled features in their applications. However, sending audios about personal or company internal information to the cloud, raises concerns about the privacy and security issues. The recognition results generated in cloud may also reveal some sensitive information. This paper proposes a deep polynomial network (DPN) that can be applied to the encrypted speech as an acoustic model. It allows clients to send their data in an encrypted form to the cloud to ensure that their data remains confidential, at mean while the DPN can still make frame-level predictions over the encrypted speech and return them in encrypted form. One good property of the DPN is that it can be trained on unencrypted speech features in the traditional way. To keep the cloud away from the raw audio and recognition results, a cloud-local joint decoding framework is also proposed. We demonstrate the effectiveness of model and framework on the Switchboard and Cortana voice assistant tasks with small performance degradation and latency increased comparing with the traditional cloud-based DNNs.
Amid ever-increasing demands for privacy and security for highly sensitive data stored in the cloud, Google Cloud announced this week the creation of Confidential Computing. Terming it a "breakthrough technology," Google said the technology, which will offer a number of products in the coming months, allows users to encrypt sensitive data not only as it is stored or sent to the cloud, but while it is being worked on as well. Confidential Computing keeps data encrypted as it's being "used, indexed, queried, or trained on" in memory and "elsewhere outside the central processing unit," Google said in a statement about the new technology. The first product, Confidential Virtual Machines, was formally announced at Google's annual Cloud Next conference being held online this year, due to COVD-19 restrictions, over a nine-week period. It builds upon its Google Cloud Services unveiled by Google and AMD earlier this year that featured processors capable of generating and managing encryption keys that remain on the chip.