Six Security Considerations for Machine Learning Solutions - Microsoft Community Hub
Model Theft: Because models represent a significant investment in Intellectual Property, they can be a valuable target for theft. And like other software assets, they are tangible and can be stolen. Model theft happens when a model is taken outright from a storage location or re-created through deliberate query manipulation. An example of this type of attack was demonstrated by a research team at UC Berkeley who used public endpoints to re-create language models with near-production state-of-the-art translation quality. The researchers were then able to degrade the performance and erode the integrity of the original machine learning model using data input techniques to compromise the integrity of the original machine learning model (see Data Poisoning above).
Feb-13-2023, 20:15:26 GMT
- Industry:
- Commercial Services & Supplies > Security & Alarm Services (0.71)
- Education (1.00)
- Information Technology > Security & Privacy (1.00)
- Technology: