Investigating User Perspectives on Differentially Private Text Privatization
Meisenbacher, Stephen, Klymenko, Alexandra, Karpp, Alexander, Matthes, Florian
–arXiv.org Artificial Intelligence
Recent literature has seen a considerable uptick in $\textit{Differentially Private Natural Language Processing}$ (DP NLP). This includes DP text privatization, where potentially sensitive input texts are transformed under DP to achieve privatized output texts that ideally mask sensitive information $\textit{and}$ maintain original semantics. Despite continued work to address the open challenges in DP text privatization, there remains a scarcity of work addressing user perceptions of this technology, a crucial aspect which serves as the final barrier to practical adoption. In this work, we conduct a survey study with 721 laypersons around the globe, investigating how the factors of $\textit{scenario}$, $\textit{data sensitivity}$, $\textit{mechanism type}$, and $\textit{reason for data collection}$ impact user preferences for text privatization. We learn that while all these factors play a role in influencing privacy decisions, users are highly sensitive to the utility and coherence of the private output texts. Our findings highlight the socio-technical factors that must be considered in the study of DP NLP, opening the door to further user-based investigations going forward.
arXiv.org Artificial Intelligence
Mar-12-2025
- Country:
- Asia > Middle East
- UAE (0.14)
- Europe > Middle East
- Malta (0.14)
- North America > United States (0.94)
- Asia > Middle East
- Genre:
- Overview (1.00)
- Questionnaire & Opinion Survey (1.00)
- Research Report
- Experimental Study (1.00)
- New Finding (1.00)
- Industry:
- Education (0.93)
- Health & Medicine > Therapeutic Area
- Cardiology/Vascular Diseases (0.69)
- Oncology (0.46)
- Information Technology > Security & Privacy (1.00)
- Law (0.93)
- Technology: