scheele
Can Matchmaking Platforms Save Us From Dating App Fatigue?
One might assume, with good reason, that a romantic recession is underway. That's the story the numbers tell, at least. Forty-seven percent of US adults say dating is more difficult today than it was a decade ago, according to a Pew Research Center analysis. Even as singledom is on a downward slope--in 2023, 42 percent of adults were unpartnered compared to 44 percent in 2019, a different Pew survey found--it doesn't feel that way. The dating landscape is in the throes of another tectonic shift.
- North America > United States > New York (0.06)
- North America > United States > California > Los Angeles County > Santa Monica (0.06)
- North America > United States > California > Los Angeles County > Los Angeles (0.06)
Artificial intelligence suffers from some very human flaws. Gender bias is one
Last month, Facebook parent Meta unveiled an artificial intelligence chatbot said to be its most advanced yet. BlenderBot 3, as the AI is known, is able to search the internet to talk to people about almost anything, and it has abilities related to personality, empathy, knowledge and long-term memory. BlenderBot 3 is also good at peddling anti-Semitic conspiracy theories, claiming that former US President Donald Trump won the 2020 election, and calling Meta Chairman and Facebook co-founder Mark Zuckerberg "creepy". It's not the first time an AI has gone rogue. In 2016, Microsoft's Tay AI took less than 24 hours to morph into a rightwing bigot on Twitter, posting racist and misogynistic tweets and praising Adolf Hitler.
- North America > United States (1.00)
- Asia (0.40)
- Europe (0.05)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.55)
Explanation as a process: user-centric construction of multi-level and multi-modal explanations
Finzel, Bettina, Tafler, David E., Scheele, Stephan, Schmid, Ute
In the last years, XAI research has mainly been concerned with developing new technical approaches to explain deep learning models. Just recent research has started to acknowledge the need to tailor explanations to different contexts and requirements of stakeholders. Explanations must not only suit developers of models, but also domain experts as well as end users. Thus, in order to satisfy different stakeholders, explanation methods need to be combined. While multi-modal explanations have been used to make model predictions more transparent, less research has focused on treating explanation as a process, where users can ask for information according to the level of understanding gained at a certain point in time. Consequently, an opportunity to explore explanations on different levels of abstraction should be provided besides multi-modal explanations. We present a process-based approach that combines multi-level and multi-modal explanations. The user can ask for textual explanations or visualizations through conversational interaction in a drill-down manner. We use Inductive Logic Programming, an interpretable machine learning approach, to learn a comprehensible model. Further, we present an algorithm that creates an explanatory tree for each example for which a classifier decision is to be explained. The explanatory tree can be navigated by the user to get answers of different levels of detail. We provide a proof-of-concept implementation for concepts induced from a semantic net about living beings.
- North America > United States > Massachusetts > Suffolk County > Boston (0.14)
- North America > United States > New York (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (2 more...)