Goto

Collaborating Authors

 Qatar Computing Research Institute, HBKU


A Social Media Based Examination of the Effects of Counseling Recommendations after Student Deaths on College Campuses

AAAI Conferences

Student deaths on college campuses, whether brought about by a suicide or an uncontrollable incident, have serious repercussions for the mental wellbeing of students. Consequently, many campus administrators implement post-crisis intervention measures to promote student-centric mental health support. Information about these measures, which we refer to as "counseling recommendations", are often shared via electronic channels, including social media. However, the current ability to assess the effects of these recommendations on post-crisis psychological states is limited. We propose a causal analysis framework to examine the effects of these counseling recommendations after student deaths. We leverage a dataset from 174 Reddit campus communities and ~400M posts of ~350K users. Then we employ statistical modeling and natural language analysis to quantify the psychosocial shifts in behavioral, cognitive, and affective expression of grief in individuals who are "exposed" to (comment on) the counseling recommendations, compared to that in a matched control cohort. Drawing on crisis and psychology research, we find that the exposed individuals show greater grief, psycholinguistic, and social expressiveness, providing evidence of a healing response to crisis and thereby positive psychological effects of the counseling recommendations. We discuss the implications of our work in supporting post-crisis rehabilitation and intervention efforts on college campuses.


Fact Checking in Community Forums

AAAI Conferences

Community Question Answering (cQA) forums are very popular nowadays, as they represent effective means for communities around particular topics to share information. Unfortunately, this information is not always factual. Thus, here we explore a new dimension in the context of cQA, which has been ignored so far: checking the veracity of answers to particular questions in cQA forums. As this is a new problem, we create a specialized dataset for it. We further propose a novel multi-faceted model, which captures information from the answer content (what is said and how), from the author profile (who says it), from the rest of the community forum (where it is said), and from external authoritative sources of information (external support). Evaluation results show a MAP value of 86.54, which is 21 points absolute above the baseline.