The parents of several Oxford High School students, including deceased Tate Myre, have filed a lawsuit against shooting suspect Ethan Crumbley, his parents and school staff. The parents of two victims of the Nov. 30, 2021, shooting at Oxford High School in Michigan are demanding more transparency from the Oxford Community School District after the board voted against moving forward with an independent investigation into the tragedy last fall. The Oxford Board of Education on Tuesday announced that the district has, for the second time, declined an offer from Michigan Attorney General Dana Nessel to conduct a third-party investigation into the school shooting with the goal of determining how shooting suspect Ethan Crumbley, 15, managed to kill four students and injure seven others last fall. "To me, this is an admission of guilt," Buck Myre, father of deceased 16-year-old Tate Myre, said during a Thursday press conference. "They know that things didn't go right that day, and they don't want to stand up and fix it. They're going to hide behind governmental immunity and they're going to hide behind insurance and the lawyers. What's this teach the kids? "We just want accountability," he added later when asked why an independent investigation is important to parents. Oakland County Prosecutor Karen McDonald revealed in December 2021 that school officials met with Crumbley and his parents to discuss violent drawings he created just hours before the deadly rampage. The 15-year-old suspect was able to convince them during the meeting that the concerning drawings were for a "video game." His parents "flatly refused" to take their son home. The shooting has also resulted in several lawsuits, including two that seek $100 million in damages each, against the school district and school employees on behalf of the family of two sisters who attend the school. Ethan Robert Crumbley, 15, charged with first-degree murder in a high school shooting, poses in a jail booking photograph taken at the Oakland County Jail in Pontiac, Michigan. Myre and Meghan Gregory, the mother of 15-year-old Keegan Gregory, who survived the shooting but witnessed and was traumatized by Crumbley's rampage, are suing the shooting suspect's parents, James and Jennifer Crumbley, as well as school staff for negligence. JENNIFER CRUMBLEY, ETHAN CRUMBLEY'S MOTHER, SENT OMINOUS TEXTS ON DAY OF SHOOTING: 'HE CAN'T BE LEFT ALONE' "They're the ones that know what happened that day.
Machine-learning algorithms that generate fluent language from vast amounts of text could change how science is done -- but not necessarily for the better, says Shobita Parthasarathy, a specialist in the governance of emerging technologies at the University of Michigan in Ann Arbor. In a report published on 27 April, Parthasarathy and other researchers try to anticipate societal impacts of emerging artificial-intelligence (AI) technologies called large language models (LLMs). These can churn out astonishingly convincing prose, translate between languages, answer questions and even produce code. The corporations building them -- including Google, Facebook and Microsoft -- aim to use them in chatbots and search engines, and to summarize documents. They sometimes parrot errors or problematic stereotypes in the millions or billions of documents they're trained on.
The latest edition of our flagship learning series on everything in and about data analytics is sure to excite your minds, be prepared for the DataHour on Building your First Chatbot using Open Source Tools. The session will be hosted by Dr. Rachael Tatman- Staff Developer Advocate at Rasa, the world's leading conversational AI platform, that enables enterprises to revamp customer experience with cutting-edge open-source machine learning implementations. In this session, you will be led on an engaging journey of using the open-source platform Rasa, and the lecture will be helmed by an ex-Googler and an instructor at the University of Michigan, Dr. Rachael Tatman. The session is for both freshers and professionals alike who would like to design chatbots to improve the CX for their organisations or simply get hands-on experience with open source tools like Rasa. Chatbots have been around for some time.
May Mobility is transforming cities through autonomous technology to create a safer, greener, more accessible world. Based in Ann Arbor, Michigan, May develops and deploys autonomous vehicles (AVs) powered by our innovative Multi-Policy Decision Making (MPDM) technology that literally reimagines the way AVs think. Our vehicles do more than just drive themselves - they provide value to communities, bridge public transit gaps and move people where they need to go safely, easily and with a lot more fun. We're building the world's best autonomy system to reimagine transit by minimizing congestion, expanding access and encouraging better land use in order to foster more green, vibrant and livable spaces. Since our founding in 2017, we've given more than 300,000 autonomy-enabled rides to real people around the globe.
Whether it's due to a lack of funding, lack of know-how or censorship, some governments and entities are shrinking the amount of data that they incorporate into their AI. Does this compromise the integrity of AI results? Intentional data shrinking is occurring as a matter of policy and expediency. Roya Ensafi, assistant professor of computer science and engineering at the University of Michigan, discovered that censorship was increasing in 103 countries. Most censorship actions "were driven by organizations or internet service providers filtering content," Ensafi reported.
Shobita Parthasarathy says that LLMs could help to advance research, but their use should be regulated. Machine-learning algorithms that generate fluent language from vast amounts of text could change how science is done -- but not necessarily for the better, says Shobita Parthasarathy, a specialist in the governance of emerging technologies at the University of Michigan in Ann Arbor. In a report published on 27 April, Parthasarathy and other researchers try to anticipate societal impacts of emerging artificial-intelligence (AI) technologies called large language models (LLMs). These can churn out astonishingly convincing prose, translate between languages, answer questions and even produce code. The corporations building them -- including Google, Facebook and Microsoft -- aim to use them in chatbots and search engines, and to summarize documents. They sometimes parrot errors or problematic stereotypes in the millions or billions of documents they're trained on.
Neither technologies nor societies are neutral, and failing to acknowledge this, results at best, in a narrow view of both. At worst, it leads to technology that reinforces oppressive societal norms. We agree with Alex Hanna, Timnit Gebru, and others who argue individual harms reflect institutional problems, and thus require institutional and systemic solutions. We believe computer science (CS) as a discipline often promotes itself as objective and neutral. This tendency allows the field to ignore systems of oppression that exist within and because of CS. As scholars in educational psychology, computer science education, and social studies education, we suggest a way forward through institutional change, specifically in the way we teach CS.
In a small but multi-institutional study, an artificial intelligence-based system improved providers' assessments of whether patients with bladder cancer had complete response to chemotherapy before a radical cystectomy (bladder removal surgery). Yet the researchers caution that AI isn't a replacement for human expertise and that their tool shouldn't be used as such. "If you use the tool smartly, it can help you," said Lubomir Hadjiyski, Ph.D., a professor of radiology at the University of Michigan Medical School and the senior author of the study. When patients develop bladder cancer, surgeons often remove the entire bladder in an effort to keep the cancer from returning or spreading to other organs or areas. More evidence is building, though, that surgery may not be necessary if a patient has zero evidence of disease after chemotherapy.
Today's recruiting landscape is characterized by low unemployment, a record level of open jobs and massive churn in the labor market. Recruiting technology can help employers find, attract and ultimately hire the people they need. "But if you suck at recruiting, having great recruiting technology will only allow you to suck much faster," said Tim Sackett, SHRM-SCP, an industry veteran, thought leader and president of HRU Technical Resources, an engineering and design staffing firm based in Lansing, Mich. Sackett, speaking at the SHRM Talent Conference & Expo 2022 in Denver on April 13, demos over a hundred recruiting technologies each year and works with HR and recruiting leaders to help build recruiting processes and technology systems. Talent acquisition technology stacks build upon the foundational applicant tracking system (ATS).