source developer
Draft EU AI Act regulations could have a chilling effect
In-brief New rules drafted by the European Union aimed at regulating AI could prevent developers from releasing open-source models, according to American think tank Brookings. The proposed EU AI Act, yet to be signed into law, states that open source developers have to ensure their AI software is accurate, secure, and be transparent about risk and data use in clear technical documentation. Brookings argues that if a private company were to deploy the public model or use it in a product, and it somehow gets in trouble due to some unforeseen or uncontrollable effects from the model, the company would then probably try to blame the open source developers and sue them. It might force the open source community to think twice about releasing their code, and would, unfortunately, mean the development of AI will be driven by private companies. Proprietary code is difficult to analyse and build upon, meaning innovation will be hampered.
- Government (0.56)
- Law (0.36)
The EU's AI Act could have a chilling effect on open source efforts, experts warn
The nonpartisan think tank Brookings this week published a piece decrying the bloc's regulation of open source AI, arguing it would create legal liability for general-purpose AI systems while simultaneously undermining their development. Under the EU's draft AI Act, open source developers would have to adhere to guidelines for risk management, data governance, technical documentation and transparency, as well as standards of accuracy and cybersecurity. If a company were to deploy an open source AI system that led to some disastrous outcome, the author asserts, it's not inconceivable the company could attempt to deflect responsibility by suing the open source developers on which they built their product. "This could further concentrate power over the future of AI in large technology companies and prevent research that is critical to the public's understanding of AI," Alex Engler, the analyst at Brookings who published the piece, wrote. "In the end, the [E.U.'s] attempt to regulate open-source could create a convoluted set of requirements that endangers open-source AI contributors, likely without improving use of general-purpose AI."
- Government (1.00)
- Information Technology > Security & Privacy (0.70)
- Law > Statutes (0.52)
- Information Technology > Software (1.00)
- Information Technology > Artificial Intelligence > Issues > Social & Ethical Issues (1.00)
Open source developers urged to ditch GitHub following Copilot launch – TechCrunch
Software Freedom Conservancy, a not-for-profit organization that provides support and legal services for open source software projects, has called on the open source community to ditch GitHub after quitting the code-hosting and collaboration platform itself. The move comes a week after Microsoft-owned GitHub launched the commercial version of Copilot, an AI-powered pair-programmer that collaborates with software developers by suggesting lines or functions as they type. It's a little like Gmail's Smart Compose feature, which strives to expedite your email writing by suggesting the next piece of text in your message using contextual cues. Software Freedom Conservancy is financially backed by a number of big-name companies, such as Google, Red Hat, and Mozilla, and its members span more than 40 projects, including Git (which GitHub relies heavily on), Selenium, and Godot. While the Software Freedom Conservancy's beef with GitHub predates Copilot by some margin, it seems that GitHub's latest launch is the final straw.
- Information Technology > Software (1.00)
- Information Technology > Artificial Intelligence > Natural Language (0.35)