Goto

Collaborating Authors

 new stack


Flyte: An Open Source Orchestrator for ML/AI Workflows - The New Stack

#artificialintelligence

Does data for artificial intelligence and machine learning need their own workflows and orchestration system? It does, according to Union.ai, which offers an open source solution called Flyte that provides workflow and orchestration to fit the unique demands of data, not software. "The number one feedback we get from people who use orchestrators for machine learning is that they're not made for AI workflows, machine learning workflows, because you're forced to write YAML code, you're forced to do understand Docker files," Martin Stein, chief marketing officer and head of developer relations at Union.ai, told The New Stack. "You're forced to really do things that machine learning engineers, data scientists and researchers don't do." Basically, with Flyte, developers write their code and then run it locally or remotely, he added.


At QCon: Why Generative AI Is Harmful to Earth and Society - The New Stack

#artificialintelligence

"My views are my own, as are my biases." That's how Leslie Miley, investor, ex-Googler, and former CTO of the Obama Foundation, kicked off his QCon London keynote. But can the same be said for generative artificial intelligence (AI)? Not likely, as collective biases are baked in at scale, influencing everyone's views. If it keeps going unchecked, it will have devastating effects both on the Earth and the people living on it.


The End of Programming Is Nigh - The New Stack

#artificialintelligence

Is the end of programming nigh? If you ask Matt Welsh, he'd say yes. As Richard McManus wrote on The New Stack, Welsh is a former professor of computer science at Harvard who spoke at a virtual meetup of the Chicago Association for Computing Machinery (ACM), explaining his thesis that ChatGPT and GitHub Copilot represent the beginning of the end of programming. Welsh joined us on The New Stack Makers to discuss his perspectives about the end of programming and answer questions about the future of computer science, distributed computing, and more. Welsh is now the founder of Fixie.ai, a platform they are building to let companies develop applications on top of large language models to extend with different capabilities. For 40 to 50 years, programming language design has had one goal.


Responsible AI at Amazon Web Services: Q&A with Diya Wynn - The New Stack

#artificialintelligence

Last year's release of ChatGPT alerted many to the great strides that machine learning has made, and will continue tomake in the years going forward. But how do we make surethat this great power is being used responsibly, free from bias and malicious intent? For Amazon Web Services, Diya Wynn is the senior practice manager for Responsible AI. Recently, she sat down with the New Stack to discuss all things Responsible AI. At AWS, Wynn created the customer facing responsible AI practice, and built a team of individuals with diverse backgrounds, including members of the LGBTQIA and differently-abled communities.


AI21 Labs Releases Jurassic-2, its New Large Language Model - The New Stack

#artificialintelligence

AI21 Labs, an Israeli generative AI company, today announced its latest large language model (LLM), called Jurassic-2. Up till now, the base model of AI21 Labs has been Jurassic-1, the largest version of which has 178 billion parameters. That made it among the largest LLMs on the market -- slightly bigger than OpenAI's 175B parameter GPT-3 davinci model. However, when I spoke this week to AI21 Labs co-founder and co-CEO, Ori Goshen, he was reluctant to tell me how large Jurassic-2 is. LLM size "plays a factor, but it's not the only factor," said Goshen. "So we've stopped referring to the size, because it can be misleading about the actual performance of the model."


Congress and AI - The New Stack

#artificialintelligence

Congress has never been the quickest off the mark when it comes to making laws dealing with technology. Now, even as AI takes over creative writing and art, Congress continues to sit idle. As legislators endeavor to comprehend generative AI programs such as Microsoft Bing, ChatGPT and Google Bard, some of the more technology-oriented lawmakers are apprehensive about a repeat of Congress's unpreparedness in responding to the previous major tech wave -- social media. Worries, however, don't appear to be leading to action. True, there's a backlash now for letting tech companies keep Washington at arm's length with promises of "self-regulation" on critical issues such as privacy protection, child safety, disinformation, cryptocurrency, and data portability.


Unlock the Next Wave of Machine Learning with the Hybrid Cloud - The New Stack

#artificialintelligence

Machine learning is no longer about experiments. Most industry-leading enterprises have already seen dramatic successes from their investments in machine learning (ML), and there is near-universal agreement among business executives that building data science capabilities is vital to maintaining and extending their competitive advantage. The bullish outlook is evident in the U.S. Bureau of Labor Statistics' predictions regarding growth of the data science career field: Employment of data scientists is projected to grow 36% from 2021 to 2031, much faster than the average for all occupations. The aim now is to grow these initial successes beyond the specific parts of the business where they had initially emerged. Companies are looking to scale their data science capabilities to support their entire suite of business goals and embed ML-based processes and solutions everywhere the company does business.


JavaScript Library Lets Devs Add AI Capabilities to Web - The New Stack

#artificialintelligence

AI company Hugging Face has released a new open source JavaScript library that allows frontend and web developers to add machine learning capabilities to webpages and apps. Traditionally, Python notebooks are the toolkit for data scientists, but for most web and frontend developers, it's JavaScript. Until now, adding those functions meant a Python app on the backend that did the work, said Jeff Boudier, head of product and growth at the startup. Using JavaScript, the browser can request machine learning models to serve predictions and obtain answers for a visitor. "We provide some low code/no code tools, but if you want to dig in a little bit, you still have to whip out some Python notebooks, etc. And that's the traditional toolkit of data scientists," Boudier told The New Stack.


Cohere vs. OpenAI in the Enterprise: Which Will CIOs Choose? - The New Stack

#artificialintelligence

OpenAI has just announced an enterprise version of its popular generative AI product, ChatGPT. But in this case, OpenAI is a fast follower -- not the first-to-market. Cohere, a Toronto-based company with close ties to Google, is already bringing generative AI to businesses. I spoke with Cohere's President and COO, Martin Kon, about how its machine learning models are being used within enterprise companies. Cohere is only a few years old, but it has an impressive pedigree.


Large Language Models Aren't the Silver Bullet for Conversational AI - The New Stack

#artificialintelligence

Machine Learning's Large Language Models (LLMs) -- like ChatGPT, GPT3 and BERT -- have recently captured the attention of the world. Put simply, LLMs are artificial intelligence (AI) tools that read, summarize, translate and generate text. They're able to predict which words would come next in a sentence with high confidence, which allows them to generate language similar to how humans speak and write. These models are so advanced, in fact, that some have even questioned their ability to achieve sentience. But, while it's no secret that LLMs have become an important foundation for conversational AI systems, many people incorrectly assume that LLMs will eventually be the silver bullet that will solve all conversational AI problems -- and that's just not the case.