predibase
How to Run Inference on Ludwig Models Using TorchScript - Predibase - Predibase
In Ludwig 0.6, we have introduced the ability to export Ludwig models into TorchScript, making it easier than ever to deploy models for highly performant model inference. In this blog post, we will describe the benefits of serving models using TorchScript and demonstrate how to train, export, and use the exported models on an example dataset. A common way to serve machine learning models is wrapping them in REST APIs and exposing their endpoints. This works great if you do not have particularly strict SLA requirements or if backwards compatibility is not a concern. However, if you need to serve a model in a production environment, you will likely need to use a more robust solution.
Predibase exits stealth with a platform for building AI models – TechCrunch
Data science teams are stymied by disorganization at their companies, impacting efforts to deploy timely AI and analytics projects. In a recent survey of "data executives" at U.S.-based companies, 44% said that they've not hired enough, were too siloed off to be effective and haven't been given clear roles. Respondents said that they were most concerned about the impact of a revenue loss or hit to brand reputation stemming from failing AI systems and a trend toward splashy investments with short-term payoffs. These are ultimately organizational challenges. But Piero Molino, the co-founder of AI development platform Predibase, says that inadequate tooling often exacerbates them.
- Banking & Finance (0.49)
- Information Technology (0.30)