Machine Learning Operations – Make the best use of MLOps

Organizations looking to take full advantage of artificial intelligence are turning to MLOps – a new set of best practices and tools aimed at operationalizing AI. […]
When companies start using artificial intelligence and building machine learning projects, the focus is usually on theory. Is there a model that can provide the required results? How can it be created? How can it be trained?
But the tools data scientists use to create these proofs of concept often don't translate well to production systems. As a result, it can take an average of more than nine months to deploy an AI or ML solution, according to IDC.
"We call this 'model velocity,' which is the time it takes from start to finish," says IDC analyst Sriram Subramanian.
This is where MLOps comes into play. MLOps – Machine Learning Operations – is a set of best practices, frameworks and tools that help companies manage data, models, deployment, monitoring and other aspects of a theoretical proof-of-concept AI system and its deployment.
"MLOps reduces model speed to weeks – sometimes even days," says Subramanian. “ Just as the average time to build an application is accelerated with DevOps, this is why you need MLOps.”
By adopting MLOps, he believes, companies can build more models, innovate faster, and cover more use cases. “The value proposition is clear,” he says.
IDC predicts that by 2024, 60% of organizations will have operationalized their ML workflows using MLOps. And when companies were asked about the challenges of adopting AI and ML, the lack of MLOps was a top barrier to AI and ML adoption, second only to cost, Subramanian said.
Below, we examine what MLOPs is, how MLOPs evolved, and what organizations need to achieve and consider to make the most of this emerging method of operationalizing AI.

The development of MLOps

When Eugenio Zuccarelli started developing machine learning projects a few years ago, MLOps was just a collection of best practices. Since then, Zuccarelli has worked on AI projects at several companies, including healthcare and financial services, and has seen MLOps evolve over time, spanning tools and platforms.
Today, MLOps provides a fairly robust framework for operationalizing AI, says Zuccarelli, who is now an innovation data scientist at CVS Health. As an example, Zuccarelli points to a project he previously worked on to develop an app that would predict adverse outcomes such as hospital return or disease progression.
"We examined datasets and models and spoke to physicians to find out the characteristics of the best models," he says. “But to make these models really useful, we had to make them accessible to actual users.
That meant developing a reliable, fast and stable mobile app with a machine learning system in the background connected via an API. "Without MLOps, we wouldn't have been able to guarantee that," he says.
His team used the H2O MLOps platform and other tools to create a health dashboard for the model. "You don't want the model to change significantly," he says. “And you don't want to introduce bias. With the Health Dashboard we can see if the system has changed.
The use of an MLOps platform also enabled updates to the production systems. "It's very difficult to swap out a file without breaking the application," says Zuccarelli. "With MLOps tools, a system can be replaced while in production with minimal disruption to the system itself."
As MLOps platforms mature, they speed up the entire model development process because companies don't have to reinvent the wheel with every project, he says. And the capabilities to manage the data pipeline are also critical to operationalizing AI.

MLOps Basics: A Changing Goal

But don't think that just because platforms and tools become available, you can ignore the core principles of MLOps. Companies just starting out in this discipline should remember that at its core, MLOps are about creating strong connections between data science and data engineering.
"To ensure the success of an MLOps project, both data engineers and data scientists must work on the same team," says Zuccarelli.
Additionally, the tools needed to protect against bias, ensure transparency and explainability, and support ethics platforms are still evolving, he says. “There's definitely still a lot of work to be done because it's such a new area.
Without a complete turnkey solution, companies must master all facets that make MLOps so effective in operationalizing AI. And that means developing expertise across a wide range of activities, says Meagan Gentry, national practice manager for the AI team at Insight, a Tempe-based technology consulting firm.
But mastering the technical aspects is only part of the equation.
MLOps also leans towards the DevOps agile methodology and the principle of iterative development, says Gentry. In addition, as with any agile discipline, communication is crucial.
“Communication is key in any role,” she says. “The communication between the data scientist and the data engineer. Communicating with DevOps and with the entire IT team.”
"This is where the pitfalls lie," said Helen Ristov, senior manager of enterprise architecture at Capgemini Americas. “A lot of this is in development. There are no formal guidelines like those seen in DevOps. It is an emerging technology and policies and policies will take time to become established.”
Ristov recommends that companies start their MLOps journey with their data platforms. "Maybe they have data sets that are in different places, but they don't have a consistent work environment," she says.
MLOps platforms typically have tools for creating and managing data pipelines and tracking different versions of training data, but that's not possible right off the bat, she says.
Then there is the model creation, versioning, logging, feature balancing, and other aspects of managing the models themselves.
"There's a significant amount of coding work going on," Ristov says, adding that setting up an MLOps platform can take months and that platform vendors still have a lot of work to do when it comes to integration.
"There are so many developments going in different directions," she says. “There are many tools being developed and the ecosystem is very large and people just pick and choose what they need. MLOps is still in the early stages. Most companies are still in the process of finding the optimal configurations.

Overview of the MLOps landscape

According to IDC's Subramanian, the MLOps market will grow to approximately $700 million by 2025, up from $185 million in 2020. However, this is likely a significant underestimate given that MLOps products often come with larger platforms be bundled. The true size of the market, Subramanian says, could top $2 billion by 2025.
MLOps providers typically fall into three categories, starting with the big cloud providers like AWS, Azure, and Google Cloud that offer MLOps capabilities as a service, Subramanian said.
Then there are ML platform providers like DataRobot, Dataiku and Iguazio.
"The third category is what used to be called data management vendors," he says. “These are providers like Cloudera, SAS and DataBricks. Their strength was data management skills and data operations, and they expanded into ML skills and eventually MLOps skills.”
All three areas are exploding, Subramanian says, adding that what sets an MLOps provider apart is the ability to support both on-prem and cloud deployment models, the ability to implement trusted and accountable AI, the ability to plug-and-play -Play-enabled and how easily it can scale. "That's where the differentiation kicks in," he says.
According to a recent IDC survey, the lack of methods to implement responsible AI was one of the top three barriers to AI and ML adoption, tied with the lack of MLOps itself in second place.
That's in large part because there are no alternatives to adopting MLOps, says Sumit Agarwal, AI and machine learning research analyst at Gartner.
"The other approaches are manual," he says. "So there's really no other way. If you want to scale, you have to automate. You need traceability of code, data and models.
According to a recent Gartner survey, the average time it takes to bring a model from proof of concept to production has dropped from nine to 7.3 months. "But 7.3 months is still very long," says Agarwal. “There are a lot of opportunities for companies to take advantage of MLOps.

The cultural shift to MLOps

MLOps also requires a cultural shift on the part of an organization's AI team, says Amaresh Tripathy, global leader of analytics at Genpact.
"The common image of a data scientist is that of a mad scientist trying to find a needle in a haystack," he says. “The data scientist is an explorer and researcher — not a factory worker making widgets. But that's what you have to do to actually scale it."
And companies often underestimate the effort they have to put into it, he says.
"People have a better understanding of software engineering," he says. “There's a lot of discipline around user experience and requirements. But somehow people don't think that when I use a model I have to go through the same process. Also, there is a misconception that any data scientist who knows their way around a test environment will be able to implement the model as a matter of course, or that they can bring in a few IT colleagues who can then do the same. There is a lack of appreciation for what is necessary to achieve this.”
Companies also fail to understand that MLOps can impact other parts of the business, often resulting in dramatic changes.
"You can put MLOps in a call center and the average response time will actually increase because the simple things get done by the machine, the AI, and the things that go to humans actually take longer because they're more complex," he says. "So you have to think about what the work will be like, what people you need and what skills they should have."
Today, he says, less than 5% of a company's decisions are driven by algorithms, but that's changing fast. “We assume that in the next five years 20 to 25% of the decisions will be controlled by algorithms. Every stat we look at shows we are at an inflection point where AI is increasing rapidly.”

And MLOps is the key factor, he says.

"One hundred percent," he emphasizes. “Without it, you won't be able to use AI consistently. MLOps is the scaling catalyst for AI in the enterprise.”
*Maria Korolov has been reporting on new technologies and emerging markets for 20 years.

Related Posts

Leave a Reply

%d bloggers like this: