Machine Learning (ML) models have shown great analytical and predictive benefits while processing vast amounts of data. These models provide significant value when deployed in a real-world production environment.
DSS Founder and CEO Anna Anisin sat down with Miriam Friedel, Senior Director of Machine Learning Engineering at Capital One, at the DataConnect conference earlier this month to talk about the latest machine learning trend; MLOPs.
This article provides a summary and main takeaways of the insightful and fun conversation and all you should know about MLOps.
Miriam defines MLOps as:
“The processes around taking a trained machine learning model and bringing it into some production or serving environment.”
She explains MLOps with a simple real-world example of a transaction fraud model inside card readers at grocery stores. In this example, a fraud-prediction machine learning model is operationalized by deploying it inside a card reader system. The system takes the results from the model and performs the required action within seconds of a card swipe.
MLOps involves many aspects to ensure the model is serving the right way. Some behind-the-scenes processes of MLOps include:
According to Miriam, “Companies who do not have MLOPs pipelines embedded in some product, are leaving value on the table”.
Ad hoc analytics using ML models is not enough to drive business value. MLOps is a great way to fully leverage the potential of the models by taking them from model training to model operationalization. This process embeds trained ML models inside real-world systems so these systems can benefit from the real-time results on real-world data that the model provides.
Operationalized ML models that fit your business use cases can give you a competitive advantage over your rivals. In addition to business profitability, MLOps encourage learning and collaboration. It spans many tasks that involve almost all organizational teams, resulting in a stronger and more collaborative business environment.
What are the challenges of MLOps?
A trained ML model cannot simply be deployed. It is called ML operationalization because a lot of work goes beyond getting a model ready for deployment. MLOps is a complex blend of many different pieces that come together to operationalize ML models. Each particular piece comes with its own set of challenges, some of which are:
These are just a few of the many challenges of MLOps that according to Miriam “require a massive breadth of people and skills” to overcome.
What are the requirements for MLOps?
When listing requirements for MLOps for a particular use case, it is always best to confirm the need for MLOps to avoid wasting time and resources.
MLOps will always be use-case-dependent. But generally, when you have use cases that require MLOps, you need to :
With the use of any technology the build vs. buy debate arises. And like any other platform, building vs. buying an MLOps platform depends on your organizational goals. If an MLOps platform is central to your business function then building a platform to suit your needs would be a better approach.
In contrast, if the requirement of an MLOps platform is only to achieve a certain goal then you should consider buying. Even with your MLOps requirements in place, some of the factors that will be the basis for deciding to build or buy include:
Whenever data scientists encounter complex tasks, they are very likely to look for ways to automate them. The same is the case for MLOps, which is complex and has many layers. While it may be possible to automate certain aspects of MLOps, it is impossible to completely replace the human mind and fully automate the process.
According to Miriam, it is suggested to automate as much as possible in areas involving tedious work or where humans are capable of making mistakes, e.g., model retraining. The tasks where human judgment and ethics may be necessary can be left un-automated e.g., justifying model predictions.
Watch the full MLOps coffee chat conversation with Miriam Friedel and Anna Anisin here on the DSS Youtube channel: