AI/ML could well be leveraged in Microservices platform where the building block components/services cause a ‘racing condition’ for resources or to predict and suffice a resource management.
The ability to deploy machine-learning applications as containers and to cluster those containers has several advantages, including:
- The ability to make machine learning applications self-contained. They can be mixed and matched on any number of platforms, with virtually no porting or testing required. Because they exist in containers, they can operate in a highly distributed environment, and you can place those containers close to the data the applications are analyzing.
- The ability to expose the services of machine learning systems that exist inside of containers as services or microservices. This allows external applications, container-based or not, to leverage those services at any time, without having to move the code inside the application.
- The ability to cluster and schedule container processing to allow the machine learning application that exists in containers to scale. You can place those applications on cloud-based systems that are more efficient, but it’s best to use container management systems, such as Google’s Kubernetes or Docker’s Swarm.
- The ability to access data using well-defined interfaces that deal with complex data using simplified abstraction layers. Containers have mechanisms built in for external and distributed data access, so you can leverage common data-oriented interfaces that support many data models.
- The ability to create machine learning systems made up of containers functioning as loosely coupled subsystems. This is an easier approach to creating an effective application architecture where you can do things such as put volatility into its own domain by using containers.