The technologies selected here to fullfill a lightweight machine learning predictive model hosting are:
- Docker, as a container standard, used here to easily build and deploy a Python environment,
- Python, the de facto prefered language for ML,
- Flask and Flask-RESTPlus, frameworks bringing web app and RESTfull APIs,
- Pickle, an object serialization for Python,
- JobLib, another object serialization for Python.
The ML hosting is composed of 3 projects:
-
ML model creation: Several source files to create variations of ML models with scikit-learn to predict a default for a loan repayment. These models are stored in the file system through a pickle serialization or JobLib serialization.
-
A static RESTful ML microservice for scikit-learn models serialized in pickle: A sample of a predictive microservice running a Random Forest Classification model to predict a default for a loan repayment. The ml model has been serialized with pickle. Features values are directly sent as http parameters. The microservice exposes an OpenAPI descriptor.
-
A generic REST ML microservice for scikit-learn models serialized in joblib: A sample of a lightweight REST/JSON microservice to run multiple sklearn ML models captured as joblib files.