porter is a framework for data scientists who want to quickly and reliably deploy machine learning models as REST APIs.
Simplicity is a core goal of this project. The following 6 lines of code are a fully functional example. While this should be the most common use case,
porter is also designed to be easily extended to cover the remaining cases not supported out of the box.
from porter.datascience import WrappedModel from porter.services import ModelApp, PredictionService my_model = WrappedModel.from_file('my-model.pkl') prediction_service = PredictionService(model=my_model, name='my-model', api_version='v1') app = ModelApp([prediction_service]) app.run()
- Practical design: suitable for projects ranging from proof-of-concept to production grade software.
- Framework-agnostic design: any object with a
predict()method will do, which means
porterplays nicely with sklearn, keras, or xgboost models. Models that don’t fit this pattern can be easily wrapped and used in
- OpenAPI integration: lightweight, Pythonic schema specifications support automatic validation of HTTP request data and generation of API documentation using Swagger.
- Boiler plate reduction:
portertakes care of API logging and error handling out of the box, and supports streamlined model loading from
.h5files stored locally or on AWS S3.
- Robust testing:
porterincludes a comprehensive test suite, and has been extensively field tested by the Data Science team at Cadent.