Package Reference

Submodules

porter.api module

Light wrappers around flask and requests.

class porter.api.App(import_name: str, static_url_path: str | None = None, static_folder: str | PathLike[str] | None = 'static', static_host: str | None = None, host_matching: bool = False, subdomain_matching: bool = False, template_folder: str | PathLike[str] | None = 'templates', instance_path: str | None = None, instance_relative_config: bool = False, root_path: str | None = None)[source]

Bases: Flask

Light wrapper around flask.app.Flask.

json_provider_class: type[JSONProvider] = functools.partial(<class 'porter.api._PorterJSONProvider'>, encoder_factory=<class 'porter.utils.AppEncoder'>)

A subclass of JSONProvider. An instance is created and assigned to app.json when creating the app.

The default, DefaultJSONProvider, uses Python’s built-in json library. A different provider can use a different JSON library.

New in version 2.2.

porter.api.get(*args, **kwargs)[source]
porter.api.get_model_context()[source]

Returns porter.sevices.BaseService or None

porter.api.jsonify(data, *, status_code)[source]

‘Jsonify’ a Python object into something an instance of App can return to the user.

porter.api.post(*args, data, **kwargs)[source]
porter.api.request_id()[source]

Return a “unique” ID for the current request.

porter.api.request_json(silent=False)[source]

Return the JSON from the current request.

Parameters:

silent (bool) – Silence parsing errors and return None instead.

porter.api.request_method()[source]

Return the HTTP method of the current request, e.g. ‘GET’, ‘POST’, etc.

porter.api.set_model_context(service)[source]

Register a model on the request context.

Parameters:

service (porter.sevices.BaseService) –

porter.api.validate_url(url)[source]

Return True if url is valid and False otherwise.

Roughly speaking, a valid URL is a URL containing sufficient information for post() and get() to send requests - whether or not the URL actually exists.

porter.config module

Configuration options.

porter.constants module

Global constants defining endpoint naming conventions, etc.

class porter.constants.BASE_KEYS[source]

Bases: object

REQUEST_ID = 'request_id'
class porter.constants.ERROR_BODY_KEYS[source]

Bases: object

MESSAGES = 'messages'
NAME = 'name'
TRACEBACK = 'traceback'
USER_DATA = 'user_data'
class porter.constants.GENERIC_ERROR_KEYS[source]

Bases: BASE_KEYS

ERROR = 'error'
class porter.constants.HEALTH_CHECK_KEYS[source]

Bases: BASE_KEYS

APP_META = 'app_meta'
DEPLOYED_ON = 'deployed_on'
PORTER_VERSION = 'porter_version'
SERVICES = 'services'
class porter.constants.HEALTH_CHECK_SERVICES_KEYS[source]

Bases: object

ENDPOINT = 'endpoint'
MODEL_CONTEXT = 'model_context'
STATUS = 'status'
class porter.constants.HEALTH_CHECK_VALUES[source]

Bases: object

DEPLOYED_ON = '2024-01-31T17:58:55.405146'
IS_READY = 'READY'
class porter.constants.MODEL_CONTEXT_ERROR_KEYS[source]

Bases: object

MODEL_CONTEXT = 'model_context'
class porter.constants.MODEL_CONTEXT_KEYS[source]

Bases: object

API_VERSION = 'api_version'
MODEL_META = 'model_meta'
MODEL_NAME = 'model_name'
class porter.constants.PREDICTION_KEYS[source]

Bases: BASE_KEYS

MODEL_CONTEXT = 'model_context'
PREDICTIONS = 'predictions'
class porter.constants.PREDICTION_PREDICTIONS_KEYS[source]

Bases: object

ID = 'id'
PREDICTION = 'prediction'

porter.datascience module

Definitions of interfaces for data science objects expected by porter.services.

class porter.datascience.BaseModel[source]

Bases: ABC

Class defining the model interface required by porter.services.ModelApp.add_service().

abstract predict(X)[source]

Return predictions corresponding to the data in X.

class porter.datascience.BasePostProcessor[source]

Bases: ABC

Class defining the postprocessor interface required by porter.services.ModelApp.add_service().

abstract process(X_input, X_preprocessed, predictions)[source]

Process and return predictions.

Parameters:
  • X_input (pandas.DataFrame) – The raw input from a POST request converted to a pandas.DataFrame.

  • X_preprocessed – The POST request data with preprocessing applied.

  • predictions – The output of an instance of BaseModel.

Returns:

predictions processed as desired.

Note: X_input and X_preprocessed are included to provide additional context for postprocessing predictions if necessary.

class porter.datascience.BasePreProcessor[source]

Bases: ABC

Class defining the preprocessor interface required by porter.services.ModelApp.add_service().

abstract process(X_input)[source]

Process and return X_input.

Parameters:

X_input (pandas.DataFrame) – The raw input from a POST request converted to a pandas.DataFrame.

Returns:

X_input processed as desired.

class porter.datascience.WrappedModel(model)[source]

Bases: BaseModel

A convenience class that exposes a model persisted to disk with the BaseModel interface.

Parameters:

model – An object with a scikit-learn-compatible .predict() method.

classmethod from_file(path, *args, s3_access_key_id=None, s3_secret_access_key=None, **kwargs)[source]
predict(X)[source]

Return predictions corresponding to the data in X.

class porter.datascience.WrappedTransformer(transformer)[source]

Bases: BasePreProcessor

A convenience class that exposes a transformer persisted to disk with the BasePreProcessor interface.

Parameters:

transformer – An object with a scikit-learn-compatible .transform() method.

classmethod from_file(path, *args, s3_access_key_id=None, s3_secret_access_key=None, **kwargs)[source]
process(X)[source]

Process and return X_input.

Parameters:

X_input (pandas.DataFrame) – The raw input from a POST request converted to a pandas.DataFrame.

Returns:

X_input processed as desired.

porter.exceptions module

Generic HTTP exception for use with porter.

exception porter.exceptions.PorterException(*args, code)[source]

Bases: Exception

Generic HTTP Exception.

Parameters:
  • *args – Passed to Exception()

  • code (int) – The HTTP status code.

porter.loading module

Loading utilities.

porter.loading.load_file(path, s3_access_key_id=None, s3_secret_access_key=None)[source]

Load a file and return the result.

Raises:

ValueError – If path specifies an unknown file type or specifies an s3 resource but credentials are not provided.

porter.loading.load_h5(path)[source]

Load and return an object stored in h5 with tensorflow.

porter.loading.load_pkl(path)[source]

Load and return a pickled object with joblib.

porter.responses module

class porter.responses.Response(data, *, status_code=200)[source]

Bases: object

jsonify()[source]
porter.responses.make_alive_response(app)[source]
porter.responses.make_batch_prediction_response(id_values, predictions)[source]
porter.responses.make_error_response(error)[source]
porter.responses.make_prediction_response(id_value, prediction)[source]
porter.responses.make_ready_response(app)[source]

porter.schemas module

Tools for validating and documenting OpenAPI schemas with porter.

class porter.schemas.Array(*args, item_type=None, **kwargs)[source]

Bases: ApiObject

Array type.

class porter.schemas.Boolean(description=None, *, additional_params=None, reference_name=None)[source]

Bases: ApiObject

Boolean type.

class porter.schemas.Integer(description=None, *, additional_params=None, reference_name=None)[source]

Bases: ApiObject

Integer type.

class porter.schemas.Number(description=None, *, additional_params=None, reference_name=None)[source]

Bases: ApiObject

Number type.

class porter.schemas.Object(*args, properties=None, additional_properties_type=None, required='all', **kwargs)[source]

Bases: ApiObject

Object type.

class porter.schemas.RequestSchema(api_obj, description=None)[source]

Bases: object

to_openapi()[source]
class porter.schemas.ResponseSchema(api_obj, status_code, description=None)[source]

Bases: object

to_openapi()[source]
class porter.schemas.String(description=None, *, additional_params=None, reference_name=None)[source]

Bases: ApiObject

String type.

porter.schemas.make_docs_html(docs_prefix, docs_json_url)[source]
Parameters:

docs_json_url (str) – URL where documentation JSON is exposed. Ignored if expose_docs=False.

Returns:

Static html docs to serve.

Return type:

str

porter.schemas.make_openapi_spec(title, description, version, request_schemas, response_schemas, additional_params)[source]
Parameters:
  • title (str) – The title of the application.

  • description (str) – A description of the application.

  • version (str) – The version of the application.

  • request_schemas (dict) – Nested dictionary mapping endpoints to a dictionary of HTTP methods to instances of RequestSchema. E.g. {“/foo/bar”: {“GET”: RequestSchema(…)}}.

  • response_schemas (dict) – Nested dictionary mapping endpoints to a dictionary of HTTP methods to lists of instances of ResponseSchema. E.g. {“/foo/bar/”: {“GET”: [ResponseSchema(…), ResponseSchema(…)]}}

  • additional_params (dict) – A nested dictionary mapping tuples of endpoints and HTTP methods to a dictionary containing arbitrary OpenAPI values that will be applied to the OpenAPI spec for that endpoint/method. E.g. {(“/foo/bar/”, ‘GET): {“tags”: [“tag1”, “tag2”]}}

Returns:

The OpenAPI spec describing the provided arguments.

Return type:

dict

porter.services module

Tools for building RESTful services that exposes machine learning models.

Building and running an app with the tools in this module is as simple as

  1. Instantiating ModelApp.

  2. Instantiating a “service”. E.g. instantiate PredictionService for each model you wish to add to the service.

  3. Use the service(s) created in 2. to add models to the app with either ModelApp.add_service() or ModelApp.add_services().

For example,

>>> model_app = ModelApp()
>>> prediction_service1 = PredictionService(...)
>>> prediction_service2 = PredictionService(...)
>>> model_app.add_services(prediction_servie1, prediction_service2)

Now the model app can be run with model_app.run() for development, or as an example of running the app in production $ gunicorn my_module:model_app.

class porter.services.BaseService(*args, **kwargs)[source]

Bases: ABC, StatefulRoute

A service class contains all necessary state and functionality to route a service and serve requests.

This class does nothing on its own and is meant to be extended.

Parameters:
  • name (str) – The model name. The final routed endpoint is generally derived from this parameter.

  • api_version (str) – The service API version. The final routed endpoint is generally derived from this parameter.

  • meta (dict) – Additional meta data added to the response body. Optional.

  • log_api_calls (bool) – Log request and response and response data. Default is False.

  • namespace (str) – String identifying a namespace that the service belongs to. Used to route services by subclasses. Default is “”.

  • validate_request_data (bool) – Whether to validate the request data or not. Applies to all HTTP methods and does nothing if add_request_schema() is never called.

  • validate_response_data (bool) – Whether to validate the response data or not. Applies to all HTTP methods and does nothing if add_response_schema() is never called.

id

A unique ID for the service.

Type:

str

name

The model name. The final routed endpoint is generally derived from this attribute.

Type:

str

api_version

The service version.

Type:

str

meta

Additional meta data added to the response body. Optional.

Type:

dict

log_api_calls

Log request and response and response data. Default is False.

Type:

bool

namespace

A namespace that the service belongs to.

Type:

str

validate_request_data

Whether to validate the request data or not. Applies to all HTTP methods and does nothing if add_request_schema() is never called.

Type:

bool

validate_response_data

Whether to validate the response data or not. Applies to all HTTP methods and does nothing if add_response_schema() is never called.

Type:

bool

action

str describing the action of the service, e.g. “prediction”. Used to determine the final routed endpoint.

Type:

str

endpoint

The endpoint where the service is exposed.

Type:

str

request_schemas

Dictionary mapping HTTP methods to instances of porter.schemas.RequestSchema. Each RequestSchema object is added from calls to add_request_schema() and instantiated from the corresponding arguments.

Type:

dict

response_schemas

Dictionary mapping HTTP methods to a list of porter.schemas.ResponseSchema. Each ResponseSchema object is added from calls to add_request_schema() and instantiated from the corresponding arguments.

Type:

dict

abstract property action

str describing the action of the service, e.g. “prediction”. Used to determine the final routed endpoint.

add_request_schema(method, api_obj, description=None)[source]

Add a request schema.

Parameters:
  • method (str) – The HTTP method, usually GET or POST.

  • api_obj (porter.schemas.ApiObject) – The request data schema.

  • description (str) – Description of the schema. Optional.

add_response_schema(method, status_code, api_obj, description=None)[source]

Add a response schema.

Parameters:
  • method (str) – The HTTP method, usually GET or POST.

  • status_code (int) – The HTTP response status code.

  • api_obj (porter.schemas.ApiObject) – The request data schema.

  • description (str) – Description of the schema. Optional.

property api_version

The model version. The final routed endpoint is generally derived from this parameter.

check_meta(meta)[source]

Raise ValueError if meta contains invalid values, e.g. meta cannot be converted to JSON properly.

Subclasses overriding this method should always use super() to call this method on the superclass unless they have a good reason not to.

define_endpoint()[source]

Return the service endpoint derived from instance attributes.

define_id()[source]

Return a unique ID for the service. This is used to set the id attribute.

get_post_data()[source]

Return POST data.

Returns:

The result of porter.config.json_encoder

Raises:

werkzeug.exceptions.UnprocessableEntity

If self.validate_request_data is True and a request schema has been defined the data will be validated against the schema.

property id

A unique ID for the instance.

property name

The model name. The final routed endpoint is generally derived from this parameter.

property namespace

A namespace that the service belongs to.

property route_kwargs

Keyword arguments to use when routing self.serve().

abstract serve()[source]

Return a response to be served to the user.

Users extending this base class will want to return a native Python object such as a str or dict. In such cases the object must be compatible with porter.config.json_encoder.

For subclasses defined internally, this should be the return value of one of the functions in porter.responses or an instance of porter.responses.Response.

abstract property status

Return str representing the status of the service.

update_meta(meta)[source]

Update meta data with instance state if desired and return.

class porter.services.ModelApp(services, *, name=None, description=None, version=None, meta=None, expose_docs=False, docs_url='/docs/', docs_json_url='/_docs.json', docs_prefix='')[source]

Bases: object

Abstraction used to simplify building REST APIs that expose predictive models.

Essentially this class is a wrapper around an instance of flask.Flask.

Parameters:
  • name (str) – Name for the application. This will appear in the documentation if expose_docs=True. Optional.

  • description (str) – Description of the application. This will appear in the documentation if expose_docs=True. HTML allowed. Optional.

  • version (str) – Version of the application. This will appear in the documentation if expose_docs=True. Optional.

  • meta (dict) – Additional meta data added to the response body in health checks. Optional.

  • expose_docs (bool) – If True API documentation will be served at docs_url. The documentation is built from the request_schemas and response_schemas attributes of services added to the instance. Default is False.

  • docs_url (str) – Endpoint for the API documentation. Ignored if expose_docs=False. Defaults to ‘/docs/’. Note this does _not_ override docs_prefix.

  • docs_json_url (str) – URL where documentation JSON is exposed. Ignored if expose_docs=False. Defaults to ‘/_docs.json’. Note this does _not_ override docs_prefix.

  • docs_prefix (str) – Prefix to applied to all documentation endpoints. Must begin with a / and end without one.

name

Name for the application.

Type:

str

description

Description of the application.

Type:

str

version

Version of the application.

Type:

str

meta

Additional meta data added to the response body in health checks.

Type:

dict

expose_docs

Whether the instance is configured to expose API documentation.

Type:

bool

docs_url

Endpoint the API documentation is exposed at.

Type:

str

docs_json_url

URL where documentation JSON is exposed.

Type:

str

docs_prefix

Prefix to applied to all documentation endpoints.

Type:

str

docs_json

The OpenAPI spec used to serve the Swagger documentation. None if expose_docs is False.

Type:

dict or None

check_meta(meta)[source]

Raise ValueError if meta contains invalid values, e.g. meta cannot be converted to JSON properly.

Subclasses overriding this method should always use super() to call this method on the superclass unless they have a good reason not to.

run(*args, **kwargs)[source]

Run the app.

Parameters:
  • *args – Positional arguments passed on to the wrapped flask app.

  • **kwargs – Keyword arguments passed on to the wrapped flask app.

class porter.services.PredictionService(*args, **kwargs)[source]

Bases: BaseService

A prediction service. Instances can be added to instances of ModelApp to serve predictions.

Parameters:
  • name (str) – The model name. The final routed endpoint will become “/<namespace>/<name>/<api version>/<action>/”.

  • api_version (str) – The model API version. The final routed endpoint will become “/<namespace>/<name>/<api version>/<action>/”.

  • meta (dict) – Additional meta data added to the response body. Optional.

  • log_api_calls (bool) – Log request and response and response data. Default is False.

  • namespace (str) – String identifying a namespace that the service belongs to. The final routed endpoint will become “/<namespace>/<name>/<api version>/<action>/”. Default is “”.

  • action (str) – str describing the action of the service. Used to determine the final routed endpoint. Defaults to “prediction”. The final routed endpoint will become “/<namespace>/<name>/<api version>/<action>/”.

  • model (object) – An object implementing the interface defined by porter.datascience.BaseModel.

  • preprocessor (object or None) – An object implementing the interface defined by porter.datascience.BaseProcessor. If not None, the .process() method of this object will be called on the POST request data and its output will be passed to model.predict(). Optional.

  • postprocessor (object or None) – An object implementing the interface defined by porter.datascience.BaseProcessor. If not None, the .process() method of this object will be called on the output of model.predict() and its return value will be used to populate the predictions returned to the user. Optional.

  • batch_prediction (bool) – Whether or not batch predictions are supported or not. If True the API will accept an array of objects to predict on. If False the API will only accept a single object per request. Optional.

  • additional_checks (callable) – If additional_checks raises a ValueError when called, a 422 UnprocessableEntity response will be returned to the user. This method allows users to implement additional data validations that cannot be expressed with an OpenAPI schema. The signature should accept a single positional argument for the validated POST input parsed to a pandas.DataFrame.

  • feature_schema (porter.schemas.Object or None) – Description of an a single feature set. Can be used to validate inputs if validate_request_data=True and document the API if added to an instance of ModelApp where expose_docs=True.

  • prediction_schema (porter.schemas.Object or None) – Description of a single model prediction. Can be used to validate outputs if validate_request_data=True and document the API if added to an instance of ModelApp where expose_docs=True.

  • **kwargs – Keyword arguments passed on to BaseService.

id

A unique ID for the model. Composed of name and api_version.

Type:

str

name

The model’s name.

Type:

str

meta

Additional meta data added to the response body. Optional.

Type:

dict

log_api_calls

Log request and response and response data. Default is False.

Type:

bool

namespace

String identifying a namespace that the service belongs to. The final routed endpoint will become “/<namespace>/<name>/<api version>/prediction/”. Default is “”.

Type:

str

api_version

The model API version.

Type:

str

endpoint

The endpoint where the model predictions are exposed. This is computed as “/<name>/<api version>/prediction/”.

Type:

str

model

An object implementing the interface defined by porter.datascience.BaseModel.

Type:

object

preprocessor

An object implementing the interface defined by porter.datascience.BaseProcessor. If not None, the .process() method of this object will be called on the POST request data and its output will be passed to model.predict(). Optional.

Type:

object or None

postprocessor

An object implementing the interface defined by porter.datascience.BaseProcessor. If not None, the .process() method of this object will be called on the output of model.predict() and its return value will be used to populate the predictions returned to the user. Optional.

Type:

object or None

batch_prediction

Whether or not the endpoint supports batch predictions or not. If True the API will accept an array of objects to predict on. If False the API will only accept a single object per request. Optional.

Type:

bool

additional_checks

Raises ValueError or subclass thereof if POST request is invalid.

Type:

callable

feature_schema

Description of an individual instance to be predicted on. Can be used to validate inputs if validate_request_data=True and document the API if added to an instance of ModelApp where expose_docs=True.

Type:

porter.schemas.Object or None

prediction_schema

Description of an individual prediction returned to the user. Can be used to validate outputs if validate_request_data=True and document the API if added to an instance of ModelApp where expose_docs=True.

Type:

porter.schemas.Object or None

request_schema

request format, including instance IDs, and wrapped as Array if batch_prediction=True. Can be used for validation outside of porter.

Type:

porter.schemas.Object or None

response_schema

POST 200 response format, including request_id, model_context, etc.

Type:

porter.schemas.Object or None

property action

str describing the action of the service. Used to determine the final routed endpoint. The final routed endpoint will become “/<namespace>/<name>/<api version>/<action>/”.

get_post_data()[source]

Return data from the most recent POST request as a pandas.DataFrame.

Returns:

pandas.DataFrame. Each row represents a single instance to

predict on. If self.batch_prediction is False the DataFrame will only contain one row.

route_kwargs = {'methods': ['GET', 'POST'], 'strict_slashes': False}
serve()[source]

Retrive POST request data from flask and return a response containing the corresponding predictions.

Returns:

A “jsonified” object representing the response to return

to the user.

Return type:

object

Raises:
  • werkzeug.exceptions.BadRequest – Raised when request data cannot be parsed (in super().get_post_data).

  • werkzeug.exceptions.UnprocessableEntity – Raised when parsed request data does not follow the specified schema (in super().get_post_data).

  • werkzeug.exceptions.UnsupportedMediaType – Raised when request data is given in an unsupported Content-Encoding.

property status

Return ‘READY’. Instances of this class are always ready.

class porter.services.ServeAlive(*args, **kwargs)[source]

Bases: StatefulRoute

Class for building stateful liveness routes.

Parameters:

app (object) – A ModelApp instance. Instances of this class inspect app when called to determine if the app is alive.

logger = <Logger porter.services (WARNING)>
class porter.services.ServeReady(*args, **kwargs)[source]

Bases: StatefulRoute

Class for building stateful readiness routes.

Parameters:

app (object) – A ModelApp instance. Instances of this class inspect app when called to determine if the app is alive.

logger = <Logger porter.services (WARNING)>
class porter.services.ServeRoot(*args, **kwargs)[source]

Bases: StatefulRoute

class porter.services.StatefulRoute(*args, **kwargs)[source]

Bases: object

Helper class to ensure that classes we intend to route via their __call__() method satisfy the flask interface.

porter.services.serve_error_message(error)[source]

porter.utils module

class porter.utils.AppEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]

Bases: NumpyEncoder, PythonEncoder

A JSON encoder that handles numpy and python data types.

default(obj)[source]

Implement this method in a subclass such that it returns a serializable object for o, or calls the base implementation (to raise a TypeError).

For example, to support arbitrary iterators, you could implement default like this:

def default(self, o):
    try:
        iterable = iter(o)
    except TypeError:
        pass
    else:
        return list(iterable)
    # Let the base class default method raise the TypeError
    return JSONEncoder.default(self, o)
porter.utils.JSONFormatter

alias of JSONLogFormatter

class porter.utils.JSONLogFormatter(*fields, indent=None, encoder=None, **kwargs)[source]

Bases: Formatter

A JSON formatter for logs.

Usage:

>>> logger = logging.getLogger(__name__)
>>> logger.setLevel('INFO')
>>> console = logging.StreamHandler()
>>> formatter = JSONLogFormatter('asctime', 'message', 'levelname')
>>> console.setFormatter(formatter)
>>> logger.addHandler(console)
>>> logger.info({'something': 'interesting'})
{"message": {"something": "interesting"}, "levelname": "INFO",
 "asctime": "2018-07-05 11:48:54,248"}

Any attribute of logging.LogRecord can be specified to log. See

Exception information does not need to be specified. If there was an exception, that information is added automatically to the log.

>>> try:
>>>     raise Exception('something bad')
>>> except Exception as err:
>>>     logger.exception(err)
{"exc_text": "...", "exc_info": "...", "levelname": "ERROR",
 "asctime": "2018-07-05 11:51:26,179", "message": "something bad"}
Parameters:
  • fields (list of str) – List of fields to include in log. This can be any attribute of a logging.LogRecord object plus “asctime” and “message”. For a list of logging.LogRecord attributes.

  • indent (int or None) – The indentation level. Values are the same as json.dump.

  • encoder (object) – A json.JSONEncoder subclass. It is recommended to use Encoder or a subclass thereof if you need additional handling so that Exception’s and datetime’s are properly handled.

  • **kwargs – Additional keyword arguments to be passed to logging.Formatter.__init__.

format(record)[source]

Format the specified record as text.

The record’s attribute dictionary is used as the operand to a string formatting operation which yields the returned string. Before formatting the dictionary, a couple of preparatory steps are carried out. The message attribute of the record is computed using LogRecord.getMessage(). If the formatting string uses the time (as determined by a call to usesTime(), formatTime() is called to format the event time. If there is exception information, it is formatted using formatException() and appended to the message.

class porter.utils.NumpyEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]

Bases: JSONEncoder

A JSON encoder that handles numpy data types.

default(obj)[source]

Implement this method in a subclass such that it returns a serializable object for o, or calls the base implementation (to raise a TypeError).

For example, to support arbitrary iterators, you could implement default like this:

def default(self, o):
    try:
        iterable = iter(o)
    except TypeError:
        pass
    else:
        return list(iterable)
    # Let the base class default method raise the TypeError
    return JSONEncoder.default(self, o)
class porter.utils.PythonEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]

Bases: JSONEncoder

A JSON encoder that extends json.JSONEncoder to handle additional Python types.

default(obj)[source]

Implement this method in a subclass such that it returns a serializable object for o, or calls the base implementation (to raise a TypeError).

For example, to support arbitrary iterators, you could implement default like this:

def default(self, o):
    try:
        iterable = iter(o)
    except TypeError:
        pass
    else:
        return list(iterable)
    # Let the base class default method raise the TypeError
    return JSONEncoder.default(self, o)
porter.utils.object_constants(obj)[source]