Information

Model and demo information.
Automated Analysis

This demo uses machine learning (ML) to analyse nocturnal pulse oximetry data in children. Computed statistical, non-linear, spectral, and oximetric features are used in the prediction process.

Sleep Staging of oximetry data is performed using binary classification. Thirty-second segments (epochs) of the recording are classified as Sleep or Wake, and these segments are filtered before analysis for sleep-disordered breathing. Sleep staging enables the computation of sleep statistics traditionally obtained from overnight polysomnography, such as sleep duration, latency, efficiency, and WASO. While this sleep staging step can be disabled, it is recommended for analysing oximetry data collected in uncontrolled environments (e.g., in the home).

Analysis of recording to uses either a regressor to produce a point estimate of the apnoea-hypopnea index (AHI) with accompanying uncertainty bounds, or a classifier to predict whether the apnoea-hypopnea index (AHI) is ≥5.

Peer-reviewed article containing out-of-sample performance data coming soon.
Online Retraining

This demo offers a streamlined pipeline for training and validating models using a bank of pre-computed features and datasets. Newly trained models are immediately available for use after saving, enabling rapid prototyping and rollout of updated models. All trained models are automatically calibrated, and provide estimates of uncertainty alongside predictions.

Model training uses a set of custom solvers. Gaussian Processes (GPs) use an exact solver. Gradient Boosting Machines (GBMs) are Gradient Boosting Decision Trees that use an XGBoost-like algorithm and are trained using the Histogram Tree Method. Linear models are trained using the closed-form solution for Linear Regression, and Iteratively Reweighted Least Squares (IRLS) for Logistic Regression. Support Vector Machines (SVMs) are trained using Sequential Minimal Optimisation with heuristics that leverage second-order information to accelerate convergence. The performance of these solvers is comparable to other widely used machine learning libraries.