{ "id": "1710.11595", "version": "v1", "published": "2017-10-31T17:15:37.000Z", "updated": "2017-10-31T17:15:37.000Z", "title": "Partial Least Squares Random Forest Ensemble Regression as a Soft Sensor", "authors": [ "Casey Kneale", "Steven Brown" ], "comment": "Rough draft", "categories": [ "stat.ML", "cs.LG", "stat.ME" ], "abstract": "Six simple, dynamic soft sensor methodologies with two update conditions were compared on two experimentally-obtained datasets and one simulated dataset. The soft sensors investigated were: moving window partial least squares regression (and a recursive variant), moving window random forest regression, feedforward neural networks, mean moving window, and a novel random forest partial least squares regression ensemble (RF-PLS). We found that, on two of the datasets studied, very small window sizes (4 samples) led to the lowest prediction errors. The RF-PLS method offered the lowest one-step-ahead prediction errors compared to those of the other methods, and demonstrated greater stability at larger time lags than moving window PLS alone. We found that this method most adequately modeled the datasets that did not feature purely monotonic increases in property values. In general, we observed that linear models deteriorated most rapidly at more delayed model update conditions while nonlinear methods tended to provide predictions that approached those from a simple mean moving window. Other data dependent findings are presented and discussed.", "revisions": [ { "version": "v1", "updated": "2017-10-31T17:15:37.000Z" } ], "analyses": { "keywords": [ "squares random forest ensemble regression", "soft sensor", "moving window", "window random forest regression", "one-step-ahead prediction errors" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }