Tidymodels

Hierarchical forecasting of hospital admissions- ML approach (ensemble)

R
1. Recap 2 Tune again Modelling Retuning 2.1 Retune Random Forest 2.2 Retune Prophet boost 2.3 Performance (after retuning) 3 Ensemble 3.1 Peformance (ensemble) 4 Performance (individual levels) Hospital Cluster level National level 5 The future Hospital Cluster National 6 KIV Plans Errors 1. Recap The aim of this series of blog is to predict monthly admissions to Singapore public acute adult hospitals.

Hierarchical forecasting of hospital admissions- ML approach (screen variables)

R
1 Intro 2 Data wrangling 2.1 Long format with aggregated values 2.2 Extend into the future 2.3 External regressor 2.3.1 Lags and rolling lags 2.3.2 Covid 2.3.3 Time series features 3 Splitting 4 Pre-processing recipes Pre-processing order 5. Modelling Workflow 6. Evaluate 6.1 Evaluate against the training set What’s inside the calibrated table 6.1 Evaluate with cross validation 8 Conclusion 1 Intro The aim of this series of blog is to predict monthly admissions to Singapore public acute adult hospitals.

Hierarchical forecasting of hospital admissions

R
Introduction Visualization 1. Trend 2. Seasonality Trend and seasonality 3. Anomaly Conclusion Introduction The aim of this series of blogs is to do time series forecasting with libraries that conform to tidyverse principles and there are two of these time series meta-packages modeltime which is created to be the time series equivalent of tidymodels fpp3 which is created to do tidy time series and has been nicknamed the tidyverts.

Explaining Predictions: Boosted Trees Post-hoc Analysis (Xgboost)

R
Recap We’ve covered various approaches in explaining model predictions globally. Today we will learn about another model specific post hoc analysis. We will learn to understand the workings of gradient boosting predictions. Like past posts, the Clevaland heart dataset as well as tidymodels principle will be used. Refer to the first post of this series for more details. Gradient Boosting Besides random forest introduced in a past post, another tree-based ensemble model is gradient boosting.

Explaining Predictions: Random Forest Post-hoc Analysis (randomForestExplainer package)

R
Recap This is a continuation on the explanation of machine learning model predictions. Specifically, random forest models. We can depend on the random forest package itself to explain predictions based on impurity importance or permutation importance. Today, we will explore external packages which aid in explaining random forest predictions. External packages There are external a few packages which offer to calculate variable importance for random forest models apart from the conventional measurements found within the random forest package.