CRAN/E | distillML

distillML

Model Distillation and Interpretability Methods for Machine Learning Models

Installation

About

Provides several methods for model distillation and interpretability for general black box machine learning models and treatment effect estimation methods. For details on the algorithms implemented, see Brian Cho, Theo F. Saarinen, Jasjeet S. Sekhon, Simon Walter.

github.com/forestry-labs/distillML
Bug report File report

Key Metrics

Version 0.1.0.13
Published 2023-03-25 391 days ago
Needs compilation? no
License GPL (≥ 3)
CRAN checks distillML results

Downloads

Yesterday 19 0%
Last 7 days 48 -11%
Last 30 days 268 +4%
Last 90 days 968 +23%
Last 365 days 3.339 +53%

Maintainer

Maintainer

Theo Saarinen

theo_s@berkeley.edu

Authors

Brian Cho

aut

Theo Saarinen

aut / cre

Jasjeet Sekhon

aut

Simon Walter

aut

Material

README
Reference manual
Package source

macOS

r-release

arm64

r-oldrel

arm64

r-release

x86_64

r-oldrel

x86_64

Windows

r-devel

x86_64

r-release

x86_64

r-oldrel

x86_64

Old Sources

distillML archive

Imports

ggplot2
glmnet
Rforestry
dplyr
R6 ≥ 2.0
checkmate
purrr
tidyr
data.table
mltools
gridExtra

Suggests

testthat
knitr
rmarkdown
mvtnorm