-
Notifications
You must be signed in to change notification settings - Fork 16
/
DESCRIPTION
35 lines (35 loc) · 1.31 KB
/
DESCRIPTION
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
Package: fairmodels
Type: Package
Title: Flexible Tool for Bias Detection, Visualization, and Mitigation
Version: 1.2.1
Authors@R:
c(person("Jakub", "Wiśniewski", role = c("aut", "cre"),
email = "[email protected]"),
person("Przemysław", "Biecek", role = c("aut"),
comment = c(ORCID = "0000-0001-8423-1823")))
Description: Measure fairness metrics in one place for many models. Check how big is model's bias towards different races, sex, nationalities etc. Use measures such as Statistical Parity, Equal odds to detect the discrimination against unprivileged groups. Visualize the bias using heatmap, radar plot, biplot, bar chart (and more!). There are various pre-processing and post-processing bias mitigation algorithms implemented. Package also supports calculating fairness metrics for regression models. Find more details in (Wiśniewski, Biecek (2021)) <arXiv:2104.00507>.
License: GPL-3
Encoding: UTF-8
LazyData: true
Depends: R (>= 3.5)
Imports:
DALEX,
ggplot2,
scales,
stats,
patchwork,
Suggests:
ranger,
gbm,
knitr,
rmarkdown,
covr,
testthat,
spelling,
ggdendro,
ggrepel,
RoxygenNote: 7.1.1.9001
VignetteBuilder: knitr
URL: https://fairmodels.drwhy.ai/
BugReports: https://github.com/ModelOriented/fairmodels/issues
Language: en-US