Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cv_results best method #621

Open
GeeseAndQuack opened this issue May 5, 2022 · 2 comments
Open

Cv_results best method #621

GeeseAndQuack opened this issue May 5, 2022 · 2 comments
Labels
enhancement New feature or request

Comments

@GeeseAndQuack
Copy link

Hi,

The best method says it 'Returns a model initialised with the hyper-parameters that perform optimally on average across folds for a given metric.'.

In some cases for a given fairness metric the maximum value indicates poor performance of a metric, as such calling best for certain measures returns the worst performing model, because I believe the best call returns the max value in the mean_storage?

Could an argument to somehow specify what constitutes best for the given measure be added?

Thanks, and let me know if I am just plain wrong.

@GeeseAndQuack GeeseAndQuack changed the title Clarification for the cv_results best method Cv_results best method May 5, 2022
@olliethomas olliethomas added the enhancement New feature or request label May 6, 2022
@olliethomas
Copy link
Member

Good idea. We'd take a pull request with this feature if you'd like to work on it.

@tmke8
Copy link
Member

tmke8 commented May 6, 2022

A work-around would be to define a new metric that is the negative of the existing metric.

@dataclass
class InverseAccuracy(Accuracy):
    def score(self, prediction: Prediction, actual: DataTuple) -> float:
        return -super().score(prediction, actual)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants