Skip to content

There is no guarantee of optimality with golden section search #1

@ljwolf

Description

@ljwolf

Hi!

This is very interesting, and a great package to see! I have similar goals to implement most of our estimator classes in sklearn-style APIs, and GW-learners should definitely be generalizable in this fashion! Really cool!

For the "optimal bandwidth search" procedure, there is no guarantee the default finds an optimum value, even for the GWLogisticClassifier, because the information criteria may not be convex as a function of the bandwidth parameter, and

for an interval containing multiple extrema (possibly including the interval boundaries), [golden section search] will converge to one of them.

Sometimes you can get lucky, in the sense that your min_bandwidth and max_bandwidth specify a region of the information criteria function with a single optimum. Such non-convexity is often strong at small values of k, as well as sometimes when the data generating process has local, regional, and global patterns. And, I'm not aware of a proof of convexity for any of the other prediction algorithms under any conditions...

I'd definitely recommend testing this for your data, and maybe implementing a more robust optimizer?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions