Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it easier to set activations #283

Open
EssamWisam opened this issue Nov 9, 2024 · 1 comment
Open

Make it easier to set activations #283

EssamWisam opened this issue Nov 9, 2024 · 1 comment

Comments

@EssamWisam
Copy link
Collaborator

Motivation and description

image

It would be easier and I don't have to explictly import Flux and just specify :relu because I coming to MLJFlux because I am not interested in using or importing Flux.

Possible Implementation

Something like:

function get_activation(func_symbol::Symbol)
           if hasproperty(Flux, func_symbol)
               return getproperty(Flux, func_symbol)
           else
               error("Function $func_symbol not found in Flux.")
           end
       end
@ablaom
Copy link
Collaborator

ablaom commented Nov 13, 2024

I think to keep things simple we either re-export the entire Flux namespace, or we re-export none of it. I think the latter and current choice gives the user more control, and it's not too burdensome to run using Flux to get dircect access to Flux's exported methods.

In case it was not clear to you, after using Flux you can do relu, Dense, etc without the Flux. qualification.

In docs we always try to put in the Flux. qualifier, because we cannot assume the user has done using Flux.

Even if we did re-export Flux's namespace, it won't help if the user never runs using MLJFlux, which they can indeed avoid with @load NeuralNetworkClassifier or similar, for that macro only does import MLJFlux.

My vote would be for the status quo. Anyone else prefer we re-export the Flux namespace?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants