Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(docs) Fix typos in documentation #4374

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions doc/source/tutorial-quickstart-jax.rst
Original file line number Diff line number Diff line change
Expand Up @@ -226,10 +226,10 @@ set that the client might have:

Finally, we can construct a ``ClientApp`` using the ``FlowerClient`` defined above by
means of a ``client_fn()`` callback. Note that the `context` enables you to get access
to hyperparemeters defined in your ``pyproject.toml`` to configure the run. In this
to hyperparameters defined in your ``pyproject.toml`` to configure the run. In this
tutorial we access the ``local-epochs`` setting to control the number of epochs a
``ClientApp`` will perform when running the ``fit()`` method. You could define
additioinal hyperparameters in ``pyproject.toml`` and access them here.
additional hyperparameters in ``pyproject.toml`` and access them here.

.. code-block:: python

Expand Down
2 changes: 1 addition & 1 deletion doc/source/tutorial-quickstart-pytorch.rst
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@ The ClientApp
The main changes we have to make to use `PyTorch` with `Flower` will be found in the
``get_weights()`` and ``set_weights()`` functions. In ``get_weights()`` PyTorch model
parameters are extracted and represented as a list of NumPy arrays. The
``set_weights()`` function that's the oposite: given a list of NumPy arrays it applies
``set_weights()`` function that's the opposite: given a list of NumPy arrays it applies
them to an existing PyTorch model. Doing this in fairly easy in PyTorch.

.. note::
Expand Down
2 changes: 1 addition & 1 deletion doc/source/tutorial-quickstart-xgboost.rst
Original file line number Diff line number Diff line change
Expand Up @@ -399,7 +399,7 @@ We first define a strategy for XGBoost bagging aggregation.
return config

We use two clients for this example. An ``evaluate_metrics_aggregation`` function is
defined to collect and wighted average the AUC values from clients. The ``config_func``
defined to collect and weighted average the AUC values from clients. The ``config_func``
function is to return the current FL round number to client's ``fit()`` and
``evaluate()`` methods.

Expand Down
Loading