Skip to content

2.5.2

Latest
Compare
Choose a tag to compare
@IsaacYangSLA IsaacYangSLA released this 14 Dec 02:04
· 103 commits to main since this release
3276960

Contributions

Special thanks to all the contributors for this release (in git shortlog order):
@IsaacYangSLA, @yhwen @yanchengnv, @nvidianz @YuanTingHsieh

What's New

In this release, we have introduced several exciting new features and enhancements, building on the foundation of version 2.5.1. Key updates include:

Extended Python Version Support (2.5.1)

We now support a broader range of Python versions, from 3.9 to 3.12, ensuring greater compatibility and flexibility for your development needs.

Secure Federated XGBoost Enhancements

Please see this note

The Secure Federated XGBoost framework has been significantly improved with optimizations to the CUDA Paillier Plugin:

New Parallel CUDA-Based Reduction Algorithm:

Version 2 of the CUDA Paillier Plugin introduces a cutting-edge parallel reduction algorithm. This improvement:

  • Doubles the performance speed compared to version 1 on specific datasets (e.g., small feature sets with a large number of rows).
    Dramatically enhances efficiency in datasets with a wide number of features (over 2000 features).
  • Parameter Conversion Optimization: We have reduced unnecessary parameter conversions, streamlining the overall performance.

Performance Benchmarks:

Benchmarks conducted on the V100 GPU highlight the remarkable improvements achieved with these enhancements:

  • For small feature datasets, our solution is 30x to 36.5x faster compared to third-party CPU-based implementations.
  • For wide-feature datasets, we maintain a competitive edge, being 4.6x faster.
    The CPU-based plugin is also optimized to reduce the memory usage during ciphertext operations by utilizing shared memory.
Screen Shot 2024-12-13 at 8 43 28 PM

End-to-end fraud detection example enhancements

In addition to the existing manual feature engineering, we add example to use Graph Embedding to feed to XGBoost, the embedding as new features works better than manual feature enrichment
We also showed how to use Federated explainability for Fed XGBoost

Support Normal TLS & signed messages

By default NVIDIA FLARE supports mutual TLS (mTLS) connection. There is a need for some customers to use normal TLS. In this release, we added the normal TLS support

Description
Currently Flare's message security comes from mutual TLS: server and client authenticate each other when making connections. This means that only clients that have the right startup kits can make a connection to the server.
The requirement of one-way SSL between the server and clients breaks this assumption: the server could be exposed to the internet and any one could write a client to connect to the server. To ensure message security, explicit message authentication is required.
This PR implements message authentication: messages received by the server must have an auth token, and the token must be validated successfully to prove that it was issued by the server!

Here is how it works:
The client first tries to login to the server. The server/client authenticate each other explicitly with the credentials in their startup kits. This step is independent of how client/server is connected.
If the client credential is validated correctly, the server issues a token and a signature that binds the client name and token together. The signature is generated with the server's private key to prove that the signature can only be issued by the server.
When sending a message to the server, the client adds its client name, token and the signature as headers to the message.
When the message is received, the server validates the token and the signature. Messages that are missing these headers or fail to validate will be rejected.
Note that this mechanism is based on the security of the startup kits. All sites must protect their startup kits securely.

Bug fixes:

We fixed various bugs discovered by our users and customers

What's Changed

Full Changelog: 2.5.1...2.5.2