Skip to content

Network Traffic Transformer to learn network dynamics from packet traces. Learn fundamental dynamics with pre-training and fine-tune to multiple applications.

License

Notifications You must be signed in to change notification settings

NotSpecial/Network-Traffic-Transformer

 
 

Repository files navigation

Network Traffic Transformer (NTT)

This work was undertaken as part of my master thesis at ETH Zurich, from Feb 2022 to Aug 2022, titled Advancing packet-level traffic predictions with Transformers. We present a new transformer-based architecture, to learn network dynamics from packet traces.

We design a pre-training phase, where we learn fundamental network dynamics. Following this, we have a fine-tuning phase, on different network tasks, and demonstrate that pre-training well leads to generalization to multiple fine-tuning tasks.

Original proposal:

Supervisors:

Research Lab:

We redirect you to the following sections for further details.

NOTE 1:

The experiments conducted in this project are very involved. Understanding and reproducing them from just the code and comments alone will be quite hard, inspite of the instructions mentioned in the given README. For more detailed understanding, we invite you to read the thesis (direct link). You can also check out an overview on the presentation slides (direct link)

For any further questions or to discuss related research ideas, please feel free to contact me by email.

NOTE 2:

Some results from the thesis have been written as a paper titled A new hope for network model generalization and the same has been accepted for presentation at ACM HotNets 2022. The paper is now online and open-access, it can be accessed via the ACM Digital Library via this link, DOI is: 10.1145/3563766.3564104.

NOTE 3:

The thesis has now been published under the ETH Research collection, which is open access. It can be accessed from here (direct link)

About

Network Traffic Transformer to learn network dynamics from packet traces. Learn fundamental dynamics with pre-training and fine-tune to multiple applications.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 45.7%
  • TeX 28.3%
  • C++ 24.4%
  • HTML 0.8%
  • Shell 0.7%
  • Makefile 0.1%