Skip to content

Implementation of "FLOWING 🌊: Implicit Neural Flows for Structure-Preserving Morphing", to appear on NeurIPS 2025

License

Notifications You must be signed in to change notification settings

schardong/flowing

Repository files navigation

FLOWING 🌊: Implicit Neural Flows for Structure-Preserving Morphing

Arthur Bizzi [1], Matias Grynberg Portnoy [2], Vitor Pereira Matias [3], Daniel Perazzo [4], João Paulo Lima [5], Luiz Velho [4], Nuno Gonçalves [6,7], João M. Pereira[8], Guilherme Schardong [6], Tiago Novello [4]
[1] École Polytechnique Fédérale de Lausanne (EPFL),
[2] Buenos Aires University (UBA),
[3] University of São Paulo (USP),
[4] Institute for Pure and Applied Mathematics (IMPA),
[5] Federal Rural University of Pernambuco (UFRPE),
[6] Institute of Systems and Robotics, University of Coimbra (ISR-UC),
[7] Portuguese Mint and Official Printing Office (INCM),
[8] University of Georgia (UGA)

This is the official implementation of "FLOWING 🌊: Implicit Neural Flows for Structure-Preserving Morphing", to appear on NeurIPS 2025. Meanwhile, check the arXiv version. More results and examples in the project page.

Note that this repository is not ready yet!

Capabilities of our method

Getting started

TL-DR:If you just want to run the code, follow the steps below (assuming a UNIX-like system with Make installed). For more details, jump to Setup and sample run section.

python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
pip install -e .
make data/frll_neutral_front data/frll_neutral_front_cropped
...

Prerequisites

  1. Python
  2. venv or PyEnv
  3. Git (or just download a tarball of the repository)

Basically, you need Python to run the code, a GPU with GPGPU drivers installed (e.g. CUDA drivers and toolkit on an NVidia GPU). Optionally, you may want to use virtual environments (venv, PyEnv, or Anaconda) to isolate the code, and git for version-controlled code. On a POSIX-compliant system, you may want Make to automate some parts, but wherever applicable, we provide Python-based alternatives.

Code organization

Inside the standalone folder, we've stored scripts used for experiments in our paper, mainly the metrics (FID and LPIPS), face image cropping, landmark detection, and dataset downloads (in case you are not on a POSIX-compliant system). These are:

  • align.py - crop/resize/alignment script for the face images. Modified from the FFHQ repo. We mostly use it for cropping, since we perform the alignment ourselves
  • calc-fid.sh - calculates the Fréchet Inception Distance (FID) between two sets of images
  • calc-lpips.py - calculates the LPIPS between pairs of images
  • detect-face-landmarks.py - given a list of face images, detects the facial landmarks using DLib and stores them as .DAT files, read by the experiment scripts

Setup and sample run

The command below trains a warping using Neural Conjugate Flows (NCF) as base model, saving the results under results/001_002-ncf-good_manual_landmarks.

python ncf-warp-train.py experiments/faces/001_002-ncf-good_manual_landmarks.yaml

For Neural-ODEs, you can simply switch the traning script and configuration file as so:

python node-warp-train.py experiments/faces/001_002-node-good_manual_landmarks.yaml

Reproducing the experiments

Obtaining the datasets

We've employed three datasets for our experiments, two image datasets, and a 3DGS dataset.

  • Face Research Lab London (FRLL) The FRLL dataset can be obtained in their page on figshare. Simply download and extract the frontal facing neutral images to the data/frll_neutral_front folder. Our Makefile has a rule (data/frll_neutral_front) to automate this process. To crop the images, simply use the standalone/crop-face-images.py to crop all face images in the directory and save them to data/frll_neutral_front_cropped. Note that we also provide a Make rule to automate this, named after the output directory.

  • MegaDepth The MegaDepth v1 dataset is made available in their project website. There are also rules in our Makefileto download, crop, and pair the images adequatly. See thedata/megadepthanddata/megadeth_pairs` rules.

  • NeRSemble The NeRSemble dataset is available at the authors' website. For complete access, you must request it by following the instructions there. Afterwards, ...

Citation

If you find our work useful in your research, consider citing it in your tech report or paper.

@misc{bizzi2025flowing,
      title={{FLOWING}: Implicit Neural Flows for Structure-Preserving Morphing},
      author={Arthur Bizzi and Matias Grynberg and Vitor Matias and Daniel Perazzo and João Paulo Lima and Luiz Velho and Nuno Gonçalves and João Pereira and Guilherme Schardong and Tiago Novello},
      year={2025},
      eprint={2510.09537},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2510.09537},
}

About

Implementation of "FLOWING 🌊: Implicit Neural Flows for Structure-Preserving Morphing", to appear on NeurIPS 2025

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •