Skip to content

Commit

Permalink
Initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
y-zheng18 committed Dec 19, 2023
1 parent 4be88c6 commit 255df1e
Show file tree
Hide file tree
Showing 31 changed files with 6,943 additions and 35 deletions.
39 changes: 6 additions & 33 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 20,7 @@ parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
Expand Down Expand Up @@ -49,7 50,6 @@ coverage.xml
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
Expand All @@ -72,7 72,6 @@ instance/
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
Expand All @@ -83,9 82,7 @@ profile_default/
ipython_config.py

# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
.python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
Expand All @@ -94,22 91,7 @@ ipython_config.py
# install all needed dependencies.
#Pipfile.lock

# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock

# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/#use-with-ide
.pdm.toml

# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
Expand Down Expand Up @@ -145,16 127,7 @@ dmypy.json

# Pyre type checker
.pyre/
.idea/
.DS_Store

# pytype static type analyzer
.pytype/

# Cython debug symbols
cython_debug/

# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
logs/
56 changes: 54 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 1,54 @@
# HyFluid
Official code for Inferring Hybrid Neural Fluid Fields from Videos (NeurIPS 2023)
# Inferring Hybrid Neural Fluid Fields from Videos
This is the official code for Inferring Hybrid Neural Fluid Fields from Videos (NeurIPS 2023).

![teaser](assets/demo_hyfluid.gif)

**[[Paper](https://arxiv.org/pdf/2312.06561.pdf)] [[Project Page](https://kovenyu.com/hyfluid/)]**

## Installation
Install with conda:
```bash
conda env create -f environment.yml
conda activate hyfluid
```
or with pip:
```bash
pip install -r requirements.txt
```

## Data
The demo data is available at [data/ScalarReal](data/ScalarReal).
The full ScalarFlow dataset can be downloaded [here](https://ge.in.tum.de/publications/2019-scalarflow-eckert/).

## Quick Start
To learn the hybrid neural fluid fields from the demo data, firstly reconstruct the density field by running (~40min):
```bash
bash scripts/train.sh
```
Then, reconstruct the velocity field by jointly training with the density field (~15 hours on a single A6000 GPU.):
```bash
bash scripts/train_j.sh
```
Finally, add vortex particles and optimize their physical parameters (~40min):
```bash
bash scripts/train_vort.sh
```
The results will be saved in `./logs/exp_real`. With the learned hybrid neural fluid fields, you can re-simulate the fluid by using the velocity fields to advect density:
```bash
bash scripts/test_resim.sh
```
Or, you can predict the future states by extrapolating the velocity fields:
```bash
bash scripts/test_future_pred.sh
```

## Citation
If you find this code useful for your research, please cite our paper:
```
@article{yu2023inferring,
title={Inferring Hybrid Neural Fluid Fields from Videos},
author={Yu, Hong-Xing and Zheng, Yang and Gao, Yuan and Deng, Yitong and Zhu, Bo and Wu, Jiajun},
journal={NeurIPS},
year={2023}
}
```
Binary file added assets/demo_hyfluid.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 8 additions & 0 deletions configs/scalarflowreal.txt
Original file line number Diff line number Diff line change
@@ -0,0 1,8 @@
expname = scalarflowreal
basedir = ./logs
datadir = ./data/ScalarReal

N_samples = 192
N_rand = 1024

half_res = True
231 changes: 231 additions & 0 deletions data/ScalarReal/info.json
Original file line number Diff line number Diff line change
@@ -0,0 1,231 @@
{
"train_videos": [
{
"file_name": "train00.mp4",
"frame_rate": 30,
"frame_num": 120,
"camera_angle_x": 0.40746459248665245,
"camera_hw": [
1920,
1080
],
"transform_matrix": [
[
0.48627835512161255,
-0.24310240149497986,
-0.8393059968948364,
-0.7697111964225769
],
[
-0.01889985240995884,
0.9573688507080078,
-0.2882491946220398,
0.013170702382922173
],
[
0.8735995292663574,
0.15603208541870117,
0.4609531760215759,
0.3249526023864746
],
[
0.0,
0.0,
0.0,
1.0
]
]
},
{
"file_name": "train01.mp4",
"frame_rate": 30,
"frame_num": 120,
"camera_angle_x": 0.39413608028840563,
"camera_hw": [
1920,
1080
],
"transform_matrix": [
[
0.8157652020454407,
-0.1372431218624115,
-0.5618642568588257,
-0.39192497730255127
],
[
-0.04113851860165596,
0.9552109837532043,
-0.2930521070957184,
0.010452679358422756
],
[
0.5769183039665222,
0.262175977230072,
0.7735819220542908,
0.8086869120597839
],
[
0.0,
0.0,
0.0,
1.0
]
]
},
{
"file_name": "train03.mp4",
"frame_rate": 30,
"frame_num": 120,
"camera_angle_x": 0.41320072172607875,
"camera_hw": [
1920,
1080
],
"transform_matrix": [
[
0.8836436867713928,
0.15215487778186798,
0.44274458289146423,
0.8974969983100891
],
[
-0.021659603342413902,
0.9579861760139465,
-0.28599533438682556,
0.02680988796055317
],
[
-0.46765878796577454,
0.24312829971313477,
0.8498140573501587,
0.8316138386726379
],
[
0.0,
0.0,
0.0,
1.0
]
]
},
{
"file_name": "train04.mp4",
"frame_rate": 30,
"frame_num": 120,
"camera_angle_x": 0.40746459248665245,
"camera_hw": [
1920,
1080
],
"transform_matrix": [
[
0.6336104273796082,
0.20118704438209534,
0.7470352053642273,
1.2956339120864868
],
[
0.014488859102129936,
0.9623404741287231,
-0.27146074175834656,
0.02436656318604946
],
[
-0.7735165357589722,
0.1828240603208542,
0.6068339943885803,
0.497546911239624
],
[
0.0,
0.0,
0.0,
1.0
]
]
}
],
"test_videos": [
{
"file_name": "train02.mp4",
"frame_rate": 30,
"frame_num": 120,
"camera_angle_x": 0.41505697544547304,
"camera_hw": [
1920,
1080
],
"transform_matrix": [
[
0.999511182308197,
-0.0030406631994992495,
-0.03111351653933525,
0.2844361364841461
],
[
-0.005995774641633034,
0.9581364989280701,
-0.2862490713596344,
0.011681094765663147
],
[
0.03068138100206852,
0.28629571199417114,
0.9576499462127686,
0.9857829809188843
],
[
0.0,
0.0,
0.0,
1.0
]
]
}
],
"frame_bkg_color": [
0.0,
0.0,
0.0
],
"voxel_scale": [
0.4909,
0.73635,
0.4909
],
"voxel_matrix": [
[
7.549790126404332e-08,
0.0,
1.0,
0.081816665828228
],
[
0.0,
1.0,
0.0,
-0.044627271592617035
],
[
-1.0,
0.0,
7.549790126404332e-08,
-0.004908999893814325
],
[
0.0,
0.0,
0.0,
1.0
]
],
"render_center":[
0.3382070094283088,
0.38795384153014023,
-0.2609209839653898
],
"near":1.1,
"far":1.5,
"phi":20.0,
"rot":"Y"
}
Binary file added data/ScalarReal/train00.mp4
Binary file not shown.
Binary file added data/ScalarReal/train01.mp4
Binary file not shown.
Binary file added data/ScalarReal/train02.mp4
Binary file not shown.
Binary file added data/ScalarReal/train03.mp4
Binary file not shown.
Binary file added data/ScalarReal/train04.mp4
Binary file not shown.
Loading

0 comments on commit 255df1e

Please sign in to comment.