Skip to content

zouchuhang/3D-PRNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

56 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

3D-PRNN

Torch implementation of our ICCV 17 paper: "3D-PRNN, Generating Shape Primitives with Recurrent Neural Networks"

Prerequisites

Data

  • Download primitive data to current folder

This includes our ground truth primitives (folder "prim_gt") and the original ModelNet mesh (folder "ModelNet10_mesh")

Train

  • For shape generation from scratch:
th driver.lua
  • For shape generation conditioned on single depth map:
th driver_depth.lua

Generation

  • For shape generation from scratch:
th testNet_3dp.lua
  • For shape generation conditioned on single depth map:
th testNet_3dp_depth.lua

Visualization

  • To visualize ground truth primitives, run visualizeGTPrimitive.m in Matlab
  • To visualize sample shape generation, run visualizeRandomGeneration.m
  • To visualize sample shape generation conditioned on depth, run visualizeDepthReconGeneration.m

Primitive ground truth

  • See ./matlab/ folder

Note

For shape generation conditioned on depth, as explained in the paper Sec 5.1, we perform a nearest neighbor query based on the encoded feature of the depth map to retrieve the most similar shape in the training set and use the configuration as the initial state for the RNN. For convenience, we include our pre-computed initial configuration for each test class in folder "data/sample_generation".

Primitive parsing

We provide in the matlab folder the demo code (demo.m) to parse single primitive. To sequentially parse primitives in batch, see "script_parse_primitive.m". After each run of "script_parse_primitive.m", run "script_parse_primitive_symmetry.m" to get the symmetry. With every three parses, try "script_refine_parse_primitive.m" to refine the parsed primitives.

Citation

@inproceedings{zou20173d,
  title={3d-prnn: Generating shape primitives with recurrent neural networks},
  author={Zou, Chuhang and Yumer, Ersin and Yang, Jimei and Ceylan, Duygu and Hoiem, Derek},
  booktitle={The IEEE International Conference on Computer Vision (ICCV)},
  year={2017}
}

Acknowledgement

About

Torch implementation of our ICCV 17 paper: "3D-PRNN, Generating Shape Primitives with Recurrent Neural Networks"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published