GeoCode: A Technique for Interpretable 3D Shape Synthesis
GeoCode: Interpretable Shape Programs
We present geocode, a technique for 3d shapesynthesis using an intuitively editable parameter space.
We build a novelprogram that enforces a complex set of rules and enables users to perform intuitive and controlled high-level edits that procedurally propagate at a lowlevel to the entire shape.
Our program produces high-quality mesh outputs byconstruction.
Once produced by our procedural program, shapes can be easily modified.
We use a neural network to map a given point cloud or sketch to our interpretable parameter space and we show that our program can infer and recover 3d shapes more accurately compared to existing techniques and we demonstrate its ability to perform controlled local and global shapemanipulations.
Authors
Ofek Pearl, Itai Lang, Yuhua Hu, Raymond A. Yeh, Rana Hanocka
We propose a technique for representing and manipulating 3d shapes using an intuitively editable parameter space.
Our human-interpretable parameter space describes a wide range of geometric properties and enables producing broad variations of detailed shapes.
However, the resulting shapes often contain undesirable artifacts or even floating parts and are not directly usable in existing 3d modeling and computer-graphics pipelines.
In this work, we leverage procedural methods to build a neural network that learns to map a given point cloud or sketch to the parameter space, and our program translates the parameter representation and produces high-quality mesh outputs by construction.
The resulting shapes are native to conventional 3d modeling pipelines and can be easily mixed and interpolated using their interpretable parameter representation.
We also show that our system generalizes to inputs from different distributions than the training set, such as free-form user-created sketches, sketches generated from images in the wild, noisy point cloud data, real-world point cloud scans, and more.
Result
In this paper, we present a novel method that represents shapes using a human-interpretable parameter space.
We achieved this by building a procedural program controlled by the intuitive parameter space and training a neural network to predict the parameter representation for an input point cloud or sketch.
We show that our system produces structurally valid 3d geometry and enables editing of the resulting shape easily and intuitively.