diff --git a/.gitignore b/.gitignore index 2f19259..8083f2b 100644 --- a/.gitignore +++ b/.gitignore @@ -159,3 +159,4 @@ cython_debug/ # option (not recommended) you can uncomment the following to ignore the entire idea folder. #.idea/ .vscode +.DS_Store diff --git a/README.md b/README.md index cd1e5c5..89e4682 100644 --- a/README.md +++ b/README.md @@ -2,20 +2,18 @@ Extension for 3D Slicer for bone mesh morphing. -At the moment, this module specializes for the *humerus* bone. +At the moment, this module specializes for the *humerus* bone, but the use case is not limited to it. -## Special thanks <3 -Special thanks goes to my wonder colleague Eva C. Herbst (@evaherbst) and Arthur Port (@agporto) for creating the initial idea and their huge help during the development of this module! +## Special thanks +Special thanks goes to my wonderful colleagues Eva C. Herbst (@evaherbst) and Arthur Porto (@agporto) for creating the initial idea and their huge help during the development of this module! Also, I would like to thank O. Hirose (@ohirose) for the research on BCPD/GBCPD and it's implementation (can be found [here](https://github.com/ohirose/bcpd)) - ## Installation **Supported platforms:** - Linux (x86_64) - Windows (x86_64) - MacOS (both x86_64 and ARM; Slicer runs through Rosetta on ARM-based Macs) - Steps: - Download the latest ZIP package from Releases - Extract the ZIP contents to your desired folder @@ -40,70 +38,68 @@ The UI consists of **4** main sections - Postprocessing ## Architecture - -To be added. +

+ +

## Module sections -### Input section +### Input section This section is self-explanatory. Here, you choose two input models: -- Source = Source for the generation; This is the model that represents the -- Target = Model which is non-complete => Needs its missing portions generated +- Source = The mean model, i.e. a full humerus +- Target = Partial model to be reconstructed -### Preprocessing section +### Preprocessing section -#### Point cloud preprocessing -Before the generation process, we usually want to preprocess the model. -Reasons could be to remove unwanted **outliers** or to smooth out the models. -First of all, the model is converted to a *point cloud* to be able to effectively perform some preprocessing steps. - -Then, a downsampling is performed. For this, you can configure the threshold for downsampling by the following parameter: +Before the generation process, we want to preprocess the model. +First of all is the option of downsampling. For this, you can configure the threshold for downsampling by the following parameter: - **Downsampling distance threshold** - -After the downsampling, we compute the normals of the point cloud. It uses a radius for which the normals are calculated and maximum number of neighbours. This can be adjusted with the following parameter: -- **Normals estimation radius** -- **Normals estimation max neighbours** - -Also, we want calculate a *Fast point feature histogram* to somehow encode the local geometric properties. This method takes in two following parameters: -- **FPFH search radius** - Radius in which the FPFH is calculated in -- **FPFH max neighbours** - Maximum number of neighbours taken into account - -#### Registration section -At this moment, we have our point clouds preprocessed and ready for the next step, which is the "registration" section. -Here we try to define and calculate how and how much we need to adjust the source mesh to match the target one. - -For this, we will use the downsampled point clouds with their corresponding FPFHs from the previous step. -The concrete method we use is called **RANSAC**. It uses a process called "repeated random sub-sampling" to mitigate the effect of outliers and rotational differences as much as possible. + - If set to 0.0, no downsampling is performed + +After the downsampling, we compute the normals of the point cloud. +The computation needs a radius for which the normals are calculated and maximum number of neighbours. +These can be adjusted with the following parameters: +- **Normals estimation radius** - maximum radius in which points are considered neighbouring +- **Normals estimation max neighbours** - maximum number of neighbours taken into account + +Also, we need to calculate a *(Fast) point feature histogram* in order to encode the local geometric properties of the models. +This method uses the following parameters: +- **FPFH search radius** - maximum radius in which points are considered neighbouring +- **FPFH max neighbours** - maximum number of neighbours taken into account + +#### Registration +At this moment we have our models preprocessed and ready for the next step, which is the registration. +Here we calculate the rigid alignment of the models in order to pre-align them. +The concrete method we use is called **RANSAC** (random sample consensus). The behaviour of this algorithm can be adjusted by the following parameters: - **Max iterations** -- **Distance threshold** - same meaning as in previous steps -- **Fitness threshold** - the lowest fitness between the point clouds to be accepted. The lower, the higher chance of finding a good fit. The higher, higher the chance that either *max iterations* are reached +- **Distance threshold** - maximum distance in which points are considered neighbouring +- **Fitness threshold** - the lowest fitness between the models to be accepted. -The result of the *RANSAC* algorithm is a bit "raw". To get the best possible fit, we perform the **ICP registration algorithm** upon the result. +The computed fit by the RANSAC algorithm is a bit "raw". To improve it further, we perform the **ICP** (Iterative closest points) algorithm. This can be tuned by the following parameter: -- **ICP Distance threshold** - same meaning as in previous steps - -### Generation section -Since we now have a preprocessed meshes and with defined transformations from the *source* to the *target*, we can proceed to the **generation section**. +- **ICP Distance threshold** - maximum distance in which points are considered neighbouring -For this purpose, we use a method called [Bayesian coherent point drift](https://github.com/ohirose/bcpd). It falls into the *non-rigid registration* category of algorithms, which actually performs the deformation of the mesh to increase the fit of the source. -It takes in both meshes and deforms the source into the target, similarly as we've already done in the [Registration section](#preprocessing-section). -Due to the problem that BCPD allows for "unrealistic" deformations, we have done the pre-registration steps, which lets us mitigate the chance of getting into unrealistic deformations. +### Reconstruction section +Since we now have a preprocessed meshes and with defined transformations from the *source* to the *target*, we can proceed to the **reconstruction section**. +For the reconstruction we use the **BCPD** (Bayesian coherent point drift) algorithm. +Now, the BCPD allows for very fine adjustments of its behaviour using lots of different parameters. +For the exact description of their effects, please refer to the official documentation [here](https://github.com/ohirose/bcpd/blob/master/README.md). -Now, BCPD allows for very fine adjustments of its behaviours using lots of different parameters. For the exact description of their effects, please refer to the documentation [here](https://github.com/ohirose/bcpd/blob/master/README.md). - -> **Note: You do NOT have to perform any kind of installation process, the BCPD and its geodesic variant are already pre-built and preconfigured for immediate using in this module.** +> **Note: You do NOT have to perform any kind of installation process, the BCPD and its geodesic variant are already pre-built and preconfigured for immediate use in this module.** **Not implemented options:** - Terminal output - File output -### Postprocessing section -After our models have been merged successfully, we still want to apply a slight amount of postprocessing to reach the most optimal results. -We are basically using a bit of **filtering and smoothing** to the meshes. +### Postprocessing section +After the model is reconstructed, we include a postprocessing section to slightly modify the result, if necessary. For these, we let you modify the following parameters: -- **Clustering scaling** - Scaled size of voxel for within vertices that are clustered together (additionally refer to [here](http://www.open3d.org/docs/0.7.0/python_api/open3d.geometry.simplify_vertex_clustering.html) +- **Clustering scaling** + - Scaled size of voxel for within vertices that are clustered together (additionally refer to [here](http://www.open3d.org/docs/0.7.0/python_api/open3d.geometry.simplify_vertex_clustering.html)) + - If set to 1.0, no scaling is performed - **Smoothing iterations** - Number of iterations of mesh smoothing + - If set to 0, no smoothing is applied After the whole process is done, both the generated mesh (source transformed into target, standalone) and the merged mesh (generated meshes merged with the target; "combined model") are import back into the current Slicer scene. @@ -111,14 +107,8 @@ After the whole process is done, both the generated mesh (source transformed int

-## FAQ -To be added. - -## Troubleshooting -To be added. - ## Contributors - +A huge thank you to all of the contributors! diff --git a/SlicerBoneMorphing/Resources/Samples/O35 M humerus.stl b/SlicerBoneMorphing/Resources/Samples/testing_humerus_full.stl similarity index 100% rename from SlicerBoneMorphing/Resources/Samples/O35 M humerus.stl rename to SlicerBoneMorphing/Resources/Samples/testing_humerus_full.stl diff --git a/SlicerBoneMorphing/Resources/Samples/U35 F humerus partial.stl b/SlicerBoneMorphing/Resources/Samples/testing_humerus_partial.stl similarity index 100% rename from SlicerBoneMorphing/Resources/Samples/U35 F humerus partial.stl rename to SlicerBoneMorphing/Resources/Samples/testing_humerus_partial.stl diff --git a/SlicerBoneMorphing/src/logic/SlicerBoneMorphingLogic.py b/SlicerBoneMorphing/src/logic/SlicerBoneMorphingLogic.py index 22511a6..fb6abce 100644 --- a/SlicerBoneMorphing/src/logic/SlicerBoneMorphingLogic.py +++ b/SlicerBoneMorphing/src/logic/SlicerBoneMorphingLogic.py @@ -283,13 +283,14 @@ def __preprocess_point_cloud( FPFH: open3d.pipelines.registration.Feature] ''' - pcd_downsampled: o3d.geometry.PointCloud = pcd.voxel_down_sample(downsampling_distance_threshold) + if downsampling_distance_threshold > 0.0: + pcd = pcd.voxel_down_sample(downsampling_distance_threshold) - pcd_downsampled.estimate_normals(o3d.geometry.KDTreeSearchParamHybrid(radius=normals_estimation_radius, max_nn=max_nn_normals)) + pcd.estimate_normals(o3d.geometry.KDTreeSearchParamHybrid(radius=normals_estimation_radius, max_nn=max_nn_normals)) - pcd_fpfh = o3d.pipelines.registration.compute_fpfh_feature(pcd_downsampled, o3d.geometry.KDTreeSearchParamHybrid(radius=fpfh_estimation_radius, max_nn=max_nn_fpfh)) + pcd_fpfh = o3d.pipelines.registration.compute_fpfh_feature(pcd, o3d.geometry.KDTreeSearchParamHybrid(radius=fpfh_estimation_radius, max_nn=max_nn_fpfh)) - return pcd_downsampled, pcd_fpfh + return pcd, pcd_fpfh def __ransac_pcd_registration( self, diff --git a/docs/assets/module_diagram.svg b/docs/assets/module_diagram.svg new file mode 100644 index 0000000..eedf736 --- /dev/null +++ b/docs/assets/module_diagram.svg @@ -0,0 +1,3 @@ + + +
SlicerBoneMorphing
«none»
+ __init__(self, parent)
SlicerBoneMorphingWidget
+ layout: QVBoxLayout

- bcpd_options: dict
- ui_widget: qMRMLWidget
- ui: []
- logic: SlicerBoneMorphingLogic
+ __init__(self, parent)
+ setup(self): None

- setup_ui(self): None
- reset_parameters_to_default(self): None
- setup_combo_box(self, comboBox: QComboBox,
enum: Enum, on_selection_changed): None
- show_kernel_type(self, current_index: int): None
- parse_parameters(self): dict
- parse_parameters_preprocessing(self): dict
- parse_parameters_bcpd(self): dict 
- parse_advanced_parameters(self, params: dict): dict
- parse_parameters_postprocessing(self): dict
- generate_model(self): None
SlicerBoneMorphingLogic
+ <none>
+ __init__(self, parent)
+ generate_model(self, source_model: vtkMRMLModelNode, target_model: vtkMRMLModelNode, parameters: dict): Tuple[int, vtk.vtkPolyData, vtk.vtkPolyData]

- convert_mesh_to_polydata(self, mesh: TriangleMesh): vtk.vtkPolyData
- convert_model_to_mesh(self, model: vtkMRMLModelNode): TriangleMesh
- convert_mesh_to_point_cloud(self, mesh: open3d.geometry.TriangleMesh): PointCloud
- preprocess_model(self, source_mesh: TriangleMesh, target_mesh: TriangleMesh, parameters: dict): Tuple[int, RegistrationResult]
- deformable_registration(self, source_pcd: PointCloud, target_pcd: PointCloud, bcpd_parameters: dict): TriangleMesh

- preprocess_point_cloud(self, pcd: PointCloud, downsampling_distance_threshold: float, normals_estimation_radius: float, fpfh_estimation_radius: float, max_nn_normals: int, max_nn_fpfh: int): Tuple[PointCloud, Feature]

- ransac_pcd_registration(self, source_pcd_down: PointCloud, target_pcd_down: PointCloud, source_fpfh: Feature, target_fpfh: Feature, distance_threshold: float, fitness_threshold: float, max_iterations: int): RegistrationResult



- postprocess_meshes(self, deformed: TriangleMesh, target_mesh: TriangleMesh, parameters: dict): Tuple[vtk.vtkPolyData, vtk.vtkPolyData]
TriangleMesh = open3d.geometry.TriangleMesh
PointCloud = open3d.geometry.PointCloud
RegistrationResult = open3d.pipelines.registration.RegistrationResult
Feature = open3d.pipelines.registration.Feature
Import Legend
«static»
Constants
«enum»
BcpdKernelType
«enum»
BcpdAccelerationMode
«enum»
BcpdStandardKernel
Enums
«enum»
BcpdNormalizationOptions
«class»
ScriptedLoadableModule
«class»
ScriptedLoadableModuleWidget
«class»
ScriptedLoadableModuleLogic
Slicer API
<<instantiates>>
<<instantiates>>
Extends
Extends
Extends
Use
Use
Use
Use
\ No newline at end of file