You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: CODE_OF_CONDUCT.md
+2-1
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,7 @@
1
+
1
2
# Code of Conduct
2
3
3
-
-**Purpose**: The purpose of this Code of Conduct is to establish a welcoming and inclusive community around the `Mambular` project. We want to foster an environment where everyone feels respected, valued, and able to contribute to the project.
4
+
-**Purpose**: The purpose of this Code of Conduct is to establish a welcoming and inclusive community around the `STREAM` project. We want to foster an environment where everyone feels respected, valued, and able to contribute to the project.
4
5
5
6
-**Openness and Respect**: We strive to create an open and respectful community where everyone can freely express their opinions and ideas. We encourage constructive discussions and debates, but we will not tolerate any form of harassment, discrimination, or disrespectful behavior.
Copy file name to clipboardexpand all lines: README.md
+18-17
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@
19
19
<h1>Mambular: Tabular Deep Learning Made Simple</h1>
20
20
</div>
21
21
22
-
Mambular is a Python library for tabular deep learning. It includes models that leverage the Mamba (State Space Model) architecture, as well as other popular models like TabTransformer, FTTransformer, TabM and tabular ResNets. Check out our paper `Mambular: A Sequential Model for Tabular Deep Learning`, available [here](https://arxiv.org/abs/2408.06291). Also check out our paper introducing [TabulaRNN](https://arxiv.org/pdf/2411.17207) and analyzing the efficiency of NLP inspired tabular models.
22
+
Mambular is a Python library for tabular deep learning. It includes models that leverage the Mamba (State Space Model) architecture, as well as other popular models like TabTransformer, FTTransformer, TabM and tabular ResNets. Check out our paper `Mambular: A Sequential Model for Tabular Deep Learning`, available [here](https://arxiv.org/abs/2408.06291). Also check out our paper introducing [TabulaRNN](https://arxiv.org/pdf/2411.17207) and analyzing the efficiency of NLP inspired tabular models.
23
23
24
24
<h3> Table of Contents </h3>
25
25
@@ -66,6 +66,7 @@ Mambular is a Python package that brings the power of advanced deep learning arc
66
66
|`TabulaRNN`| A Recurrent Neural Network for Tabular data, introduced [here](https://arxiv.org/pdf/2411.17207). |
67
67
|`MambAttention`| A combination between Mamba and Transformers, also introduced [here](https://arxiv.org/pdf/2411.17207). |
68
68
|`NDTF`| A neural decision forest using soft decision trees. See [Kontschieder et al.](https://openaccess.thecvf.com/content_iccv_2015/html/Kontschieder_Deep_Neural_Decision_ICCV_2015_paper.html) for inspiration. |
69
+
|`SAINT`| Improve neural networs via Row Attention and Contrastive Pre-Training, introduced [here](https://arxiv.org/pdf/2106.01342). |
69
70
70
71
71
72
@@ -90,7 +91,7 @@ If you want to use the original mamba and mamba2 implementations, additionally i
90
91
pip install mamba-ssm
91
92
```
92
93
93
-
Be careful to use the correct torch and cuda versions:
94
+
Be careful to use the correct torch and cuda versions:
@@ -115,7 +116,7 @@ Mambular simplifies data preprocessing with a range of tools designed for easy t
115
116
-**Polynomial Features**: Automatically generates polynomial and interaction terms for numerical features, enhancing the ability to capture higher-order relationships.
116
117
-**Box-Cox & Yeo-Johnson Transformations**: Performs power transformations to stabilize variance and normalize distributions.
117
118
-**Custom Binning**: Enables user-defined bin edges for precise discretization of numerical data.
Note, that using this, you can also optimize the preprocessing. Just use the prefix ``prepro__`` when specifying the preprocessor arguments you want to optimize:
0 commit comments