Description
Machine learning packages such as scikit-learn and mlr3 allow mtry/max_features type hyperparameters to be supplied (and tuned) as a proportion of available features ('max_features' in sklearn and 'mtry_ratio' in mlr3).
Currently, parsnip appears to require the number of features to be supplied as an integer for tree-based models, and with mtry needing to be finalized if this is unknown. Maybe I have missed something, but this appears problematic within a workflow if preceeding recipe steps are adding or selecting features because the total number of features available are unknown. I couldn't see a way of ensuring a proper range of mtry when tuning hyperparameters as part of a workflow that contains such recipe steps? Currently, the only option appears to be to set the upper range of mtry to be as large as the maximum possible number of features and allowing tune/ranger to provide the warning that mtry has exceeded the number of features available?
Although most random forest implementations in R seem to only allow mtry as a number of features rather than proportion, mlr3 appears to catch this in its fit method via the mtry_ratio
hyperparameter. Would this be possible in parsnip?