raffaello-camoriano/Enitor
Enitor provides the MATLAB implementation of several large-scale kernel methods.
MATLAB
Issues
- 0
Remove filter hyperparameter cross validation logic from nystromUniformIncremental
#36 opened by raffaello-camoriano - 1
Decouple loss function from dataset
#37 opened by raffaello-camoriano - 0
- 0
- 0
obj.errorStorageMode in kigdesc and ksgdesc shoud be a flag, not a string.
#45 opened by raffaello-camoriano - 0
Datasets to be added
#47 opened by raffaello-camoriano - 0
- 0
Add option for rank-k update in randomFeaturesGaussianIncremental and nystromUniformIncremental
#48 opened by raffaello-camoriano - 0
Unify randomFeaturesGaussianIncremental and nystromUniformIncremental compute() method
#49 opened by raffaello-camoriano - 0
Fix Hessenberger version of Tikhonov
#11 opened by raffaello-camoriano - 0
Add configuration parameter for validation part in experiment class
#19 opened by raffaello-camoriano - 1
The 1st guess for m in incremental and batch Nystrom should never be 1
#20 opened by raffaello-camoriano - 1
remove the 'hasRealValues' bottleneck
#21 opened by raffaello-camoriano - 0
- 0
- 0
fix nystromUniform interface
#31 opened by raffaello-camoriano - 0
- 0
- 0
Change name of filter class
#52 opened by raffaello-camoriano - 0
Make bandWaterfall plot function
#54 opened by raffaello-camoriano - 0
Add member functions to dataset class to convert between codings in multiclass scenarios
#55 opened by raffaello-camoriano - 0
- 1
Add Falcon
#58 opened by raffaello-camoriano - 0
Add mixUpTrainIdx and mixUpTestIdx to dataset to shuffle indexes without resampling them from scratch
#56 opened by raffaello-camoriano - 1
Do not recompute trainval, traintest and traintrain kernels from scratch at every filter iteration
#53 opened by raffaello-camoriano - 2
Incremental RF provides results which are incompatible with batch RF
#34 opened by raffaello-camoriano - 0
use try/catch for efficiency in SIsubGD_dual_hinge_loss && SsubGD_dual_hinge_loss
#46 opened by raffaello-camoriano - 1
In SGD, add option to compute CV performance only for the last step, instead of for every iteration
#44 opened by raffaello-camoriano - 1
Add cross-validation mechanism to SGD
#41 opened by raffaello-camoriano - 1
Restore the 'retraining' option
#29 opened by raffaello-camoriano - 0
Add perceptron loss
#40 opened by raffaello-camoriano - 1
- 0
merge S-IGD filters
#39 opened by raffaello-camoriano - 1
add iteration timing option to incremental nystrom and incremental random features
#33 opened by raffaello-camoriano - 1
in incrementalNkrls, choosing numNysParGuesses == 1 provides unreliable results
#30 opened by raffaello-camoriano - 0
add minimum rank option to incremental nystrom and incremental random features
#32 opened by raffaello-camoriano - 1
- 0
fix mapParRangeSamples limit
#28 opened by raffaello-camoriano - 1
Possible bug in incrementalrfrls
#27 opened by raffaello-camoriano - 0
Add Comparison between maximum size KRLS on a part of the dataset and DACKRLS, incremental NKRLS and RFKRLS on the full dataset
#18 opened by raffaello-camoriano - 1
Add rebalancing methods
#10 opened by raffaello-camoriano - 1
- 1
- 1
Add support for generating smaller datasets with a subset of classes in the constructor of Cifar10
#22 opened by raffaello-camoriano - 1
- 1
Add full training, testing and validation performance matrix saving options to algorithms
#14 opened by raffaello-camoriano - 1
- 1
- 1
- 1
retraining to be fixed in nrls
#9 opened by raffaello-camoriano