Skip to main content
Best practices
0 votes
0 replies
39 views

I am training WGAN-GP on Eurosat dataset, splitted into train/val/test sets in counts 18900/4050/4050. Since FID scores are widely used in GANs in image generation, I based my hyperparameter search on ...
Advice
0 votes
0 replies
37 views

I’m trying to reproduce an ML pipeline (built in SAS Viya Model Studio, but the question is general) consisting of: Robust PCA with automatic/optimal component selection XGBoost with auto-tuned ...
0 votes
0 answers
53 views

I’m running Optuna to tune hyperparameters for a TabM regression model (10 trials) on Kaggle (GPU: Tesla P100) to minimize RMSE. The optimization runs fine — all trials complete — but right after ...
0 votes
0 answers
136 views

I am using Optuna for hyperparameter tuning. I get messages as shown below: Trial 15 finished with value: 6.226334123011727 and parameters: {'iterations': 1100, 'learning_rate': 0.04262148853587423, '...
1 vote
0 answers
88 views

I used TPESampler and set it as follows while optimizing with optuna: sampler=optuna.samplers.TPESampler(multivariate=True, n_startup_trials=10, seed=None). But in the 10 startup_trials process, it ...
0 votes
0 answers
39 views

My impression is that every trial is run for one step. Then some trials are pruned and the remaining continue for another step and so on. However, the logs show: Trial 0 completed Trial 1 completed ...
1 vote
0 answers
95 views

I have a dataframe named hyperparam_df which looks like the following: repo_name file_name \ 0 DeepCoMP deepcomp/util/simulation.py ...
1 vote
0 answers
1k views

I am working on a binary classification problem using a neural network with TensorFlow/Keras and scikit-learn's GridSearchCV for hyperparameter tuning. However, I am encountering an AttributeError ...
0 votes
0 answers
145 views

This problem has been bothering me for a long time. I am using optuna for automatic parameter tuning of deep learning models, and the objective function returns the average AUC of five folds. Unable ...
1 vote
0 answers
72 views

I try to apply deep learning to make regression (6 independent variables and 1 dependent variable). Similar problems but has not been solved: Error while running Bayesian for finding best ...
2 votes
0 answers
271 views

I'm using Optuna to find the best values for training my machine learning model. Using Optuna dashboard I can see a chart showing a list of my hyper parameters and their importance. It's just the list ...
1 vote
0 answers
576 views

could I ask you for help? I am doing fine tuning of LLM model Llama3 8b (with LoRA) for text classification. I am using Trainer from Huggingface. I am looking for the optimal ...
1 vote
0 answers
85 views

I have been trying to import KerasRegressor hyper-parameterization of the LSTM time series model to improve its performance in both HPC Linux** and **Windows. For this, I installed tensorflow using ...
2 votes
0 answers
356 views

I am performing hyperparameter optimization with Optuna (from within rl-zoo) and have some questions about parallelization. In the docs, it is recommended to use process based (-> distributed) ...
0 votes
0 answers
147 views

I am trying to tune model hyperparameters using scikit-learn's HalvingGridSearchCV class and the iterations it uses do not appear correct to me. I am using the default min_resources="exhaust"...

15 30 50 per page
1
2 3 4 5
14