Skip to main content

Questions tagged [loss-function]

A function used to quantify the difference between observed data and predicted values according to a model. Minimization of loss functions is a way to estimate the parameters of the model.

3 votes
2 answers
91 views

I’m working on a MarTech use case (predict customers conversions to a certain product). Not really used to work within this domain, therefore I’m seeking some critical questions on my set up. Context: ...
Henri's user avatar
  • 133
3 votes
1 answer
96 views

I'm working with a custom YOLO-like architecture implemented in TensorFlow/Keras. While pretraining on the COCO dataset works, I plan to fine-tune the model on a highly imbalanced dataset. ...
chhu's user avatar
  • 141
3 votes
1 answer
90 views

I've come across a loss regularizing function that uses population counts (i.e., bits that are one, Hamming weight) of activations: $$ L_\mathrm{reg} = H(\max(\lfloor x \rceil, 0)), $$ where $x$ is an ...
Gaslight Deceive Subvert's user avatar
2 votes
2 answers
129 views

As an intro, Group Based Splitting is data splitting into Train / Test (Val), when by some attribute like patient_id, item_id or similar, to ensure that same person ...
Michael D's user avatar
  • 209
6 votes
1 answer
128 views

Download PDF files frome this website "https://register.awmf.org/de/start" but the code didn't find any PDF Link, although there are links to PDF files, but indirectly,I want to download all ...
Ward Khedr's user avatar
0 votes
0 answers
34 views

I tried modifying the reconstruction loss such that values that are pushed out of bounds do not contribute to the loss and it works as expected in tensorflow after training an autoencoder. However, ...
zvxayr's user avatar
  • 1
1 vote
0 answers
53 views

I've been trying to aggregate a normal CNN loss with a loss that quantifies how well we can cluster the second-to-last layer embeddings (i.e. feed the embeddings to a 2D Self Organizing Map (SOM) and ...
catalyst's user avatar
6 votes
2 answers
90 views

For a binary classification model, When training a deep model, at each training step, the model receives a batch (i.e batch of size 32 samples). Let's assume that in each training batch there are ...
user3668129's user avatar
3 votes
1 answer
143 views

I am training a CNN to regress on 4 targets related to a given image. Within the image is a point of interest whose position can be defined by phi, and theta (corresponding to x and y of a normal ...
Jack Stethem's user avatar
5 votes
2 answers
702 views

I have two loss functions $\mathcal{L}_1$ and $\mathcal{L}_2$ to train my model. The model is predominantly a classification model. Both $\mathcal{L}_1$ and $\mathcal{L}_2$ takes are two variants of ...
Aleph's user avatar
  • 205
5 votes
1 answer
98 views

I am generally trying to take into account costs in learning. The set-up is as follows: a statistical learning problem with usuall X and y, where y is imbalanced (roughly 1% of ones). Scikit learn ...
Lucas Morin's user avatar
  • 3,274
1 vote
0 answers
57 views

I am currently tackling a semantic segmentation problem where, for each sample, my goal is to segment two masks corresponding to two objects. Notably, object two is typically located inside object one,...
Ahmed Mohamed's user avatar
3 votes
1 answer
123 views

I'm wondering about which activation function will be easier to train with (get better accuracy / smallest loss) - with SoftMax or sigmoid (for multiclass classification problem) According to: https://...
user3668129's user avatar
0 votes
1 answer
117 views

I define a classification problem as a problem of calculating a function $h$ that approximates a function $f$ that classifies data. The approximation is calculated by taking a set of training samples ...
Leandro's user avatar
  • 25
0 votes
1 answer
102 views

I’m very new to Pytorch (and ML in general), so I’m having difficulty understanding what is going on WRT a custom loss/cost function I’m looking at. I understand what’s going on in the function, but I ...
user3460324's user avatar

15 30 50 per page
1
2 3 4 5
35