Skip to main content
0 votes
0 answers
37 views

I am new to the implementation of Gradcam, and having a trouble with it. I have created a model img_shape = (256, 256, 3) base_model = ResNet50(include_top=False, ...
Arcturus's user avatar
0 votes
0 answers
58 views

I'm experimenting with gradients of individual losses, but I can't get the gradient of the single loss_PDE loss to run with the unconnected_gradients argument included. It runs if I delete the ...
Cooper E.'s user avatar
0 votes
1 answer
61 views

I have a model that has a GRU implementation inside and process audio samples. In each forward path I process a single sample of an audio file. To imitate the GRU behavior correctly, I have returned ...
Zahra Kokhazad's user avatar
0 votes
1 answer
90 views

I'm currently working with an autoencoder in hopes to test its accuracy vs pca. My tutor asked me to add a custom loss function that involves the derivatives of the decoder output with respect to the ...
Jonathan Jesus Calderon Rivera's user avatar
0 votes
1 answer
98 views

I am trying to make a model which solves a partial differential equation (PDE). The problem is, it requires a make loss function which takes as its parameters: Inputs given to model Predictions of ...
Sourabh's user avatar
  • 11
0 votes
0 answers
44 views

I am manually trying to compute the gradient of how each one of the output features changes with each of the trainable parameters of my model. The model consists of several CNN layers and a last dense ...
Daniël timmermans's user avatar
0 votes
1 answer
65 views

Overview My model is an encoder that has input Z and output x. I'm trying to use a total_loss that has both traditional supervised learning and regularization term(s). I have additional functions (...
tsf44's user avatar
  • 13
1 vote
0 answers
40 views

Problem I'm having troubles correctly adding physics-informed losses to my training code for my neural network. Background I have an encoder that takes an input curve, X(w), where w is an independent ...
tsf44's user avatar
  • 13
2 votes
0 answers
83 views

I was wondering what the output_gradients argument does in the gradient function of an GradientTape object in tensorflow. According to https://www.tensorflow.org/api_docs/python/tf/GradientTape#...
Alexander Janssen's user avatar
0 votes
0 answers
129 views

I am attempting to code a PINN (physics informed neural network) that has a custom loss function based off of partial differential equations that govern fluid flow. I have included a screenshot of the ...
Felix Spiers's user avatar
0 votes
0 answers
37 views

following is the code I'm trying to implement for meta learning using maml algorithm on a specific dataset. Inner loop works well, don't know why in the outerloop gradients are None. def maml(...
Pranav Belhekar's user avatar
1 vote
0 answers
85 views

In the code I just simulate gradient backpropagation with respect to a fictional randomly simulated image classification dataset. The scenario is : A dataset which gives (images,labels) pair. A model ...
ujjwalnur's user avatar
2 votes
1 answer
303 views

I am using tape.gradient method for optimising some neural networks. It works as expected but keeps giving this warning when I calculate the gradient using tape.gradients, multiple times in a single ...
Prabhat's user avatar
  • 21
1 vote
2 answers
346 views

I'm getting a memory leak and I believe it to be linked to the following warning: WARNING:tensorflow:6 out of the last 6 calls to <function _BaseOptimizer._update_step_xla at 0x7fa9f8074c20> ...
Bryan Carty's user avatar
0 votes
0 answers
45 views

I wanted to get the gradients from the gradient tape so in tf version 1 I used something like - grads = model.optimizer.get_gradients(model.total_loss, model.trainable_weights) symb_inputs = (model....
Shashank Priyadarshi's user avatar

15 30 50 per page
1
2 3 4 5
10