1
$\begingroup$

In the code for deep learning models in tensorflow, Keras I see that Reshaping of the data (numpy tensor or ndarray) is a very frequent operation.

It is used to fit different components of a model together.

Why can I reshape data and assume everything is fine? Stochastic gradient and Backprop will get job done?

Does this mean that while training a model I can do whatever kind of "improvisations" and just be happy if I get the end result right?

$\endgroup$

1 Answer 1

2
$\begingroup$

For the gradient descent algorithm (and its variations) you can apply any operation which is differentiable. Reshape operation is just a way to convert 4 dimensional tensor to 2 dimensional matrix. With matrix you can do matrix multiplication using built-in functions. Reshape operation doesn't influence gradient calculation, because you calculate it with respect to each variable in the matrix separately.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.