Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

9
  • By "tie" do you mean there are two Dense layers with exactly the same weights? If that's the case then why don't you use a single Dense layer and apply it in different parts of your model? Commented Dec 12, 2018 at 21:05
  • Sorry about that, I have updated the question to show what I mean by "tying" the weights. Unfortunately, it is not as simple as using the same layer since the weight matrix has to be transposed. Commented Dec 12, 2018 at 21:19
  • I can't test it, but I am quite confident that your approach is correct (although I am not sure whether self._trainable_weights.append(self.kernel) is strictly necessary, since the weights self.tied_to.kernel are in theory already trainable). I would suggest you to check the weights after training, and make sure that they are the same. You could also visualize the computational graph with Tensorboard. Commented Dec 13, 2018 at 7:22
  • @JamesMchugh I think you should not use self._trainable_weights.append(self.kernel) at all since these weights are not trainable from the viewpoint of custom Dense layer. Either remove that line entirely, or use self._non_trainable_weights.append(self.kernel) instead so that you can access the weights from the custom Dense layer independently (i.e. using get_weights() method). Commented Dec 13, 2018 at 10:29
  • 1
    For anyone interested, the problem was that by using k.variable(k.transpose(self.kernel)), I broke the tie. I had to use k.transpose(self.kernel) instead. However, this does cause some problems when trying to use autoencoder.load_weights(file) since self.kernel is a tensor and does not have the assign method. Commented Dec 14, 2018 at 21:18