Skip to main content
Bumped by Community user
Bumped by Community user
Bumped by Community user
Bumped by Community user
Bumped by Community user
Bumped by Community user
Bumped by Community user
Bumped by Community user
Bumped by Community user
edited body; edited tags; edited title
Source Link
nbro
  • 43.6k
  • 14
  • 122
  • 223

Batch Normalization Does it make sense to use batch normalization in Deep Autoencodersdeep (stacked) or sparse auto-encoders?

Does it make sense to use Batch Normalizationbatch normalization in Deepdeep (stacked) or/and Sparse Autoencoders sparse auto-encoders? 

I cannot find any resources for that, so is. Is it safe to assume that, since it works for other DNNs, it will also make sense to use it and will offer benefits on training AEs?

Batch Normalization in Deep Autoencoders?

Does it make sense to use Batch Normalization in Deep (stacked) or/and Sparse Autoencoders? I cannot find any resources for that, so is it safe to assume that since it works for other DNNs it will also make sense to use it and will offer benefits on training AEs?

Does it make sense to use batch normalization in deep (stacked) or sparse auto-encoders?

Does it make sense to use batch normalization in deep (stacked) or sparse auto-encoders? 

I cannot find any resources for that. Is it safe to assume that, since it works for other DNNs, it will also make sense to use it and will offer benefits on training AEs?

Source Link
Glrs
  • 241
  • 3
  • 8

Batch Normalization in Deep Autoencoders?

Does it make sense to use Batch Normalization in Deep (stacked) or/and Sparse Autoencoders? I cannot find any resources for that, so is it safe to assume that since it works for other DNNs it will also make sense to use it and will offer benefits on training AEs?