All Questions
9 questions
-2
votes
1
answer
2k
views
Evaluating the performance of variational autoencoder on unlabeled data
I've designed a variational autoencoder (VAE) that clusters sequential time series data.
To evaluate the performance of VAE on labeled data, First, I run KMeans on the raw data and compare the ...
2
votes
1
answer
1k
views
Why flatten last encoder layer in a convolutional VAE?
I am quite new in the deep learning game, I was wondering why do we flatten the last layer of the encoder in a VAE and then give the flattened output to a linear layer, which then approximates a ...
0
votes
0
answers
37
views
Getting loss (binary_crossentropy) stagnated around 0.601 for this autoencoder architecture
I am working on an unsupervised image classification problem, the dataset consists of around 4700 photos of carnivores. I thought of achieving this task by constructing an autoencoder and getting the ...
2
votes
2
answers
780
views
Anomaly Detection with Autoencoder using unlabelled Dataset (How to construct the input data)
I am new in deep learning field, i would like to ask about unlabeled dataset for Anomaly Detection using Autoencoder. my confusing part start at a few questions below:
1) some post are saying ...
0
votes
1
answer
1k
views
Unsupervised Convolutional Autoencoder is always giving blank output
I'm trying to train an autoencoder with unsupervised images. I have about 300 train images and 100 validation images. But when I inputted an unseen image to the trained autoencoder, it is giving ...
1
vote
0
answers
841
views
Attempt to train keras VAE on unlabeled images (input=target) using ImageDataGenerator and vae.fit_generator fails when checking model target
I am trying to adapt the keras VAE template variational_autoencoder_deconv.py found here for a non-MNIST unlabeled dataset. I am using 38,585 256x256 pixel training images and 5,000 validation images, ...
1
vote
1
answer
1k
views
python tensorflow using dropout on input layer
I am using python with tf and looking for the proper way to mask some of the input while training an auto de-noising encoder for mnist data.
I tried using dropout for the input layer, same way as i am ...
1
vote
0
answers
823
views
TensorFlow network having low epoch loss but still getting low accuracy
So I'm trying to build a deep net with stacked auto-encoders for training the network with MNIST dataset. I first pre-trained the model (layer-wise) and did a normal backprop for fine tuning. The ...
4
votes
1
answer
3k
views
How to train and fine-tune fully unsupervised deep neural networks?
In scenario 1, I had a multi-layer sparse autoencoder that tries to reproduce my input, so all my layers are trained together with random-initiated weights. Without a supervised layer, on my data this ...