About 204,000 results
Open links in new tab
  1. Why my autoencoder model is not learning? - Stack Overflow

    Apr 15, 2020 · If you want to create an autoencoder you need to understand that you're going to reverse process after encoding. That means that if you have three convolutional layers with …

  2. Image generation using autoencoder vs. variational autoencoder

    Sep 17, 2021 · I think that the autoencoder (AE) generates the same new images every time we run the model because it maps the input image to a single point in the latent space. On the …

  3. python - Reducing Losses of Autoencoder - Stack Overflow

    May 26, 2020 · i am currently trying to train an autoencoder which allows the representation of an array with the length of 128 integer variables to a compression of 64. The array contains 128 …

  4. python - LSTM Autoencoder - Stack Overflow

    Jun 20, 2017 · I'm trying to build a LSTM autoencoder with the goal of getting a fixed sized vector from a sequence, which represents the sequence as good as possible. This autoencoder …

  5. How does binary cross entropy loss work on autoencoders?

    Sep 21, 2018 · Note that in the case of input values in range [0,1] you can use binary_crossentropy, as it is usually used (e.g. Keras autoencoder tutorial and this paper). …

  6. python 2.7 - keras autoencoder vs PCA - Stack Overflow

    For this reason, one way to evaluate an autoencoder efficacy in dimensionality reduction is cutting the output of the middle hidden layer and compare the accuracy/performance of your desired …

  7. What is an autoencoder? - Data Science Stack Exchange

    Aug 17, 2020 · The autoencoder then works by storing inputs in terms of where they lie on the linear image of . Observe that absent the non-linear activation functions, an autoencoder …

  8. python - Training autoencoder for variant length time series ...

    Training autoencoder for variant length time series - Tensorflow Asked 3 years, 3 months ago Modified 3 years, 2 months ago Viewed 2k times

  9. Variational Autoencoders: MSE vs BCE - Stack Overflow

    3 I'm working with a Variational Autoencoder and I have seen that there are people who uses MSE Loss and some people who uses BCE Loss, does anyone know if one is more correct …

  10. machine learning - Is there any sense to use autoencoder for …

    Dec 9, 2016 · Thank you! Pre-trained network by using RBM or autoencoder on lots of unlabeled data allows more faster fine-tuning than any weight initialization. I.e. if I want to train network …