-
Hello, at first thank you for posting this work, and it helps me a lot on my comprehension about TCN and AE! This is what I'm trying to learn these days! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Thanks for the compliment! I'm glad it helped. We use the Aurélien Geron has an easier to understand implementation of a VAE in his Jupyter notebook. It's worth looking at. His explanation in the book is good too. https://github.com/ageron/handson-ml2/blob/master/17_autoencoders_and_gans.ipynb Let me know if you have any other questions. Cheers! |
Beta Was this translation helpful? Give feedback.
Thanks for the compliment! I'm glad it helped.
We use the
latent_loss
(line 129 in [17]) by using the Kerasadd_loss
method (line 138 in [17]). There is a good explanation of why Keras has this "add_loss" method in this stackoverflow post: https://stackoverflow.com/a/52683522/9214620.Aurélien Geron has an easier to understand implementation of a VAE in his Jupyter notebook. It's worth looking at. His explanation in the book is good too. https://github.com/ageron/handson-ml2/blob/master/17_autoencoders_and_gans.ipynb
Let me know if you have any other questions. Cheers!