Today's article comes from the journal of Machine Learning and Knowledge Extraction. The authors are Nassar et al., from Hamad Bin Khalifa University, in Qatar. In this paper they're evaluating two replacements for KL-divergence within t-SNE. Max-Flipped KL Divergence (KLmax) and KL-Wasserstein Loss.
DOI: 10.3390/make8020047
You must be an active Journal Club member to access this content. If you're already a member, click the blue button to login. If you're not a member yet, click the sign-up button to get started.
Login to My Account
Sign Up