Spark library for generalized K-Means clustering. Supports general Bregman divergences. Suitable for clustering probabilistic data, time series data, high dimensional data, and very large data.
-
Updated
Jan 19, 2024 - HTML
Spark library for generalized K-Means clustering. Supports general Bregman divergences. Suitable for clustering probabilistic data, time series data, high dimensional data, and very large data.
Trending algorithm based on the article "Trending at Instagram"
Maximum entropy and minimum divergence models in Python
Methods for computational information geometry
Decoupled Kullback-Leibler Divergence Loss (DKL), NeurIPS 2024
Kullback-Leibler projections for Bayesian model selection in Python
[CVPR 2023] Modeling Inter-Class and Intra-Class Constraints in Novel Class Discovery
Code for Variable Selection in Black Box Methods with RelATive cEntrality (RATE) Measures
Experiments of the three PPO-Algorithms (PPO, clipped PPO, PPO with KL-penalty) proposed by John Schulman et al. on the 'Cartpole-v1' environment.
[Python] Comparison of empirical probability distributions. Integral probability metrics (e.g. Kantorovich metric). f-divergences (e.g. Kullback-Leibler). Application to the Choquet integral.
PyTorch implementations of the beta divergence loss.
Code, data, and tutorials for "Sense organ control in moths to moles is a gamble on information through motion"
🐍 🔬 Fast Python implementation of various Kullback-Leibler divergences for 1D and 2D parametric distributions. Also provides optimized code for kl-UCB indexes
Non-Negative Matrix Factorization for Gene Expression Clustering
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
This project implements in Python some common statistical analysis methods used in data analysis, including Entropy, Mutual Information, Kolmogorov–Smirnov test, Kullback-Leibler divergence (KLD), AB tests (Mann-Whitney U and t-tests)
Can we identify key events in a war by analyzing raw text from news stories?
💫 Fast Julia implementation of various Kullback-Leibler divergences for 1D parametric distributions. 🏋 Also provides optimized code for kl-UCB indexes
Using entities from NER on GOV.UK content to power personalisation.
Particle Filter tracker and square-shape detection
Add a description, image, and links to the kullback-leibler-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kullback-leibler-divergence topic, visit your repo's landing page and select "manage topics."