rmaestre / Mutual-InformationLinks
In probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two random variables. This script performs MI over Mutual Information over discrete random variables
☆80Updated 13 years ago
Alternatives and similar repositories for Mutual-Information
Users that are interested in Mutual-Information are comparing it to the libraries listed below
Sorting:
- Mutual Information functions for C and MATLAB☆144Updated 7 years ago
- Python toolbox for nonnegative matrix factorization☆116Updated 7 years ago
- Generalized Canonical Correlation Analysis☆51Updated 8 years ago
- Relevance Vector Machine implementation using the scikit-learn API.☆237Updated 5 months ago
- Regularized kernel canonical correlation analysis in Python☆258Updated 4 years ago
- A Python Multiple kernel learning library.☆57Updated 6 years ago
- Multi-task learning via Structural Regularization☆135Updated 4 years ago
- Bayesian Networks in Python