brew.selection.dynamic package¶
Submodules¶
brew.selection.dynamic.base module¶
brew.selection.dynamic.knora module¶
-
class
brew.selection.dynamic.knora.
KNORA_ELIMINATE
(Xval, yval, K=5, weighted=False, knn=None, v2007=False)[source]¶ Bases:
brew.selection.dynamic.knora.KNORA
K-nearest-oracles Eliminate.
The KNORA Eliminate reduces the neighborhood until finds an ensemble of classifiers that correctly classify all neighbors.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
-
`weighted`
bool, (makes no difference in knora_eliminate) – Bool that defines if the classifiers uses weights or not
Examples
>>> from brew.selection.dynamic.knora import KNORA_ELIMINATE >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> >>> dt = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=dt, n_classifiers=10) >>> bag.fit(X, y) >>> >>> ke = KNORA_ELIMINATE(X, y, K=5) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=ke) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.knora.KNORA_UNION
- KNORA Union.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
brew.selection.dynamic.ola.OLA
- Overall Local Accuracy.
References
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
Britto, Alceu S., Robert Sabourin, and Luiz ES Oliveira. “Dynamic selection of classifiers—A comprehensive review.” Pattern Recognition 47.11 (2014): 3665-3680.
Hung-Ren Ko, A., Robert Sabourin, and A. de Souza Britto. “K-nearest oracle for dynamic ensemble selection.” Document Analysis and Recognition, 2007. ICDAR 2007. Ninth International Conference on. Vol. 1. IEEE, 2007
-
-
class
brew.selection.dynamic.knora.
KNORA_UNION
(Xval, yval, K=5, weighted=False, knn=None)[source]¶ Bases:
brew.selection.dynamic.knora.KNORA
K-nearest-oracles Union.
The KNORA union reduces the neighborhood until finds an ensemble of classifiers that correctly classify all neighbors.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
-
`weighted`
bool, (makes no difference in knora_eliminate) – Bool that defines if the classifiers uses weights or not
Examples
>>> from brew.selection.dynamic.knora import KNORA_UNION >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> >>> dt = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=dt, n_classifiers=10) >>> bag.fit(X, y) >>> >>> ku = KNORA_UNION(X, y, K=5) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=ku) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.knora.KNORA_ELIMINATE
- Knora Eliminate.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
brew.selection.dynamic.ola.OLA
- Overall Local Accuracy.
References
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
Britto, Alceu S., Robert Sabourin, and Luiz ES Oliveira. “Dynamic selection of classifiers—A comprehensive review.” Pattern Recognition 47.11 (2014): 3665-3680.
Hung-Ren Ko, A., Robert Sabourin, and A. de Souza Britto. “K-nearest oracle for dynamic ensemble selection.” Document Analysis and Recognition, 2007. ICDAR 2007. Ninth International Conference on. Vol. 1. IEEE, 2007.
-
brew.selection.dynamic.lca module¶
-
class
brew.selection.dynamic.lca.
LCA2
(Xval, yval, K=5, weighted=False, knn=None)[source]¶ Bases:
brew.selection.dynamic.base.DCS
Local Class Accuracy.
The Local Class Accuracy selects the best classifier for a sample using it’s K nearest neighbors.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
Examples
>>> from brew.selection.dynamic.lca import LCA >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> tree = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=tree, n_classifiers=10) >>> bag.fit(X, y) >>> >>> lca = LCA(X, y, K=3) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=lca) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.ola.OLA
- Overall Local Accuracy.
References
Woods, Kevin, Kevin Bowyer, and W. Philip Kegelmeyer Jr. “Combination of multiple classifiers using local accuracy estimates.” Computer Vision and Pattern Recognition, 1996. Proceedings CVPR‘96, 1996 IEEE Computer Society Conference on. IEEE, 1996.
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
-
brew.selection.dynamic.ola module¶
-
class
brew.selection.dynamic.ola.
OLA2
(Xval, yval, K=5, weighted=False, knn=None)[source]¶ Bases:
brew.selection.dynamic.base.DCS
Overall Local Accuracy.
The Overall Local Accuracy selects the best classifier for a sample using it’s K nearest neighbors.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
Examples
>>> from brew.selection.dynamic.ola import OLA >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> tree = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=tree, n_classifiers=10) >>> bag.fit(X, y) >>> >>> ola = OLA(X, y, K=3) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=ola) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
References
Woods, Kevin, Kevin Bowyer, and W. Philip Kegelmeyer Jr. “Combination of multiple classifiers using local accuracy estimates.” Computer Vision and Pattern Recognition, 1996. Proceedings CVPR‘96, 1996 IEEE Computer Society Conference on. IEEE, 1996.
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
-
Module contents¶
-
class
brew.selection.dynamic.
APriori
(Xval, yval, K=5, weighted=False, knn=None, threshold=0.1)[source]¶ Bases:
brew.selection.dynamic.probabilistic.Probabilistic
A Priori Classifier Selection.
The A Priori method is a dynamic classifier selection that uses a probabilistic-based measures for selecting the best classifier.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
-
`threshold`
float, default = 0.1 – Threshold used to verify if there is a single best.
Examples
>>> from brew.selection.dynamic.probabilistic import APriori >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier as Tree >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0] , [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> tree = Tree(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=tree, n_classifiers=10) >>> bag.fit(X, y) >>> >>> apriori = APriori(X, y, K=3) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=apriori) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.probabilistic.APosteriori
- A Posteriori DCS.
brew.selection.dynamic.ola.OLA
- Overall Local Accuracy.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
References
Giacinto, Giorgio, and Fabio Roli. “Methods for dynamic classifier selection.” Image Analysis and Processing, 1999. Proceedings. International Conference on. IEEE, 1999.
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
-
-
class
brew.selection.dynamic.
APosteriori
(Xval, yval, K=5, weighted=False, knn=None, threshold=0.1)[source]¶ Bases:
brew.selection.dynamic.probabilistic.Probabilistic
A Priori Classifier Selection.
The A Priori method is a dynamic classifier selection that uses a probabilistic-based measures for selecting the best classifier.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
-
`threshold`
float, default = 0.1 – Threshold used to verify if there is a single best.
Examples
>>> from brew.selection.dynamic.probabilistic import APosteriori >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier as Tree >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0] , [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> tree = Tree(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=tree, n_classifiers=10) >>> bag.fit(X, y) >>> >>> aposteriori = APosteriori(X, y, K=3) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=aposteriori) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.probabilistic.APriori
- A Priori DCS.
brew.selection.dynamic.ola.OLA
- Overall Local Accuracy.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
References
Giacinto, Giorgio, and Fabio Roli. “Methods for dynamic classifier selection.” Image Analysis and Processing, 1999. Proceedings. International Conference on. IEEE, 1999.
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
-
-
class
brew.selection.dynamic.
KNORA_UNION
(Xval, yval, K=5, weighted=False, knn=None)[source]¶ Bases:
brew.selection.dynamic.knora.KNORA
K-nearest-oracles Union.
The KNORA union reduces the neighborhood until finds an ensemble of classifiers that correctly classify all neighbors.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
-
`weighted`
bool, (makes no difference in knora_eliminate) – Bool that defines if the classifiers uses weights or not
Examples
>>> from brew.selection.dynamic.knora import KNORA_UNION >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> >>> dt = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=dt, n_classifiers=10) >>> bag.fit(X, y) >>> >>> ku = KNORA_UNION(X, y, K=5) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=ku) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.knora.KNORA_ELIMINATE
- Knora Eliminate.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
brew.selection.dynamic.ola.OLA
- Overall Local Accuracy.
References
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
Britto, Alceu S., Robert Sabourin, and Luiz ES Oliveira. “Dynamic selection of classifiers—A comprehensive review.” Pattern Recognition 47.11 (2014): 3665-3680.
Hung-Ren Ko, A., Robert Sabourin, and A. de Souza Britto. “K-nearest oracle for dynamic ensemble selection.” Document Analysis and Recognition, 2007. ICDAR 2007. Ninth International Conference on. Vol. 1. IEEE, 2007.
-
-
class
brew.selection.dynamic.
KNORA_ELIMINATE
(Xval, yval, K=5, weighted=False, knn=None, v2007=False)[source]¶ Bases:
brew.selection.dynamic.knora.KNORA
K-nearest-oracles Eliminate.
The KNORA Eliminate reduces the neighborhood until finds an ensemble of classifiers that correctly classify all neighbors.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
-
`weighted`
bool, (makes no difference in knora_eliminate) – Bool that defines if the classifiers uses weights or not
Examples
>>> from brew.selection.dynamic.knora import KNORA_ELIMINATE >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> >>> dt = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=dt, n_classifiers=10) >>> bag.fit(X, y) >>> >>> ke = KNORA_ELIMINATE(X, y, K=5) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=ke) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.knora.KNORA_UNION
- KNORA Union.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
brew.selection.dynamic.ola.OLA
- Overall Local Accuracy.
References
Ko, Albert HR, Robert Sabourin, and Alceu Souza Britto Jr. “From dynamic classifier selection to dynamic ensemble selection.” Pattern Recognition 41.5 (2008): 1718-1731.
Britto, Alceu S., Robert Sabourin, and Luiz ES Oliveira. “Dynamic selection of classifiers—A comprehensive review.” Pattern Recognition 47.11 (2014): 3665-3680.
Hung-Ren Ko, A., Robert Sabourin, and A. de Souza Britto. “K-nearest oracle for dynamic ensemble selection.” Document Analysis and Recognition, 2007. ICDAR 2007. Ninth International Conference on. Vol. 1. IEEE, 2007
-
-
class
brew.selection.dynamic.
MCB
(Xval, yval, K=5, weighted=False, knn=None, similarity_threshold=0.7, significance_threshold=0.3)[source]¶ Bases:
brew.selection.dynamic.base.DCS
Multiple Classifier Behavior.
The Multiple Classifier Behavior (MCB) selects the best classifier using the similarity of the classifications on the K neighbors of the test sample in the validation set.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
Examples
>>> from brew.selection.dynamic.mcb import MCB >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> tree = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=tree, n_classifiers=10) >>> bag.fit(X, y) >>> >>> mcb = MCB(X, y, K=3) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=mcb) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.lca.OLA
- Overall Local Accuracy.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
References
Giacinto, Giorgio, and Fabio Roli. “Dynamic classifier selection based on multiple classifier behaviour.” Pattern Recognition 34.9 (2001): 1879-1881.
-
-
class
brew.selection.dynamic.
DSKNN
(Xval, yval, K=5, weighted=False, knn=None, n_1=0.7, n_2=0.3)[source]¶ Bases:
brew.selection.dynamic.base.DCS
DS-KNN
The DS-KNN selects an ensemble of classifiers based on their accuracy and diversity in the neighborhood of the test sample.
-
`Xval`
array-like, shape = [indeterminated, n_features] – Validation set.
-
`yval`
array-like, shape = [indeterminated] – Labels of the validation set.
-
`knn`
sklearn KNeighborsClassifier, – Classifier used to find neighborhood.
Examples
>>> from brew.selection.dynamic import DSKNN >>> from brew.generation.bagging import Bagging >>> from brew.base import EnsembleClassifier >>> >>> from sklearn.tree import DecisionTreeClassifier >>> import numpy as np >>> >>> X = np.array([[-1, 0], [-0.8, 1], [-0.8, -1], [-0.5, 0], [0.5, 0], [1, 0], [0.8, 1], [0.8, -1]]) >>> y = np.array([1, 1, 1, 2, 1, 2, 2, 2]) >>> tree = DecisionTreeClassifier(max_depth=1, min_samples_leaf=1) >>> bag = Bagging(base_classifier=tree, n_classifiers=10) >>> bag.fit(X, y) >>> >>> sel = DSKNN(X, y, K=3) >>> >>> clf = EnsembleClassifier(bag.ensemble, selector=sel) >>> clf.predict([-1.1,-0.5]) [1]
See also
brew.selection.dynamic.lca.OLA
- Overall Local Accuracy.
brew.selection.dynamic.lca.LCA
- Local Class Accuracy.
References
Santana, Alixandre, et al. “A dynamic classifier selection method to build ensembles using accuracy and diversity.” 2006 Ninth Brazilian Symposium on Neural Networks (SBRN‘06). IEEE, 2006.
-