Loading problem...
In ensemble learning, majority voting is a powerful yet elegantly simple technique for combining predictions from multiple machine learning models. Rather than relying on a single classifier's judgment, this approach harnesses the collective wisdom of multiple diverse models to produce more robust and accurate predictions.
The core principle is democratic: each classifier casts a "vote" for what it believes to be the correct class, and the class receiving the most votes wins. This aggregation strategy leverages a fundamental insight from statistical learning theory—when individual classifiers make independent errors, combining their predictions often cancels out individual mistakes, leading to superior overall performance.
The Algorithm: Given k classifiers and n samples, you receive a 2D prediction matrix where:
For each sample (column), you must:
Handling Ties: In scenarios where multiple classes receive the same maximum number of votes (a tie), return the smallest class label among the tied classes. This deterministic tie-breaking rule ensures consistent, reproducible results across runs.
Your Task: Write a Python function that implements this majority voting ensemble strategy. Given a matrix of classifier predictions, return a list containing the consensus prediction for each sample.
predictions = [[0, 1, 2], [0, 2, 2], [1, 1, 2]][0, 1, 2]Sample 0 (column 0): Classifiers voted [0, 0, 1] • Class 0 receives 2 votes • Class 1 receives 1 vote • Winner: Class 0 (most votes)
Sample 1 (column 1): Classifiers voted [1, 2, 1] • Class 1 receives 2 votes • Class 2 receives 1 vote • Winner: Class 1 (most votes)
Sample 2 (column 2): Classifiers voted [2, 2, 2] • Class 2 receives 3 votes (unanimous) • Winner: Class 2 (all classifiers agree)
The final ensemble predictions are [0, 1, 2].
predictions = [[0, 1], [1, 0], [2, 2]][0, 0]Sample 0 (column 0): Classifiers voted [0, 1, 2] • Class 0 receives 1 vote • Class 1 receives 1 vote • Class 2 receives 1 vote • Three-way tie! Return smallest class: Class 0
Sample 1 (column 1): Classifiers voted [1, 0, 2] • Class 0 receives 1 vote • Class 1 receives 1 vote • Class 2 receives 1 vote • Three-way tie! Return smallest class: Class 0
The final ensemble predictions are [0, 0], demonstrating tie-breaking by selecting the minimum class label.
predictions = [[1, 2, 3], [1, 2, 3], [1, 2, 3]][1, 2, 3]This example demonstrates perfect classifier agreement:
Sample 0: All 3 classifiers predict class 1 → Winner: Class 1 Sample 1: All 3 classifiers predict class 2 → Winner: Class 2 Sample 2: All 3 classifiers predict class 3 → Winner: Class 3
When all classifiers are unanimous, the ensemble output simply reflects this consensus. This scenario often occurs when the samples are easily separable, representing high-confidence predictions.
The final ensemble predictions are [1, 2, 3].
Constraints