Enhancing Machine Learning through Advanced Optimized Parallelized Model Aggregation: A novel theory of Optimized Parallelized Ensemble Learning (OPEL)

Tracking #: 912-1892

Authors:
NameORCID
Jephter PelekamoyoORCID logo https://orcid.org/0000-0003-4150-8997


Submission Type: 

Research Paper

Abstract: 

This study presents a novel ‘Optimized Parallelized Ensemble Learning’ (OPEL) theory, a parallelized multi-mode ensemble learning framework that optimize computational efficiency, speed and model accuracy. This is accomplished by formulating theoretical mathematical models that guide model selection, weighting, and parallel execution strategies and utilizing performance metrics like the Matthews correlation coefficient to select top-performing models. Experimental simulations on real-world datasets demonstrated significant reductions in computation time and improvements in model accuracy compared to conventional ensemble methods. A paired t-test confirms the statistical significance of these improvements, highlighting OPEL’s potential in distributed and resource-constrained environments. This paper henceforth introduced key innovations, which include the Parallelized Model Execution (PME) approach, Consensus-Based Model Selection (CMS), and Optimized Parallel Voting Mechanism (OPVM), each contributing to reduced computational time and improved model performance. The study demonstrates significant gains in computational speed and accuracy through parallelization and advanced voting techniques, with a time complexity reduction as defined by Amdahl's Law. The proposed ensemble learning framework is validated as both computationally efficient and robust in diverse, large-scale AI applications.

Manuscript: 

Tags: 

  • Reviewed

Data repository URLs: 

none

Date of Submission: 

Friday, April 11, 2025

Date of Decision: 

Monday, April 14, 2025


Nanopublication URLs:

Decision: 

Reject (Pre-Screening)


2 Comments

PrePrint Copy

Please note that a similar copy of this paper exists as a preprint, below is the URL:

https://easychair.org/publications/preprint/fzhq

I am also asking, if they is any issue with the submission or the paper, allow me to make adjustments other than rejecting it.

Having a preprint out there

Having a preprint out there is fine. Just make sure you clearly indicate in the manuscript how the work adds on other papers previously published, specifically what is new and what has been reported elsewhere (except preprints) already.

The manuscript will also need to come with a data respository URL, so I will desk-reject this one but you can submit an improved version.