Raul, Ramos Pollan, Miguel Angel Guevara López, Naimy Gonzales de Posada
FUTURE COMPUTING 2011: The Third International Conference on Future Computational Technologies and Applications: 75-81.
September 25-30, 2011. Rome, Italy. ISBN: 978-1-61208-154-0
This paper proposes a method to adapt existing multilayer perceptron (MLP) training algorithms for optimizing the area under the receiver operating characteristic curve (AUC), in binary classification tasks. It is known that error rate minimization does not necessarily yield to optimal AUC and, rather than developing new MLP training algorithms for AUC optimization, we reuse the vast experience encoded into existing algorithms by replacing the error metrics used to guide their training processes, with a novel defined AUC loss function, leaving unmodified their core logic.
The new method was evaluated over 2000 MLP configurations, using four different training algorithms (backpropagation, resilient propagation, simulated annealing and genetic algorithms) in 12 binary datasets from the UCI repository. AUC was improved in 5.86% in average and, in addition, the proposed definition preserves interesting properties of other error metrics. An efficient AUC calculation procedure was also developed to ensure the method remains computationally affordable.