You are here

A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling.

Printer-friendly versionPrinter-friendly version
Bryant B, Sari-Sarraf H, Long LR, Antani SK
Proceedings of the 4th IEEE/ACM International Conference on Big Data Computing, Applications and Technologies (BDCAT) 2017, Austin, Texas, USA, December 2017. Pp. 267-8 DOI: https://doi.org/10.1145/3148055.3149206
Abstract: 

AGEL-SVM is an extension to a kernel Support Vector Machine (SVM) and is designed for distributed computing using Approximate Global Exhaustive Local sampling (AGEL)-SVM. The dual form of SVM is typically solved using sequential minimal optimization (SMO) which iterates very fast if the full kernel matrix can fit in a computer's memory. AGEL-SVM aims to partition the feature space into sub problems such that the kernel matrix per problem can fit in memory by approximating the data outside each partition. AGEL-SVM has similar Cohen's Kappa and accuracy metrics as the underlying SMO implementation. AGEL-SVM's training times greatly decreased when running on a 128 worker MATLAB pool on Amazon's EC2. Predictor evaluation times are also faster due to a reduction in support vectors per partition.

Bryant B, Sari-Sarraf H, Long LR, Antani SK. A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling. Proceedings of the 4th IEEE/ACM International Conference on Big Data Computing, Applications and Technologies (BDCAT) 2017, Austin, Texas, USA, December 2017. Pp. 267-8 DOI: https://doi.org/10.1145/3148055.3149206