In this paper, a novel method for reducing the high-order systems to first-order plus time-delay forms is introduced. For this purpose Support Vector Machines, which became a popular learning algorithm, is employed. Three parameters of the first-order plus time-delay forms are estimated by three parallel support vector regression machines. Satisfactory performance is obtained at the simulations.