Original Articles

Vol. 25 No. 1 (2025): ELECTRICA

Artificial Intelligence for IT Operations–Based Performance Optimization in Desktop Virtualization

Main Article Content

Mehmet Akif Özdemir
Hikmetcan Özcan

Abstract

Gradient Boosting and XGBoost achieved the lowest root mean squared error (RMSE, ≤ 3.9 MHz) and the highest
R-squared (R², ≥ 0.74), whereas SVR and KNN underperformed, likely due to data sparsity and high dimensionality.
Gradient Boosting and XGBoost achieved the lowest root mean squared error (RMSE, ≤ 3.9 MHz) and the highest
R-squared (R², ≥ 0.74), whereas SVR and KNN underperformed, likely due to data sparsity and high dimensionality.
This study investigates artificial intelligence for IT operations (AIOps)–based performance optimization for
virtual desktop infrastructure (VDI) by integrating automatic resource allocation, dynamic load balancing, and
real-time performance monitoring.Using production telemetry from a corporate VDI cluster, we trained and
compared seven machine-learning models XGBoost, Gradient Boosting, Random Forest, LightGBM, support
vector regression (SVR), k-nearest neighbors (KNN), and Decision Tree to predict central processing unit (CPU),
memory, and response-time dynamics. Gradient Boosting and XGBoost achieved the lowest root mean squared
error (RMSE, ≤ 3.9 MHz) and the highest R-squared (R², ≥ 0.74), whereas SVR and KNN underperformed, likely due
to data sparsity and high dimensionality. .The proposed AIOps pipeline reduces mean response time by 27%,
memory consumption by 18%, and halves manual incident tickets, enabling proactive capacity management.
These findings demonstrate the practical viability of AIOps for holistic, real-time VDI performance governance.

Cite this article as: M. A. Özdemir and H. Özcan, "Artificial intelligence for IT operation–based performance optimization in desktop virtualization," Electrica, 2025, 25, 0032, doi: 10.5152/electrica.2025.25032.

Article Details