Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/17067
Title: | PARTICLE SWARM OPTIMIZATION AND SUPPORT VECTOR REGRESSION HYBRID APPROACH BASED FORECASTING |
Authors: | CHAUHAN, BHAVESH KUMAR |
Keywords: | PARTICLE SWARM OPTIMIZATION SUPPORT VECTOR REGRESSION FORECASTING SVM |
Issue Date: | Aug-2018 |
Series/Report no.: | TD-4775; |
Abstract: | Forecasts are made for time periods of varying duration. Short term load forecast has a lead time that ranges in the order of one hour to one week. The forecasted load is an integrated load for the chosen time step. The short term load demand in its basic form is a statistical problem, where in the load demand is known to vary in time (the time of day, week, month), casual variables (temperature, other weather conditions) and social variables (usage habits of the consumers). Support vector machine (SVM), which is proposed by Vapnik and co-workers, is a novel powerful machine learning method based on statistical learning theory (SLT). SVM replaces the empirical risk minimization (ERM) principle, which is generally employed in the traditional artificial neural network (ANN), by the structural risk minimization (SRM) principle. The most important concept of SRM is the application of minimizing an upper bound on the generalization error instead of minimizing the training error. On the basis of this principle, SVM is equivalent to solving a linear constrained quadratic programming problem, so that the solution of SVM is always unique and globally optimal. Originally, SVM has been developed for solving the classification problems and achieved good performances. With the introduction of Vapnik’s ε-insensitive loss function, SVM has been extended to solve regression problem called support vector regression (SVR). There are two key features in the implementation of SVR. They are quadratic programming and kernel functions. By solving quadratic programming problem with linear equality and inequality constraints, the SVR’s parameters can be obtained. The flexibility of kernel functions allows the technique to search a wide range of solution space. SVR avoids underfitting and overfitting of training data by minimizing the regularization term as well as the training error. The typical examples of kernel function are linear, polynomial, and Gaussian etc. Choosing a suitable formulation for SVR method is a problem itself. Therefore, different types are tested before choosing the best one. It is found that the Gaussian kernel gives better results than other types in this study. A novel method based on the support vector regression (SVR) is proposed to improve the load prediction accuracy. To guarantee the generalization performance of the SVR model, the Particle swarm optimization (PSO), is utilized to obtain the optimal parameters for the SVR, which is referred to as particle swarm optimization based support vector regression (PSO-SVR) model. Compared with regression and time series, FFNN, proposed models achieve the lowest mean absolute percentage error and thus have a potential for load forecasting. |
URI: | http://dspace.dtu.ac.in:8080/jspui/handle/repository/17067 |
Appears in Collections: | M.E./M.Tech. Electrical Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2K14_C&I_501_Preamble (1).pdf | 441.89 kB | Adobe PDF | View/Open | |
2K14_C&I_501_thesis (2).pdf | 1.74 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.