《Table 2 Results of five experiments under different models》

《Table 2 Results of five experiments under different models》   提示:宽带有限、当前游客访问压缩模式
本系列图表出处文件名:随高清版一同展现
《基于XGBoost算法的异常用户识别(英文)》


  1. 获取 高清版本忘记账户?点击这里登录
  1. 下载图表忘记账户?点击这里登录

The balanced sample set was composed of 1 394positive samples and 1 394negative samples.The former samples were the existing abnormal users,and the latter samples were the samples randomly selected from 8 562 normal users.The balanced sample set was divided into five parts,four of which were training sets,one of which was a test set.The experiments were conducted five times under different recognition models.KNN classifier taked K=50.In the BP neural network,the number of hidden layer unit was 8.The training algorithm used QuickProp.The algorithm parameters were 0.1,2,0.000 1and 0.1.The maximum number of iterations was 1 000.The XGBoost parameters were divided into three kinds:general parameters,booster parameters and learning target parameters.In this experiment,0.01 was selected as the booster parameter shrinkage step size(ETA)to prevent overfitting,and the maximum iteration number(nrounds)was 1 500.The minimum sample weight of child nodes(min_child_weight)was 10.The ratio of feature sampling(colsample_bytree)was 0.8.In the learning target parameters,the objective function selected binary logistic regression(binary:logistic),and the evaluating indicator was the average accuracy(map).The rest parameters kept the default values.The results are shown in Table 2and the values are the average of the five experiments.