《Table 3 Results of five experiments under different model》

《Table 3 Results of five experiments under different model》   提示:宽带有限、当前游客访问压缩模式
本系列图表出处文件名:随高清版一同展现
《基于XGBoost算法的异常用户识别(英文)》


  1. 获取 高清版本忘记账户?点击这里登录
  1. 下载图表忘记账户?点击这里登录

The imbalanced sample set consisted of 1 394positive samples and 8 562negative samples.The former samples were the existing abnormal users,and the latter samples were all normal users.The imbalanced sample set was divided into five parts,four of which were training sets,one of which was a test set.The experiments were conducted five times under different recognition models.KNN classifier taked K=100.In the BP neural network,the number of hidden layer units was 5.The training algorithm used QuickProp.The algorithm parameters were 0.1,2,0.000 1and 0.1.Themaximum number of iterations was 100.The XGBoost parameters were divided into three kinds:general parameters,booster parameters and learning target parameters.In this experiment,the booster parameter shrinkage step size(ETA)was 0.01to prevent overfitting,and the maximum iteration number(nrounds)was 1 500.The minimum sample weight of child nodes(min_child_weight)was 10.The ratio of feature sampling(colsample_bytree)was0.8.In the learning target parameters,the objective function selected binary logistic regression(binary:logistic),and the evaluating indicator was the average accuracy(map).The rest parameters keptthe default values.The results are shown in Table 3and the values are the average of five experiments.