《Table 2 Results of five experiments under different models》
The balanced sample set was composed of 1 394positive samples and 1 394negative samples.The former samples were the existing abnormal users,and the latter samples were the samples randomly selected from 8 562 normal users.The balanced sample set was divided into five parts,four of which were training sets,one of which was a test set.The experiments were conducted five times under different recognition models.KNN classifier taked K=50.In the BP neural network,the number of hidden layer unit was 8.The training algorithm used QuickProp.The algorithm parameters were 0.1,2,0.000 1and 0.1.The maximum number of iterations was 1 000.The XGBoost parameters were divided into three kinds:general parameters,booster parameters and learning target parameters.In this experiment,0.01 was selected as the booster parameter shrinkage step size(ETA)to prevent overfitting,and the maximum iteration number(nrounds)was 1 500.The minimum sample weight of child nodes(min_child_weight)was 10.The ratio of feature sampling(colsample_bytree)was 0.8.In the learning target parameters,the objective function selected binary logistic regression(binary:logistic),and the evaluating indicator was the average accuracy(map).The rest parameters kept the default values.The results are shown in Table 2and the values are the average of the five experiments.
图表编号 | XD0012771300 严禁用于非法目的 |
---|---|
绘制时间 | 2018.12.01 |
作者 | 宋晓宇、孙向阳、赵阳 |
绘制单位 | 兰州交通大学电子与信息工程学院、兰州交通大学电子与信息工程学院、兰州交通大学电子与信息工程学院 |
更多格式 | 高清、无水印(增值服务) |