• <menu id="gyiem"><menu id="gyiem"></menu></menu>
  • <menu id="gyiem"><code id="gyiem"></code></menu>

    機器學習(二) 如何做到Kaggle排名前2%

    原創文章,轉載請務必將下面這段話置于文章開頭處。
    本文轉發自技術世界原文鏈接 http://www.luozeyang.com/ml/classification/

    摘要

    本文詳述了如何通過數據預覽,探索式數據分析,缺失數據填補,刪除關聯特征以及派生新特征等方法,在Kaggle的Titanic幸存預測這一分類問題競賽中獲得前2%排名的具體方法。

    競賽內容介紹

    Titanic幸存預測是Kaggle上參賽人數最多的競賽之一。它要求參賽選手通過訓練數據集分析出什么類型的人更可能幸存,并預測出測試數據集中的所有乘客是否生還。

    該項目是一個二元分類問題

    如何取得排名前2%的成績

    加載數據

    在加載數據之前,先通過如下代碼加載之后會用到的所有R庫

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    library(readr) # File read / write
    library(ggplot2) # Data visualization
    library(ggthemes) # Data visualization
    library(scales) # Data visualization
    library(plyr)
    library(stringr) # String manipulation
    library(InformationValue) # IV / WOE calculation
    library(MLmetrics) # Mache learning metrics.e.g. Recall, Precision, Accuracy, AUC
    library(rpart) # Decision tree utils
    library(randomForest) # Random Forest
    library(dplyr) # Data manipulation
    library(e1071) # SVM
    library(Amelia) # Missing value utils
    library(party) # Conditional inference trees
    library(gbm) # AdaBoost
    library(class) # KNN
    library(scales)

    通過如下代碼將訓練數據和測試數據分別加載到名為train和test的data.frame中

    1
    2
    train <- read_csv("train.csv")
    test <- read_csv("test.csv")

    由于之后需要對訓練數據和測試數據做相同的轉換,為避免重復操作和出現不一至的情況,更為了避免可能碰到的Categorical類型新level的問題,這里建議將訓練數據和測試數據合并,統一操作。

    1
    2
    3
    data <- bind_rows(train, test)
    train.row <- 1:nrow(train)
    test.row <- (1 + nrow(train)):(nrow(train) + nrow(test))

    數據預覽

    先觀察數據

    1
    str(data)
    ## Classes 'tbl_df', 'tbl' and 'data.frame':    1309 obs. of  12 variables:
    ##  $ PassengerId: int  1 2 3 4 5 6 7 8 9 10 ...
    ##  $ Survived   : int  0 1 1 1 0 0 0 0 1 1 ...
    ##  $ Pclass     : int  3 1 3 1 3 3 1 3 3 2 ...
    ##  $ Name       : chr  "Braund, Mr. Owen Harris" "Cumings, Mrs. John Bradley (Florence Briggs Thayer)" "Heikkinen, Miss. Laina" "Futrelle, Mrs. Jacques Heath (Lily May Peel)" ...
    ##  $ Sex        : chr  "male" "female" "female" "female" ...
    ##  $ Age        : num  22 38 26 35 35 NA 54 2 27 14 ...
    ##  $ SibSp      : int  1 1 0 1 0 0 0 3 0 1 ...
    ##  $ Parch      : int  0 0 0 0 0 0 0 1 2 0 ...
    ##  $ Ticket     : chr  "A/5 21171" "PC 17599" "STON/O2. 3101282" "113803" ...
    ##  $ Fare       : num  7.25 71.28 7.92 53.1 8.05 ...
    ##  $ Cabin      : chr  NA "C85" NA "C123" ...
    ##  $ Embarked   : chr  "S" "C" "S" "S" ...
    

    從上可見,數據集包含12個變量,1309條數據,其中891條為訓練數據,418條為測試數據

    • PassengerId 整型變量,標識乘客的ID,遞增變量,對預測無幫助
    • Survived 整型變量,標識該乘客是否幸存。0表示遇難,1表示幸存。將其轉換為factor變量比較方便處理
    • Pclass 整型變量,標識乘客的社會-經濟狀態,1代表Upper,2代表Middle,3代表Lower
    • Name 字符型變量,除包含姓和名以外,還包含Mr. Mrs. Dr.這樣的具有西方文化特點的信息
    • Sex 字符型變量,標識乘客性別,適合轉換為factor類型變量
    • Age 整型變量,標識乘客年齡,有缺失值
    • SibSp 整型變量,代表兄弟姐妹及配偶的個數。其中Sib代表Sibling也即兄弟姐妹,Sp代表Spouse也即配偶
    • Parch 整型變量,代表父母或子女的個數。其中Par代表Parent也即父母,Ch代表Child也即子女
    • Ticket 字符型變量,代表乘客的船票號
    • Fare 數值型,代表乘客的船票價
    • Cabin 字符型,代表乘客所在的艙位,有缺失值
    • Embarked 字符型,代表乘客登船口岸,適合轉換為factor型變量

    探索式數據分析

    乘客社會等級越高,幸存率越高

    對于第一個變量Pclass,先將其轉換為factor類型變量。

    1
    data$Survived <- factor(data$Survived)

    可通過如下方式統計出每個Pclass幸存和遇難人數,如下

    1
    2
    3
    4
    5
    6
    7
    8
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Pclass, y = ..count.., fill=Survived)) + 
    geom_bar(stat = "count", position='dodge') +
    xlab('Pclass') +
    ylab('Count') +
    ggtitle('How Pclass impact survivor') +
    scale_fill_manual(values=c("#FF0000", "#00FF00")) +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,Pclass=1的乘客大部分幸存,Pclass=2的乘客接近一半幸存,而Pclass=3的乘客只有不到25%幸存。

    為了更為定量的計算Pclass的預測價值,可以算出Pclass的WOE和IV如下。從結果可以看出,Pclass的IV為0.5,且“Highly Predictive”。由此可以暫時將Pclass作為預測模型的特征變量之一。

    1
    WOETable(X=factor(data$Pclass[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL     PCT_G     PCT_B        WOE         IV
    ## 1   1   136   80   216 0.3976608 0.1457195  1.0039160 0.25292792
    ## 2   2    87   97   184 0.2543860 0.1766849  0.3644848 0.02832087
    ## 3   3   119  372   491 0.3479532 0.6775956 -0.6664827 0.21970095
    
    1
    IV(X=factor(data$Pclass[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.5009497
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    不同Title的乘客幸存率不同

    乘客姓名重復度太低,不適合直接使用。而姓名中包含Mr. Mrs. Dr.等具有文化特征的信息,可將之抽取出來。

    本文使用如下方式從姓名中抽取乘客的Title

    1
    2
    3
    4
    5
    6
    data$Title <- sapply(data$Name, FUN=function(x) {strsplit(x, split='[,.]')[[1]][2]})
    data$Title <- sub(' ', '', data$Title)
    data$Title[data$Title %in% c('Mme', 'Mlle')] <- 'Mlle'
    data$Title[data$Title %in% c('Capt', 'Don', 'Major', 'Sir')] <- 'Sir'
    data$Title[data$Title %in% c('Dona', 'Lady', 'the Countess', 'Jonkheer')] <- 'Lady'
    data$Title <- factor(data$Title)

    抽取完乘客的Title后,統計出不同Title的乘客的幸存與遇難人數

    1
    2
    3
    4
    5
    6
    7
    8
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Title, y = ..count.., fill=Survived)) + 
    geom_bar(stat = "count", position='stack') +
    xlab('Title') +
    ylab('Count') +
    ggtitle('How Title impact survivor') +
    scale_fill_discrete(name="Survived", breaks=c(0, 1), labels=c("Perish", "Survived")) +
    geom_text(stat = "count", aes(label = ..count..), position=position_stack(vjust = 0.5)) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可看出,Title為Mr的乘客幸存比例非常小,而Title為Mrs和Miss的乘客幸存比例非常大。這里使用WOE和IV來定量計算Title這一變量對于最終的預測是否有用。從計算結果可見,IV為1.520702,且”Highly Predictive”。因此,可暫將Title作為預測模型中的一個特征變量。

    1
    WOETable(X=data$Title[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ##       CAT GOODS BADS TOTAL       PCT_G       PCT_B         WOE            IV
    ## 1     Col     1    1     2 0.002873563 0.001808318  0.46315552  4.933741e-04
    ## 2      Dr     3    4     7 0.008620690 0.007233273  0.17547345  2.434548e-04
    ## 3    Lady     2    1     3 0.005747126 0.001808318  1.15630270  4.554455e-03
    ## 4  Master    23   17    40 0.066091954 0.030741410  0.76543639  2.705859e-02
    ## 5    Miss   127   55   182 0.364942529 0.099457505  1.30000942  3.451330e-01
    ## 6    Mlle     3    3     3 0.008620690 0.005424955  0.46315552  1.480122e-03
    ## 7      Mr    81  436   517 0.232758621 0.788426763 -1.22003757  6.779360e-01
    ## 8     Mrs    99   26   125 0.284482759 0.047016275  1.80017883  4.274821e-01
    ## 9      Ms     1    1     1 0.002873563 0.001808318  0.46315552  4.933741e-04
    ## 10    Rev     6    6     6 0.017241379 0.010849910  0.46315552  2.960244e-03
    ## 11    Sir     2    3     5 0.005747126 0.005424955  0.05769041  1.858622e-05
    
    1
    IV(X=data$Title[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ## [1] 1.487853
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    女性幸存率遠高于男性

    對于Sex變量,由Titanic號沉沒的背景可知,逃生時遵循“婦女與小孩先走”的規則,由此猜想,Sex變量應該對預測乘客幸存有幫助。

    如下數據驗證了這一猜想,大部分女性(233/(233+81)=74.20%)得以幸存,而男性中只有很小部分(109/(109+468)=22.85%)幸存。

    1
    2
    3
    4
    5
    6
    7
    8
    data$Sex <- as.factor(data$Sex)
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Sex, y = ..count.., fill=Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('Sex') +
    ylab('Count') +
    ggtitle('How Sex impact survivo') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    通過計算WOE和IV可知,Sex的IV為1.34且”Highly Predictive”,可暫將Sex作為特征變量。

    1
    WOETable(X=data$Sex[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ##      CAT GOODS BADS TOTAL     PCT_G    PCT_B        WOE        IV
    ## 1 female   233   81   314 0.6812865 0.147541  1.5298770 0.8165651
    ## 2   male   109  468   577 0.3187135 0.852459 -0.9838327 0.5251163
    
    1
    IV(X=data$Sex[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ## [1] 1.341681
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    未成年人幸存率高于成年人

    結合背景,按照“婦女與小孩先走”的規則,未成年人應該有更大可能幸存。如下圖所示,Age < 18的乘客中,幸存人數確實高于遇難人數。同時青壯年乘客中,遇難人數遠高于幸存人數。

    1
    2
    3
    ggplot(data = data[(!is.na(data$Age)) & row(data[, 'Age']) <= 891, ], aes(x = Age, color=Survived)) + 
    geom_line(aes(label=..count..), stat = 'bin', binwidth=5) +
    labs(title = "How Age impact survivor", x = "Age", y = "Count", fill = "Survived")
    ## Warning: Ignoring unknown aesthetics: label
    

    配偶及兄弟姐妹數適中的乘客更易幸存

    對于SibSp變量,分別統計出幸存與遇難人數。

    1
    2
    3
    4
    5
    ggplot(data = data[1:nrow(train),], mapping = aes(x = SibSp, y = ..count.., fill=Survived)) + 
    geom_bar(stat = 'count', position='dodge') +
    labs(title = "How SibSp impact survivor", x = "Sibsp", y = "Count", fill = "Survived") +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,SibSp為0的乘客,幸存率低于1/3;SibSp為1或2的乘客,幸存率高于50%;SibSp大于等于3的乘客,幸存率非常低。可通過計算WOE與IV定量計算SibSp對預測的貢獻。IV為0.1448994,且”Highly Predictive”。

    1
    WOETable(X=as.factor(data$SibSp[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL       PCT_G       PCT_B        WOE          IV
    ## 1   0   210  398   608 0.593220339 0.724954463 -0.2005429 0.026418349
    ## 2   1   112   97   209 0.316384181 0.176684882  0.5825894 0.081387334
    ## 3   2    13   15    28 0.036723164 0.027322404  0.2957007 0.002779811
    ## 4   3     4   12    16 0.011299435 0.021857923 -0.6598108 0.006966604
    ## 5   4     3   15    18 0.008474576 0.027322404 -1.1706364 0.022063953
    ## 6   5     5    5     5 0.014124294 0.009107468  0.4388015 0.002201391
    ## 7   8     7    7     7 0.019774011 0.012750455  0.4388015 0.003081947
    
    1
    IV(X=as.factor(data$SibSp[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1448994
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    父母與子女數為1到3的乘客更可能幸存

    對于Parch變量,分別統計出幸存與遇難人數。

    1
    2
    3
    4
    5
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Parch, y = ..count.., fill=Survived)) + 
    geom_bar(stat = 'count', position='dodge') +
    labs(title = "How Parch impact survivor", x = "Parch", y = "Count", fill = "Survived") +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,Parch為0的乘客,幸存率低于1/3;Parch為1到3的乘客,幸存率高于50%;Parch大于等于4的乘客,幸存率非常低。可通過計算WOE與IV定量計算Parch對預測的貢獻。IV為0.1166611,且”Highly Predictive”。

    1
    WOETable(X=as.factor(data$Parch[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL       PCT_G       PCT_B        WOE          IV
    ## 1   0   233  445   678 0.671469741 0.810564663 -0.1882622 0.026186312
    ## 2   1    65   53   118 0.187319885 0.096539162  0.6628690 0.060175728
    ## 3   2    40   40    80 0.115273775 0.072859745  0.4587737 0.019458440
    ## 4   3     3    2     5 0.008645533 0.003642987  0.8642388 0.004323394
    ## 5   4     4    4     4 0.011527378 0.007285974  0.4587737 0.001945844
    ## 6   5     1    4     5 0.002881844 0.007285974 -0.9275207 0.004084922
    ## 7   6     1    1     1 0.002881844 0.001821494  0.4587737 0.000486461
    
    1
    IV(X=as.factor(data$Parch[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1166611
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    FamilySize為2到4的乘客幸存可能性較高

    SibSp與Parch都說明,當乘客無親人時,幸存率較低,乘客有少數親人時,幸存率高于50%,而當親人數過高時,幸存率反而降低。在這里,可以考慮將SibSp與Parch相加,生成新的變量,FamilySize。

    1
    2
    3
    4
    5
    6
    7
    8
    data$FamilySize <- data$SibSp + data$Parch + 1
    ggplot(data = data[1:nrow(train),], mapping = aes(x = FamilySize, y = ..count.., fill=Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('FamilySize') +
    ylab('Count') +
    ggtitle('How FamilySize impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    計算FamilySize的WOE和IV可知,IV為0.3497672,且“Highly Predictive”。由SibSp與Parch派生出來的新變量FamilySize的IV高于SibSp與Parch的IV,因此,可將這個派生變量FamilySize作為特征變量。

    1
    WOETable(X=as.factor(data$FamilySize[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL       PCT_G      PCT_B        WOE           IV
    ## 1   1   163  374   537 0.459154930 0.68123862 -0.3945249 0.0876175539
    ## 2   2    89   72   161 0.250704225 0.13114754  0.6479509 0.0774668616
    ## 3   3    59   43   102 0.166197183 0.07832423  0.7523180 0.0661084057
    ## 4   4    21    8    29 0.059154930 0.01457195  1.4010615 0.0624634998
    ## 5   5     3   12    15 0.008450704 0.02185792 -0.9503137 0.0127410643
    ## 6   6     3   19    22 0.008450704 0.03460838 -1.4098460 0.0368782940
    ## 7   7     4    8    12 0.011267606 0.01457195 -0.2571665 0.0008497665
    ## 8   8     6    6     6 0.016901408 0.01092896  0.4359807 0.0026038712
    ## 9  11     7    7     7 0.019718310 0.01275046  0.4359807 0.0030378497
    
    1
    IV(X=as.factor(data$FamilySize[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.3497672
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    共票號乘客幸存率高

    對于Ticket變量,重復度非常低,無法直接利用。先統計出每張票對應的乘客數。

    1
    ticket.count <- aggregate(data$Ticket, by = list(data$Ticket), function(x) sum(!is.na(x)))

    這里有個猜想,票號相同的乘客,是一家人,很可能同時幸存或者同時遇難。現將所有乘客按照Ticket分為兩組,一組是使用單獨票號,另一組是與他人共享票號,并統計出各組的幸存與遇難人數。

    1
    2
    3
    4
    5
    6
    7
    8
    9
    data$TicketCount <- apply(data, 1, function(x) ticket.count[which(ticket.count[, 1] == x['Ticket']), 2])
    data$TicketCount <- factor(sapply(data$TicketCount, function(x) ifelse(x > 1, 'Share', 'Unique')))
    ggplot(data = data[1:nrow(train),], mapping = aes(x = TicketCount, y = ..count.., fill=Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('TicketCount') +
    ylab('Count') +
    ggtitle('How TicketCount impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    由上圖可見,未與他人同票號的乘客,只有130/(130+351)=27%幸存,而與他人同票號的乘客有212/(212+198)=51.7%幸存。計算TicketCount的WOE與IV如下。其IV為0.2751882,且”Highly Predictive”

    1
    WOETable(X=data$TicketCount[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ##      CAT GOODS BADS TOTAL    PCT_G     PCT_B        WOE        IV
    ## 1  Share   212  198   410 0.619883 0.3606557  0.5416069 0.1403993
    ## 2 Unique   130  351   481 0.380117 0.6393443 -0.5199641 0.1347889
    
    1
    IV(X=data$TicketCount[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ## [1] 0.2751882
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    支出船票費越高幸存率越高

    對于Fare變量,由下圖可知,Fare越大,幸存率越高。

    1
    2
    3
    ggplot(data = data[(!is.na(data$Fare)) & row(data[, 'Fare']) <= 891, ], aes(x = Fare, color=Survived)) + 
    geom_line(aes(label=..count..), stat = 'bin', binwidth=10) +
    labs(title = "How Fare impact survivor", x = "Fare", y = "Count", fill = "Survived")

    不同倉位的乘客幸存率不同

    對于Cabin變量,其值以字母開始,后面伴以數字。這里有一個猜想,字母代表某個區域,數據代表該區域的序號。類似于火車票即有車箱號又有座位號。因此,這里可嘗試將Cabin的首字母提取出來,并分別統計出不同首字母倉位對應的乘客的幸存率。

    1
    2
    3
    4
    5
    6
    7
    ggplot(data[1:nrow(train), ], mapping = aes(x = as.factor(sapply(data$Cabin[1:nrow(train)], function(x) str_sub(x, start = 1, end = 1))), y = ..count.., fill = Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('Cabin') +
    ylab('Count') +
    ggtitle('How Cabin impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    由上圖可見,倉位號首字母為B,C,D,E,F的乘客幸存率均高于50%,而其它倉位的乘客幸存率均遠低于50%。倉位變量的WOE及IV計算如下。由此可見,Cabin的IV為0.1866526,且“Highly Predictive”

    1
    2
    data$Cabin <- sapply(data$Cabin, function(x) str_sub(x, start = 1, end = 1))
    WOETable(X=as.factor(data$Cabin[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL      PCT_G      PCT_B        WOE          IV
    ## 1   A     7    8    15 0.05109489 0.11764706 -0.8340046 0.055504815
    ## 2   B    35   12    47 0.25547445 0.17647059  0.3699682 0.029228917
    ## 3   C    35   24    59 0.25547445 0.35294118 -0.3231790 0.031499197
    ## 4   D    25    8    33 0.18248175 0.11764706  0.4389611 0.028459906
    ## 5   E    24    8    32 0.17518248 0.11764706  0.3981391 0.022907100
    ## 6   F     8    5    13 0.05839416 0.07352941 -0.2304696 0.003488215
    ## 7   G     2    2     4 0.01459854 0.02941176 -0.7004732 0.010376267
    ## 8   T     1    1     1 0.00729927 0.01470588 -0.7004732 0.005188134
    
    1
    IV(X=as.factor(data$Cabin[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1866526
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    Embarked為S的乘客幸存率較低

    Embarked變量代表登船碼頭,現通過統計不同碼頭登船的乘客幸存率來判斷Embarked是否可用于預測乘客幸存情況。

    1
    2
    3
    4
    5
    6
    7
    ggplot(data[1:nrow(train), ], mapping = aes(x = Embarked, y = ..count.., fill = Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('Embarked') +
    ylab('Count') +
    ggtitle('How Embarked impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,Embarked為S的乘客幸存率僅為217/(217+427)=33.7%,而Embarked為C或為NA的乘客幸存率均高于50%。初步判斷Embarked可用于預測乘客是否幸存。Embarked的WOE和IV計算如下。

    1
    WOETable(X=as.factor(data$Embarked[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL      PCT_G     PCT_B        WOE           IV
    ## 1   C    93   75   168 0.27352941 0.1366120  0.6942642 9.505684e-02
    ## 2   Q    30   47    77 0.08823529 0.0856102  0.0302026 7.928467e-05
    ## 3   S   217  427   644 0.63823529 0.7777778 -0.1977338 2.759227e-02
    
    1
    IV(X=as.factor(data$Embarked[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1227284
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    從上述計算結果可見,IV為0.1227284,且“Highly Predictive”。

    填補缺失值

    列出所有缺失數據

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    attach(data)
    missing <- list(Pclass=nrow(data[is.na(Pclass), ]))
    missing$Name <- nrow(data[is.na(Name), ])
    missing$Sex <- nrow(data[is.na(Sex), ])
    missing$Age <- nrow(data[is.na(Age), ])
    missing$SibSp <- nrow(data[is.na(SibSp), ])
    missing$Parch <- nrow(data[is.na(Parch), ])
    missing$Ticket <- nrow(data[is.na(Ticket), ])
    missing$Fare <- nrow(data[is.na(Fare), ])
    missing$Cabin <- nrow(data[is.na(Cabin), ])
    missing$Embarked <- nrow(data[is.na(Embarked), ])
    for (name in names(missing)) {
    if (missing[[name]][1] > 0) {
    print(paste('', name, ' miss ', missing[[name]][1], ' values', sep = ''))
    }
    }
    detach(data)
    ## [1] "Age miss 263 values"
    ## [1] "Fare miss 1 values"
    ## [1] "Cabin miss 1014 values"
    ## [1] "Embarked miss 2 values"
    

    預測乘客年齡

    缺失年齡信息的乘客數為263,缺失量比較大,不適合使用中位數或者平均值填補。一般通過使用其它變量預測或者直接將缺失值設置為默認值的方法填補,這里通過其它變量來預測缺失的年齡信息。

    1
    2
    age.model <- rpart(Age ~ Pclass + Sex + SibSp + Parch + Fare + Embarked + Title + FamilySize, data=data[!is.na(data$Age), ], method='anova')
    data$Age[is.na(data$Age)] <- predict(age.model, data[is.na(data$Age), ])

    中位數填補缺失的Embarked值

    從如下數據可見,缺失Embarked信息的乘客的Pclass均為1,且Fare均為80。

    1
    data[is.na(data$Embarked), c('PassengerId', 'Pclass', 'Fare', 'Embarked')]
    ## # A tibble: 2 × 4
    ##   PassengerId Pclass  Fare Embarked
    ##         <int>  <int> <dbl>    <chr>
    ## 1          62      1    80     <NA>
    ## 2         830      1    80     <NA>
    

    由下圖所見,Embarked為C且Pclass為1的乘客的Fare中位數為80。

    1
    2
    3
    4
    ggplot(data[!is.na(data$Embarked),], aes(x=Embarked, y=Fare, fill=factor(Pclass))) +
    geom_boxplot() +
    geom_hline(aes(yintercept=80), color='red', linetype='dashed', lwd=2) +
    scale_y_continuous(labels=dollar_format()) + theme_few()

    Fare median value of each Embarked and Pclass

    因此可以將缺失的Embarked值設置為’C’。

    1
    2
    data$Embarked[is.na(data$Embarked)] <- 'C'
    data$Embarked <- as.factor(data$Embarked)

    中位數填補一個缺失的Fare值

    由于缺失Fare值的記錄非常少,一般可直接使用平均值或者中位數填補該缺失值。這里使用乘客的Fare中位數填補缺失值。

    1
    data$Fare[is.na(data$Fare)] <- median(data$Fare, na.rm=TRUE)

    將缺失的Cabin設置為默認值

    缺失Cabin信息的記錄數較多,不適合使用中位數或者平均值填補,一般通過使用其它變量預測或者直接將缺失值設置為默認值的方法填補。由于Cabin信息不太容易從其它變量預測,并且在上一節中,將NA單獨對待時,其IV已經比較高。因此這里直接將缺失的Cabin設置為一個默認值。

    1
    data$Cabin <- as.factor(sapply(data$Cabin, function(x) ifelse(is.na(x), 'X', str_sub(x, start = 1, end = 1))))

    訓練模型

    1
    2
    set.seed(415)
    model <- cforest(Survived ~ Pclass + Title + Sex + Age + SibSp + Parch + FamilySize + TicketCount + Fare + Cabin + Embarked, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))

    交叉驗證

    一般情況下,應該將訓練數據分為兩部分,一部分用于訓練,另一部分用于驗證。或者使用k-fold交叉驗證。本文將所有訓練數據都用于訓練,然后隨機選取30%數據集用于驗證。

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    cv.summarize <- function(data.true, data.predict) {
    print(paste('Recall:', Recall(data.true, data.predict)))
    print(paste('Precision:', Precision(data.true, data.predict)))
    print(paste('Accuracy:', Accuracy(data.predict, data.true)))
    print(paste('AUC:', AUC(data.predict, data.true)))
    }
    set.seed(415)
    cv.test.sample <- sample(1:nrow(train), as.integer(0.3 * nrow(train)), replace = TRUE)
    cv.test <- data[cv.test.sample,]
    cv.prediction <- predict(model, cv.test, OOB=TRUE, type = "response")
    cv.summarize(cv.test$Survived, cv.prediction)
    ## [1] "Recall: 0.947976878612717"
    ## [1] "Precision: 0.841025641025641"
    ## [1] "Accuracy: 0.850187265917603"
    ## [1] "AUC: 0.809094822285082"
    

    預測

    1
    2
    3
    predict.result <- predict(model, data[(1+nrow(train)):(nrow(data)), ], OOB=TRUE, type = "response")
    output <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(output, file = "cit1.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分為0.80383,排第992名,前992/6292=15.8%。

    調優

    去掉關聯特征

    由于FamilySize結合了SibSp與Parch的信息,因此可以嘗試將SibSp與Parch從特征變量中移除。

    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(Survived ~ Pclass + Title + Sex + Age + FamilySize + TicketCount + Fare + Cabin + Embarked, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit2.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分仍為0.80383。

    去掉IV較低的Cabin

    由于Cabin的IV值相對較低,因此可以考慮將其從模型中移除。

    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(Survived ~ Pclass + Title + Sex + Age + FamilySize + TicketCount + Fare + Embarked, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit3.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分仍為0.80383。

    增加派生特征

    對于Name變量,上文從中派生出了Title變量。由于以下原因,可推測乘客的姓氏可能具有一定的預測作用

    • 部分西方國家中人名的重復度較高,而姓氏重復度較低,姓氏具有一定辨識度
    • 部分國家的姓氏具有一定的身份識別作用
    • 姓氏相同的乘客,可能是一家人(這一點也基于西方國家姓氏重復度較低這一特點),而一家人同時幸存或遇難的可能性較高

    考慮到只出現一次的姓氏不可能同時出現在訓練集和測試集中,不具辨識度和預測作用,因此將只出現一次的姓氏均命名為’Small’

    1
    2
    3
    4
    5
    6
    7
    8
    9
    data$Surname <- sapply(data$Name, FUN=function(x) {strsplit(x, split='[,.]')[[1]][1]})
    data$FamilyID <- paste(as.character(data$FamilySize), data$Surname, sep="")
    data$FamilyID[data$FamilySize <= 2] <- 'Small'
    # Delete erroneous family IDs
    famIDs <- data.frame(table(data$FamilyID))
    famIDs <- famIDs[famIDs$Freq <= 2,]
    data$FamilyID[data$FamilyID %in% famIDs$Var1] <- 'Small'
    # Convert to a factor
    data$FamilyID <- factor(data$FamilyID)
    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(as.factor(Survived) ~ Pclass + Sex + Age + Fare + Embarked + Title + FamilySize + FamilyID + TicketCount, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit4.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分為0.82297,排第207名,前207/6292=3.3%

    其它

    經試驗,將缺失的Embarked補充為出現最多的S而非C,成績有所提升。但該方法理論依據不強,并且該成績只是Public排行榜成績,并非最終成績,并不能說明該方法一定優于其它方法。因此本文并不推薦該方法,只是作為一種可能的思路,供大家參考學習。

    1
    2
    data$Embarked[c(62,830)] = "S"
    data$Embarked <- factor(data$Embarked)

    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(as.factor(Survived) ~ Pclass + Sex + Age + Fare + Embarked + Title + FamilySize + FamilyID + TicketCount, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit5.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分仍為0.82775,排第114名,前114/6292=1.8%
    Kaggle rank first 2%

    總結

    本文詳述了如何通過數據預覽,探索式數據分析,缺失數據填補,刪除關聯特征以及派生新特征等方法,在Kaggle的Titanic幸存預測這一分類問題競賽中獲得前2%排名的具體方法。

    《機器學習》系列文章

    郭俊 Jason wechat
    歡迎關注作者微信公眾號【大數據架構】
    您的贊賞將支持作者繼續原創分享
    速赢彩app www.320662.com | www.12136a.com | ff3336.com | www.3398788.com | 700589.com | www.xcn6.com | www.jing6667.com | 0253.com | www.8499f.com | www.v16677.com | 55545z.com | www.98698q.com | 3846w.com | www.999xm.cc | www.yh706.com | 1770s.com | www.50064m.com | www.3643p.com | 360urlscan.com | www.4331n.com | www.un222.com | 41178982.com | www.97828e.vip | 888234.com | 14179455.com | www.8zz66.com | huangma19.com | 338499.com | www.11pjjt.com | 66775002.com | bodog9393.com | www.hg8.vip | www.hgttt.com | 7777ag.com | www.1434j.com | 31325l.com | 3648.net | www.1205.com | www.66939988.com | 5627.com | www.596081.com | www.661559.com | 3015.cc | www.505354.com | www.2018fh.com | 7508t.com | www.117035.com | www.7225f.com | 0033b.cc | 5622h.com | www.99080077.com | www.1596d.com | sjg008.com | www.c5987.com | www.09528.cc | hh38648.com | www.606642.com | www.vns303.com | 61652f.com | www.011ac.com | www.hg8452.com | 4036444.com | www.582914.com | www.8080999a.com | www.p30226.com | 23233i.com | www.876061.com | www.5hg6668.com | www.56733o.com | 8449tt.com | www.3890k.com | www.yin3737.com | 7141jj.com | 48285555.com | www.3552p.com | www.998855t.com | 2851g.com | 3559r.com | www.50064w.com | www.m15595.com | 20054433.com | 1018xpj.com | www.41518y.com | www.99827.com | 00048d.com | 9411nnn.com | www.55228g.com | www.1122333.com | 2864i.com | 3559hhh.com | 3890p.com | www.pj88i.com | bet36530000.com | www.4809x.com | 188468.com | www.4996yc.com | www.v09738.com | www.876445.com | 1483g.com | www.hy5507.com | www.66621j.com | 8577o.cc | www.39058.cc | www.esb116.net | 2757g.com | www.9205b.com | r4042.com | www.155301.com | www.js7591.com | 4182v.com | www.msc99.la | www.3435z.com | z08199.com | laojieyuhe.com | y1253.com | hhgz1155.com | hjqpjgj.com | 17yy.com | 00778h.com | 588sss.cc | www.330652.com | www.918681.com | www.6661d.cc | www.3126u.com | www.66376c.com | www.552168.com | www.2566y1.com | www.00778a.com | www.56655c.com | www.www335505.com | www.842211.com | www.97321f.com | www.nnn9702.com | www.939393.com | www.yh-9.com | www.77803l.com | www.2078t.com | www.422442.com | 668cp77.com | 2618z.com | 4340m.com | 35222m.com | lh66r.com | 4036o.com | 22117q.com | 00774zz.com | 76886l.com | kk7742.com | 8036c.com | www.9068ii.com | huangguanwang6.com | www.507381.com | www.577285.com | www.546477.com | www.376477.com | www.196065.com | www.444070.com | www.503990.com | www.641638.com | www.550419.com | www.501349.com | www.89894c.com | 44tt8331.com | 3568cc.com | app3544.com | 0805q.com | 32666h.com | 6118r.com | 2848.tt | www.hg9952.com | www.111xhtd.com | www.88837n.com | www.am1818.cc | www.xj666k.com | www.71071y.com | www.9570115.com | www.79095x.com | www.984709.com | www.88266g.com | www.2350x.com | www.903212.com | www.488018.com | www.105180.com | 272yy.net | 2223890.com | 2223847.com | jj00558.com | 8036d.com | www.554js.com | www.h70717.com | www.56655o.com | www.47506f.com | www.560900.com | www.5441v.com | www.685053.com | www.52303o.com | 3550b.com | 00778i.com | jk080.com | www.glc66.com | www.32666w.com | www.444412.com | www.e526688.com | www.4546800.com | www.660066.vip | www.26299c.com | www.qucw5.com | mtime.com | afcp04.com | hwcp44.com | www.25673e.com | www.44488.cc | www.46630.vip | www.83033g.com | www.657399.com | 845262.com | 83138k.com | 2418003.com | www.25288r.com | www.138cpn.com | www.8905m.com | www.3416y.com | www.001036.com | 56787ii.com | 7002004.com | 0080y.com | www.9976799.com | www.870000.com | www.115527d.com | www.bet353653.com | www.596112.com | 6403.com | hg2019.cc | www.ok755.com | www.695044.com | www.88065c.com | www.377503.com | 1227011.com | 777k7.com | www.sbet678.net | www.g32939.com | www.734331.com | www.704126.com | w4222.com | 38238k.com | www.shenbobet.com | www.00868.com | www.32688.cc | www.55czj.com | 55545b.com | 5350d.com | www.76543o.com | www.618961.com | www.99677n.com | www.26878i.com | 4005dh.com | www.yddc06.com | www.jxcp1111.com | www.71399t.com | www.025517.com | jing7779.com | www.vns0488.com | www.5981m.com | www.9928j.com | 883399r.com | 4955h.com | www.518836.com | www.450098.com | www.89677.com | www.339189.com | 7742e.com | www.546001.com | www.365033.bet | www.6832s.com | 11335156.com | am4848.cc | www.474922.com | www.0055wd.com | 333235.com | ff3405.com | www.xpj6t6.com | www.9895l.com | www.896950.com | 2778aa.com | hg91558.com | www.hg3837.com | www.81608h.com | www.68682i.com | 26789.com | www.130031.com | www.087x.com | zz180.com | vip7704.cc | www.jsp19.com | www.8839c.com | 9897.com | www.shenbobet.com | www.65066bb.com | www.c6712.com | peiyinla.com | www.56520f.com | www.j2894.com | www.401274.com | lh66o.com | www.4938y.com | www.81619.com | www.515848.com | 7599nn.com | www.7t68.com | www.1368q.cc | 77463.com | www.25288x.com | www.0343e.com | 2373t.com | 036010.com | www.a69096.com | www.375791.com | hg28866.com | www.895095.com | www.5854i.cc | ggg5682.com | www.25288k.com | www.6889783.com | 7720.com | feicai0795.com | www.8473s.com | www.791571.com | d14666.com | www.000435.com | www.2109c.com | 7508w.com | www.xpj3200.com | www.33598d.com | 1294h.com | www.777444u.com | www.4923u.com | 2805z.com | www.99955545.com | www.2221103.com | 3405jj.com | www.aobo22.com | www.56011f.com | jidupt.com | 1408w.com | www.393008.com | 1705o.com | www.21365qq.com | www.4323r.com | hgw168n.com | www.20161100.com | www.5854l.cc | ylhg2233.com | www.04567n.com | www.550431.com | 55331dd.com | www.27363.com | www.805836.com | 7042004.com | www.80.tt | www.43131c.com | 7736k.com | www.1466v.com | 6150t.com | www.t063801.com | www.blr1144.com | 59859.com | www.k27229.com | www.87668w.com | 67890m.com | www.lefa888.com | www.hy5507.com | pj911c.cc | www.4521u.com | 23231381.com | www.js55567.com | www.371d.cc | p8381.com | www.393002.com | yy67890.com | www.25288h.com | www.ybao7.com | wdlywz.com | www.35252d.com | q72227.com | www.cr1112.com | www.11310.cc | 2012bet9.com | www.610096.com | 2021ppp.com | www.hg8875.com | www.076069.com | www.hg6906.com | www.50052h.com | 6487jjj.com | www.89677k.com | 6261t.com | www.sun7766.com | www.20czj.com | yth13.net | www.vn888456.com | 55323u.com | www.3066mm.com | dh742.com | www.y4042.com | 38840.com | www.20161122.com | www.482770.com | www.xpj6666.cc | www.65707n.com | 957939.com | www.4546900.com | 7736n.com | www.8839.com | 7003ss.com | www.0820.com | 63305.com | www.430msc.com | www.095335.com | www.88807d.com | www.915319.com | 0559hk.com | www.39957b.com | hg27388.com | www.3577.ag | vns2.vip | www.115527u.com | 2jsaaa.com | www.bwinyz03.com | 267365.com | www.4078n.com | 20833d.com | www.90.tt | 53206600.com | www.27363x.com | v47479.com | www.5966kkk.com | 9411ggg.com | www.911.hk | 2019rr.cc | www.rg777.com | bet35365.com | www.38138i.com | 9694c.com | www.hgdc300.com | yz57.com | www.w84i.com | 463y8.com | www.184533.com | 3640qq.com | www.48330o.com | 8381zz.com | www.954321s.com | 6766kk.com | www.6364c.com | www.1869i.com | www.947547.com | www.2880880.com | www.342918.com | www.11188807.com | www.022175.com | www.vns2016.com | jing7770.com | www.00773d.com | qjdy001.com | www.38775yy.com | 35222rr.com | www.88266n.com | 3379d.com | www.ascp3.com | www.22799.com | www.530833.com | www.vip22252.com | h151515.com | www.198hg.com | 9089.com | www.725603.com | vipvip7777.com | www.3005.am | www.365168.cc | www.381110.com | www.y440044.com | p888go.com | www.4446fff.com | 7726zzz.com | www.0270u.com | 1775mm.com | www.335247.com | www.634599.com | ls887.com | www.087938.com | 37570.com | www.gyfc7.com | www.82980.com | 2019ii.cc | www.v69096.com | 13222s.com | www.9478e.com | www.hg806.com | 0156742.com | www.28000c.com | hhh40033.com | www.295207.com | 30006c.com | www.701547.com | www.hg0019.com | 5651j.com | www.c80288.com | aipin11.me | www.hx6686.com | www.896883.com | www444000.com | www.vip7033.com | www.csyyzz.com | mtime.com | www.673888c.com | 4369.com | www.522.cc | www.vip7311.com | y35151.com | www.62778855.com | www.183ks.com | pj111177.com | www.3846ff.com | d55.com | 4066z.com | www.694916.com | 84497722.com | www.cp500.in | www.mailebi.com | 4022xx.com | www.50052d.com | www.11446547.com | h5429.com | www.61233t.com | www.66930011.com | 28288pp.com | www.91233m.com | www.88c07.com | 99w99.vip | www.907004.com | www.234374.com | 8838000.com | www.xb8833.com | www.fh7.927go.com | 56811.com | www.49956c.com | www.55268gg.com | z7454.com | www.21365bet.com | www.xb0021.com | 22933s.com | www.ya033.com | www.790808.com | 3467j.am | www.500871.com | www.wy8008.com | 11474444.com | js848.vip | www.77782yh.com | www.703270.com | www.4963uu.com | 8381l.com | www.617069.com | www.4136r.com | 921730.com | 2373v.com | www.xb8844.com | www.97780099.net | 3435f.com | www.578393.com | www.8473a.com | 9498.biz | 3049p.com | www.hg3005.com | www.358nn.com | cc01234.com | www.320660.com | www.5886rr.com | www.4625x.com | 1679abc.com | www.911924.com | www.9737yy.com | 1483a.com | 6868pp.cc | www.1111y.cc | www.4963h.com | 1665h.com | w72227.com | www.098wy.com | www.sj52288.com | 88807w.com | 2757m.com | www.50051k.com | www.3893y.com | 009900c.com | 1775aa.com | www.2632d.com | www.23277.com | jinguanylc9.com | 3939688.com | www.50051hh.com | www.01304.com | 0747w.com | 36402222.com | www.c5819.com | www.pj88r.com | 5446hh.com | qile220.com | www.cp8017.cc | www.t3410.com | www.h7788m.com | 3467w.cc | 23800p.com | www.66376p.com | www.766577.com | zhcp89.com | 1665jj.com | www.610829.com | www.38345.com | www.hg2808.com | 821.cc | 2096.com | www.915399.com | www.77782yh.com | www.hg10979.com | 3665yh.com | hg999333i.com | www.552703.com | www.555999qipai.com | www.2005647.com | 883399.com | bet36500220.com | www.627212.com | www.kj333888.com | www.jjjj007.com | www.6491o.com | hx6888.com | ule608.com | www.766076.com | www.88325.com | www.834817.com | www.87680j.com | 8522dddd.com | 11170022.com | www.867865.com | www.798345.com | www.1006hg.com | www.98699.com | 6868269.com | 1331o.com | www.178726.com | www.lb5555.com | www.998762.com | www.333222m.com | t99345.am | yhningxia.vip | www.177949.com | www.00665c.com | www.45598q.com | www.28758c.com | 2698v.com | 98345g.com | bet88w.com | www.369052.com | www.29178h.com | www.33678ee.com | www.6677104.com | 7893w40.com | 78808h.com | 4052v.com | 48586.com | www.949498.com | www.120782.com | www.8967d.com | www.87680t.com | 99151c.com | 51133t.com | 61789l.com | www.033918.com | www.csgc4.com | www.50dh.cc | www.27363k.com | www.29886t.com | 22296cc.com | s61788.com | 876878k.com | 2222138.com | www.302210.com | www.7dzh.com | www.89777w.com | www.lhg888.com | www.yk222a.com | www.hg6651.com | www.pj8819.com | 33382m.com | 8668099.com | v96.com | 3614h.com | www.77803v.com | www.854266.com | www.73166h.com | www.73990k.com | www.24k888.vip | www.js8040.com | www.1333zx.com | feicai0393.com | l4212.com | 11018z.com | 928012.com | sha145.com | www.50989i.com | www.655061.com | www.3333y.cc | www.95333w.com | www.1764j.com | www.j063801.com | www.971055.com | www.331445.com | www.1990365.com | pj677x.com | 22294433.com | bet37566.com | 6766yy.com | 8159f.cc | xiang07.com | bb1331.com | hunibe.com | 9068t.com | 88771382.com | www.045445.com | www.840680.com | www.rrle3.com | www.8039b.com | www.55228b.com | www.84499f.com | www.89777n.com | www.07679l.com | www.6776gg.com | www.c51ii.vip | www.bsd9999.com | www.9920992.com | www.hg9665.com | www.60168b.com | www.js7444.com | www.980143.com | www.58777w.com | www.21365uu.com | www.db2233.com | feicai0416.com | 4195k.com | 1340.com | 58802p.com | biying970vip.com | ff5443.com | 4893.com | www.dfs994.com | www.86611a.com | 97618o.com | 2418v.com | 500000813.com | 28758x.com | hd8332.com | 500000416.com | mg437766.com | vip7570.com | 33115e.com | jinlong12.com | hg0555.com | df8g.com | b887.xyz | feicai0556.com | 6641kk.com | www.hg582.com | www.hg9092.com | www.653504.com | www.56520h.com | www.hg8197.com | www.25288p.com | www.0006358.com | w7454.com | 3957x.com | 7141xx.com | 320006.com | amhj.com | c53.cc | 4022pp.com | j7742.com | 284813.com | www.gh0029.com | www.00618l.com | www.hg806.com | www.995m.net | www.pj2598.com | www.0285.com | www.919301.com | www.b35bb.com | www.7920g.com | www.9646u.com | www.50051x.com | www.3933p.cc | www.689826.com | www.160912.com | 8988i.com | js345o.com | 6033p.com | 3157777.com | 4590n.com | 7792.com | www.pj567.com | www.0860n.com | www.22441.com | www.56655i.com | www.222jyh.com | www.4809i.com | www.96386q.com | www.89959.cc | www.hy2522.com | www.306503.com | 5858jsc.com | vns8y.com | 060nnn.com | 11005p.com | 4066j.com | www.k30226.com | www.wns82.com | www.2y930.com | www.47506e.com | www.6678693.com | www.77114f.com | www.994696.com | 91019w.com | hdyh224.com | 54146655.com | 666df8.com | www.yh8877.cc | www.000943.com | www.987072.com | www.38775ss.com | www.50052r.com | www.581860.com | 3178w.com | v939.com | 2019dd.cc | www.58777j.com | www.35155.com | www.1434i.com | www.1035i.com | www.330691.com | 5003ww.com | 8569844.com | www.0223889.com | www.f333js.com | www.71585.com | www.16065n.com | www.868955.com | 2222150.com | c600.com | www.7727bet.com | www.js06888.com | www.792074.com | www.42042.cc | www.xcw866.com | 313.net | 0600i.cc | www.88807z.com | www.50000991.com | www.6482.com | www.449879.com | 079.com | 06382424.com | www.23427q.com | www.673888a.com | www.7793l.com | www.87668i.com | 2767x.com | 658606.com | www.ym777.cc | www.38775uu.com | www.976821.com | 0044.cc | 99589.cc | www.88837g.com | www.8473c.com | www.04500w.com | 33318z.com | 9506g.com | www.pj8458.com | www.28000k.com | www.66332p.com | 4195qq.com | pj00kk.com | www.39695h.com | www.hg173j.com | www.1368l.cc | 33tt8331.com | 4809h.com | www.0139.com | www.ldz333.com | www.fcff0.com | jj5443.com | 6118q.com | www.xn339.com | www.12136g.com | www.982090.com | dd5144.com | jsjlb11.com | www.111889.com | www.1429.com? | www.32123d.com | 55572949.com | www.440567.com | www.06386699.com | www.864876.com | bm1398.com | www.pj25555.com | www.hf9011.com | www.11260.cc | 3522.cc | 131lll.net | www.jb8899.com | www.4323b.com | 8790q.com | o01234.com | www.9870x4.com | www.3zq01.com | ce.warning.360.cn | 13222q.com | www.990998.com | www.50989h.com | alpk33.com | www.9374b.com | www.6688kcd.com | 496.be | 2247ll.com | www.e32939.com | www.547477.com | h1915.com | www.552226.com | www.fan73.com | 809h22.com | www.188i77.com | www.60108k.com | 80032277.com | www.21365bb.com | www.2418n.com | 1770f.com | www.pjliaoning.com | www.4809x.com | 81511n.com | www.vn686.com | www.r999996.com | itkvm.net | 6556g.com | www.12136x.com | 87965ff.com | 1389vv.com | www.4207.com | g17022.com | www.pj8777.com | www.068633.com | 4008590.com | www.986686.com | www.5958199.com | 10050532.com | www.00829c.com | www.91233l.com | 5456.com | www.hg0454.com | www.656by.com | 4136v.com | www.pj111111.com | www.599464.com | hg792.com | www.l456x.com | www.902776.com | 112i.net | www.yh66855.com | www.hg77730.com | yhhainan.vip | www.47ey.com | www.06088.com | 3678gg.com | www.hg2282.com | www.7782j.com | 22883z.com | www.v3304.com | www.81678d.com | 30007g.com | www.v3266.com | www.528qx.com | 40033k.com | www.98698m.com | www.330832.com | www.hg8mm.com | www.68993271.com | 4066qq.com | www.vns9978.com | www.281138.com | www.910056.com | 3mgmmmm.com | www.88837t.com | www.49956s.com | q6q66.com | www.495655.com | www.630477.com | 7779t.cc | www.77658yy.com | 4379k.com | www.70msc.com | www.fen00.com | mgm11999.com | www.4963qq.com | 2822828228.com | www.yh5099.com | www.97655b.com | swin.com | www.xycp099.com | 37987.com | www.pj550088.com | www.12455e.com | 9659z.com | www.848777i.com | 69111l.com | www.21365bb.com | www.gyfc2.com | ahga008.com | www.hgdc500.com | yin3939.com | www.3335544.com | www.351867.com | shen5533.com | www.5446ff.com | wfcp555444.com | www.hq77.com | www.ya088.com | 50067q.com | www.6889797.com | 4520066.com | www.js9997.com | www.590477.com | www.86611e.com | www.9711c.com | 4488m.com | www.55060k.com | 65335.com | www.cc1916.com | www.33598o.com | www.u32126.cc | www.ya390.com | 83377f.com | www.2221101.com | 2324vvv.com | www.5m5.com | zz.org | www.599234.com | www.646452.com | 8790d.com | www.49956d.com | 70118i.com | www.9989585.com | gh7105.com | www.bwinyz37.com | 33313k.com | www.yzcp2023.com | www.215135.com | www.333222q.com | www.980797.com | 1589vip.com | www.662by.com | 2078i.com | www.6888776.com | yh55444.com | www.hg77750.com | 2381o.com | www.7415kk.com | 118888777.com | www.dzj0404.com | hggjtg16.com | www.75367.com | 5622n.com |