• <menu id="gyiem"><menu id="gyiem"></menu></menu>
  • <menu id="gyiem"><code id="gyiem"></code></menu>

    機器學習(二) 如何做到Kaggle排名前2%

    原創文章,轉載請務必將下面這段話置于文章開頭處。
    本文轉發自技術世界原文鏈接 http://www.luozeyang.com/ml/classification/

    摘要

    本文詳述了如何通過數據預覽,探索式數據分析,缺失數據填補,刪除關聯特征以及派生新特征等方法,在Kaggle的Titanic幸存預測這一分類問題競賽中獲得前2%排名的具體方法。

    競賽內容介紹

    Titanic幸存預測是Kaggle上參賽人數最多的競賽之一。它要求參賽選手通過訓練數據集分析出什么類型的人更可能幸存,并預測出測試數據集中的所有乘客是否生還。

    該項目是一個二元分類問題

    如何取得排名前2%的成績

    加載數據

    在加載數據之前,先通過如下代碼加載之后會用到的所有R庫

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    library(readr) # File read / write
    library(ggplot2) # Data visualization
    library(ggthemes) # Data visualization
    library(scales) # Data visualization
    library(plyr)
    library(stringr) # String manipulation
    library(InformationValue) # IV / WOE calculation
    library(MLmetrics) # Mache learning metrics.e.g. Recall, Precision, Accuracy, AUC
    library(rpart) # Decision tree utils
    library(randomForest) # Random Forest
    library(dplyr) # Data manipulation
    library(e1071) # SVM
    library(Amelia) # Missing value utils
    library(party) # Conditional inference trees
    library(gbm) # AdaBoost
    library(class) # KNN
    library(scales)

    通過如下代碼將訓練數據和測試數據分別加載到名為train和test的data.frame中

    1
    2
    train <- read_csv("train.csv")
    test <- read_csv("test.csv")

    由于之后需要對訓練數據和測試數據做相同的轉換,為避免重復操作和出現不一至的情況,更為了避免可能碰到的Categorical類型新level的問題,這里建議將訓練數據和測試數據合并,統一操作。

    1
    2
    3
    data <- bind_rows(train, test)
    train.row <- 1:nrow(train)
    test.row <- (1 + nrow(train)):(nrow(train) + nrow(test))

    數據預覽

    先觀察數據

    1
    str(data)
    ## Classes 'tbl_df', 'tbl' and 'data.frame':    1309 obs. of  12 variables:
    ##  $ PassengerId: int  1 2 3 4 5 6 7 8 9 10 ...
    ##  $ Survived   : int  0 1 1 1 0 0 0 0 1 1 ...
    ##  $ Pclass     : int  3 1 3 1 3 3 1 3 3 2 ...
    ##  $ Name       : chr  "Braund, Mr. Owen Harris" "Cumings, Mrs. John Bradley (Florence Briggs Thayer)" "Heikkinen, Miss. Laina" "Futrelle, Mrs. Jacques Heath (Lily May Peel)" ...
    ##  $ Sex        : chr  "male" "female" "female" "female" ...
    ##  $ Age        : num  22 38 26 35 35 NA 54 2 27 14 ...
    ##  $ SibSp      : int  1 1 0 1 0 0 0 3 0 1 ...
    ##  $ Parch      : int  0 0 0 0 0 0 0 1 2 0 ...
    ##  $ Ticket     : chr  "A/5 21171" "PC 17599" "STON/O2. 3101282" "113803" ...
    ##  $ Fare       : num  7.25 71.28 7.92 53.1 8.05 ...
    ##  $ Cabin      : chr  NA "C85" NA "C123" ...
    ##  $ Embarked   : chr  "S" "C" "S" "S" ...
    

    從上可見,數據集包含12個變量,1309條數據,其中891條為訓練數據,418條為測試數據

    • PassengerId 整型變量,標識乘客的ID,遞增變量,對預測無幫助
    • Survived 整型變量,標識該乘客是否幸存。0表示遇難,1表示幸存。將其轉換為factor變量比較方便處理
    • Pclass 整型變量,標識乘客的社會-經濟狀態,1代表Upper,2代表Middle,3代表Lower
    • Name 字符型變量,除包含姓和名以外,還包含Mr. Mrs. Dr.這樣的具有西方文化特點的信息
    • Sex 字符型變量,標識乘客性別,適合轉換為factor類型變量
    • Age 整型變量,標識乘客年齡,有缺失值
    • SibSp 整型變量,代表兄弟姐妹及配偶的個數。其中Sib代表Sibling也即兄弟姐妹,Sp代表Spouse也即配偶
    • Parch 整型變量,代表父母或子女的個數。其中Par代表Parent也即父母,Ch代表Child也即子女
    • Ticket 字符型變量,代表乘客的船票號
    • Fare 數值型,代表乘客的船票價
    • Cabin 字符型,代表乘客所在的艙位,有缺失值
    • Embarked 字符型,代表乘客登船口岸,適合轉換為factor型變量

    探索式數據分析

    乘客社會等級越高,幸存率越高

    對于第一個變量Pclass,先將其轉換為factor類型變量。

    1
    data$Survived <- factor(data$Survived)

    可通過如下方式統計出每個Pclass幸存和遇難人數,如下

    1
    2
    3
    4
    5
    6
    7
    8
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Pclass, y = ..count.., fill=Survived)) + 
    geom_bar(stat = "count", position='dodge') +
    xlab('Pclass') +
    ylab('Count') +
    ggtitle('How Pclass impact survivor') +
    scale_fill_manual(values=c("#FF0000", "#00FF00")) +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,Pclass=1的乘客大部分幸存,Pclass=2的乘客接近一半幸存,而Pclass=3的乘客只有不到25%幸存。

    為了更為定量的計算Pclass的預測價值,可以算出Pclass的WOE和IV如下。從結果可以看出,Pclass的IV為0.5,且“Highly Predictive”。由此可以暫時將Pclass作為預測模型的特征變量之一。

    1
    WOETable(X=factor(data$Pclass[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL     PCT_G     PCT_B        WOE         IV
    ## 1   1   136   80   216 0.3976608 0.1457195  1.0039160 0.25292792
    ## 2   2    87   97   184 0.2543860 0.1766849  0.3644848 0.02832087
    ## 3   3   119  372   491 0.3479532 0.6775956 -0.6664827 0.21970095
    
    1
    IV(X=factor(data$Pclass[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.5009497
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    不同Title的乘客幸存率不同

    乘客姓名重復度太低,不適合直接使用。而姓名中包含Mr. Mrs. Dr.等具有文化特征的信息,可將之抽取出來。

    本文使用如下方式從姓名中抽取乘客的Title

    1
    2
    3
    4
    5
    6
    data$Title <- sapply(data$Name, FUN=function(x) {strsplit(x, split='[,.]')[[1]][2]})
    data$Title <- sub(' ', '', data$Title)
    data$Title[data$Title %in% c('Mme', 'Mlle')] <- 'Mlle'
    data$Title[data$Title %in% c('Capt', 'Don', 'Major', 'Sir')] <- 'Sir'
    data$Title[data$Title %in% c('Dona', 'Lady', 'the Countess', 'Jonkheer')] <- 'Lady'
    data$Title <- factor(data$Title)

    抽取完乘客的Title后,統計出不同Title的乘客的幸存與遇難人數

    1
    2
    3
    4
    5
    6
    7
    8
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Title, y = ..count.., fill=Survived)) + 
    geom_bar(stat = "count", position='stack') +
    xlab('Title') +
    ylab('Count') +
    ggtitle('How Title impact survivor') +
    scale_fill_discrete(name="Survived", breaks=c(0, 1), labels=c("Perish", "Survived")) +
    geom_text(stat = "count", aes(label = ..count..), position=position_stack(vjust = 0.5)) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可看出,Title為Mr的乘客幸存比例非常小,而Title為Mrs和Miss的乘客幸存比例非常大。這里使用WOE和IV來定量計算Title這一變量對于最終的預測是否有用。從計算結果可見,IV為1.520702,且”Highly Predictive”。因此,可暫將Title作為預測模型中的一個特征變量。

    1
    WOETable(X=data$Title[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ##       CAT GOODS BADS TOTAL       PCT_G       PCT_B         WOE            IV
    ## 1     Col     1    1     2 0.002873563 0.001808318  0.46315552  4.933741e-04
    ## 2      Dr     3    4     7 0.008620690 0.007233273  0.17547345  2.434548e-04
    ## 3    Lady     2    1     3 0.005747126 0.001808318  1.15630270  4.554455e-03
    ## 4  Master    23   17    40 0.066091954 0.030741410  0.76543639  2.705859e-02
    ## 5    Miss   127   55   182 0.364942529 0.099457505  1.30000942  3.451330e-01
    ## 6    Mlle     3    3     3 0.008620690 0.005424955  0.46315552  1.480122e-03
    ## 7      Mr    81  436   517 0.232758621 0.788426763 -1.22003757  6.779360e-01
    ## 8     Mrs    99   26   125 0.284482759 0.047016275  1.80017883  4.274821e-01
    ## 9      Ms     1    1     1 0.002873563 0.001808318  0.46315552  4.933741e-04
    ## 10    Rev     6    6     6 0.017241379 0.010849910  0.46315552  2.960244e-03
    ## 11    Sir     2    3     5 0.005747126 0.005424955  0.05769041  1.858622e-05
    
    1
    IV(X=data$Title[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ## [1] 1.487853
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    女性幸存率遠高于男性

    對于Sex變量,由Titanic號沉沒的背景可知,逃生時遵循“婦女與小孩先走”的規則,由此猜想,Sex變量應該對預測乘客幸存有幫助。

    如下數據驗證了這一猜想,大部分女性(233/(233+81)=74.20%)得以幸存,而男性中只有很小部分(109/(109+468)=22.85%)幸存。

    1
    2
    3
    4
    5
    6
    7
    8
    data$Sex <- as.factor(data$Sex)
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Sex, y = ..count.., fill=Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('Sex') +
    ylab('Count') +
    ggtitle('How Sex impact survivo') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    通過計算WOE和IV可知,Sex的IV為1.34且”Highly Predictive”,可暫將Sex作為特征變量。

    1
    WOETable(X=data$Sex[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ##      CAT GOODS BADS TOTAL     PCT_G    PCT_B        WOE        IV
    ## 1 female   233   81   314 0.6812865 0.147541  1.5298770 0.8165651
    ## 2   male   109  468   577 0.3187135 0.852459 -0.9838327 0.5251163
    
    1
    IV(X=data$Sex[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ## [1] 1.341681
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    未成年人幸存率高于成年人

    結合背景,按照“婦女與小孩先走”的規則,未成年人應該有更大可能幸存。如下圖所示,Age < 18的乘客中,幸存人數確實高于遇難人數。同時青壯年乘客中,遇難人數遠高于幸存人數。

    1
    2
    3
    ggplot(data = data[(!is.na(data$Age)) & row(data[, 'Age']) <= 891, ], aes(x = Age, color=Survived)) + 
    geom_line(aes(label=..count..), stat = 'bin', binwidth=5) +
    labs(title = "How Age impact survivor", x = "Age", y = "Count", fill = "Survived")
    ## Warning: Ignoring unknown aesthetics: label
    

    配偶及兄弟姐妹數適中的乘客更易幸存

    對于SibSp變量,分別統計出幸存與遇難人數。

    1
    2
    3
    4
    5
    ggplot(data = data[1:nrow(train),], mapping = aes(x = SibSp, y = ..count.., fill=Survived)) + 
    geom_bar(stat = 'count', position='dodge') +
    labs(title = "How SibSp impact survivor", x = "Sibsp", y = "Count", fill = "Survived") +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,SibSp為0的乘客,幸存率低于1/3;SibSp為1或2的乘客,幸存率高于50%;SibSp大于等于3的乘客,幸存率非常低。可通過計算WOE與IV定量計算SibSp對預測的貢獻。IV為0.1448994,且”Highly Predictive”。

    1
    WOETable(X=as.factor(data$SibSp[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL       PCT_G       PCT_B        WOE          IV
    ## 1   0   210  398   608 0.593220339 0.724954463 -0.2005429 0.026418349
    ## 2   1   112   97   209 0.316384181 0.176684882  0.5825894 0.081387334
    ## 3   2    13   15    28 0.036723164 0.027322404  0.2957007 0.002779811
    ## 4   3     4   12    16 0.011299435 0.021857923 -0.6598108 0.006966604
    ## 5   4     3   15    18 0.008474576 0.027322404 -1.1706364 0.022063953
    ## 6   5     5    5     5 0.014124294 0.009107468  0.4388015 0.002201391
    ## 7   8     7    7     7 0.019774011 0.012750455  0.4388015 0.003081947
    
    1
    IV(X=as.factor(data$SibSp[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1448994
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    父母與子女數為1到3的乘客更可能幸存

    對于Parch變量,分別統計出幸存與遇難人數。

    1
    2
    3
    4
    5
    ggplot(data = data[1:nrow(train),], mapping = aes(x = Parch, y = ..count.., fill=Survived)) + 
    geom_bar(stat = 'count', position='dodge') +
    labs(title = "How Parch impact survivor", x = "Parch", y = "Count", fill = "Survived") +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,Parch為0的乘客,幸存率低于1/3;Parch為1到3的乘客,幸存率高于50%;Parch大于等于4的乘客,幸存率非常低。可通過計算WOE與IV定量計算Parch對預測的貢獻。IV為0.1166611,且”Highly Predictive”。

    1
    WOETable(X=as.factor(data$Parch[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL       PCT_G       PCT_B        WOE          IV
    ## 1   0   233  445   678 0.671469741 0.810564663 -0.1882622 0.026186312
    ## 2   1    65   53   118 0.187319885 0.096539162  0.6628690 0.060175728
    ## 3   2    40   40    80 0.115273775 0.072859745  0.4587737 0.019458440
    ## 4   3     3    2     5 0.008645533 0.003642987  0.8642388 0.004323394
    ## 5   4     4    4     4 0.011527378 0.007285974  0.4587737 0.001945844
    ## 6   5     1    4     5 0.002881844 0.007285974 -0.9275207 0.004084922
    ## 7   6     1    1     1 0.002881844 0.001821494  0.4587737 0.000486461
    
    1
    IV(X=as.factor(data$Parch[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1166611
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    FamilySize為2到4的乘客幸存可能性較高

    SibSp與Parch都說明,當乘客無親人時,幸存率較低,乘客有少數親人時,幸存率高于50%,而當親人數過高時,幸存率反而降低。在這里,可以考慮將SibSp與Parch相加,生成新的變量,FamilySize。

    1
    2
    3
    4
    5
    6
    7
    8
    data$FamilySize <- data$SibSp + data$Parch + 1
    ggplot(data = data[1:nrow(train),], mapping = aes(x = FamilySize, y = ..count.., fill=Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('FamilySize') +
    ylab('Count') +
    ggtitle('How FamilySize impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    計算FamilySize的WOE和IV可知,IV為0.3497672,且“Highly Predictive”。由SibSp與Parch派生出來的新變量FamilySize的IV高于SibSp與Parch的IV,因此,可將這個派生變量FamilySize作為特征變量。

    1
    WOETable(X=as.factor(data$FamilySize[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL       PCT_G      PCT_B        WOE           IV
    ## 1   1   163  374   537 0.459154930 0.68123862 -0.3945249 0.0876175539
    ## 2   2    89   72   161 0.250704225 0.13114754  0.6479509 0.0774668616
    ## 3   3    59   43   102 0.166197183 0.07832423  0.7523180 0.0661084057
    ## 4   4    21    8    29 0.059154930 0.01457195  1.4010615 0.0624634998
    ## 5   5     3   12    15 0.008450704 0.02185792 -0.9503137 0.0127410643
    ## 6   6     3   19    22 0.008450704 0.03460838 -1.4098460 0.0368782940
    ## 7   7     4    8    12 0.011267606 0.01457195 -0.2571665 0.0008497665
    ## 8   8     6    6     6 0.016901408 0.01092896  0.4359807 0.0026038712
    ## 9  11     7    7     7 0.019718310 0.01275046  0.4359807 0.0030378497
    
    1
    IV(X=as.factor(data$FamilySize[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.3497672
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    共票號乘客幸存率高

    對于Ticket變量,重復度非常低,無法直接利用。先統計出每張票對應的乘客數。

    1
    ticket.count <- aggregate(data$Ticket, by = list(data$Ticket), function(x) sum(!is.na(x)))

    這里有個猜想,票號相同的乘客,是一家人,很可能同時幸存或者同時遇難。現將所有乘客按照Ticket分為兩組,一組是使用單獨票號,另一組是與他人共享票號,并統計出各組的幸存與遇難人數。

    1
    2
    3
    4
    5
    6
    7
    8
    9
    data$TicketCount <- apply(data, 1, function(x) ticket.count[which(ticket.count[, 1] == x['Ticket']), 2])
    data$TicketCount <- factor(sapply(data$TicketCount, function(x) ifelse(x > 1, 'Share', 'Unique')))
    ggplot(data = data[1:nrow(train),], mapping = aes(x = TicketCount, y = ..count.., fill=Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('TicketCount') +
    ylab('Count') +
    ggtitle('How TicketCount impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    由上圖可見,未與他人同票號的乘客,只有130/(130+351)=27%幸存,而與他人同票號的乘客有212/(212+198)=51.7%幸存。計算TicketCount的WOE與IV如下。其IV為0.2751882,且”Highly Predictive”

    1
    WOETable(X=data$TicketCount[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ##      CAT GOODS BADS TOTAL    PCT_G     PCT_B        WOE        IV
    ## 1  Share   212  198   410 0.619883 0.3606557  0.5416069 0.1403993
    ## 2 Unique   130  351   481 0.380117 0.6393443 -0.5199641 0.1347889
    
    1
    IV(X=data$TicketCount[1:nrow(train)], Y=data$Survived[1:nrow(train)])
    ## [1] 0.2751882
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    支出船票費越高幸存率越高

    對于Fare變量,由下圖可知,Fare越大,幸存率越高。

    1
    2
    3
    ggplot(data = data[(!is.na(data$Fare)) & row(data[, 'Fare']) <= 891, ], aes(x = Fare, color=Survived)) + 
    geom_line(aes(label=..count..), stat = 'bin', binwidth=10) +
    labs(title = "How Fare impact survivor", x = "Fare", y = "Count", fill = "Survived")

    不同倉位的乘客幸存率不同

    對于Cabin變量,其值以字母開始,后面伴以數字。這里有一個猜想,字母代表某個區域,數據代表該區域的序號。類似于火車票即有車箱號又有座位號。因此,這里可嘗試將Cabin的首字母提取出來,并分別統計出不同首字母倉位對應的乘客的幸存率。

    1
    2
    3
    4
    5
    6
    7
    ggplot(data[1:nrow(train), ], mapping = aes(x = as.factor(sapply(data$Cabin[1:nrow(train)], function(x) str_sub(x, start = 1, end = 1))), y = ..count.., fill = Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('Cabin') +
    ylab('Count') +
    ggtitle('How Cabin impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    由上圖可見,倉位號首字母為B,C,D,E,F的乘客幸存率均高于50%,而其它倉位的乘客幸存率均遠低于50%。倉位變量的WOE及IV計算如下。由此可見,Cabin的IV為0.1866526,且“Highly Predictive”

    1
    2
    data$Cabin <- sapply(data$Cabin, function(x) str_sub(x, start = 1, end = 1))
    WOETable(X=as.factor(data$Cabin[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL      PCT_G      PCT_B        WOE          IV
    ## 1   A     7    8    15 0.05109489 0.11764706 -0.8340046 0.055504815
    ## 2   B    35   12    47 0.25547445 0.17647059  0.3699682 0.029228917
    ## 3   C    35   24    59 0.25547445 0.35294118 -0.3231790 0.031499197
    ## 4   D    25    8    33 0.18248175 0.11764706  0.4389611 0.028459906
    ## 5   E    24    8    32 0.17518248 0.11764706  0.3981391 0.022907100
    ## 6   F     8    5    13 0.05839416 0.07352941 -0.2304696 0.003488215
    ## 7   G     2    2     4 0.01459854 0.02941176 -0.7004732 0.010376267
    ## 8   T     1    1     1 0.00729927 0.01470588 -0.7004732 0.005188134
    
    1
    IV(X=as.factor(data$Cabin[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1866526
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    Embarked為S的乘客幸存率較低

    Embarked變量代表登船碼頭,現通過統計不同碼頭登船的乘客幸存率來判斷Embarked是否可用于預測乘客幸存情況。

    1
    2
    3
    4
    5
    6
    7
    ggplot(data[1:nrow(train), ], mapping = aes(x = Embarked, y = ..count.., fill = Survived)) +
    geom_bar(stat = 'count', position='dodge') +
    xlab('Embarked') +
    ylab('Count') +
    ggtitle('How Embarked impact survivor') +
    geom_text(stat = "count", aes(label = ..count..), position=position_dodge(width=1), , vjust=-0.5) +
    theme(plot.title = element_text(hjust = 0.5), legend.position="bottom")

    從上圖可見,Embarked為S的乘客幸存率僅為217/(217+427)=33.7%,而Embarked為C或為NA的乘客幸存率均高于50%。初步判斷Embarked可用于預測乘客是否幸存。Embarked的WOE和IV計算如下。

    1
    WOETable(X=as.factor(data$Embarked[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ##   CAT GOODS BADS TOTAL      PCT_G     PCT_B        WOE           IV
    ## 1   C    93   75   168 0.27352941 0.1366120  0.6942642 9.505684e-02
    ## 2   Q    30   47    77 0.08823529 0.0856102  0.0302026 7.928467e-05
    ## 3   S   217  427   644 0.63823529 0.7777778 -0.1977338 2.759227e-02
    
    1
    IV(X=as.factor(data$Embarked[1:nrow(train)]), Y=data$Survived[1:nrow(train)])
    ## [1] 0.1227284
    ## attr(,"howgood")
    ## [1] "Highly Predictive"
    

    從上述計算結果可見,IV為0.1227284,且“Highly Predictive”。

    填補缺失值

    列出所有缺失數據

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    attach(data)
    missing <- list(Pclass=nrow(data[is.na(Pclass), ]))
    missing$Name <- nrow(data[is.na(Name), ])
    missing$Sex <- nrow(data[is.na(Sex), ])
    missing$Age <- nrow(data[is.na(Age), ])
    missing$SibSp <- nrow(data[is.na(SibSp), ])
    missing$Parch <- nrow(data[is.na(Parch), ])
    missing$Ticket <- nrow(data[is.na(Ticket), ])
    missing$Fare <- nrow(data[is.na(Fare), ])
    missing$Cabin <- nrow(data[is.na(Cabin), ])
    missing$Embarked <- nrow(data[is.na(Embarked), ])
    for (name in names(missing)) {
    if (missing[[name]][1] > 0) {
    print(paste('', name, ' miss ', missing[[name]][1], ' values', sep = ''))
    }
    }
    detach(data)
    ## [1] "Age miss 263 values"
    ## [1] "Fare miss 1 values"
    ## [1] "Cabin miss 1014 values"
    ## [1] "Embarked miss 2 values"
    

    預測乘客年齡

    缺失年齡信息的乘客數為263,缺失量比較大,不適合使用中位數或者平均值填補。一般通過使用其它變量預測或者直接將缺失值設置為默認值的方法填補,這里通過其它變量來預測缺失的年齡信息。

    1
    2
    age.model <- rpart(Age ~ Pclass + Sex + SibSp + Parch + Fare + Embarked + Title + FamilySize, data=data[!is.na(data$Age), ], method='anova')
    data$Age[is.na(data$Age)] <- predict(age.model, data[is.na(data$Age), ])

    中位數填補缺失的Embarked值

    從如下數據可見,缺失Embarked信息的乘客的Pclass均為1,且Fare均為80。

    1
    data[is.na(data$Embarked), c('PassengerId', 'Pclass', 'Fare', 'Embarked')]
    ## # A tibble: 2 × 4
    ##   PassengerId Pclass  Fare Embarked
    ##         <int>  <int> <dbl>    <chr>
    ## 1          62      1    80     <NA>
    ## 2         830      1    80     <NA>
    

    由下圖所見,Embarked為C且Pclass為1的乘客的Fare中位數為80。

    1
    2
    3
    4
    ggplot(data[!is.na(data$Embarked),], aes(x=Embarked, y=Fare, fill=factor(Pclass))) +
    geom_boxplot() +
    geom_hline(aes(yintercept=80), color='red', linetype='dashed', lwd=2) +
    scale_y_continuous(labels=dollar_format()) + theme_few()

    Fare median value of each Embarked and Pclass

    因此可以將缺失的Embarked值設置為’C’。

    1
    2
    data$Embarked[is.na(data$Embarked)] <- 'C'
    data$Embarked <- as.factor(data$Embarked)

    中位數填補一個缺失的Fare值

    由于缺失Fare值的記錄非常少,一般可直接使用平均值或者中位數填補該缺失值。這里使用乘客的Fare中位數填補缺失值。

    1
    data$Fare[is.na(data$Fare)] <- median(data$Fare, na.rm=TRUE)

    將缺失的Cabin設置為默認值

    缺失Cabin信息的記錄數較多,不適合使用中位數或者平均值填補,一般通過使用其它變量預測或者直接將缺失值設置為默認值的方法填補。由于Cabin信息不太容易從其它變量預測,并且在上一節中,將NA單獨對待時,其IV已經比較高。因此這里直接將缺失的Cabin設置為一個默認值。

    1
    data$Cabin <- as.factor(sapply(data$Cabin, function(x) ifelse(is.na(x), 'X', str_sub(x, start = 1, end = 1))))

    訓練模型

    1
    2
    set.seed(415)
    model <- cforest(Survived ~ Pclass + Title + Sex + Age + SibSp + Parch + FamilySize + TicketCount + Fare + Cabin + Embarked, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))

    交叉驗證

    一般情況下,應該將訓練數據分為兩部分,一部分用于訓練,另一部分用于驗證。或者使用k-fold交叉驗證。本文將所有訓練數據都用于訓練,然后隨機選取30%數據集用于驗證。

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    cv.summarize <- function(data.true, data.predict) {
    print(paste('Recall:', Recall(data.true, data.predict)))
    print(paste('Precision:', Precision(data.true, data.predict)))
    print(paste('Accuracy:', Accuracy(data.predict, data.true)))
    print(paste('AUC:', AUC(data.predict, data.true)))
    }
    set.seed(415)
    cv.test.sample <- sample(1:nrow(train), as.integer(0.3 * nrow(train)), replace = TRUE)
    cv.test <- data[cv.test.sample,]
    cv.prediction <- predict(model, cv.test, OOB=TRUE, type = "response")
    cv.summarize(cv.test$Survived, cv.prediction)
    ## [1] "Recall: 0.947976878612717"
    ## [1] "Precision: 0.841025641025641"
    ## [1] "Accuracy: 0.850187265917603"
    ## [1] "AUC: 0.809094822285082"
    

    預測

    1
    2
    3
    predict.result <- predict(model, data[(1+nrow(train)):(nrow(data)), ], OOB=TRUE, type = "response")
    output <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(output, file = "cit1.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分為0.80383,排第992名,前992/6292=15.8%。

    調優

    去掉關聯特征

    由于FamilySize結合了SibSp與Parch的信息,因此可以嘗試將SibSp與Parch從特征變量中移除。

    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(Survived ~ Pclass + Title + Sex + Age + FamilySize + TicketCount + Fare + Cabin + Embarked, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit2.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分仍為0.80383。

    去掉IV較低的Cabin

    由于Cabin的IV值相對較低,因此可以考慮將其從模型中移除。

    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(Survived ~ Pclass + Title + Sex + Age + FamilySize + TicketCount + Fare + Embarked, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit3.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分仍為0.80383。

    增加派生特征

    對于Name變量,上文從中派生出了Title變量。由于以下原因,可推測乘客的姓氏可能具有一定的預測作用

    • 部分西方國家中人名的重復度較高,而姓氏重復度較低,姓氏具有一定辨識度
    • 部分國家的姓氏具有一定的身份識別作用
    • 姓氏相同的乘客,可能是一家人(這一點也基于西方國家姓氏重復度較低這一特點),而一家人同時幸存或遇難的可能性較高

    考慮到只出現一次的姓氏不可能同時出現在訓練集和測試集中,不具辨識度和預測作用,因此將只出現一次的姓氏均命名為’Small’

    1
    2
    3
    4
    5
    6
    7
    8
    9
    data$Surname <- sapply(data$Name, FUN=function(x) {strsplit(x, split='[,.]')[[1]][1]})
    data$FamilyID <- paste(as.character(data$FamilySize), data$Surname, sep="")
    data$FamilyID[data$FamilySize <= 2] <- 'Small'
    # Delete erroneous family IDs
    famIDs <- data.frame(table(data$FamilyID))
    famIDs <- famIDs[famIDs$Freq <= 2,]
    data$FamilyID[data$FamilyID %in% famIDs$Var1] <- 'Small'
    # Convert to a factor
    data$FamilyID <- factor(data$FamilyID)
    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(as.factor(Survived) ~ Pclass + Sex + Age + Fare + Embarked + Title + FamilySize + FamilyID + TicketCount, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit4.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分為0.82297,排第207名,前207/6292=3.3%

    其它

    經試驗,將缺失的Embarked補充為出現最多的S而非C,成績有所提升。但該方法理論依據不強,并且該成績只是Public排行榜成績,并非最終成績,并不能說明該方法一定優于其它方法。因此本文并不推薦該方法,只是作為一種可能的思路,供大家參考學習。

    1
    2
    data$Embarked[c(62,830)] = "S"
    data$Embarked <- factor(data$Embarked)

    1
    2
    3
    4
    5
    set.seed(415)
    model <- cforest(as.factor(Survived) ~ Pclass + Sex + Age + Fare + Embarked + Title + FamilySize + FamilyID + TicketCount, data = data[train.row, ], controls=cforest_unbiased(ntree=2000, mtry=3))
    predict.result <- predict(model, data[test.row, ], OOB=TRUE, type = "response")
    submit <- data.frame(PassengerId = test$PassengerId, Survived = predict.result)
    write.csv(submit, file = "cit5.csv", row.names = FALSE)

    該模型預測結果在Kaggle的得分仍為0.82775,排第114名,前114/6292=1.8%
    Kaggle rank first 2%

    總結

    本文詳述了如何通過數據預覽,探索式數據分析,缺失數據填補,刪除關聯特征以及派生新特征等方法,在Kaggle的Titanic幸存預測這一分類問題競賽中獲得前2%排名的具體方法。

    《機器學習》系列文章

    郭俊 Jason wechat
    歡迎關注作者微信公眾號【大數據架構】
    您的贊賞將支持作者繼續原創分享
    速赢彩app hgw168x.com | ambyc3.com | www.1035d.com | 611.cc | www.693930.com | www.0860a.com | z2894.com | www.4963rr.com | 8988ee.com | www.xj456789.com | 8036ll.com | www.670137.com | www.7607600.com | 83377c.com | www.9j3.com | ry064.com | www.2373p.com | www.53516n.com | 80031122.com | www.192818.com | 2851p.com | www.599781.com | www.500994.com | 9679r.com | www.pj888l.com | wpmtp.com | www.77803u.com | www.751.cc | 3640g.com | www.5086a.com | www.sb5202.com | uu4675.com | www.3775v.com | www.aicaicc.com | 7811yy.com | www.b35cc.com | 4647r.com | www.955494.com | www.bj448.com | ooo0022.com | www.60007c.com | www.21365ss.com | YHvip.vip | www.73365r.cc | www.yhuuu.com | 5589v.com | www.68993236.com | www.7337942.com | 95gamevip3.com | www.52062n.com | www.8520e.com | 36502288.com | www.68993276.com | www.yh5258.com | 4809x.com | www.77114r.com | www.8814e5.com | dgd0000.com | www.68682f.com | www.438msc.com | 8000524.com | www.50026j.com | www.272424.com | 2698h.com | www.8499m.com | www.hg8357.com | 4369.com | www.577062.com | www.4972b.com | LXYL360.com | 2127q.com | www.6613699.com | www.990333.net | 0615004.com | www.982pk.com | www.hx928.com | 3664yh.com | www.150961.com | www.9895r.com | www.81866v.com | 01234j.com | www.3890i.com | www.v36.com | ca9033.com | 0860o.com | www.954321a.com | www.84162222.com | 4182h.com | yingbo114.com | www.087q.com | www.1423guo.com | xhtd12345.com | www.320667.com | www.180092.com | www.55268nn.com | 060ccc.com | www.758786.com | www.4996hl.com | www.5588hg.com | 496.vc | www.565378.com | www.m3410.com | www.hj200766.com | 77260088.com | feicai.com | www.61655x.com | www.91088.com | d55.com | 866666b.com | www.952856.com | www.4136w.com | www.6491u.com | 4255s.com | www.81678a.com | www.0601s.com | www.39555.com | 35252a.com | 40666111.com | www.558421.com | www.hr1866.com | www.73077d.com | huc288.com | 36404433.com | www.062166.com | www.1122800.com | www.938h.cc | www.9844ii.com | 4022dd.com | 26668q.com | www.2934p.com | www.79095u.com | www.hg0634.com | qmbc9.com | bbb5682.com | www.669634.com | www.44074.com | www.76775a.com | 1705x.com | x33q.vip | 2349013.com | www.cp689.vip | www.81608d.com | www.56655q.com | www.333222u.com | www01234.com | 52688p.com | www.091069.com | www.cp58870.com | www.1429g1.com | www.231477.com | 365488.bet | bdg1144.com | 83377p.com | www.755814.com | www.51365.com | www.45598b.com | www.js815.com | www.wn2008.com | 3522600.com | l82365.com | www.86267u.com | www.ch8567.com | www.29468.com | www.jz95599.com | www.b444.com | www.xhtd2233.com | 2535a.com | 4136k.com | bg7666.com | www.097968.com | www.hx6697.com | www.646869.com | www.cn365s.com | www.9068nn.com | www.zzzz0263.com | www.hg1528.com | dzc09.com | 7343366.com | 6556z.com | www.34788e.com | www.904862.com | www.66332w.com | www.flb577.com | www.009159.com | www.bet421.com | www.66930000.com | www.717020.com | 55tt8331.com | 88807u.com | wlb11.com | c1915.com | 272zz.net | 80032233.com | 3844qq.com | www.187513.com | www.934680.com | www.39957g.com | www.6888775.com | www.w84c.com | www.009319.com | www.rycp167.com | www.pj6368.com | www.57798c.com | www.3333740.com | www.88yh765.com | www.8520f.com | aa33336.com | 76543z.com | 6789blr.com | so444.cc | 4488jj.com | 9694p.com | wb750.com | 550111t.com | 61652m.com | 3416y.com | 7777jdb.com | 6868vv.cc | 06006y.com | 3189ee.com | 5099ff.com | 91019w.com | bc21888.com | 500000898.com | 38821100.com | 226688f.vip | 9995z.cc | 0250.com | 6556e.com | 3779955.com | 99567m.com | 111cp.com | 81366i.com | c89od261.com | v4212.com | 4288b.com | 9355ema.com | 44hh.bet | 35252p.com | 4694c.com | www.yun900.com | www.tushan48.com | www.3643l.com | www.hnjxjd.cn | www.433345.com | www.078999.com | www.klcp88.com | www.5588kk.com | www.2021w.com | www.9187u.com | www.065825.com | www.75538c.com | www.2632p.com | www.979819.com | www.590801.com | www.187516.com | c89qe65.com | 131bb.net | 2359911.com | so55555.cc | o82365.com | 781520.com | 6707888.com | ylzz2228.com | www.85770j.com | www.xpj8874.com | www.bbb839.com | www.50999q.com | www.40288k.com | www.a79839.com | www.5086k.com | www.444037.com | 6002.com | 6218d.com | 51133dd.com | 621155.com | www.333xhtd.com | www.lgf08.com | www.79500n.com | www.99552bb.com | www.06662.com | www.c4529.com | www.091069.com | 6556z.com | 93936f.com | zhcp56.com | www.4688111.com | www.447334.com | www.3066kk.com | www.3978n.com | www.979819.com | 11018n.com | 1489f.com | hjdc789.com | www.lhg555.com | www.49581.com | www.xpj338888.com | www.8582zz.com | www.al9888.com | 771906.com | tc88.com | 6040055.com | www.44488u.com | www.8473l.com | www.16065v.com | www.845755.com | 22296nn.com | 5309b.com | 2805k.com | www.vnsr33888.com | www.4996kf.com | www.84499n.com | www.606684.com | 7196z.com | 3775o.com | www.9846j.com | www.858822.com | www.xpj8668.com | www.hf5880.com | 3016rrr.com | 4060vv.com | www.wn2028.com | www.449999.com | www.566048.com | www.595411.com | 1463p.com | 84497788.com | www.140999.com | www.2021b.com | www.66ffi.com | 588sss.cc | 111ctx.com | www.bet365602.com | www.38345r.com | www.50732l.com | js345w.com | 211707.com | www.v0020.com | www.1064e.com | www.86079.cc | 66300vip11.com | yhgansu.vip | www.58777e.com | www.1754p.com | www.c136.vip | 54141133.com | www.8520o.com | www.108007.com | www.50052w.com | 588hhh.cc | 2127pp.com | www.vns2831.com | www.97828q.vip | www.190302.com | so55555.cc | www.3941177.com | www.38775cc.com | www.602471.com | 1813dd.com | www.hg3968.com | www.8080999j.com | www.401275.com | 8381r.com | www.912suncity.com | www.86339u.com | www.196103.com | 51133z.com | www.sun5655.com | www.68993252.com | 996622oo.com | LXYL361.com | www.327msc.com | www.3552n.com | 55545o.com | www.68666w.com | www.6880xx.com | www.502990.com | 3559nnn.com | www.6y7y.net | www.5091i.com | yy4119.com | www.hg131.com | www.8877gvb.com | www.80075o.com | 2127pp.com | www.bjd999.com | www.ya022.com | 9895w.com | www.572444.com | www.16065p.com | ss3405.com | www.9464hh.com | www.79095n.com | www.43131w.com | 官网捕鱼3.com | www.ebo666.com | www.737884.com | 33111122.com | www.188741.com | www.flff3.com | 00773oo.com | www.hg8549.com | www.5522u.cc | 88905050.com | www.61808.com | www.ya619.com | 9539012.com | www.63248.com | www.zfcp8.com | 1cp077.com | www.mailebi.com | www.hm6622.com | 40033f.com | www.318418.com | www.737298.com | 85686.com | www.4446ggg.com | www.087079.com | 30006t.com | www.jsc799.com | 19990.com | www.98699.com | www.81520z.com | 4546.com | www.cs.bet | www.346277.com | 3258w.com | www.jsdc9111.com | 6766ff.com | www.026319.com | www.013ac.com | 3950y.com | www.cai51.com | www.83993u.com | www.1869n.com | www.t639.com | 83378o.com | www.ra82.com | www.003561.com | www.ylg343.com | www.388603.com | alpk88.com | www.27363s.com | p222.com | www.87680l.com | www.918ar.com | 42428822.com | www.vnsr738.com | 8766549.com | www.0777msc.com | www.663738.com | 9030s.com | www.6482b.com | 4466303.com | www.614886.com | pj111177.com | www.js80088.com | www.712012.com | 2418009.com | www.a991qp.com | 4995r.com | www.4058m.com | 33773885.com | www.042055.com | www.71233o.com | www.hg567000.com | www.926857.com | 284813.com | www.91233w.com | 7003ww.com | www.07679j.com | 9068qq.com | www.99113y.com | zhcp34.com | www.0866.com | 66876f.com | www.3331js.com | www.315063.com | www.700h.com | www.620477.com | www.4123uu.com | www.659zf.com | 7792f.com | www.52303f.com | 765905.com | www.333682.com | 9659a.com | www.0343c.com | 40033t.com | www.7737cc.com | 35222h.com | www.4156.com | 3748.com | www.3399186.com | eee4255.com | www.gg4625.com | 4022u.com | www.9971i.com | 40033hhh.com | www.6613633.com | 84497711.com | www.50026p.com | 7141aa.com | www.66069.cc | 444789.com | www.c6127.com | www.wi9999.com | www.607325.com | www.d30226.com | www.178389.com | www.wd00004.com | 0615007.com | www.25959a.com | 3242s.com | www.7415ss.com | 50038q.com | www.1764i.com | pj300.cc | www.82575.com | www.1869e.com | www.599749.com | www.31399t.com | 5360jj.com | www.3846l.com | 6868456.cc | www.44118i.com | 9506e.com | www.61233s.com | bcrpk.com | www.619989.com | www.hg6288.com | 33vn77.com | www.4078p.com | 3967z.com | www.800938.com | cc2649.com | www.599842.com | www.hg30567.com | 71366s.com | www.4196s.com | 55545k.com | www.2408b.com | www.pj7897.com | 08530005.com | www.19019o.com | lll4255.com | www.71233k.com | www.vns4858.com | www.150882.com | www.sun181.com | 111122ee.com | www.bet73y.com | www.vn696.com | www.52303u.com | www.18123.com | ooo67890.com | www.12455k.com | www.8694v.com | 4195z.com | www.704901.com | 00ss8331.com | www.700724.com | www.433345.com | dd38648.com | www.50811.com | www.hg992255.com | 69446611.com | www.47506r.com | df6529.com | www.538950.com | www.597567d.com | 2021e.com | www.74972.com | www.pj8288.com | 1665j.com | www.860505.com | www.8124b.com | 55665002.com | www.hg0083.com | 97297i.com | 44dd8331.com | www.29098.com | 5589333.com | 732669.com | www.377666s.com | 883399c.com | www.41518b.com | www.35252i.com | 3434mmm.com | www.81520e.com | www.80999.vip | 3009x.com | 61327711.com | www.c3846.com | www.8520h.com | 4647033.com | www.v98478.com | www.bet25365.com | 33771381.com | www.26163m.com | www.tyc57.com | 39990066.com | www.865065.com | www.a81a.cc | 44112007.com | www.156506.com | www.509530.com | 6218.com | 86811nn.com | www.78949r.com | www.20550577.com | ll3336.com | www.599380.com | www.9bet005.com | xx2649.com | 6168.cc | www.84499z.com | www.70704.net | 3614q.com | www.305388.com | www.j83377.com | www.3333wnsr.com | kk3384.com | www.5xag.com | www.725025.com | 0747jj.com | 4255qq.com | www.66332b.com | www.8882244.com | aobo12.com | 66876f.com | www.77801f.com | www.82799b.com | 3301855.com | 84498833.com | www.96386p.com | www.3195555.com | 08159w.com | 97799a.com | www.65707z.com | www.581777.com | 2629.com | js14q.com | www.927150.com | www.4996ls.com | www.fcbmp.com | 0683e.com | www.234397.com | www.789zr.net | www.gf11888.com | 80368mm.com | 99v999.com | www.12455c.com | www.99113s.com | www.hg990022.com | 760238.com | www.89894b.com | www.0194004.com | www.w2826.com | vvv8827.com | 316.cc | www.507933.com | www.1434e.com | www.35138.com | 3258.com | 30009.com | www.577837.com | www.099018.com | www.377606.com | 500000520.com | 87965oo.com | www.295207.com | www.wn816.com | www.138cpi.com | www.gh0030.com | 119649.com | amhj.cc | www.3668r.com | www.xpj16683.com | www.kelake22.com | 2643r.com | 3726o.com | www.77djcp.com | www.68568d.com | www.7334c.com | www.k6572.com | 8988o.com | hhgz1144.com | www.50026v.com | www.36788y.com | www.783644.com | www.09527w.com | 3024.com | js8078.com | 1077zz.com | www.501209.com | www.6653k.com | www.694942.com | www.s95568.com | pj535.cc | y8287.com | bet365yulecheng871.org | www.345897.com | www.43818w.com | www.w3410.com | www.www-737475.com | www.38200i.com | z7742.com | x48f.com | 6869822.com | www.545669.com | www.60123w.com | www.vip9587.com | www.548058.com | www.1347-03.com | fff8827.com | 3467e.com | 606469.com | www.36787e.com | www.934077.com | www.hg6767c.com | www.2021k.com | www.5966rrr.com | www.2078p.com | cp088u.com | wnsr8817.com | 0610.com | lego.vip | 39552288.com | www.550347.com | www.6364d.com | www.00797b.com | www.8888xp.com | www.120345.com | www.hg9801199.com | www.3435333.com | 69441166.com | 4488vv.com | 9007570.com | 88807.com | yin6666.com | www.621441.com | www.5441y.com | www.ya2019u.com | www.r999996.com | www.dsn88.net | www.dw8844.com | www.hxcp3.com | www.benz4455s.com | 8520m.com | 0747v.com | hj1234.com | 3678dd.com | 88894f.com | 60956666.com | jixiang20.com | www.77114o.com | www.602472.com | www.933661.com | www.2373d.com | www.71233p.com | www.js7581.com | www.4809q.com | www.40288f.com | www.4058u.com | www.8888hj.com | www.dzcp7777.com | www.490000.com | www.0055js.com | www.25288s.com | www.w7w7.com | www.55yh765.com | www.1188hg.com | www.033033g.com | ly88888824.com | feicai024.com | 0434355.com | 1cp055.com | 80850x.com | vic.am | 983888n.com | 2141100.com | 8569811z.com | 6766ii.com | 86811o.com | 32250.com | 22296oo.com | spj09.com | da611.com | 2778xl.com | 80579a.com | www.js882288.com | www.pj7333.com | www.7t789.com | www.808xpj.com | www.hg158.ag | www.330099l.com | www.q13608.com | www.308878.com | www.88hm.com | www.82142.com | www.hg1266.com | www.938d.cc | www.58118e.com | www.c30666.com | www.292085.com | www.742988.com | www.4447758.com | www.js520988.com | www.77114p.com | www.8667o.com | www.cb6588.com | www.794918.com | www.381110.com | www.51515z.com | 50000.com | 28288zz.com | tzvip2028.com | nn1915.com | pj09595.com | pj08222.com | www.fh7777.com | www.56520v.com | www.1596b.com | www.999530.com | www.4972u.com | www.ee00668.com | www.jbb400.cc | www.00840p.com | www.903029.com | www.257233.com | 2649p.com | 3y999.com | 2757f.com | 6xh.vip | 473041.com | www.gt885.com | www.782881.com | www.hv588.com | www.4996sp.com | www.77424.com | www.87096.com | www.652162.com | dytj365.com | 2247pp.com | 80567v.com | yz6388.com | www.kim217.com | www.71234010.com | www.50999m.com | www.52062m.com | www.12455s.com | www.557089.com | 3049n.com | 8827ttt.com | 33rr8332.com | www.720992.com | www.ms9998.com | www.f32939.com | www.111999qipai.com | www.11201.cc | www.355274.com | 3189ss.com | yuhe.ooo | spj09.com | www.2846j.com | www.68277999.com | www.14168.com | www.9149i.com | www.339219.com | 1669c.com | 3950k.com | 11422l.com | www.hg134.com | www.469701.com | www.36788n.com | www.339219.com | 3y999.com | 08159s.com | www.dh03035.com | www.2233suncity.com | www.350c6.com | www.fcff5.com | 66648m.com | b1654.com | 32212x.com | www.yinhe899.com | www.4196l.com | www.00711.cc | 54141122.com | 14341434.com | 2455d.com | www.hg8447.com | www.69567n.com | www.w635.com | 11505.com | 4288cc.com | www.546001.com | www.38238d.com | www.2418z.com | www.576679.com | 38238u.com | hg91778.com | www.bb8005.com | www.mmm2848.com | www.cb7188.com | yh3434.com | o99345.am | www.40033.1340033.com | www.35700d.com | www.5484f.com | 3004.com | 53262kk.com | www.4107u.com | www.26123ee.com | www.5086r.com | 3424y.com | 8547j.com | www.js466.com | www.792063.com | www.603924.com | y68com | 5437t.com | www.a81b.cc | www.39500p.com | 15222.com | 8547m.com | www.33885940.com | www.5655999.com | www.609813.com | 9086.com | www.pj5901.com | www.jz9588.com | www.58fcw.com | xx888q.com | i2306.com | www.91w.com | www.16878g.com | 2324rrr.com | y2306.com | www.55526o.com | www.04500w.com | 2127s.com | 500000795.com | www.tw7788.com | www.33358c.com | 353458.com | h47479.com | www.s3065.com | www.c300.biz | vip61788.com | www.621155b.com | www.4521l.com | www.544087.com | b6857.com | www.sky5678.com | www.63800v.com | www.50026e.com | 888.cm | www.ra7088.com | www.36166l.com | 3568zz.com | www.1869s.com | www.n80288.com | www.591733.com | 53262r.com | www.js6169.com | www.16065h.com | pj5288.com | 22335002.com | www.55060b.com | www.606614.com | 8037i.com | www.6655288.com | www.65707p.com | 9856pp.com | www.pjyunnan.com | www.12345605.cc | 79964d.com | 538.cc | www.8888yh.com | www.611874.com | 40014477.com | www.033008.com | www.48wcp.com | 4270ee.com | www.9785333.com | www.77114t.com | 5504n.com | www.4972oo.com | www.3328229.com | 57157f.com | www.857558.com | www.586806.com | 3024h.com | www.50061b.com | www.68365f.com | 8894vip2.com | www.445884215.com | www.pj55716.com | by306.com | www.xpj9999.cc | www.97828l.vip | wfcp555444.com | www.js662.com | www.1466q.com | jinshabb.com | www.b4737.com | www.84499l.com | 4195j.com | www.027322.com | www.448h.com | 3568j.com | www.hndf333.com | www.585065.com | u86811.com | www.5180889.com | 6150b.com | www.50044.com | www.ylylc04.com | 80032277.com | www.24671.net | www.cly5.com | 22296zt.com | www.yh8878n.com | www.66653e.com | 226688j.net | www.6768991.com | 9649d.com | www.38144144.com | www.955474.com | 8988k.com | www.4996lj.com | 3245777.com | www.hg8707.com | www.36166w.com | ll3336.com | www.88166e.com | qianyi819.com | www.vns9983.com | www.35155x.com | 4488sss.com | www.9299.com | c75ioyt.com | www.454.net | www.5854d.cc | 97297p.com | www.208432.com | 88807.com | www.85886.la | www.744646.com | 2543009.com | www.mgm777c.com | 888funcity.com | www.hnwmrx.com | h7742.com | www.hg226.com | www.535hc.com | 418923.com | www.vns118.me | 56978.com | www.618msc.com | 535app.cc | www.blr5544.com | 4323o.com | www.hwx688.com | 5310.com | www.3459v.com | www.fcyl1.com | 365666.com | www.9679l.com | www.865065.com | 8036ww.com | www.97655r.com | 3522qq.cc | www.7191a.com | 188qq444.com | www.58118h.com | 3267.com | www.4888789.com |