R : Train Random Forest with Caret Package (R)

Train Random Forest with Caret Package (R)

1. No Cross Validation / Bootstrapping
mtry <- tuneRF(dev[, -1], dev[,1], ntreeTry=500, stepFactor=1.5,improve=0.01, trace=TRUE, plot=TRUE)

best.m <- mtry[mtry[, 2] == min(mtry[, 2]), 1]

set.seed(825)
trained1 <- train(dev[, -1], dev[,1], method = "rf", ntree=100, tuneGrid=data.frame(mtry=best.m), trControl = trainControl(method = "none"), importance = TRUE)

2. Cross Validation with Manual Fine Tuning
sqtmtry<- round(sqrt(ncol(mydata) - 1))
rfGrid <- expand.grid(mtry = c(round(sqtmtry / 2), sqtmtry, 2 * sqtmtry))

ctrl <- trainControl(method = "cv", classProbs = TRUE, summaryFunction = twoClassSummary, number = 3)

set.seed(2)
trained2<- train(Y ~ . , data = mydata, method = "rf", ntree = 500, tuneGrid = rfGrid, metric = "ROC",
trControl = ctrl, importance = TRUE)
3. Cross Validation with Automatic Fine Tuning

set.seed(2)
trained3 <- train(Y ~ . , data = mydata, method = "rf", ntree = 500, tunelength = 10, metric = "ROC", trControl = ctrl, importance = TRUE)

4. Bootstrapping (Repetitive Sampling)

set.seed(2) tuned <- train(dev[, -1], dev[,1], method = "rf" , ntree =10)
Note : By default, it creates 25 repetitive samples.

If you want to set 10 repetitive samples.
tuned <- train(dev[, -1], dev[,1], method = "rf" , ntree =10 , trControl= trainControl(method="boot", number=10))


Related Posts
About Author:

Deepanshu founded ListenData with a simple objective - Make analytics easy to understand and follow. He has over 10 years of experience in data science. During his tenure, he has worked with global clients in various domains like Banking, Insurance, Private Equity, Telecom and Human Resource.

0 Response to "R : Train Random Forest with Caret Package (R)"

Post a comment

Next → ← Prev
Love this Post? Spread the Word!
Share