varImp does not recognize gradient boosting model produced by Ret when I apply weights to target

Asked

Viewed 37 times

-1

I produced a gradient boosting model with the package caret and applied weights to the target variable. When I call the function varImp in the model it returns the error:

Error in xgb.model.dt.tree(feature_names = feature_names, text = model_text_dump, : Non-tree model detected! This function can only be used with tree models.

If I don’t apply the weights, varImp works normally. Why it does not identify my tree?

My code:

set.seed(123)

model_weights <- ifelse(modelo_df_sseg$FATALIDADES == 1,
                        yes = (1/table(modelo_df_sseg$FATALIDADES)[2]) * 0.5,
                        no = (1/table(modelo_df_sseg$FATALIDADES)[1]) * 0.5
                        )

model <- train(
  as.factor(FATALIDADES) ~.,
  data = modelo_df_sseg, 
  method = "xgbTree",
  trControl = trainControl("cv", number = 10),
  weights = model_weights
  )
varImp(model)

EDITION 04/09/2020

No stack overflow in English guided me to use Wts instead of Weights, but now the error message that appears to me is:

Error in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : argumento formal "wts" corresponde a múltiplos argumentos especificados

Follow a code you can test on your own machines:

library(caret)
library(carData)

set.seed(123)

basex <- Arrests

model_weights <- ifelse(basex$released == 2,
                        yes = (1/table(basex$released)[2]) * 0.5,
                        no = (1/table(basex$released)[1]) * 0.5
                        )

y = basex$released
x = basex
tc = trainControl("cv", number = 10)

mtd = "xgbTree"
model <- train(
  x, 
  y, 
  method = mtd,
  trControl = tc, 
  wts = model_weights,
  verbose = TRUE
  )
  • You need to share all the code and the dataset to whoever is testing can reproduce the error. But the error that is described is that you are trying to call the function by passing as parameter something it does not expect, the function expects a tree model.

  • I also posted a version of the question on the American stack and was told to do it with Wts instead of Weights. I will add in my question with an example base on which you can test on your machine.

1 answer

1


Solution found in stackoverflow in English. Follow example code:

set.seed(123)

basex <- Arrests

model_weights <- ifelse(basex$released == "Yes",
                        table(basex$released)[1]/nrow(basex),
                        table(basex$released)[2]/nrow(basex))

dummies <- dummyVars(released ~ ., data = basex)
x <- predict(dummies, newdata = basex)
y <- basex$released
folds <- createFolds(basex$released, 10)
tc <- trainControl(method = "cv",
                   number = 10,
                   summaryFunction = twoClassSummary,
                   index = folds, #predefined folds
                   classProbs = TRUE) #needed for twoClassSummary

mtd <- "xgbTree"

model <- train(x = x, 
               y = y, 
               method = mtd,
               trControl = tc, 
               weights = model_weights,
               verbose = TRUE,
               metric = "ROC")

varImp(model)

model$results %>%
  filter(ROC == max(ROC))

Browser other questions tagged

You are not signed in. Login or sign up in order to post.