@pagal_guy wrote:
hello,
I am using the below parameters:
parm = { 'objective':"binary:logistic", 'booster':"gbtree", 'eval_metric':"error", 'max_depth':8, 'min_child_weight':50, 'eta':0.2, 'seed':88888 }
and then I am using the below code:
from sklearn.cross_validation import KFold import xgboost as xgb from sklearn import metrics a = np.array([]) kfolds = KFold(train_X.shape[0], n_folds=6) for dev_index, val_index in kfolds: dev_X, val_X = train_X[dev_index,:], train_X[val_index,:] dev_y, val_y = train_y[dev_index], train_y[val_index] dtrain = xgb.DMatrix(dev_X,label = dev_y) dtest = xgb.DMatrix(val_X) bst = xgb.train( plst,dtrain, num_rounds) ypred_bst = bst.predict(dtest,ntree_limit=bst.best_iteration) ** score = metrics.confusion_matrix(val_y, ypred_bst)** res = (score[0][0]+score[1][1])*1.0/sum(sum(score)) a = np.append(a,[res]) print "Accuracy = %.7f" % (res) print "Overall Mean Accuracy = %.7f" % (np.mean(a))
however this is giving an error for the score part as:
I think this is because logistic will return probabilities and not class labels i think.So how can i specify that class labels are returned.?
Can someone please help me with this?
Posts: 1
Participants: 1