Just some tips:
options(error=recover): it will tell R to launch debug section and you can choose which one to debug
options(show.error.locations=TRUE): let R show the source line number
Something else:
use traceback() to locate where the last error message is and then use browser() to run the function again to check what is wrong.
Monday, December 30, 2013
Tuesday, December 24, 2013
A note about what is the output in predict.lm(lm_model, test_data, type=”terms”):
## first explain what is type="terms": If type="terms" is selected, a matrix of predictions
## on the additive scale is produced, each column giving the deviations from the overall mean
## (of the original data's response, on the additive scale), which is given by the attribute "constant".
set.seed(9999)
x1=rnorm(10)
x2=rnorm(10)
y=rnorm(10)
lmm=lm(y~x1+x2)
predict(lmm, data=cbind(x1,x2,y), type="terms")
lmm$coefficient[1]+lmm$coefficient[2]*x1+mean(lmm$coefficient[3]*x2)-mean(y)-predlm[,1]
lmm$coefficient[1]+lmm$coefficient[3]*x2+mean(lmm$coefficient[2]*x1)-mean(y)-predlm[,2]
Monday, December 23, 2013
R: Calculate ROC and Plot ROC
library(ROCR)
library(Hmisc)
## calculate AUC from the package ROCR and compare with it from Hmisc
# method 1: from ROCR
data(ROCR.simple)
pred=prediction(ROCR.simple$prediction, ROCR.simple$labels)
perf=performance(pred, 'tpr', 'fpr') #true positive and false negative
plot(perf, colorize=T)
perf2=performance(pred, 'auc')
auc=unlist(slot(perf2, 'y.values')) # this is the AUC
# method 2: from Hmisc
rcorrstat=rcorr.cens(ROCR.simple$prediction, ROCR.simple$labels)
rcorrstat[1] # 1st is AUC, 2nd is Accuracy Ratio(Gini Coefficient, or PowerStat, or Somer's D)
Subscribe to:
Posts (Atom)