weka: backwards with delete

paper:

Mark Hall, Eibe Frank: Combining Naive Bayes and Decision Tables. In: Proceedings of the 21st Florida Artificial Intelligence Society Conference (FLAIRS)

 

 

code:

// best_group初始包含所有属性 // main search loop boolean done = false; boolean addone = false; boolean z; boolean deleted = false; while (!done) { temp_group = (BitSet)best_group.clone(); temp_best = best_merit; done = true; addone = false; for (i = 0; i < numAttribs;i++) { z = ((i != classIndex) && (temp_group.get(i))); if (z) { temp_group.clear(i); //TODO 核心在这。。如果数据样本中不包含属性i时,待测属性集的评价分高于数据样本中包含属性i时待测属性集的评价分, 则从“数据样本”中永久性删除该属性(实际上是在评价时忽略该属性) temp_merit = ((SubsetEvaluator)eval).evaluateSubset(temp_group); temp_merit_delete = ((EvalWithDelete)eval).evaluateSubsetDelete(temp_group, i); boolean deleteBetter = false; if (temp_merit_delete >= temp_merit) { temp_merit = temp_merit_delete; deleteBetter = true; //标记为删除可能会好点, 具体删除的更多限制条件在后面。。 } z = (temp_merit >= temp_best); if (z) { //还要高于当前最佳 temp_best = temp_merit; temp_index = i; addone = true; done = false; if (deleteBetter) { deleted = true; } else { deleted = false; } } // unset this addition/deletion temp_group.set(i); }// end if(z) }//end for(i=0; if (addone) { //如果删除该属性后的评价分比不删除的要高、且高于当前最高评价, 则删除该属性: //从best_group中永久性删除, 从数据样本中永久性删除 best_group.clear(temp_index); best_merit = temp_best; if (deleted) { ((EvalWithDelete)eval).getDeletedList().set(temp_index); } } }// end while(!done) return attributeList(best_group);

你可能感兴趣的:(delete,search)