knnB {MLInterfaces}R Documentation

An interface to various machine learning methods for ExpressionSets

Description

This document describes a family of wrappers of calls to machine learning classifiers distributed through various R packages. This particular document concerns the classifiers for which training-vs-test set application makes sense.

For example, knnB is a wrapper for a call to knn for objects of class ExpressionSet. These interfaces, of the form [f]B provide a common calling sequence and common return value for machine learning code in function [f].

For details on the additional arguments that may be passed to any covered machine learning function f, check the manual page for that function. This will require loading the package in which f is found.

Usage

knnB(exprObj, classifLab, trainInd, k = 1, l = 1, prob = TRUE,
  use.all = TRUE, metric = "euclidean") 
#
# for such functions as nnetB, use the same first three
# parameters, and then add optional parameters from the nnet API
#

Arguments

exprObj An instance of the exprset class.
classifLab The name of the phenotype variable to use for classification.
trainInd integer vector: Which elements are the training set.
k The number of nearest neighbors.
l See knn for a complete description.
prob See knn for a complete description.
use.all See knn for a complete description.
metric See knn for a complete description.

Details

See knn for a complete description of parameters to and details of the k-nearest neighbor procedure in the class package.

For other interfaces, such as ldaB, nnetB, rpartB, gbmB, randomForestB, and so on, see the usage note above and also see the man pages for those functions. For each of these functions you will need to attach the appropriate package in order to examine the man page.

The MLearn interface is a more unified approach but is still maturing.

Value

An object of class "classifOutput" This class unifies the representation of results of machine learning algorithms implemented by different designers.

Author(s)

Jess Mar, VJ Carey <stvjc@channing.harvard.edu>

See Also

xval for information on how to obtain various cross-validated fits, and MLearn for a less fragmented implementation of the interface.

Examples

# access and trim an ExpressionSet
library(golubEsets)
data(Golub_Merge)
smallG <- Golub_Merge[1:60,]
# set a PRNG seed for reproducibilitiy
set.seed(1234) # needed for nnet initialization
# now run the classifiers
knnB( smallG, "ALL.AML", 1:40 )
nnetB( smallG, "ALL.AML", 1:40, size=5, decay=.01 )
lvq1B( smallG, "ALL.AML", 1:40 )
naiveBayesB( smallG, "ALL.AML", 1:40 )
svmB( smallG, "ALL.AML", 1:40 )
baggingB( smallG, "ALL.AML", 1:40 )
ipredknnB( smallG, "ALL.AML", 1:40 )
sldaB( smallG, "ALL.AML", 1:40 )
ldaB( smallG, "ALL.AML", 1:40 )
qdaB( smallG[1:10,], "ALL.AML", 1:40 )
pamrB( smallG, "ALL.AML", 1:40 )
rpartB( smallG, "ALL.AML", 1:35 )
randomForestB( smallG, "ALL.AML", 1:35 )
gbmB( smallG, "ALL.AML", 1:40, n.minobsinnode=3 , n.trees=6000)
stat.diag.daB( smallG, "ALL.AML", 1:40 )

[Package MLInterfaces version 1.10.3 Index]