MINND

Package

weka.classifiers.mi

Synopsis

Multiple-Instance Nearest Neighbour with Distribution learner.

It uses gradient descent to find the weight for each dimension of each exeamplar from the starting point of 1.0. In order to avoid overfitting, it uses mean-square function (i.e. the Euclidean distance) to search for the weights.
It then uses the weights to cleanse the training data. After that it searches for the weights again from the starting points of the weights searched before.
Finally it uses the most updated weights to cleanse the test exemplar and then finds the nearest neighbour of the test exemplar using partly-weighted Kullback distance. But the variances in the Kullback distance are the ones before cleansing.

For more information see:

Xin Xu (2001). A nearest distribution approach to multiple-instance learning. Hamilton, NZ.

Options

The table below describes the options available for MINND.

Option

Description

debug

If set to true, classifier may output additional info to the console.

numNeighbours

The number of nearest neighbours to the estimate the class prediction of test bags.

numTestingNoises

The number of nearest neighbour instances in the selection of noises in the test data.

numTrainingNoises

The number of nearest neighbour instances in the selection of noises in the training data.

Capabilities

The table below describes the capabilites of MINND.

Capability

Supported

Class

Missing class values, Binary class, Nominal class

Attributes

Unary attributes, Binary attributes, Empty nominal attributes, Relational attributes, Missing values, Nominal attributes

Other

Only multi-Instance data

Min # of instances

1