### Abstract

In this paper, we analyze probability density functions when the number of training samples is limited, assuming normal distributions. As the dimension of data increases significantly, the performance of a classifier suffers when the number of training samples is not adequate. This problem becomes worse as high dimensional data such as hyperspectral images are widely available. The key factor in designing a classifier is estimation of probability density functions, which are completely determined by covariance matrices and mean vectors in case of the Gaussian ML classifier. In this paper, we provide in-depth analyses of estimation of probability density functions in terms of the number of training samples assuming normal distributions and provide a guideline in choosing the dimensionality of data for a given set of training samples.

Original language | English |
---|---|

Pages | 458-462 |

Number of pages | 5 |

Publication status | Published - 2004 Dec 27 |

Event | Proceedings of the Seventh IASTED International Conference on Computer Graphics and Imaging - Kauai, HI, United States Duration: 2004 Aug 17 → 2004 Aug 19 |

### Other

Other | Proceedings of the Seventh IASTED International Conference on Computer Graphics and Imaging |
---|---|

Country | United States |

City | Kauai, HI |

Period | 04/8/17 → 04/8/19 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Engineering(all)

### Cite this

*Estimation of probability density functions from limited training samples*. 458-462. Paper presented at Proceedings of the Seventh IASTED International Conference on Computer Graphics and Imaging, Kauai, HI, United States.

}

**Estimation of probability density functions from limited training samples.** / Lee, Chulhee; Choi, Euisun.

Research output: Contribution to conference › Paper

TY - CONF

T1 - Estimation of probability density functions from limited training samples

AU - Lee, Chulhee

AU - Choi, Euisun

PY - 2004/12/27

Y1 - 2004/12/27

N2 - In this paper, we analyze probability density functions when the number of training samples is limited, assuming normal distributions. As the dimension of data increases significantly, the performance of a classifier suffers when the number of training samples is not adequate. This problem becomes worse as high dimensional data such as hyperspectral images are widely available. The key factor in designing a classifier is estimation of probability density functions, which are completely determined by covariance matrices and mean vectors in case of the Gaussian ML classifier. In this paper, we provide in-depth analyses of estimation of probability density functions in terms of the number of training samples assuming normal distributions and provide a guideline in choosing the dimensionality of data for a given set of training samples.

AB - In this paper, we analyze probability density functions when the number of training samples is limited, assuming normal distributions. As the dimension of data increases significantly, the performance of a classifier suffers when the number of training samples is not adequate. This problem becomes worse as high dimensional data such as hyperspectral images are widely available. The key factor in designing a classifier is estimation of probability density functions, which are completely determined by covariance matrices and mean vectors in case of the Gaussian ML classifier. In this paper, we provide in-depth analyses of estimation of probability density functions in terms of the number of training samples assuming normal distributions and provide a guideline in choosing the dimensionality of data for a given set of training samples.

UR - http://www.scopus.com/inward/record.url?scp=10444277337&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=10444277337&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:10444277337

SP - 458

EP - 462

ER -