Information measure in terms of the hazard function and its estimate

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

It is well-known that some information measures, including Fisher information and entropy, can be represented in terms of the hazard function. In this paper, we provide the representations of more information measures, including quantal Fisher information and quantal Kullback-leibler information, in terms of the hazard function and reverse hazard function. We provide some estimators of the quantal KL information, which include the Anderson-Darling test statistic, and compare their performances.

Original languageEnglish
Article number298
Pages (from-to)1-11
Number of pages11
JournalEntropy
Volume23
Issue number3
DOIs
Publication statusPublished - 2021 Mar

Bibliographical note

Funding Information:
Funding: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education(2018R1D1A1B07042581).

Publisher Copyright:
© 2021 by the author. Licensee MDPI, Basel, Switzerland.

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Mathematical Physics
  • Physics and Astronomy (miscellaneous)
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Information measure in terms of the hazard function and its estimate'. Together they form a unique fingerprint.

Cite this