• Fulltext

       

        Click here to view fulltext PDF


      Permanent link:
      https://www.ias.ac.in/article/fulltext/pram/084/03/0365-0372

    • Keywords

       

      Complexity; Lempel–Ziv complexity; Shannon’s entropy; approximate entropy; logistic map.

    • Abstract

       

      ‘Complexity’ has several definitions in diverse fields. These measures are indicators of some aspects of the nature of the signal. Such measures are used to analyse and classify signals and as a signal diagnostics tool to distinguish between periodic, quasiperiodic, chaotic and random signals. Lempel–Ziv (LZ) complexity and approximate entropy (ApEn) are such popular complexity measures that are widely used for characterizing biological signals also. In this paper, we compare the utility of ApEn, LZ complexities and Shannon’s entropy in characterizing data from a nonlinear chaotic map (logistic map). In this work, we show that LZ and ApEn complexity measures can characterize the data complexities correctly for data sequences as short as 20 in length while Shannon’s entropy fails for length less than 50. In the case of noisy sequences with 10% uniform noise, Shannon’s entropy works only for lengths greater than 200 while LZ and ApEn are successful with sequences of lengths greater than 30 and 20, respectively.

    • Author Affiliations

       

      Karthi Balasubramanian1 Silpa S Nair1 Nithin Nagaraj1

      1. Department of Electronics and Communication Engineering, Amrita School of Engineering, Amrita Vishwa Vidyapeetham, Amritapuri Campus, Clappana 690 525, India
    • Dates

       
  • Pramana – Journal of Physics | News

    • Editorial Note on Continuous Article Publication

      Posted on July 25, 2019

      Click here for Editorial Note on CAP Mode

© 2021-2022 Indian Academy of Sciences, Bengaluru.