P S Moharir
Articles written in Journal of Earth System Science
Volume 92 Issue 3 November 1983 pp 223-237
Whether the earthquake occurrences follow a Poisson process model is a widely debated issue. The Poisson process model has great conceptual appeal and those who rejected it under pressure of empirical evidence have tried to restore it by trying to identify main events and suppressing foreshocks and aftershocks. The approach here is to estimate the density functions for the waiting times of the future earthquakes. For this purpose, the notion of Gram-Charlier series which is a standard method for the estimation of density functions has been extended based on the orthogonality properties of certain polynomials such as Laguerre and Legendre. It is argued that it is best to estimate density functions in the context of a particular null hypothesis. Using the results of estimation a simple test has been designed to establish that earthquakes do not occur as independent events, thus violating one of the postulates of a Poisson process model. Both methodological and utilitarian aspects are dealt with.
Volume 92 Issue 3 November 1983 pp 261-281
Very little work has been done in generating alternatives to the Poisson process model. The work reported here deals with alternatives to the Poisson process model for the earthquakes and checks them using empirical data and the statistical hypothesis testing apparatus. The strategy used here for generating hypotheses is to compound the Poisson process. The parameter of the Poisson process is replaced by a random variable having prescribed density function. The density functions used are gamma, chi and extended (gamma/chi). The original distribution is then averaged out with respect to these density functions. For the compound Poisson processes the waiting time distributions for the future events are derived. As the parameters for the various statistical models for earthquake occurrences are not known, the problem is basically of composite hypothesis testing. One way of designing a test is to estimate these parameters and use them as true values. Momentmatching is used here to estimate the parameters. The results of hypothesis testing using data from Hindukush and North East India are presented.
Volume 99 Issue 4 December 1990 pp 473-514
The problem of inversion of potential field data is a challenging one because of the difficulty in obtaining a unique solution. This paper identifies various types of nonuniqueness and argues that it is neither possible nor necessary to remove all categories of nonuniqueness. Some types of nonuniqueness are due to human limitations and choice and these would always persist.
Listing all the solutions, imposing additional constraints on the acceptable solutions, a priori idealization, use of a priori or supplementary information, characterizing what is common to all the solutions, obtaining extremal solutions, seeking a distribution of all possible solutions, etc. are various responses in the face of nonuniqueness. It is shown that merely the form of nonuniqueness is changed by all these techniques. Some algorithms, which should be used to obtain the global minimum of the objective function are discussed.
The conceptual commonality underlying seemingly different approaches and the possibility of nonunique interpretations of the same numerical results due to different axiomatic contexts are both elucidated.
Volume 100 Issue 4 December 1991 pp 309-319
Drawing an inference about an empirical situation requires posing a problem or asking a question, collecting relevant data and choosing methods to analyse them. Though ideally under the spell of the muse of objectivity (but unrealistically) one imagines that the inference should be determined by the objective data alone, the former can depend on various aspects of the entire process that precedes it. This is brought out by two examples from earth science literature. They pertain to seismicity classification based on a pattern recognition algorithm and early palaeomagnetic studies. The examples are deliberately chosen from disjoint fields so that it is clear that the objective is not to contribute to any chosen field of earth science, but to make essentially methodological points. These are: (i) no inference is a culmination of a lineal or sequential process, but is a part of it; (ii) Therefore, one can and should return from there to the preceding process and revise it; (iii) All the intermediate and secondary inferences, all the disagreements between what was expected a priori and what happened on way to the inferences, the symptoms of the effects of the format of questions on the inferences should all be critically scrutinized; (iv) The total exercise should be viewed as not only that of drawing inferences about the external world, but also of learning the process of doing so. The latter experience is generally transferable to other problem situations.
Volume 101 Issue 1 March 1992 pp 1-11
Parasnis has observed in a presidential address that geophysics is not a Popperian science in a major way. That is, hypotheses are not consciously put forth in a falsifiable format and much of the effort goes in seeking supporting evidence for favoured hypotheses. Parker evolved a parameter extremization strategy, initially to tackle the problem of non-uniqueness in geophysical inference. Later he based a hypothesis testing proposal on it, which is refreshingly Popperian. It has not been adopted widely, partly because it requires global extrema, and not local and this has been regarded as a problem with no solution. Attention is drawn towards tunnelling algorithm, which solves the problem of global optimization successfully, makes Parker’s Popperian proposal practical and extends the range of Popperian geophysics.
Volume 108 Issue 4 December 1999 pp 223-231
A method of representing surfaces and volumes by a set of geometric points and a small set of auxiliary parameters, based on generalization of Bernoulli’s notion of lemniscates is introduced. It provides for easy generation and modification of surfaces and volumes, which could be connected, disjoint or even with very irregular boundaries. This allows solving geophysical inversion problems, without constraining the anomalous volumes to some ideal or simple forms. This is illustrated by the example of joint inversion of gravity and magnetic data sets attributable to two-dimensional anomalous bodies. A nonlocal optimization algorithm called
Volume 108 Issue 4 December 1999 pp 269-275
Nonlinear, nonlocal and adaptive optimization algorithms, now readily available, as applied to parameter estimation problems, require that the data to be inverted should not be very noisy. If they are so, the algorithm tends to fit them, rather than smoothening the noise component out. Here, use of Bernstein polynomials is proposed to prefilter noise out, before inversion with the help of a sophisticated optimization algorithm. Their properties are described. Inversion of gravity and magnetic data for basement depth estimation, singly and jointly, and without and after Bernstein-preprocessing is conducted to illustrate that the inversion of Bernstein-preprocessed gravity data alone may be slightly superior to the joint inversion of gravity and magnetic data.
Volume 131, 2022
Continuous Article Publishing mode
Click here for Editorial Note on CAP Mode