Parametric analysis
Parametric analysis is a procedure that relies on the assumption that the distribution of the responses we are measuring, once any fixed parts have been taken into account, reflects a certain probability distribution. Then we make statistical inferences about the parameters that characterise this distribution. For example, we may conclude that the response we are measuring is normally distributed (a distribution defined by the sample means and sample variance). Under this assumption we can evaluate the differences between the sample means by comparing the size of these differences to the sample variance. However, we need not restrict ourselves to the normal distribution[1].
Parametric analysis is a process whereby the output of a system is observed when a single test parameter is varied over a range while all other test parameters are held fixed. It helps the modeller to understand how the problem solutions change as a role of the singular input parameters. The overall process is assigned to as sensitivity analysis because it shows the sensitivity of the output of the model to changes in input[2].
As well as making distributional assumptions, when making a parametric analysis, we may also need to assume[3]:
- The variability is the same across all collection (homogeneity of variance);
- The responses are numeric and continuous;
- The observations are independent;
- There are no outlying observations unduly affecting the results;
- The results behave in an additive way.
The main objective of parametric analysis is to describe or predict growth, or differences in growth, as a function of chronological age[4].
Parametric tests
We distinguish different types of tests that provide a set of tools for analysing data. Among other things, there are[5]:
- t-test - This test should be used if you have a single factor at two levels (for example you wish to examine treatment and control). We do not advise practising the t-test in more complicated situations.
- One-way ANOVA - This test is suitable if your experiment consists of a single factor at more than two levels (for example three portions of a test compound and control). The ANOVA provides an overall test to recognise if the experimental factor means are various. If you require to make pairwise comparisons between the unique factor means then you need to use "post hoc" tests, multiple comparison methods or planned comparisons. However, you should note that all of these tests - the overall test and any pairwise tests - use an estimate of the variability obtained from all of the data. This estimate is a more solid and reproducible estimate of the variability because all of the data have been adopted to measure it.
Non-parametric analysis
When addressing sample size, there are two general approaches to the underlying statistics[6]:
- the parametric - this approach assumes that the functional form of the frequency distribution is known and is concerned with testing hypotheses about parameters of the distribution, or estimating the parameters.
- the nonparametric - this approach does not assume the form of the frequency distribution (i.e. it applies distribution free statistics).
"Parametric analysis is the most powerful. Non-parametric analysis in the most flexible"[7].
Ordinarily, non-parametric procedures are less powerful than equivalent parametric approaches when the assumptions of the latter are valid. The assumptions present the parametric approach with extra information, which the non-parametric approach must find. The more relabelling, the better the potential of the non-parametric approach relative to the parametric approach. In a sense, the method has more data from more relabelling and discovers the null distribution assumed in the parametric approach. Nevertheless, if the assumptions needed for a parametric analysis are not reliable, a non-parametric approach becomes the only valid method of analysis[8].
Examples of Parametric analysis
- One example of parametric analysis is regression analysis, which is used to quantify the relationship between two or more variables. Regression analysis allows us to determine the effect of one variable on another, by measuring how much of the variation in the dependent variable can be attributed to the independent variable.
- Another example of parametric analysis is analysis of variance (ANOVA). ANOVA is used to compare the means of two or more independent variables. It is commonly used to evaluate the differences between the means of two or more groups, and is often used to analyze data from experiments.
- Another example of parametric analysis is factor analysis, which is used to identify the underlying factors or components that explain the variation in a set of observed variables. Factor analysis is useful in situations where the independent variables are not directly observable, and can help identify the relationships between different variables.
- Finally, parametric analysis can also be used to assess the reliability of a measurement. This is done by comparing the observed variance in a measurement with the expected variance, or the estimated variance that would be expected based on the assumed probability distribution. This can be used to determine if the measurement is reliable enough to be used for statistical inference.
Advantages of Parametric analysis
- Parametric analysis can be used to measure the effects of various treatments or conditions on a sample. By making inferences about the parameters of the distribution, we can gain insights into how different treatments and conditions affect the responses of the sample.
- Parametric analysis is easier to interpret and has fewer assumptions than non-parametric methods. This makes it a powerful approach to data analysis and interpretation.
- Parametric analysis is also more efficient than non-parametric methods. This is because the assumptions made in parametric analysis are often more restrictive and less complex than those made in non-parametric methods. As such, parametric analysis can provide more accurate results in fewer samples.
- Parametric analysis is also more reliable than non-parametric methods. This is because the assumptions made in parametric analysis are often more restrictive and less complex than those made in non-parametric methods. Thus, the results of parametric analysis are more likely to be accurate and reliable.
Limitations of Parametric analysis
- Parametric analysis is limited in its ability to account for non-normally distributed data, since it assumes that the data is normally distributed.
- Parametric analysis is limited in its ability to detect outliers, as it is based on assuming that the data is normally distributed and that the mean is a reliable representation of the data.
- Parametric analysis is limited in its ability to accurately describe the relationships between variables, as it only takes into account linear relationships.
- Parametric analysis is limited in its ability to identify trends and patterns in data, as it fails to recognize non-linear relationships between variables.
- Parametric analysis is limited in its ability to identify underlying structures and causal relationships in data, as it is based on a limited set of assumptions.
- Non-parametric analysis: a method of data analysis that does not make assumptions regarding the underlying population distributions. This allows us to make inferences about our data without relying on the usual assumptions about the underlying population.
- Regression analysis: a statistical technique used to predict the value of a dependent variable based on the values of one or more independent variables.
- Factor analysis: a statistical technique used to identify underlying factors that explain the relationships between a set of variables.
- Structural equation modelling: a technique used to assess the causal relationship between variables through the estimation of models that represent the relationship between the variables.
In summary, parametric analysis is one of several approaches to data analysis, which makes assumptions about the underlying population distributions. Other approaches to data analysis include non-parametric analysis, regression analysis, factor analysis, and structural equation modelling. Each of these techniques has its own advantages and disadvantages, and the choice of which approach to use depends on the nature of the data and the research question.
Footnotes
- ↑ S. T. Bate, R. A. Clark, 2014, page 151
- ↑ D. L. Giadrosich, 1995, page 78
- ↑ S. T. Bate, R. A. Clark, 2014, page 152
- ↑ R. C. Hauspie, N. Cameron, L. Molinari, 2004, page 234
- ↑ S. T. Bate, R. A. Clark, 2014, page 151
- ↑ D. L. Giadrosich, 1995, page 121
- ↑ D. W. Scott, 2015
- ↑ W. D. Penny, K. J. Friston, J. T. Ashburner, S. J. Kiebel, T. E. Nichols, 2011, pages 260,261
Parametric analysis — recommended articles |
Three-Way ANOVA — Correlational study — Logistic regression model — Descriptive model — Analysis of variance — Principal component analysis — Attribute control chart — Types of indicators — Box diagram |
References
- Bate S. T., Clark R. A., (2014), The Design and Statistical Analysis of Animal Experiments, Cambridge University Press, Cambridge, England
- Giadrosich D. L., (1995), Operations Research Analysis in Test and Evaluation, American Institute of Aeronautics and Astronautics, Reston, Wirginia, United States of America
- Hauspie R. C., Cameron N., Molinari L., (2004), Methods in Human Growth Research, Cambridge University Press, Cambridge, England
- Penny W. D., Friston K. J., Ashburner J. T., Kiebel S. J., Nichols T. E., (2011), Statistical Parametric Mapping: The Analysis of Functional Brain Images, Elsevier, St. Louis, Missouri, United States of America
- Scott D. W., (2015), Multivariate Density Estimation: Theory, Practice, and Visualization, John Wiley & Sons, Hoboken, New Jersey, United State of America
Author: Monika Mendak