|Methods and techniques|
Markov Analysis is a method that predicts the activity of a random variable, modelled on the current circumstances that surround a random variable. It is used to forecast the value of a variable whose future value is affected by its current position or condition, and not by an earlier action that brought the variable to its current position or condition.
This analysis is named after the Russian mathematician Andrei Andreyevich Markov. He was a pioneer in the study of stochastic processes, which are a process involving the action of a case. He used this process for the first time to predict the behaviour of gas particles trapped in a closed container. This analysis is also used to predict behaviour in groups of people. It is a method of forecasting random variables.
This is a statistical technique that is used to predict future behaviour of a variable or system whose behaviour or state does not depend on its state or behaviour at any time in the past.
Markov analysis is a probabilistic technique. However, it does not provide a recommended decision. This technique provides probabilistic information about the decision-making situation, which helps the decision-maker to make a decision. It is a descriptive technique that results in probabilistic information. It is not an optimisation technique.
Some methods of forecasting share and option prices also provide for a Markowa analysis. This analysis can also be used to predict the proportion of corporate receivables that will become bad debts. Companies also sometimes use them to make consumer decisions about the company's market share and to forecast future brand loyalty to existing customers.
Markov analysis is used in systems that over time show probabilistic movement from one state to another. For example, this analysis can be used to determine the probability that something will work one day and spoil another day.
Brand analysis shows the probability of customers switching from one brand to another. This analysis can be used to analyse different decision-making situations. One of its most popular applications is the analysis of customer brand change. It is a marketing application that focuses on customer loyalty to a store or brand.
An example is the story of one community. It has two petrol stations called Petroco and National. The Petroco company has carried out a survey. They wanted to check customer loyalty. As it turned out, the customers were not quite loyal. Under the influence of interesting services or advertising they were able to change their tastes. People said that if a customer bought petrol from Petroco within one month, they would probably not buy from the same company next month.
The use of decision trees is logical for such an analysis. However, it takes a lot of time and it is troublesome. For example, if Petroco wanted to know the probability that a customer who traded with him in a given month will return next month, it is necessary to build a large decision tree. (Bernard W. Taylor, 2006, Introduction to Management Science (10th Edition), Module F.).
Analysis in use
Markov analysis carried out with the help of computer programs provides a flexible and convenient way of modeling long-term scenarios. If this is new to someone, he or she should be aware of the potential risks when using these programs. The analyst must pay attention to the simplicity of the convention with regard to Markov analysis. (David Naimark, MD, Murray D. Krahn, Gary Naglie, 1997, Primer on Medical Decision Analysis: Part 5—Working with Markov Processes).
Markov analysis is suitable for testing changes in farm size classes. Most Brand analyses of American agriculture used imputed data, and the data requirements are strict. This study applies Brand analysis to unique data. (Clark Edwards, Matthew G. Smith, and R. Neal Peterson, 1990, The Changing Distribution of Farms by Size: A Markov Analysis).
The most obvious information from Markov analysis is the probability of being in some future period of time. This is the type of information that can be obtained from the decision tree.
- Bernard W. Taylor, 2006, Markov Analysis, Introduction to Management Science (10th Edition), Module F.
- Clark Edwards, Matthew G. Smith, and R. Neal Peterson, 1990, The Changing Distribution of Farms by Size: A Markov Analysi, Peter S. Liapis, Incorporating Inputs in the Static World Policy Simulation Model.
- David Naimark, MD, Murray D. Krahn, Gary Naglie, 1997, Primer on Medical Decision Analysis: Part 5—Working with Markov Processes.
Author: Aleksandra Jasińska