Musings on Markov Chains

Sheldon Cooper
2 min readApr 22, 2021

--

Mathematics has never been my strongest suit. I’ve failed it up until university when I decided to major maths (more on that later) but instead of rambling on the beautiful yet trying experience of learning to learn to love maths. I’d like to muse about: Markov Chains.

[edit: this is an unfinished story. i’ll come back to it]

Matrix is a huge and deeply convoluted chapter. Markov Chain is a subtopic of it. While it’s important to dumb down on the basics (i.e. knowing the difference between an augmented and scalar matrix etc), there’s no need for that here.

Markov chains answers a whole slew of questions and it, generally, just makes math feel more applicable and salient: how do people model the stock market? How do we calculate the market share of different companies within the same market? How do can we describe the weather tomorrow at any given moment in time? How do we quantify the effect of watching a ten second video into our browser ad algorithm of the future?

Mathematicians use stochastic modelling to predict these outcomes of the future with random variables.

A state matrix (represented by Xt) is a matrix or a vector that represents the state of the situation in a particular time period (t).

In the stock market analysis, Xt might represent whether the stock market is up or down in a day (t):

  • If a stock market is up, it can be illustrated as: Xt = [ 1 0 ]
  • If a stock market is down, it can be illustrated as: Xt = [0 1 ]

While in operation research, Xt might represent the market share each company has in a particular period of time:

  • Xt = [ 0.15 0.2 0.65 ]

0.15 — represents the 15% market share of Company A

0.2 — represents the 20% market share of Company B

0.65 — represents the 65% market share of Company C

--

--

Sheldon Cooper
Sheldon Cooper

No responses yet