Here are a few things I have found interesting from my first term as a PhD student at Reading University.

I started with the intention of quickly writing a short blog post, but that proved too difficult...

The Ising model comes from statistical physics, and describes how neighbouring atoms interact with each other to produce a lattice of spin states. Each atom is a variable in the model, leading to a complex multivariate model. For example, a 10x10 lattice is associated with a 100 variable joint probability mass function with a total of 2^100 states (assuming there are 2 spin states for each atom). The combination of high dimension (number of variables) and dependencies between variables make it challenging to analyse or compute anything of interest. Monte Carlo methods are needed to approximate the integrals that are used in Bayesian inference for this model.

I have found out more research in the Systems Engineering department by meeting with Ingo Bojak and Slawek Nasuto. Among other things, they work with neural field and neural mass models. Their models include spatial interaction, stochasticity, and uncertain parameter values. The combination of these model features means that the challenges of statistical inference may be the same as for the Ising model: high dimension and dependencies between variables.

I am sitting in on an Introduction to Neuroscience undergraduate course to improve my subject knowledge in this area. I have enjoyed learning random facts about the brain. Just to pick out one area, the story of Phineas Gage is quite fascinating both at a superficial level (how could someone survive an iron bar going through their brain!?) and at a deeper level (how and why did Gage's mind change after the accident?) The story is well told in Descartes' Error by Antonio Damasio.

As well as learning random facts, the good thing about taking a course is seeing how different things link together. I have found the concepts of synaptic plasticity and receptive fields quite interesting. Synaptic plasticity is the process by which connections between neurons change over time, and underpins learning. Receptive fields are representations of how neurons respond to a range of different stimuli. Receptive fields are important for understanding how physical stimuli are transformed into perception. These concepts are fundamental for understanding a wide range of brain functions - hearing, vision, bodily sensation etc.

I travelled to London for an event held by the Royal Statistical Society on Sequential Quasi Monte Carlo, a new method developed by Mathieu Gerber and Nicolas Chopin (a relative of the great Frederic, or so I am told...). Only a handful of papers each year are read before the society, so they are usually fairly significant advances. This particular paper was presented as an advance from Sequential Monte Carlo (SMC). SMC methods are useful for statistical inference of time series or dynamic models (e.g. Hidden Markov Models). SQMC is also useful for statistical inference of time series / dynamic models, but, computationally, it is an order of magnitude faster than SMC.

It was nice to hear from some statisticians who work in different application areas to me. I enjoyed finding out about statistical modelling of volcanoes and of the Antarctic ice sheet.

It was also interesting to hear how academic statisticians have worked with government, e.g. MRC Biostatistics Unit, and the Cabinet Office National Risk Register.

I started with the intention of quickly writing a short blog post, but that proved too difficult...

__Parameter estimation for the Ising model__The Ising model comes from statistical physics, and describes how neighbouring atoms interact with each other to produce a lattice of spin states. Each atom is a variable in the model, leading to a complex multivariate model. For example, a 10x10 lattice is associated with a 100 variable joint probability mass function with a total of 2^100 states (assuming there are 2 spin states for each atom). The combination of high dimension (number of variables) and dependencies between variables make it challenging to analyse or compute anything of interest. Monte Carlo methods are needed to approximate the integrals that are used in Bayesian inference for this model.

__Meeting people working in Neuroscience at Reading University__I have found out more research in the Systems Engineering department by meeting with Ingo Bojak and Slawek Nasuto. Among other things, they work with neural field and neural mass models. Their models include spatial interaction, stochasticity, and uncertain parameter values. The combination of these model features means that the challenges of statistical inference may be the same as for the Ising model: high dimension and dependencies between variables.

__Introduction to Neuroscience course__I am sitting in on an Introduction to Neuroscience undergraduate course to improve my subject knowledge in this area. I have enjoyed learning random facts about the brain. Just to pick out one area, the story of Phineas Gage is quite fascinating both at a superficial level (how could someone survive an iron bar going through their brain!?) and at a deeper level (how and why did Gage's mind change after the accident?) The story is well told in Descartes' Error by Antonio Damasio.

As well as learning random facts, the good thing about taking a course is seeing how different things link together. I have found the concepts of synaptic plasticity and receptive fields quite interesting. Synaptic plasticity is the process by which connections between neurons change over time, and underpins learning. Receptive fields are representations of how neurons respond to a range of different stimuli. Receptive fields are important for understanding how physical stimuli are transformed into perception. These concepts are fundamental for understanding a wide range of brain functions - hearing, vision, bodily sensation etc.

__RSS Read Paper on SQMC__I travelled to London for an event held by the Royal Statistical Society on Sequential Quasi Monte Carlo, a new method developed by Mathieu Gerber and Nicolas Chopin (a relative of the great Frederic, or so I am told...). Only a handful of papers each year are read before the society, so they are usually fairly significant advances. This particular paper was presented as an advance from Sequential Monte Carlo (SMC). SMC methods are useful for statistical inference of time series or dynamic models (e.g. Hidden Markov Models). SQMC is also useful for statistical inference of time series / dynamic models, but, computationally, it is an order of magnitude faster than SMC.

__APTS____The Academy for PhD Training in Statistics (APTS) held a week long course in Cambridge where I learnt about Statistical Inference and Statistical Computing. There were 50-100 other first year PhD students in Statistics there from all over the UK and Ireland.__

It was nice to hear from some statisticians who work in different application areas to me. I enjoyed finding out about statistical modelling of volcanoes and of the Antarctic ice sheet.

It was also interesting to hear how academic statisticians have worked with government, e.g. MRC Biostatistics Unit, and the Cabinet Office National Risk Register.