Quantum computing is an interesting application of the principles of quantum mechanics that has the potential to revolutionize the field of computing and computer science, and it dominated my interest during my first few years studying physics. This paper, written for an introductory quantum mechanics course at Swarthmore college tought by John Townsend, is intended as a brief introduction to the field of quantum computing and should be accessible to anyone with a familiarity of basic linear algebra and wave mechanics. PDF

Large deviation theory is the study of random variables $X_n$ whose probability densities follow a decaying exponential of the form

$p(x) \approx e^{- n I(x) }$

as $n$ grows large, where $I(x)$ is known as the rate-function.  This naturally leads to a generalization of both the law of large numbers and the central limit theorem, and forms a natural framework upon which a rigorous theory of equilibrium statistical mechanics can be built.  In addition, large deviation theory has powerful applications even to traditionally “non-equilibrium” situations.  This paper, which was written as part of the senior comps requirement for my math major, provides a basic (although rigorous) introduction to large deviation theory, as well as its application to a simple class of stochastic differential equations.  It should be accessible to anyone with an introductory course in analysis. PDF

The concept of fluctuations is a fundamental part of statistical mechanics, yet many introductory textbooks are quite vague about the difference between statistical uncertainty and temporal fluctuations. This can lead to much confusion for a student who is new to the subject (it certainly did for me). This paper clarifies the distinction between these two kinds of fluctuations and gives sufficient conditions for their equality. PDF

Statistical mechanics is an interesting subject.  At least in the (equilibrium) classical case, it is ostensibly concerned with deterministic systems, and yet these systems are modeled by probability distributions with great success.  How and why is this possible?  Can these techniques be extended to other domains?  A particularly interesting approach uses E.T. Jaynes’ maximum entropy (MaxEnt) formalism, which attempts to derive the required probability distributions by searching for the most “ignorant” or “uncertain” distribution subject to the constraints of the problem.   In this way, we see a funny connection between complexity, ignorance, and randomness.  This paper, which was part of a final project for Michael Brown‘s statistical mechanics seminar at Swarthmore College, gives an introduction to the MaxEnt approach in nonequilibrium statistical mechanics, including a derivation of the Evans-Searles Fluctuation Theorem. PDF

I’m in the process of moving from my old website, so it may take a while before everything is updated on the new site. For the time being, you can still access the old website here.

An introduction to the theory behind the popular Markov Chain Monte Carlo sampling algorithm and the Simulated Annealing optimization algorithm. This paper, which was written as a final project for Aimee Johnson’s probability seminar at Swarthmore College, should be accessible to anyone with a basic familiarity of probability.  PDF

My first class in stochastic processes was tought before we saw a rigorous definition of random variables or their convergence.  This was pretty confusing for me, and so for the first week of presentations I chose to do a “semi-formal” introduction to the concept of random variables (i.e., a step above handwaving but without the formal apparatus of measure theory) suitable for a physics or mathematics student looking for a slightly more formal definition than that given in most introductory textbooks. The various forms of convergence for random variables are introduced, as are the strong and weak versions of the law of large numbers. PDF