### Zé Vinícius

Intern at NASA Ames Research Center Kepler/K2 Mission

# Sum of Poisson and Gaussian Random Variables

In this short blog post I am going to derive the probability density function of the sum between Poisson and Gaussian random variables.

This assumption appears in many practical scenarios, specially in imaging in which a photon noise component (usually Poisson distributed) gets combined with a thermal noise component (usually assumed to be Gaussian distributed).

Consider an experiment that outputs $Z_i = Y_i + X_i,~i=1, 2, ..., n$. Assume that $Y^{n} \triangleq \{Y_i\}_{i=1}^{n}$ is a sequence of independent but not necessarily identically distributed Poisson random variables, each of which has mean $\lambda_i$. Assume further that $X^{n} \triangleq \{X_i\}_{i=1}^{n}$ is a sequence of iid Gaussian random variables with zero mean and variance $\sigma^2$, $\sigma^2 > 0$.

The first step into deriving the likelihood function of $Z^{n}$ is to get the pdf of every $Z_i$. Since $Z_i$ is the sum of a Poisson random variable and a Gaussian random variable, we can go ahead and perform the convolution between their pdfs in order to get the pdf of $Z_i$. However, let’s try a different approach.

Note that, conditonal on $Y_i = y_i$, $Z_i$ follows a Gaussian distribution with mean $y_i$ and variance $\sigma^2$, i.e.

Now, we can use the Law of Total Probability to derive $p(z_i)$ as follows

Using the fact that $Z_i,~i=1, 2, ..., n$ are independent random variables, the pdf of $Z^n$ follows as

and the log-likelihood can be written as

Any suggestions on how to make this likelihood computationally tractable? Maybe via approximation theory?