# PDF of Sum of Independent Exponential and Gaussian Random Variable

In this article, we explore the computation of the probability density function (PDF) for the sum of independent Exponential and Gaussian random variables. Let’s consider $Y$, the sum of independent $X$ (Exponential) and $N$ (Gaussian) random variables:

\begin{align} Y &= X + N \end{align}

When dealing with the sum of independent random variables, the PDF can be computed through convolution of their corresponding PDFs. The step-by-step process to derive the closed-form expression is as follows:

[Open the mobile browser in desktop mode, if the below equations do not render properly]

\begin{align} f_Y(y) &= \frac{1}{\beta}e^{-\frac{y}{\beta}} \cdot \frac{1}{\sqrt{2\pi \sigma^{2}}}e^{-\frac{y^2}{2\sigma^{2}}}\\ &= \frac{1}{\beta}\sqrt{\frac{1}{2\pi\sigma^{2}}}\int_{0}^{\infty}e^{-\frac{1}{\beta}t - 0.5\sigma^{-2} (y-t)^2},dt\\ &= \frac{1}{\beta}\sqrt{\frac{1}{2\pi\sigma^{2}}}\int_{0}^{\infty}e^{-0.5\sigma^{-2}\left(t^2 + \frac{t\sigma^{2}}{0.5\beta} - 2yt\right) - 0.5\sigma^{-2} y^2},dt\\ &= \frac{1}{\beta}\sqrt{\frac{1}{2\pi\sigma^{2}}}\int_{0}^{\infty}e^{-0.5\sigma^{-2}\left(\left(t - m\right)^2 - m^2 + y^2\right)},dt,\\ &= \frac{1}{\beta}\sqrt{\frac{1}{2\pi\sigma^{2}}}\int_{0}^{\infty}e^{-0.5\sigma^{-2}\left(t - m\right)^2},dt\cdot e^{0.5\sigma^{-2}\left(m^2 - y^2\right)}\\ &= \frac{1}{\beta}e^{0.5\sigma^{-2}\left(m^2 - y^2\right)} \int_{0}^{\infty}\frac{1}{\sqrt{2\pi \sigma^{2}}}e^{-\frac{\left(t - m\right)^2}{2\sigma^{2}}},dt\\ &= \frac{1}{\beta}e^{0.5\sigma^{-2}\left(m^2 - y^2\right)}{\rm Q}(-m\sqrt{\sigma^{-2}}) \end{align}

Here, $m = y - \frac{\sigma^{2}}{\beta}$.

Therefore, the final PDF of $Y$ is:

\begin{align} f_Y(y) &= \frac{1}{\beta}e^{-\frac{\left(y^2 - m^2\right)}{2\sigma^{2}}}{\rm Q}\left(-\frac{m}{\sigma}\right) \end{align}

In the final PDF expression for the non-zero mean, say $\mu$, of the Gaussian random variable can be incorporated by replacing $y$ with $y-\mu$ . This modification allows for a more versatile representation of the PDF when dealing with Gaussian variables with non-zero means.

To validate this expression, the MATLAB code provided below generates random variables and plots the theoretical PDF against the histogram of the generated samples. The plot demonstrates a close match between the theoretical and empirical distributions.

% Author: Zakir Hussain Shaik
% Date: 02-Nov-2023

clc;
clear;
close all;

% Number of Samples
M = 5e7;

% Generate exponential random variables with mean beta
beta = 2;
x = exprnd(beta,M,1);

% Generate Gaussian random variables with variance nVar
nMean = 0;
nVar = 4;
n = nMean + sqrt(nVar)*randn(M,1);

% Adding independent RVs: Exponential and Gaussian
y = x + n;

% Theoretical Expression of PDF
r = -20:0.01:20;
m = (r-nMean) - nVar/beta ;

f = (1/beta)*exp(0.5*(m.^2 - (r-nMean).^2)/nVar).*qfunc(-m/sqrt(nVar));

% Plot
figure;
histogram(y,'Normalization','pdf','DisplayStyle','stairs');hold on;
plot(r,f,'*r','MarkerIndices',1:50:length(f));
legend('Histogram','Theoretical','Location','best');
xlabel('$y$','Interpreter','latex');
ylabel('$f_Y(y)$','Interpreter','latex');
title('PDF: $Y = X +N$','Interpreter','latex');
grid on;
xlim([min(r),max(r)]);


Results/Plots using MATLAB:

This plot showcases the comparison between the theoretical PDF (red markers) and the generated data’s histogram, validating the derived expression for the sum of independent Exponential and Gaussian random variables. ##### Zakir Hussain Shaik
###### PhD Student in Communications Systems

My research interests include wireless communications, distributed signal processing, and convex optimization