I dont think there is any trick that will give you a shortcut to the MLE in general for this problem. I am happy to be corrected on that, though.
For $n=2$ you can get a nice formula, and that should extend to formulas for $n=3$ or $4$ (perhaps with some checking of cases) but the approach I see to do these would leave us trying to solve a quintic at $n=5$ so unless the problem has special structure I don't see, that wouldn't work for larger $n$ than $4$.
In general with mixtures (this is a one-parameter mixture) you don't have some fancy shortcut, and it's often ill-conditioned $-$ at least in parts of the parameter space $-$ to boot, often making numerical optimization tricky, particularly if youre also estimating parameters of the components (indeed, for some parts of the space the model is degenerate).
In this case, we have mixtures of exponentials, which are sometimes difficult to fit. You may occasionally find the likelihood function is not that nice (e.g. pretty flat near the peak) even with plenty of data.
Fortunately in this case, with two components and just the mixing parameter to estimate, its relatively nice and numerical approaches should work in most situations. You can even plot $\log L$ vs $p$ for any given data, if you like (and I'd encourage that if you have data to work with).
Here's the log-likelihood function for 3 different (made up) small data sets, which have $\hat{p}=$ $0$, $1$, and $\sim 0.18$ respectively.

If you use R, the function optimize (built in, it's in stats ... thanks Thomas for correcting my mistake) should be adequate for optimizing the log-likelihood for most data sets on this particular problem. Even so, I'd still suggest plotting it.
A warning: take care with trying to apply the standard asymptotic inference if the parameter estimate hits the bounds.