# conorjh

## A perceptron for finding a hyper-exponential distribution.

Recently I have been looking at some data, jitter data for spike trains, which may have a hyper-exponential distribution:

$p(t)=\sum p_i s_ie^{-s_i t}$

The idea is that there is a probability $p_i$ of event type $i$ which is in turn exponentially distributed with rate $s_i$. The $p_i$s sum to one. It is hard, even with a ton of data, to fit the parameters, I thought I might try using a perceptron as a way to do this. I started by changing the sum to an integral so

$p(t)=\int_0^\infty f(s) s e^{-st} ds$

which looks a lot like a Laplace transform, though it is hard to know what to do with that. Now, this means

$P(t_1\le t \le t_2)=\int_0^\infty f(s) \left[e^{-st_2}-e^{-st_1}\right] ds$

I imagine a situation where $f(x)$ is compactly supported and can be sensibly discretized $f_i=f(s_i)$. Thinking of the stuff in square brackets as input at the input nodes and the corresponding $f_i$ as weights, the predicted $P(t_1\le t\le t_2)$ is the output. The corresponding data values were found by interpolation with the $(a-1)$th and $(b+1)$th points and

$p=P(t_1\le t\le t_2) = \sum f_i \left[e^{-s_it_2}-e^{-s_it_1}\right] \delta s$

was calculated. The error is now

$E=p-\frac{b-a}{n}$

with $n$ the number of points. The learning rule is applied

$f_i\leftarrow f_i-\eta E \left[e^{-s_it_2}-e^{-s_it_1}\right] \delta s$

Evolve until happy.

It didn't work really, starting with some known $f(s)$ it evolves until the error is small and the predicted $p(t)$ looks a lot like the real one, but $f(s)$ doesn't look much like the input. The lesson seems to be that there are lots of ways to produce more-or-less the same distribution.

The code is at

https://sourceforge.net/p/percepthypexp/