Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The max entropy distribution over (-oo,+oo) is usually taken to be either Gaussian or Laplace (two sided exponential), depending on moment conditions. There is no such thing as a uniform distribution over the real line.


Of course there is. It's an improper prior (doesn't sum up to one), see link above. Theoretically the Gaussian is only maximum entropy given a mean and stddev. An infinite uniform is maxent without any knowledge of mean and stddev.

In practice though you can use a wide Gaussian if it is more mathematically convenient and get about the same results.


The entropy of any improper prior diverges. There is no point in arguing over which one maximizes entropy; they all do, vacuously. Max entropy is not useful for discriminating amongst probability distributions unless you require them to actually be probability distributions.


Side note: great how you spell infinity. Never saw this before. So simple and smart


Came here to say the same thing. Certainly haven't seen such depiction of infinity earlier. But this is a nice way. I am going to use it myself.


It's used as the symbol for infinity in Sympy, and probably other (and older) computer algebra systems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: