【简答题】We wish to estimate the pdf of X with a function p(x) that maximizes the entropy It is known from measurements that E[X] = μ and Var[X] = σ 2 . Find the maximum entropy estimate of the pdf of X.
【简答题】If random variables X and Y satisfy: P{X=-2,Y=-1}=1/6, P{X=2,Y=-1}=1/6, P{X=-2,Y=0}=1/3, P{X=2,Y=0}=1/6, P{X=-2,Y= 1}=1/24, P{X= 2,Y= 1}=1/8. Then, the joint entropy of X and Y is ____________, the co...
【简答题】Entropy of functions of a random variable.Let X be a discrete random variable.Show that the entropy of function of X is less than or equal to the entropy of X by justifying the following steps: H(X,g(...