Posted on Categories:信息论, 数学代写

# 数学代写|信息论代写Information Theory代考|ENTROPY RATE

avatest™

avatest信息论information theory代写，免费提交作业要求， 满意后付款，成绩80\%以下全额退款，安全省心无顾虑。专业硕 博写手团队，所有订单可靠准时，保证 100% 原创。https://avatest.org/， 最高质量的信息论information theory作业代写，服务覆盖北美、欧洲、澳洲等 国家。 在代写价格方面，考虑到同学们的经济条件，在保障代写质量的前提下，我们为客户提供最合理的价格。 由于统计Statistics作业种类很多，同时其中的大部分作业在字数上都没有具体要求，因此信息论information theory作业代写的价格不固定。通常在经济学专家查看完作业要求之后会给出报价。作业难度和截止日期对价格也有很大的影响。

avatest™ 为您的留学生涯保驾护航 在澳洲代写方面已经树立了自己的口碑, 保证靠谱, 高质且原创的澳洲代写服务。我们的专家在信息论information theory代写方面经验极为丰富，各种信息论information theory相关的作业也就用不着 说。

## 数学代写|信息论代写Information Theory代考|ENTROPY RATE

If we have a sequence of $n$ random variables, a natural question to ask is: How does the entropy of the sequence grow with $n$ ? We define the entropy rate as this rate of growth as follows.
Definition The entropy of a stochastic process $\left{X_i\right}$ is defined by
$$H(\mathcal{X})=\lim _{n \rightarrow \infty} \frac{1}{n} H\left(X_1, X_2, \ldots, X_n\right)$$
when the limit exists.
We now consider some simple examples of stochastic processes and their corresponding entropy rates.

1. Typewriter.
Consider the case of a typewriter that has $m$ equally likely output letters. The typewriter can produce $m^n$ sequences of length $n$, all of them equally likely. Hence $H\left(X_1, X_2, \ldots, X_n\right)=\log m^n$ and the entropy rate is $H(\mathcal{X})=\log m$ bits per symbol.
2. $X_1, X_2, \ldots$ are i.i.d. random variables. Then
$$H(\mathcal{X})=\lim \frac{H\left(X_1, X_2, \ldots, X_n\right)}{n}=\lim \frac{n H\left(X_1\right)}{n}=H\left(X_1\right),$$
which is what one would expect for the entropy rate per symbol.
3. Sequence of independent but not identically distributed random variables. In this case,
$$H\left(X_1, X_2, \ldots, X_n\right)=\sum_{i=1}^n H\left(X_i\right)$$
but the $H\left(X_i\right)$ ‘s are all not equal. We can choose a sequence of distributions on $X_1, X_2, \ldots$ such that the limit of $\frac{1}{n} \sum H\left(X_i\right)$ does not exist. An example of such a sequence is a random binary sequence

where $p_i=P\left(X_i=1\right)$ is not constant but a function of $i$, chosen carefully so that the limit in (4.10) does not exist. For example, let
$$p_i= \begin{cases}0.5 & \text { if } 2 k<\log \log i \leq 2 k+1, \ 0 & \text { if } 2 k+1<\log \log i \leq 2 k+2\end{cases}$$
for $k=0,1,2, \ldots$
Then there are arbitrarily long stretches where $H\left(X_i\right)=1$, followed by exponentially longer segments where $H\left(X_i\right)=0$. Hence, the running average of the $H\left(X_i\right)$ will oscillate between 0 and 1 and will not have a limit. Thus, $H(\mathcal{X})$ is not defined for this process.
We can also define a related quantity for entropy rate:
$$H^{\prime}(\mathcal{X})=\lim {n \rightarrow \infty} H\left(X_n \mid X{n-1}, X_{n-2}, \ldots, X_1\right)$$
when the limit exists.

## 数学代写|信息论代写Information Theory代考|EXAMPLE: ENTROPY RATE OF A RANDOM WALK ON A WEIGHTED GRAPH

As an example of a stochastic process, let us consider a random walk on a connected graph (Figure 4.2). Consider a graph with $m$ nodes labeled ${1,2, \ldots, m}$, with weight $W_{i j} \geq 0$ on the edge joining node $i$ to node $j$. (The graph is assumed to be undirected, so that $W_{i j}=W_{j i}$. We set $W_{i j}=0$ if there is no edge joining nodes $i$ and $j$.)

A particle walks randomly from node to node in this graph. The random walk $\left{X_n\right}, X_n \in{1,2, \ldots, m}$, is a sequence of vertices of the graph. Given $X_n=i$, the next vertex $j$ is chosen from among the nodes connected to node $i$ with a probability proportional to the weight of the edge connecting $i$ to $j$. Thus, $P_{i j}=W_{i j} / \sum_k W_{i k}$.

In this case, the stationary distribution has a surprisingly simple form, which we will guess and verify. The stationary distribution for this Markov chain assigns probability to node $i$ proportional to the total weight of the edges emanating from node $i$. Let
$$W_i=\sum_j W_{i j}$$
be the total weight of edges emanating from node $i$, and let
$$W=\sum_{i, j: j>i} W_{i j}$$
be the sum of the weights of all the edges. Then $\sum_i W_i=2 W$.
We now guess that the stationary distribution is
$$\mu_i=\frac{W_i}{2 W}$$
We verify that this is the stationary distribution by checking that $\mu P=\mu$. Here
\begin{aligned} \sum_i \mu_i P_{i j} & =\sum_i \frac{W_i}{2 W} \frac{W_{i j}}{W_i} \ & =\sum_i \frac{1}{2 W} W_{i j} \ & =\frac{W_j}{2 W} \ & =\mu_j . \end{aligned}

## 学代写|信息论代写Information Theory代考|ENTROPY RATE

$$H(\mathcal{X})=\lim _{n \rightarrow \infty} \frac{1}{n} H\left(X_1, X_2, \ldots, X_n\right)$$

$X_1, X_2, \ldots$ 都是i.i.d随机变量。然后
$$H(\mathcal{X})=\lim \frac{H\left(X_1, X_2, \ldots, X_n\right)}{n}=\lim \frac{n H\left(X_1\right)}{n}=H\left(X_1\right),$$

$$H\left(X_1, X_2, \ldots, X_n\right)=\sum_{i=1}^n H\left(X_i\right)$$

$$p_i= \begin{cases}0.5 & \text { if } 2 k<\log \log i \leq 2 k+1, \ 0 & \text { if } 2 k+1<\log \log i \leq 2 k+2\end{cases}$$

$$H^{\prime}(\mathcal{X})=\lim {n \rightarrow \infty} H\left(X_n \mid X{n-1}, X_{n-2}, \ldots, X_1\right)$$

## 数学代写|信息论代写Information Theory代考|EXAMPLE: ENTROPY RATE OF A RANDOM WALK ON A WEIGHTED GRAPH

$$W_i=\sum_j W_{i j}$$

$$W=\sum_{i, j: j>i} W_{i j}$$

$$\mu_i=\frac{W_i}{2 W}$$

\begin{aligned} \sum_i \mu_i P_{i j} & =\sum_i \frac{W_i}{2 W} \frac{W_{i j}}{W_i} \ & =\sum_i \frac{1}{2 W} W_{i j} \ & =\frac{W_j}{2 W} \ & =\mu_j . \end{aligned}

## MATLAB代写

MATLAB 是一种用于技术计算的高性能语言。它将计算、可视化和编程集成在一个易于使用的环境中，其中问题和解决方案以熟悉的数学符号表示。典型用途包括：数学和计算算法开发建模、仿真和原型制作数据分析、探索和可视化科学和工程图形应用程序开发，包括图形用户界面构建MATLAB 是一个交互式系统，其基本数据元素是一个不需要维度的数组。这使您可以解决许多技术计算问题，尤其是那些具有矩阵和向量公式的问题，而只需用 C 或 Fortran 等标量非交互式语言编写程序所需的时间的一小部分。MATLAB 名称代表矩阵实验室。MATLAB 最初的编写目的是提供对由 LINPACK 和 EISPACK 项目开发的矩阵软件的轻松访问，这两个项目共同代表了矩阵计算软件的最新技术。MATLAB 经过多年的发展，得到了许多用户的投入。在大学环境中，它是数学、工程和科学入门和高级课程的标准教学工具。在工业领域，MATLAB 是高效研究、开发和分析的首选工具。MATLAB 具有一系列称为工具箱的特定于应用程序的解决方案。对于大多数 MATLAB 用户来说非常重要，工具箱允许您学习应用专业技术。工具箱是 MATLAB 函数（M 文件）的综合集合，可扩展 MATLAB 环境以解决特定类别的问题。可用工具箱的领域包括信号处理、控制系统、神经网络、模糊逻辑、小波、仿真等。