Posted on Categories:Time Series, 数据科学代写, 时间序列, 统计代写, 统计代考

# 统计代写|时间序列分析代写Time-Series Analysis代考|The classical multiple regression model

avatest™

## avatest™帮您通过考试

avatest™的各个学科专家已帮了学生顺利通过达上千场考试。我们保证您快速准时完成各时长和类型的考试，包括in class、take home、online、proctor。写手整理各样的资源来或按照您学校的资料教您，创造模拟试题，提供所有的问题例子，以保证您在真实考试中取得的通过率是85%以上。如果您有即将到来的每周、季考、期中或期末考试，我们都能帮助您！

•最快12小时交付

•200+ 英语母语导师

•70分以下全额退款

## 统计代写|时间序列分析代写Time-Series Analysis代考|The classical multiple regression model

In a multiple regression model, a response variable $Y$ is related to $k$ predictor variables, $X_1, X_2, \ldots, X_k$, as follows,
$$Y=\beta_0+\beta_1 X_1+\cdots+\beta_k X_k+\xi,$$

where $\xi$ is assumed to be uncorrelated white noise, often as i.i.d. $N\left(0, \sigma^2\right)$. When time series data are used to fit a multiple regression model, we often write Eq. (3.1) as
\begin{aligned} Y_{\mathrm{t}} & =\beta_0+\beta_1 X_{1, t}+\cdots+\beta_k X_{k, t}+\xi_{\mathrm{t}} \ & =\mathbf{X}t^{\prime} \boldsymbol{\beta}+\xi_t \end{aligned} where $t$ refers to time, \begin{aligned} \mathbf{X}_t^{\prime} & =\left[1, X{1, t}, X_{2, t}, \ldots, X_{k, t}\right] \ \boldsymbol{\beta} & =\left[\beta_0, \beta_1, \beta_2, \ldots, \beta_k\right]^{\prime} \end{aligned}
and in time series regression $\xi_t$ is normally assumed to follow a time series model such as $\operatorname{AR}(p)$.
When we have time series data from time $t=1$ to $t=n$, we can present Eq. (3.2) in the matrix form,
$$\underset{n \times 1}{\mathbf{Y}}=\underset{(n \times(k+1))}{\mathbf{X}} \underset{(k+1) \times 1}{\boldsymbol{\beta}}+\underset{(n \times 1)}{\boldsymbol{\xi}}$$
where
$$\mathbf{Y}=\left[\begin{array}{c} Y_1 \ Y_2 \ \vdots \ Y_n \end{array}\right], \quad \mathbf{X}=\left[\begin{array}{c} \mathbf{X}1^{\prime} \ \mathbf{X}_2^{\prime} \ \vdots \ \mathbf{X}_n^{\prime} \end{array}\right]=\left[\begin{array}{ccccc} 1 & X{1,1} & X_{2,1} & \cdots & X_{k, 1} \ 1 & X_{1,2} & X_{2,2} & \cdots & X_{k, 2} \ \vdots & \vdots & \vdots & \ddots & \vdots \ 1 & X_{1, n} & X_{2, n} & \cdots & X_{k, n} \end{array}\right], \quad \boldsymbol{\beta}=\left[\begin{array}{c} \beta_0 \ \beta_1 \ \vdots \ \beta_k \end{array}\right], \quad \boldsymbol{\xi}=\left[\begin{array}{c} \xi_1 \ \xi_2 \ \vdots \ \xi_n \end{array}\right],$$
and $\boldsymbol{\xi}$ follows a $n$-dimensional multivariate normal distribution $N(\mathbf{0}, \mathbf{\Sigma})$. Given $\boldsymbol{\Sigma}$, the generalized least squares estimator (GLS)
$$\hat{\boldsymbol{\beta}}=\left(\mathbf{X}^{\prime} \mathbf{\Sigma}^{-1} \mathbf{X}\right)^{-1} \mathbf{X}^{\prime} \mathbf{\Sigma}^{-1} \mathbf{Y}$$
has the smallest possible variance among all other unbiased estimators $\widetilde{\boldsymbol{\beta}}$ of $\boldsymbol{\beta}$ in the form
$$\boldsymbol{c}^{\prime} \tilde{\boldsymbol{\beta}}=c_0 \tilde{\beta}_0+c_1 \tilde{\beta}_1+\cdots+c_k \tilde{\beta}_k .$$

## 统计代写|时间序列分析代写Time-Series Analysis代考|Multivariate multiple regression model

Now, suppose that instead of one response variable in Eq. (3.2), we have $m$ response time series variables related to these $k$ predictor time series variables, that is,
\begin{aligned} & Y_{1, t}=\beta_{1,0}+\beta_{1,1} X_{1, t}+\cdots \beta_{1, k} \mathrm{X}{k, t}+\varepsilon{1, t}=\mathbf{X}t^{\prime} \boldsymbol{\beta}{(1)}+\xi_{1, t} \ & Y_{2, t}=\beta_{2,0}+\beta_{2,1} X_{1, t}+\cdots \beta_{2, k} \mathrm{X}{k, t}+\varepsilon{2, t}=\mathbf{X}t^{\prime} \boldsymbol{\beta}{(2)}+\xi_{2, t} \ & \vdots \ & Y_{m, t}=\beta_{m, 0}+\beta_{m, 1} X_{1, t}+\cdots \beta_{m, k} \mathrm{X}{k, t}+\varepsilon{m, t}=\mathbf{X}t^{\prime} \boldsymbol{\beta}{(m)}+\xi_{m, t}, \end{aligned}
or
$$\mathbf{Y}t^{\prime}=\mathbf{X}_t^{\prime}\left[\boldsymbol{\beta}{(1)}, \boldsymbol{\beta}{(2)}, \ldots, \boldsymbol{\beta}{(m)}\right]+\boldsymbol{\xi}t^{\prime}$$ where \begin{aligned} \mathbf{Y}_t^{\prime} & =\left[Y{1, t}, Y_{2, t}, \ldots, Y_{m, t}\right], \ \mathbf{X}t^{\prime} & =\left[1, X{1, t}, X_{2, t}, \ldots, X_{m, t}\right], \ \boldsymbol{\beta}{(i)} & =\left[\beta{i, 0}, \beta_{i, 1}, \ldots \beta_{i, k}\right]^{\prime}, i=1,2, \ldots, m, \end{aligned}
and
$$\xi_{\mathbf{t}}^{\prime}=\left[\xi_{1, t}, \xi_{2, t}, \ldots, \xi_{m, t}\right]$$
For $i=1,2, \ldots, m$ and time $t=1$ to $t=n$, let
$$\mathbf{Y}{(i)}=\left[\begin{array}{c} Y{i, 1} \ Y_{i, 2} \ \vdots \ Y_{i, n} \end{array}\right], \mathbf{X}=\left[\begin{array}{c} \mathbf{X}1^{\prime} \ \mathbf{X}_2^{\prime} \ \vdots \ \mathbf{X}_n^{\prime} \end{array}\right]=\left[\begin{array}{ccccc} 1 & X{1,1} & X_{2,1} & \cdots & X_{k, 1} \ 1 & X_{1,2} & X_{2,2} & \cdots & X_{k, 2} \ \vdots & \vdots & \vdots & \ddots & \vdots \ 1 & X_{1, n} & X_{2, n} & \cdots & X_{k, n} \end{array}\right], \text { and } \boldsymbol{\xi}{(i)}=\left[\begin{array}{c} \xi{i, 1} \ \xi_{i, 2} \ \vdots \ \xi_{i, n} \end{array}\right]$$

## 统计代写|时间序列分析代写Time-Series Analysis代考|The classical multiple regression model

$$Y=\beta_0+\beta_1 X_1+\cdots+\beta_k X_k+\xi,$$

\begin{aligned} Y_{\mathrm{t}} & =\beta_0+\beta_1 X_{1, t}+\cdots+\beta_k X_{k, t}+\xi_{\mathrm{t}} \ & =\mathbf{X}t^{\prime} \boldsymbol{\beta}+\xi_t \end{aligned}其中$t$表示时间，\begin{aligned} \mathbf{X}t^{\prime} & =\left[1, X{1, t}, X{2, t}, \ldots, X_{k, t}\right] \ \boldsymbol{\beta} & =\left[\beta_0, \beta_1, \beta_2, \ldots, \beta_k\right]^{\prime} \end{aligned}

$$\underset{n \times 1}{\mathbf{Y}}=\underset{(n \times(k+1))}{\mathbf{X}} \underset{(k+1) \times 1}{\boldsymbol{\beta}}+\underset{(n \times 1)}{\boldsymbol{\xi}}$$

$$\mathbf{Y}=\left[\begin{array}{c} Y_1 \ Y_2 \ \vdots \ Y_n \end{array}\right], \quad \mathbf{X}=\left[\begin{array}{c} \mathbf{X}1^{\prime} \ \mathbf{X}2^{\prime} \ \vdots \ \mathbf{X}_n^{\prime} \end{array}\right]=\left[\begin{array}{ccccc} 1 & X{1,1} & X{2,1} & \cdots & X_{k, 1} \ 1 & X_{1,2} & X_{2,2} & \cdots & X_{k, 2} \ \vdots & \vdots & \vdots & \ddots & \vdots \ 1 & X_{1, n} & X_{2, n} & \cdots & X_{k, n} \end{array}\right], \quad \boldsymbol{\beta}=\left[\begin{array}{c} \beta_0 \ \beta_1 \ \vdots \ \beta_k \end{array}\right], \quad \boldsymbol{\xi}=\left[\begin{array}{c} \xi_1 \ \xi_2 \ \vdots \ \xi_n \end{array}\right],$$
$\boldsymbol{\xi}$服从$n$维多元正态分布$N(\mathbf{0}, \mathbf{\Sigma})$。给定$\boldsymbol{\Sigma}$，广义最小二乘估计量(GLS)
$$\hat{\boldsymbol{\beta}}=\left(\mathbf{X}^{\prime} \mathbf{\Sigma}^{-1} \mathbf{X}\right)^{-1} \mathbf{X}^{\prime} \mathbf{\Sigma}^{-1} \mathbf{Y}$$

$$\boldsymbol{c}^{\prime} \tilde{\boldsymbol{\beta}}=c_0 \tilde{\beta}_0+c_1 \tilde{\beta}_1+\cdots+c_k \tilde{\beta}_k .$$

## 统计代写|时间序列分析代写Time-Series Analysis代考|Multivariate multiple regression model

\begin{aligned} & Y_{1, t}=\beta_{1,0}+\beta_{1,1} X_{1, t}+\cdots \beta_{1, k} \mathrm{X}{k, t}+\varepsilon{1, t}=\mathbf{X}t^{\prime} \boldsymbol{\beta}{(1)}+\xi_{1, t} \ & Y_{2, t}=\beta_{2,0}+\beta_{2,1} X_{1, t}+\cdots \beta_{2, k} \mathrm{X}{k, t}+\varepsilon{2, t}=\mathbf{X}t^{\prime} \boldsymbol{\beta}{(2)}+\xi_{2, t} \ & \vdots \ & Y_{m, t}=\beta_{m, 0}+\beta_{m, 1} X_{1, t}+\cdots \beta_{m, k} \mathrm{X}{k, t}+\varepsilon{m, t}=\mathbf{X}t^{\prime} \boldsymbol{\beta}{(m)}+\xi_{m, t}, \end{aligned}

$$\mathbf{Y}t^{\prime}=\mathbf{X}t^{\prime}\left[\boldsymbol{\beta}{(1)}, \boldsymbol{\beta}{(2)}, \ldots, \boldsymbol{\beta}{(m)}\right]+\boldsymbol{\xi}t^{\prime}$$ where \begin{aligned} \mathbf{Y}_t^{\prime} & =\left[Y{1, t}, Y{2, t}, \ldots, Y_{m, t}\right], \ \mathbf{X}t^{\prime} & =\left[1, X{1, t}, X_{2, t}, \ldots, X_{m, t}\right], \ \boldsymbol{\beta}{(i)} & =\left[\beta{i, 0}, \beta_{i, 1}, \ldots \beta_{i, k}\right]^{\prime}, i=1,2, \ldots, m, \end{aligned}

$$\xi_{\mathbf{t}}^{\prime}=\left[\xi_{1, t}, \xi_{2, t}, \ldots, \xi_{m, t}\right]$$

$$\mathbf{Y}{(i)}=\left[\begin{array}{c} Y{i, 1} \ Y_{i, 2} \ \vdots \ Y_{i, n} \end{array}\right], \mathbf{X}=\left[\begin{array}{c} \mathbf{X}1^{\prime} \ \mathbf{X}2^{\prime} \ \vdots \ \mathbf{X}_n^{\prime} \end{array}\right]=\left[\begin{array}{ccccc} 1 & X{1,1} & X{2,1} & \cdots & X_{k, 1} \ 1 & X_{1,2} & X_{2,2} & \cdots & X_{k, 2} \ \vdots & \vdots & \vdots & \ddots & \vdots \ 1 & X_{1, n} & X_{2, n} & \cdots & X_{k, n} \end{array}\right], \text { and } \boldsymbol{\xi}{(i)}=\left[\begin{array}{c} \xi{i, 1} \ \xi_{i, 2} \ \vdots \ \xi_{i, n} \end{array}\right]$$

## MATLAB代写

MATLAB 是一种用于技术计算的高性能语言。它将计算、可视化和编程集成在一个易于使用的环境中，其中问题和解决方案以熟悉的数学符号表示。典型用途包括：数学和计算算法开发建模、仿真和原型制作数据分析、探索和可视化科学和工程图形应用程序开发，包括图形用户界面构建MATLAB 是一个交互式系统，其基本数据元素是一个不需要维度的数组。这使您可以解决许多技术计算问题，尤其是那些具有矩阵和向量公式的问题，而只需用 C 或 Fortran 等标量非交互式语言编写程序所需的时间的一小部分。MATLAB 名称代表矩阵实验室。MATLAB 最初的编写目的是提供对由 LINPACK 和 EISPACK 项目开发的矩阵软件的轻松访问，这两个项目共同代表了矩阵计算软件的最新技术。MATLAB 经过多年的发展，得到了许多用户的投入。在大学环境中，它是数学、工程和科学入门和高级课程的标准教学工具。在工业领域，MATLAB 是高效研究、开发和分析的首选工具。MATLAB 具有一系列称为工具箱的特定于应用程序的解决方案。对于大多数 MATLAB 用户来说非常重要，工具箱允许您学习应用专业技术。工具箱是 MATLAB 函数（M 文件）的综合集合，可扩展 MATLAB 环境以解决特定类别的问题。可用工具箱的领域包括信号处理、控制系统、神经网络、模糊逻辑、小波、仿真等。