課程:ELEC5300
作者:xlidq [17级 CPEG]
創建於:2019-12-19 17:21:17
課程時間:2019年F季
授課教授:MOW, Wai Ho我覺得教授:挺不错的,讲课挺清楚
評分標準:assignment 15% + midterm 35% + final 50% 这个sem没有参考价值這門課的Grade:Grade神/較好 同上
内容:就是接著elec2600的内容繼續講。這門課雖然標題叫隨機過程,不過内容上我覺得更像是統計信號處理,和math那邊的隨機過程差別還挺大的。math3425主要是講各種隨機過程的model,而這門課講的是如何利用各種統計學上的性質來預測信號。下面我附了syllabus的。這門課主要就是教了你一些隨機過程的性質,重點是如何運用。主要就是各種filter來回用,對於communication還是比較重要的,signal processing上的用處也還是蠻大的。這門課講了wiener filter之後,可以推導一下2D情況下的,對於image processing中的相關部分理解有一定幫助。
我覺得難度不大,這門課以前也是4000level的。根據我上課的經驗,pre-requisite應該是elec2100和elec2600,畢竟拉普拉斯變換和Z變換用的飛起,上課也都默認你很熟悉信號與系統那些東西了。elec3100有部分内容有涉及到,不過我覺得問題不大,遇見了再自學即可。根據phd們的表述,這門課算是比較簡單的pg課,而且課堂上我也看見了不少認識的ug來上,多半是ee和cpeg的。總體來説我覺得,如果想稍微多學一些概率,又不想直接去上math版,可以先來這邊過度一下。
這門課最大的問題就是,6點鐘到9點鈡,3個小時,我從來沒有清醒地完整聽完過。所以下來還是得自己多看看,因爲上課多半是昏昏欲睡的。
至於grade,又是elec又是pg課,還用說嗎。
有用又好龟,而且不算難。
syllabus:
Lecture 1: Review of Probability Theory and Random Variables
· Review of Basic Probability
o Random Experiments
o Axioms of Probability
o Conditional Probability and Independence
· Single Random Variables (RVs)
o Definition and equivalent events
o Specifying a RV (CDF, PDF, PMF, Characteristic Function)
o Expectation of a RV and functions of a RV, Moments
o Mean as the minimum mean squared error estimate
· Multiple RVs
o Joint distribution and density functions
o Conditional density and independence
o Expectation and joint moments
o Correlation and Covariance
Lecture 2: Transformations, Random processes, Convergence
· Affine Functions of RVs
o Density
o Mean and variance
· Jointly Gaussian RVs
o Definition
o Properties
· Random Processes and Sequences
o Definitions and intepretations
o Convergence of random sequences
o Specifying RPs by joint distributions/densities
o Mean, autocorrelation and covariance functions
o Examples: IID and Gaussian RPs
Lecture 3: Stationary random processes, Power spectral density
· Stationary and Wide-Sense Stationary RPs
o Properties of the autocorrelation of WSS RPs
· Ergodicity
· Multiple Random Processes
o Cross correlation function
· Power Spectral Density
o Cross Power Spectral Density
· Important Random Processes
o Continuous Time White Noise
o Bandlimited White Noise
o Gauss-Markov Process
o Random Telegraph Signal
o Wiener Process
Lecture 4: Response of Linear Systems to Random Inputs
· Continuous time linear systems: a review
· Filtering WSS Random Processes
o Mean, Cross and Autocorrelations, Power Spectral Density
o Generating the Gauss-Markov Process
o Filtering the Gauss-Markov and Bandlimited White Noise Processes
o Regular Processes
o Spectral Factorization
o Noise equivalent bandwidth
· Transient analysis of linear systems
o Response to initial conditions and input
o Weiner process
Lecture 5: Estimation and optimal filtering
· Minimum mean squared estimation
o Linear estimation from single variable data
o Orthogonality principle
o Estimation from multiple variable data
o Whitening viewpoint
o Optimal nonlinear estimator
o Generalized orthogonality principle
o Estimating Gaussian Random Variables
· Optimal linear filtering
o Parameter optimization
o Continuous-time Weiner filtering
o Orthogonality principle
§ Non-causal filter
§ Causal filter for white data
§ Causal filter for colored data
Lecture 6: Discrete-time Wiener Filtering
· Discrete-time linear filtering of random processes
· Discrete-time regular processes
· Discrete-time Wiener filtering
· Comparison of estimation from N random variables with continuous/discrete-time Weiner filtering
Lecture 7: Parameter Estimation
· Maximum-likelihood estimation
o Introduction
o Single parameters
o Multiple parameters
· Properties of Estimates
o Biasedness and convergence
o Cramer-Rao bound
§ Proof
§ Alternative form and example
Lecture 8: Parameter Estimation (cont)
· Estimating the autocorrelation
· Estimating the power spectral density
· Bayesian Estimation
o Motivation and the Conjugate Prior
o Bernoulli distribution
o Mean of the Gaussian
o Mean and Variance of the Gaussian
· Exponential family of distributions
Lecture 9: Principal Component Analysis and Deterministic Least Squares (Outside Exam Scope)
· Principal Component Analysis
o Motivation
o Maximum variance formulation
§ Constrained Optimization
o Minimum error formulation
o Applications
· Deterministic Least Squares
o Problem formulation and solution
o Properties of solution
o Dealing with correlated noise
請登錄後再評論