Whittle likelihoodIn statistics, Whittle likelihood is an approximation to the likelihood function of a stationary Gaussian time series. It is named after the mathematician and statistician Peter Whittle, who introduced it in his PhD thesis in 1951. It is commonly used in time series analysis and signal processing for parameter estimation and signal detection. In a stationary Gaussian time series model, the likelihood function is (as usual in Gaussian models) a function of the associated mean and covariance parameters.
Gretlgretl is an open-source statistical package, mainly for econometrics. The name is an acronym for Gnu Regression, Econometrics and Time-series Library. It has both a graphical user interface (GUI) and a command-line interface. It is written in C, uses GTK+ as widget toolkit for creating its GUI, and calls gnuplot for generating graphs. The native scripting language of gretl is known as hansl (see below); it can also be used together with TRAMO/SEATS, R, Stata, Python, Octave, Ox and Julia.
Partial autocorrelation functionIn time series analysis, the partial autocorrelation function (PACF) gives the partial correlation of a stationary time series with its own lagged values, regressed the values of the time series at all shorter lags. It contrasts with the autocorrelation function, which does not control for other lags. This function plays an important role in data analysis aimed at identifying the extent of the lag in an autoregressive (AR) model.
Exponential smoothingExponential smoothing is a rule of thumb technique for smoothing time series data using the exponential window function. Whereas in the simple moving average the past observations are weighted equally, exponential functions are used to assign exponentially decreasing weights over time. It is an easily learned and easily applied procedure for making some determination based on prior assumptions by the user, such as seasonality. Exponential smoothing is often used for analysis of time-series data.
EViewsEViews is a statistical package for Windows, used mainly for time-series oriented econometric analysis. It is developed by Quantitative Micro Software (QMS), now a part of IHS. Version 1.0 was released in March 1994, and replaced MicroTSP. The TSP software and programming language had been originally developed by Robert Hall in 1965. The current version of EViews is 13, released in August 2022. EViews can be used for general statistical analysis and econometric analyses, such as cross-section and panel data analysis and time series estimation and forecasting.
Fractional Fourier transformIn mathematics, in the area of harmonic analysis, the fractional Fourier transform (FRFT) is a family of linear transformations generalizing the Fourier transform. It can be thought of as the Fourier transform to the n-th power, where n need not be an integer — thus, it can transform a function to any intermediate domain between time and frequency. Its applications range from filter design and signal analysis to phase retrieval and pattern recognition.
Heteroskedasticity-consistent standard errorsThe topic of heteroskedasticity-consistent (HC) standard errors arises in statistics and econometrics in the context of linear regression and time series analysis. These are also known as heteroskedasticity-robust standard errors (or simply robust standard errors), Eicker–Huber–White standard errors (also Huber–White standard errors or White standard errors), to recognize the contributions of Friedhelm Eicker, Peter J. Huber, and Halbert White.
Sinusoidal modelIn statistics, signal processing, and time series analysis, a sinusoidal model is used to approximate a sequence Yi to a sine function: where C is constant defining a mean level, α is an amplitude for the sine, ω is the angular frequency, Ti is a time variable, φ is the phase-shift, and Ei is the error sequence. This sinusoidal model can be fit using nonlinear least squares; to obtain a good fit, routines may require good starting values for the unknown parameters.
Granger causalityThe Granger causality test is a statistical hypothesis test for determining whether one time series is useful in forecasting another, first proposed in 1969. Ordinarily, regressions reflect "mere" correlations, but Clive Granger argued that causality in economics could be tested for by measuring the ability to predict the future values of a time series using prior values of another time series.
Cross-sectional regressionIn statistics and econometrics, a cross-sectional regression is a type of regression in which the explained and explanatory variables are all associated with the same single period or point in time. This type of cross-sectional analysis is in contrast to a time-series regression or longitudinal regression in which the variables are considered to be associated with a sequence of points in time. For example, in economics a regression to explain and predict money demand (how much people choose to hold in the form of the most liquid assets) could be conducted with either cross-sectional or time series data.