Sumario: | "Anjali Samani (CircleUp) explains two simple frameworks for evaluating a dataset's candidacy for smoothing and quantitatively determining the optimal imputation strategy and the number of consecutive missing values that can be imputed without material degradation in signal quality. To extract meaningful signals from alternative data, it's necessary to apply denoising and imputation to generate clean and complete time series. There are numerous ways to smooth a noisy data series and impute missing values, each with relative strengths and weaknesses. Smoothing removes noise from the data and allows patterns and trends to be identified more easily. It can, however, make a series appear less volatile than it is and may mask the very patterns you're seeking to identify. So you have to know when you should and shouldn't smooth a series, and if it is smoothed, what type of smoothing you should apply. Similarly, missing observations in time series can be imputed in many ways. These are covered in detail in both academic and practitioner literature. What caused the missing values in the first place and how the data is going to be used in downstream applications can often inform the most appropriate strategy for imputation. However, when there are multiple options to choose from, you have to objectively choose between different strategies and identify how many consecutive missing values can be safely imputed. This session is from the 2019 O'Reilly Strata Conference in New York, NY."--Resource description page
|