This is how I thought of it at first:
The Double-3: Take a zero-phase 3-year moving average of a zero-phase 3-year moving average of some economic data... I think this would be like assigning weights to the numbers, heaviest at the current value, lightest two years ahead or behind.
When I looked at it in Excel, a Double-5 gave a result more like what I had in mind:
Looking at any point on a trend line, the actual value at that point should have more significance than neighboring points; and more distant neighbors should have even less. I think this is a decent way to generate a trend line for an economic dataset.
To compare these images, use Bloggers Lightbox: Click a graph to activate the lightbox, and use the selector at the bottom of the screen to switch between graphs.
6 comments:
This gives you double smoothing, but no current smoothed data. How do you intend to use it?
Cheers
JzB
Try this.
JzB
Good post, Jazz.
I intend to use the data to obtain a smoother picture of past trends. I do not really need "current" data. I am studying the past. I am not making investment decisions in the current moment.
A 3-year moving average contains one-third of each of the values (let us say) for 1961, 1962, and 1963. These values are unweighted. These observations are true regardless of whether the point is plotted at 1962 (for a zero-phase picture) or at 1963 (for a causal picture).
A Double-Three moving average contains one-ninth of each of the values for 1961, 1962, 1963, and 1962, 1963, 1964, and 1963, 1964, 1965. In sum: the Double-Three contains one ninth of the 1961 and 1965 values; two ninths of the 1962 and 1964 values, and three ninths of the 1963 value. In other words, the Double-Three uses weighted data.
Hi Art,
You have progressed from a rectangular filter to a triangle filter.
triangle filter
-jim
Jim,
I have heard of a somebody-prescott filter, which is I think supposed to strip out minor variations (small business cycles maybe) and provide a picture of longer-term trends (longwave cycles maybe). But that isn't intuitive for me, and what I found on the internet required more work than I have yet put into it.
What I am looking for is to determine longer-term trends objectively, I guess to see the forest despite the trees.
Got any advice or links for me?
Thanks... and have a good long weekend.
Hi Art,
The late response is because I was too busy eating turkey yesterday.
My familiarity with filters is mostly in the area of audio and image processing not statistics.
I think what you are doing is about as good as it will get if your goal is to look at the smoothed data and find long-term peaks and troughs in the data. You don't want the peaks shifted in time by filtering so you don't want a causal filter.
If you want to get a little more fancy you might view this as a downsampling and reconstruction problem. The basic point behind a filter designed for downsampling is you want to remove the high frequency content that can't be supported at the lower sample rate. You aren't actually interested in a lower sample rate, but the low-pass filters designed for that purpose will be the best filter for removing high frequency content above a certain threshold.
However, for your purposes that is likely to be overkill.
Post a Comment