Tuesday, September 30, 2014

A Statistical Discrepancy

If you go to FRED for the default graph of GDP, the title reads Gross Domestic Product and the left border reads (Billions of Dollars).

If you then change the "Units" for the graph to "Percent Change from Year Ago" the left border changes to read (Percent Change from Year Ago). But the title still says Gross Domestic Product.

It's a pretty minor point. But it makes sense: GDP is GDP, whether you look at it in billions of dollars or as percent change values. That's probably obvious. I just want to make sure it's obvious.

At Vox last May, Yglesias pondered differences between Gross Domestic Product and Gross Domestic Income:
This morning we learned that GDP shrank at a 1 percent annual rate and GDI shrank at a 2.3 percent annual rate. Disturbing. And you may be even more disturbed when you learn that GDP and GDI are the exact same thing.

Check out any economics textbook and they will tell you that GDI = GDP. And it does. By definition. And yet when the government sets about to measure GDP and GDI they are never equal.

In theory, these two series should add up to the same thing. In practice, data sources are always imperfect and they don't add up to the same thing. The difference is known as the statistical discrepancy.

That's it: The difference is known as the statistical discrepancy.

But today's topic is TFP, not GDP.

Back in June I wrote:

I don't know how they figure Total Factor Productivity. Maybe it's an error term. Maybe there's a discrepancy in the calculation and they use TFP to reduce or eliminate the discrepancy.

Back in June I got the idea that TFP is "an error term". I quoted Simplicitus:

Total factor productivity is the otherwise unexplained productivity left over after all the individual factors (labor, capital, etc.) are taken out.

and I said

See? That's why I say TFP seems like an error term, a correction value, an adjustment to make the answer come out right.

I quoted Diego Comin:

Total Factor Productivity (TFP) is the portion of output not explained by the amount of inputs used in production.

and I said

This is starting to get familiar... the growth of labor and capital do not fully account for the growth of output. The difference ... is called TFP, Total Factor Productivity.

So that's pretty much how I came to think of Total Factor Productivity as an error term. It is the measure of a discrepancy.

There is a statistical discrepancy between GDP and Gross Domestic Income. They call it a statistical discrepancy.

There is a statistical discrepancy between GDP growth and aggregate production functions like the Solow Growth Model. They call it "Total Factor Productivity".

With GDP and GDI they don't take the statistical discrepancy and make up a name for it like Total Income Productivity and make it not seem like a discrepancy. They just call it a discrepancy.

But with GDP growth it's different. They take labor and capital and fit them to an equation, and compare the result to GDP growth. And when it comes out on the low side, they say there must be something else involved, and they give it the name "Total Factor Productivity".

They explain it by saying Oh, well, that's the increase in labor efficiency. And yes, it makes good sense. But it's just a story. It's not empirical. It's not based on anything. And it doesn't account for the effects of things like the growing accumulation of private debt.

TFP cannot be measured directly. Instead it is a residual ... a mismatch in the calculation ... a statistical discrepancy ... a big one.

This is what fascinates me about TFP: It's a black box.

I want to open up the box and see what's in there.

Monday, September 29, 2014

Hopscotch Graphs

The other day I looked at a recent Noahpinion post that used an old David Beckworth graph showing Total Factor Productivity data from John Fernald.

Fernald's TFP (left) looks significantly unlike FRED's TFP (right):

 Graph #1: Fernald's TFP (Beckworth)

 Graph #2: FRED's TFP

To improve the comparison, I overlaid the one graph on top of the other:

 Graph #3: Beckworth's TFP (blue) Overlaid on FRED TFP (red)

Then I kept looking, and found another TFP graph of Fernald's data from Noah. So now I want to repeat the above layout of graphs, but using Noah's graph in place of Beckworth's:

 Graph #4: Fernald's TFP (Noahpinion)

 Graph #5: FRED's TFP (again)

Again, the overlay:

 Graph #6: Noah's TFP (blue, green, dingy red) Overlaid on FRED TFP (bright red)
On Graph #3 I adjusted the "overlay" graph up-and-down to get a good fit in the early years. On Graph #6 I adjusted the overlay to get a good fit in the later years. (Noah's graph is taller, and already sits a lot higher than the FRED graph. So I didn't want to move it up more to make the early years align.)

But you can see on Graph #6 that the years from 1980 to maybe 2006 are a good match. The red lines follow the same general trend. Also in the early years, until maybe 1966, the red lines seem to share a trend.

I'm wondering now if maybe the FRED data is calculated in a way similar to Fernald's red line, which shows the TFP for "durables".

One more comparison graph, a straightforward one this time, two FRED series:

 Graph #7: Total Factor Productivity (red) and Multifactor Productivity (blue)
Too bad that blue line got such a late start.

Sunday, September 28, 2014

Garbage in, garbage out

Continuing yesterday's search of this site for Total Factor Productivity . . .

After the Van Zandweghe post in March 2014, my next mention of "Total Factor Productivity" came in June with Oh! I've seen this shape before.... That was my reaction when I first looked at FRED's version of TFP: Where have I seen this before?

Just a week later, in Uptrend to 1966, then stable to 1982, then uptrend again I found the matching shape. The match was SRW's Real GDP relative to the size of the labor force, at Interfluidity.

If productivity is output per worker, then maybe it makes sense that the graphs are similar. But TFP isn't "output per worker" so I'm not clear on this.

In any case, I ended that post "wondering now how they calculate TFP."

Before long I was looking at the Solow growth model.

//

From Roberto Chang of Rutgers:
In the original Solow model, time is continuous and the horizon is infinite. Without loss of generality assume that time is indexed by t in [0,∞).

At each point in time, there is only one final good. The good is produced via the aggregate production function:

Y=F(K,AL)

Here Y, A, K, and L denote output, labor productivity, capital, and labor, and are functions of time (i.e. we should really write Y(t) and so on, but we omit time arguments when not needed). The product AL is called effective labor.

Once you assume the existence of the aggregate production function Y=F(K,AL), it is clear that the growth of output can be due to growth of A, K, or L.

I like that one: If we assume the program we wrote for the Holodeck accurately mirrors reality, then we can assume the results the Holodeck shows us are correct.

It always cracked me up that on Star Trek: Next Gen, when they ran into a problem they couldn't solve, they would write a computer program and let the Holodeck solve the problem. Obviously, if you cannot determine the calculation that will solve the problem, you cannot write a computer program to run that calculation for you.

On the other hand, if you know the answer you want to get, you can write the code so it gives you that answer. Then you call your code a "model" and you call yourself an economist. Roberto Chang lays this out clearly:

Once you assume the existence of the aggregate production function Y=F(K,AL), it is clear that the growth of output can [only] be due to growth of A, K, or L.

//

Right or wrong, the answer we get is determined by the calculation we give the computer. Still, I am fascinated by the Solow model and want to understand it.

True confession: Usually when I'm reading a PDF and I get to a formula I just completely stop. (It even happened with Mason & Jayadev's Fisher Dynamics paper.) For an old math major like me, that's embarrassing. But the 6-page SolowProjectNotes PDF gave me no trouble at all. Things are well explained:

... for each time period t, Yt is gross domestic product (GDP), Kt is the capital stock, Lt is the size of the labor force, and At is labor productivity. The parameter α is the "capital cost share" we discussed in class, δ is the rate at which capital depreciates, and s is the savings rate for a country. We said labor grew at a rate n and labor productivity grew at a rate x ...

We next discuss how to calculate/estimate the parameters in our model...

We begin with the simple calculations first.

Some of the numbers come right out of a table. And the additional explanation you want is at hand:

α is the share of production costs paid to (owners of) capital, and 1 − α is the share of production costs paid to labor. For example, for the U.S., on average 62.9 percent of GDP was paid to labor and the rest went to owners of capital: i.e., α = 0.371 and 1 − α = 0.629. For Zambia, α = 0.533 and 1 − α = 0.467.

For me, even expanding the common concept the share of production costs paid to capital to

the share of production costs paid to (owners of) capital

helps a lot. Highly recommended reading...

But remember: Back in June I wrote:

spent most of Saturday going through the PDF, gathering data and entering calculations in a spreadsheet. It's one of those things where you have to put a lot of hours into it before you can make a graph to see if you're anywhere near right. I have patience for that kind of work, but it was most of Saturday.

Good luck with it.

//

By the way: The A in Roberto Chang's formula and the At in the SolowGrowthModel PDF both stand for "Labor Productivity". I think Labor Productivity is related to, but not the same as, Total Factor Productivity. I can't explain the difference yet. Couple days, maybe.

Saturday, September 27, 2014

When the moment is right

The earliest mention of "Total Factor Productivity" I find on this blog occurs in Suddenly remembered where I caught Noah considering the relation between TFP and unemployment. I wrote:

"TFP" being "Total Factor Productivity" which doesn't matter at the moment.

I think that's funny because it was three years ago, almost to the day. And now, finally, I'm looking at TFP. The moment is right, I guess.

//

I find the phrase Total Factor Productivity a few times after that, but nothing catches my eye (at the moment) until 26 March 2014: The relation between productivity and growth. In that post I quoted Willem Van Zandweghe of the Kansas City Fed:

In recent years, the U.S. economy has undergone a change in the behavior of productivity over the business cycle. Until the mid-1980s, productivity growth rose and fell with output growth. But since then the relationship between these two variables has weakened, and they have even moved in different directions.

I did some graphs comparing RGDP and TFP, then. But what strikes me just now is the contrast of Van Zandweghe's remark to that of Noah in the earlier post:

The recessions of the early 1980s were very short and "V-shaped," despite the extremely low rate of TFP growth at the time... By contrast, the early-2000s recession, though shallow, was much longer and "U-shaped", despite the recovery in TFP growth.

When I came upon that quote this morning I got stuck on the word "despite". Noah's use of that word implies things that shouldn't have coincided with other things, did coincide. Short recessions coincided with low TFP growth. Long recessions coincided with high TFP growth. I don't object to the coincidences. I object to Noah trying to shove "despite" down my throat before I've considered whether it's appropriate to say "despite" or whether it would be better to say "which makes sense because of". To whit:

The recessions of the early 1980s were very short and "V-shaped," which makes sense because of the extremely low rate of TFP growth at the time... By contrast, the early-2000s recession, though shallow, was much longer and "U-shaped", which makes sense because of the recovery in TFP growth.

I don't think it makes sense to say "despite". In a time of high productivity -- lots of output per worker -- you lose a lot of output with each unemployed worker. So I think it makes sense that severe recessions and high TFP go together. And in a time of low productivity, losing a worker means less loss of output. So I think it makes sense that mild recessions and low TFP go together.

I don't think it makes sense to say "despite".

//

Noah is implicitly arguing that there is a particular relation between productivity growth and economic growth. Van Zandweghe of Kansas City is arguing that there *was* a particular relation, which then changed, and now there is a different particular relation.

I couldn't see the change Van Zandweghe described. Nor can I see the relation Noah described.

Friday, September 26, 2014

I have to look into this more

Like pulling teeth, but pushing: to get ideas into my brain.

If I understand it right, "Total Factor Productivity" is a discrepancy. When you try to explain growth, you say: Okay, we've got labor, and we've got capital, and we take and mix them together in a big bowl and look at what we've got. You put another bowl next to it, that has GDP growth in it. And you compare the two.

What you see is that the GDP growth bowl is pretty close to full, but our labor-plus-capital bowl is half empty. So we have to add something to the labor-and-capital mix, until the one bowl is just as full as the other. And then we'll say we understand economic growth.

The thing we add to the bowl is called "Total Factor Productivity". The amount we have to add is the amount that makes the bowls equal. That's why I say TFP is a discrepancy. For every spoonful that our bowl is low, we add a spoonful of TFP.

//

So... the amount of TFP depends on the size of the discrepancy. Or take the bowls away and look at graphs, and say the value of TFP depends on how much we have to push up our calculation-of-growth line to make it match the official estimate of GDP growth. Something like that.

So the TFP number is the number that gives us the answer we were looking for, the answer that matches the GDP number. I'm not saying I'm doing sophisticated analysis here. I'm just saying it's what I understand.

But you see what I'm saying: If GDP is a line on a graph, and our labor-and-capital calculation is a lower line on that graph, then Total Factor Productivity has to be the number that fills the space between the two lines. There's not a lot of wiggle room.

//

So, assuming that's correct. then how can there be two different measurements of Total Factor Productivity??? There can't be. If we know what the GDP number is, and the labor number, and the capital number, then TFP has to be some particular number. If GDP is 10, and labor is 5 and capital is 3, and if our formula tells us that 10=5+3+X, then X has to be 2.

Like X, TFP has to be some particular value to work with the other numbers, which are givens. But remember the overlay graph I showed the other day?

 Graph #1: Beckworth's TFP (blue) Overlaid on FRED TFP (red)

We've got two different measures of Total Factor Productivity.

What I think is, the numbers Beckworth used are figured a different way. They're not just a measure of the discrepancy. There's more economic thought behind those numbers. I have to look into this more.

Thursday, September 25, 2014

Empty economics at Liberty Street

At Liberty Street Economics: Developing a Narrative: The Great Recession and Its Aftermath Andrea Tambalotti and Argia Sbordone.
What precipitated the U.S. economy into the worst recession since the Great Depression? And what headwinds are holding back the recovery?
...
As it turns out, only four of these shocks account for the bulk of the movements in GDP and inflation since the Great Recession.

•  A shock to total factor productivity (TFP)...
•  A financial (or spread) shock...
•  A shock to investment demand...
•  Shocks to monetary policy...

They show two graphs, then continue:
The first feature of this decomposition that we want to highlight is the paramount importance of spread shocks (in purple) during the recession. Starting at the end of 2007, the economy experiences a sequence of large shocks to credit spreads, driven by an increase in the perceived riskiness of borrowers. This progressive increase in risk is accompanied by deteriorating credit conditions, culminating in two spikes in spreads in the third and fourth quarters of 2008 with the failure of Lehman Brothers. These abrupt increases in the cost of credit account for a decline in quarterly GDP growth of more than 5 percentage points (annualized), which is about half of the total drop in output growth at the nadir of the recession. Inflation is also significantly affected by these shocks, which account for about three quarters of the decline in inflation in the second half of 2008.

The other half of the decline in GDP growth in 2008 stems from significant declines in TFP...

Half of the decline they attribute to financial "shocks" starting at the end of 2007.

Well, they got the "financial" part right. But "shocks" doesn't explain anything. And if you want to know what happened and why, you have to go back before 2007. You have to watch the economy develop for 60 years, and watch the growth of finance over that era.

The other half of the decline they attribute to Total Factor Productivity. But as we now know, TFP is just the part of growth that we cannot explain.

Wednesday, September 24, 2014

Let me repeat that

From Total Fudge-Factor Productivity:
Total Factor Productivity is a fudge factor.

TFP is an adjustment to make the answer come out right ... because the growth of labor and capital do not fully account for the growth of output.

Diego Comin writes: "Total Factor Productivity (TFP) is the portion of output not explained by the amount of inputs used in production."

Jazzbumpa quotes Wikipedia: "TFP cannot be measured directly. Instead it is a residual, often called the Solow residual, which accounts for effects in total output not caused by inputs."

Total Factor Productivity is "a residual". We use it to explain the part of growth that we can't explain. TFP is like the ether or Einstein's cosmological constant.

TFP is not an explanation. It is a fudge factor.

Tuesday, September 23, 2014

Just right

I'm always on the lookout for something said "just right" that will help me understand a concept or help me convey an idea. Funny thing, though, sometimes the thing that's just right is the perfect expression of a wrong idea. I think it happens most often when somebody takes an idea that was wrong to begin with, and dumbs it down to make it clear.

We tend to accept as true the things we've been taught. Not all of it of course, but the bigger part. It's natural. We're spoon-fed practically from birth, and much of what we know is so terribly familiar to us that we never doubt the truth of it.

Anyway, while looking up "inflation adjusting debt" the other day, I came across U.S. Public Debt Since 1940 - Adjusted for Inflation, an old (2008) post by John Buck at Economic Perspectives:
Here is the U.S. public debt since 1940 adjusted for inflation:

Adjusting the public debt for inflation provides a good account of federal government borrowing. The public debt increased in the 1940s to finance World War II. The public debt remained fairly constant from the late 1940s through 1981.

"Adjusting the public debt for inflation," Buck says, "provides a good account of federal government borrowing."

No it doesn't. If you want to see borrowing you should look at borrowing, not debt:

 Graph #2: We were borrowing more by the mid-1970s than we were during World War Two
Federal deficits increased all through the 1950s and 1960s and 1970s.

 Graph #3: But the 1970s don't look so bad when deficits are adjusted for inflation

Inflation adjustment makes it look like "public debt remained fairly constant from the late 1940s through 1981". Inflation adjustment hides the fact that the public debt was less constant than it appears on Buck's graph.

 Graph #4: Gross Federal Debt  1940-1981

Monday, September 22, 2014

Total Fudge-Factor Productivity

Noah links to an old Beckworth where Beckworth ponders Tyler Cowen's Great Stagnation. Here's Beckworth:
To see just how marked this TFP decline was after 1973 and whether the recent TFP gains make up for any of the loss, I went to the data.  Below is a figure constructed using the quarterly TFP series of John Fernald at the San Francisco Fed. (Click on figure to enlarge.)

I want to go to the data, too. Because when I first laid eyes on Beckworth's graph, I didn't think it showed Total Factor Productivity. I thought it was another one of those slowdown-of-income-growth graphs that I was looking at recently. Total Factor Productivity graphs have two changes in the trend-line, not one:
 Graph #2: Total Factor Productivity (FRED)
The FRED graph shows Total Factor Productivity in red, and shows "natural log" values like Beckworth did in his graph.

On the FRED graph, the trend changes once around 1966, and again around 1982. On Beckworth's graph, the trend changes once around 1973, and that's it. I think that's odd. But I was not the first to notice. In comments on Beckworth's post, João Marcus wrote:
Leonard Nakamura of the Philly Fed has done work in that area. Here´s an article from 1997.
His Figure 1 more or less replicates the figure in the post. I remember that in 1999 I saw a revision of Figure 1. It changes significantly. TFP (MFP) flattens between 1965-81 and then takes off again.

João Marcus sees the same mismatch that I see in the two TFP graphs.

Numbers aside, I do like Beckworth's graph. I like it because it shows a great stagnation that began long ago and continues pretty much to the current moment. I've seen that stagnation in the "slowdown of income growth" graphs. I've seen it in inflation-adjusted GDP. So I think Beckworth's graph shows a slowdown that really happened -- whether or not it really happened in TFP.

But I do have a couple problems with his graph. For one, I don't like his trend lines. Here's my problem: If you start at the 1947 end of the graph, the blue line and the trend line follow the same path pretty well. However, if you start at the 2007 end and work backwards, you want to put the change-in-trend kink at 1966, not 1973 where Beckworth has it. This happens a lot when you're eyeballing trend lines: Starting from different ends of the graph, you end up seeing different trends.

So I can agree with Beckworth that there was a "great stagnation" but I don't necessarily agree with him on the start-date of it. And that's kind of a big deal. Putting Beckworth's turning point in question just might raise doubt about Tyler Cowen's "low-hanging fruit" story.

There's another thing that bothers me about Beckworth's graph. Here are his thoughts on Cowen's stagnation, after pondering his graph:
Okay, I am impressed and far less skeptical of the Great Stagnation theory. In my previous post I argued that Cowen failed to appreciate how dramatically our lives have changed since the advent of the internet and faster computing.  Now I am thinking these gains are but a faint shadow of what they could have been had TFP continued to grow at its 1947-1973 trend.  The "good old days" really were better in terms of TFP growth.

Oh -- I certainly agree that our economy performed better before 1973 than after. But Beckworth is at a turning point -- deciding whether to accept the view that our economy performed better before 1973 than after, based on a graph of Total Factor Productivity. And the TFP numbers are in doubt.

Set the numbers aside, and TFP is still not a good basis on which to base a decision like Beckworth's -- because Total Factor Productivity is a fudge factor.

TFP is an adjustment to make the answer come out right ... because the growth of labor and capital do not fully account for the growth of output.

Diego Comin writes: "Total Factor Productivity (TFP) is the portion of output not explained by the amount of inputs used in production."

Jazzbumpa quotes Wikipedia: "TFP cannot be measured directly. Instead it is a residual, often called the Solow residual, which accounts for effects in total output not caused by inputs."

Total Factor Productivity is "a residual". We use it to explain the part of growth that we can't explain. TFP is like the ether or Einstein's cosmological constant.

TFP is not an explanation. It is a fudge factor. Total Factor Productivity is not a sound basis upon which to make the kind of judgement that Beckworth is making. He'd be better off looking at slowing income growth or inflation-adjusted GDP.

I took Beckworth's JPG, saved it as a PNG, and used PAINT.NET to erase the background. Then I overlaid it on the FRED graph of Total Factor Productivity:

 Graph #3: Beckworth's TFP (blue) Overlaid on FRED TFP (red)

For the overlay, I matched up the early years, and let the rest of it blow in the wind. You can see the the red FRED numbers are obviously lower that Beckworth's blue during the 1970s. But after about 1982 the red version rises sooner and more quickly than the blue. FRED's red line even comes pretty near to getting back to Beckworth's 1947-1973 trend line.

Which numbers are less iffy, I really can't say.

Sunday, September 21, 2014

Frances, in the moment

From Frances Coppola's Ultra-liquidity:
We already know that post-crisis, monetary policy can no longer use reserve scarcity to create the illusion of money drought...

If ultra-liquidity is here to stay, as seems likely, we must either re-unify monetary and fiscal policy or resign ourselves to central bank impotence.

From BIS central bank warns against 'illusion of permanent liquidity':
Loose monetary policies have created an "illusion of permanent liquidity" ...

"The longer the music plays and the louder it gets, the more deafening is the silence that follows," Claudio Borio, who heads the BIS's monetary and economic unit, told reporters.

"Markets will not be liquid when that liquidity is needed most," he warned, urging "sound prudential policies (and) extra prudence on the part of market participants themselves".

Saturday, September 20, 2014

Challenge and Response

In a growing civilization a challenge meets with a successful response which proceeds to generate another and a different challenge which meets with another successful response. There is no term to this process of growth unless and until a challenge arises which the civilization in question fails to meet--a tragic event which means a cessation of growth ...

Think of Toynbee's "growth" as economic growth. The great depressions of the capitalist era are our "correlative rhythm". The challenge is rising cost, due to the growth of finance. We fail to meet the challenge, thinking we need only to choke off inflation. But our response only makes the problem worse.

Marcus Nunes writes:
On becoming chairman of the Fed, Volker challenged the Keynesian orthodoxy which held that the high unemployment high inflation combination of the 1970´s demonstrated that inflation arose from cost-push and supply shocks – a situation dubbed “stagflation”.

Volker´s challenge placed inflation as the FOMC´s top priority. He also brought to the fore of policy discussions the ideas developed during the previous 12 years – since Friedman´s address to the 1967 AEA meetings – on the importance of inflation expectations.

To Volker, the policy adopted by the FOMC “rests on a simple premise, documented by centuries of experience, that the inflation process is ultimately related to excessive growth in money and credit”.

This view, an overhaul of Fed doctrine, implicitly accepts that rising inflation is caused by “demand-pull” or excess aggregate demand or nominal spending.

I summarized those remarks thus:

After the change in doctrine, policymakers held that inflation is related to the quantity of money (or, to the growth of the quantity of money) and, excuses be damned, that they could control inflation by controlling the quantity of money

With the change in doctrine, policymakers denied that there was a problem of rising costs. They thought they could stop inflation by limiting the growth of money -- or, you know, by fiddling with interest rates. They chose to see inflation as a monetary phenomenon rather than a cost phenomenon.

"Neoliberalism" is corporatization, privatization, and deregulation, according to Lorenzo, plus

the adoption of inflation-targeting by central banks, as a way of operationalising their responsibility for inflation as a monetary phenomenon.

But the root cause of inflation in our time is not money growth. The root cause is rising cost. In the 1960s and '70s, perhaps policymakers goosed the quantity of money to compensate for rising costs. Since Volcker, policymakers have quashed money and income growth, and simply ignored the problem of rising costs. As a result, they have forced down not only inflation, but also economic growth. Yet rising cost remains a problem. Policymakers have therefore given up on the idea of stable prices, and have accepted instead that they should aim for stable inflation.

Rising cost, driven by the growth of finance, is the challenge that our civilization fails to meet. Our response is to pretend that the problem lies elsewhere. The failure of policymakers to address the problem of rising cost means a cessation of growth, and what Toynbee has called a breakdown. The challenge has not been met, but it nonetheless continues to present itself.

// Wesley changed my life

Friday, September 19, 2014

Miscellaneous Toynbee

Excerpts from D.C. Somervell's two-volume abridgement of Arnold J. Toynbee's A Study of History

Challenge and Response

In a growing civilization a challenge meets with a successful response which proceeds to generate another and a different challenge which meets with another successful response. There is no term to this process of growth unless and until a challenge arises which the civilization in question fails to meet--a tragic event which means a cessation of growth and what we have called a breakdown. Here the correlative rhythm begins. The challenge has not been met, but it nonetheless continues to present itself. A second convulsive effort is made to meet it, and, if this succeeds, growth will of course be resumed.

Charm and Force

The radiation of any civilization may be analysed into three elements--economic, political and cultural--and, so long as a society is in a state of growth, all three elements seem to be radiated with equal power or, to speak in human rather than physical terms, to exercise an equal charm. But, as soon as the civilization has ceased to grow, the charm of its culture evaporates. Its powers of economic and political radiation may, and indeed probably will, continue to grow faster than ever, for a successful cultivation of the pseudo-religions of Mammon and Mars and Moloch is eminently characteristic of broken-down civilizations. But, since the cultural element is the essence of a civilization and the economic and political elements are relatively trivial manifestations of the life that it has in it, it follows that the most spectacular triumphs of economic and political radiation are imperfect and precarious.

In the contest that now ensues the broken-down civilization radiates force instead of attracting mimesis.

The Genesis of Civilization

The differentiation takes place within the body of the antecedent civilization, when that civilization begins to lose the creative power through which, in its period of growth, it had at one time inspired a voluntary allegiance in the hearts of the people below its surface or beyond its borders. When this happens, the ailing civilization pays the penalty for its failing vitality by being disintegrated into a dominant minority, which rules with increasing oppressiveness but no longer leads, and a proletariat (internal and external) which responds to this challenge by becoming conscious that it has a soul of its own and by making up its mind to save its soul alive.

The dominant minority's will to repress evokes in the proletariat a will to secede; and a conflict between these two wills continues while the declining civilization verges toward its fall, until, when it is in articulo mortis, the proletariat at length breaks free from what was once its spiritual home but has now become a prison-house and finally a City of Destruction.... The secession of the proletariat is the dynamic act, in response to the challenge, through which the change from Yin to Yang is brought about; and in this dynamic separation the 'affiliated' civilization is born.

The Nature of the Breakdown

The nature of the breakdown can be summed up in three points: a failure of creative power in the creative minority, which henceforth becomes a merely 'dominant' minority; an answering withdrawal of allegiance and mimesis on the part of the majority; a consequent loss of social unity in the society as a whole.

Like a Shark

... when a frontier between a more highly and a less highly civilized society ceases to advance, the balance does not settle down to a stable equilibrium but inclines, with the passage of time, in the more backward society's favour.

Toynbee's observations about civilization in decline seem to describe our world.

Thursday, September 18, 2014

This is my short and short-attention-span review of Noah Smith's full 5-part review of Big Ideas in Macroeconomics by Karthik Athreya

Dumb luck, maybe. At the top of Noah's sidebar, a link to his review of the book Big Ideas in Macroeconomics. From its title, the book sounds like it might be a useful summary of the big ideas. It isn't. Or at least, I couldn't tell if it is, from as much as I read of Noah's post.

But he did irk me, Noah did. So here we are.

Noah writes:
Deep within my cultural memory is buried a legend - the legend of the Scholastics. The legend goes like this: At the dawn of the modern age, when European rationalists and scientists began to unleash an explosion of creativity and free thought, there were a tribe of very smart, very learned people called the Scholastics, who devoted all their mental powers to defending the old Medieval understanding of the Universe. They produced exhaustive treatises defending old dogmas, and honed their logical thinking to a fine edge, but in the end they could not stand in the way of progress and were swept away. Deep within my cultural memory lies the boyish fantasy of confronting and defeating a Scholastic in an intellectual confrontation, in the name of a new scientific revolution. The 14-year-old in me still wants to be a fictionalized, Hollywood-ized version of Descartes, Galileo, or Francis Bacon, fighting for rationality, enlightenment, etc. etc.

Yeah, I'm no history buff, but I think Noah has the timeline jumbled. The Scholastics were in the 1200s and 1300s. Occam (of Occam's Razor) and Thomas Aquinas and them. The "dawn of the modern age" came a little later, with Descartes in the 1500s, and after.

Oh, crap. Noah's Wikipedia link dates Scholasticism at "about 1100 to 1700".

Oh, yeah, but: Under "Early Scholasticism" they say Charlemagne "attracted scholars" and "By decree in AD 787, he established schools in every abbey in his empire." Wow... he set a standard that lasted more than a thousand years. Only now are we trying to think it might be better to "privatize" schools. We, in our dark-age thinking. Oh, well.

Under "High Scholasticism" they say:

The 13th and early 14th centuries are generally seen as the high period of scholasticism.

Yeah, see? The 1200s and 1300s.

Then, under "Late Scholasticism" there is nothing but a link to some other article! I was right. Noah has things jumbled. Or maybe he's just focused on the dregs of Scholasticism, the unimportant part. Sheesh.

Descartes, it turns out, was a little later than I thought: the first half of the 1600s. But what's interesting, I think, is that when Descartes decided to throw away everything he knew and start from scratch, the stuff he was throwing away was mostly Scholasticism. What he was throwing away was the stuff Wikipedia describes as "a method of critical thought" and "a program of employing that method in articulating and defending dogma".

Interesting fellow, Descartes.

So, Noah:
But anyway, this is the bias I have to overcome when thinking about Big Ideas in Macroeconomics. It definitely has the feel of a Scholastic apologia. The book is clearly intended as a response to the outside criticisms of academic macroeconomics that have proliferated since the 2008 crisis and recession. Some of the critics are bloggers like Paul Krugman or writers like John Quiggin, who criticize macro in the public sphere; others are economists like Dan Hamermesh, who had this to say in 2011:

The economics profession is not in disrepute. Macroeconomics is in disrepute. The micro stuff that people like myself and most of us do has contributed tremendously and continues to contribute. Our thoughts have had enormous influence. It just happens that macroeconomics, firstly, has been done terribly and, secondly, in terms of academic macroeconomics, these guys are absolutely useless, most of them. Ask your brother-in-law. I’m sure he thinks, as do 90% of us, that most of what the macro guys do in academia is just worthless rubbish. Worthless, useless, uninteresting rubbish, catering to a very few people in their own little cliques.

Ouch...

A good number of people within the macro field agree, of course. But not all...

... Athreya seems to believe that most of macro's critics just don't know enough about the field...

But does this mean "outside" criticisms of macro should be discarded, just because they come from outside? Athreya does acknowledge, two pages later, that this process can lead to "capture" of critics - if the only legitimate critics of something are people who do it for a living, then the set of potential legitimate critics is pre-selected for people who will not want to be critics. (Athreya fails to mention a second, more cynical kind of "capture", which is that internal critics of a field are automatically "pissing where they sleep".)

This matters for society, because they're the ones who pay macroeconomists' salaries. Granted, it's not a large burden - if there are 3000 macroeconomics profs and govt. researchers in America and they earn an average of \$150,000 each, then that's less than half a billion dollars total each year (the cost of two or three F-35s), for a field that has a shot at preventing trillions of dollars in lost output.

It's all gossip. Sure, in the end it all comes down to money. Of course. But Noah's discussion is not about the economy. It's about personalities and disagreements and zingers. It's all gossip.

It's the money that's important. The people who have money know it. The people who don't have money know it. So, why is Noah writing about economics? Why isn't he writing about the economy?

Noah, you need to throw away everything you know, and start from scratch. Don't worry, it'll be no great loss. You'll be giving up nothing except a method of critical thought and a program of employing that method in articulating and defending dogma. Throw it away, Noah.

It's what you wanted since you were 14.

Wednesday, September 17, 2014

Not perfect, but it's close.

The red line on this graph shows Real GDP, percent change in Real GDP, actually, shifted down one and a half percentage points:

Suppose you wanted to draw a trend line through the red line, a smoothed, gradually sloping line that provides some idea of how the "average" value of Real GDP had changed over time.

I think it would look a lot like the blue line on the graph.

The blue line is a measure of nominal GDP relative to total credit market debt. The downtrend in real economic growth looks like a result of accumulating debt.

Tuesday, September 16, 2014

It didn't reduce private debt

From Sumner's post to Glasner...

David Glasner:

In a monetary economy suffering from debt deflation, one would certainly want to use monetary policy to alleviate the debt burden, but using monetary policy to alleviate the debt burden is different from using monetary policy to eliminate an excess demand for money.

I get this.

I don't really know what the "demand for money" is. I think maybe that's the problem Quantitative Easing was designed to solve. I don't know. But I do know that Quantitative Easing did nothing to alleviate the debt burden.

Scott Sumner:

As far as I know the demand for money is usually defined as either M/P or the Cambridge K. In either case, a debt crisis might raise the demand for money, and cause a recession if the supply of money is fixed. Or the Fed could adjust the supply of money to offset the change in the demand for money, and this would prevent any change in AD, P, and NGDP.

I don't get this.

Or maybe I do. A debt crisis might raise the demand for money, because creditors worry about loans going bad. So they lend less. They hang on to more of their money. Their "demand for money" goes up.

So then, Sumner is saying that if there is a debt crisis and the demand for money goes up, then to avoid a recession the Fed should "adjust the supply of money" to compensate for the increased demand for money. But isn't that what the Fed was doing for the last five years? Or trying to do?

It didn't work. But even if it *could* work, it wouldn't alleviate the debt burden.

Actually, that's why the Fed's plan didn't work: It didn't reduce private debt.

So I think Glasner has it right, and Sumner has it wrong.

Monday, September 15, 2014

Prime less inflation

Showed this Quandl graph the other day, the Real interest rate:

 Graph #1: Real Interest Rate SOURCE: Quandl
As I discovered when I tried to duplicate their graph at FRED, They don't specify the interest rate used in their calculation. I took a shot and guessed the prime rate. For the inflation I subtracted the GDP Deflator, which they specified; I used percent change from year ago of the Deflator, which they didn't specify.

Then, as my graph turned out much more jiggy than theirs, I specified annual frequency for the data on my FRED graph:

 Graph #2: The Prime Rate less Inflation, Annual Data
That's a pretty good match for the Quandl data. Theirs starts in the early 1960s, and lacks the rounded tops the FRED graph shows before 1960. And my FRED numbers may be a little higher -- but not much. Anyway the trends are similar, and that's the main thing.

Here's the jiggy one:

 Graph #3: The Prime Rate less Inflation, Quarterly Data

Sunday, September 14, 2014

Hoodwinked by the indexing

Back in February I came across this graph at The Current Moment:

 Graph #1
Also this one, which they called "Doug Henwood's Graph":

 Graph #2
Both graphs show productivity. The one graph compares productivity to compensation, the other compares it to wages. Both wages and compensation lag productivity. But compensation increases a lot more than wages, because compensation (as the source site explains) is wages plus benefits.

I find it useful when people provide background information like that. But that's not what caught my eye. It's the difference in separation points that caught my eye: On Graph #1 the "wages" line breaks away from the productivity line suddenly, just around 1974. On Graph #2 the "compensation" line breaks away from the productivity line gradually, but also much earlier -- possibly as early as 1960. This is the kind of thing that fascinates me.

I did notice that the first graph is indexed "relative to 1970" while the second is "rebased to 1960". In other words, the lines on Graph #1 cross at 1970 and the lines on Graph #2 cross at 1960. The indexing (or the choice of a "base year") influences the location of the break-away point on the graphs, and affects the impression the graphs give us. I think we are hoodwinked by the indexing.

On Graph #1 in particular, between 1955 and 1975, it's pretty clear that the orange line is going up faster than the red line. If you could take the whole red line and just move it down a little bit, you could make the two lines touch at just one point. That point would be about 1956. And if you looked at that graph, you would see real wages falling behind productivity since 1956.

You should find that disturbing.

The second graph, Henwood's graph, seems to show compensation falling behind productivity since about 1960. But it doesn't show earlier data, so it leaves the door open. 1956 is not shown to be wrong.

I followed the Current Moment link back to Doug Henwood's and liked what I found there. I emailed Henwood:
Hi. I recently came upon an old page (2001) at Left Business Observer

I'd like to use your "productivity and compensation" graph on my blog. The page says I should get permission first, so I'm asking.

All the similar graphs that I've seen focus on the separation beginning mid-1970s. Your graph shows the separation beginning much earlier. It was eye-opening for me.

"Heavens," Henwood replied. "I have much more recent versions of that - let me find one for you tomorrow." He did, too:

 Graph #3
"Here's the latest," he wrote last February. "Haven't updated it since November, but it's through the third quarter of last year. 'Compensation' includes fringe benefits - since much of that is health insurance, much of the real value is eaten up by medical inflation. Direct pay is pay without fringe benefits. All are inflation-adjusted and indexed so the base year = 100."

The three lines on this graph combine the three different series shown on the first two graphs above. In addition, on Graph #3 the series are indexed to 1964. And -- don't you know it! -- the break-away point this time looks to be 1964, or before.

A little over a week ago -- it seems much longer -- I was reading a discussion at Reddit. The topic was "What might actually be holding back workers’ wages".

Somebody blamed Reaganomics. One guy, I'll call him Joe, rejected that idea: "Your idiotic blaming of Reaganomics would be dependent on Reaganomics traveling back in time a decade and starting the trend in the mid 70's," Joe said.

I love it. That's one of my themes: You can't just blame the guy you don't like, especially if the things you're blaming him for happened before his turn at bat. Reminds me of a Mike Kimel quote that Jazzbumpa has in his sidebar:

#15 Time moves in a single direction.

No doubt.

Anyway, Joe provided a link to a "productivity and real wages" graph -- the same graph from The Current Moment that I have as Graph #1 above. Small world.

I complimented Joe on the graph. But then I went off-topic, so much that Joe had to disagree with me. I pointed out that the graph is indexed "relative to 1970". I said: "If the graph was indexed relative to 1956 we would see the slowing of wages begin in the mid-1950s and then accelerate slow more in the mid-1970s."

Joe replied: "That is not how the graph would change changing the index year at all. Here is one indexed to 1947. There is still a trend starting at 1970 where wages and output are decoupled."

He showed this graph:

 Graph #4: http://www.econdataus.com/wagegap12.png

Yeah, the red and blue lines are comparable to those in the graphs above. And yeah, the base year is 1947 this time.

But this graph has a third line -- the purple line -- that shows the ratio of the red and blue. Nice! I was going to get to showing a ratio.

The purple line shows how productivity ("hourly output") is changing relative to wages. And if you look at the purple line you can see it starts going up around 1956 or 1957. That means productivity started gaining on wages around 1956 or 1957.

Wages started falling behind productivity around 1956 or 1957. So if you want to blame Reaganomics... or if you want to blame Jimmy Carter or Gerald Ford or Richard Nixon or Lyndon Johnson -- or John F. Kennedy for that matter -- to do it you will have to make time go in the wrong direction. You'll be breaking Rule #15.

The failure of wages and compensation to keep up with productivity is a problem that began in the 1950s.

Yesterday, at The State of Working America I found this graph from the Economic Policy Institute:

It is similar to the graphs above. The dark blue "productivity" line goes up faster than the light blue "compensation" line since the mid-1950s. But the indexing has the two lines tangled together so you don't notice compensation falling behind until the mid-1970s.

This graph is important because it comes with this data. (Excel XLSX, 37KB) So of course I took the file, uploaded it to Zoho, and customized it for on-line use.

Change the 4-digit year value in the yellow cell to "rebase" the two lines to the year of your choice. The graph runs from 1948 to 2013. You can use any year in that range. 1956 is about right.

Saturday, September 13, 2014

He's kidding, right?

Benoît Cœuré of the ECB asks:

"Why is the effect of finance on growth non-linear?"

He's kidding, right?

You have to take the verb "finance" and consider what it means. It means to borrow money. Borrowing money is good for economic growth. A dollar borrowed is a dollar spent. The numbers go up. You might even want to conclude there is a one-for-one relation between borrowing and spending, or between borrowing and growth. But it's a bit early yet for conclusions.

Borrowing money is good for economic growth. But when you borrow a dollar, you create a dollar of debt. (Think of debt as the record of money borrowed.) Debt has to be repaid. And just as borrowing puts money in to the economy, repayment of debt takes money out. You know this. Even Benoît Cœuré knows this.

So, borrowing money is good for growth. But paying down debt is bad for growth. It's yin and yang. It's shadow and light. It's the two sides of the coin.

The effect of finance on growth goes something like this: The change in growth is equal to the amount of new borrowing, less principal repaid, less interest paid, plus interest withdrawn and spent into the economy.

And then you have to factor in the fact that spending recycles income, and the fact that not all spending contributes to growth, and the fact that debt accumulates.

It becomes too complicated for me to put a number on it. But you can easily see it's not a simple linear relation.

I think you already knew this. But maybe Benoît Cœuré didn't know.

The linear, one-for-one relation between borrowing and economic growth is made complex by the cost of debt. Therefore, the relation becomes non-linear.

I think that answers Benoît Cœuré's question.

Friday, September 12, 2014

The Bankers' Solution

Notes on the speech by Benoît Cœuré of the ECB.
The crisis has made us understand that the size of the financial sector can exacerbate the trade-off between economic efficiency and financial stability. While finance per se is necessary for growth, an oversized financial industry can be detrimental to real economic activity. Of course, the question of what constitutes an “oversized” financial sector is a complex one...

I have written:

Would you wait until adding to total debt decreases GDP to say there is a harmful effect? Or would you say what I say: If adding to total debt produces a smaller increase in GDP than it formerly did, then harm has already been done.

The question is really not complex. If the next dollar of debt increases GDP less than the previous dollar did, then you are on the wrong side of the Laffer Limit:

If you watch an economy using credit for growth, using more credit increases growth up to a point; beyond the Laffer Limit it starts ruining the economy.

Benoît Cœuré:
As regards the link between finance and growth, for a long time, it was common to say that finance affects growth in a positive, monotonic way, as if more is always better. Recent empirical evidence, however, has qualified this conventional wisdom. The analysis has now been refined to show that, beyond a threshold size, the effect of finance on long-term economic growth can weaken and even become negative.

Yep.

Why is the effect of finance on growth non-linear?

Wrong question.

Cœuré, the banker:
In the wake of the financial crisis, the global regulatory architecture has evolved to meet the challenge posed by the financial sector’s potential to generate economic distress. For one, we have proceeded towards a more integrated governance of the banking sector in Europe. The first components of a genuine banking union – the Single Supervisory Mechanism and the Single Resolution Mechanism – are already being implemented. By aiming to make the banking activities conducted in Europe safer, the banking union implicitly touches on some of the aspects of “oversized finance” in general, and “overbanking” in particular...

We believe that we have learned our lesson...

The academic literature, as well as our own experience in the crisis, has proven that the size of the financial sector can exacerbate the trade-off between economic efficiency and financial stability. I hope that the regulatory reforms that are now being enacted and the recent changes in the European supervisory architecture, combined with the renewed push for a single market for capital in Europe, will go some way in addressing this trade-off.

No.

There is no "trade-off between economic efficiency and financial stability". If Benoît Cœuré thinks there is, then Benoît Cœuré fails to understand the Laffer Limit.

The banker's solution makes finance bigger. My solution makes finance smaller.

We must not allow bankers to design the future.

Wednesday, September 10, 2014

A fly in the ointment

You know the economy is screwed up. That's why you're here. It's what you do on the Internet, think about the economy. Me, too.

You know we need to fix it. But everything has an economic component, a money component. So the conversation can go anywhere. Makes it hard to focus. Makes it hard to know what's relevant. That's unfortunate.

You know our world is full of well-written explanations and well-accepted memes, not all of which are correct. Not all of which can be correct. So it goes.

Maybe you are reading Frances Coppola's Ultra-Liquidity at Pieria. Well-written, certainly seems right.

Coppola:

All assets can be regarded as “money” to a greater or lesser extent: the extent to which assets have “moneyness” is really a matter of liquidity.

Sure. The concept of "near money" isn't new. But I guess Coppola's point is that liquidity is different now. Liquidity is "ultra".

She gives some excellent examples, beginning with her stay at Lindau. Using a VISA debit card she could "pay for a meal in Euros directly from my sterling account":

Because of technological changes, what was formerly an illiquid asset in Germany (sterling) has become transparently liquid.

Coppola says we could easily do the same thing with "a smartcard or a smartphone app". This helps me understand a little of what Steve Roth has been writing about.

Me? I don't use smartcards or smartphone apps or a VISA debit card.

It was interesting, I thought, that Coppola said this:

... suppose that instead of a sterling bank account, a smartcard or a smartphone app enabled me to pay a bill in Euros directly from my holdings of UK gilts? This is not as unlikely as it sounds. It would actually be two transactions...

Yeah: It's two transactions. One of them pays for dinner. The other pays for the currency swap: a financial transaction. Coppola's significant fact is that "This pair of transactions in today's liquid markets could be done instantaneously."

My significant fact is that financial transactions increase costs.

Coppola makes a good point:

ANYTHING that can be used to settle transactions really should be regarded as money: and as technology increases liquidity in all asset classes, so they become easier to use for transaction settlement and therefore more money-like.

Okay. This is a significant change: We now have technological liquidity. Like gunpowder and the bomb, it probably won't be going away.

But it doesn't come free, either. And the more financial transactions we participate in, the more is the cost of finance. This is true, for any given rate of interest. It is true, even if the transactions occur "instantaneously".

I see financial cost as a problem. My solution is the same, always: reduce the size of finance. Rely more on government-issue money and less on bank-issue.

The cost of finance doesn't seem to cross Frances Coppola's mind.

The article is about the increase in the liquidity of non-monetary financial assets, a change that has turned them into money. But as Coppola points out, ultra-liquidity is not always reliable:

CDOs were highly liquid near-money assets prior to the financial crisis. Now, they are about as liquid as flies in amber, and worth considerably less.

Again, Coppola:

No wonder yield-hunters go for intrinsically illiquid assets such as property, or assets that carry significant credit risk such as junk bonds. They can't get any sort of return on safer and/or more liquid assets. And the more technology improves market liquidity – and the more regulators push for improvements to market liquidity – the lower returns will be across all classes of financial asset.

And the more regulators push for improvements to market liquidity, she says.

It is policy to "push for improvements" in liquidity. By "improvements" Coppola means "increases" in liquidity. Policy increases liquidity, Coppola says, even though it drives investors into junk.

In the conclusion Coppola writes:

Above all, though, we need to rethink how we do monetary policy... Liquidity is becoming to all intents and purposes free. The question for policy makers now is how to influence the returns earned on illiquid investments such as property.

In other words, she accepts this brave new world and wants to find a way to deal with it. I think that's the wrong approach. I think Coppola, like the regulators, is heading for the wrong goal. We want to minimize finance, not maximize it.

Tuesday, September 9, 2014

The limits of limits

It's four years old, but I am fascinated by Thomas Palley's The Limits of Minsky’s Financial Instability Hypothesis as an Explanation of the Crisis. Minsky makes a good point, Palley says, but "his theory only provides a partial and incomplete account of the current crisis."

I'm still reading, but according to Palley, Tim Geithner and Larry Summers -- the Treasury Secretary and the President's chief economic counselor when the piece was written -- Geithner and Summers' view, Palley says, was that "financial excess was the only problem, and normal growth will return once that problem is remedied." Palley disagrees, and pushes the "roots of the crisis" back to the late 1970s, early '80s:

By giving free rein to the Minsky mechanisms of financial innovation, financial deregulation, regulatory escape, and increased appetite for financial risk, policymakers (like former Federal Reserve Chairman Alan Greenspan) extended the life of the neoliberal model. The sting in the tail was that this made the crisis deeper and more abrupt when financial markets eventually reached their limits. Without financial innovation and financial deregulation the neoliberal model would have got stuck in stagnation a decade earlier, but it would have been stagnation without the pyrotechnics of financial crisis.

When Palley goes back in time to the root of the crisis, he finds people doing what Minsky said they'd be doing. So I haven't figured out why Palley says Minsky's theory provides only a partial explanation. But like I said, I'm still reading.

"The sting in the tail", Palley says, was that the long duration of the policy made the crisis worse "when financial markets eventually reached their limits." Sure: Preventing small forest fires makes the big ones bigger. But I have trouble with the words "reaching the limits".

Palley doesn't say it, but to me the word "limits" implies we might somehow be able to calculate those limits, and that way avoid a crisis the next time. This brings to mind a calculation by Richard Vague, recently referenced by Steve Keen:

This recovery is starting from an unprecedented level of private debt: whereas the last post-recession recovery in America began from a debt level of 115 per cent of GDP, this one is commencing from 155 per cent (see Figure 2). If it only reaches the level of the previous peak (177 per cent) in the next five years, it will have fulfilled Richard Vague’s empirical rule of thumb for the cause of an economic crisis (a private debt to GDP ratio above 150 per cent, and an 18 per cent increase in that ratio over 5 or less years).

I think it conveys the wrong idea to say a crisis occurs when markets reach their limits. It implies there is a number, 18 or 5 or maybe 42, a magic number that when you reach it all hell breaks loose.

I didn't read Minsky, so maybe I have this wrong. (If you think I'm wrong, quote Minsky to me. Don't quote somebody else.) But here's what I'm pretty sure Minsky said: Financial instability increases.

Until what Thomas Palley calls Minsky's "thwarting institutions" are restored, instability increases. As instability increases, the economy becomes increasingly more fragile. Eventually, the weight of a feather is enough to bring it down. An analogy that comes to mind: a bicycle going slower and slower, wobbling and righting itself once, wobbling again, then falling over in a sudden crash.

If the bicycle kept going fast enough, it might have stayed upright. If the economy didn't grow increasingly fragile, it might have withstood the shock that brought it down. It's not like there's a limit-line on the pavement and the bicycle will fall when it reaches the limit.

I think a shock that would scarcely be noticed in a strong economy is enough to topple a weak one. You don't calculate the "limits" at which an economy comes crashing down. You figure out what's making the economy weak, and you calculate how best to strengthen it.

Monday, September 8, 2014

What's wrong with this picture?

This picture. The FRED Blog graph we looked at yesterday:

 Graph #1: Quantity of Currency in Circulation Relative to GDP
What's wrong with it?

Here, let me ask the question this way: What stands out?

I'll give you a hint:

 Graph #2: Same graph as Above, Except it Starts at 1960
There is an anomaly on this graph, a decade that's not like all the others. See it?

Here's another hint:
 Graph #3: Same graph as Above, Except it Starts at 1995
You can see in all three graphs above, regardless of whether the general trend is upward or downward, the steps are always tiny. Perhaps the better word is "brief".

The graph is a plot of quarterly values. That means every three months there is a different value. And you can see easily that every three months, the line is just a little bit different than it was before.

Even the seriously large changes of the early 1950s, on Graph #1, occurred during very brief periods of time.

The anomaly is that the brief steps didn't happen in the years leading up to the Global Financial Crisis and the Great Recession of 2009.

The random walk uphill, which began in the mid-1980s, was interrupted by a purposeful walk downhill from 2003 to 2008.

A purposeful walk is an act of policy. The uninterrupted five-year decline visible on these graphs was an act of policy. It can be nothing else.

Actually, I showed you that decline before. It is visible in the growth rate of Base Money. As with the graphs above, the decline occurs during the decade before the Great Recession:

 Graph #4, from mine of 14 October 2011. See also mine of 6 March 2014.

It also occurred during the decade before the Great Depression.

You know, I wasn't gonna beat you over the head with this. I was gonna let it go. But then I bumped into Scott Sumner's Four things I believe. And Sumner's Thing One is this:

1. The Great Recession was caused by tight money at the Fed, and other major central banks. Period. End of Story.

I hate to say it, but he's right. But he didn't show the graphs, so I did.

Period. End of Story.