The contradictory joys of being the US Treasury Secretary

Tim Geithner, speaking at the start of the G-20 meeting in Pittsburgh:

Sept. 25 (Bloomberg) — Treasury Secretary Timothy Geithner said he sees a “strong consensus” among Group of 20 nations to reduce reliance on exports for growth and defended the dollar’s role as the world’s reserve currency.

“A strong dollar is very important in the United States,” Geithner said in response to a question at a press conference yesterday in Pittsburgh, where G-20 leaders began two days of talks.

Tim Geithner, speaking in Tokyo while joining the US President on a tour of Asian capitals:

Nov. 11 (Bloomberg) — U.S. Treasury Secretary Timothy Geithner said a strong dollar is in the nation’s interest and the government recognizes the importance it plays in the global financial system.

“I believe deeply that it’s very important to the United States, to the economic health of the United States, that we maintain a strong dollar,” Geithner told reporters in Tokyo today.
[…]
Geithner said U.S. efforts to boost exports aren’t in conflict with the “strong-dollar” policy. “I don’t think there’s any contradiction between the policies,” he said.

Which is hilarious.

There is no objective standard for currency strength [1].  A “strong (US) dollar” is a dollar strong relative to other currencies, so it’s equivalent to saying “weak non-US-dollar currencies”.  But when the US dollar is up and other currencies are down, that means that the US will import more (and export less), while the other countries will export more (and import less), which is the exact opposite of the re-balancing efforts.

The only way to reconcile what Geithner’s saying with the laws of mathematics is to suppose that his “strong dollar” statements are political and relate only to the nominal exchange rate and observe that trade is driven by the real exchange rate.  But that then means that he’s calling for a stable nominal exchange rate combined with either deflation in the USA or inflation in other countries.

Assuming my previous paragraph is true, 10 points to the person who can see the potential conspiracy theory [2] implication of Nouriel Roubini’s recent observation that the US holding their interest rates at zero is fueling “the mother of all carry trades” [Financial Times, RGE Monitor].

Hint:  If you go for the conspiracy theory, this story would make you think it was working.

Nov. 13 (Bloomberg) — Brazil, South Korea and Russia are losing the battle among developing nations to reduce gains in their currencies and keep exports competitive as the demand for their financial assets, driven by the slumping dollar, is proving more than central banks can handle.
[…]
Governments are amassing record foreign-exchange reserves as they direct central banks to buy dollars in an attempt to stem the greenback’s slide and keep their currencies from appreciating too fast and making their exports too expensive.
[…]
“It looked for a while like the Bank of Korea was trying to defend 1,200, but it looks like they’ve given up and are just trying to slow the advance,” said Collin Crownover, head of currency management in London at State Street Global Advisors

The answer to follow …

Update: The answer is in my next post.

[1] There better not be any gold bugs in the audience.  Don’t make me come over there and hurt you.

[2] Okay, not a conspiracy theory; just a behind-the-scenes-while-completely-in-the-open strategy of international power struggles.

[1] There better not be any gold bugs on this list.  Don’t make me
come over there and hurt you.

[2] Okay, not a conspiracy theory; just a behind-the-scenes-while-
completely-in-the-open strategy of international power struggles.

Not raising the minimum wage with inflation will make your country fat

Via Greg Mankiw, here is a new working paper by David O. Meltzer and Zhuo Chen: “The Impact of Minimum Wage Rates on Body Weight in the United States“. The abstract:

Growing consumption of increasingly less expensive food, and especially “fast food”, has been cited as a potential cause of increasing rate of obesity in the United States over the past several decades. Because the real minimum wage in the United States has declined by as much as half over 1968-2007 and because minimum wage labor is a major contributor to the cost of food away from home we hypothesized that changes in the minimum wage would be associated with changes in bodyweight over this period. To examine this, we use data from the Behavioral Risk Factor Surveillance System from 1984-2006 to test whether variation in the real minimum wage was associated with changes in body mass index (BMI). We also examine whether this association varied by gender, education and income, and used quantile regression to test whether the association varied over the BMI distribution. We also estimate the fraction of the increase in BMI since 1970 attributable to minimum wage declines. We find that a $1 decrease in the real minimum wage was associated with a 0.06 increase in BMI. This relationship was significant across gender and income groups and largest among the highest percentiles of the BMI distribution. Real minimum wage decreases can explain 10% of the change in BMI since 1970. We conclude that the declining real minimum wage rates has contributed to the increasing rate of overweight and obesity in the United States. Studies to clarify the mechanism by which minimum wages may affect obesity might help determine appropriate policy responses.

Emphasis is mine.  There is an obvious candidate for the mechanism:

  1. Minimum wages, in real terms, have been falling in the USA over the last 40 years.
  2. Minimum-wage labour is a significant proportion of the cost of “food away from home” (often, but not just including, fast-food).
  3. Therefore the real cost of producing “food away from home” has fallen.
  4. Therefore the relative price of “food away from home” has fallen.
  5. Therefore people eat “food away from home” more frequently and “food at home” less frequently.
  6. Typical “food away from home” has, at the least, more calories than “food at home”.
  7. Therefore, holding the amount of exercise constant,  obesity rates increased.

Update: The magnitude of the effect for items 2) – 7) will probably be greater for fast-food versus regular restaurant food, because minimum-wage labour will almost certainly comprise a larger fraction of costs for a fast-food outlet than it will for a fancy restaurant.

Variation in US unemployment

The NY Times brings us a another wonderful graphic.  As of September 2009, white women aged 25 to 34 with a college degree had an unemployment rate of just 3.6%, while black men aged 18 to 24 without a highschool diploma had an unemployment rate of 48.5%.  Change that last group to white men aged 18 to 24 without a highschool diploma and it falls to 25.6%.

The likelihood-ratio threshold is the shadow price of statistical power

Cosma Shalizi, an associate professor in statistics at Carnegie Mellon University, gives an interpretation of the likelihood-ratio threshold in an LR test: It’s the shadow price of statistical power:

[…]

Suppose we know the probability density of the noise p and that of the signal is q. The Neyman-Pearson lemma, as many though not all schoolchildren know, says that then, among all tests off a given size s, the one with the smallest miss probability, or highest power, has the form “say ‘signal’ if q(x)/p(x) > t(s), otherwise say ‘noise’,” and that the threshold t varies inversely with s. The quantity q(x)/p(x) is the likelihood ratio; the Neyman-Pearson lemma says that to maximize power, we should say “signal” if its sufficiently more likely than noise.

The likelihood ratio indicates how different the two distributions — the two hypotheses — are at x, the data-point we observed. It makes sense that the outcome of the hypothesis test should depend on this sort of discrepancy between the hypotheses. But why the ratio, rather than, say, the difference q(x) – p(x), or a signed squared difference, etc.? Can we make this intuitive?

Start with the fact that we have an optimization problem under a constraint. Call the region where we proclaim “signal” R. We want to maximize its probability when we are seeing a signal, Q(R), while constraining the false-alarm probability, P(R) = s. Lagrange tells us that the way to do this is to minimize Q(R) – t[P(R) – s] over R and t jointly. So far the usual story; the next turn is usually “as you remember from the calculus of variations…”

Rather than actually doing math, let’s think like economists. Picking the set R gives us a certain benefit, in the form of the power Q(R), and a cost, tP(R). (The ts term is the same for all R.) Economists, of course, tell us to equate marginal costs and benefits. What is the marginal benefit of expanding R to include a small neighborhood around the point x? Just, by the definition of “probability density”, q(x). The marginal cost is likewise tp(x). We should include x in R if q(x) > tp(x), or q(x)/p(x) > t. The boundary of R is where marginal benefit equals marginal cost, and that is why we need the likelihood ratio and not the likelihood difference, or anything else. (Except for a monotone transformation of the ratio, e.g. the log ratio.) The likelihood ratio threshold t is, in fact, the shadow price of statistical power.

It seems sensible to me.

Who has more information, the Central Bank or the Private Sector?

A friend pointed me to this paper:

Svensson, Lars E. O. and Michael Woodford. “Indicator Variables For Optimal Policy,” Journal of Monetary Economics, 2003, v50(3,Apr), 691-720.

You can get the NBER working paper (w8255) here.  The abstract:

The optimal weights on indicators in models with partial information about the state of the economy and forward-looking variables are derived and interpreted, both for equilibria under discretion and under commitment. The private sector is assumed to have information about the state of the economy that the policymaker does not possess. Certainty-equivalence is shown to apply, in the sense that optimal policy reactions to optimally estimated states of the economy are independent of the degree of uncertainty. The usual separation principle does not hold, since the estimation of the state of the economy is not independent of optimization and is in general quite complex. We present a general characterization of optimal filtering and control in settings of this kind, and discuss an application of our methods to the problem of the optimal use of ‘real-time’ macroeconomic data in the conduct of monetary policy. [Emphasis added by John Barrdear]

The sentence I’ve highlighted is interesting.  As written in the abstract, it’s probably true.  Here’s a paragraph from page two that expands the thought:

One may or may not believe that central banks typically possess less information about the state of the economy than does the private sector. However, there is at least one important argument for the appeal of this assumption. This is that it is the only case in which it is intellectually coherent to assume a common information set for all members of the private sector, so that the model’s equations can be expressed in terms of aggregative equations that refer to only a single “private sector information set,” while at the same time these model equations are treated as structural, and hence invariant under the alternative policies that are considered in the central bank’s optimization problem. It does not make sense that any state variables should matter for the determination of economically relevant quantities (that is, relevant to the central bank’s objectives), if they are not known to anyone in the private sector. But if all private agents are to have a common information set, they must then have full information about the relevant state variables. It does not follow from this reasoning, of course, that it is more accurate to assume that all private agents have superior information to that of the central bank; it follows only that this case is one in which the complications resulting from partial information are especially tractable. The development of methods for characterizing optimal policy when di fferent private agents have di fferent information sets remains an important topic for further research.

Here’s my attempt as paraphrasing Svensson and Woodford in point form:

  1. The real economy is the sum of private agents (plus the government, but ignore that)
  2. Complete information is thus, by definition, knowledge of every individual agent
  3. If we assume that everybody knows about themselves (at least), then the union of all private information sets must equal complete information
  4. The Central Bank observes only a sample of private agents
  5. That is, the Central Bank information set is a subset of the union of all private information sets. The Central Bank’s information cannot be greater than the union of all private information sets.
  6. One strategy in simplifying the Central Bank’s problem is to assume that private agents are symmetric in information (i.e. they have a common information set).  In that case, we’d say that the Central Bank cannot have more information than the representative private sector agent. [See note 1 below]
  7. Important future research will involve relaxing the assumption in (f) and instead allowing asymmetric information across different private agents.  In that world, the Central Bank might have more information than any given private agent, but still less than the union of all private agents.

Svensson and Woodford then go on to consider a world where the Central Bank’s information set is smaller than (i.e. is a subset of) the Private Sector’s common information set.

But that doesn’t really make sense to me.

If private agents share a common information set, it seems silly to suppose that the Central Bank has less information than the Private Sector, for the simple reason that the mechanism of creating the common information set – commonly observable prices that are sufficient statistics of private signals – is also available to the Central Bank.

In that situation, it seems more plausible to me to argue that the CB has more information than the Private Sector, provided that their staff aren’t quietly acting on the information on the side.  It also would result in observed history:  the Private Sector pays ridiculous amounts of attention to every word uttered by the Central Bank (because the Central Bank has the one private signal that isn’t assimilated into the price).

Note 1: To arrive at all private agents sharing a common information set, you require something like the EMH (in fact, I can’t think how you could get there without the EMH).  A common information set emerges from a commonly observable sufficient statistic of all private information.  Prices are that statistic.

    Restarting running: 100 days in

    Running_30Oct2009

    Between 1999 and 2008 inclusive, the best I ever managed in a single block was a pathetic 17 runs over 41 days.  In 1998 I did manage 37 runs in a “block” but it was haphazard, with several two-week breaks and a couple of spurts of 5 runs per week that were, frankly, dangerous.  It took me 128 days (18 and a half weeks) to get through those 37.  I should reach 37 this time after 103 days (14 and a half weeks).  I attribute my sticking with it this time to:

    1. Running shorter distances than I have in the past
    2. Resisting the urge to increase my speed too quickly
    3. Never running more than three times per week
    4. This

    Media bias and people who are WAAAY out on the political spectrum

    Andrew Sullivan points to this research by Pew on how American’s view the bias of the major television networks.  It’s nicely summarised in this diagram (from Pew):

    Public perceptions of news network ideology

    Andrew makes the obvious and easy comment bashing on Fox:

    Clearly the public understands that the network MSM is skewed to the left. But there’s a difference of magnitude between that assessment and that of Fox. Quite simply, most Americans see Fox for what it is: an appendage of a political operation, not a journalistic one. Its absurd distortions, its relentless attacks on Obama from the very start, its hideously shrill hosts, and its tawdry, inflammatory chat all put it in a class by itself.

    Personally, I don’t necessarily agree that the MSM is, on average, biased to the left (although maybe that’s just my internal biases talking).  I’ll get to that in a moment, but first …

    14% of respondents consider Fox News to be mostly liberal in it’s bias!  That’s almost one in seven.  Just how far out in the political spectrum are those people? What would Fox need to do to convince them that they were neutral?  Actively promote the KKK?

    Back to perceptions of bias.  Here is another graphical illustration of the Pew Research data:

    US Perceptions of MSM Bias (Pew)

    It seems safe to assume that anybody who thinks Fox News is liberal will consider the rest liberal as well, so that explains a large fraction of the “liberal” responses for the rest.  So, excluding the people who are personally so conservative as to consider Fox News to have a pro-liberal bias, this is what it looks like:

    US Perceptions of MSM Bias (excl. people who think Fox is liberal)

    In other words, when we restrict our attention to people who are not insane [1], the American public agrees with me: by and large, the non-Fox networks are pretty evenly balanced, although MSNBC  is pro-liberal.

    [1] Okay, they may not be insane.  I have no evidence than any larger fraction of them are insane than in the rest of the population.  But they do strike me as having some pretty whacky personal beliefs.

    The death throes of US newspapers?

    Via Megan McArdle’s excellent commentary, I discovered the Mon-Fri daily circulation figures for the top 25 newspapers in the USA.  Megan’s words:

    I think we’re witnessing the end of the newspaper business, full stop, not the end of the newspaper business as we know it. The economics just aren’t there. At some point, industries enter a death spiral: too few consumers raises their average costs, meaning they eventually have to pass price increases onto their customers. That drives more customers away. Rinse and repeat . . .

    […]

    The numbers seem to confirm something I’ve thought for a while: we’re eventually going to end up with a few national papers, a la Britain, rather than local dailies. The Wall Street Journal, the Washington Post, and the New York Times (sorry, conservatives!) are weathering the downturn better than most, and it’s not surprising: business, politics, and national upper-middlebrow culture. But in 25 years, will any of them still be printing their product on the pulped up remains of dead trees? It doesn’t seem all that likely.

    For those of you that like your information in pictoral form, here it is:

    First, the data.  Look at the Mean/Median/Weighted Mean figures.  That really is an horrific collapse in sales.

    US_Newspaper_circulation_data

    Second, the distribution (click on the image for a full-sized version):

    US_Newspaper_circulation_distribution

    Finally, a scatter plot of year-over-year change against the latest circulation figures (click on the image for a full-sized version):

    US_Newspaper_circulation_scatterplotAs Megan alluded in the second paragraph I quoted, there appears to be a weak relationship between the size of the paper and the declines they’ve suffered, with the bigger papers holding up better.  The USA Today is the clear exception to that idea.  Indeed, if the USA Today is excluded from the (already very small!) sample the R^2 becomes 30%.

    To really appreciate just how devestating those numbers are, you need to combine it with advertising figures.  Since newspapers take revenue from both sales (circulation) and advertising, the fact that advertising revenue has also collapsed, as it always does in a recession, means that newspapers have taken not just one but two knives to the chest.

    Here’s advertising expenditure in newspapers over recent years, taken from here:

    Year Expenditure (millions of dollars) Year-over-year % change
    2005 47,408
    2006 46,611 -1.7%
    2007 42,209 -9.2%
    2008 34,740 -17.7%

    Which is ugly.  Remember, also, that this expenditure is nominal.  Adjusted for inflation, the figures will be worse.

    So what do you do when your ad sales and your circulation figures both fall by over 15%?  Oh, and you can’t really cut costs any more because, as Megan says:

    For twenty years, newspapers have been trying to slow the process with increasingly desperate cost cutting, but almost all are at the end of that rope; they can’t cut their newsroom or production staff any further and still put out a newspaper. There just aren’t enough customers who are willing to pay for their product what it costs to produce it.

    Which, in economics speak, means that the newspaper business has a large fixed cost component that isn’t particularly variable even in the long run.

    Tyler Cowen, in an excellent post that demonstrates precisely why I read him daily, says:

    I believe with p = 0.6 that the world is in for a “great disruption.”  It has come to MSM first but it will not end there.  In the longer run I am optimistic about the results of this change — computers will free up lots of human labor — but in the meantime it will have drastic implications for income redistribution, across both individuals and across economic sectors.  For a core metaphor, the internet displacing paid journalism and classified ads is a good place to start.  The value of newspapers has been sucked into Google.

    […]Once The Great Disruption becomes more evident, entertainment will be very very cheap.

    Which may well be true, but will be cold comfort for all of those traditional journalists out there.

    “L’Heure espagnole” and “Gianni Schicchi”

    Last night Dani and I went to the Royal Opera thanks to the glories of Student Standby tickets:  £10 each!

    It’s luck of the draw for where you end up sitting.  Last night we were in the nose-bleeds, but at the ROH, even there you get a perfect view and no acoustic trade-off that my untrained ears can notice.  We’ve previously managed to get seats that would ordinarily cost hundreds of pounds.

    The Royal Opera - L'Heure espagnole

    We saw two one-act comedic operettasRavel‘s L’Heure espagnole (the poor-quality photo above is from that – Look!  Giant breasts at the opera!) and Puccini‘s Gianni Schicchi.  Freakin’ hilarious.

    One in 20 Australians play the pokies WEEKLY

    Stephen Lunn, writing at The Australian, channels the Productivity Commission’s recent report:

    [The Productivity Commission] finds the legal ban in Australia on online gaming is a failure, with betting traffic heading to overseas sites that offer little in the way of consumer protection.

    In its draft report on gambling, the first in-depth national look at Australia’s gambling industry in a decade, the commission finds that gamblers are losing $18 billion a year, of which $12 billion is lost on gaming machines.

    It estimates that around 5 per cent of adults play weekly or more on gaming machines, and 15 per cent of those, or around 125,000 people, are problem gamblers.

    Productivity commissioner Gary Banks says “a large number of people have problems with their gambling (and) it is vital that they are given a tool to achieve greater control”.

    The commission recommends the reduction in the amount that can be lost on a gaming machine from its current upper limit of $1200 an hour to $120 per hour, and giving people a choice when they sit down on how much they spend, using the latest technologies.

    [Emphasis added by John Barrdear]

    If we assume that state governments and pubs don’t want to get rid of pokies because they’re so dependent on the revenues, then surely the only serious hope for enacting this would be for it to be a federal law.

    Lifting the ban on online gambling and permitting pokies but limiting the loss rate seem sensible ideas to me – they leave people with the freedom to gamble if they wish, but limit the loss to largely one of time rather than having the option of putting the house down.

    Of course, the softest still-ultimately-effective policy would be to simply hold the upper limit on loss rates constant while letting the minimum wage and welfare benefits rise with inflation so that the limit falls both in real terms (relative to the cost of living) and relative to household income.