Do job destruction shocks matter in the theory of unemployment?

Size: px
Start display at page:

Download "Do job destruction shocks matter in the theory of unemployment?"

Transcription

1 Do job destruction shocks matter in the theory of unemployment? Melvyn G. Coles, University of Essex Ali Moghaddasi Kelishomi University of Warwick December 2016 Abstract. This paper examines whether the wide variation in unemployment over the cycle is due to large variations in job destruction rates or due to large variations in job finding rates. When vacancies evolve as a stock variable, we show job destruction shocks in the Diamond/Mortensen/Pissarides framework generate Beveridge curve dynamics. By calibrating the job destruction process directly to that identified in Shimer (2005) and using SMM to identify the job creation process which best fits a set of target moments, this framework identifies a near-perfect match to the Shimer (2005) data. Four calibration tests identify important difficulties faced by the (standard) DMP approach. Results support the more traditional view of unemployment: that unemployment volatility is largely driven by job destruction shocks. JEL Codes: E24, E32, J41, J63, J64. Keywords: search, job destruction, ins-and-outs of unemployment. Melvyn Coles acknowledges research funding by the UK Economic and Social Research Council (ESRC), award ref. ES/I037628/1. This paper is an extensively rewritten version of our working paper New Business Start-ups and the Business Cycle which it supercedes. 1

2 1 Introduction What explains the wide variation in unemployment over the cycle - is it due to large variations in job destruction rates or due to large variations in the job finding rates of the unemployed - the so-called ins and outs of unemployment? This important issue has a long history (see Darby et al (1986), Davis and Haltiwanger (1992), Shimer (2012), Elsby et al (2009), Fujita and Ramey (2009)) and has a major impact on how we understand unemployment dynamics over the cycle. The prevailing view is that job destruction shocks play a minor role in explaining unemployment volatility because (A) Shimer (2005) and Hall (2005) argue that job destruction shocks are small and nearly acyclic, 1 (B) it is believed that large job destruction shocks generate a counterfactual positive correlation between unemployment and vacancies, and (C) the results in Shimer (2012) suggest that only 24% of the variation in unemployment is due to variations in job separation rates, the rest is due to variations in worker job finding rates. In this paper we consider a seemingly minor change to one of the primitives of the Diamond/Mortensen/Pissarides [DMP] framework: we allow vacancies [unfilled jobs] to evolve as a stock variable. We show how this simple variation provides a near-perfect match to the Shimer (2005) data. We also identify 4 calibration tests which establish both the excellent fit of this approach and the difficulties faced by the standard matching approach. Most surprisingly our results support the more traditional view of unemployment as described by Davis and Haltiwanger (1992) and Mortensen and Pissarides (1994): that unemployment volatility is largely driven by job destruction shocks. The power of our approach is most readily demonstrated by considering the aggregate labor market dynamics of the US economy following the 2008/9 Great Recession. Using CPS data, Figure 1 describes [seasonally adjusted] total hires and total job separations. 2 It also plots [non-farm] layoffs taken instead from JOLTS (a time series which has been available since 2001). Figure 1 reveals the unprecedentedly large spike in layoffs across As explained in Shimer (2005), the standard DMP approach with free entry predicts vacancies increase and hires surge following a job destruction shock. This did not happen. Instead this demonstrably large job destruction 1 though see the recent empirical survey by Elsby et al (2015) for an alternative interpretation 2 Series are constructed by the BLS from the Current Population Survey (CPS) and are available at 2

3 Hire Separation Layoffs from JOLTS Q1 2002Q1 2003Q1 2004Q1 2005Q1 2006Q1 2007Q1 2008Q1 2009Q1 2010Q1 2011Q1 2012Q1 2013Q1 2014Q1 2015Q1 13Q1 2014Q1 Figure 1: U.S. Job Turnover [ ]. 3 shock generated Beveridge curve dynamics: the stock of vacancies fell steeply as unemployment increased. Figure 2 describes in greater detail the evolution of the US labor market across and subsequent to the 2008/9 layoff spike. Using the Shimer (2005) methodology, it describes unemployment, market tightness [the vacancy/unemployment ratio], productivity and layoffs, each measured as log deviations from trend using an HP filter with smoothing parameter At the start of the layoff spike in Figure 2, market tightness was slightly above trend and unemployment slightly below. The surge in layoffs coincided with steeply increasing unemployment, a steep fall in vacancies [not graphed] and an even steeper fall in market tightness [graphed]. Yet over the entire time period, productivity was positively correlated with unemployment and so free entry with small surplus implies market tightness should have been be positively correlated with unemployment. Figure 2 shows this is counterfactual. Furthermore with high unemployment and above trend productivity, the vacancy stock and gross hires should all have been well above trend from 2010 onwards. Figure 1 demonstrates hire flows only reverted to trend. It is thus difficult to rationalise the post-2008 evolution of the U.S. economy 3 Hires measure is the flow of workers from unemployment and non-labor force to employment, Job Separations is the flow of workers from employment to unemployment and non-labor force (all in thousands) 3

4 Unemployment Productivity -0.7 Market Tightness Layoffs Q2 2009Q2 2010Q2 2011Q2 2012Q2 2013Q2 2014Q2 Figure 2: U.S. Labor Market Indicators [ ]. 4 using the free entry/small surplus approach. Allowing vacancies to evolve as a stock variable is important for, consistent with Figure 2, job destruction shocks then generate Beveridge curve dynamics. 5 The different assumption is a game-changer because it is then no longer necessary to impose zero (or sufficiently small) job destruction shocks in order to generate Beveridge curve dynamics. By instead calibrating the job separation process directly to that identified by Shimer (2005), and using SMM to estimate the job creation process which best fits a set of target moments also taken form Shimer (2005), it becomes an open question as to how important are job destruction and job creation shocks in explaining the volatility and persistence of unemployment. The insights obtained are important for several reasons. First Figure 6 in the text describes the impulse response of the economy to a job destruction shock. That impulse response function has the same structure as that demonstrated in Figure 2. In this way we identify a new, dynamically consistent theory of equilibrium unemployment dynamics. Yet even though 4 Series are quarterly deviations from HP trends (λ = 10 5 ). Productivity is BLS output per worker from Major Sector Productivity and Costs, unemployment is BLS constructs from CPS, vacancies used in market tightness is job openings from JOLTS, and layoffs are also from JOLTS (non-farm business). 5 The free entry approach instead implies vacancies evolve as a jump variable. Sniekers (2016) shows the (U,V) dynamics in the standard approach exhibit too high frequencies and how external economies of scale can generate long, low frequency Beveridge cycles. 4

5 the model finds unemployment volatility is driven by job destruction shocks, the results are also fully consistent with the Shimer (2012) decomposition of the ins and outs of unemployment. That paper can be interpreted as supposing unemployment U t is well proxied by a reduced form relationship U t = U(θ t, δ t ), where θ t describes market tightness and δ t describes job separation rates. 6 But we might invert this statistical relationship for instead a theory of equilibrium market tightness: θ t = θ(u t ; p t, δ t ) where p t additionally describes aggregate productivity. Our first calibration test shows, for the Shimer (2005) data, that market tightness is highly negatively correlated with unemployment, is more weakly positively correlated with job separation rates and productivity has only a marginal positive impact on market tightness. These correlations, of course, conform precisely to those identified in Shimer (2012). The crucial insight, however, is that conditional on (p t, δ t ), the standard free entry approach implies equilibrium market tightness θ(.) is orthogonal to unemployment. 7 Our first calibration test strongly rejects this hypothesis and Figure 2 clearly demonstrates the case. When vacancies evolve as a stock variable, however, a job destruction shock generates a market tightness process which is indeed strongly negatively correlated with unemployment. Furthermore it is this property of the model - job finding rates are low precisely when unemployment is high - which generates the unemployment persistence that is otherwise missing in the standard DMP approach. Our equilibrium framework is simple, transparent and parsimonious. To allow vacancies to evolve as a stock variable, we must drop the free entry assumption. We instead adopt a (two-parameter) job creation process (analogous to the coconut model of Diamond (1982)) which admits perfectly elastic vacancy creation (analogous to the free entry assumption) as a special (limiting) case. 8 We do not argue whether job destruction shocks are small or large, rather we simply calibrate the exogenous job destruction process to the job separation process inferred in Shimer (2005). We do not adopt the small surplus approach. Instead parameter values are taken from Mortensen and Nagypal (2007). We then use SMM to identify the job creation process 6 Formally Shimer (2012) considers U = U(f t, δ t ) where f t describes worker job finding rates, but matching then implies f t = m(θ t ). 7 this is not true when there is also on-the-job search; e.g. Menzio and Shi (2011), Lise and Robin (2016). We discuss this literature further in the Conclusion. 8 also see Diamond (1982) Diamond and Fudenberg (1989), Caballero and Hammour (1994), Fujita and Ramey (2005, 2007) for realted work. 5

6 which best fits a set of target moments taken from Shimer (2005). We not only find the resulting fit is near perfect, 4 calibration tests directly establish the data-relevance of this framework and the difficulties faced by the standard approach. The model identifies the following structural interpretation for the dynamics described by Figure 2. Following a job destruction shock, vacancies fall because many newly laid-off workers take some of those unfilled jobs. 9 Oversampling by the pulse of laid-off workers thus (partially) depletes the vacancy stock. Of course a free entry assumption implies all such vacancies filled are instantaneously replaced. But with a less than perfectly elastic vacancy creation process, the stock of unfilled jobs falls as the increasing stock of unemployed job seekers chase those jobs. The rising unemployment stock along with a declining vacancy stock together imply plummeting job finding rates. The vacancy stock does not begin to recover until around 10 months after the spike for, in the intervening period, the many unemployed workers quickly fill new vacancies as they come on to the market. Only once unemployment begins to fall does the vacancy stock begin to recover and it takes unemployment and vacancies around 3 years to return to their long run values. The calibrated model finds that job destruction shocks drive unemployment volatility and a relatively smooth job creation process explains the high serial persistence of unemployment. Section 2 describes the model and Section 3 characterises its (Markov) equilibrium. Section 4 describes the calibration exercise and results. Section 5 uses the calibrated model to instead consider the impact of the Great Recession on US labour market outcomes but using layoffs [taken from JOLTS] as an explicit measure of job destruction shocks. Section 6 concludes and section 7 contains the Data Appendix. 2 Model We use a conventional equilibrium unemployment framework with discrete time and an infinite time horizon; e.g. Pissarides (2000). All firms and all workers are equally productive, all firms pay the same (Nash bargained) wage, each worker-firm match survives until it is hit by a job destruction 9 According to the model some unfilled jobs are also killed by the job destruction shock but this effect is minor: the impulse response function when vacancies are not subject to job destruction shocks is qualitatively identical. 6

7 shock. The only difference is that vacancies evolve as a stock variable with a less than perfectly elastic new vacancy creation process. Without free entry, however, the stocks of unemployment and vacancies become relevant aggregate state variables. There is a finite, fixed measure F > 0 of firms who create vacancies. In every period, each firm has one new (independent) business opportunity. Given that opportunity, the firm compares its investment cost x against its expected return. Its expected return depends on the state of the aggregate economy at time t, denoted Ω t which is described in detail below. We let J t = J(Ω t ) denote the expected return of a business opportunity in state Ω t. The investment cost x is considered as an idiosyncratic random draw from an exogenous cost distribution H. For tractability we assume this investment cost captures all of the idiosyncratic features associated with any given business venture - in other words, highly profitable opportunities correspond to low realised values of x. Should the firm decide to invest, it pays the upfront cost x and then holds an unfilled job with expected value J t ; i.e. each new investment generates one new vacancy. Following Diamond (1982), each firm invests in its business opportunity if and only if it has positive value; i.e. when x J t. This requires no recall of a business opportunity should the firm not immediately invest in it. As investment occurs whenever x J t then, at the aggregate level, i t = F H(J t ) describes total period t new vacancy creation. To describe how a firm fills a vacancy, we adopt the standard matching framework (without free entry). There is a unit measure of infinitely lived workers. All workers and firms are risk neutral and have the same discount factor 0 < β < 1. Workers switch between being employed and unemployed depending on their realised labour market outcomes. Each period is characterised by the number v t of vacancies (currently unfilled jobs) and the number u t of unemployed workers (so that 1 u t describes the number employed). The hiring process is frictional: the number m t of new job-worker matches in period t is described by a matching function m t = m(u t, v t ), where m(.) is positive, increasing, concave and homogenous of degree one. While unemployed a job seeker enjoys per period payoff z > 0. In period t, each job-worker match produces the same market output p = p t, where aggregate productivity p t evolves according to an exogenous AR1 process (described below). Job destruction is also an exogenous, stochastic process. δ t describes the probability that any given job, either filled or unfilled, is 7

8 destroyed. In the event of a filled job being destroyed, the worker becomes unemployed and the job s continuation payoff is zero. This job destruction parameter, δ t, also evolves according to an exogenous AR1 process (described below). We next describe the sequence of events within each period t. Each period has 5 stages: Stage I [new realisations]: given (p t 1, δ t 1 ) from the previous period, new values of p t, δ t are realised according to ln p t = ρ p ln p t 1 + ε t ln δ t = ρ δ ln δ t 1 + (1 ρ δ ) ln δ + η t where (ε t, η t ) are white noise innovations drawn from the Normal distribution with mean zero, covariance matrix Σ, δ > 0 is the long-run average job destruction rate while long-run productivity p is normalised to one; Stage II [bargaining and production]: the wage w t is determined by Nash bargaining. Production takes place so that a job match yields one period profit p t w t while the employed worker enjoys payoff w t. Each unemployed worker enjoys payoff z; Stage III [vacancy investment]: firms invest in new vacancies i t ; Stage IV [matching]: let u t,v t denote the stock of unemployed job seekers and vacancies at the start of this stage. Matching takes place so that m t = m(u t, v t ) describes the total number of new matches; Stage V [job destruction]: each vacancy and each filled job is independently destroyed with probability δ t. 3 Markov Dynamics and Equilibrium. This section describes the (Markov) equilibrium dynamics. Because u t is defined as the number unemployed in period t immediately prior to the matching stage (stage IV), then u t evolves according to: u t = u t 1 + δ t 1 (1 u t 1 ) (1 δ t 1 )m t 1 (1) where m t 1 = m(u t 1, v t 1 ). The second term describes the stock of employed workers in period t 1 who become unemployed through a job destruction shock. The last term describes the match outflow where such matches are also subject to the period t 1 job destruction shock. 8

9 The vacancy stock dynamics are given by v t = (1 δ t 1 )[v t 1 m t 1 ] + i t, (2) where the first term describes those vacancies which survive (unfilled) from the previous matching event, while i t describes new vacancy creation. To determine equilibrium new vacancy creation i t we restrict attention to Markov equilibria. Once (p t, δ t ) are realised, define the intermediate stock of vacancies ṽ t = (1 δ t 1 )[v t 1 m t 1 ] which is the number of surviving vacancies carried over from the previous matching event. When bargaining occurs in stage II, let Ω t = {p t, δ t, u t, ṽ t } denote the corresponding state space. As described below, any standard Nash bargaining procedure yields a wage rule of the form w t = w N (Ω t ). Stage III then determines optimal investment i t = i(ω t ). As the matching and separation dynamics ensure Ω t evolves as a first order Markov process, then Ω t is indeed a sufficient statistic for optimal decision making in period t. We next characterise the Bellman equations describing optimal behaviour. In period t and at the start of stage II with state vector Ω t (i.e. prior to production and matching but after new p t, δ t have been realised) let: J t = J(Ω t ) denote the expected value of a vacancy; Jt F = J F (Ω t ) denote the expected value of a filled job; Vt U = V U (Ω t ) denote the worker s expected value of unemployment; Vt E = V E (Ω t ) denote the worker s expected value of employment. Let E[. Ω t ] denote the expectations operator given period t state vector Ω t. The timing of the model implies the value functions J t, Jt F are defined recursively by: { m(ut, v t ) J t = c + β(1 δ t )E Jt+1 F + [1 m(u } t, v t ) ]J t+1 Ω t v t v t (3) J F t = p t w t + β(1 δ t )E{J F t+1 Ω t }. (4) 9

10 The worker value functions are also defined recursively: V U t V E t [ = z + βe Vt+1 U + (1 δ t ) m(u ] t, v t ) [ ] V E u t+1 Vt+1 U Ωt (5) t = w t + βe [ ] Vt+1 E + δ t+1 [Vt+1 U Vt+1] Ω E t. (6) Because firms invest if and only if the business opportunity has cost x J P t, equilibrium new vacancy creation i t = i(ω t ) where i t = F H(J t ), (7) and J t = J(Ω t ). Assuming workers have bargaining power φ [0, 1], the axiomatic Nash bargaining approach closes the model with (1 φ) [ V E t ] [ Vt U = φ Jt Jt V Using the above equations, this condition determines the equilibrium wage w t = w(ω t ). The above thus yields a system of autonomous, first order difference equations determining (i) the evolution of Ω t and (ii) the equilibrium value functions with corresponding investment rule i t = i(ω t ). 4 Calibration and Comparative Dynamics. In what follows we first calibrate the model to the data considered in Shimer (2005). We then report 4 different calibration tests, one of which considers how well the model predicts out of sample; i.e. we re-examine the evolution of the US economy following the 2008/9 Great Recession. As the framework is so standard, we adopt the calibration parameters as described in Mortensen and Nagypal (2007). Specifically we assume each period corresponds to one month and a standard Cobb-Douglas matching function m = Au γ v 1 γ. Table 1 describes the corresponding Mortensen/Nagypal parameter values. Note the Hosios condition is satisfied and so bargaining is efficient. As the productivity process for p t implies its (long run) mean value equals one, surplus (1 z)/z = 43% is large. The monthly discount factor implies an annual discount rate of 4%. We next calibrate the stochastic process for {p t, δ t }. Rather than make an ad hoc restriction to no aggregate job destruction shocks, we calibrate this 10 ].

11 Parameter Table 1: Mortensen/Nagypal Parameters Value γ elasticity parameter on matching function 0.6 φ worker bargaining power 0.6 z outside value of leisure 0.7 β monthly discount factor process to the findings in Shimer (2005). Figure 3 describes the magnitude of log-deviations in (i) job separation rates and (ii) labor productivity as computed for the Shimer (2005) data at business cycle frequencies Separation Rates 0.25 Labor Productivity Q1 1956Q1 1961Q1 1966Q1 1971Q1 1976Q1 1981Q1 1986Q1 1991Q1 1996Q1 2001Q1 Figure 3: U.S. Separation Rates and Labor Productivity [ ] 11 As these data are only recorded quarterly while the model adopts a monthly time structure, we choose the autocorrelation parameters ρ p, ρ δ and covariance matrix Σ so that the implied process (p t, δ t ), when reported at quarterly intervals, matches the first order autocorrelation and cross correlation implied by the data. Doing this yields: 10 We describe how Shimer (2005) measures the separation rate in the Data Appendix. 11 Both series are in logs as deviations from an HP trend with smoothing parameter

12 Parameter Table 2: (p t, δ t ) Stochastic Process Value ρ p productivity autocorrelation ρ δ separation autocorrelation σ p st. dev. productivity shocks σ δ st. dev. separation shocks ρ pδ cross correlation As demonstrated in Figure 3 and consistent with the insights of Mortensen and Pissarides (1994), the job destruction innovations for this time period are strongly negatively correlated with the productivity innovations. The [exogenous] job separation process has much greater variance but is less persistent than the productivity process. When calibrating a DMP framework, it is often found the advertising cost c must be unrealistically large. This is typically explained by arguing it also reflects previously sunk job creation investments. Here we take the converse case: we instead presume small advertising costs (c = 0) and so all job creation costs are tied to the ex-ante investment decision with x H(.). Given the vacancy creation rule is i t = F H(J t ), we adopt the simplest, most parsimonious functional form i t = F J ξ t (8) so that ξ describes the elasticity of new vacancy creation with respect to vacancy value. ξ = describes infinitely elastic new vacancy creation (analogous to the free entry case) while ξ = 0 implies perfectly inelastic (fixed) new vacancy creation. The framework is further calibrated to fit the long run turnover means. To ensure comparability of results, we follow Shimer (2005) who argues that (i) the mean job separation probability should equal 3.4% per month, (ii) the average duration of an unemployment spell is 2.2 months and thus the long run unemployment rate equals u = 7%. We also note the average duration of vacancies is around 3 weeks (Blanchard and Diamond (1989)). The first restriction ties down δ = [mean monthely job separation. Depending on the choice of ξ, the latter two restrictions tie down A [the scale parameter on the matching function] and F [the scale parameter on the vacancy creation rule]. For example the choice ξ = 1 [the distribution of investment costs is 12

13 uniform] requires A = and F = This leaves us with one free parameter ξ, the elasticity of the job creation process. We estimate ξ using SMM, where for each chosen value ξ we first update parameter values (A, F ) so the generated data is consistent with the long run turnover means. Based on the data series for vacancies, unemployment, and market tightness in deviation from their HP filter trends as used to generate Table 1 in Shimer (2005), the target moments are the standard deviations and correlations of unemployment, vacancies and market tightness. The estimation method uses simulated method of moments as described in Ruge-Murcia (2012) with a Newey-West diagonal weighting matrix. The system is overidentified: we use 6 target moments to identify ξ. 4.1 Results Column 1 [labelled Data] in Table 3 records the 6 data targets which include the correlation of vacancies with unemployment [the Beveridge curve (Bc)]. The remaining columns describe the corresponding statistics using model generated data. We begin with the final column, H/M, which for comparison records the model fit when using the calibration parameters as described in Hagedorn and Manovskii (2008); i.e. the small surplus case with free entry. As is well known it yields the Beveridge curve [Bc], that vacancies and unemployment are strongly negatively correlated, but does not generate sufficient volatility in unemployment, vacancies and market tightness. The column H/M with JD augments the H/M specification with the above job destruction process. Adding separation shocks yields greater unemployment volatility [to around 3/4 that observed in the data] but, consistent with the arguments in Shimer (2005), reduces the strong negative correlation between unemployment and vacancies. The ξ = 1 column describes the results when the vacancy creation process is assumed unit elastic. For that choice, the model generates the right correlated behaviour but there is too little volatility. To fit the volatility targets, SMM identifies that the job creation process must be less than unit elastic; i.e. estimated ξ = < 1. Although ξ = slightly overstates the negative correlation of unemployment and vacancies, the fit is otherwise near-perfect. The Data Appendix includes the full set of data moments as reported in Table 1, Shimer (2005) and the corresponding table for the model with 13

14 Table 3: Simulation Results Data ξ = ξ = 1 H/M with JD H/M Standard Deviations σ u σ v σ θ Cross Correlations corr(v,u) [BC] corr(θ, u) corr(θ, v) Note: σ x is the standard deviation of x, and corr(x, y), the cross correlation between x and y. Column 2 contains the quarterly moments from Shimer(2005) s table 1. Column 3 are the statistics from the estimated model. Column 4, simulated model with ξ = 1, column 5, free entry model with H/M calibration with job destruction shock, and the last column, H/M calibration without job destruction shock. To calculate the quarterly moments, models are first simulated at monthly frequency, and then aggregated. ξ = The rest of the paper now considers 4 calibration tests which identify why this framework yields such an improved fit. 4.2 Calibration Test 1: Market Tightness Dynamics Market tightness θ t = V t /U t determines how worker job finding rates vary over the cycle. The free entry approach yields a particularly useful simplification: that equilibrium market tightness θ t = θ(p t, δ t ) is independent of unemployment U t. A typical calibration approach identifies those assumptions whereby the elasticities of the model-generated θ(p t, δ t ) match those found in the data. But conditional on (p t, δ t ), the obvious empirical issue is whether market tightness θ(.) is indeed orthogonal to unemployment. A simple calibration test thus asks whether the model generated market tightness dynamics are consistent with the reduced form estimates of θ(.). Column 1 [Data] in Table 4 reports the results of estimating a reduced form, log-linear statistical relationship log θ t = α 0 + α 1 log p t + α 2 log δ t + α 3 log U t 1, (9) 14

15 on Shimer (2005) HP filtered data where, because market tightness is measured as V t /U t, we mitigate simultaneity issues by using last period U t 1 as the conditioning variable. 12 Estimated t-statistics are reported in brackets. Table 4: Reduced Form Market Tightness Dynamics Parameters Data ξ = ξ = 1 H/M with JD α 1 [productivity] (1.98) 0.96 (80.0) 2.07 (188) 13.0 (3823) α 2 [JD δ t ] (-10.4) (-240) (196) (375) α 3 [unemployment] (-26.0) (-1620) (-1114) (-1.4) Note: Estimation results of reduced form equation (9), using Shimer (2005) data (column 2) and models generated data (columns 3-5). t-statistics are reported in brackets. According to the data (column 1), market tightness is positively correlated with productivity and negatively correlated with job destruction shocks. But productivity shocks are barely significant [a t-statistic equal to 1.98] while market tightness is very strongly (negatively) correlated with unemployment [a t-statistic equal to -26]. Figure 2 in the Introduction clearly supports this view of the data. The final column [H/M with JD] reports the corresponding results for the free entry/small surplus approach, augmented with the above job destruction process. This frameowrk instead implies the opposite scenario: small surplus implies market tightness is driven by productivity shocks and, conditional on (p t, δ t ), market tightness is instead orthogonal to unemployment. Although not a perfect match, ξ = yields parameter estimates α i which are broadly consistent with those identified on the data: market tightness is most highly [negatively] correlated with unemployment, though productivity and job destruction shocks also play significant roles. Surprisingly for the insights that follow, the data suggest job destruction shocks 12 instead using log U t finds estimated productivity effects ( α 1 ) become negative and insignificant, and there is an even stronger negative correlation between unemployment and measured market tightness. 15

16 have an even greater impact on market tightness than that implied by the model. 4.3 Calibration Test 2: Serial Persistence. An important criticism due to Shimer (2005) is that the standard search and matching framework generates insufficient persistence. Column 1 [Data] in Table 5 describes the serial autocorrelation of unemployment, vacancies and market tightness according to the (HP filtered) data. The remaining columns describe the corresponding parameter estimates based on model generated data. Table 5: Estimated Serial Persistence [Quarterly Frequencies] Data ξ = ξ = 1 H/M with JD. unemployment vacancies market tightness Note: The data column is the autocorrelation of unemployment, vacancies, and market tightness from Shimer(2005) table 1. The second column contains the autocorrelations of these variables from the estimated model (not targeted in estimation) and the rest are for the simulated model with ξ = 1 and H/M calibration with job destruction shock. Recall that (p t, δ t ) are exogenous stochastic processes. Measured at quarterly frequencies, the serial persistence parameters are ρ p = for productivity and ρ δ = for job destruction. The data (column 1) find the endogenous variables unemployment, vacancies and market tightness follow much more persistent processes. H/M with JD does not generate sufficient persistence. The reason is very simple: θ = θ(p t, δ t ) and a small surplus assumption implies market tightness is driven by productivity shocks, which have serial persistence ρ p = 0.87, and market tightness is orthogonal to unemployment. With no feedback from unemployment to market tightness, Column 4 shows the free entry approach does not create any additional persistence: H/M with JD generates an unemployment process whose persistence is the same as that of the underlying productivity process, and so is too low. 16

17 With ξ = 0.265, Column 2 demonstrates the serial persistence of (U t, V t, θ t ) is a near-perfect match to the data. This occurs because instead market tightness θ = θ(u,.) is strongly, negatively correlated with U, so that periods of high unemployment are characterised by below trend job finding rates which, in turn, increase the persistence of the high unemployment phase. Of course the equilibrium degree of persistence ρ u is an endogenous outcome which depends on the propagation mechanism implied by the model. Before examining that propagation mechanism (see Section 5), we report our third calibration test. 4.4 Calibration Test 3: The Ins and Outs of Unemployment. Shimer (2012) argues that the variation in unemployment is more highly correlated with variations in worker job finding rates than with job separation rates. The argument begins by noting that steady state unemployment u = x x + f where x is the (steady state) exit rate of employed workers into unemployment and f the rate unemployed workers become employed. It turns out for aggregate data, that the unemployment proxy u P t = x t x t + f t, where x t is the period t exit rate and f t the job finding rate, is a surprisingly good approximation for actual unemployment u t. Assuming this is a valid proxy, this variable can then be further decomposed into job separation effects (variations in x t ) and job finding effects (variations in f t ). For example putting x t = x, the sequence x/(x + f t ) describes the variation in u P t due solely to variations in f t. Similarly x t /(x t + f) describes the variation in u P t due to variations in x t. Shimer (2012) defines the contribution of the job finding rate to variations in unemployment as the covariance of u t and x/(x + f t ) divided by the variance of u t. Column 1, Table 1 in Shimer (2012) reports that 77% of the variation in unemployment is explained by variations in the job finding rate f t, while variations in the job separation rate x t instead ex- 17

18 plain only 24%. 13 In this way it is argued that job separation shocks play a minor role in explaining unemployment volatility. We repeat this decomposition on model-generated data with ξ = For this data, we also find the proxy variable u P t = x t /(x t + f t ) is very highly correlated with u t. Computing the same statistics, we also find that job finding variations, x/(x + f t ), explain 77% of the unemployment variation, though job separation variations x t /(x t + f) explain a slightly smaller 21%. The simulated data is thus entirely consistent with the Shimer (2012) decomposition. Nevertheless, as we now show, volatility in the model is driven by job destruction shocks. 5 How Important are Job Destruction Shocks in Explaining Unemployment Variation? The above has established our framework not only provides a near perfect match to the Shimer (2005) data, it is entirely consistent with the Shimer (2012) decomposition of the ins and outs of unemployment. The interesting question then is how important are job destruction shocks in explaining unemployment volatility? The answer is most of it, for assuming no job destruction shocks δ t = δ, recalibrating the productivity process appropriately and re-estimating ξ only generates volatility σ u = 0.05, a quarter of that observed in the data. 15 Of course this should not be surprising given productivity shocks are small and the Mortensen and Nagypal (2007) parameterization does not assume small surplus. Given standard arguments, however, the puzzle is how can a DMP framework, where unemployment volatility is largely driven by job destruction shocks, be consistent with the Beveridge curve and the Shimer (2012) decomposition? Consider Figures 4 and 5. Figure 4 describes the impulse response functions of unemployment and vacancies to a single separation innovation at date zero (holding productivity fixed p t = 1). 13 see Figures 4 and 6 in Elsby et al (2009) and Table 1 in Fujita and Ramey (2009) for alternative estimates. 14 with x t δ t and f t (1 δ t )m(θ t ). 15 With no job destruction shocks and to maximise volatility, SMM finds ξ is arbitrarily large; i.e. it rediscovers the free entry assumption. H/M then raises volatility to σ u = by assuming small surplus. 18

19 Unemployment, xi=0.265 Unemployment, H&M with JD Months Figure 4: Impulse Response of Unemployment to a Separation Shock The impulse response function labeled [H/M with JD] describes the impulse response of unemployment for the case of free entry and small surplus. That shock yields a relatively small increase in unemployment which quickly recovers to its steady state value. The impulse response for ξ = instead generates a much higher unemployment peak and much greater persistence. Figure 5, which describes the corresponding impulse response function for vacancies, reveals why Vacancies, xi=0.265 Vacancies, H&M with JD Months Figure 5: Impulse Response of Vacancies to a Separation Shock Free entry with small surplus [H/M with JD] implies vacancies increase given a rise in unemployment. This vacancy response ensures unemployment quickly recovers to its long run steady state value and the model demonstrates no added persistence (the persistence observed in this impulse response func- 19

20 tion is due to the separation process being an AR1 process with ρ δ = 0.73). This adjustment process also implies unemployment and vacancies covary positively and so is inconsistent with the Beveridge curve. To avoid this outcome, the standard approach typically imposes zero job destruction shocks. In contrast with ξ = 0.265, Figure 5 demonstrates the vacancy stock instead falls as unemployment increases. 16 The job destruction shock not only destroys some vacancies, it generates a rising tide of unemployed workers, some of whom quickly re-match with the existing vacancy stock. Of course free entry (perfectly elastic vacancy creation) would imply all such vacancies are instantaneously replaced. But without perfectly elastic vacancy creation, oversampling of the vacancy stock by newly laid-off workers causes the vacancy stock to fall. Unemployed worker job finding rates then plummet as the increasing number of unemployed workers chase the relatively few vacancies which remain in the vacancy stock. In this way, the Beveridge curve naturally arises given a job destruction shock. 5.1 Calibration Test 4: The 2008/9 Great Recession. Figure 2 in the Introduction describes how (i) unemployment U t, (ii) market tightness θ t = V t /U t, (iii) layoffs, (iv) productivity evolved across and subsequent to the layoff spike of 2008/9. With ξ = 0.265, Figure 6 describes the impulse response of these same variables following a single [unfavourable] separation shock. The separation process has serial persistence ρ δ = 0.73 and is much less persistent than unemployment which has serial persistence ρ u = The propagation mechanism yields a market tightness process which is highly [negatively] correlated with unemployment and it is this property of the 16 The initial iterations in Figure 5 are affected by the assumed timing of the model, that unemployment and vacancies are measured at stage III just prior to matching, while job destruction occurs at stage V. For example with free entry, a higher job destruction rate δ t (which is known at stage I) reduces stage III market tightness. Because stage III unemployment is on trend for the first iteration, lower market tightness then implies vacancies fall below trend for the first iteration (but subsequently increase as unemployment increases). Conversely for the case ξ = 0.265, new vacancy creation i t is always above trend (which ensures the unemployment stock eventually returns to trend). For the first two iterations the stock of vacancies (measured at stage III prior to job destruction) is slightly above trend. But steeply increasing unemployment and oversampling then cause the vacancy stock to plummet, where the vacancy stock begins to recover only when unemployment falls below its peak. 20

21 model, that job finding rates are negatively correlated with unemployment, which yields such high serial persistence. The underlying reason, according to the model of course, is that the job creation process is relatively inelastic [as opposed to infinitely elastic when there is free entry]. Table 6: Simulation Results for JOLTS calibration Data ξ = 0 ξ = Standard Deviations σ u σ v σ θ Cross Correlations corr(v,u) [BC] corr(θ, u) corr(θ, v) Note: In Data column, u is the unemployment constructed by BLS from CPS, v, is the job openings from JOLTS, and θ = v/u. The data is quarterly average over the period All the statistics are calculated for HP filtered (with λ = 10 5 ) series. σ x is the standard deviation of x and corr(x, y) the cross correlation between x and y. The interesting calibration test, then, is to identify the extent to which this framework is consistent with the data for the period , but instead using the JOLTS layoff series as a more direct measure of the job destruction process. Although the layoff spike may not be consistent with a stationary AR1 process, we repeat the above methodology: (p t, δ t ) are assumed to follow a joint AR1 process and we reset the autocorrelation parameters ρ p, ρ δ and covariance matrix Σ to match the data Doing this implies ρ p = 0.95, ρ δ = 0.89 with σ p = , σ δ = and crosscorrelation ρ pδ = 0.38 at monthly frequencies. Column 1 [data] in Table 6 reports the updated 6 targets. Column 3 records the corresponding statistics using model generated data with ξ = as previously estimated. Not surprisingly given the close match of the impulse response function [Figure 6] to the data [Figure 2], this specification provides a near perfect fit of the correlations [and persistence] of unemployment, vacancies and market tightness. For this time period, however, ξ = yields too little unemployment volatil- 21

22 Months Separation Unemployment Market Tightness Figure 6: Impulse Response of U t and θ t, to a Separation Shock, δ t, with ξ = ity. Because more inelastic vacancy creation rates slow down the recovery rate of the economy following an unfavourable layoff shock, SMM estimates the polar case ξ = 0; perfectly inelastic vacancy creation rates. This in part reflects Figure 1 in the Introduction which shows that hires merely reverted to trend following the layoff spike - the so-called jobless recovery. Yet even with this polar case, the model does not generate the large observed volatility in unemployment. 6 Conclusion. When vacancies evolve as a stock variable, job destruction shocks in the Diamond/Mortensen/Pissarides framework generate Beveridge curve dynamics. This fundamentally changes the calibration strategy for it is no longer necessary to impose zero (or sufficiently small) aggregate job destruction shocks to be consistent with the Beveridge curve. Instead by calibrating the job destruction process directly to that identified in Shimer (2005) and using SMM to identify the job creation process which best fits a set of target moments, this framework identifies both a near perfect fit of the Shimer (2005) data and a clear dynamic structure. In contrast to current methodology, results find that unemployment volatility is driven by the job destruction process. To explain why unemployment is so persistent, results find the job creation process is instead relatively inelastic. The results are fully consistent with unemployed worker job finding rates (market tightness) being highly volatile and strongly negatively correlated with unemployment. The paper identifies important new avenues for future research: to explain and re-assess the properties of job destruction and job creation over the 22

23 cycle. Mortensen and Pissarides (1994) has already identified how aggregate productivity shocks can endogenously generate large job destruction shocks. Given recent events, however, it would seem reasonable to allow that financial [or credit] shocks might generate large job destruction outcomes; e.g. Jermann and Quadrini (2012), Chodorow-Reich (2014), Boeri et al (2015). For example Bentolia et al (2015) show (for Spain) that firms with credit relationships tied to banks facing severe liquidity problems were much more likely to downsize or go out of business. A standard Cholesky decomposition on our calibrated model (with ξ = 0.265) finds that pure productivity shocks generate 47% of the unconditional variance of unemployment (mainly through their impact on the job destruction channel). Conversely pure job destruction shocks, orthogonal to productivity shocks, are instead responsible for 53% of observed unemployment volatility. 17 The smooth job creation process is modelled here as a frictional search process for new business opportunities: that it takes time for entrepreneurs to identify and implement new technologies and so create new jobs. Although highly stylised, an early version of this paper shows that new business startup activity is indeed largely acyclical [Coles and Moghaddasi (2011)]. There are, however, other macro-mechanisms which potentially explain why job creation rates are relatively inelastic. For example because productivity shocks are small then, without a small surplus assumption, job creation rates are automatically relatively inelastic. Also if financial shocks are important, then a credit crisis which triggers large job destruction rates is likely to suppress job creation rates in the subsequent high unemployment phase. Standard macro-arguments such as aggregate demand externalities [so that demand is low when unemployment is high] or sticky wages [so that wages do not adjust in the recession] might also explain why job creation rates do not surge following a job destruction shock. 18 On-the-job search also potentially explains why economic recovery is slow for, on average, around 40% of jobs filled are taken by a worker who is already employed. Thus around 40% of new jobs created are matched by a job destroyed elsewhere and there is zero net job creation. Indeed Figure 1 demonstrates that when one considers total 17 There is some work on the impact of financial market failures on frictional labor markets, though with free entry as a maintained assumption, e.g. Hall (2014) and Eckstein et al (2014). 18 Also see Mitman and Rabinovich (2016) who argue that because US policy increases the duration of unemployment benefits in a recession, this reduces surplus and so reduces job creation rates. 23

24 job separations (layoffs plus job-to-job quits) total separations closely track total hires to the point that period to period changes in unemployment are small relative to gross hire and job separation flows. Extending the on-the-job search approach with endogenous hiring to a stochastic, business cycle framework is complex, but there has been important recent progress; e.g. Menzio and Shi (2011), Robin (2011), Moscarini and Postel-Vinay (2013,2016), Coles and Mortensen (2016), Lise and Robin (2016). Finally it should be noted that temporary layoffs are an important phenomenon in the U.S.; e.g. Feldstein (1976), Fujita and Moscarini (2015). Although this phenomenon does not imply the layoff shock of 2008/9 was a small event, undoubtedly some laid-off were eventually re-employed by their previous employer once economic conditions improved. This process is an example of search with memory, a process first considered in Carrillo-Tudela et al (2011), where Carrillo-Tudela and Smith (2017) show that many laid-off workers also become re-employed with a previous employer. These issues are not inconsistent with our framework for the implied density of hiring costs H (c) = F c ξ 1 is steeply decreasing in c. Just like the hiring cost of replacing a worker who quits may be relatively small, it may also be relatively costless to re-employ previously laid-off employees in a recovery. But once such low cost employment opportunities have been exhausted, further expansion opportunities through the profitable investment in and implementation of new technologies would not appear limitless. Appendices A Job Separation Measures Shimer (2005, 2012) uses data on employment, short term unemployment, and hiring rate to measure separation rate. To correct for time-aggregation bias he uses the following formula: u s t+1 = s t e t (1 1 2 f t) where u s t is the number of workers unemployed for less than one month and e t employed workers in month t. f t is the computed job finding rate in a two states labor market (no labor force non-participation transition). An 24

25 unemployed worker who just loses his job has half a month, on average, to find a job. To correct for aggregation bias, Shimer (2012) computes continuous time separation rates according to u t+1 = (1 e ft st )s t (u t + e t ) + e ft st u t f t + s t Alternatively, Eslby et la (2009) impute discrete time, weekly, hazard rates which are fairly similar to Shimer s contimious time measures. In a similar exercise, Fujita and Ramey (2009) reconstruct CPS gross flows data but control for margin error. B Complete Results To make the comparison easier, below we report table 1 in Shimer (2005) and our SMM results. Table 7: Summary Statistics, Quarterly U.S. data, , table from Shimer (2005) page 28. u v v/u f s p Standard Deviations Quarterly autocorrelation u v Correlation matrix v/u f s p 1 25