By Robert Hoekstra, Technical Expert, Program Analyst, Performance Management and Data Reporting Unit, Office of Trade Adjustment Assistance


In January and February, we did our first round of TAA Administrative Collection of States (TAAACS) after the long process of drafting, revising, and getting it cleared with the goal of having discrete, quantifiable comparisons between states.  The process of the initial collection created numerous conversations with states which helped us to best understand how states were interpreting questions to better understand their meanings.

Once we got the individual sheets and reviewed them for completeness and consistency, I wrote a quick macro to pull the data into one aggregated dataset for analysis and then integrated this data with our participant dataset.  For the analysis of staffing levels, I also added a population count, so that we could compare relative state sizes.

Some of the initial breakdowns that I had envisioned as being useful were not because states were not as dissimilar as I thought.  For example, I was very curious to see what the difference between centralized and decentralized states might be, but with only five states indicating they were decentralized, the analysis wasn’t meaningful.  Many of the items in the collection involve rankings where blanks indicated that the category didn’t apply, which needed to be translated into meaningful categories (primary, secondary, not used) or a value needs to be filled in for non-uses when looking at average values.  There are also a number of categories where we provided a “mixed” or “other” options that allow states to not cleanly fit into categories, but also complicates the analysis.  The biggest issue, though, was just the multitude of ways to look at the data: do we just want to talk about differences between states?  Do we want to tie it to performance, participation levels, TAADI?  In the end, the analysis included places where the relationship had a logical tie and we actually saw some difference.  Many things were not related – for example, we see no difference in performance based on state expertise and no difference based on system age (which may just be too broad) and I included many small notes in the discussion beyond what was in the slides.  Some of the highlights included:

  • Some states are significant outliers in their staffing numbers;
  • Or disproportionate trade-affected workers compared to their population;
  • Case manager expertise is related to higher TAADI results on case management measures;
  • Determining eligibility with wage records, but not using employer identification records is related to twice the rate of needing to amend certifications;
  • TAA units determining eligibility are tied to higher notification acknowledgement rates, higher rapid response rates, and lower training rates as compared to TRA units;
  • Not having a common exit policy is related to more reporting staff, more systems, and co-enrollment reporting not backed up by services;
  • Co-enrollment policies matter a ton such as having a 32% higher co-enrollment rate if a referral is required over letting case managers decide when a referral is needed;
  • States filing petitions based on WARNs and other identified layoffs helps get to workers earlier and results in much higher take-up rates than identifying them when workers seek services;
  • Using intermediaries (such as unions or employers) and phone calls as a primary eligibility notification method are related to the highest acknowledgement and application rates with email, texts, and newspapers showing good results as secondary methods;
  • Having Rapid Response in the same department as TAA improves Rapid Response rates by nearly 10%;
  • Having fiscal units in the same department (33 states) improves expenditure matching rates (70% vs. 30% on TAADI Training Expenditures)
  • Integration with WIOA Title 1 is associated with higher training and employment rates;
  • Discussing Job Search and Relocation at training completion substantially increases utilization

I am very curious what states want to see more of and encourage them to share on the Discussion forum, but I know we need to dig more into how state processes around getting worker lists and notifying workers translates into take-up rates as well as a better understanding about how making the TAA Program more accessible (such as through participant portals) is related to participant utilization and outcomes.

I also have plans to integrate county data for where participants reside into the data and format the data for analysis using multivariate regression techniques as well as machine learning to best identify what practices are tied to better participant outcomes.

My appreciation for states in this process cannot be understated.  Because of your help and engagement with this process, we all got the opportunity to learn and rethink some of our assumptions about the program.