
Fourteen percent vacancy rate! Ninth biggest biotech market! Fastest growing!
We see these headlines every day, and data is important to any AECRE firm. Sources abound, from quick hits to detailed reports. But should you believe what you see?
I say be a skeptic. Filter everything you hear.
Here are my top 11 reasons why, in no order.
1. Everything you read is past tense. Today's report might cover a study from October that used data from July about a year ending in 2022. This is history. Even projections are based on old data, and even "real time" info is old an hour later.
2. Comparisons can be apples and oranges. Those "Downtown Recovery" stats about cities' activity levels vs. 2019 make a valiant effort to use parallel areas, but they aren't perfect. This impacts how cities score and rank. Seattle's map includes Pioneer Square but not the core Amazon towers.
3. Sometimes data has unclear parameters. Is that office vacancy rate defined geographically, and is it clear whether sublease space is included? What about flex space?
4. Even if the parameters are clear, the data might not match. Sometimes the more tightly the topic is defined, the harder it is to find information that aligns. This frequently shows in news and reports. Maybe the census tracts or block groups didn't quite match the intended geography.
5. Headlines often differ from the story. A great article about a great study can be tainted by a clickbait headline. That "busiest" sector might just be fastest growing, for example. Read carefully, and download the study too.
6. Any trend from 2020 to 2023 has to be considered temporary and/or uncertain. The common flight to remote mountain towns is old news, for starters, and has often reversed. We still don't know what's "normal" with office demand or business travel.
7. Sometimes the data isn't relevant to the point. It's been written that San Antonio deserves more pro sports because it's the "seventh largest US city." But that's due to mass annexation of suburban areas. It's the 24th largest extended metro (CSA) and 31st media market (or it was recently).
8. The same topic can be looked at in different ways for different results. Consider the "tech workforce." One study might count every person at every tech company, including the kitchen staff. Another might count jobs using employment data, only including the techiest titles. Some might include life science, digital graphics, or aerospace engineering. This is why the results vary.
9. The data might be incomplete. Brokerage reports can represent a fraction of the actual commercial inventory, particularly for retail and hotels. Some apartment tallies omit buildings under 50 units. Literally nobody has a list of every building or every tenant. (Imagine your marketing database, but PMs can opt out and it's for the whole country!)
10. Any second-hand data has an error factor. We can all read from the wrong column, add poorly, misunderstand the point, etc. I'm as skeptical of my own data as anyone else's. So double check when possible, and trust your instincts if something looks wrong.
11. Many sources are opinion pieces. They're trying to persuade you or rile you up, and their information might be cherry picked or false. This is true of a lot of "news" media, industry clickbait, and anything produced by an advocacy group.
There's a lot of good stuff out there. But tread—and read—carefully!
Comments