Find a back issue

Burning Questions: How Did Central Track Come Up With Its List of the 50 Best Burgers in Dallas?

Stackhouse burger (photo by Kevin Marple)
Stackhouse burger (photo by Kevin Marple)

Last week, Central Track published a list of the 50 Best Burgers in Dallas. The post went viral via Twitter and Facebook, and everyone was touting its sincerity. Sweetness. I bookmarked it for later.

It’s a genius idea to make a list from everyone else’s list… as long as you do it right. When I got a good look at the list on Friday, something seemed off about it, but I couldn’t put my finger on what was what. (Could’ve been that my brain was still lost in July 4 or something.) I’ll admit I was a lazy sonovagun on Friday. I couldn’t deduce how Central Track assigned points to each burger joint, so I gave up after thirty minutes. I managed to send one email out to Central Track before I called it quits. No word yet on how the team did it. So, smart SideDishers, let’s see if we can solve this mystery together. How did Central Track come up with a list of the 50 Best Burgers in Dallas? Was it arbitrary? Or was it scientific? I lean toward the former.

Let’s investigate.

Exhibit A: A vague explanation of the vague methodology at the top of the article.

“Over the past few weeks, we’ve sought out as many of these top burger lists and “Best Of” honors in Dallas as we could find and started compiling them into a weighted list, assigning more value to the burger joints at the top of each list than the ones in the middle of the pack. And more to the middle-of-the-pack ones than the bottom-dwellers. And so on, and so on.

Our opinion on the matter isn’t included at all in this equation. Sure, we love a delectable, juicy burger as much as everyone else. But, considering the sheer number of lists already put out by the likes of D Magazine, the Fort Worth Star-Telegram, the Dallas Observer and others (14 lists in total made it into our formula, and you can find links to each of them at the bottom of this post), we just didn’t feel it necessary to try to get our own voice to rise about the chorus. Instead, we wanted to come up with a consensus — or as close to one as we could.”

Burning Question #1: How were points assigned to burger joints in lists that are unranked? Six out of the 14 source lists don’t decrescendo from “best to worst” burgers. This includes the Dallas Observer’s, D Magazine’s, CultureMap’s, CraveDFW’s, Eater’s, and Metro’s.

Burning Question #2: How did Central Track account for old lists that left out new(er) joints, like Hopdoddy? The Texas Monthly list from April 2009 doesn’t include Maple & Motor and Hopdoddy because they didn’t exist in Dallas back then. This means new burger joints are at a disadvantage and naturally assigned fewer points. (Other lists before Hopdoddy’s time include the Dallas Observer’s, D Magazine’s, and Fort Worth Star-Telegram.)

MOST Burning Question #3: HOW DID CENTRAL TRACK COME UP WITH THIS LIST?? What was the experimental setup??

Yo necesito un explicación, por favor.

Jump for more.

[Update 3:46 p.m.] Pete Freedman has sent me all the info I needed. If you want to see the actual spreadsheet, you can email him at pete@centraltrack.com. I’ve been emailing him back and forth for the past couple of hours to get to the bottom of this list, and here’s the exchange:

Rules for Unordered Lists:

  • For lists that mention 5 or less places (the most selective): Each burger place will get 30 points.
  • For lists 6-10 (somewhat selective): Each burger place will get 25 points.
  • For lists 11-15 (less selective): Each burger place will get 20 points.

 
Rules for Ordered Lists:

  • The number one got 32 points (that’s our longest list) and the number two got 31 and so on until the last one on that list.
  • Any honorable mentions left after the numbered list all got the same amount. The amount cannot be higher than 20. If the numbered portion of the list is longer than 20, the amount given to honorable mention burger places will be the next one down from the last numbered burger score.
  • Exceptions and clarifications:

 
– D Magazine’s list: The selections are treated like unordered lists. The Best D pick got 30 points and the Reader’s choice also got 30 points.

– Texas Monthly: Since this list looks at all of Texas and we only looked at DFW locations, the list was treated as an ordered list out of 7 (the number of DFW locations).

My response to his system:

Hi Pete,

Thanks for giving me access to the spreadsheet.

I’ve looked it over and I still have two main issues with the methodology, even though I do think it is close.

1. The arbitrary point system. Why 30 points for an exclusive list, then 25 and 20? The ordered scoring system is also random. Why should #1 of a 32-rank list have the same point value as the #1 of a 10-rank list?
2. The point system is not normalized. Normalizing the point system across the ordered lists would make the aggregate list make more sense, instead of having fixed #s based on absolute positions. You would also normalize the unordered list based on the list size so that points are all the same.

     

  • For example, you have two separate lists with 10 ordered items, one ranked and one unranked. The ranked ones would give different scores to all items. However, the sum of all the ranked items would still equal the sum of the unordered list items.

Does that make sense? It’d be interesting to see how the list would change if you normalized the point system.

-Carol

Pete responded and said that his numbers are, in fact, normalized.

A publication that chooses not to rank its burger selections is, in effect, saying they’re all “the best.” Given that a top place in a list of Top 30 would get 32 points, the 30 points for an unranked Top 5 actually IS a pretty spot-on average. Same with unranked Top 10 lists getting 25 points each, etc.

I guess my point is that we very much did the math here. It’s not some arbitrary system. Which answers your main question, no? Is the rest not semantics? This is just how we figured it fair as possible to do it given the varied source material. And the math checks out with the formula used, does it not?

 
It’s fair to say that we have agreed to disagree. And now my brain hurts, because I’ve done more math today than I have in the last four years. Burgers, man. Gotta count ‘em all.

19 comments on “Burning Questions: How Did Central Track Come Up With Its List of the 50 Best Burgers in Dallas?

  1. Carol:

    First of all, thanks so much for the link! That said, I just triple-checked my email, though, and I see no note from you.

    To answer your larger question: The list is not arbitrary at all.

    To answer your other ones…

    1. We assigned point values to every joint that appeared on each list, giving the ones that ranked higher on the ranked lists more points (and the ones that ranked lower fewer). We assigned flat point values to non-ranked lists.

    2. We didn’t. But the fact that those places ranked highly regardless of not appearing on earlier lists speaks volumes of the love they’ve received so far, no?

    3. I think the above explains it pretty outright, but if you’d like to know more, you’re welcome to contact me directly — my email is pete [at] centraltrack [dot] com — if there’s more you’d like to know.

    Thanks again for the interest!

  2. The list was possibly figured the way Nichols figured her bogus top 100 restaurants.

  3. your sources are cited, but you need to “show your work” as far as your calculations are concerned.

  4. This is a lot more than 50 burgers because there are many point ties. I didn’t count but it’s got to be about 60 -70 best burgers.

  5. Thanks, Pete. I sent an email to Jessica Petrocchi on Friday. And I agree with math. I’m still puzzled by the numbers.

  6. Check your email, Carol. Just sent you our rating rules and spreadsheet work.

  7. No. The locations that tied in points were all listed with the same rank, but the next rank that was listed accounted for rank “slots” for all of the locations in the tie. See – there are 3 that tied for #46, but the next rank listed is #49. (So the #46 locations are really #46, #47 & #48.)
    Don’t know about the rest of their math, but CT did count from #1 to #50 correctly…

  8. That’s a laugh. Not a single one of these places have ever run an ad with us.

  9. JESUS CHRIST, BUD: HOPDODDY IS NO. 9 ON THE LIST!

    I’m honestly beginning to wonder if you read things before you take jabs at them.

  10. As a student of statistics, I believe this whole thing is methodologically worthless.
    Variables:
    1. Workers change.
    2. cooks change
    3. time of year has an impact on the ingredients
    4. what was ordered and when
    5. We don’t have a thousand people each ordering ever burger, rather we have maybe one person ordering one from perhaps dozens to choose from.

    I think such lists are as valid as throwing darts or, better, line up 100 dogs and turn them loose and see which of 100 hamburgers is eaten first.

    When it comes to smell and meat, I trust dogs.

  11. This reminds me of the list that Del Frisco used to run about top steakhouses in the US. They were always listed in the top 5.
    I would imagine that Maple and Motor had something to do with compilation………..

  12. here is my problem with lists like this — and it is a big one— How do you account for the fact that many people are not trying all the burger joints. unless all the people voting have tried all the burger joints how can they rank 1 as the best… it is really just the best of the ones they have tried — making it really a list of the burgers that have access to the most people.

    for example one of my favorite burgers in town never makes it to any list… it is at Louie’s most people go for Pizza but man they have a good burger — I think gives Angry Dog a run for the money but Angry Dog still probably wins…

    Anyway i don’t put much stock in these kind of list for that reason. Some people just make it a popularity contest — and that is fine just don’t call it the best burger… call it the most popular burger joint…

  13. This is stupid, who cares how they came up with their point system and results of the list. It’s no more subjective than any other review or ranking. To me a good burger has juice running all over my face and a well seasoned patty. To my wife, it’s a 93/7 patty with little seasoning but great toppings. Everyone’s opinion is different, everyone’s list is different. Why argue about it? Certainly everything that comes from D doesn’t make a ton of sense. And you argue that Maple and Motor and Hopdoddy are at a disadvantage, yet they are listed #1 and #9 respectively. Not much of a disadvantage.