2017 Week 3, Game 1 – The Bad Beat

God bless the internet. Today I get to paraphrase one of my favorite lines of all-time, courtesy of Brian Murphy (now of, as best as I can tell, KNBR). I give you an O/U of 79 points before the Rams-49ers kickoff Thursday night.

“You wake up [Friday] morning wearing a barrel, and signing over your mortgage to me, thank you very much.”

I’m sure I’m not the only person who wonders whether their actions did, in fact, affect something totally out of their control, especially when it comes to football. By posting to Twitter the Crowd prediction (Rams by 4, total of 34) ahead of time before the Rams-49ers game, I am nearly convinced that it turned what people expected to be a low- to medium-scoring affair into a barnburner.

Nevertheless, there are two quick takeaways:

  1. Crowdsourcing the scores is about the percentages rather than any individual game.
  2. Crowdsourcing is hard to do for outliers.


The crowdsourcing scores concept is in line with any other gambling strategy. Blackjack and craps strategies are both about maximizing odds against the house over a number of rounds. If you follow the strategy, over time the percentages will normalize, but it doesn’t apply on any given hand. If you’ve ever read “Bringing Down the House”, you’ll be familiar with the story of one of the players losing over $100,000 on one hand even though he followed perfect strategy. Over time, the team came out well ahead, but on that one hand, luck worked against them.

For a game like Thursday in which the 49ers hit on a backdoor cover when the spread is only 2.5 by scoring 19 in the 4th quarter, it can be doubly frustrating because the win for the crowd seemed well in hand, and to have can feel like a win is being snatched from your fingers. All we can say is that we expect the crowd to be right more over time, so stick with us.


It would definitely be an understatement to say that Thursday night’s game was unexpected. The average score for all games through Week 2 was in the mid-40s, and the 49ers had scored a total of 12 points. Thursday’s game nearly doubled the average score, and the 49ers tripled their season points total in a single game.

In Week 2, the crowd went 6 for 9 on the Over/Under when the delta between the O/U line and the crowd prediction was greater than 5. Thursday was the first of 8 games with that kind of delta, so we’re hopeful that we can expect 5 of the remaining 7 to come through.

In the meantime, does anyone have a barrel I can borrow?


2017 Week 2, Game 1 – 3-for-3 Thursday

The crowd is off to a great start this week! The crowd predicted the winner (the underdog), the spread, and the total.

On the other hand, we just had two predictions, so that’s just random chance for now, but sometimes you take your wins when they come.

One follow-up from the KC-New England game last week that I thought was worth mentioning was about decentralization and independence. I browsed the Patriots subreddit on, and I found a thread fielding predictions for the game. All of the predictions (about 20 before I stopped counting) picked the Patriots, and the average score was 31-18. That would have been good enough for the over/under, but it was wrong straight up and against the spread.

Of course, when you have a forum of like-minded people, this is exactly what you’d expect. It would be rare for someone to post a close score let alone the Patriots losing, unless they were particularly brave (or they were a troll that enjoys tweaking a group of people). So, decentralization and independence are critical to ensuring that the aggregated predictions reflect the crowd wisdom and not a herd mentality.



A Quick Recap of the Theory

With the season just about to start, we wanted to recap how we think the Wisdom of the Crowd can work with NFL betting. (A more detailed post is here.)

Tl;Dr: Vegas’ goal is to match betting lines to public sentiment, not to predict final outcomes. We believe that the crowd can identify when public sentiment differs significantly from the predicted final outcome so we can take advantage of betting line values.

Thanks for visiting Crowdsourced Scores! Please get your Week 1 predictions in now!

What is the Wisdom of the Crowd?

The Wisdom of the Crowd is the theory that, over time, the crowd is a more effective predictor than any single individual. For the crowd wisdom to be “valid”, the crowd has to be:

  • Diverse: the crowd has to have a significant variety of opinion to ensure that biases are normalized.
  • Independence: an individual in the crowd develops their opinion on their own and without being influenced by another person.
  • Decentralization: the crowd is formed by people who have come from different backgrounds and experiences.
  • Aggregated: the individual opinions can be aggregated appropriately.

In short, the crowd needs to have a diverse group of individuals who are able to generate their opinions on their own.

How can the Wisdom of the Crowd Work for Sports Scores?

We believe that sports fans satisfy the three core components of a crowd: diversity, independence, and decentralization. will do the work of aggregating. The each individual of the crowd is their own super computer, processing all of the news and information out there and putting a prediction together based on their own calculations, and we aggregate the results.

How does this differ from what Vegas does? Doesn’t Vegas move the line based on the crowd behavior?

The main difference between us and Vegas is that Vegas is interested in matching the perception of each side and we are interested identifying where the public perception may be inaccurate. Most importantly, Vegas wants to balance the money. If one side has a heavy amount of money riding on it, the book is in serious danger is that side wins. A perfect 50/50 split BEFORE THE GAME is Vegas’ goal. They are not interested in matching the final outcome; they are interested in perfectly matching public perception to keep money flowing evenly on both sides.

Additionally, Vegas moves the line based on dollar values rather than on gross volume of bets. One $10,000 bet on the favorite would require 100 $100 bets on the underdog to match it so the line may move to encourage bets on the underdog even though only one person has bet on the favorite.

What we are trying to do is identify value in the line. In the stock market, this is called the margin of safety. If the crowd predicts a significant deviation from the Vegas line, we want to see if that prediction has value (we believe that it does).

What about Who Picked Whom?

Who-picked-whom seems like it is the same as crowd-sourcing predictions. If 70% of users are picking the underdog, how is that different from what Crowdsourced Scores is doing?

The short answer is that Who-Picked-Whom is a binary question that is limited by the same lack of detail as the betting lines. Do you believe that the underdog will simply cover the spread or win outright? Do you believe that the favorite will not only cover the spread but beat the tar out of the underdog?

In this situation, all choices fall into one of two buckets, and they are not distinguished at all. Therefore, it may tell you where the public sentiment is, but it doesn’t tell you if there is any value in the line. Additionally, the public is roughly 50% season over season.

We’re aiming to top that.

By aggregating score predictions, we are adding fidelity that will identify not only how many users choose one side or the other, but also how much each side believes its pick will win by. This will help identify when the crowd sees value in a given betting line and inform bettors.

Best of luck!