Thanks to some dedicated volunteers, we have seen our first set of results come through. While it would be nice to draw some conclusions, there were only 4 submissions, so we’re still likely in the realm of random chance. Still, a couple of things: There was some very useful feedback that Paul and I are very grateful…… Continue reading The First Week Results
This is a follow up to our post on how crowdsourcing scores can work. In that post, we talked about how who-picked-whom is a crude metric of crowdsourcing predictions, but it’s main gap is that it doesn’t provide margin of victory to determine a more accurate crowdsourced prediction. As a result, even if the majority thinks that…… Continue reading Crowdsourcing Spread vs. Margin of Victory
I listened to a couple of podcasts this week and, combined with my own experience, my limited consensus says that Week 3 was a rough week for most people. The Sports Gambling Podcast hypothesized that it was because our conclusions are formed but are based on iffy data. You have 2 weeks of data, and in…… Continue reading Crowdsourced Scores as a Leading Indicator
One of the earliest forms of pushback that we received regarding the idea of crowdsourcing game scores was simply that there would not be enough expertise available to predict accurately. “There are too many variables.” “No one will every have enough information.” “You’ll never get it right.” To be honest, when we heard such gut-reaction pushback, it…… Continue reading How Can Crowdsourcing Scores Work?
We are very excited to have you join our little experiment! Alright, so what is this all about? Crowdsourced Scores is based on the theory that says that a average prediction by across a group of individuals will perform better than any individual over time, called the Wisdom of the Crowds. Our goal is to collect…… Continue reading Welcome to Crowdsourced Scores!