Posts Tagged ‘crowdsourcing’
What does Charlton Heston and crowdsourcing have in common?
Seems Microsoft is attempting to “integrate human expertise permanently into our writing tools” with a plug in that leverages crowdsourced labor via Mechanical Turk to create a better spell/grammar checker.
Mechanical Turk is an on-demand workforce platform that leverages the crowd (anyone, anywhere, who’s interested and connected to the Internet) to compete routine, time-consuming tasks that are difficult for computers but easy for humans – commonly referred to as human intelligence tasks or HITs.
Turkers – as the workers on the site are called – are paid nominal fees to complete HITs. Often one group may complete a task and another verifies quality of others.
How does it work?
Soylent is an add-on that leverages Mechanical Turk to copy-edit your document. Currently in Beta, Soylent attempts to “embed human knowledge into a word processor.”
Soylent uses a program design pattern called “Find-Fix-Verify” that splits task into smaller tasks that can be done in stages. Theory is that this decreases costs but increased quality.
Features include:
- Shortn – Turkers cut out extra words and shorten your manuscript
- Crowdproof – leverages the crowd to check spelling grammar and provide suggestions about style
- The Human Macro – allows you to describe the types of changes you want (e.g., change all to past tense), then turn it loose to the crowd
Ask not what the you can do for the people, but what the people can do for you.
While Clippy came with Microsoft Word, Soylent requires payment to crowdworkers.
Costs are descried as “small” or “just a few cents” so it’s hard to really estimate real costs. The creators say it costs about $1.50 per paragraph.
Are you game?
Check out the short YouTube video.
You can join the Beta to see what the people can do for you. If you do, let me know what you think.
Crowdsourcing #1 for 2012
Posted August 2, 2011
on:As a researcher studying crowdsourcing I was excited to see that Haydn Shaughnessy of Forbes magazine predicts that crowdsourcing will be top of mind for companies in 2012. While I agree that crowdsourcing examples are on the rise, I’m not sure I agree that crowdsourcing is a “fail safe” option that is a “mature” as Haydn suggests.
We’ve only begun to examine the economic impacts of crowdsourcing initiatives on the corporate bottom line. Some studies are finding that turning to the crowd has reduced cost and time for product innovation and problem solving, improved quality, and increased market acceptance of new products. In fact, TopCoder a site that runs contests for developing complex software applciations reports that projects typically requiring over a year of development have been completed in slightly over five months. Additionally, TopCoder programs average .98 bugs per thousand lines of code, significantly better than the industry standard of six per thousand. These initial findings are promising, but more research is needed to determine the true benefits to corporations.
While potentially more economical than traditional innovation methods, crowdsourcing does not come without costs. It is not a “build it and they will come” solution. Success requires defined business goals, an understanding of crowd dynamics as well as collaborative technologies. Additionally, those who are getting the crowd to participate are often finding it difficult to sort through and evaluate all the information and ideas that are generated.
One of the biggest hurdles is organizational culture. I saw a similar issue when working with companies to leverage social media for marketing initiatives. Success at leveraging the crowd requires an organizational culture that embraces open methods from the top down and is willing to give up some control. Exposing yourself and your company to the crowd can be scary and isn’t without risk. Lawyers raise concerns about leakage of trade secrets and issues related to intellectual property. Employees may feel they are becoming obsolete and fear for their jobs. And, executives may pull the plug when they encounter negative feedback or comments from customers.
Every day there are new and different uses of the crowd for innovation. While companies like P&G and intermediaries like InnoCentive seem to have it down, most are only beginning to experiment with leveraging the crowd for innovation. I do agree that crowdsourcing may be an excellent opportunity for companies to supplement or even replace their current innovation initiatives – saving money and time in the process. But currently we have only scant evidence of the how best companies can extra value from the crowd.
(Cartoon (c) Geek and Poke, 2009)
Part of the reason I decided to go back to get my PhD was to study how social media was changing how businesses connect with customers and build brands. I’m currently focusing on “crowdsourcing.” Crowdsourcing is basically an open call to the “crowd” to participate in an activity typically completed by employees or paid consultant/contractor.
There are tons of different types of crowdsourcing sites and researchers are only beginning to examine the different uses of the crowd. I wrote a post about the crowdsourcing site eBird a while back. Here’s another example of a crowdsourced site. This one for crowdsourced software development.
TopCoder.com connects companies with programmers in the crowd to collectively build complex programs.
Here’s how it works:
- Clients specify requirements, timelines, and budgets and the crowd competes to see who can produce the best code in the allotted timeframe.
- Qualified reviewers evaluate weekly submissions, scores are posted for everyone to see, and a winner is selected.
- After all modules are complete, a new contest is held to assemble the modules into the final program.
- Winners are paid a pre-defined fee and coders in second place receive half the amount of the winner.
- Winners turn over code (and all rights to it) to the paying company.
You’d think that throwing out a programming challenge to an undefined group of people without set standards or guidelines would result in pretty “iffy” code. But, what’s interesting is that TopCoder code actually exceeds the industry standard for quality. TopCoder reports an average of .98 errors per 1000 lines of code, compared to the industry average of 6 per 1000.
TopCoder manages to create complex programs in less time, at less cost, and at a higher quality than typical of internal development teams. Is this the end of internal software development teams?