So you want to start usability testing, don’t where to start finding testers. Have you tried Taskrabbit?
You know the drill. If you want to find people for usability tests, you either go on Craigslist, pay someone to find and vet testers or be lucky enough to have your own network of willing pool of testers who you can reach out to.
Our New York studio went through a season of a project where we were recruiting entirely from Taskrabbit to conduct weekly usability tests and interviews. My teammate Adam Brodowski, wrote about how weekly user tests affected our design process but here I’ll talk about what we learned solely using Taskrabbit to recruit for usability tests. I’m no expert in recruiting, but here are a few things we learned.
Using a recruiting vendor was quickly ruled out for this phase of the project because of cost (Although it has worked well for other projects). Also we’ve generally heard good things about UserTesting.com but mixed reviews from anyone who relied solely on Craigslist for recruiting.
We’ve known Taskrabbit as a service for outsourcing errands but not as a recruiting tool. It was something we tried mostly because we were intrigued by it’s novelty.
The first thing I noticed about hiring usability testers from Taskrabbit was that it was really quick. If I posted a task Monday afternoon, I could count on having multiple candidates by Tuesday morning. Sometimes I could even book a tester within an hour of posting a task. Which was a contrast to my teammates having experiences of needing weeks to hire tests.
Taskrabbit is surprisingly affordable. The people close to this project were generally used to paying somewhere between $50, $75, or even $100 per usability test found on Craigslist or through a vendor. However Taskrabbits were bidding for half or a smaller fraction of that. We were paying Taskrabbits around $25 per usability test, many times even as low as $15. Much more affordable than what we were used to and it was nice not having to go buy Amazon gift cards.
I’ve heard a general rule that of all of the usability tests you book, expect ¼ of the participants to not show up. But the nice thing about recruiting through Taskrabbit was that no-shows rarely happened, if at all. We were conducting weekly tests all summer and maybe in one instance the tester didn’t show up.
We landed on Wednesday afternoons for our weekly user tests and interviews. The design team liked claiming the middle of the week for user tests. We could use the first half of the week to prepare and then the next two days to make changes from what we learned.
Going over user tests Wednesday afternoon and Thursday meant that the insights were fresh in our minds rather than testing Friday afternoon but waiting till Monday to review.
One thing I know we’ll do better is relaying user insights to the design team and client. Notes and recording videos are great for reference, until you realize that no one actually has the time to watch them. Sometimes the designers on the team would be too busy on Wednesday afternoon to sit in interviews.
All our effort conducting a right usability test isn’t worth much if you interview the wrong person. The biggest downside to relying on Taskrabbit was the risk that all of our insights would be skewed. It helped us find specific people quickly, such as smokers who’ve tried quitting, Spotify subscribers, or regular shoppers at Old Navy. The only issue is that we were designing products for people who probably have never heard of Taskrabbit, let alone would be willing to work as one.
This is something we kept in mind for each interview to check ourselves from skewed insights. A nice workaround for this problem was asking the taskrabbit to bring a friend or significant other with them. This way we spoke with someone who taskrabbits on the side and someone who doesn’t. (Made By Many has written in the past about the benefits of interviewing pairs.)
Alongside that workaround we always checked our user test insights with other research techniques such as surveys, field research, and AB testing. And yet, having said that, there were some things that we could learn just from showing people every week.
Would we use Taskrabbit again for recruiting for user tests?
But alas, after Taskrabbit’s latest redesign meant that our season of weekly user tests was short-lived. The service took away the ability to screen taskers ahead of time, meaning that tasks felt randomly assigned rather than us being able to choose the right person for each test. It was incredibly useful for a season, and maybe we’ll find use of it again in the future. The experience taught me to keep my eyes open for any creative ways of recruiting for user testing.
If you have any other alternatives for recruiting research participants we’d to love to hear them!
We don’t want any funny business, we want to be clear about the data we collect and how we use it. To find out the details click here
Recently we've been building applications using the excellent React JS. We're now at a point where we've graduated to using Flux as an application pattern ...
Another weekly round-up of topics, thoughts, articles or activities that the folk at MxM have been sharing or doing between ourselves in London and colleag...