Remote Testing > Has anyone tried any of these options?

4 Sep 2010 - 10:34pm
4 years ago
7 replies
1336 reads
leslie242
2010

I am researching remote testing for some usability studies we'd like to run at work and came across a couple different options that look promising. Has anyone tried these services?

 *   http://www.utetool.com/
 *   http://usabilla.com
 *   http://loop11.com
 *   http://www.openhallway.com/

Any experiences or advice you have to offer would be much appreciated—cheers!

 

Comments

5 Sep 2010 - 5:49pm
Amy Silvers
2007

I've used Usabilla, with good results. I've looked closely at Loop11 and will probably use it at some point in the future, but for my most recent need, I didn't have a live URL to test, and went with Usabilla because it uses jpgs.

What I liked about Usabilla:

  • ease of setup (could be a little more streamlined, but still very easy to learn)
  • low cost
  • ability for users to add notes and comments on specific page elements
  • integration with WuFoo and Google Docs allows for asking follow-up questions easily
  • very good customer support

 

What wasn't so great:

  • customer support is great, but because it's a small startup, response time can be a little slow
  • only 100 participants per test on the standard plan

 

I have mixed feelings about unmoderated testing in general, but as far as the specific tools go, I think Usabilla is worth considering. Overall, it was an excellent tool for gathering relatively large amounts of quick data, and the ability to capture user comments and ask survey questions at the end of the test was especially helpful.

BTW, I recommend the decision tree graphic that Clearleft put together recently to help in choosing a remote testing tool or service: http://www.flickr.com/photos/clearleft/4931570875/

 

5 Sep 2010 - 10:33pm
Boltron
2010

I've used them all except UTE, and they each serve a different need. Automated static testing (Usabilla) is awesome for prototypes or single tasks, Loop11 is automated live testing for an existing site or functional prototype where you want to get success rates and data based on tasks as well. Optimal Workshop's tools span the gamut of conceptual testing, which includes Card Sorting, static testing of single images like Usabilla, and a navigation testing tool. 

So depends on what your study design is more than the merits of each of these tools, but they are all pretty cool for what they do. Just don't forget to test the test and do some casual moderated sessions with whatever unmoderated tool you use. I try and keep some descriptions and reviews of all these tools at http://remoteusability.com but it's a bit out of date at the moment.

- Nate

   ..  nate bolt | ceo
   ..  http://boltpeters.com

 

6 Sep 2010 - 11:35am
Dana Chisnell
2008

Great summary, Nate (@Boltron). But then, you are the go-to guy for information about remote research.

@leslie242, you should check out Nate's book with Tony Tulathimutte, _Remote Research_. It's excellent and will answer all of your questions (and more).

Dana

21 Sep 2010 - 10:40am
leslie242
2010

This is great feedback! .And yes, Nate's book is what initially sparked the research. I am currently about halfway thru--very interesting stuff.

I will definitely let everyone know which way we end-up going and the results.

Thanks again

 

 

 

22 Sep 2010 - 2:30pm
Alfonso de la Nuez
2009

Hi Leslie, check us out as well! 

www.userzoom.com

Best,

Alfonso

4 Oct 2010 - 6:51pm
Leslie_Chacon
2010

Hi Leslie,

I am not sure what tool you decided to use but thought I would add in my experience on using and doing analysis on Usabilla and Loop 11.

Usabilla is agreat tool for mock-up images or static wireframes.Recommedations:

  • I would recommend that you create your own questions/task about the page and not depend on the standard questions that are there.
  • If you do use the standard questions limit one response per participant, so that your results don't get watered down.
  • If you use the notes, try to highlight that in your task question.


Downside

  • I didn't like that I could only recruit 100 participants.
  • My sample vendor had a touch time progamming the respondents.
  • Reporting was easy enough though, but no way of doing any segmentation or any other type of analysis.


Loop 11 you could  pretty much use anything that is clickable. Recommedations:

  •  Add screener questions before the scenarios and start doing segmentation based on your task responses.
  •  Segment your metrics (completion rate, error rate) by gender, computer skill level, or whatever you came up with. (pivot tables will be your best bet)
  •  Add follow-up questions (radio or comment box) for every task, or rating scales. (comment boxes are my fav people don't hold back ;)
  • Intergration with your sample vendor is easy now but you can also add a comment box for their respondent ids.


I wrote a blog post on the new updates you can read here: http://www.userfirst.com/our-blog/2010/06/28/unmoderated-usability-study/

Please let me know if you had any questions or comments.

Cheers!

Leslie Chacon

9 Nov 2010 - 9:55am
leslie242
2010

You guys are awesome! I've narrowed it down to either Loop11 or Usabilla. However, based on all the advice you all have to offer, I am leaning towards Loop11.

We're hoping to test at the start of December...will share the results of our experience.

Thanks again guys--all your thoughts have truly been helpful :)

Syndicate content Get the feed