Category: Meetings

Recap: Fast & Cheap UX Research

At our August 2013 meeting, Refresh Detroit welcomed Jodi Bollaert and Megan Schwarz of Team Detroit, who shared their insights on fast and inexpensive user experience research.

Attendees included web designers, developers and user experience professionals who came from metro Detroit and Ann Arbor to learn about online and in-person research methods to improve their projects.

Thank you Team Detroit for the refreshments, hosting the meeting and for your support of Refresh Detroit events. It was our first time meeting at Team Detroit, and it was a great venue!

Notes from the Presentation

Before beginning user experience research, what do you want to learn? With over 100 user experience tools, how do you choose the best tool? It helps to identify:

  • Who is your target audience?
  • What will you test?
  • What’s your budget?

Challenges we face today with user experience research: three times as much to test (desktop, mobile, tablet) and short project lifecycles. Traditional testing can take up to two weeks to get the report. Web-based tools (like UserTesting.com) enables teams to do research at lower cost and less time, with fewer resources.

Why use UserTesting.com?

  • You need findings quickly (within hours)
  • Test can be completed in 15 minutes
  • If audience can easily be recruited online
  • Site is available online
  • Have resources available to do the tests, review the videos, take notes and report
  • Budget is a concern

Using unmoderated testing with Usertesting.com, you’ll be able to:

  • Test five people
  • Do remote research
  • Only spend 15 minutes
  • Get results in an hour
  • Keep cost to ~$250 on average

For mobile testing, Usertesting.com sends cameras to participants.

Analysis and Reporting

  • The team doing design and development, get the results within hours
  • Developed a UT report template – focused on actionable findings, including a few video highlights to underscore key themes
  • Deliver report in person (usually a few days later since it can take three to four days to review five videos, create report)

Team-based Analysis

  • Offer chocolate and caffeine to encourage team members to attend
  • Watch videos together
  • Practice active observation
  • Each team member documents key insights on sticky notes, one per note
  • Post stickies on wall, work together to sort out

Constraints/Lessons Learned

  • Avoid leading questions
  • Run a pilot test with one participant before launching the full study
    • Check that duration is about 15 mintues
    • Ensure your directions and questions are understood
  • Participant no good? Usertesting will find another one quickly for you

Even Cheaper Tools

5 Second Test

  • Offered by Usability Hub
  • Find out what users recall about your design
  • Free with Karma Points (which you get by participating in  tests) or monthly subscription pricing
  • Easy to set up
    • Upload screenshots
    • Add brief instructions
    • Use default questions or customize for your needs
  • Example shared: Team Detroit home page
    • 5 second test showed that users didn’t know what Team Detroit was about (was it cars? Detroit)
    • Updated home page to clearly identify what Team Detroit does

Click Test (Usability Hub)

Card Sorting

  • Method to help you organize content on a site
  • Example: videos for Ford site. Users asked to group content and write labels once all cards were grouped.
  • Card sorting resource from Boxes and Arrows
  • Consider Websort for online card sorting. Free for under 10 users

User Interviews

  • Qualitative research method
  • One-on-one conversations
  • Identify what your users’ needs are and why

Summary

  • Do-it-yourself user experience research will be used more often, but won’t replace traditional research
  • Talk to project teams. Find out their pain points. Get them involved in planning and observation.
  • Test early, test often
  • Test up to launch, test after launch
  • Share results when you get them (don’t wait)
  • Document findings & facilitate next steps
  • Be careful what you wish for! People get excited and want to do more. You’ll soon discover you need more bandwidth.
  • Just do it!

We’re hoping to have a link to Jodi and Megan’s slides soon. We’ll update this post when the slides are published online.

Recap: 25 User Experience (UX) Lessons from Hollywood

In January we were lucky enough to have Steve Tengler give a talk to a joint meeting between Refresh Detroit and the Michigan Usability Professionals’ Association.
His talk built off his ongoing series of articles which explore user experience lessons from actors:

Steve

Although Steve couldn’t share his slides here are the 25 UX lessons he demonstrated that we can learn from Hollywood.

The 25 UX Lessons From Hollywood:

  1. Rain Man: Social Media Ratings of UX Can be Powerful (“Kmart Sucks!”)
  2. Mission Impossible: Arrange your User Interface Around Urgent Tasks
  3. Minority Report: Design Your System with a Multimodal Interface
  4. Top Gun: Design for Human Error Upfront
  5. Risky Business: Style Captures Attention
  6. The Green Mile: Task Completion Doesn’t Automatically Equate To Success
  7. Cast Away: Fictional Personas Can Bring Sanity to the Project
  8. Bosum Buddies: Re-Skinning Can Allow Financially-Advantageous Reuse
  9. Da Vinci Code: Complicated Interfaces Have Their Purposes Too
  10. Forrest Gump: Exceeding Expectations Makes You Memorable!
  11. Jurassic Park: The Details Are Surrounded by Dung
  12. October Sky: Have Faith in Iterative Testing
  13. Recount: Statistics Can Be Both Powerful and Dangerous
  14. Blue Velvet: Investigations Can Go Too Far
  15. Meet The Fockers: Understand the Financials of Personalization
  16. Pirates of the Carribean: It’s Not About the Ship You Rode In On
  17. Edward Scissorhands: Plan Ahead for Assimilation
  18. Alice in Wonderland: Flexibility on Size Helps Win the Battle
  19. What’s Eating At Gilbert Grape: Design for What Your Customer Wants
  20. Alice In Wonderland: Tremendous Flexibility Can Lead to User Satisfaction
  21. Bruce Almighty: Silent Analytics Can Help Tailor Your UX
  22. Man on the Moon: Know the Business Side of your Business
  23. Mr. Popper’s Penguins: Watch Out for the X+1 Factor
  24. Horton Hears a Who: Don’t Forget The Minority Might be a Captured Market
  25. Batman Returns: Be Flexible on Emerging HMI

If you have a chance to see Steve talk, take it. He is engaging and has a great insight in to user’s needs. We were grateful to have him and invite him back any time.

Recap: The Tricky Business of Website Testing

You might think software testing would be a bit of dry topic but Christopher Martello’s talk at the November Refresh Detroit meeting was far from it. Chris gave us a look at what website testing is, who does it, why they do it, and numerous excellent resources that everyone can use for testing.

Chris has a lot of experience to draw from. He has been doing software testing for about 12 years. He’s done manual testing, automated testing, been a team leader and a project coordinator. Currently he’s a Build & Deployment Coordinator.

He started with an explanation of why we need testing and what makes testers tick. Testing identifies issues that get missed during development. Tester’s are often in between the developers and the business analysts, making sure the functionality is in balance with the requirements. They can function as the gatekeeper to production, determining when a build is good enough for a release. “The most compelling factor for me as a tester is I want to make sure the customers … have a good experience,” said Chris.

Software testers are the type of people who like to find defects and issues. They identify the intended and unintended ways users can go through an application. They also make sure all the intended scenarios work. They incorporate quality throughout the process. It’s important that IT finds the bugs. “You don’t want the user to find the bugs”.

The testing process consists of planning tests, running through them, and reporting the issues you discover. Chris discussed how testing is one part of the overall quality assurance process for your website or application. “It helps build confidence in your application, the team can say it has been tested.” Chris briefly touched on some of common terms and testing methods used in software testing including functional testing, automated testing and load and performance testing.

He then discussed how NOT to test, which included:

  • Letting the users do the testing
  • Saying “I’ll test later”
  • Saying “Works on my computer.”

He discussed other types of testing including standards and browser testing.
Standards testing involves determining if a web site or application meets HTML, CSS, accessibility, security and the evolving mobile standards.

Browser testing is a very important aspect website testing. Unfortunately its difficult to cover the ever increasing number of web browser and operating system versions. Graded presentation standards are typically developed to determine what browsers your web application works on and to what level. You can base your standards on your own internal application logs and on browser statistics found on sites like Stat Counter’s Global Stats. “You should probably be tailoring your browser standards to what your audience is,” said Chris.

He went on discuss the specialized craft of load and performance testing. “It’s almost like designing a very complex scientific experiment”, Chris explained. Its important for identifying bottlenecks and break points in your application but it can take a great deal of time to set-up.

There’s also manual testing. “You got to have some eyeballs and some hands on”, to test if your application meets the requirements and has the intended functionality. A few examples Chris gave of manual tests included proof reading the site’s fine print, checking if the correct phone number is listed, and making sure a form’s error validation is working correctly.

Chris then provided a list of tools and services that you can use for testing. I’ve listed some of them here but for a complete list I recommend you check out his slides.

  • SnagIt – A handy screen shot tool.
  • IETester – A way to test multiple versions of the Internet Explorer on one machine (free)
  • BrowserStack – A complete cross browser testing service that creates virtual machines for you.
  • Xenu Link Sleuth – For testing links on your site. (free)
  • SQA Forums – Software Quality Assurance forums
  • Cacoo – Free online drawing and wireframe sharing

Some of the Firefox add-ons that Chris suggested we check out are:

Automation is where it’s at in software testing. According to Chris, “if you execute a test case more than three times then you should automate it.” It can save a tremendous amount of time. The downside is sometimes you don’t get support for the tools, the budget, and the training needed. While expensive, Chris likes the
HP QuickTestPro testing tool. Some other automated testing tools are:

Two load and performance testing tools Chris mentioned included Rational Performance Tester and HP LoadRunner.

Some issue and defect tracking tools Chris mentioned are:

Chris wrapped up the talk with some demos of some of the tools he discussed and answered questions from the audience. Throughout the presentation he provided examples, interesting anecdotes, and some fun QA jokes. Refresh Detroit would like to thank Chris for his excellent presentation.

Below are Chris’ slides: