Sprint by dscout
Video product feedback from 50 new users in 24 hours
Sprint by dscout
Video product feedback from 50 new users in 24 hours
dscout Sprint delivers video product feedback from 50 new users in 24 hours. It's ridiculously easy to sign up: give us 1 link, and write a short teaser for your product. We take this link and share it with 40,000 people who use the dscout mobile app. Our users opt into reviewing your product, click on your link to use your product, and then record a 30-second clip saying whether or not you lived up to their expectations.
We package these videos together, complete with transcripts, demographic data from each user, plus a "before" and "after" rating so you can measurably know whether your product met expectations, exceeded them, or disappointed your newest users.
1. Can testers only record a video after they've used the app, or also WHILE they are using it? I imagine the latter to be especially useful.
2. Where do the scouts come from and how do you make sure their feedback is relevant?
1. After they've used the product or prototype. Recording while someone uses a product is more about usability. We care more about whether someone would use the product, and why.
2. We have approx 40k scouts on dscout. They are invited to do research with other products, or search for our app in the App or Play store. Their feedback is filtered to make sure the content in their videos is relevant.
1. Cool! I guess that makes finding the right scouts even more important.
2. When a scout states they wouldn't use the app. How do you determine whether it's a fault with the app or it being a bad fit for scout? (I guess it's more of a general customer development question that doesn't just apply to dscout, but curious to hear your answer.)
We currently have representative videos of a product..