Discogs, a comprehensive audio recordings database, doubles as a marketplace for physical media, accessible through its standalone mobile app and mobile website version.
The current search functionality on both the website and app is limited to a text input field and a barcode scanner. To enhance the efficiency of music searches and optimize the seller's inventory updating process, it would be beneficial to introduce additional search methods.
While most prefer text input, our research uncovers surprising alternatives, especially in music searches.
93% of the respondents find themselves in a situation they want to find the music they hear, to 40% it happens often.
90% of the respondents either already use the search-by-sound feature, or would want to use it.
While over half of music seekers don't use image search or text scanning, a solid 33% are keen on exploring these options.
Shazam is the favored choice for many, thanks to its one-click simplicity. However, it came up short when tested against the Google widget in the field.
The results indicate that new Discogs users highly value and expect familiar search methods, akin to those found in other programs and widgets.
I was aware that these search features are employed by music enthusiasts. However, given that Discogs specializes in physical media, understanding the specific needs of our buyers became crucial. To gain deeper insights, I opted for a strategic approach: conducting three interviews and crafting personas that would synthesize both interview and survey findings.
These were the main questions I was seeking answers to:
In accordance with my findings, the following decisions are made:
To help map out the integration into the existing app structure, I created a user flow showing how the functions will interrelate and compliment each other
For the sake of simplicity and clearness, this user flow omits the text input and browsing/sorting methods, and reflects only “untraditional” methods of search, two of which are new:
I've crafted a system of components and wireframes for our upcoming prototype, encompassing all three innovative search methods
Testers showed some uncertainty about the search process stages and were unsure if they should press anything to proceed.
I've added a progress timer to all three search options.
If a tester accidentally selects the wrong search method, they have no option to correct the mistake except by going back to the search home page.
The navigation menu is now accessible at any stage of the search process.
The scan options had distinct UI differences, making the camera usage choices less clear.
To enhance consistency, I unified the visual elements and introduced a timer that matches the selected method's icon.
Due to visual inconsistencies, the testers were confused if some of the options were available only in certain searches
I fine-tuned and adjusted the visual elements to maintain a sleek and clear visual coherence.