Featured work, LivePerson

Taxonomy Annotator

An internal web app for text annotation, which improved the efficiency and accuracy of labelling data.

Platform

Web

Timeline

8 weeks

Domain/Topics

Machine Learning

Core Team

Designer (me), 3 Engineers

Results

86%

faster to annotate messages

10/10

Satisfaction, Survey of 8 users

Platform

Web

Timeline

8 weeks

Domain/Topics

Machine Learning

Core Team

Designer (me), 3 Engineers

Results

86%

faster to annotate messages

10/10

Satisfaction, Survey of 8 users

The need for efficient text classification

The challenge of accurately classifying consumer messages for sentiment and intent through spreadsheets was hindering our Machine Learning models' training. My team and I decided that an internal annotation tool could offer not just efficiency but a significant reduction in errors.

The vision

My goal was to design a web app to enable the team to classify data more efficiently, improving classification speed and reducing human error. This change would ease user fatigue, enhance data quality, and accelerate our ML models' training.

Successful launch and impact

Launched successfully in 2 months, the product is now utilized by data scientists and annotators, delivering a remarkable improvement in task completion time and overall job turn-around.

This is much better to use than working with spreadsheets. Reading the text is much easier, I make fewer mistakes, and I'm much faster at annotation.

Lar, Insights Manager

How I discovered pain points

Methodologies

  1. Market Research: Examined the existing methods of annotation

  2. Competitive Analysis: Assessed similar tools like Prodigy to understand the competitive landscape and identify opportunities

  3. Discovery Interviews: Collaborated with annotation, data-science, and engineering partners to uncover pain points and technical constraints

  4. Usability Testing: Conducted moderated tests on prototypes to identify user expectations and pain points


After annotating for a while...I get tired and I end up misclicking a lot

Discovery interview participant

Key insights

  • Users needed easy-to-use keyboard shortcuts for efficiency

  • A clear need for reduced error rates through more conspicuous text and examples

  • Unexpected findings: The UI elements' placement led to frustration and errors

Designing solutions to meet user needs

My designs had several goals in mind:

  1. Easy-to-use keyboard shortcuts to increase workflow efficiency

  2. Reduced error rates through more conspicuous text, examples, and noise reduction

  3. Modular components to work with various annotation levels

  4. Support different permission levels

Implementation and results

Faster, easier, and more accurate

The product launched internally to great success, being actively used by around a dozen users to annotate data. Initial tests show that:

  • 86% faster message classification

  • 26% faster task completion times

  • 15% decrease in error rates¹

Overall, this project was more than a success; it lead to tangible cost savings and performance improvements that redefined our approach to text annotation.

¹ Error rates were determined by cross-referencing annotation results between multiple users and identifying anomalies

© 2023 Charles Wu