The following describes how we applied the prototype building and refining phase of the Natively Adaptive Interfaces (NAI) approach to a specific use case: making online videos accessible to blind and low-vision (B/LV) users.
Select the ideas to prototype.
- Developed prototypes of the following features, in order:
- Playback control
- Information seeking and retrieval
- Adaptable audio descriptions
- Adaptable preferences
- Personalized guidance
- Developed prototypes of the following features, in order:
Determine the type of prototype needed.
We started with low fidelity prototypes for most features, but looked to build medium fidelity prototypes for both the adaptable audio descriptions, and information seeking and retrieval features. Medium fidelity was needed since we needed working features to get meaningful feedback.
Build just enough.
Followed this guidance when building prototypes to enable rapid prototyping and learning.
Co-create these prototypes.
Circled back to 20 co-design participants for a second round of feedback, but with our prototypes.
Simulate user interactions.
When feedback sessions from participants were cancelled or there were gaps, each team member did simulated testing of features. This added valuable supplementary data points.
Iterate. Multiple iterations were created for each feature.