Autodesk Screencast is a tool that lets customers record their screens, in order to help others learn how to use Autodesk software. These videos can then be viewed by others using the Screencast video player, which allows users to also search and interact with the video. During our summer 2017 internship we redesigned Screencast, in order to make it accessible to customers who are deaf and hard of hearing. We accomplished that goal, as well as simplified the user interface, and added more business value to the product.
Screencast is a great learning resource for Autodesk customers. However, it is not accessible for customers who are deaf or hard of hearing, as it is not accessible to them. Automatic video transcription needs to be provided, in order to allow screencast creators to easily make their recording accessbile to customers who are deaf or hard of hearing.
In order to get familiar with the product, we started by conducting heuristic evaluation and looked through existing research that the team had already documented. We also went through the data that we can get through Google Analytics and the Autodesk Community Forums
We created a stakeholder map to get insights into the following questions:
Next, we conducted a competitive analysis in three dimensions - recorders, transcription services, and players with transcripts. This gave us a better understanding of some of the problems we might face, as well as how other solutions have tackled them.
To better understand our target population and user needs, we conducted interviews with Screencast users. The users varied in levels of experience with the app. We also interviewed some users were hard of hearing, as well as other that were not.
In order to get value out of the data that we collected, we synthesized it using an affinity map. We used the LUMA Institute exercise Rose, Thorn, Bud. This allowed us to better visualize the data, draw conclusions and get a sense of direction for our next steps.
After getting a better understanding about our product, customers and possible transcription services, we brainstormed potential solutions.
We then brainstormed and sketched our ideas, which we used to have more discussions with more developed ideas. We reviewed each other's work and suggested changes.
We then created high-level prototypes using Framer. We picked that tool, because of its ability to produce highly interactive prototypes. We believed this was required for effectively testing a video player concept.
Similarly, we created interactive prototypes for the desktop app components of Screencast, the video recorder and editor.
We prototyped the web video player, which learners use to watch instructional videos.
We tested our prototypes with 6 customers, 2 of which were deaf or hard of hearing. Our testing gave us insights into what worked and what could be improved with our design. This led us to continue refining our prototype. The test sessions included the following steps:
Based on the feedback we got, we made changes to our prototypes and conducted a second usability testing session to verify that the new changes worked well.
After refining our prototypes, we conducted a second iteration of usability testing, in order to validate the effectiveness of our changes. Our tests showed improved quality and faster task completion by users.
Finally, we produced specifications documents that we shared with our team. We also presented our work to the team, as well as to our whole Global Customer Service & Operations division. We also shared our interactive prototype, to clarify how the interactions would work.
We redesigned Screencast, to make it better and more useful to our customers. Here are the improvements over the old design: