Workflow and creating documentation is one of the most important things for designers. Being able to communicate an idea through an artefact and keep it updated ensures a team is on the same page working towards a shared goal
Originally published on The Guardian UK – February 2019 https://www.theguardian.com/info/2019/feb/07/exploring-your-ideas-why-prototyping-is-so-important
Here at the Voice Lab, we’ve learned that creating this shared understanding is absolutely critical while building voice-first and multimodal projects, which can be difficult to visualise internally. We’ve found dynamic prototyping to be more effective than traditional flow charts during the early stages of a concept to facilitate discussion and progress faster.
Everyone has a different answer to why they need to prototype. From the point of a designer, a prototyping tool allows you to express your ideaswithout anyone editing it. Whether that’s to validate an idea with user testing or to get an idea signed off from a client or stakeholder, this can be an important step in the design workflow. What you need from a prototype can also vary wildly, this is why there are a myriad solutions in order to get the task done.
During our first project, we found creating traditional user flows was quite slow and repetitive. Keeping everything paper-based made iterating and responding quickly to feedback difficult … For our second project, we researched various tools that might help us better communicate and document our concept.
Dynamic prototyping with Voiceflow
In late December 2018, we stumbled across Voiceflow. When we started to explore the platform, it was the community that drew me in. There were lots of videos and documentation to explain how I could create functional user flows. Even though it is only Alexa and our focus is on Google Assistant, we felt that investing some time to see what the tool was all about was a good use of our time.
We had recently launched Year In Review for Google Assistant and we were between projects, so we started going through the video tutorials. Once we had done all the theory examples we needed to make something real to really test it out.
What we had to hand was the Year In Review project assets such as audio files and graphic assets. So we thought: could we make a fully functional port of a Google Action to Alexa? We felt this would be a good challenge for the tool.
We started inputting the content into the user flows and quickly got comfortable with the platforms. Completing the full prototype took three or four days. We were pleased by how quickly we were able to complete the port, and the ability to test the prototype through an Amazon developer account enabled the potential to conduct future user testing without coding.
But we wanted to test Voiceflow further and give it a real challenge, so next we tried to reproduce an Alexa skill with more technical components like variables, API requests and even permissions for the user’s phone number.
Impressed with the capabilities, we decided to try integrating dynamic prototyping into our workflow for the team’s next project.
Room for improvement
Voiceflow has really helped us to imagine what these devices and platforms are capable of more quickly than a traditional digital or paper workflow. The simple act of connecting blocks and variables while seeing the real outcomes breeds confidence that the team is building a shared understanding of the concept. While the tool has been very helpful, there are areas it could improve on. Here are a few examples:
We’d like to see better sharing options and tools for collaboration. At the moment, you can create and upload a prototype to the Amazon developer console, but only the account owner can use it. So, for example, you can’t simply hand the prototype to a UX researcher and let them carry out a testing session.
- We’d like to see Google Assistant integration with support for Google’s unique SSML tags, such as the parallel tag. This would really help us design and understand the nuanced differences between the two platforms.
- Better tools for visual and multimodal design. Currently we have simple “card” and “display access for APL, but most of it needs to be hand-coded in JSON and for a designer that isn’t an ideal solution.
- Better return journey support plus an in-house database solution. This would allow us to create more powerful interactive prototypes.
Ultimately, we are still producing final user journeys in a more traditional format, but we are doing so with much shorter feedback cycles and more confidence that the concept is being communicated effectively. We highly recommend trying out more dynamic prototyping tools to help speed up and improve your workflow. There are a number of other interesting tools out there, like Invocable, Twine and others. We suggest your team try them all and figure out which one works for you.
We think these solutions are a great addition to the landscape and very much look forward to see more tools like this to enable the design process to voice-enabled digital assistants.