Learning how to balance design and training

January 28, 2016

The USABLE team is halfway through our Tool Feedback Training process, and the first two TFTs were hugely informative. We worked with groups doing amazing work while facing widely different challenges; including censorship, targeted malware, and situations where digital security problems can become physical security ones.

One of the concerns we had going in to this process was balancing the digital security training for the participants with human-centered design exercises, which is why we spent so much time working through this during our Security and Design Workshop.

In both trainings however, the participants not only responded enthusiastically to the design exercises we conducted, but expressed ongoing interest in applying design thinking to their project work. The design-centric agenda items have actually meshed well with the digital security curricula, engaging participants in the design process, creating users personas, and supporting the tool developers in coming up with prototyped design changes. We’ll post soon on building user testing directly in to the training process.

The dSchool’s Wallet Exercise has been very useful as a short, fun way to get into the mindset of design, as it leads you through a series of thought, listening, and feedback exercises with a partner. It - along with many of the other design exercises – creates a great icebreaker effect by putting everyone on the same level as they work through a problem.

We’ve updated the TFT schedule to better reflect how these first TFTs went - you’ll see a back-and-forth between threat modeling, design exercises, and tool trainings.

Day Description
  • Project Overview and Introduction to Design
  • Experience sharing: "A Day in the Life"
  • Threat modeling introduction
  • Digital security overview
  • Design exercises: "The Wallet Project"
  • Persona creation and discussions about risks and tools
  • Digital security tool training with participant observation feedback
  • Defining expectations and requirements for digital security tools
  • Digital security tool training with user testing observation and participant feedback
  • Further digital security training (depending on scope and complexity of tools)
  • Review tool experience from a design perspective
  • Open discussion around digital security needs
  • Roadmap and prioritization of challenges
  • Finalize user personas

Indeed, the integration of thinking through how a tool works – and how it could be improved – deepened both TFTs. These opened discussions around how the tools interact with the risks that each group faced, and what features were most important for them to filter their choice of tools on. The second group even built their “top 10” (well, 14) ranked list of features around secure communication tools:

  1. Usable; fast and easy to set up
  2. Cross-platform, mobile
  3. Works on slow connections
  4. Secure by default
  5. Tool is consistently updated
  6. Sync across devices?
  7. Provider cannot read messages / metadata
  8. File transfer
  9. Offline support
  10. Open code
  11. Code Audits
  12. Perfect Forward Secrecy
  13. Documentation
  14. Localization

A good reminder that usability is security.