Announcing the 2020-2021 UX Feedback Collection Funding Pool

Image with title Announcing the 2020-2021 UX Feedback Collection Funding Pool in blue text

Following AXIS, we are inviting trainers, auditors, and other experts to apply for a modest amount of funding to implement UX feedback collection in their local communities.

Applicants will be expected to organize user engagements, collect feedback directly from at-risk users, and then synthesize and share that feedback with open source privacy and security tool teams. The goal of this funding is to support stronger feedback loops among tool teams and their target user base, and ultimately make tools more usable for those who need them most.

Whether collecting tool-specific feedback or developing general resources such as user personas or organizational archetypes, applicants will be providing valuable insights to open source tool teams wishing to implement usability and accessibility improvements.

The amount of funding provided will vary based on the specific circumstances and proposed activities, but we ask that you do not submit budget proposals over $5,000 USD. If you are able to implement your project on a small budget, please be modest as it will open up more resources for fellow AXIS participants. All user feedback engagements must be implemented before May 31, 2021.

DEADLINE TO APPLY

The application will close on November 15, 2020.

WHAT WILL WE FUND?

  • Digital security trainings (with integrated feedback collection)
  • Organizational audits (with integrated feedback collection)
  • User engagements
  • Trainer/auditor/designer/user-experience expert meet-ups
  • Developer engagements

Engagements can be in-person, virtual, or a combination of both!

Budgets can include costs such as:

  • Funding for you as a trainer, auditor, or facilitator
  • Funding for a co-facilitator/observer to help collect feedback
  • Funding for participants or incentives (stipends, licenses, etc.)
  • Materials (printing, design, etc.)
  • Costs associated with in-person events (venue, catering, travel, etc.)
  • Costs associated with virtual events (data, internet, etc.)

HOW WILL APPLICATIONS BE SCORED?

Applications will be reviewed by a three-person technical committee based on the criteria provided below.

1. Proposal reflects the goals of the funding pool:

  • Increases the amount of quality user feedback provided to open source tool developers, elevating the needs of at-risk communities;
  • Provides digital security training or support to at-risk end-users and organizations;
  • Generates and publishes documentation to support feedback loops, such as personas, archetypes, resources for trainers or auditors, or documentation of usability roadblocks in the open source world;
  • Creates and maintains positive, mutually respectful feedback loops among developer and at-risk communities.

2. Proposal is relevant to the local community:

  • Responds to security needs of a particular community;
  • Outlines how tools selected are relevant for the local users or participants;
  • Highlights long-term engagement or any plans for follow-up with the target community.

3. Clear outputs from the activity/engagement:

  • Explains how user experience feedback will be collected from target group;
  • Includes a simple work plan and timeline explaining when the activities will take place;
  • Outlines how feedback will be documented and converted to a format that is accessible for developer(s), including target numbers for each output (# of people or organizations engaged/trained, # of personas or archetypes developed, # of pieces of tool-specific feedback collected, etc.);
  • Establishes a clear plan for connecting with the developer or tool team to share feedback (Internews can provide support as needed when connecting with tool teams).

4. Includes ongoing engagement plan with an open source developer(s):

  • Includes a specific plan to connect with developer(s), such as Jitsi Meet calls, Email, GitHub issue submissions, etc.;
  • Outlines plan to follow-up with or stay connected with developer(s) after the engagement.

5. Cost effectiveness:

  • Provides reasonable cost estimates that have a clear justification;
  • Highlights any additional funding that may complement this work.

6. Risk Mitigation

  • Demonstrates a record of safely operating in hostile digital environments or engaging with at-risk users;
  • Provides clear justification and mitigation strategies for any proposed in-person engagements (in light of the COVID-19 pandemic).

HOW TO APPLY?

We’re afraid that applications are now closed