In the fall of 2020, Scoop pivoted its product focus to build tools that support companies choosing to employ both on-site and remote workers.
We were moving fast to save our business after COVID caused our carpooling product to lose most of its revenue.
I wanted to make sure that as we moved fast we didn't break things, especially because our product roadmap had a lot of ideas that required collecting personal data from remote workers.
"When we move fast and break things and those things get bigger and bigger, the rubble falls everywhere, destroying communities and the rising dust blots out the sun."
- Mike Monteiro
I interviewed internal stakeholders, some former Scoop employees, and did research on trust and safety standards. I developed a set of trust and safety tenets to help us quickly make product decisions that aligned with our values and to guard against mistakes that would be difficult to roll back. I also created opportunities for everyone at the company to provide feedback.
Background
By agreeing upfront on trust and safety principles for our team, we can quickly make product and design decisions that align with our values and guard against mistakes that are difficult to roll back. In developing these goals we consider our ability to deliver value to our users, including employees, employers, and buyers. We also aim to hold ourselves to high ethical standards, committing to doing no harm to our users and strictly maintaining legal business practices.
This information pertains to user experiences. Data handling and technical implementation best practices are addressed elsewhere.
The tenets
1. Collect the minimum amount of personal data needed to do the task at hand.
Don’t collect data for undefined future use. Evaluate the level of risk associated with collecting each type of personal data. High risk data, like location information, needs to be justified by high user value, typically for both employee and employer.
2. Tell people what we’re doing with the data.
Tell the whole truth. Trust is built when motivations are clear. With sensitive user data, it’s important to ask permission and inform users up front, rather than having to ask for forgiveness after the fact. Tell users who will be able to view their data, what data will be viewable, and what it’s for. If location data is used for contact tracing that could save a life, users will be less likely to object to sharing their location.
3. Embrace the principle of least privilege: show the minimum amount of data to the most limited people.
Personal information should be accessible by the minimum number of people who need it to do their jobs. This applies both to PII on the Scoop employee side and employers looking at employee data. Company policies and features like tiered administration allow for controlled access.
4. If data is non-essential, give employees the option to use an alternative or to opt-out of sharing.
Data is personal and messy. If it’s not critical, we should make sharing it optional. If there is another approach, we should pursue or enable it. For example, if an employee suspects that they may be discriminated against for indicating they need childcare flexibility, we should enable them to opt-out of sharing that information.
5. Give users control over communications.
Access to users is a privilege that’s easily abused. Users should have visibility into and control over the ways in which we can contact them and why. Only business essential communications like terms of service information should be out of the user’s control.
6. Where there are potential data inaccuracies, clearly communicate to users what we’re measuring.
Explain how data is being collected and how it might have errors. Don’t present 100% accuracy if it is not the truth. For example, do this: “Scoop will only access your location when you are at or near your workplace, and will stop using it once your arrival is confirmed.”
7. Where concepts are difficult to measure and susceptible to bias, don’t make things worse.
We shouldn’t shy away from complex topics like measuring productivity, but we’ll need to be diligent in understanding how it’s measured already, and go out of our way not to accelerate bias. Where possible, we should work to add clarity to the system and counteract historical systematic biases.
8. Look for ways to reduce the harm that can be done with our product, even if we can’t eliminate all potential harm.
There are easy wins that limit harm like blocking certain words in free text entry, or limiting the number of locations that can be tracked without admin approval. Let’s not let perfect be the enemy of good.
9. Understand our role, don’t overstep.
Some issues need to be handled by a company’s HR department. We can reduce opportunities for issues like inter-employee harassment, but if that happens, a feature in our software like colleague blocking would not be a reasonable or comprehensive solution.