Member Accountability
Design
Three key safety features will be built into the platform: accountability, informed choice, and education. While we will use moderators where necessary, actively policing of all members by a select few individuals leads to all kinds of issues and isn't scalable. Rather, we will empower the community with powerful tools to moderate itself.
We cannot prevent bad users from joining the platform, but what we can do is hold everyone accountable for their actions. Sexually predatory users, hosts who make inappropriate sexual advances on their guests, people who send creepy messages, and others who make members feel uncomfortable or unsafe, will be made accountable for that by in some way having that information reflected on their public record.
Theoretically, this is already a feature of Couchsurfing™ in the form of negative references. However negative references are so stigmatized and unused that they are not effective. Most users have 100% positive references, and so it is too hard to tell whether an experience with anyone will be safe or not.
By redesigning the review system, we make users more likely to leave negative reviews, while not making one negative experience or mistake be the end of a member's ability to take part in couch surfing. Members' scores will mostly fall in the 60-80% range, which will make leaving negative reviews less of a 'punishment'. Providing a couple more questions will also leave more nuanced answers than a simple positive/negative. For instance, there may be situations where a host makes sexual suggestions to their guest who then brushes it off but is made to feel uncomfortable. That guest may say overall they would still stay with that host, while still signaling that discomfort. The set of questions is now a bit broader, but remains simple. Users can leave these reviews anonymously and also be prompted for a public reference that is not attached to a positive or negative rating. This will give high scores to great members, who people feel safe around and have a good time with. We will look into implementing techniques from differential privacy to make it much harder for receivers of negative feedback to know which particular member wrote that feedback, while making it resistant to misuse.
Given that a member's score now accurately reflects their actions and holds them accountable, people can make informed choices. Everyone has different tolerances for what makes them uncomfortable or makes them feel unsafe, and so everyone can choose individually what members they choose to interact with by filtering by their score. Further, there will be the ability to filter by sub-community. For instance, women may want to see the community standing of a male host just amongst other female surfers. This way, predatory hosts can be filtered out of the community.
There will still need to be governance to deal with cases such as explicit harassment and assault. To prevent these kinds of users from reentering the platform by creating a new account, we have introduced a better verification system.
Similarly, if you receive a creepy message, you will be able to flag it which will reduce the user's score. You may filter who is able to send you messages by score, significantly reducing creeps from your inbox.
Education will be important and we are exploring how this will be implemented. The key messages to prominently get across is that the platform is not a dating or hook-up app and that people should be aware of the vulnerabilities and power imbalances that are created when people receive hospitality from others. One way of doing this is to remind users frequently to be aware of these issues, for example by having each new user tick a box with a few short reminders about how to be a good couch surfer, or displaying messages throughout the app.
Similarly, if users just join for free accommodation and take advantage of hosts' hospitality without properly engaging, this will be reflected in their score so they may be filtered out by hosts if they repeatedly act that way.