Design Challenge: Sometimes strangers who have gone through similar situations can give the best advice when we’re going through rough times in our lives. How can you design an environment where people who are seeking help feel safe to connect and support each other?
Four weeks (2020)
Many people who are going through challenging times don’t, due to either personal circumstances or exterior factors, have an outlet to reach out to others and connect. Additionally, existing platforms that attemp to offer supportive communities online are difficult to join and navigate for many people due to a fear of judgement, doxxing, and discomfort. How can we design a space that fosters trust and openness in a digital space?
Aure is an app concept that provides a safe space for people to come and share their experiences with each other anonymously. My goal was to create an approachable and calming experience that is not only easy to learn, but focused on minimizing emotional labor and welcoming to people of all ages and backgrounds– a light in the dark.
Before I started solving anything I took a look at the issue at hand and broke it down to its root causes. The current need for community and and open conversations between people who share the same experiences is unmet, leaving people to endure difficult times without the much-needed help and support of peers and guidance. We want to make it easier for people to seek help, connect, and support other people. This raised the core questions:
With these main points of inquiry in mind, I interviewed ten different people about their experiences with going through challenging times, as well as their approaches to healing and consolation.
Select Research Questions
Amongst the people that I interviewed, a common thread was the reasoning behind their decisions of whether or not to trust somebody with their thoughts and experiences, no matter who the recipient of their conversation was. Whether it was a therapist, a close confidant, or even a person they met online, they were only comfortable talking with another person if that person could be presumed to be unbiased and non-judgmental.
Fear of Bias + Judgement
Comfort + Bonding
Having saught out online communities for support and experience myself, I found it important to analyze how other platforms are trying to meet this demand and note how they are encountering problems or solutions in the ways in which the applications have been appropriated for such a sensitive need.
One of the most integral takeaways from looking into how other platforms have handled hosting communities based around providing support is the way in which a platform enables certain types of relationships and dynamics to be built. Specifically, I focused on the ability or inability to direct message a person on an application, and how such a capability can affect the course of the well-meaning supportive dynamic and consequently affect a user's behavior and emotional wellbeing.
The key finding in this area was the decision of some specific Reddit sub-groups to discourage the usage of DM's in mental health support groups. People who are struggling with serious mental-health issues often (justifiably) have a low tolerance for disappointment and a high-level of ever-changing emotional need. Unless the helper is able to make a 100% commitment to be there for them in every way, for as long as necessary, offering a personal inbox as a resource is likely to do more harm than good. No individual person, let alone a single user of the app without proper training in therapy or social work, is equipped to singlehandedly handle someone else's struggles. It's much safer and healthier for users to develop a relationship with the community as a whole, to ensure the availability for anyone to reply at any time and preventing the compassion fatigue of any one person.
We want to make it easier for people to speak and connect with people who’ve gone through similar experiences. The underlying problem is that people aren’t comfortable with sharing their identities and struggles with strangers online. This is problematic because people therefore have no choice but to go through struggles by themselves, and as a result bear the weight of their problems without the much-needed aid of others who’ve gone through similar experiences. Additionally, current platforms that try and meet this demand for empathetic online communities inadvertently foster a dynamic that can be more harmful than beneficial to users.
From these statements and traits observed from my interviews, I created two personas: Phoebe, a student and teenager apprehensive about confiding with the web in such a personal manner, and Cameron, an adult seeking help on the internet for the first time.
With both Phoebe and Cameron's needs in mind, which encompass a wide scope of the needs of an audience diverse in age, familiarity with tech, and experience with the struggles which would have prompted them to approach the application, I created three goals:
I started off by brainstorming possible screen layouts and user flows. While it was useful to find inspiration from the structure of existing platforms with community forums, like Reddit, no other popular video app specifically targets the needs of an audience that is specifically looking for people to connect with and talk to about their experiences with dark times. The following are my sketches and annotations delineating my thought process and iterations.
From the sketches, I narrowed down which screens would require the most focus and iteration for addition into a valuable and Onboarding process, the environment that user is introduced to (via the Home Screen), and the Chat Room functionality– that would be integral to meeting the goals of this design challenge.
From this discussion I created a set of low-fidelity wireframes. I focused on the key screens on a typical user flow, fleshing out the architecture of each screen with special attention to the interfaces' written prompts and instructions that would guide a user. These screens functioned as a bare-bones prototype of the entire app, ready for user testing.
With the main functionality down, I showed this low-fidelity prototype to three users and observed their interactions. Their verbal feedback during and after the testing provided me with the following insights:
Having users test out my low-fidelity wireframe proved to be extremely useful in understanding user's specific needs. From their feedback I decided to implement another feature to the chat rooms that would address a need I have initially overlooked– the need for the ability to choose the type of people you speak to.
With the structure of my app completed, I got to work designing the visual style and elements of the interface. I chose a color palette that was calming, consisting of cool blues and soft peaches, echoing the colors of a sunset or scene in nature. It was important to maintain a neutral and passive visual presence to not impose any emotions or schemas onto a user– Aure is meant to be a safe space away from situations and applications that can induce stress and agitation.
With a solid foundation laid out, I created a set of high-fidelity wireframes to fully visualize the final product. I paid extra attention to the seeming breathability of the interface– how neat, uncluttered, and aesthetically calming each screen was– so users would have space to think instead of being overloaded with screen traffic.
The onboarding process, which is the first impression a user will have of the app, was designed to be friendly and welcoming. The copy writing of the screens of each process of registration is carefully thought out, with consideration of how different people in need would perceive and react to the subtle shifts in tone and wording. Users are asked to choose a username, which will be the only mode of identification on the application and ensures the user's anonymity while interacting with others on the app.
The Search button of the bottom menu bar brings up the Search page, where users can search for communities and tags to locate threads and groups they may be interested in joining. Individual posts in communities can also have tags for searchability and ease of categorization.
Users are allowed to choose in their settings whether or not to opt for Advanced Chat Matching, which takes in their age, gender, and communities they are part of as data points to facilitate the chat room member placement. People with Advanced Chat Matching activating will be put in chat rooms with people who share similar traits, offering a point of conversation and bonding to aid the formation of empathy, trust, and open conversation.
With the intention of curating a safe space and opening users' emotions, especially when some of them may be high risk, I had to pay much more careful attention to the exact tone and reception of the text that I added to my designs. While I was used to making text "fun and quirky" and personable, it was a whole different experience making sure phrases were perfectly worded in orer to make people feel safe.
This project's topic was particularyly close to home for me , as a person who has often struggled to find community around specific topics that need healing. It was really helpful to be able to re-examine spaces and social media that I'm currently a part of, and systematically identify why they don't work for people like me and what needs to be changed.