No technology company is safe from the potential misuse of its product or service by bad actors. Least of all social networks that deliberately try to attract users that have been banned or shunned on other platforms due to their disruptive and counterproductive behaviour. We’re talking about Parler, a so-called alt-tech platform whose star fell as quickly as it rose in 2020.
Initially, the platform proclaimed to be a “safe space” (or rather echo chamber) where users could express their opinions without regulations or censorship. But Parler fell victim to social risks that have afflicted the fate of many other alternative social networks across all political and ideological spectrums: anti-social behaviour and hijacking by bad actors.
While Parler, a hotbed for xenophobia, antisemitism and right-wing extremism, might not be a prime example of technology acting responsibly towards society, there is a lot to be learned from its struggle to prevent diminishing trust and confidence in their product.
Parler is imploding. pic.twitter.com/tmfk52666q
— Parlertakes?? (@parlertakes) December 14, 2020
Sounds interesting but what exactly is Parler?
Parler is “a non-biased free speech-driven entity” founded by two computer scientists, John Matze and Jared Thomson. It received its funding from angel investors like the prominent conservative donor Rebekah Mercer, as it doesn’t display any ads and is not profitable as of now.
From a UX standpoint, it’s similar to Twitter with posts that can be shared (or “echoed”), rated and commented on. Its mission statement to “enable productive, polite discourse among people with differing interests, life experiences, and viewpoints” is in strong contrast to Parlor’s active users, who are overwhelmingly right-leaning and to some extent radicalised.
The network is most popular in the Americas, the U.S. and Brazil to be more specific. However, it had initially caught on in the Middle East and in Europe too. At one point, conservative members of the British parliament romped about alongside anti-government extremists and QAnon conspiracy theorists.
Parler is often mentioned in the same breath as other alt-tech platforms that cater to right-wing communities such as 4chan.
Why did it become so popular then?
The emergence of alternative social media networks is fueled by users’ increasing dissatisfaction with existing platforms. The messaging app Telegram gained popularity by explicitly emphasising its focus on privacy and data encryption features, that WhatsApp is reportedly struggling to implement.
In a similar way, Parler saw a steep increase in user registrations after Twitter changed its policy towards the tweets of the current U.S. President Donald Trump. While adding a flair about disputed facts seems like a drop in the ocean of stopping the spread of misinformation to some, to others this was enough to boycott the platform and switch to Parler – at least temporarily.
Its community guidelines are fairly short. They emphasise Parler’s minimal involvement in monitoring and removing content, something that definitely appealed to an audience disgruntled by their perceived censorship through bigger tech companies.
The chronological layout of users’ feeds was seen as a positive contrast to sometimes untransparent algorithms of other platforms that favour certain posts without any real explanation. Creators on platforms such as Instagram accuse the tech giant of bias and censorship every time a new algorithm update is rolled out. By displaying posts in the exact order they were released Parler tried to circumvent this issue..
I can’t get past the captcha! Anyone know a fix? Beyond frustrating..I’ve tried on iPhone, pc and the app ?
A quote from a disgruntled Parler user who struggles with some of its features. Source: “Who Is Moving To Parler?” Facebook group
So, where did it go wrong?
Firstly, proclaiming oneself as a safe haven for individuals who have been banned from conventional social networking sites due to their breach of community guidelines set the tone for the type of discourse to come.
Secondly, Parler fell victim to something tech2impact and The Institute of Technological Ethics describe as “social risks”: the consequences of bad actors using technology in unintended and harmful ways. There are 2 key social risks that lead users to turn away from alternative social networks, including Parler.
Anti-social behaviour
One of the main reasons Parler has taken a hit with its user base is the various reports of high-profile QAnon figures being impersonated on the platform. QAnon is a (disproven) far-right conspiracy theory, driven by anonymous leaders who post regular “updates” on fringe websites. An administrator of one of these websites has publicly spoken out about Parler, as somebody had created a verified profile on his behalf, pretending to be him.
Another example was the fake profile of right-wing English media personality Katie Hopkins. After she was banned from Twitter for violating their policies, an account claiming to be Hopkins was set up and quickly verified on Parler. This account went on to raise $500 for a supposed lawsuit against Twitter that was of course fabricated. The CEO of Parler ended up publicly apologizing and verifying the real Katie Hopkins himself, but at that point, users had already begun to become suspicious of the platform.
Bad actors & hijacking
Letting misinformation and conspiracy theories spread might have been one of the core USPs of Parler. But the same practices that were put in place to “ensure free speech” (i.e. not using algorithms but volunteers to filter content) lead Parler to be susceptible to third parties utilising the platform for their own gains.
Reportedly, the platform is an easy target for the spam of pornographic material of actors using the site to promote their own services. Mechanisms that should disable auto-play of videos seem to not always work, which leads to users being confronted with graphic imagery they might not necessarily want to see.
Another example:
MeWe, a different alt-tech network is currently grappling with the hijacking of their platform by gun and drug dealers. Their mission statement was to give “the power of the internet back to the people”.
Hey @GregAbbott_TX! This dude is saying stuff on Parler. ✌️ pic.twitter.com/guvMdDJMQW
— Parlertakes?? (@parlertakes) December 14, 2020
With that being said, what are the implications of all this for tech founders?
Technology founders who set out to create alternative social media platforms amongst the quasi-monopolists Facebook, YouTube and Twitter need to be extremely careful to not create something even more harmful. Creating an environment that adds value and not harm to their users lives can be a tightrope to walk.
The implications of social risks are detrimental to users’ wellbeing, self-esteem and mental health. While bigger companies like Apple and Microsoft have their fair share of controversies with regards to this, their operations are more robust due to size and experience. Startups’ reputation and performance on the other hand can be heavily affected by even just one incident.
Without taking social risks into account, tech companies are vulnerable to anti-social behavior and bad actors that can taint the experience of their users. When it comes to Parler, these and other factors are leading to the erosion of meaningful interaction and a drastic slowdown of their growth (which definitely isn’t a bad thing, all things considered).
But regardless of Parler’s twisted business model and mission statement from the get-go, there are valuable lessons to be learned for startup founders about the need to plan ahead for social risks of all sorts.
If you want to find out more about your startup’s status quo when it comes to social risks and other topics such as business ethics or socio-cultural understanding, take the Responsible Technology Assessment here – it only takes 15 minutes!