Pixstory Striving to Address the Need for User Safety On Social Media

November 23, 2021 by Dan McCue
Pixstory Striving to Address the Need for User Safety On Social Media
Appu Esthose Suresh, founder and chief executive of Pixstory.

WASHINGTON – Social media, once a venue for pet pictures and catching up with friends from high school, has turned toxic in many cases. And it’s almost impossible to escape the most divisive of political commentary.

Among those trying to address the issue is Appu Esthose Suresh, founder and chief executive of Pixstory, a social media platform he’s promising will place people above profits and keep them there.

“What we’re trying to do is show you can successfully challenge the existing business model that until now has rewarded misinformation and bad behavior,” Suresh said during a recent Zoom call with The Well News.

His ideas on policy and where the social media sector should be heading may well be a road-map for legislators grappling with data privacy and bad actors. 

TWN: I’ll start with the obvious question: I’ve got my Twitter and my Facebook, my Instagram and my Tic Tok … everywhere you turn, somebody’s curating one of these accounts.  Why does the world need another social media platform?

AES: Well, I would argue you need a new one for all the reasons people are now criticizing the existing social media platforms. What you need is a social media platform that prioritizes user safety and ethics over everything else.

It is my own personal opinion that the two largest threats to humanity right now are climate change and hate and polarization through misinformation in the internet space.

And I think we need to approach trying to solve these problems in different ways than we have in the past. When it comes to the things that have gone wrong with social media, the founders of the existing social media platforms look at everything from the perspective of technology.

Technology, which has been the enabler of the problem, will be the solution. I don’t think that’s right. Our thing, with Pixstory, is that we need to redefine the business model so that it doesn’t profit from hate or disinformation, and in fact disincentives hate and disinformation.

What we’re saying is you need to build a whole new ecosystem that rewards decent exchanges between people. Our business model is, good ethics is good business.

And one thing you’ll notice on our platform is that there is no instant validation of a story on our site. There are no likes, there are no dislikes. We take a strong stand against hateful content. We have zero tolerance for that. And we have created a very open, transparent ecosystem where people won’t feel like they are being victimized.

TWN: Okay, let me make a counter argument. Say I started a social media platform 15 or 20 years ago, I cared and was conscientious about it, but I just didn’t anticipate the bad actors and other things we encounter today. And now these web pages are highly complex systems — ecosystems as you call them. Are you, in effect trying to capitalize on the ire the Facebooks of the world have inspired, and doing things today to safeguard users that weren’t available or thought of when they started out?

AES: That’s a very, very good question. Pixstory was in development for a long time before we launched it earlier this year. It wasn’t like we woke up six month ago when there was a big debate around social media and technology and said, let’s get into this, there’s an opportunity here. The ideas behind Pixstory had been brewing for some time.

But I do think one of the disadvantages that some of the bigger, older tech and social media platforms have run into is that they started out connecting people and then evolved into something else.

Now, in a way you’re correct. We are coming along at a time when we can say we don’t want to adopt that business model. I’ll even give them the benefit of the doubt and say they didn’t anticipate their platforms becoming what they are today. But they made the decisions they did about what to do, when and where.

They prioritize interactions, even if it means marketing information for it. And even if it comes with a host of other problems.

What we are striving to do is combine tech design and a moderation scheme. By which I mean we’ve consciously made a trade off — with our business model, you can’t have astronomical profits. You can’t have unlimited growth for the sake of growth. User safety has to be the priority.

TWN: You’re coming to market at a time when there is a lot of scrutiny of this industry from Congress. The rules of the road for social media, in theory, could change at any time. How scary is that for you?

AES: I feel exactly the opposite. I think when the regulatory changes come down, we’re going to benefit by being an early adopter or early qualifier, if you will, under those regulations.

I had the opportunity to meet with all the high-ranking officials in the European Union, which is actually leading the way when it comes to rules and regulations for this space, and one of them said to me, “It sounds like what you’re doing is going to be the next big thing.” Now, I’m not saying that to brag, and it’s no exaggeration. That’s a quote from an actual conversation.

And I think, because our starting point is safety and privacy and respect for data protection, we’ll do very well with respect to the regulatory shocks other companies will be grappling with for the next five to seven years. In a sense I would say, we are kind of like electric vehicles. It has been harder for the larger car companies to fully embrace the electric vehicle and it’s harder for them to make the transition.

One of my favorite stories about this kind of thing involved Preston Tucker. In the late 1940s, owning a car had become a middle-class aspiration in the United States. But the more cars that got sold, the more fatal car accidents that were occurring.

Why?

Because nobody factored in safety measures. The big auto manufacturers knew about the problems, but didn’t want to invest in safety. Can you imagine building a car today without shatterproof glass and other safety features we now take for granted?

Well, Preston Tucker tried to build safety into his car — the Torpedo — by incorporating them into the design and configuration of the headlights and so on. At the time, the industry was able to block those innovations and Tucker wasn’t able to fulfill his dream. But now, decades later, everyone is doing what Tucker wanted to do.

TWN: You mentioned before that your dream had been in the works for some time … tell me about the roll out?

AES: We started working on this in 2017, and our public beta launch was this past March. And we were lucky to get early support from people in the United States like the basketball star Dwight Howard, and we’ve been attracting attention from a lot of other heavy hitters — public officials, sports stars and celebrities that have similar concerns.

TWN: But how did you actually roll it out? I mean, you can’t exactly go on Twitter and say, “The heck with these guys, come over to our space?”

AES: You reach out to people. For instance, we’ve been successful in getting some strategic partnerships with some of the top football clubs in the world. And that’s because footballers were facing unfettered hate and racism, had been protesting, and yet nothing was changing.

So our approach was: Hey we are doing exactly what you’re asking the industry to do, So let’s collaborate. Let’s build so that what we build here will also build a large coalition of people and alliances that will make a safer internet a reality.

TWN: Right …

AES: It’s the same thing with climate change. People have been protesting for years and nothing was happening. Nothing was changing. But as soon as you actually started engaging people in other countries, and educating them to the fact that together we can address climate change, you began to see movement on the issue. It’s the same thing here.

TWN: Let’s talk about user experience. I happen to have your website open on my browser. And … your page looks a lot different from Twitter or Facebook. There’s a lot of white space here, things like that. It seems to function a bit differently than other social media sites do. And as much as people complain about their current alternatives, are you afraid at all that they’ll resist switching to the unfamiliar?

AES: What we’ve tried to do is incorporate some checks and balances right into the design. And what we created is a story-telling platform. We encourage people to tell stories that are near and dear to their heart, but there are a couple of more steps now, in terms of the user journey.

And we think that makes for a more thoughtful process, because it gives you a moment to pause before you react and really think about what you are trying to say.

At the same time, we’ve gotten away from your validation being how many followers you have. Instead, your visibility is a function of how reliable you are. Every time you post content that is devoid of hate or active misinformation, our ecosystem takes notice of that and raises your visibility. 

And if you make a mistake – a bonafide mistake — you have a chance to take the post down and edit it. And it’s all good. And if you don’t take it down, then we will — which means there’s always a disincentive against someone being hateful or spreading misinformation. … and that’s how you incentivize user behavior change.

I mean, we all know that people tend to be hateful or post controversial stuff and spread misinformation because that attracts attention and then the algorithms of the particular platform you’re on amplifies those things.

I mean, there’s a reason why, on a very prominent social media platform, an angry emoji gets five times the weight of any other response.

TWN: I’m looking at some of the content here on Pixstory. I see one item here is a picture and a caption. Another is more like a blog post or news story. How long does it take for you to check individual posts to see if they meet your standards and is the process different depending on the nature and form of the post?

AES: Well, it begins with your users knowing there’s a standard in place. We tell you upfront we don’t want explicit material or material that violates our community safety standards. Then, as soon as you post something, we’re checking in on the backend, specifically looking to see if it’s hateful or contains misinformation. If something registers as extremely hateful, we take it down.

Now, we also realize that no system is perfect at capturing everything, so we also allow users to report content that they find hateful.

The other part of this, of course, is, what if something isn’t hateful, but it is controversial? The way we address those  cases is to try to add value to that conversation. It’s why we don’t have a simple “like” or “dislike” function. Instead, we have “support” and “challenge.” 

Perhaps someone is seeing exactly the same situation that you are posting, but has a completely different perspective on it because they live in another country or belong to another religion. In other words, we’ve designed our platform to encourage people, in the end, to come away from an interaction not with anger, but with a sense that, “Okay, everybody is entitled to their own opinion.” And I think that’s interesting.

TWN: How do you see the future playing out for social media? 

AES: So, I’m going to bite the bullet as I answer that. I think social media is going to become something like the oil and gas or pharmaceutical industry in that there will be a high barrier to entry and the major players will continue to exercise things the way that they want.

But there will also be higher checks and balances. Regulation is definitely coming. What I fear is that it will take many years to take effect because it’s going to be a process. As you’ve seen time and again in Washington, even as a law is being passed, it’s being challenged. Things drag on and on.

And my concern, my fear, is that the world doesn’t have that kind of time when it comes to the danger associated with what’s going on right now on social media. We need to intervene now.

Which is why I think it is incumbent upon newcomers to the space to first, not be intimidated by the size of the big players — the big players were once small players too — and second, to reimagine the homepage and how it functions. Find best practices and implement them. Look at the design of your platform. Look at how to build moderation into it.

I think the drive for more social responsibility from our social media platforms will continue to grow, in one form or another. And I think there will be more major social media platforms to come.

Is Pixstory going to compete with Big Tech, probably not, but I think we will compliment what they are doing without the obsession over astronomical growth and unlimited growth and unlimited profits. And I think that while newcomers to the sector will experience some initial inertia, over time the number of new, innovative, socially responsible social media platforms will grow to a significant number.

A+
a-
  • Appu Esthose Suresh
  • Congress
  • Pixstory
  • regulation
  • Social Media
  • In The News

    Health

    Voting

    Social Media

    A Supreme Court Ruling in a Social Media Case Could Set Standards for Free Speech in the Digital Age

    WASHINGTON (AP) — In a busy term that could set standards for free speech in the digital age, the Supreme... Read More

    WASHINGTON (AP) — In a busy term that could set standards for free speech in the digital age, the Supreme Court on Monday is taking up a dispute between Republican-led states and the Biden administration over how far the federal government can go to combat controversial social... Read More

    House Passes Bill That Would Lead to TikTok Ban if Chinese Owner Doesn't Sell. Senate Path Unclear

    WASHINGTON (AP) — The House on Wednesday passed a bill that would lead to a nationwide ban of the popular video app... Read More

    WASHINGTON (AP) — The House on Wednesday passed a bill that would lead to a nationwide ban of the popular video app TikTok if its China-based owner doesn't sell, as lawmakers acted on concerns that the company's current ownership structure is a national security threat. The bill, passed by... Read More

    Fake Images Made to Show Trump With Black Supporters Highlight Concerns Around AI and Elections

    WASHINGTON (AP) — At first glance, images circulating online showing former President Donald Trump surrounded by groups of Black people... Read More

    WASHINGTON (AP) — At first glance, images circulating online showing former President Donald Trump surrounded by groups of Black people smiling and laughing seem nothing out of the ordinary, but a look closer is telling. Odd lighting and too-perfect details provide clues to the fact they... Read More

    Florida Lawmakers Prepare for Gov. DeSantis to Veto Social Media Ban on Children Under 16

    TALLAHASSEE, Fla. (AP) — Florida lawmakers were bracing for Republican Gov. Ron DeSantis to veto a bill banning social media... Read More

    TALLAHASSEE, Fla. (AP) — Florida lawmakers were bracing for Republican Gov. Ron DeSantis to veto a bill banning social media for children under 16 on Friday and finding a way to make him more comfortable with its language before their session ends next week. DeSantis supports... Read More

    Russian Disinformation Is About Immigration. The Real Aim Is to Undercut Ukraine Aid

    WASHINGTON (AP) — For Vladimir Putin, victory in Ukraine may run through Texas' Rio Grande Valley. In recent weeks, Russian... Read More

    WASHINGTON (AP) — For Vladimir Putin, victory in Ukraine may run through Texas' Rio Grande Valley. In recent weeks, Russian state media and online accounts tied to the Kremlin have spread and amplified misleading and incendiary content about U.S. immigration and border security. The campaign seems... Read More

    February 26, 2024
    by Tom Ramstack
    States Try to Convince Supreme Court to Keep Laws Regulating Social Media

    WASHINGTON — The Supreme Court seemed skeptical Monday of two state laws that could redraw social media companies’ right to... Read More

    WASHINGTON — The Supreme Court seemed skeptical Monday of two state laws that could redraw social media companies’ right to decide what sort of content is displayed on their platforms. The tech companies say the Florida and Texas laws that seek to limit which content they... Read More

    News From The Well
    scroll top