You are on page 1of 126

IMPROVING

SOCIAL MEDIA
The Purpose of this Report
It is time for greater action toward improving social media.

This report, Improving Social Media: The People, Organizations and Ideas for
a Better Tech Future is directed at policymakers and social media platforms
with the express purpose of creating a more holistic, collective approach
to improving social media. It also foregrounds the diverse ecosystem that
affects and is affected by social media and aims to promote a culture of
knowledge-sharing and collaboration that is necessary for this proactive
approach.

We fully recognize the complexity of improving social media and how it


interacts with economic, social, and historical influences. That is precisely
why the team that developed this report includes a diverse group of 100
individuals from a wide variety of backgrounds and perspectives. This
report also includes more than forty interviews representing an expansive
range of experts from industry, academia, policy, research, startups,
advocacy organizations, and other thought leaders.

The aim of this report is to create a more comprehensive, proactive


approach that addresses the complex sociotechnical issues related to
social media.

Collaborative Connected Collective


We know that there is a problem with the state of social media and how it
affects our civil liberties, wellbeing, and information ecosystem. Solving
these problems will require a far more thorough approach that
demonstrates cross-sector understanding of the interlocking roles of
platforms, users, policymakers, tech workers, news media, advertisers,
funders, and the Knowledge Base that informs the full ecosystem.

Social media dramatically impacts our future. This report will help us move
toward co-creating a tech future that we want to live in.

IMPROVING SOCIAL MEDIA | 2


Table of
Contents
Our Holistic Aporoach Organizations & Resources

6 Welcome, All Tech Is Human 109-120 Listing of Organizations & Resources

7 Interlocking Roles

8 Explaining the Roles In Conclusion


9-11 Explaining Our Process
122 Contributors

Issues & Grid Approach 123 Next Steps

124 In Summary
12-14 Overview of the Issues

15-16 Grid Approach in Action Stay in Touch

Community Interviews 125 All Tech Is Human

125 The Bridge (Community Partner)


17-108 Diverse Range of 42 Interviews Contact Info
126

IMPROVING SOCIAL MEDIA | 3


"For my part, I would be encouraged to see more people involved with a deep and practical
understanding of human psychology and the ways to support our most vulnerable populations."
-Dona Bellow, Responsible Innovation Manager, Facebook

"The conundrum is not exactly whether platforms moderate too much or too little. it is more about
whether platform policies around moderation are guided by the kinds of considerations that social
media companies operating in democratic, decidedly anti-racist societies should prioritize."
-Oumou Ly, Staff Fellow, Berkman Klein Center for Internet and Society

"While I'd much rather use the carrot, there are times the stick is definitely needed. And, as expressed
through our commitment to safety by design, we absolutely believe that industry has to do better in
making their platforms safer, more secure and that they need to be both more transparent and
accountable for harms that take place on their platforms."
-Julie Inman-Grant, eSafety Commissioner of Australia

"This cannot just be left to technologists to solve and lawyers to debate. Social media has a
profound impact on all of society, and we are all stakeholders in the ultimate solutions."
-Yael Eisenstat, Democracy activist; former Facebook elections integrity head; former diplomat,
intel officer and White House advisor

IMPROVING SOCIAL MEDIA | 4


"The scale of the problems of platform management, especially content moderation, but also
cross-cultural complexities, push platforms to look for automated, AI-driven solutions that lack
nuance and often punish more users who do not look like a platform's imagined average user."
-Amanda Lenhart, Program Director, Health + Data, Data & Society Research Institute

IMPROVING SOCIAL MEDIA | 5


Let's improve social media.
No matter what your background is, most of us agree that the current state of social
media is not ideal nor sustainable for a healthy democracy. The attack on the US
Capitol on January 6th was a wake-up call that democracy depends on a shared
reality and understanding of truth. This tragedy brought focus to the immense role
that social media plays in influencing the information ecosystem and, by extension,
human behavior.

It is quite clear that there is a problem; what has been more difficult, however, has
been the transition from this "awareness stage" to the far more challenging stage of
working out solutions and improvement. This involves a far greater degree of
participation across a diverse range of groups to ensure that we are truly co-
creating our tech future. This also involves approaching the issues of social media in
a more collective, holistic fashion.

I hope that you find our report on Improving Social Media to be a valuable resource.
Our aim is to provide a more thorough understanding of the complex issues facing
social media, while highlighting a range of ideas for improving social media and
showcasing a variety of passionate people and organizations working toward this
goal. By involving such a wide range of individuals and organizations to make this
report, our intention is also to promote the knowledge-sharing and collaboration
that is necessary in order to tackle such an intractable problem. We need more
voices, more perspectives, more participation.

If you are new to our organization, welcome. The mission of All Tech Is Human is to
build the Responsible Tech pipeline. I believe that we can improve our tech future by
dramatically changing those involved in it; making sure that the pipeline is diverse,
multidisciplinary, and aligned with the public interest. Our organization has been
actively building a broad community across civil society, government, and industry
since 2018. It's this type of diversity and collaboration that seems utterly necessary
as we work toward improving
social media.

Let's co-create a better tech future,

DAVID RYAN POLGAR


Founder of All Tech Is Human
David@AllTechIsHuman.org IMPROVING SOCIAL MEDIA | 6
IMPROVING SOCIAL MEDIA | 7
The Roles
PLATFORMS
To provide and actively evolve toward an environment conducive for shared
truth, harassment-free communication, and the overall health of democracy;
proactively consider not only the way a platform may be used, but also the
way it can potentially be misused and create adverse societal impact.

KNOWLEDGE BASE
To inform and influence every aspect of the overall social media ecosystem;
develop a culture of knowledge-sharing and collaboration to increase overall
quality and ability to affect change; consisting of researchers, academics,
advocates, and activists.

TECH WORKERS
To build and maintain awareness of ethical considerations in technology;
enrich the definition of product success to include user wellbeing; utilize an
ethics-by-design approach to proactively plan for ethics within the entire
product development lifecycle to shape better design decisions.

POLICYMAKERS
To recognize that social media is a dynamic landscape and will require
ongoing monitoring and regulatory oversight/guidance; consider the differing
needs and experiences within the population; connect multiple stakeholders
in creating legislation and regulation.

USERS
To be educated on the basic structural elements of social media and tech design;
embrace the power they have to affect change online while also learning about
the ways in which their power is mediated through platforms.

ADVERTISERS
To hold platforms accountable as business partners and through monetary
pressure; have brands advocate for and ally with their consumers.

NEWS MEDIA
To offer accountability through journalism that educates the general public and
adequately exposes the mechanisms of social media.

FUNDERS
To vet startups for sound privacy and safety practices and features; be able to
alter social media future through its investment decisions.

IMPROVING SOCIAL MEDIA | 8


ALL TECH IS HUMAN

PEOPLE There is a diverse range of individuals and


organizations that are focused on improving
social media that cut across civil society,
government, and industry. This stands in contrast
with the typical media narrative that paints
"improving social media" in extremely broad
strokes, with individuals either for or against a
social media platform.

In actuality, the movement for improving social


media is made up of people both inside and
outside of major platforms and from a variety of
disciplines and deeply held beliefs. This means that
we are not looking for a technical fix that will
solve all of our problems, but instead trying to
understand the complexity and nuance of issues
regarding social media. In making this report, we
intentionally sought out a diverse range of

ORGANIZATIONS
interviewees and collaborators.

Our report aims to highlight many of the


organizations that are focused on improving social
media. Most of these organizations are addressing
specific topics, such as decreasing misinformation,
increasing media literacy, or reimagining digital
public spaces. Examining the range of
organizations, three approaches to improving
social media become apparent: Prevention
(preventing or limiting negative media impacts),
Intervention (regulating, fixing
environments/incentives; protecting users post
harm), and Reinvention (reimagining digital spaces
and business models).

By uniting a wide range of individuals and


organizations, we intend to encourage a culture of
knowledge-sharing and collaboration among a
cross-section of groups. This is a recognition that
in order to improve social media we need to move
away from thinking of it as a "tech issue," and

IDEAS
instead appreciate how it is intertwined with
messy social, economic, and historical
underpinnings.

This means, unfortunately, that there is no silver


bullet to be found. There are, however, significant
opportunities to improve the current state of
social media by tapping into the vast amount of
work currently taking place and promoting greater
collaboration. Instead of a technical fix, we view
this as a cultural change toward collective action.

Our report features the ideas and perspectives of


around forty experts, showcasing the diverse
range of concerns and proposed "solutions." Our
team developing this report is composed of one
hundred individuals who illuminated the
complexity of the issue.

IMPROVING SOCIAL MEDIA | 9


To build a roadmap toward a better social media
future, we first need to decide what we would
want that future to look like. A major point of
contention that our report discovered is that there
is not a widely agreed-upon structure for our ideal
social media future. As showcased in our profile
interviews, one person's vision for our social
media future may be deeply at odds with another
person's values in terms of privacy and expression.
By bringing this to the forefront of conversation,
we can more aptly make decisions that are
cognizant of trade-offs and competing interests.

We are entering the politics of technology, which


means that we are not looking for the "right
answer" but more so a course of action that is
best for our individual civil liberties and the public
interest.

There is no perfect solution: a movement toward


greater content moderation also brings forward
accusations of overreach, lack of transparency,
and uncomfortable questions of concentrated
power in unelected officials. To lessen this
moderation, however, may increase a polluted
information ecosystem that strains civil discourse
and democracy. A push toward more private
communication is desirable for privacy advocates
and political dissidents—but also radical
movements and the sharing of abusive content
that thrives in secrecy. Moving toward a
blockchain future may offer greater individual
ownership of data, but its lack of centralization
may also weaken the ability to maintain a vibrant
and safe environment.

While there may be disagreements on the


underlying type of platform involved in an ideal
social media future, there are certain common
attributes that are more widely agreed upon.
Understanding these common attributes can help
guide us toward a better course of action as we
collectively improve social media. The most
common attribute involves a social media future
that is more respectful of user data and overall
freedom of mind, which may involve moving away
from the traditional ad-based business model
reliant on maximizing attention.

The ideal social media future is more cognizant of


harms to vulnerable groups, and actively seeks to
incorporate these concerns into its structure. This
social media future is also better aligned with the
interests of democracy, which emphasizes
depolarization and a semblance of shared truth.
Our ideal social media future allows for
meaningful connection and communication across
boundaries, which necessitates confronting our
current user/business relationship with our more
public-interest expectation. IMPROVING SOCIAL MEDIA | 10
Our theory of change at All Tech Is Human is
based on having a vibrant underlying Knowledge
Base (researchers, academics, activists,
advocates) that informs and influences all of the
power nodes that can create meaningful change
toward improving social media. This requires that
we move away from our traditional hierarchical
approach toward social media issues and adopt a
more neural approach that considers the various
interlocking parts.

As a case in point, consider the recent attention


regarding Section 230, which generally grants
immunity for platforms hosting user-generated
content and provides the right, but not
responsibility, to moderate content. Even though
Section 230 has been widely debated in academic
circles for years, it wasn't until a confluence of
factors came together that Section 230 generated
growing news stories, user concerns, an
advertising pushback, and enhanced pressure from
policymakers. Given that the most consequential
law facing social media was developed during a
time of floppy drives and prior to the founding of
major tech platforms (1996), our goal is to
massively accelerate society's ability to consider
the impact of technology. Our three-prong
approach to improving social media is about
moving from reactive to proactive action.

There is currently room for massive improvement


to increase the overall quality of the underlying
Knowledge Base that informs the entire
ecosystem. This requires breaking down current
silos and developing a culture of collaboration and
knowledge-sharing. The very process of
developing this report made clear that there is a
wealth of work that is underutilized. Hence, a
more vibrant Knowledge Base with an embedded
culture of collaboration increases the depth of
knowledge by allowing participants to build on the
existing work.

The second prong of this process is based on


building better pathways between the
Knowledge Base and all of the power nodes. The
coming years will likely feature important policy
and regulatory decisions that depend on
policymakers having access to quality expertise
that considers all angles of an issue. Likewise, a
stronger connection between the Knowledge Base
and the news media can push us toward a better
understanding of complex issues.

The third prong moves us from playing checkers to


3D chess. We should approach the issues of
social media in a holistic, collective approach that
simultaneously considers all actors and effects.

IMPROVING SOCIAL MEDIA | 11


An Overview of the Issues
By now, the many issues surrounding social media platforms have been well articulated by journalists,
academics, and civic society organizations.

At their root, the fundamental business model of these platforms is built on generating advertising
revenue from user data and attention—enabling targeted advertising at a large scale. Labeled
“surveillance capitalism,” this system harvests and leverages personal user data for precision
marketing, maximal engagement, and profit. In a Faustian bargain, users give up information about
themselves and their viewing preferences in exchange for “free” access to the content and
connections these platforms provide. In turn, the platforms use our engagement patterns to
determine what we will see.

On social media, our natural tendency to click and share content that triggers intense emotions,
reaffirms our preconceptions and cognitive biases, and signals our group affiliations is coupled with a
business model of opaque algorithms designed to learn from these choices and continuously feed us
what they think we want to see. The outcome is a feedback loop that fragments online spaces into
silos of like-minded content shaped by user profiles and choices.

Designed to optimize for attention and sharing, our digital public spaces have become dominated by
emotion-triggering content, affiliation and “in group” signaling, knee-jerk reactions, and outrage. The
fact that this content is not limited to personal or entertainment posts, but also includes posts
intended to inform or mislead, exacerbates the problem even more. By optimizing for clicks and
sharing, platforms are doing more than simply “giving us what we want”—they are, by their very
design, fostering an environment that incentivizes the spread of "junk food" informational content
over potentially less engaging but more “nutritious” or healthy information.

Ultimately, social media platforms are not the neutral and value-free channels for content that they
have often been positioned as. The algorithmic processes that determine what we see are not
transparent, and users have little insight into how or why they are served the content they receive,
how it differs from what appears on someone else’s screen, or how their digital data is being used.

In the end, these two key elements—platform business models and human nature—work together to
create the flawed and painfully polarized social and informational digital ecosystem we see today.

Potential solutions to addressing the problems of


our online information ecosystems have placed
the onus of responsibility on individuals (in the
form of media literacy education), on the
platforms themselves (content moderation), on the
government (regulation), or on potential third-
party oversight groups. Each of these has their
own benefits and challenges. Most recently, some
organizations are shifting the focus from
preventing harm to envisioning the kinds of online
public spaces that foster positive and healthy
interactions.

The current approach to mitigating online harms is


through content policies and content moderation,
provided by the platforms themselves. These
platforms have continually refined their policies
outlining what is acceptable and what is not, and
why. These rules are, unfortunately, applied
unequally across people and countries, and in a
non-transparent manner, which in itself is a
problem. Improving the transparency surrounding
how content policies are applied and content
choices are made is generally advocated, although
difficult to enforce. In addition, the scale of these
platforms, and the nuance and social and cultural
IMPROVING SOCIAL MEDIA | 12
ALL TECH IS HUMAN

complexity of the content itself, is overwhelming,


and inhibits effective human or AI moderation. It’s
an exhausting and Sisyphean game of “whack-a-
mole,” where content that is banned or removed
for whatever reason can simply resurface under
another name or with clever editing, coding, or
coded wordplay. And finally, it is questionable
whether we, as citizens and consumers, want to
give these platforms and the people who run
them, even more gatekeeping power over what
we see and share.

Government regulation is another potential


solution, proposed more and more frequently.
However, giving governments additional power
over permissible content and speech is
problematic in its own right. The thinking around
human rights principles, especially for business,
has laid some useful groundwork, and offers a way
to navigate the tensions surrounding freedom of
speech, privacy, online harms, and democratic
discourse. Designed to protect the vulnerable,
human rights principles emphasize essential values
such as transparency (especially around standards,
operations, data use, and decision-making),
accountability (including impact assessments) and
access to remedy and redress. As the UN Special
Rapporteur on freedom of expression argues,

Grid Approach
leveraging human rights principles can help both
governments and companies balance freedom of
expression with user protection.
PLATFORMS USERS KNOWLEDGE
BASE Another potentially effective solution is to
outsource content policy, moderation and
oversight to expert third parties or “social media
councils." These multi-stakeholder councils should
include input from government, industry, and civil
society members to help shape the difficult
TECH SCENARIO POLICYMAKERS content moderation decisions necessary for
WORKERS
fostering healthy online spaces.

Finally, people are moving beyond remedies that


try to solve current problems, to reconsidering the
structure of social media platforms themselves.
Some thinkers in academia and beyond are
NEWS MEDIA ADVERTISERS FUNDERS beginning to imagine the qualities, shape, and
structure of flourishing online public spaces or
digital public infrastructure that optimize for and
foster healthy discourse and interconnection.
Lessons can be learned from successful
communities and public spaces, both offline and
on, such as Wikipedia as well as public parks.
Our Grid Approach examines issues related to social Engaging users in the process of design and
media by considering the interlocking roles of moderation is fundamental to improving social
platforms, users, tech workers, policymakers, news media.
media, advertisers, funders, and the Knowledge Base.

IMPROVING SOCIAL MEDIA | 13


ALL TECH IS HUMAN

CONNECTED ISSUES

ONLINE MEDIA GOVERNANCE & ETHICS

Algorithmic Recommendation Engines (Rabbit Holes) Accountability


Attention Economy AI Bias
Brand Safety AI Explainability & Interpretability
Content Moderation / Commercial Content AI in Governance
Moderation Algorithmic Auditing
Cyberbullying / Online Harassment Algorithmic Bias
Deepfakes Algorithmic Harm
Digital Civility Auditing and Verification
Digital Identity Cybersecurity
Digital Wellness Data Agency and Data Provenance
Disinformation and Misinformation Data Governance
Echo Chambers and Filter Bubbles Data Trusts
Governmental Relations - Government Regulation Ethics Theater
(see Section 230) Ethics Washing
GDPR and CCPA Fairness in AI/ML
Interoperability Privacy
Marketplace Integrity Risk Assessments
Media Literacy Role and Responsibilities of the Public Sector
Oversight Panels and Boards Terms of Service / Community Guidelines
Personalization Transparency in AI/ML
Platforms’ Business Model Tech Equity
Policy Development and Enforcement Tech Ethics
Polarization Online
Privacy
Section 230
Surveillance Capitalism
Synthetic and Manipulated Media

DESIGN, INCLUSION, EQUITY, POWER

Accessibility
Affinity Groups
Age Gates
Anti-Racist Technology
Cultural Intelligence
Data Access for academic researchers
Data Literacy
Data Minimization
Data Transparency
Digital Citizenship (aka Cyber Citizenship)
Digital Colonialism
Digital Divide
Digital Human Rights
Diversity, Equity, Inclusion, Belonging
Ethics by Design
Ethically Aligned Design
Freedom of Expression
Friction
Human-Centered Design
Inclusive Design
Intersectionality
Privacy by Design
Power Asymmetries
Safety by Design
Surveillance Economy
Stakeholder Engagement
Workers' Rights in the Digital Economy
IMPROVING SOCIAL MEDIA | 14
Grid Approach
PLATFORMS USERS KNOWLEDGE
Partner with experts to Demand universal media BASE
optimize algorithmic literacy education; Engage in research in
detection & removal; recognize role effective media literacy
employ or contract with influencing and being education; increased
media literacy experts. influenced by research around
information ecosystem; effective ways to reduce
demand greater political early misinfo +
movement. superspreaders.

TECH SCENARIO POLICYMAKERS


WORKERS
A polluted information Consider writing and
Recognize and address ecosystem; viral passing universal media
bias in writing code, literacy legislation;
content moderation, misinformation leading
to real world violence. better define the role of
workplaces, corp. policy social media platforms
& business practices; as content moderators.
and lobby for legislation
that supports the above.

NEWS MEDIA ADVERTISERS FUNDERS


Make public their ethical Require that the places Just as investors are
standards (vis. info, where their ads are now consciously
misin, disinfo & placed take action funding innovators
advertising); educate against working to improve
social media platforms mis/disinformation and people’s wellness and
that publish or host user that systems which mental health, do the
content sharing their place their ads establish same with the
content about the same requirement. eradication of
information ethics. mis/disinformation.

IMPROVING SOCIAL MEDIA | 15


Grid Approach
PLATFORMS USERS KNOWLEDGE
Prioritize user wellbeing Demand and spend time BASE
at all levels of their with digital products that
business; incorporate support their physical and Publish and keep up with
positive environment psychosocial wellbeing. research on wellbeing in
changes that may run Reject products that are the digital age as it
counter to current harmful to themselves or unfolds and provide
business model. others. educational tools and
resources based on it
multiple audiences.

TECH SCENARIO POLICYMAKERS


WORKERS Concern that the
Continue to develop current structure Ensure that regulation
products and features of our digital backed by research.
that support users’ care Consider incentives for
for themselves and each spaces can have an platforms that prioritize
other; build in ongoing adverse impact on wellbeing.
education in what mental wellbeing.
supports user wellbeing
for a diversity of users.

NEWS MEDIA ADVERTISERS FUNDERS


With sourcing, seek out Consciously invest in
peer-reviewed research; Co-create a formula or startups that support
Commit to showcasing standard for media people’s wellbeing and
the complexity of user entities that represent help to diversify the
wellbeing and its wellbeing for customers business models
influence, avoiding and society. startups and other
sensationalism. innovators use.

IMPROVING SOCIAL MEDIA | 16


IMPROVING SOCIAL MEDIA

Community
Interviews
Hear from a broad range of leaders about
their role in improving social media

Some interviews have been lightly edited to improve


consistency and readability.

AllTechIsHuman.org | ImprovingSocialMedia.com

IMPROVING SOCIAL MEDIA | 17


my current role was the belief that policy
LEARNING FROM THE COMMUNITY
considerations often come in at the
backend of product development, when

Dona Bellow
there is little opportunity to influence
how things are built—I wanted to support
teams in considering risks at the
beginning of their process and building
mitigations directly into the product.

Responsible Innovation Manager, Facebook In your opinion, what are the biggest
issues facing social media?

There are many – from moderation to


trying to solve deep societal/human
problems with technology. The scale at
which platforms have to operate – while
considering cultural and contextual
nuances in regard to safety, equity and
other user experience concerns – is a
complicated challenge. Meanwhile,
society is shifting, and people are
demanding more accountability from the
systems they evolve in – that includes
governments but also corporations and
social media platforms. In my personal
opinion, a critical challenge of years to
come is going to be around rethinking
what social media platforms are meant to
accomplish (primary purpose) and the
way they operate (business models), in
order to rebuild trust and ensure the
viability of existing and new platforms.

What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

Tell us about your role: completed my education in I am excited about the growing focus on
International Human Rights Law in product equity and building social media
I closely partner with product France. I started working in tech experiences on the basis of our most
teams across the Facebook family about 8 years ago, at Google, where vulnerable populations. I heard someone
of apps to help surface and address I joined the Legal Online mention before, "If you manage to build a
potential negative impacts to Operations team, which managed social experience that supports the user
society in all that we build, early in legal requests for content removal needs, safety and privacy of a Black trans
the development process. My team and operationalized Google's disabled woman, you've created an
develops frameworks and internal legal policies. During my experience that will benefit the entire
methodologies to help teams build time there, I co-developed a population."
responsibly. program supporting product teams
in surfacing and mitigating abuse- Other interesting ideas: rethinking the
Tell us about your career path and related risks through integrated role and scope of social media platforms
how it led you to your work’s anti-abuse systems. Following that and whether they should be more
focus: experience, I worked at Airbnb and
Twitter in various policy-related Continued on next page
I have a legal background and roles; however, what led me to join
IMPROVING SOCIAL MEDIA | 18
ALL TECH IS HUMAN

centered around small communities; exploring decentralized autonomous orgs and moderation models; experimenting
with non-ad-based business models; and implementing usage limits to prompt people to disconnect.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

I wonder if it is even possible to single out a "main" area of improvement across these issues, because of their
interdependency. For example, increased digital literacy for internet citizens does not exist in isolation of platforms'
responsibility to provide accessible tools, transparent use and controls. Meanwhile, citizens also have a role in defining
socially acceptable outcomes and demanding that their governments implement structural social changes that could
support better digital norms. None of these groups, on their own, hold the keys to "fixing" social media and the internet,
so we need to establish what a universal system of accountability would look like.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

I think we can all agree that having as multidisciplinary an approach as possible is the way to go. Since I have been in this
industry, I have worked with such a wide array of professionals (from veterans to former teachers to economics experts,
etc.). For my part, I would be encouraged to see more people involved with a deep and practical understanding of human
psychology and the ways to support our most vulnerable populations — maybe more therapists, social workers.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

I may be a cynic in thinking that this may remain a conundrum forever, or at least for as long as there isn't a universal
agreement on what "too much or too little" means. There is an expectation, there, for platforms to be the arbiter of right
vs wrong, when those very notions are not agreed upon across our communities and global societies, which creates a
"losing side" for every one of those complex policy enforcement decisions. Where I think platforms have an opportunity
is in establishing further transparency on what the rules are, how enforcement happens, and being prepared to evolve
those rules as social norms are shifting: collaborating with governments to establish standards on these parameters may
be a good approach to this problem.

What makes you optimistic that we, as a society, will be able to improve social media?

What I am observing is that we're starting to shift social norms in terms of how much we expect from our communities
and governments, as well as the corporations whose products impact us on a day-to-day basis. I do think that people are
slowly building a better understanding of how their information is collected and used on social media, and what that
means is that they are also building new expectations, demanding more transparency and establishing informal channels
for accountability. This mounting social pressure is coming from all of us, including the professionals who research, build
and create rules for this technology, and I am very encouraged by all the conversations I see happening in industry. The
progressive expansion of these conversations to multidisciplinary groups, impacted stakeholders and culturally diverse
voices is what makes me optimistic.

"For my part, I would be encouraged to see more people involved


with a deep and practical understanding of human psychology and
the ways to support our most vulnerable populations."

-Dona Bellow, Responsible Innovation Manager, Facebook

IMPROVING SOCIAL MEDIA | 19


and investment back to Canada.
LEARNING FROM THE COMMUNITY
Tell us about your career path and how

Rana Sarkar
it led you to your work’s focus:

I have been an advisor, entrepreneur,


investor, operator and now diplomat
with meaningful sidelines in public policy,
politics and academia. I have spent the
Consul General of Canada in San past 25 years working across North
Francisco/Silicon Valley America, Europe and Asia almost always
with a tech, innovation and cross-border
focus. In this time, I have been lucky to be
engaged in rich global conversations and
on site for the creation of new industries,
trends and public policy shifts. This
network and wide angle perspective
helps me every day to build bridges
between Canada, Silicon Valley and the
rest of the world and to spot the
interconnections, talent and white space
at a key moment. At its core, my current
work tracks shifting nexuses of power in
innovation with the aim of making
networks smarter to unlock value and
solve big problems. I am energized by the
possibilities of this moment, but also
hyper attendant to its dangers. The new
digital Bretton Woods is being written
now. Along with colleagues in Silicon
Valley and around the world, we are
exploring how best to evolve diplomatic
tools to shape this conversation.

In your opinion, what are the biggest


issues facing social media?

Social media is at a turning point and


Tell us about your role: public on how tech, norms and we’re likely to see changes in response to
public interest intersect. We a number of forces, including
In 2017, I was appointed as engage with a cross walk of leaders regulators/authorities, consumers and
Canada’s Consul General to San in the venture community, tech employees. The proper pricing (and
Francisco and Silicon Valley by firms and governments (including in impending regulation) of their
Prime Minister Justin Trudeau to Canada) on a range of tech policy unintended externalities and the social
focus on better connecting Canada considerations from governance, harms caused by data exhaust and “rage-
to Silicon Valley. In this capacity, I diversity, climate, inclusion and and-engage” business models are at the
work to ensure that Canada is part human rights to the future of work. heart of the challenge facing social
of the fast-evolving global We also work closely with media. Social media companies must
technology governance governments and tech companies in become more “pro-human” and fix design
conversation. In addition to the Canada to help build their fluency and business models that promote harms
work of traditional diplomacy, our and opportunities in the corridor. and infodemics, while remaining
focus is to better understand and My team also manages several tech transparent and accountable along the
contribute to an evolving “co- accelerator programs to support
literacy” between tech creators, Canadian founders and startups Continued on next page
operators, governments and the and works to attract critical talent
IMPROVING SOCIAL MEDIA | 20
ALL TECH IS HUMAN

way. Antitrust competition authorities, privacy and security authorities worldwide have started to act. Tech companies
also are beginning to lose the faith of their employee and user bases, which is where it really hurts.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

There is wide agreement that there is no silver bullet, here – but perhaps silver buckshot. I am encouraged by the efforts
of social media platforms to finally restrict or slow the spread of misinformation and disinformation. Building in features
to the platforms takes some of the pressure off users, while also encouraging more thoughtful discourse online. Product
innovation away from “enrage-and-engage” to pro-social forms is also good news. Consumers, employees and
governments are all pushing in this direction. Governments are getting better at tracking, naming and eventually pricing
harms; for instance, there is terrific work being done by Global Affairs Canada’s Digital Inclusion Lab. The lab’s most
recent work consists of social media data analysis that reveals how hate and discrimination are being deployed against
targeted groups online. Governments, regulators and activists are learning to act together and across their own silos to
granulate and create the scale effects necessary to incent industry behavior. There is huge pent up consumer demand for
better, which we can never forget, and if incumbents do not fill it, others will.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

There are a number of Canadian organizations working to improve social media and online platform governance.

The Citizen Lab at the University of Toronto operates at the intersection of tech, human rights and global security and is
a global leader in conducting research on digital espionage, internet filtering and the impact on freedom of expression
online and privacy, security and information controls on applications.

The Centre for International Governance Innovation, also in the research field, is a think tank that has put a renewed
focus on pressing digital and technology issues such as platform governance, internet governance and big data.

Another organization to watch is the Centre for Media Technology and Democracy at McGill University, which has
remarkable researchers bridging the Canadian and global conversations.

In the non-profit space, OpenMedia aims to maintain a safe, free and open internet, while MediaSmarts helps children
and youth develop critical thinking skills to engage with the media through digital literacy programs. These organizations
operate at the community level to inform Canadians of all ages about the role of digital technologies in society and
actions that the government can take to improve internet and social media standards.

How does social media look different five years from now?

Ad-supported data extractive businesses won’t fade easily and will remain the focus of scrutiny from governments,
consumers and employees. But I suspect we’ll see the growth of additional small, niche and decentralized platforms.
Blockchain and ledger systems have done a great job of embedding accountability in some burgeoning social platforms
and marketplaces, but have yet to see scalable social use cases.

In the next five years, we can expect to see a further consumer migration to video and audio rather than text-based
platforms. These will provide even closer and more intimate communication between users and growing “intentional
communities.” Clubhouse is already part of this trend, and we’ll likely see other innovative products in this space,
including the long-promised AR and properly digitally native applications. Given pervasive social media exhaustion,
particularly with younger users, I expect to see a spate of new ventures focused on more algorithmically “pro human”
platforms that amplify strengths rather than weaknesses of cognition. Gaming might lead the way here, with business
models based on subscription and tokening versus ads. This also reflects step change in digital norms. Additionally, I
expect to see more variety globally, given the splintering of digital norms.

Connect with Rana Sarkar at @RanaSarkar_

RANA SARKAR IMPROVING SOCIAL MEDIA | 21


that rights to free speech are provisioned
LEARNING FROM THE COMMUNITY
under the First Amendment, and that the
precise ways this should play out online

Oumou Ly
mirror the ways we've seen it play out in
offline fora.

At the same time, this definition of free


speech is often used to justify insufficient
moderation around incitement to
Staff Fellow, Berkman Klein Center for violence, hate speech and disinformation.
Internet and Society Congressional oversight of platforms'
action against Donald Trump's social
media accounts illustrated how a First
Amendment-derived definition of free
speech can be abused. If we're to decide
free speech should mean that any
attempt by government to preserve the
integrity of true information and/or
protect minority groups is unacceptable
and/or actionable under the First
Amendment, then safety, privacy and
security cannot co-exist. The best way to
ensure these can co-exist is to have a
dialogue on what we believe should
constitute free speech, and that dialogue
should interrogate our assumptions
about the connections between speech,
truth and democracy.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
media to educate the general public,
governmental oversight, and the role of
citizens in terms of literacy and how
they engage with platforms. In your
opinion, what area do you think needs
the most improvement?
Tell us about your role: long-form publications; and
contribute to white papers. Finally, I Each of these areas could benefit from
I work as a fellow at the Berkman host a web series called the significant improvement. Most
Klein Center for Internet and Breakdown, on which I interview immediately, more clearly delineating
Society at Harvard University. In experts across the disinformation the responsibilities of government,
my current role, I contribute to space on particularly salient media and platforms can help to slow the
project work and programming on challenges in mitigating and harmful effects of disinformation and
the Center's Assembly: countering disinformation. social media at scale.
Disinformation program and help to
set the substantive direction of the How do we ensure safety, privacy What people and organizations do you
program through creation of new and freedom of expression all at feel are doing a good job toward
areas of inquiry for discussion on the same time? improving social media?
the Assembly Forum. I also
regularly provide media This is difficult to do, primarily I think Joan Donovan of the Shorenstein
commentary on issues at the nexus because freedom of expression has
of cybersecurity, information come to be very broadly defined. Continued on next page
security and technology; author On one hand, vocal critics contend
IMPROVING SOCIAL MEDIA | 22
ALL TECH IS HUMAN

Center on Media, Politics and Public Policy at the Harvard Kennedy School does insightful and incisive writing and
analysis on these issues.

What do you see as the risk of doing nothing to address the shortcomings of social media?

It is hard to overstate how harmful doing nothing would be. First, disinformation is a major harmful side effect of social
media at its current scale. As the January 6 attacks on the Capitol showed, disinformation has a massive,
incontrovertible, corrosive impact on democracy. Moreover, when disinformation takes hold in a democratic society, it
works to both expose and accelerate the decay of democratic institutions. In this way, to do nothing is to accelerate
major democratic collapse.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Several: computer scientists and data design engineers, statisticians, information scientists, foreign policy and
international relations experts, psychologists, educators, political philosophers, policy experts and sociologists all belong
at the table, to name a few.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

The conundrum is not exactly whether platforms moderate too much or too little. it is more about whether platform
policies around moderation are guided by the kinds of considerations that social media companies operating in
democratic, decidedly anti-racist societies should prioritize. It is about how often or not often platforms make important
moderation decisions informed by these considerations. At this time, in the United States at least, moderation decisions
are not guided by these considerations to a sufficient extent. One difficulty with changing the current state of play is that
while we (all) tacitly agree that the responsibility of taking moderation action should remain the domain of platforms
(and not the government), platforms often moderate in a way that minimizes their exposure to certain risk, and this does
not always yield moderation outcomes that protect democratic interests. It is critical that our thinking on this question
takes into account the structural factors that create these kinds of trade-offs.

What makes you optimistic that we, as a society, will be able to improve social media?

I have been heartened by the significant academic and civil society efforts toward workable solutions, as well as their
readiness to so vocally hold the powerful accountable when they've failed to defend democratic interests. I know that,
when mobilized effectively, efforts of this kind have the potential to effect sweeping change in both government and in
private industry. What makes me optimistic is the groundswell of time and expertise that's been dedicated to creating a
more equitable internet. Those efforts, given time, will translate to social media in particular.

Connect with Oumou Ly @oumoubly

"The conundrum is not exactly whether platforms moderate too


much or too little. it is more about whether platform policies
around moderation are guided by the kinds of considerations that
social media companies operating in democratic, decidedly anti-
racist societies should prioritize."

-Oumou Ly, Staff Fellow, Berkman Klein Center for Internet


and Society

IMPROVING SOCIAL MEDIA | 23


abroad to being a national security
LEARNING FROM THE COMMUNITY
advisor at the White House – I began to
view the breakdown of civil discourse

Yael Eisenstat
here at home as the biggest threat to
democracy. I became increasingly
concerned with how the Internet was
contributing to political polarization,
hate and division. I set out to both
Democracy activist; former Facebook elections publicly sound alarm bells and to see
integrity head; former diplomat, intel officer what role I could play in helping reverse
this course.
and White House advisor
This led me to Facebook, where I was
hired to head the company’s new Global
Elections Integrity Operations team for
political advertising. Realizing I was not
going to change the company from
within, I am now a public advocate for
transparency and accountability in tech,
particularly where the real-world-
consequences affect democracy and
societies around the world.

In your opinion, what are the biggest


issues facing social media?

We are being manipulated by the current


information ecosystem, entrenching so
many of us so far into absolutism that
“compromise” has become a dirty word.
Because right now, social media
companies, like Facebook, profit from
segmenting us and feeding us
personalized content that both validates
and exploits our biases. Their bottom line
depends on provoking strong emotions
to keep us engaged, often incentivizing
inflammatory, polarizing voices; to the
Tell us about your role: corporate social responsibility point where finding common ground
strategist at ExxonMobil and the feels impossible. Unless they are willing
I am a democracy activist and head of a global risk firm. As a to reconsider how the entire machine is
strategist focused on the Visiting Fellow at Cornell Tech's designed and monetized, no amount of
intersection of tech, democracy and Digital Life Initiative, I focused on "whack-a-mole" content moderation will
policy. My work strives to bridge technology's effects on discourse fix the divisive nature of the biggest
the divide between government and democracy and teaches a multi- social media platforms today. They will
and tech, to help foster a healthier university course on Tech, Media never truly address how the platform is
information ecosystem. and Democracy. contributing to hate, division and
radicalization. But that would require
I have spent 20 years working Tell us about your career path and fundamentally accepting that the thing
around the globe on democracy and how it led you to your work’s you built might not be the best thing for
security issues as a CIA officer, a focus: society and agreeing to alter the entire
White House advisor, the Global product and business model.
Head of Elections Integrity After spending 18 years in the
Operations for political advertising national security and global affairs Continued on next page
at Facebook, a diplomat, a world – from countering extremism
IMPROVING SOCIAL MEDIA | 24
ALL TECH IS HUMAN

In your opinion, what area do you think needs the most improvement?

Every one of these is part of the larger puzzle. There is no one magical solution – we need a whole-of-society approach. I
focus on the government's role in defining responsibility, accountability for the externalities and threats to society
caused by current social media business models. This goes hand-in-hand with civic education, media literacy, public
awareness and healthier media in general.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

Civil Rights leaders, academics, journalists, advertisers, legislators, employees and activists all play a critical role in this
movement. Many organizations help educate the public, raise awareness and push the government to step up and
address these issues. Every one of these voices is important. This cannot just be left to technologists to solve and lawyers
to debate. Social media has a profound impact on all of society, and we are all stakeholders in the ultimate solutions.

What do you see as the risk of doing nothing to address the shortcomings of social media?

We have already seen the risks that I (and so many others) have been trying to highlight for years play out: People are
using social media tools, exactly as they were designed, to sow division, hatred and distrust. We saw where that can lead
when followers of conspiracy theories tried to launch an insurrection at the U.S. Capitol.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

This cannot be left to just technologists to fix. In addition to the need for racial, socio-economic, religious and geographic
diversity, this will require true diversity of thought, experience and background to fix. If we desire to create a healthier,
more equitable information ecosystem, the people who are most affected by the negative side of social media must be
incorporated into the decision-making processes moving forward.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

I do not believe the government should be regulating what speech is ok and what speech should be taken down, except
where it breaks the law. But I do think government should figure out how to regulate the tools the platforms use (and sell
to advertisers) for curating, recommending, amplifying and targeting. And that comes down to the fact that there is no
transparency into how those tools work. By insisting on real transparency around what these recommendation engines
are doing, how the curation, amplification, and targeting are happening, we could separate the idea that Facebook
shouldn’t be responsible for what a user posts from their responsibility for how their own tools treat that content. I want
us to hold the companies accountable not for the fact that someone posts misinformation or extreme rhetoric, but for
how their recommendation engines spread it, how their algorithms steer people towards it, and how their tools are used
to target people with it.

Connect with Yael Eisenstat @YaelEisenstat | Watch Yael's TED talk

"This cannot just be left to technologists to solve and lawyers to


debate. Social media has a profound impact on all of society, and
we are all stakeholders in the ultimate solutions."

-Yael Eisenstat, Democracy activist; former Facebook


elections integrity head

IMPROVING SOCIAL MEDIA | 25


how it led you to your work’s focus:
LEARNING FROM THE COMMUNITY
Prior to Cambridge, I worked at Google

Anita Williams
in the Global Affairs department as a
Legal Specialist. There I worked on
platform abuse protection initiatives
such as child sexual abuse investigations,
elections, advertising transparency and
counterfeit operations. I began my career
Graduate Researcher / Centre for Data Ethics with a keen interest in human trafficking
& Innovation prevention and obtained my BA in
Justice and Peace Studies from
Georgetown University with an emphasis
on labor and sexual exploitation. My
professional entry into the space of
online counter-abuse policy enforcement
has since elevated the scope of my career
focus to policy regulation and design.

What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

I have been very impressed with the


ideas for combating misinformation on
conversation platforms like Twitter and
Reddit. Reddit in particular is a
fascinating case study on
grassroots/community-created and -
enforced standards and has yielded very
high-quality conversation forums that
operate with integrity. You can find
subreddits like r/PoliticalDiscussion and
r/ChangeMyView that require extensive
citations for their members' statements
and stances.

Tell us about your role: national governments, How do we ensure safety, privacy and
international organizations and freedom of expression all at the same
I am currently an MPhil in private firms move beyond ethical time?
Technology Policy student, which is AI principles and toward the
an interdisciplinary postgraduate implementation of ethical AI, Research is increasingly uncovering the
program within the Cambridge standards are being discussed as an fact that there is no absolute way to deal
Judge Business School. The important way to achieve with online harms, and that only well-
program imparts cost-benefit, consensus across borders, and the reasoned trade-offs between privacy and
econometric and ethical analysis role of standards in AI governance security will define how online platforms
tools to design effective policies for will unquestionably help create a treat their users' data and voice. I believe
unregulated and/or emerging common language and way of the need for a fundamental digital bill of
technologies. I have recently joined operating for governments around rights is necessary to ensure safety,
the AI Assurance team at the the world who wish to maximize the privacy and freedom of expression of all
Centre for Data Ethics & opportunities of AI while users. The work by Liberal Democrats,
Innovation, alongside colleagues for minimizing risks it poses to society.
my program, to conduct research Continued on next page
on AI standards development. As Tell us about your career path and
IMPROVING SOCIAL MEDIA | 26
ALL TECH IS HUMAN

titled 'Creating a Digital Bill of Rights: Why do we need it, and what should we include?' addresses these central trade-
offs in a measurable and governable framework.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

From internal tech company hiring processes and retention programs to the very products and services they design for
use in global markets, there is a dire need for intersectionality and gender studies professionals in product design, policy
and improvement initiatives. I have worked directly with product and process users who have provided feedback on the
lack of consideration of their unique communication styles and needs – needs that academics with these areas of
expertise would be best equipped to design a new system to address. The Leverhulme Center for the Future of
Intelligence is doing phenomenal work with their AI Narratives and Justice program, working to highlight this 'pain point'
within companies and within greater cultural imaginations of AI.

Connect with Julie Anita Williams @AnitaMPhil

Note: The views expressed above represent the personal views of Anita Williams and do not represent the official
views of the Centre for Data Ethics & Innovation.

"Research is increasingly uncovering the fact that there is no absolute


way to deal with online harms, and that only well-reasoned trade-offs
between privacy and security will define how online platforms treat
their users' data and voice. "

-Anita Williams, Graduate Researcher at the Centre for Data


Ethics & Innovation

IMPROVING SOCIAL MEDIA | 27


people. I think many of the problems we
LEARNING FROM THE COMMUNITY
face – questions about polarization,
moderation, truth, surveillance – would

Ethan Zuckerman
be best worked on not by tweaking the
existing system but by building new
systems that are much smaller and
governed by the communities that use
them.

UMass Amherst What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

Some of the work being done on


decentralization is promising, but
decentralization itself isn't the solution.

We need more careful thought on how


communities govern themselves, and not
just technical architectures that take
power out of one set of hands and put it
into other places. I am advocating a
broad set of reforms under the heading
of “digital public infrastructure" that
could serve as best practices for small,
self-governing, civic-focused social
media that would act as a counterweight,
not a replacement, to the models we
experience right now.

How do we ensure safety, privacy and


freedom of expression all at the same
time?

Better governance. It is not doable within


the existing massive social networks. But
human communities work through this
Tell us about your role: much of the 2000s focused on all the time. We have different spaces
helping the internet become more with different expectations about
I teach public policy, information global, working with communities in privacy and freedom of expression – we
and communication and lead a lab the developing world on online speak differently in a church service than
dedicated to reimagining and inclusion. I spent the last decade we do in the AA meeting in the basement
building social media around an working on questions of the of the church. We need more of these
explicitly non-commercial, pro-civic internet and social change. Now I small spaces where we do the hard work
model. work on changing social media of governing ourselves, and we need to
itself. accept that spaces designed for a billion
Tell us about your career path and people do not work well for any of those
how it led you to your work’s In your opinion, what are the people.
focus: biggest issues facing social media?
When we discuss improving social
I was an early dotcom brat in the For me, the biggest issues are media, we often toggle between the
late 1990s and became fascinated ultimately structural. Human
with all that is good and bad about communities are not huge – real Continued on next page
human interaction online. I spent communities support 20-20,000
IMPROVING SOCIAL MEDIA | 28
ALL TECH IS HUMAN

responsibility of platforms, the role of media to educate the general public, governmental oversight, and the role of
citizens in terms of literacy and how they engage with platforms. In your opinion, what area do you think needs the
most improvement?

Most of the talk about media literacy is a cop out – it is transferring responsibility from platforms that have tons of
resources to throw at making spaces less dangerous and toxic to already overburdened educational systems.

Connect with Ethan Zuckerman at @ethanz

"I think many of the problems we face – questions about


polarization, moderation, truth, surveillance – would be best worked
on not by tweaking the existing system but by building new systems
that are much smaller and governed by the communities that use
them."

-Ethan Zuckerman, Director of the UMass Institute for


Digital Public Infrastructure

IMPROVING SOCIAL MEDIA | 29


human rights impact of abuse against
LEARNING FROM THE COMMUNITY
women and non-binary people on social
media platforms. I was most recently the

Azmina Dhrodia
Head of Operations and Research at
Block Party, an early-stage tech startup
that solves online harassment by filtering
out users more likely to send unwanted
or harassing content, and was
responsible for building and executing
Senior Policy Manager, Gender and Data user research and operational strategies
Rights at the Web Foundation for Block Party’s alpha and beta
products.

In your opinion, what are the biggest


issues facing social media?

One of the biggest issues facing social


media is online gender-based violence
and abuse against women. Violence
against women and gender
discrimination is not a new phenomena,
but what is new is the medium through
which we are seeing it play out (i.e.
online). Our digital world amplifies
existing inequalities, particularly
intersectional discrimination, and we are
seeing this manifest on online platforms.
Forms of online gender-based violence
can include physical and sexual threats of
violence, sexist or misogynistic
comments towards women (often
focusing on women’s physical
appearance) and privacy concerns such
as sharing non-consensual intimate
images of a woman or uploading a
woman’s personal information, such as
her phone number or email address, to
cause her distress or alarm (doxing).
Tell us about your role: Tell us about your career path, and
how it led you to your work’s It is important to stress that women with
I take a lead role in shaping the focus: multiple and intersecting identities, for
Web Foundation’s evidence-based example, women who identify as racial or
policy development and advocacy I was previously a Research and ethnic minorities, young women, LGBT
at the intersection of gender and Policy Advisor on Technology and women, etc., are at heightened risk of
data rights, and work to advance a Human Rights at Amnesty online abuse and disproportionately
gender-first approach to data rights International and spearheaded the affected, because the abuse they face
with key stakeholder audiences, organization’s research, policy and often targets their different identities.
including policymakers, regulators, advocacy on violence and abuse
companies and the broader tech against women on social media When abuse on social media goes
policy community. I also manage platforms. I have authored several unchecked, women can end up silencing
our emerging workstream reports and articles on the issue, or self-censoring themselves, online
convening consultations and policy including the cutting-edge report which has a detrimental effect on whose
design workshops to build human- “#ToxicTwitter: Violence against
centered solutions to online Women Online,” where I applied an Continued on next page
gender-based violence. intersectional lens to analyze the
IMPROVING SOCIAL MEDIA | 30
ALL TECH IS HUMAN

voices and opinions we hear online. Experiences of online gender-based violence can also force women offline
completely. Online gender-based violence not only has harmful psychological and economic impacts on women, but the
silencing and censoring of women’s voices online is a threat to democracy and fails to respect their right to freely express
themselves online without fear.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

At the Web Foundation, we’re working to co-design policy and product solutions with tech companies and civil society as
part of the Contract for the Web. Our pilot program focuses on online gender-based violence and The Web Foundation
hosted a series of consultations throughout 2020 and now in 2021, bringing together tech companies, civil society
organizations, researchers, academics and women impacted by online abuse directly to collaborate on the technical,
policy and design challenges that need to be addressed to tackle gender-based violence on social media platforms.
Insights from four multi-stakeholder consultations will inform a series of policy design workshops that will bring
participants from the consultations and tech companies to co-create policy and product solutions to online gender-based
violence using an innovative human-centered approach.

Specifically, we are also calling on companies to:


Use “gender by design” when developing products and services – and think about the impacts of products and
services on women from the very beginning of the design process, for example, building products based on gender-
disaggregated data, considering the safety and security of women when building location-based features.
Consult with women in the design of technology products, platforms and terms of service.
Conduct regular gender audits of products and services
Include gender disaggregated data in transparency reports.

How do we ensure safety, privacy and freedom of expression all at the same time?

Ensuring the safety of users online helps ensure that those who are disproportionately targets of online abuse are able to
freely and safely express themselves on social media without fear of violence and abuse. Ensuring safety and the right to
freedom of expression are not at odds with one another, and we must shift the narrative also to consider the right to free
expression for those voices being silenced and censored because of online gender-based violence and identity-based
attacks. When online abuse is allowed to flourish, especially when we know that some groups are targeted more than
others, it creates a culture where some users' right to free expression is seen to be more important than others. Privacy
settings are one way social media users can keep themselves safer online, but these settings can often be confusing, and
there is a lack of standardised terminology across platforms which makes it difficult to understand how a privacy feature
or safety tool can be used. Accessible and user-friendly privacy and security settings online can offer users an
empowering way to ensure that they are able to curate online experiences that make them feel safe and able to freely
express themselves without fear.

What do you see as the risk of doing nothing to address the shortcomings of social media?

When social media platforms fail women and silence or censor their voices by not adequately dealing with online gender-
based violence and abuse on their platforms, we also fail a whole generation of young women whose voices can also end
up being silenced as a result.

Connect with Azmina Dhrodia @snazzyazzy

AZMINA DHRODIA IMPROVING SOCIAL MEDIA | 31


disinformation analyst.
LEARNING FROM THE COMMUNITY
In your opinion, what are the biggest

Samantha North
issues facing social media?

Certain aspects of social media’s current


design promote the innate human
tendency toward tribal behaviour. This
often manifests as online hostility toward
Director, North Cyber Research. PhD the perceived outgroup. I have seen this
Candidate, University of Bath happen around a great number of
political topics, such as the Brexit debate.
When people behave tribally, their main
goal is to assert their identity as a
member of the ingroup. So they’re less
likely to think critically about the
information they share. Unfortunately,
social media’s current design rewards
engagement in all its forms, including the
negative, which exacerbates tribal
tendencies and leaves users more
susceptible to influence campaigns. I
believe this issue needs to be urgently
addressed.

What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

I was happy to see Twitter testing a 'read


before you retweet' function. This may
seem like a simple fix, but it has powerful
implications for the spread of
disinformation. On Twitter in particular,
people retweet often without critically
engaging with, or even opening, the
article. They often take a knee-jerk
Tell us about your role: Tell us about your career path and reaction sort of approach, retweeting
how it led you to your work’s things that fit their identity as part of
I am a disinformation researcher focus: their ingroup, based solely on the
helping organizations to headline. This practice provides addictive
understand, identify and counter My career path started in Istanbul, dopamine hits in the form of approval
inauthentic online behaviour. My working as a freelance journalist (likes, RTs) from their tribe. Whether or
approach blends behavioral covering politics and security. I was not the content is true is often a
psychology with open-source there in 2014, when ISIS became secondary concern. That's why I like this
intelligence methods. I draw on my prominent. Studying ISIS new feature from Twitter, and I hope
academic research on tribalism and propaganda got me interested in they implement it in earnest.
cognitive biases to better how online spaces are used to
understand the motivations and influence people in real life. I have When we discuss improving social
incentives behind inauthentic always been interested in the media, we often toggle between the
online behaviour. I provide psychology that makes people responsibility of platforms, the role of
specialized disinformation analysis susceptible to disinformation. That
for a range of organizations, from focus led me to a relevant PhD, Continued on next page
non-profits to cybersecurity firms alongside a number of projects as a
and social media platforms. IMPROVING SOCIAL MEDIA | 32
ALL TECH IS HUMAN

media to educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they
engage with platforms. In your opinion, what area do you think needs the most improvement?

A key area to focus on here is getting citizens used to engaging with social media in a more healthy way. This could
include teaching people about the tactics trolls use to draw them into hostility online, helping people break out of endless
dopamine loops, and teaching them how to develop more distance from their mobile devices. We also need to educate
social media users about the many cognitive biases involved in platform use.

What do you see as the risk of doing nothing to address the shortcomings of social media?

The side effects of social media have already caused at least five years of damage to information space and social
cohesion. Trust in institutions is at an all-time low, and many people across different countries are firm believers in
conspiracy theories. As we've seen recently in the US and other places, this can have disastrous results in real life. I do
not want to claim social media is wholly responsible for this, but it certainly has had significant influence.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Psychology and sociology are critical to include. Any efforts to tackle with online influence campaigns would also benefit
from experts in geopolitics, as well as cultural expertise in countries like Russia and China.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

This is an especially tricky challenge for governments in the current environment. Several Western governments have
been instrumental in polarizing their societies. We can look to the Brexit situation for a prime example of this. During the
debate, government officials deliberately used tribal terminology, such as “Brexiteer” and “Remainer,” which then
filtered through to social media, to be wielded by the two tribes against one another. Similar dynamics have been at play
in the United States, particularly during Donald Trump's presidency. To help solve this conundrum, governments need to
be more transparent and work harder at social cohesion, rather than leveraging division for political ends. Polarized
societies are unhealthy societies, prime targets for hostile influence campaigns.

What makes you optimistic that we, as a society, will be able to improve social media?

The cycles of history give me some cause for optimism. Whenever a new media emerged in the past, it took societies
some time to get used to it. Social media is still in its infancy, and perhaps the human brain needs time to catch up with it.
The people who design platform architecture have a lot of control. I am confident they can figure out ways to make social
media less harmful. After all, it offers a lot of positive factors as well.

Connect with Samantha North @sjnrth

"To help solve this conundrum, governments need to be more


transparent and work harder at social cohesion, rather than leveraging
division for political ends. Polarized societies are unhealthy societies,
prime targets for hostile influence campaigns."

-Samantha North, Director, North Cyber Research

IMPROVING SOCIAL MEDIA | 33


made its appearance (Cambridge
LEARNING FROM THE COMMUNITY
Analytica) and the first well-thought and
coordinated attempts to regulate the

Ioanna Noula
internet were already on the horizon
(GDPR).

These conditions steered my research


focus toward exploring the impact of
new media (including social media) on
Head of Research and Development, Internet democratic citizenship, the role of
Commission education in mediating the relationship
between citizens and their new public
spaces and the need for public debate
across sectors. As a visiting fellow in
LSE’s Department of Media and
Communications, I led discussions on the
impact of digitalization with participants
from different sectors. The broader
consensus around the importance of
holding powerful digital actors to
account and the responsibility of digital
organizations (“digital responsibility”)
inspired me to become the co-founder
and lead research for the Internet
Commission, an organization whose
mission is to promote digital
responsibility by offering evaluation
mechanisms that help companies
demonstrate leadership and
accountability and offer insight to policy-
makers.

In your opinion, what are the biggest


issues facing social media?

There is no question in my mind that


moderating content, conduct and contact
at scale is the biggest challenge social
Tell us about your role: Tell us about your career path, and media and society is faced with.
how it led you to your work’s Developing fair and wise moderation
I lead the Internet Commission focus: processes that will balance freedom of
research team and am responsible expression and minimization of harms in
for the delivery and quality of During my undergraduate studies an ecosystem that caters across cultures,
research-related activities, in education, I discovered my political regimes and infrastructures is an
collaborating across the Delivery strong interest in sociology, impossible task. Throw AI into the mix
Team and Advisory Board. My main understanding education as a force and you have a mutation of the challenge.
responsibilities include directing that can drive or undermine
and overseeing the research team societal and political change. I I said fair and wise, as I wanted to avoid
to provide expert and evidence- explored these connections further the use of the words efficient and smart.
based opinion and developing an during my doctoral studies in the The former implies time, the latter means
interdisciplinary research agenda in area of Citizenship Education. The speed. Humanity needs to give time and
partnership with a growing network end of my PhD coincided with a reflection to develop not just the right
of academic and other partner turning point in the history of social .
organizations. media. The first wave of societal Continued on next page
and political crises had already
IMPROVING SOCIAL MEDIA | 34
ALL TECH IS HUMAN

tools and procedures but also the appropriate corporate cultures that will not just care about not being evil. We are
trapped in a whirlpool of unchecked inescapable (cannot be without) services and products, their speedy developments
and toxic impact.

Content moderation needs to be resourced and refined in ways that are responding to the scope, audience and
functionalities of the platform. Language proficiency, cultural awareness, understanding users' intent and offering
pathways to redress are some of the issues that AI could potentially support but it cannot yet be trained to do so. This
task is immense, it should be led by public deliberation and cross-sectoral consultation, and it requires ethical corporate
cultures characterised by openness and commitment to learning and accountability.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

I have recently been part of a conversation on AI-powered age verification. I was fascinated by the insightful inclusive
approach that was driven by experts from a range of sectors and disciplines and the leadership team's dedication to
children's welfare (YOTI).

How do we ensure safety, privacy and freedom of expression all at the same time?

I do not think I can offer an answer to this. It is a complex philosophical issue that is tied to the nature of democracy. I
think we should make peace with the fact that true democracy is a work in progress and it is based on struggle, dissent
and readiness to protect itself but also change when needed. In this sense, we should be driven by the value of human
dignity and try to draw lines according to the historical circumstances. The War on Terror came together with the
imperative to safeguard citizens' lives. The prioritization of safety came at the expense of our right to privacy. Snowden
exposed a system that had gone rogue and democratic institutions moved to address this.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

I think we need empowered and responsive (democratically elected) governments that can deploy pertinent regulation
and protect citizens. We are also in need of meaningful and critical education that will produce critical and responsible
citizens. Digital literacy and other education initiatives cannot be outsourced. I could not hold a profit-driven social
media company responsible for the lack of education of the general public.

I would, however, expect governments to hold media corporations accountable for not collaborating and sharing
knowledge that will assist governments to identify risks, assess societal impact and educate citizens in meaningful ways. I
cannot expect citizens to understand algorithms and the impact they have on their lives, if governments have no
understanding of how Facebook or Twitter's algorithms work. Democracies and their governments are tasked with
safeguarding, educating and empowering citizens and imagining the future of our society. Knowledge sharing from the
side of social media corporations is needed so that our governments can ask them the right questions, and offer the right
answers to their citizens.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

Our organization, the Internet Commission, is advancing digital responsibility through the evaluation of social media and
other digital organizations. We have recently addressed the contentious topic of organizational decision-making about
online content, conduct and contact. To deliver this, we went through the laborious process of convincing established
organizations to participate and secure a much wider endorsement.

We applied diverse experience of business, public service and academic research to the creation of detailed case studies.
Guided by a detailed evaluation framework, we identified and analyzed the key organizational practices that enable and
shape decisions about online content, contact and conduct.

For the first time, we looked "under the hood" of these organizations speaking to people involved in making these
decisions and who are generally unseen and unknown. We also interviewed those on the front line who can face very
challenging conditions.
IOANNA NOULA IMPROVING SOCIAL MEDIA | 35
Continued on next page
ALL TECH IS HUMAN

We documented sophisticated technologies that help to safeguard human moderators and the public, but which can also
amplify harmful stories, reinforce gender and racial biases and shape or limit the spread of ideas.

This accountability exercise was an independently scrutinized proof of concept that proved that auditing and holding
social media companies to account can be delivered.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

I think the dialogue on the diversity of tech issues is in the right direction and we are consistently shining new light on the
under-representation of ethnic groups, the toxic masculinity cultures, the suppression of voices of dissent that challenge
those cultures, the biased training of algorithms and the unaccounted for audiences. I think we need to include children
and also consider intersectionality of identities that amplify disadvantage. I would want to see children (including
disabled children, children of colour, children in poverty) being part of the discussion at the stage of designing tech and
developing tech regulation. The recent adoption of General Comment 25 by the UN Committee on the Rights of the
Child is a pivotal moment for children's rights that recognizes that these rights apply in the digital world. Considering this
new introduction with the breakthrough article 12 of the UNCRC (that stresses the importance of listening to children)
spotlights the imperative to consider children's voices in the process of designing their digital future.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

I do not think so. I think institutional vigilance is important, and governance should ensure that the right people can
support pertinent legislation and enforcement.

Connect with Ioanna Noula @ioannanoula

"I think we need empowered and responsive (democratically elected)


governments that can deploy pertinent regulation and protect
citizens. We are also in need of meaningful and critical education
that will produce critical and responsible citizens."

-Ioanna Noula, Head of Research and Development, Internet


Commission

IMPROVING SOCIAL MEDIA | 36


health, the increase in paranoia, and the
LEARNING FROM THE COMMUNITY
destruction of civil society. It is not an
exaggeration to say that America may

Douglas Rushkoff
itself collapse as a viable society, thanks
chiefly to the impact of social media on
our capacity to understand or engage in
civics.

What "solutions" to improving social


Professor of Media Theory, author and host of media have you seen suggested or
Team Human implemented that you are excited
about?

Excited? That’s a pretty tall order. I am


excited mainly by people leaving the
social media networks and doing things
in real life. I am excited when people
decide to visit each other’s houses or
meet outside in a park or something.

How do we ensure safety, privacy and


freedom of expression all at the same
time?

You cannot ensure any of this. You can,


instead, help people become more
resilient and able to function in a world
where safety, privacy, and freedom
cannot be ensured—particularly not at
the same time. Those sorts of totalizing
goals are the way tech people think and
talk, but they’re not of the real world.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
media to educate the general public,
governmental oversight, and the role of
Tell us about your role: I became an advocate for using citizens in terms of literacy and how
these technologies in pro-human they engage with platforms. In your
I write books about media, ways. opinion, what area do you think needs
technology, and society, focusing on the most improvement?
the impact of the digital media In your opinion, what are the
environment on human autonomy biggest issues facing social media? We toggle? Like machines? I fear you are
and connection. spending too much time with engineers.
Well, social media isn’t really alive. We do not toggle. Of the potential
Tell us about your career path, and It’s got nothing facing it. I just do activities you mention, I guess “the role
how it led you to your work’s focus not care how Facebook or Twitch of citizens” is closest to my own agenda
feels about things, because they of helping people strengthen their
I was a theater director, fed up with cannot feel. Nothing faces them. I cognitive immunity. Media literacy is
the elitism of professional theater. I think there are some big issues certainly part of that. Social health and
was attracted to emerging facing people who are using social civics are others.
interactive technologies as a new media—mainly, the intentional
‘people’s’ medium. Then, as the net assault on their basic cognition, the Continued on next page
was sold out to corporate interests, resulting damage to their mental
IMPROVING SOCIAL MEDIA | 37
ALL TECH IS HUMAN

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

I think Mozilla may be looking at this. I do not really see much out there.

What do you see as the risk of doing nothing to address the shortcomings of social media?

The greatest risk? We all die. An insane population cannot govern itself or address collective problems.

What models do you see coming on line for providing a digital community (beyond today’s ad-based, extraction
model) – platform cooperatives? Decentralized Autonomous Organizations (DAOs)? For example, are there promising
applications for the blockchain?

Sure, DAOs could work. Most are just for building platforms, or alternative social networks, or versions of Facebook that
are less this or that. I think the open web and TOR type things are interesting. Platform cooperatives are certainly better
than platform monopolies. But how many businesses really need to be operating as decentralized, non-local networks?
Most of us should be working in local cottage industries. Maybe use the networks for B2B connectivity between the
cottage industries. That’s called anarcho-syndicalism.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Liberal arts, civics, economics, ethics…. everything that’s not included in STEM.

There is a concern that the focus on improving social media is overly US-centric. Can you point to stellar work
happening outside the US?

I cannot point to any stellar work that’s happening in the US. Who is focused on improving social media in the US? I like
what they’re doing at Enspiral in New Zealand.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

All you have to do is break up the monopolies, so that smaller communities can make their own decisions.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

Algorithms optimize for extraction of user data or for engagement/addiction. If we tuned the algorithms for something
other than mental illness, they might not make people as sick.

What makes you optimistic that we, as a society, will be able to improve social media?

Dude—you have the cart and horse reversed. I do not want society to focus on improving social media. I want social
media to focus on improving society. I am optimistic about those who have given up on social media. All media is social.
The platforms you’re calling “social media” are the least social media ever developed. As people realize this, and turn to
real life and other forms of media to accomplish social ends, we can improve society.

Connect with Douglas Rushkoff at @rushkoff

DOUGLAS RUSHKOFF IMPROVING SOCIAL MEDIA | 38


and a contributor to multiple
LEARNING FROM THE COMMUNITY
anthologies, most recently Free Speech
in the Digital Age and Believe Me: How

Soraya Chemaly
Trusting Women Can Change The World.

Prior to 2010, I spent more than 15 years


as a market development executive and
consultant in the media and data
technology industries. I currently serve
Writer & Executive Director of The on the national board of the Women's
Representation Project Media Center. I have also served on the
boards of Women, Action and The Media,
Women in Journalism, and the DC
Volunteer Lawyers Project, as well as on
the advisory councils of the Center for
Democracy and Technology, VIDA,
Secular Woman, FORCE: Upsetting Rape
Culture, No Bully and Common Sense
Media DC. As an activist, I have
spearheaded several campaigns
challenging corporations to address
online abuse, restrictive content
moderation and censorship, and
institutional biases that affect free
speech.

In your opinion, what are the biggest


issues facing social media?

Widespread social distrust — of fellow


citizens, of institutions, of truth, of
experts – pervades society and, so, social
media. Social media companies continue
to promote the idea that technical
solutions can solve socio-technical
problems like this one.

How do we ensure safety, privacy and


Tell us about your role: speak frequently on topics related freedom of expression all at the same
to gender norms, inclusivity, social time?
I am the Writer & Executive justice, free speech, sexualized
Director of The Representation violence and technology. We expand our understanding of safety,
Project, an organization that uses privacy and FoE so that they are
films, activism and art to challenge My writing appears in The Atlantic, grounded in the experiences of the most
discriminatory norms and The Nation, TIME, Salon, The vulnerable instead of, as they are now,
stereotypes. Guardian and The New Statesman the most powerful. Each of these is
and speaks to our growing need for defined, substantively, by assumptions
Tell us about your career path and corporate inclusivity, freedom of about life experiences, perceptions of
how it led you to your work’s speech, comprehension of identity threat, assessments of risk.
focus: in the creation of knowledge,
human systems, data and When we discuss improving social
As a writer and an activist, my work technology. I am also the author of media, we often toggle between the
is focused on women's equality, Rage Becomes Her: The Power of
freedom from violence, and Women’s Anger, which has been Continued on next page
freedom of expression. I write and translated into several languages,
IMPROVING SOCIAL MEDIA | 39
ALL TECH IS HUMAN

responsibility of platforms, the role of media to educate the general public, governmental oversight, and the role of
citizens in terms of literacy and how they engage with platforms. In your opinion, what area do you think needs the
most improvement?

The areas that need the most improvement are those that reside at the intersections between these sectors. Each has its
own areas of focus and addresses different dimensions of complex problems. Where concerns overlap, we find critical
expressions of cultural values that social media companies minimize in a cultural preference for tech fixes. Social media is
a sociotechnical system and requires sociotechnical understandings, at every level.

What do you see as the risk of doing nothing to address the shortcomings of social media?

Deeper social distrust, conservative and deeply authoritarian retrenchment, violent political disruptions.

How does social media look different five years from now?

It will be more deeply embedded in the Internet of Things and in our bodies.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Philosophers, historians, social scientists, ethicists.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

I do not believe governments can solve this problem. Particularly since social media companies benefit and profit from
operating transnationally and in proto-governmental ways.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

While the role that algorithms play in individuals' experience is significant, I believe that the impacts of algorithms,
systemically and at-scale are far more worrying. Racism, sexism and caste-enforcement operate invisibly and powerfully
to reproduce and empower long-standing discrimination. Individuals may never see or personally experience or realize
that they are subject to these effects.

What makes you optimistic that we, as a society, will be able to improve social media?

Our desire for novelty and adaptability to media.

Connect with Soraya Chemaly @schemaly

"I do not believe governments can solve this problem. Particularly


since social media companies benefit and profit from operating
transnationally and in proto-governmental ways."

-Soraya Chemaly, Writer & Executive Director of The


Representation Project

IMPROVING SOCIAL MEDIA | 40


my college radio station and founded
LEARNING FROM THE COMMUNITY
Radio Poch@, a DJ collective that still
broadcasts a weekly radio show out of a

Steven Renderos
community radio station in Minneapolis,
Minnesota.

After college, I became a community


organizer working on issues from
affordable housing to police violence to
Executive Director of MediaJustice immigrant rights. Across all those issues,
I realized how critical it was to tell our
own stories but felt challenged by a
media ecosystem with many corporate
gatekeepers. I came across MediaJustice
not just as an organization, but a political
framework that weaved many beliefs I
held. When I first heard our founder
(Malkia Devich-Cyril) speak at an event, I
remember them saying that the “power
to communicate, and therefore the
power to transform our society, belongs
to everyone.” I knew then I wanted to be
part of a movement led by people of
color to reshape our media system. I have
been in the MediaJustice movement for
over 12 years and have been fortunate to
fight on the cutting edge of issues that
are transforming our society.

In your opinion, what are the biggest


issues facing social media?

In short, the scale and business model of


social media networks contributes to a
variety of major issues. First and
foremost, the unrelenting data collection
these platforms depend on to generate
revenue via ads has put their users,
Tell us about your role: Tell us about your career path and particularly people of color, in harm's
how it led you to your work’s way. Deceptive and outright racists ads
I am the executive director of focus: have leveraged this data to discriminate.
MediaJustice, a national Platforms have shared this data with
organization fighting for the media I grew up in Los Angeles, a child of a third party vendors that have used it for
and technology rights of people of Salvadoran immigrant, at the height a political purpose, e.g., the 2016 U.S.
color. I have been in my role for the of a nationwide war on drugs and election.
last year, having taken over from anti-immigrant fervor in California.
MediaJustice’s founder, Malkia The stories told about my Additionally, the platforms have been
Devich-Cyril. In this role, I am community rarely matched up with designed through their algorithms to
responsible for ensuring our work is my lived experience and, for as long reward highly engaging content, from
mission-aligned and putting us on as I can remember, I was passionate what posts show up on your news feed to
the path toward our ultimate vision, about telling my own story through what groups you’re suggested to join.
a future in which everyone is the creation of media. I wrote for And much like news coverage of crime in
connected, represented and free. my school newspaper, produced
films and even pretended to be a Continued on next page
radio DJ. In college, I was a host at
IMPROVING SOCIAL MEDIA | 41
ALL TECH IS HUMAN

the 1990s when “If it bleeds it leads” dictated editorial decisions, social media platforms reward the most sensational and
hateful content. This has created the conditions for disinformation and misinformation to spread, which has led to the
mainstreaming of fringe conspiracies like QAnon. White supremacists, prior to social media, were smaller in ranks and
disconnected. In the social media age, white supremacy has grown as fast as a Facebook algorithm will allow them to.
Social media has been a tool that’s been used to mainstream their beliefs and incite violence that has caused offline
harms. We saw that during the summer in 2020 with the violence perpetrated by white supremacist groups as a way to
co-opt protests in response to the killing of George Floyd.

Lastly, these companies have been left to regulate themselves, as policy making has not caught up to the threats these
platforms pose.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

There is no singular solution that will do the greatest good, but there are a multitude of solutions that together can start
to tackle the many complicated issues these platforms pose to our society.

First, solutions should tackle the business model of social media companies. Compel transparency around the algorithms
these companies use and adopt civil and human rights protections to ensure the most vulnerable users in the US and
beyond are safe. Create limits to the data that can be collected and prevent data from being shared with third party
vendors. And transition ownership of that data to the individual that data belongs to.

Second, break up the monopoly power these companies hold. it is not just that Facebook also owns Instagram (which is a
direct competitor) but that alongside Google, these two companies account for the vast majority of growth in the digital
market space. These companies should be broken up.

And the last solution I have heard about is taking the social media approach at a smaller scale to tackle the infodemic. I
have heard of hyper local social media networks at a city level with strong curation rules that attempt to engage people
in healthy dialogue over the issues facing their communities. A key part of the solution to address misinformation is
strengthening where people are getting their news from. Hyper local networks where people rely on their neighbors is a
start, and so will revitalizing journalism.

How do we ensure safety, privacy and freedom of expression all at the same time?

Balancing safety, privacy, and freedom of expression begins with acknowledging who historically and currently does not
enjoy those protections on social media. According to Pew Research, 1 in 4 Black people in the U.S. has faced online
harassment because of their race. Women on Twitter have been doxxed by misogynists so routinely that it even
prompted a research project by Amnesty International (“Toxic Twitter”) to study this phenomenon. Arabs and Muslims
have seen social media used to mobilize violent actions directed at mosques.

Meanwhile, for years, the right has tried to co-opt the debate over free speech by claiming that social media platforms
are biased against conservatives. This is counter to any studies that routinely show right-leaning content performs better
on the platforms.

Balancing safety, privacy and freedom of expression should start where the greatest harm is happening. To quote a
recent op-ed by MediaJustice founder Malkia Devich-Cyril, “When an oppressed minority seeks equality and justice, and
freedom from the harm and violence brought on by the systematically privileged speech of others, that’s not censorship,
that’s accountability.”

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

Governments have largely left these large platforms to regulate themselves. Platforms have proven ill-equipped to
address many of the problems that have cropped off from the proliferation of disinformation to the spread of white
supremacy. There is a critical need for policy to define what behavior and practices are lawful and what is not. The
platforms will never make choices that are counter to their own business practices, and only the adoption and
enforcement of laws can ensure that people of color are protected.
STEVEN RENDEROS IMPROVING SOCIAL MEDIA | 42
Continued on next page
ALL TECH IS HUMAN

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

MediaJustice is a part of a couple coalitions that are aiming to improve social media, including the Change the Terms
coalition, which was founded by people of color-led organizations to tackle the spread of online hate. We’re part of the
leadership council of the Disinformation Defense League, which is aimed at fighting against racialized disinformation.
Organizations within them, including Color of Change, Free Press, United We Dream, First Draft, Lawyer Committee on
Civil Rights, the National Hispanic Media Coalition and others are trying to improve social media for good.

What do you see as the risk of doing nothing to address the shortcomings of social media?

There is a high concentration of people in society today who live in an alternate reality shaped by misinformation. The
Capitol riot was a violent manifestation of what happens when a lie, like that of a stolen election, is directed at the
government. Democracy itself hangs in the balance, as an electorate divorced from reality can vote in a government with
the ability to cause harm on a massive scale. We spent the last four years seeing a president run a country via social
media, and as always it was communities of color who faced the brunt of the harm.

How does social media look different five years from now?

We forget that Facebook is less than 18 years old (not even old enough to vote in a metaphorical sense). Other major
platforms that people have gravitated toward lately are less than five years old. A helpful point of comparison to me are
streaming platforms. For a while, there has been an undisputed giant in Netflix, but in recent years more niche platforms
are emerging. I think the same is likely to happen to social media, where we’ll network ourselves along place, identity or
topic. This happens now on platforms like Facebook, but I do see an appetite for stepping outside of that infrastructure.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

It is hard to feel confident that the government can solve this conundrum when it wasn’t long ago that a sitting US
Senator described the Internet as a “series of tubes.” That said, I would argue against false equivalencies. Deplatforming
Trump was a necessary action, albeit too late, to hold a public official accountable for inciting violence. Doing so was not
counter to protecting vulnerable voices on platforms. I believe the standards should be higher for public officials who are
given a platform to directly reach their audience, not lowered because their speech is inherently newsworthy, despite
the harm they cause. I also do not see the issue with platforms as one with an endpoint, for as long as platforms for
speech have existed from newspapers to radio to TV and now the Internet, the question of whose voices are centered
has always been an issue. And so long as white supremacy, embedded in U.S. institutions, continues to define who has
power and wealth, then any new innovation in communications will face the same challenge of amplifying the speech of
the privileged while suppressing dissenting voices.

What makes you optimistic that we, as a society, will be able to improve social media?

There is a greater understanding that local work matters. The multiracial alliance that drove the highest voter turnout in
U.S. history was rooted in local organizing. The way to combat information deserts is to rebuild local journalism, as the
work of Free Press’s News Voices is attempting to do.

In my first job out of college as an organizer for a tenants union. I had a chance to organize neighborhoods that were
composed of immigrants and poor white folks. They fought together because they shared an interest in living in a safe
and clean neighborhood. They had more in common with each other than differences, and organizing together helped
make that real. The silver lining in this moment where social media has driven divisions far and wide is that healing those
divisions has to start where we’re at. For most of us, that’s in the communities we live and work in every day.

Connect with Steven Renderos @stevenrenderos

STEVEN RENDEROS IMPROVING SOCIAL MEDIA | 43


that, in 2020, web, mobile app and video
LEARNING FROM THE COMMUNITY
accessibility lawsuits were up by almost
25% in the United States. For companies

Pia Zaragoza
without existing efforts in inclusive
design and accessibility research, more
needs to be done to understand the
ability, identity, habits and preferences
of people of all abilities. This framing
builds off of the research of Dr. Amy
UX Mentor at Springboard, Presidential Hurst. It is all about ensuring that social
Innovation Fellow media enables people with disabilities to
be seen, heard and understood.

What makes you optimistic that we, as a


society, will be able to improve social
media?

In Clay Shirky’s book Here Comes


Everybody: The Power of Organizing
Without Organizations, he wrote, “The
tools are simply a way of channeling
existing motivation.” Drawing inspiration
from the 2020 Accenture Technology
Trends report, they mention how “people
do not just want more technology in our
products and services; we want
technology that is more human.” To some
degree, this speaks to the intrinsic
motivation on a societal and personal
level that will potentially drive the post-
pandemic social media landscape.

Though I am optimistic that, as a society,


we will be able to improve social media, it
is important to consider what could
possibly go wrong and what has gone
wrong. Misinformation and
disinformation or the “infodemic” in
Tell us about your role: mentor students at Springboard recent times has proven to be a
who are looking to take the leap significant threat toward progress. It will
I am a technologist, researcher and into the field of User Experience be interesting to see what emerges from
educator exploring the current use (UX). a policy and regulatory standpoint, as
cases and future possibilities of well as how nascent technologies can
emerging technologies. The work In your opinion, what are the better address prevention and
that I do is at the intersection of biggest issues facing social media? intervention efforts.
design, research, engineering and
computer science. Previously, I was Eliza Greenwood, Accessibility Connect with Pia Zaragoza @pzaragoza
the Vice President of Accessibility Trainer and Digital Marketer, said it
Research and Insights at a Fortune best when she posed the question,
500 Bank. Currently, I am a “Who is or isn’t showing up in the
Presidential Innovation Fellow different social media channels?”
working alongside federal Digital accessibility and the lack
changemakers in developing thereof in social media is one of the
creative solutions to complex biggest barriers for people with
problems. In addition to this, I also disabilities. UsableNet reported
IMPROVING SOCIAL MEDIA | 44
working to improve the overall climate of
LEARNING FROM THE COMMUNITY
the user experience on a platform. The
scale of the problems of platform

Amanda Lenhart
management, especially content
moderation, but also cross-cultural
complexities, push platforms to look for
automated, AI-driven solutions that lack
nuance and often punish more users who
do not look like a platform's imagined
Program Director, Health + Data, Data & average user.
Society Research Institute What "solutions" to improving social
media have you seen suggested or
implemented that you are excited
about?

I am currently most interested in actions


that are implemented inside technology
companies that improve the process by
which products are designed and bring a
wholistic concern for user well-being to
every step of the design process, rather
than parachuting in right before launch –
a sort of “Well-being by Design” initiative
that empowers those deputized to think
about user wellbeing with the power to
make decisions and make change. While
it is important to give users choices, we
can no longer place all of the burdens of
managing risk and avoiding harms at the
feet of the users.

How do we ensure safety, privacy and


freedom of expression all at the same
time?

We do not. There is no right to freedom


of expression on private platforms. Free
Tell us about your role: social media platforms is between a speech does not mean freedom from the
moral responsibility to minimize the consequences of your speech, and
I am the program director for harm that their products cause to unbridled free speech so often ends up
Health and Data at Data & Society users – either through what they silencing all but the most powerful
Research Institute. I lead a team directly cause or what they enable groups. To safeguard safety and privacy
that studies both how technology other people to do – and the and preserve the well-being of users –
and data are used in medical and pressures for revenue and growth and frankly the utility of our spaces as
non-medical settings to improve that incentivize duration, stickiness sites of civic dialogue – we cannot have
human health and how the use of and other platform design practices completely ungoverned speech.
technology can adversely affect that may diminish user well-being.
well-being. When we discuss improving social
When user well-being is media, we often toggle between the
In your opinion, what are the foregrounded, it is typically through responsibility of platforms, the role of
biggest issues facing social media? the most measurable, but often the media to educate the general public,
least important and impactful
A fundamental tension faced by modes of action – e.g., focusing on Continued on next page
companies that build and maintain reducing screen time without
IMPROVING SOCIAL MEDIA | 45
ALL TECH IS HUMAN

governmental oversight, and the role of citizens in terms of literacy and how they engage with platforms. In your
opinion, what area do you think needs the most improvement?

We have spent over 15 years working to educate the public and enhance the media literacy of platform users. it is time to
hold platforms more accountable for the harms to users, especially those that affect structurally marginalized
communities. These are difficult problems to manage. Platforms have not prioritized their moral responsibilities over
their pursuit of profits, and many need nudges from regulators to rebalance those priorities.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Technology companies need to enhance the diversity of their workforces – not just in who they hire, but who they retain
and promote. Employees of companies at all levels need to reflect the breadth of the users of their products and services.
Not just diversity by race and ethnicity, but by age, parent status, gender identity, disability status and other categories
of users who are not always well-served by platforms and their moderation. And where this diversity cannot be readily
acquired or maintained, companies must seek out and do direct research about users to better understand their
experiences on the platform. Further, emphasizing exposure to a variety of academic disciplines like the social sciences
and humanities, including but not limited to thinking about ethics among all types of hires – including engineers and
computer scientists, not just among legal and trust and safety teams – will also help seed ideas about responsibility to
users at all levels and across roles in technology companies.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

Algorithms indelibly shape user experiences on many social media platforms. They also help companies manage scale
problems by eliminating human moderators or assisting them with content moderation. However, the lack of
transparency in the algorithms that govern these systems, and bias enabled by assumptions built into automated
systems, harm certain sub-groups of users (especially structurally marginalized groups) and make it difficult for anyone
using the system to understand and critically respond to outputs. Requiring algorithmic impact assessments prior to the
launch of algorithms used (or modified) by platforms can start the process of providing greater transparency, especially
with regard to an algorithm's outcomes.

Connect with Amanda Lenhart @amanda_lenhart

"The scale of the problems of platform management, especially


content moderation, but also cross-cultural complexities, push
platforms to look for automated, AI-driven solutions that lack
nuance and often punish more users who do not look like a
platform's imagined average user."

-Amanda Lenhart, Program Director, Health + Data, Data &


Society Research Institute

IMPROVING SOCIAL MEDIA | 46


230 and evaluate advertising liability;
LEARNING FROM THE COMMUNITY
companies need to improve and update
their content moderation policies; and

Laura Manley
advocacy organizations and citizens must
hold social media platforms accountable.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
Director, Technology and Public Purpose media to educate the general public,
Project, Harvard Kennedy School governmental oversight, and the role of
citizens in terms of literacy and how
they engage with platforms. In your
opinion, what area do you think needs
the most improvement?

Government oversight. It is not the


responsibility of my 65-year-old mother
to understand the intricacies of
Facebook's manipulative algorithm and
how to protect her data. It has gotten to a
point where the government needs to
intervene because there are real harms
being done to our democracy and
citizens.

What makes you optimistic that we, as a


society, will be able to improve social
media?

I like watching historical movies and


shows. It always reminds me how far
we've come as a society on so many
fronts – healthcare, education, equality
and so on – so it gives me hope that we
can overcome anything. It may take a
long time, but we will eventually figure
out a solution to the social media
Laura Manley is the inaugural Director you are excited about? dilemma. We are at the uncomfortable
of the Technology and Public Purpose moment in time when the tech has
(TAPP) Project at the Harvard Section 230 reform could help outpaced the societal guardrails, but they
Kennedy School Belfer Center. At address some concerns, but the real will sooner or later be developed.
TAPP, Laura launched several new focus needs to be on the business
initiatives, including the TAPP models of social media companies Connect with Laura Manley
Fellowship, Tech Fact Sheets for and how they incentivize @Laura_Manley1
Policymaker Series and the Tech engagement at all costs.
Spotlight to make advances in tech
more inclusive, safer and fairer. She How do we ensure safety, privacy
has also testified twice to Congress on and freedom of expression all at
her research to improve science and the same time?
technology expertise in government.
There is no silver bullet to address
What "solutions" to improving this question. Actions across
social media have you seen multiple fronts need to be taken –
suggested or implemented that policymakers need to reform CDA
IMPROVING SOCIAL MEDIA | 47
incredibly difficult to solve, unless we
LEARNING FROM THE COMMUNITY
adopt a holistic perspective and are
willing to address the core dysfunction of

John Rousseau
the system. Doing so would require
significant efforts across industry,
government and society; tradeoffs
amongst various stakeholders; and a
willingness to completely redesign the
policies, products and platforms
Partner at Artefact themselves. Today, the system is
functioning as designed and in the
interests of those who designed it.

What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

I am particularly interested in
decentralized identity, an idea most
prominently advocated by Balaji
Srinivasan. His concept – Pseudonymity
– refers to a middle ground between
anonymity (where identity cannot be
verified) and transparency/permanent
identity (which can be verified but is
high-risk/fragile). A pseudonymous
identity might include a range of
authenticated and secure profiles: for
example, a social identity, a transacting
identity and a permanent identity – all
encrypted on a blockchain ledger and
owned by the individual.

This would improve social media by


creating ways to build social capital
without fear of reprisal, improving
discourse while protecting free speech,
Tell us about your role: biggest issues facing social media? leveling unbalanced power relationships,
increasing participation and minimizing
I am a Partner at Artefact, a Social media is a wicked problem. the impact of bad actors and bots.
strategy and design firm, where we While it is easy to identify singular
work with clients to shape more issues like disinformation, When we discuss improving social
responsible futures. My practice is prospective solutions like better media, we often toggle between the
primarily concerned with content moderation or responsibility of platforms, the role of
understanding change, identifying authentication invariably media to educate the general public,
strategic choices and envisioning encounter complex systemic governmental oversight, and the role of
future scenarios, products and relationships between the business citizens in terms of literacy and how
services across multiple time model, incentive structures, they engage with platforms. In your
horizons. I am interested in product design, technologies, opinion, what area do you think needs
emerging technologies, new competing interests, power, politics the most improvement?
business models, social change, and and human behavior (among
complex systems. others!). Continued on next page

In your opinion, what are the This makes specific problems


IMPROVING SOCIAL MEDIA | 48
ALL TECH IS HUMAN

Responsibility starts with platforms – as I said previously, they are functioning as designed. And yet, as Jaron Lanier has
argued, the technology companies are unlikely to fix themselves absent real external pressure, due to conflicting
incentives and a distorted world view built on the primacy of their products.

Following Lanier, the only form of agency most of us have is to simply stop using the products and collectively imagine a
better future. I use the word “imagine” suggesting that responsibility for envisioning the future falls to each of us, and to
society at large. It is not the exclusive domain of pundits, technology companies or the government (though the latter can
and should structure incentives that are aligned with a clear vision).

We need to question our assumptions about how the system works, and what outcomes we seek, if we are to have any
voice in shaping a better world. So – how might the future be different? Will there be perfectly targeted advertising, or
no advertising at all? Will you own your personal data, or would others? Is engagement the primary metric, or well-being?

What models do you see coming on line for providing a digital community (beyond today’s ad-based, extraction
model) – platform cooperatives? Decentralized Autonomous Organizations (DAOs)?

The proposals I have seen for new platforms – particularly those with novel incentive structures – seem like they would
create a range of unexpected negative externalities at scale. Similarly, fully decentralized networks could address issues
of data sovereignty and disrupt the extraction model but could struggle to maintain order and moderate the flow of
disinformation without robust and reliable governance. This is particularly true as real power is involved, as the
discourse becomes more and more polarized and as the system influences real-world outcomes.

How does social media look different five years from now?

In five years, the most likely outcome is that social media is a lot like today, but worse. It will be dominated by the same
monopolistic platforms, which have made superficial changes to policies and products in response to public pressure and
regulation. These fixes will fail to have a significant impact because they did not address root causes (e.g., business
model) and systemic dysfunction (e.g., incentivizing platform growth and user engagement). The 2024 US presidential
election will be a high-water mark for disinformation.

This is not a hopeful vision, I know. My sense is that, in terms of social media and several related issues, things will get
much worse before they get better. Or rather, before enough people demand change, are willing to make personal
sacrifices for the greater good and are willing to act on that belief. Until then, keep scrolling.

What makes you optimistic that we, as a society, will be able to improve social media?

The reason to be optimistic about the future is that, unlike social media, society is (hopefully) enduring and has the
capacity to improve itself over time. Social media is not a mirror that merely reflects society; it is more like a powerful
amplifier turned up to 11 and stuck in a negative feedback loop. To make meaningful progress, we need to turn the
volume way down, begin from first principles, question our assumptions, and truly address the complex and systemic
issues that got us here with an eye toward a preferable future.

Connect with John Rousseau @john_rousseau

"We need to question our assumptions about how the system works,
and what outcomes we seek, if we are to have any voice in shaping a
better world."

-John Rousseau, Partner at Artefact

IMPROVING SOCIAL MEDIA | 49


trade-offs they’re making and how to
LEARNING FROM THE COMMUNITY
protect themselves. I want users to make
intentional and informed digital safety

Camille Stewart
decisions based on their risk tolerance
and to understand the consequences of
each choice, similar to the physical safety
decisions they make each day. A more
informed citizenry will make demands of
both the industry and government to
Cyber Fellow, Harvard Belfer Center drive the market and direct government
regulation and oversight. All these areas
are important, but we are underinvesting
in comprehensive citizen education on
digital literacy and civics – especially as
educating low-income, rural and minority
populations continues to be largely left
out or underfunded.

Part of our mission at All Tech Is Human


is to diversify the people working in the
tech industry. In your opinion, what
academic or experience backgrounds
should be more involved in improving
social media?

All backgrounds should be represented.


Not just academic or experience but also
race, ethnicity, gender, religions, etc.
Those pieces of our identity influence
and ultimately drive how we engage with
technology, how we see the world and
how we innovate. Our diversity is our
greatest asset. We all have a role to play
in elevating new voices and different
lived experiences. I co-founded
#ShareTheMicInCyber to spur action and
remind cyber practitioners that as
individuals they can elevate the voices of
Tell us about your role: making (e.g., convenience v. their underrepresented colleagues, in
privacy, convenience v. security). this case Black cyber practitioners. Learn
I work to empower others through more HERE.
technology by elevating and When we discuss improving social
working to solve the complex media, we often toggle between Connect with Camille Stewart @CamilleEsq
challenges at the intersection of the responsibility of platforms, the
technology, security, society and role of media to educate the
the law. general public, governmental
oversight, and the role of citizens
In your opinion, what are the in terms of literacy and how they
biggest issues facing social media? engage with platforms. In your
opinion, what area do you think
The conflicting societal needs the most improvement?
expectations, trust, truthfulness,
and an under-informed and under- We need to invest in educating
engaged user base that does not citizens on digital literacy and civics
understand the tradeoffs they are so they better understand the
IMPROVING SOCIAL MEDIA | 50
on the opportunity for digital places and
LEARNING FROM THE COMMUNITY
experiences to allow young people,
particularly Black, Latinx, Indigenous,

David Ball
LGBTQIA+ and multi-hyphenated youth
to thrive and find support through the
technology that is intertwined in their
lives. We do this through research, like
our crowdsourced mapping project
Digital Delta, through the Headstream
Headstream Director at SecondMuse Accelerator, which supports
entrepreneurs working on social
technology innovations, and through our
Youth 2 Innovator program for our
future leaders.

What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

Over the last year we have accelerated


and incubated more than 30 social
technologies working to improve digital
places for young people through the
Headstream Accelerator for market-
based innovations and the Headstream
Incubator for young entrepreneurs with
transformative ideas.

Technology is intertwined in our daily


actions. We are dependent, reliant and at
times consumed by social technologies.
For most young people, there isn’t a
distinction between IRL and life online;
they run in parallel. Embedded in the
notion of living online is a responsibility
to create digital innovations that
empower young people to build new
Tell us about your role: and wellbeing, Headstream stands skills, skills that allow them to build
at the heart of both a pandemic that meaningful relationships, relationships
I am grateful for the opportunity to has magnified our dependence on that enable an environment that fosters
lead Headstream, an innovation digital spaces, and a fight for racial creativity, patience, learning,
program focused on shaping an justice that has awakened the collaboration and where young people
economy around the digital places country to the injustices BIPOC can be seen and heard.
where young people can thrive. communities endure. Work that
Headstream is powered by was critical two years ago as rates We admire social platforms like 2Swim
SecondMuse, a B-Corporation that of depression, anxiety and and Trill Project that create safe
is constantly probing for systems, loneliness among young people communities for young people to thrive,
structures, and economies that are rapidly increased, is now at the create meaningful connections and
primed for socially and forefront of how we as a society explore their evolving identities. The
environmentally just adapt to parallel crises. startups Weird Enough Productions and
transformation. Novelly provide young people with the
Headstream is an initiative
Two years after starting our exploring the intersection of Continued on next page
exploration of teens, technology technology and wellbeing focused
IMPROVING SOCIAL MEDIA | 51
ALL TECH IS HUMAN

agency to create and connect around art and writing in a digital place where they can address the most important social
issues in their lives. Online games have also become incredibly social experiences for young people. Innovations like
Liminal Esports, Social Cipher, EquipT and Gamersafer are using gaming platforms to provide safe digital experiences for
young people to connect and develop new skills critical to their wellbeing, all while having fun.

These are the types of social technologies we believe will meet the wellbeing, relationship and developmental needs of
young people moving forward.

What do you see as the risk of doing nothing to address the shortcomings of social media?

We have a long way to go to create a digital economy that meets the needs of all of its participants. Structural injustices
exist for both the creators of these new technologies as well as communities of users who are often completely ignored.
At Headstream, we strive to deconstruct the unjust system that exists for women as well as Black and Indigenous People
of Color in the technology sector. We take a stand, through the innovations that we source and accelerate, to bring
solutions that serve communities of users who are traditionally unjustly served by social technologies. Our fear is that
we will lose a generation because we didn’t take care of their wellbeing in time. Social technology needs to be a space
where young people can nurture their talents as they grow up, connect meaningfully and find support when times are
tough.

How does social media look different five years from now?

Young people across the country have cracked the code and are flourishing because of social technologies. They are
connecting and collaborating with communities they may never have imagined existed, bridging socioeconomic and
racial divides. Their voices, ideas and creativity now have the potential to reach every corner of the globe. They can find
and pursue their diverse and unique passions. And of course, they can play and find joy. What would growing up be
without that? All of these incredibly rich and positive experiences point to the real social value that technology can
create. As Ose Arheghan, a national youth LGBT advocate and member of the Headstream community puts it, “Having
access to technology has been really empowering because I have been able to educate myself on my identity and what
that means for me, and I am not confined to what other people think being a part of the LGBT community means.” That
experience exists for some today; in five years, hopefully it exists for almost all young people.

Connect with David Ball and Headstream @headstreaminno

"Embedded in the notion of living online is a responsibility to create


digital innovations that empower young people to build new skills, skills
that allow them to build meaningful relationships, relationships that
enable an environment that fosters creativity, patience, learning,
collaboration and where young people can be seen and heard."

-David Ball, Headstream Director at SecondMuse

IMPROVING SOCIAL MEDIA | 52


I accidentally landed in this field! I am a
LEARNING FROM THE COMMUNITY
very private person who does not like
using social media. My early career was

Liz Lee
spent investing in VCs and startups. But
in 2015, while at Morgan Stanley, I was at
staring at two monitors, one with news
headlines of people committing suicide
due to online harassment, and the other
showing the financial performance of
OnlineSOS / Founder; Sr. Product Trust social media apps. I had never spoken
Partner / Twitter about my own experience being stalked
and extorted online. I realized I wasn’t
alone; this experience impacts 85 million
Americans. I asked myself: How many
more people need to be harmed before I
do something? I started sharing my story;
less than a year later, I left my job and
launched OnlineSOS to create the tools I
wish I had.

What do you see as the risk of doing


nothing to address the shortcomings of
social media?

Social media isn’t going away. As new


technologies, interactions and content
formats emerge, the problems will only
continue to get thornier. There is an
opportunity to try to address problems
before they develop as platforms, and
features are designed to limit potential
harm. Not enough people took these
issues seriously in the past, resulting in
many of the problems today. It is
important to address the shortcomings
now as we see them, to foster that
authentic introspection within a
company and communication across
This interview represent the personal abuse, including mental health companies. It is critical to proactively
views of Liz Lee and not that of services and case management. identify those blind spots. The risk of
Twitter. While we no longer provide direct doing nothing is high – doing nothing can
support, we have checklists and cause irreparable and inadvertent harms.
Tell us about your role: other resources to empower people Amplification of content occurs at a rapid
with information to take action. velocity that can alter conversations and
I currently am a Sr. Product Trust Our Landscape Report maps out perceptions, and potentially cause
Partner in Trust & Safety at Twitter the ecosystem’s needs and physical harm to individuals, groups and
where I develop product policies recommendations for systemic communities.
and make recommendations on change. Our Online Stalking Legal
feature development. Previously, I Project outlines what rights people Part of our mission at All Tech Is Human
started OnlineSOS to support have in select states to get is to diversify the people working in the
people, particularly journalists, restraining orders. tech industry. In your opinion, what
facing online harassment. academic or experience backgrounds
OnlineSOS was the first and only Tell us about your career path and
nonprofit in the US to provide free, how it led you to your work’s Continued on next page
professional support for online focus:
IMPROVING SOCIAL MEDIA | 53
ALL TECH IS HUMAN

should be more involved in improving social media?

People who have been trained in the social and behavioral sciences and understand human interaction and behavior, as
well as historical developments and societal patterns should consider entering or contributing to the field. Sociologists,
psychologists, historians, community organizers know either firsthand through one on one primary research and
studying or academic research the very challenges that we grapple with on a day to day basis both in person and online.
This is especially crucial in order to prevent systemic inequalities from being mirrored or exacerbated in online spaces.

There is also a need for people who can speak to the experiences of people from historically underrepresented and
marginalized communities, including but not limited to: communities of color, LGBTQ+, women, people who understand
systemic racism and power dynamics. They are a necessary part of the overall solution to our puzzle. Anyone who can
use data to advocate for and keep in mind any structural or systemic biases that we may have in not only our algorithms,
but also our design and decision making, is critical.

Tech companies can build formal processes and initiatives for planning, prioritization/stack ranking, escalations and risk
mitigation, but companies are made of individuals and, in the middle of a crisis, an individual will choose what to
prioritize, will choose to use their political capital to raise an issue. However, it is also the responsibility of tech
companies to ensure that diverse candidates are welcomed and set up to succeed; we cannot and should not assume that
everyone should fit into the existing tech culture.

What makes you optimistic that we, as a society, will be able to improve social media?

There has been a major shift in the public consciousness about the role of social media and the importance of online
experiences. Online harassment, toxicity abuse, misleading information – these issues are now increasingly well
understood. Five years ago, people were still saying, “Turn off your computer, it is not real life.” Awareness and a demand
for change are the first steps.

Now, as more talent and dollars go into addressing multiple prongs of the range of challenges, I am certainly optimistic.
Improvements to social media are no longer a “nice to have” but generally understood and accepted as a “must.” Getting
funding for initiatives tackling online abuse is easier in 2021 than in 2015.

It is easy to be resigned about the progress that we haven’t made, but we also cannot discount the shift in tide of public
opinion. Real, lasting, systemic and structural change will require radically new levels of cross-stakeholder collaboration,
with tech companies, grassroots and advocacy groups, community organizers, lawmakers, researchers, educators,
technologists, policy makers, etc.

The announcement of the Biden/Harris Online Harassment Task Force indicates that they are taking violence against
women and online harassment seriously. Vice President Harris has demonstrated an understanding of cyber crime
issues, including nonconsensual pornography, when she was the Attorney General and then Senator of California. She
has a track record of engaging with members of the community who understand the problem well, including advocates
for victims/targets of abuse and grassroots groups.

Connect with Liz Lee @wanderwonderful

"It is easy to be resigned about the progress that we haven’t made,


but we also cannot discount the shift in tide of public opinion."

-Liz Lee, Founder of OnlineSOS

IMPROVING SOCIAL MEDIA | 54


At Mozilla, we’re building crowdsource-
LEARNING FROM THE COMMUNITY
powered tools that make opaque social
media platforms more transparent. For

Ashley Boyd
example, we launched the
RegretsReporter browser extension that
lets people send us data about
recommendation rabbit holes that they
get sent down on YouTube. Similarly,
during the 2020 U.S. elections, Mozilla
VP of Advocacy and Engagement, Mozilla researched and published platform’s
policies on misinformation and
disinformation; access for researchers;
ad transparency and consumer control.

We plan to continue that work looking at


global elections in 2021. And Mozilla’s
Common Voice allows people to donate
voice data to contribute to an open
dataset that can be used to make AI-
enabled voice technology more inclusive.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
media to educate the general public,
governmental oversight, and the role of
citizens in terms of literacy and how
they engage with platforms. In your
opinion, what area do you think needs
the most improvement?

We toggle between these areas because


each of them is required for a healthy
information ecosystem. Our information
ecosystem cannot and should not be
governed by any one of these actors.
Rather, it is a matter of media, platform
companies, people and democratic
At Mozilla, Ashley mobilizes millions opaque. Despite influencing what institutions working together to create a
of internet users to stand up for billions of people see, read and system of accountability that has checks
privacy, security, inclusion and believe, Facebook, YouTube and and balances against the power of any
decentralization online. other platforms do not disclose how one actor. Meaningful transparency is an
they develop and train their essential starting point for decentralizing
She leads global campaigns that algorithms, nor do they share data power and increasing individual agency.
empower individual internet users and on how this AI influences users. As a
hold Big Tech like Facebook, Amazon result, these AI systems can In the U.S. context, civil society
and YouTube accountable. misinform, polarize and radicalize organizations have been more successful
millions of users, and society has no holding the platforms to account in
In your opinion, what are the biggest way to understand how or push for recent years, but we’re about to see a
issues facing social media? change. large swath of new legislation. it is crucial
that those civil society groups continue
The AI systems powering social What "solutions" to improving to engage in that process since we’ve
media are some of the most social media have you seen
influential forces in the world. suggested or implemented that Continued on next page
They’re also some of the most you are excited about?
IMPROVING SOCIAL MEDIA | 55
ALL TECH IS HUMAN

gained deep understanding and first hand experience with many of the harms that social media can exacerbate in real
life.

What models do you see coming on line for providing a digital community (beyond today’s ad-based, extraction
model) – platform cooperatives? Decentralized Autonomous Organizations (DAOs)? For example, are there promising
applications for the blockchain?

Tech today collects our data and then uses it to make important decisions about us, from what news we read to who we
date. And there is a serious power imbalance at work here: Why should Amazon know so much about our shopping
impulses? Why should Facebook get to create secret profiles on us?

To re-balance power, we need new data governance models like data trusts, which operate as trusted intermediaries
between consumers and big tech platforms.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

We need to shed the harmful misconception that technologists alone should develop technology. Today’s consumer tech
products, from search engines to voice assistants, impact billions of people everyday. As a result, we need people with a
range of expertise, from sociologists to psychologists, to design, build and govern these systems. That’s how we fix blind
spots and prevent technologies from inadvertently excluding or manipulating entire communities. My colleague Kathy
Pham wrote a great essay about exactly this for Fast Company.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

One concern we have with how algorithms affect the social media experience is that they often drive people toward
more extreme content. Case-in-point: In the weeks leading up to the 2020 U.S. elections, we called on Facebook to
discontinue group recommendations as evidence mounted that people were increasingly being exposed to
disinformation and misinformation after joining groups recommended by Facebook’s algorithmic recommendation
engine. People who might never have been radicalized otherwise end up going down a rabbit hole on YouTube, engaged
in a feedback loop on Facebook or otherwise exposed to harmful content largely because of algorithmic
recommendations.

Giving consumers more control over content recommendations, eliminating algorithmic recommendations and
increasing transparency about how platforms’ algorithms are trained are all ways we can help mitigate this problem.

What makes you optimistic that we, as a society, will be able to improve social media?

Recently, more and more people have engaged in the challenge. There is increased political will to work toward solutions,
and platforms themselves recognize that, between consumer power, advocacy from civil society and government
interventions, they will have no choice but to make crucial improvements.

Connect with Ashley Boyd @ashleyboyd

ASHLEY BOYD IMPROVING SOCIAL MEDIA | 56


porn,” but we don’t call it that), or serious
LEARNING FROM THE COMMUNITY
cyberbullying of a child.

Julie Inman-Grant
There is pending legislation that would
give us additional powers to compel
take-down of serious adult cyber abuse
and require companies to live up to Basic
Online Safety Expectations (“the BOSE”)
including the ability to compel
eSafety Commissioner, Australian Office of the transparency reports to reduce opacity
eSafety Commissioner in policies but also to understand how
certain issues are being tackled and
whether companies are enforcing their
policies consistently and fairly.

Tell us about your career path, and how


it led you to your work’s focus.

I most definitely didn’t imagine in the


1980s when I was attending university in
Boston that I would end up becoming a
government regulator of the technology
industry in the Land Down Under but my
career has been bookmarked by roles in
Government. My first job interview out
of university was at the CIA [and] was to
analyze the psychology of serial killers
but I ended up taking a role on Capitol
Hill with my hometown Congressman
instead. I was working on a range of
social justice issues but the Congressman
asked if I would take on the break up of
the Baby Bells and look after technology
policy because we had a “small little
software company in our District called
Microsoft.” So, in 1991, I embarked upon
a career that worked at the intersection
of technology, safety and policy, before
Tell us about your role: achieving our goals. We focus on there was an Internet.
Protection—through our reporting
For the past 4+ years, I have served schemes and investigations; After a stint in the NGO sector and in
as Australia’s eSafety Prevention—through education, Brussels, I landed as Microsoft’s second
Commissioner—I started with a programs and awareness raising, DC lobbyist immediately prior to the US
staff of 35 and have grown it into a and; Proactive & systemic change— DOJ antitrust case. In this role, I was
nimble and innovative agency of where we try and stay ahead of involved in shaping Section 230 of the
115. Established in 2015, the Office technology trends and work with CDA, helping to organize the first White
of the eSafety Commissioner industry to encourage them to House Summit on Online Safety during
(eSafety) is an independent online develop safer online products. the Clinton Administration and after five
safety regulator and educator LONG years, I moved to Australia to start
whose sole purpose is to ensure eSafety has a range of civil powers their corporate affairs function in the
that our citizens have safer and to compel takedown of illegal or region. I developed my specialty in
more positive experiences online. harmful content, whether it’s child safety, security and privacy in an APAC
We are the first regulator of its kind sexual abuse material, pro-terrorist
in the world and take a content, image-based abuse Continued on next page
multipronged approach to (colloquially known as, “revenge
IMPROVING SOCIAL MEDIA | 57
ALL TECH IS HUMAN

role and finished my 17 year career at Microsoft as the global head of privacy and safety policy and outreach at Redmond
HQ. I had two exciting and eye opening years at Twitter setting up and running their public policy and philanthropy
functions in ANZ and South East Asia before joining Adobe as their head of Government Relations across Asia Pacific.
Nine months later this poacher became a game keeper and I was appointed to serve as eSafety Commissioner of
Australia.

In your opinion, what are the biggest issues facing social media?

The failure of corporate leadership to recognize and embrace their tremendous societal impact and the ill effects
technology can have on humanity and to actively take responsibility for these hazards. Had more of these companies
prioritized the safety, privacy, security and overall well-being of their users and balanced the imperatives of “profits at all
costs” with their responsibility to prevent and protect a range of online harms, they would be in a much better position. If
you add to that tremendous market power, the perception of evasion of international taxes, and occasional recalcitrance
towards governments, the biggest issues facing them will be the force of global governments regulating them in ways
that might be both unworkable, inconsistent and detrimental to their future growth. So, to me, this is the biggest threat
to the industry.

[T]he looming threats to users of social media and the industry’s ability to address these threats involve the various ways
threat actors are weaponizing their platforms to spread child sexual abuse material, pro-terrorist/extremist content and
other forms of illegal content—these cause the most harm to society but there are also a range of technology tools
available to tackle these issues, if there were greater will to do so. It is the more “contextual issues” and the forms of
harmful content that aren’t patently illegal and are likely to require more “ecosystem change,” investigation and human
moderation that are likely to be more challenging to tackle. This includes issues related to incitement or facilitation of
violence and sexual harassment.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about? How
do we ensure safety, privacy and freedom of expression all at the same time?

I served as an “online safety antagonist” within industry for more than two decades and I could never convince company
leadership that addressing “personal harms” should be elevated to the same status as privacy or security. I brought
“safety by design” to Microsoft leadership more than a decade ago and while there was a tacit understanding of the
importance of online safety, it was never given the priority, investment or attention that the other disciplines were.
Whilst at Twitter, I saw the devastation that targeted online harassment wrought on humanity every single day – and it
demoralized me too. The company that I was so excited to join, that stood for the levelling and promotion of voices online
that previously weren’t heard, simply weren’t doing enough to protect those voices, particularly marginalized voices. I
could not defend this anymore, so I sadly left a company that I saw having so much potential to do good in the world.

As eSafety Commissioner, I built an incredible team to work WITH industry to create a set of “safety by design principles”
that were achievable, actionable and meaningful. I understood that this is something we needed to do with industry
rather than to industry to be effective, as it will involve changing the ethos of how technology design, development and
deployment typically happen. We went “deep” over about 8 months to uncover innovation and best practice in this space
to elevate as examples and ended up with three sets of principles: “Service Provider Responsibility; User Empowerment
and Autonomy, and; Transparency and Accountability.” Because we want industry to be successful at assessing risk at
the front end and building in safety protections to prevent misuse rather than retrofitting after the damage has been
done, we also WANT companies to be successful at achieving higher levels of safety. So, we decided that we’d take the
principles and turn them into a free, interactive assessment tool so that companies could use this as an audit tool of sorts,
learn how to address safety weaknesses and have a robust “safety impact assessment" to help them build their roadmap.
This tool will be released in a few months—one tool is for start-ups, the other is for more mature enterprises.

Safety by design doesn’t end there – we believe the VC and investment community has an important role to play in
ensuring user safety as a way to ensure more ethical investing, managing risk and in preventing “tech wreck moments” –
these are preventable. In January, 2021, we released an investor toolkit. We’re also piloting safety by design curricula in
four universities in Australia – we believe the next generation of coders, designers and engineers should be building
technology with ethics, human rights and safety in-mind.

By the way, I reject the supposition that privacy, safety and freedom of expression are diametrically opposed or mutually
exclusive. They need to be balanced—and occasionally be recalibrated—like 4 legs of a stool….
JULIE INMAN-GRANT IMPROVING SOCIAL MEDIA | 58
Continued on next page
ALL TECH IS HUMAN

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

They all need improvement and need to work in harmony if we are going to make the online world more hospitable, civil
and positive. This balance has informed the way in which I structured eSafety. Everything we do is evidence-based, so I
have an internal research team that delves into qualitative and quantitative measures, and this informs our public
messaging education materials and resources. These are designed to reach specific audiences, whether parents,
educators, senior citizens or children themselves with the aim of helping citizens to harness the benefits of technology,
understand and mitigate risks with pragmatic and actionable solutions. We are aiming to encourage behavioral change
(which takes a long time) and measure that impact through evaluation. We reject purely fear-based messages and also
leverage the education sector to help reinforce messages and incident response throughout a child’s educational
journey.

Clearly, we believe that government oversight is required to serve as a “safety net” for our citizens when online abuse
falls through the cracks of their content moderation systems, to remove harmful content and when necessary use civil
penalties to punish perpetrators and fine content hosts. While I'd much rather use the carrot, there are times the stick is
definitely needed. And, as expressed through our commitment to safety by design, we absolutely believe that industry
has to do better in making their platforms safer, more secure and that they need to be both more transparent and
accountable for harms that take place on their platforms. They build the online roads, they also need to erect the
guardrails, occasionally police those roads for dangerous drivers and enforce the rules so that other users don’t end up
online roadkill.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

There are so many people doing such great work all around the world, committing to make the world a safer place. We
notice that examples of such work often focus on North America and Europe and that very few people outside of the
safety community know what we’re doing in Australia. We may be small and far away but we think what we’re doing for
our citizens is pretty unique and has impact.

There are some incredible technologists out there devoting their brain power and careers to making the online world a
better place—this includes Dr. Hany Farid, from Berkeley and the inventor of PhotoDNA, Christian Berg from Sweden
who has developed tools for law enforcement and several companies like NetClean and Paliscope. There are some really
great safety tech companies popping up too, including Spectrum Labs, Sentropy, Hive, Tiny Beans, Family Zone and
numerous others.

There are incredible researchers and advocates, particularly all of those affiliated with Global Kids Online, that bring a
lot of rigor mixed with a genuine concern for children and human rights, mixed with common sense. Dr. Sonia
Livingstone, Amanda Third and Anne Collier come to mind and I love the work of Sameer Hinduja and Justin Patchin of
cyberbullying.org. They are the real deal!! Some of the female lawyers and academics in the US working in the intimate
privacy, ethical AI and advocating for women and minorities online are doing ground breaking work including Danielle
Citron, Mary Anne Franks and Safiya Umoja Noble are my she-roes! I am honored to work with some amazing human
beings through the WeProtect Global Alliance including Baroness Joanna Shields, Ernie Allen, Julie Cordua of Thorn and
passionate advocates like John Carr. It is amazing what a bit of compassion, strategy, brains and strong communications
skills can do to enable meaningful change!

What do you see as the risk of doing nothing to address the shortcomings of social media?

Yes, the risk of doing nothing is too great. That is why we are doing stuff here in Australia. One of those things is seeking
to build capacity and capability in the online safety regulator space. We expect that there will be a network of online
safety regulators in the next 5 years, which is great. It’s been lonely at times, and challenging not having a playbook to
refer to. For us to have real impact, we need some “pincher moves.” Ireland, the UK, Canada, Fiji will be the next batch of
online safety regulators— we hope this catches on further with the European Digital Services Act. We are also heartened

Continued on next page

JULIE INMAN-GRANT IMPROVING SOCIAL MEDIA | 59


ALL TECH IS HUMAN

that Biden/Harris are talking about forming an Online Harassment Taskforce.

Of course, one of our key goals is for Safety by Design to really take off and to become the de facto way that companies
design, develop and deploy (or refresh their technology). This is not incompatible with innovation— in many ways it
requires innovation—and I would argue it’s a much better investment to embed safety upfront than to have to re-
engineer after a major regulatory, revenue or reputational threat.

Can governments solve this conundrum?

I don’t think any one sector can “solve” this conundrum on its own—but certainly, more governments affirmatively
committing not just to curbing tech industry power but also to ensuring that they are regulating for a range of safety and
privacy shortcomings would be a good start. eSafety was established to serve as a “safety net” and I think this is a good
model. I don’t think governments should be in the role of serving as the “censors” or “arbiters of speech or the cultural
wars” but they have a role in pointing out when banter turns to serious abuse and when hate speech poisons public
discourse.

Each social media company can be seen as a house. They inhabit a global neighborhood where there are zoning rules and
laws that prevent encroachment on others. They set their own house rules so as long as those rules are transparent and
clear, they should be able to decide how to discipline those who violate these rules. It might mean sending little Johnny to
his room or it might mean kicking the mean, drunk uncle out. But you are still not allowed to commit crimes in your house
—and you need to let the police in (with a warrant) when wrongdoing occurs. There will always be the need for a housing
commission or police to enforce such rules – not all home owners are going to be law abiding or respect their neighbors
rights.

Speech is much more difficult, of course. And, given the power, amplification potential and standing of the user, that level
of influence (or not) should also have bearing on how the rules are designed and enforced.

Connect with Julie Inman-Grant @tweetinjules

You can connect with the Australian Office of the eSafety Commissioner at esafety.gov.au

"While I'd much rather use the carrot, there are times the stick is
definitely needed. And, as expressed through our commitment to safety
by design, we absolutely believe that industry has to do better in
making their platforms safer, more secure and that they need to be
both more transparent and accountable for harms that take place on
their platforms."

-Julie Inman-Grant, eSafety Commissioner of Australia

IMPROVING SOCIAL MEDIA | 60


across the country.
LEARNING FROM THE COMMUNITY
In your opinion, what are the biggest

Merve Lapus
issues facing social media?

I think one of the biggest issues is


whether or not social media will continue
to be a private enterprise, or if it (or at
least some/one of the platforms) will
VP Education Outreach and Engagement at become more of a public domain-like
Common Sense space. i.e. a "digital town square."

Other issues include...Social Displacement:


the ongoing cultural belief that face-to-
face time is valued within any given
community norms. Is that still true?
Challenging online norms of
communication: The information and
narrative constructed from online
information and platforms can confuse
adolescents during their final stages of
personal development. Exacerbating
cultural appropriation: how everything
from Fortnite dances to TikTok
challenges to racial language has become
appropriated freely and often without
accountability and or credit.
Misinformation also continues to be a
pervasive issue. Civil discourse is
encouraged and important: How has
misinformation affected free speech vs
hate speech?

What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

Tell us about your role: Tell us about your career path and Redefine norms of online engagement.
how it led you to your work’s Create legislation and policies that
I oversee Education Outreach and focus: authenticate and legally hold users and
Engagement for Common Sense platforms accountable. Otherwise, it
Education, including school I have worked in EdTech for over goes without saying that education at
adoption, district implementation, 18 years, focused on developing both the federal and state levels on the
parent/community engagement, programs and resources to support merits of digital citizenship go a long way.
strategic marketing and community academic and social emotional Integrating digital citizenship into k-12
development. Centered on building blocks for learning and life instruction has been largely successful in
supporting communities to create for kids. At Common Sense, my educating kids and their communities to
positive learning cultures around work began with supporting better prepare and support kids as they
media and technology, I develop districts throughout CA to address engage with social media platforms.
leaders focused on empowering digital citizenship. Now, 11 years Shifting from ad-based monetization
kids and families to be thoughtful later, I focus on developing a team would also greatly benefit how content is
and innovative consumers of media to provide high-quality professional
and technology. development and community Continued on next page
engagement for educators/schools
IMPROVING SOCIAL MEDIA | 61
ALL TECH IS HUMAN

shaped, pushed and developed for these platforms.

How do we ensure safety, privacy and freedom of expression all at the same time?

This is a really tough one, because – legally speaking (at the federal level) – a lot of hate speech is protected as free
speech. From a different perspective, I think we can advocate that hate speech is the enemy of free speech, since hate
speech is intended to silence marginalized voices and therefore does not fit into the scope of civil discussion. But what
exactly hate speech is and isn't is up for debate (on both sides of the debate). And there are disagreements – even within
the different camps – over where the line gets drawn. One solution I think about a lot, and I know has been discussed
elsewhere out there, is a form of ID for people to use social media. If social media is to become a digital public square,
then building in a form of accountability that exists in the real world might be a solution. However, this is a fraught
solution, as there are huge privacy and identity implications for various individuals and groups. And it could have the
opposite effect, (i.e. in countries where dissent isn't tolerated, for victims of stalking, etc.).

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

I think there is probably lots more room for improvement elsewhere (government, platforms, etc.) but in terms of where
some improvement is most likely to happen, and most likely to make a difference? I think education. More digital
citizenship education is probably something that would have the biggest ROI in terms of impact. But it is not a quick fix;
you'd have to do it right, then wait a generation to see the results. People probably want quick change, but in order to do
that, policy may need to hold platforms accountable, and platforms themselves will need to step up. Primarily: drastically
adjust revenue models and setting firm language to address free speech vs hate speech.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

Common Sense has been doing a lot of work focused on improving social media and the platforms that support them. K-
12 Education-wise, Common Sense provides a free robust curriculum addressing digital citizenship across k-12 schools
focused on empowering kids and families to think critically, navigate safely, and participate responsibly with media and
technology.

Policy-wise, Common Sense Kids Action has been working with lawmakers to pass a number of state and national
policies focused on addressing privacy, dark patterns and hate speech, to name a few. An additional resource supporting
these efforts is the Common Sense Social Media Scorecard focused on looking at how different platforms handled the
flood of videos, memes and hashtags based on last year's elections. This provides a framework for how these platforms
work.

More cynically, everyone who has consciously quit using social media in recent years and everyone who continues to do
so is an improvement. =)

What do you see as the risk of doing nothing to address the shortcomings of social media?

Hyperchambers, filter bubbles and misinformation exacerbated. I think an end to democratic government and relative
peace in the world – not just in the US, but worldwide – is something that has to be seriously discussed at this point. It
sounds hyperbolic, but it is not: violence, conflict, all out war.

How does social media look different five years from now?

I fear that shows like Black Mirror have already shown us what social media might look like five years from now. With
services becoming heavily socially rated, what is to stop social interactions from doing the same? With services and
products becoming so heavily rated, followed by social platforms exacerbating validation-type experiences, I wouldn’t be
surprised. My hope is that platforms take a stand on hate speech and misinformation and facilitate spaces that are truly
about fostering positive and accountable communities.

Continued on next page


MERVE LAPUS IMPROVING SOCIAL MEDIA | 62
ALL TECH IS HUMAN

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

More ethicists. And not just those who have come from within the industry. We cannot just "tech our way out of this" so
to speak. Documentaries like “The Social Dilemma” present the solution to the problem as coming from within tech, and
as much as those voices are helpful, lifting diverse perspectives outside of the industry is important. Mental health
professionals should be more heavily consulted, and their scrutiny should be taken as heavily as their hopeful
perspectives. Educators also should be consulted, as their understanding of whole child development is essential. These
platforms have terms meant for adults, but we know children are heavy consumers and contributors.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

As long as platforms (and their algorithms) are developed for profit, the problem will persist. They optimize and develop
your persona, focused on capturing your attention and investment – information and experiences that can then shape
thinking and behavior. Maybe a truly nonprofit social platform would be the way to go? I do not know how that would
work, but you gotta have the will to make it happen and have investors invested in people, not potential profit.

What makes you optimistic that we, as a society, will be able to improve social media?

Quite honestly, I am not optimistic about this. There are a lot of great experiences and relationships that social media
provides us, but largely, the disinhibition and disconnected nature of platforms foster many issues, perpetuated by the
platforms themselves. I do see glimmers of collective support and movements designed to bring people together, as
opposed to lengthening our divide, but until we thrive on unity and not one-sided satisfaction, social media will continue
to act as a fractured community.

Connect with Merve Lapus at @molapus

"I think one of the biggest issues is whether or not social media will
continue to be a private enterprise, or if it (or at least some/one of the
platforms) will become more of a public domain-like space."

-Merve Lapus, VP Education Outreach and Engagement at


Common Sense

IMPROVING SOCIAL MEDIA | 63


mission at the outset of the Korean War,
LEARNING FROM THE COMMUNITY
and I craved that sort of impact. To
satiate that desire, I joined the State

Jeff Collins
Department as a Foreign Service Officer
and helped navigate some difficult
bilateral relationships and advance U.S.
human rights initiatives in Cuba, Iraq,
Turkey, Bolivia and Venezuela. I also had
the honor of serving in the Obama
Sr. Director, Global Trust & Safety, TikTok National Security Council as the first
Director for Turkish Affairs.

I left government service to return to my


home state of California as Senior
Counsel for International Policy at
Chevron. There, I helped integrate a
human rights framework into the
company's business model.

Then, craving a more innovative


environment and the opportunity to
have a greater impact on the direction of
a company, I moved to the world of tech
to join a startup. I applied my cross-
cutting background to help the startup
After School, a U.S.-based teen social
network that was all the rage in 2016. I
joined the company after it had
experienced numerous problems related
to its hypergrowth, and worked with the
co-founders to reinvent the company
into one that acted proactively to
prevent bullying and harassment.

Most recently, I decided to apply my


experience to larger-scale problems and
again strive for global impact. TikTok was
growing by leaps and bounds and had
Tell us about your role: academic expert to refine our exactly what I was looking for: an array
approach to misinformation, and of challenging safety issues, cross-
At TikTok, I lead a team that works the next I may be conducting cultural teams and a global user base –
to create a safe and welcoming interviews to grow our world-class and the energy and excitement of a
environment for our community. team. Adaptability has never been startup.
We develop policies and features more important!
that help people have a positive app In your opinion, what are the biggest
experience, and we work with a Tell us about your career path and issues facing social media?
range of experts and non-profit how it led you to your work’s
organizations to continually focus: As an entertainment platform, TikTok
improve our approach. What this shares many challenges related to user-
means differs on any given day. One A desire to make a positive generated content and online community
day I may be working with our contribution to our world drives me governance that have affected social
product designers to help create personally and professionally. I media platforms, including how to
features to promote screen-time grew up hearing stories about my
management and wellbeing, the grandfather, a U.S. naval attaché Continued on next page
next I may be working with an and pilot who led the evacuation
IMPROVING SOCIAL MEDIA | 64
ALL TECH IS HUMAN

prevent the amplification of misinformation and disinformation, hateful behavior, and other harmful content. At the
heart of these challenges are the sheer volume and virality of content that is created (around the world, tens of
thousands of videos are uploaded on TikTok every minute, some reaching millions of views the same day). We are
focusing on how to best moderate at scale where we can empower the originality of our creators while keeping our
communities safe.

At TikTok, we are tackling these challenges by developing clear policies grounded in evidence and the experience of the
past decade in social media, sound processes to train human moderators on how to understand and apply these policies,
more robust feedback loops between the two, machine learning algorithms to better detect potential policy violations
and transparency. I am particularly proud of and excited by our work to generate more frequent and detailed
Transparency Reports to help external stakeholders better understand our work, as well as our Transparency and
Accountability Centers – physical locations where experts can take a look under the hood and learn about how we
approach content moderation, how we keep our platform secure and how our For You feed algorithm works.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

I am incredibly inspired and influenced by the groundbreaking work of Eli Pariser and Talia Stroud at Civic Signals (now
New_ Public). Eli is well known for his leadership on "filter bubbles." He coined the term in 2011 – a year that saw both
the Arab Spring and a "can do no wrong" high point for big tech. From his work at MoveOn.org and Upworthy, Eli
recognized and called attention to the potentially harmful impact of algorithm amplification, siloization of
communication and online rabbit holes. Eli and Talia have put tremendous effort into researching and developing a
framework to help us use learnings and signals from how we design and use real world physical spaces (parks, town halls,
city streets, etc.) to design online spaces with respected rules and norms.

If we are honest, it is clear that we need to move beyond what I call Trust & Safety 2.0 – reliance on large-scale machine
learning detection and human moderation to police platforms. A new design-influenced paradigm that draws on lessons
from the offline world is an important instructive element that can help us invent (because this truly does need to be
invented) Trust & Safety 3.0 and truly create safe, positive and trustworthy spaces.

How do we ensure safety, privacy and freedom of expression all at the same time?

These are the kinds of trade-offs that my team at TikTok navigates daily. Safety, privacy and freedom of expression are
all principles that we endeavor to uphold; however, at times those priorities conflict with one another. For example,
when we consider when to send safety resources to users who may have had a video reported for self-harm, we also have
to consider their privacy. Or, when we design educational campaigns to raise awareness about eating disorders, we also
have to consider protections for others who may find that content triggering. There is no perfect approach here, but we
do a few things to help us strike the right balance:

We bring diverse perspectives into our decision-making, both from our teams and through partnering with experts
We take a broad view of the potential harm we must mitigate to include several dimensions – physical, psychological,
societal, etc.
We iterate on our policies and tactics frequently to ensure we keep pace with – and ultimately anticipate – new
trends and threats.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

When it comes to creating a safer online environment, the responsibility is shared. The platforms, governments, media
and public all need to be involved – especially because many of the challenges we face online have a nexus with broader
societal challenges, such as the uptick in polarization and decrease in trust in institutions we've seen in many areas of the
globe. With that in mind, I believe strongly that we as individuals need to step up in industry, in the government, in civil
society, and as community members. We also need to be proactive in building much greater connectivity and
collaboration across these dimensions. Doing this requires creativity and fresh thinking. This is an ever-evolving
landscape that requires moonshot goal-setting, commitment, investment and fierce determination.

What people and organizations do you feel are doing a good job toward improving social media?
JEFF COLLINS IMPROVING SOCIAL MEDIA | 65
Continued on next page
ALL TECH IS HUMAN

This list is long, which is important to recognize because the general public is unaware of the amazing work being done by
a global army of people and organizations to help platforms promote creativity and a diversity of viewpoints, while
maintaining guardrails to keep different online communities safe and generate greater societal trust in our companies. In
addition to All Tech Is Human, which is doing amazing work to infuse a thoughtful, responsible approach into high tech,
I'll mention just a few:

Aspen Digital's mission is to help policymakers, civic organizations, companies and the public to be responsible
stewards of technology and media in the service of an informed, just and equitable world. I am excited to be a part of
this organization's Virtually Human Working Group, whose purpose is to identify critical issues at the nexus of
human connection and technology and develop a repository of best practices, shared definitions and state-of-the-art
methodologies and measurement tools.

The DQ Institute (DQI) is an international think tank that is dedicated to setting global standards for digital
intelligence education, outreach and policies. We partnered with DQI to develop a safety guide for Safer Internet
Day and are working to develop educational videos based on their recognized standards for digital literacy, digital
skills and digital readiness.

Among the many amazing organizations working on mental wellbeing and technology, Crisis Text Line continues to
be a leader in providing real-time support that saves lives. Our partnership with Crisis Text Line, launched last year,
is helping address the needs of our community of users in the U.S. We're looking forward to doing much more in the
coming year to make a positive impact on our community.

What do you see as the risk of doing nothing to address the shortcomings of social media?

Given the events that led to the so-called "techlash," invasions of privacy, amplification of dangerous conspiracy theories,
election disinformation campaigns and much more, inaction would be disastrous. But I do not actually think we're at risk
of doing nothing; I work with too many dedicated Trust & Safety colleagues to believe that. But I do feel that there are
some challenges that will be really difficult to solve. After all, the problems we deal with are large, complex societal
problems that involve education systems, legislative systems and diverse social and cultural mores.

We are at a point where "Alone Together" has gone from beyond a dire warning (in Sherry Turkle's groundbreaking
book) to an empowering and empathetic TikTok campaign to support one another during the pandemic. It is a fool's
errand to try to stop your kids from using screens. These are realities, not trends, and they will continue to morph and
evolve along with the evolution of our social systems. So for me, the key question is how can we harness these
technologies to move humanity in a positive direction? How can we use technology, including the social dimensions of
platforms, to help educate, recognize and treat mental health issues, invent ways to save our climate, advance equity and
inclusion in our communities and build bridges across cultures while minimizing the negative externalities (which we will
never eliminate given that we are human beings).

What models do you see coming on line for providing a digital community (beyond today’s ad-based, extraction
model) – platform cooperatives? Decentralized Autonomous Organizations (DAOs)? For example, are there promising
applications for the blockchain?

Blockchain companies definitely are pioneering some interesting community governance models that seek to advance
the Reddit- and Wikipedia-type models of involving users in the creation and application of rules. There will certainly be
interesting lessons to learn from these efforts.

Industry has made progress on key issues, via the Global Internet Forum to Counter Terrorism, the Technology Coalition
(for CSAM-related issues) and multi-stakeholder initiatives like the Global Network Initiative. In just one recent example
of industry cooperation, TikTok is working to bring companies together to minimize the proliferation of suicide-related
content across platforms and think more deeply about how we approach mental wellbeing. Of course, there have been
pointed (and not invalid) criticisms raised about the manner in which companies come together to make content
decisions (see, e.g., Evelyn Douek's writings on "content cartels").

While it is unpredictable how things will move forward, we do know that actual progress in dealing with content in a
manner that is just and balances the variety of stakeholder concerns will require more, rather than less, collaboration
across government, civil society and business.
JEFF COLLINS IMPROVING SOCIAL MEDIA | 66
Continued on next page
ALL TECH IS HUMAN

How does social media look different five years from now?

The landscape already has changed so significantly that what we initially called "social media'' barely exists. Rather, we
are in a world where entertainment and social [media] have fused, with companies falling somewhere along a spectrum
of being more pure entertainment (think Netflix or Hulu here) vs. pure social networking (think Facebook here).

As this evolution continues toward social-entertainment, my hope is that in five years platforms will have made strides to
understand, respond to and help address important societal issues from mental health to education to the creation of
career opportunities. In the near term, this means more proactive work to address issues around diversity, equity and
inclusion as well as potential bias in algorithms, maturing our efforts to balance safety and privacy considerations and
enhancing our approach to supporting the mental health and wellbeing of our users. At TikTok, we're investing in
growing our team and expanding our commitments across these areas. I am optimistic that our progress will be evident
with concrete, real-world impacts in the years to come.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media? (50-250 words)

More than any particular background, I think, having expertise from a cross-section of disciplines is important. I am a
huge believer in the "triple strength leadership" paradigm, which holds that today's increasingly complex global
challenges – from climate to economic growth to health – require multi-faceted leaders who can bring to bear experience
in the public, private and social sectors to create solutions (which of course require the involvement/application of
various types of specialized expertise) – see https://hbr.org/2013/09/triple-strength-leadership

Tech in general, and Trust & Safety in particular, sits at the nexus of countless major issues, trends and challenges. As we
leverage our technical capabilities to address major societal issues, health, equality and democracy, we need people who
can connect the dots, so to speak. At the moment, one of my top priorities is to attract talent with expertise in the social
sciences, human rights and AI ethics, to help us strengthen the way in which we connect, support and give voice to our
user/creator community.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

I believe cross-sector work is incredibly important because neither governments nor platforms can answer this question
alone. Laws and regulations will continue to play an important role in setting consistent expectations for platforms and
representing the views of citizens. But we are global entities that span multiple countries and legislative systems. This
means that we must work actively and cooperatively with governments and civil society across the globe to help develop
approaches that are fit-for-purpose, rather than blunt instruments with negative unintended consequences. Speaking for
my team at TikTok, we are constantly evolving our policy frameworks so that we can continue to empower people to
create and share authentic content that lifts up rather than denigrates or creates harm. And we are always seeking out
feedback to do a better job of this.

What makes you optimistic that we, as a society, will be able to improve social media?

Two things: my colleagues and the creativity people bring to social media every day.
I have the privilege of leading an incredibly dedicated Trust & Safety team, and I wish I could convey just how true it is
that we put user safety first. I have colleagues around the world who hop on calls in the middle of the night or work
through the weekend to make sure we are responsible stewards of the creator experience and what all users encounter
online. They have thoughtful and difficult conversations about our work and sometimes take on an emotional load when
they review harmful content and make tough decisions. At the same time they bring great passion to their work and take
pride in the meaningful strides we make to counter bad actors and empower our community. This makes me feel
extremely optimistic about the future of our app and for our online lives going forward.

I also am consistently impressed with the ways people leverage platforms to educate one another, organize for the
common good, educate and simply create cool and interesting material. In safety-focused roles we tend to think first of
harmful content or tough policy calls, but the majority of what creators are putting out into the world is positive,
purposeful, encouraging and fun. I think that's what people want to see online, so I am hopeful that, as we tackle the
many challenges ahead of us, we can make sure to amplify the good.
JEFF COLLINS IMPROVING SOCIAL MEDIA | 67
does not presuppose a certain level of
LEARNING FROM THE COMMUNITY
experience or education.

Pinal Shah
What I love most about my role is that I
get to cater to and take into
consideration people from all
demographics, all socioeconomic levels
and educational backgrounds. This is
truly what we mean when we talk about
Behavioral Engineer, Robinhood product inclusivity. Further, I think about
how we are helping to nudge people
towards making better security and
safety decisions to help protect
themselves, as well as educating them on
the why.

Tell us about your career path and how


it led you to your work’s focus:

I love to explain my career path as me


jumping in between the overlapping
circles on the Venn Diagram that is my
life. I have so many varied intellectual
and social interests, and throughout my
career, I have loved stepping into roles
that, at first glance, seem so random. But
then I step back and realize that although
my path to that role was in no way linear,
it makes sense in terms of who I am and
past work I have done. I first went to law
school thinking I was going to work in
International Human Rights. I chose my
alma mater, Howard University School of
Law, because of its legacy of building civil
rights leaders and I wanted to walk in
their shadows and be inspired by the
greats, and I wanted the non-hegemonic
legal perspective. While at HUSL, I
Tell us about your role: when a bad actor exploits another became interested in federal
person and tricks them into giving government service and was introduced
As a Behavioral Engineer at up personal or confidential to the national security world. I spent
Robinhood under the Trust & information that the person would several years working for the
Safety team, I work on increasing not have given under normal Department of Homeland Security as an
the safety, security and privacy of circumstances, had they realized Obama Administration appointee, where
our users. Behavioral engineering is who the bad actor truly was. I delved into the worlds of international
about taking into account human diplomacy, tech policy and cybersecurity.
psychology and people's lived As we build our products and
experiences as determinants for features, I think about the ways bad After government service, I wanted to
how they will interface with actors could try to trick our users, try my hand at the startup scene, so I
technology. I use behavioral and I work with product and joined Lyft, where I helped to build and
insights to help people make engineering teams to build scale their Privacy team. While that was
stronger security and safety mitigations against such actions. I a fascinating experience as we dealt with
decisions. Another element of my also help our content teams with
job is reverse engineering social writing content that is easily Continued on next page
engineering. Social engineering is digestible and written in a way that
IMPROVING SOCIAL MEDIA | 68
ALL TECH IS HUMAN

CCPA and GDPR regulations and concepts like Privacy by Design, I knew I was ready for something that was closer to
the human interface with technology, which led me to Robinhood's Trust & Safety team.

In your opinion, what are the biggest issues facing social media?

I remember joining Facebook in college. At that time, you were just connected to people at your particular school (in my
case, University of California, Berkeley). It was a fun way to interact with more people and "poke” them so you could start
a conversation. It did not really serve as a replacement for real social engagement, but rather an extension of it. Over
time, social media platforms began to serve as catalysts for entire movements and as pulpits for the unheard.

We are at such a defining moment for social media as it relates mis/disinformation. When you take our human limitations
in processing large amounts of information (aka the attention economy) and you compound it with our cognitive biases,
our echo chambers and the various agendas that people want to push, it is a recipe for disaster.

We are seeing that play out daily as it relates to Covid-19 misinformation, as well as in the role it plays in radicalizing
people politically, under the guise of exercising free speech and democratizing voices.

There is so much work to do in terms of creating a whole-of-society effort to combat misinformation and the impacts it
has on our society. Misinformation exacerbates existing societal fissures, and the rapid spread makes it that much harder
to combat. It is so easy to spread, and yet it is like Pandora's box. Once the information is amplified, reeling it back in is
nearly impossible. So the key is preventing the misinformation from getting out there to begin with. And that is the
challenge of this decade.

How do we ensure safety, privacy and freedom of expression all at the same time?

I remember one of my law school professors emphasizing that the First Amendment is the First Amendment for a reason.
Freedom of expression is what we as Americans hold as the most sacred right. It is critical to our American culture and
our democracy that we should maintain a healthy culture of the free exchange of ideas. It is what allows our society to
innovate and progress. But rights also come with responsibilities.

The intellectual discourse around the freedom of speech debate tends to settle around allowing all forms of speech, even
hate speech, to come forward, so that it may be brought to light, refuted, and progress can then be made. In theory, this is
a great way to discredit harmful and hateful ideologies, or just plain wrong information.

But this is premised on the fact that such ideologies will in fact inherently be discredited before they cause harm. This is
not the case. In a pluralistic society such as ours which allows for such open discourse, we have to constantly remain
vigilant to the narratives that are being shared, who is sharing them and the motivation(s) behind them. This is not to
suggest that we live in a constantly-teetering-on-the-brink-of-censorship state. But because we know that the ideologies
that people share online can lead to physical and emotional harm, so there is a responsibility we all have to maintain
vigilance to the spread of harmful ideologies and narratives.

We cannot be blind to the harms that online discourse can cause in the name of free expression, and we must work at
examining speech and curbing it when necessary, when it begins to infringe on others’ rights to privacy and safety.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

When it comes it to improving social media, it is pretty clear to me that this is a whole-of-society responsibility. Although
platforms broker the engagement, there are downstream impacts on all of us. Most people get their news online now, so
media agencies have tailored their stories to fit into the mobile experience. Most companies now have social media
managers that cultivate an online presence for marketing purposes. We have now seen an entire presidency conducted
via Twitter (is it just a coincidence that Twitter's character limit bumped up to 240 characters during the Trump era)?

When it comes to making social media better, we need to ask ourselves: Who are we making it better for? Everyone has
an angle. Social media platforms have an interest in monetization, so they will constantly anticipate user needs and add
new features. But who is looking out for the user (beyond giving them cool new features)?
PINAL SHAH IMPROVING SOCIAL MEDIA | 69
Continued on next page
ALL TECH IS HUMAN

Using data privacy as an example, most data collection practices done by companies just five years ago were fairly
opaque, and most consumers did not pay much attention to their data. But as European legislation emerged and GDPR
took center stage, companies were forced to be more transparent about what data they collected about users, and to
give consumers more agency over their data.

So the government and lawmakers clearly have a role here, and we cannot just rely on companies to self-regulate. Media
companies have an added responsibility to verify the accuracy of what they share or procure via social media. And
individuals have a responsibility to stop and examine what they are consuming and maintain a healthy dose of skepticism
if it is not coming from a reputable source, especially if it appears to be emotionally charged.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

I think ethics organizations (like the Center for Humane Technology) and think tanks have also really stepped up lately
and have been providing great research, public education and commentary on the pitfalls of lack of oversight when it
comes to social media regulations. I would love to see academia and non-profit orgs partner with public health
organizations to share more knowledge and awareness with the public on how social media impacts our mental and
psychological health. There is currently some awareness out there, but I think we need to see more campaigning done by
public health groups to really create more education around the mental and physical impacts of too much social media
and too much screen time.

And in terms of organizations, one of my favorites is TheBridge. They send out weekly emails that are so informative and
they also host regular talks with experts on really pressing tech policy issues. They’re such an informative and down-to-
earth organization. I would recommend them if you are just getting started in the space, and even if you are tenured. And
of course, All Tech Is Human! I love how they are trying to bring together different facets of society to solve hard
problems, and especially that they are building the responsible tech pipeline.

What do you see as the risk of doing nothing to address the shortcomings of social media?

You cannot manage what you do not measure. We have seen all too well the pitfalls of letting misinformation and
disinformation campaigns spread on social media. If we do not monitor it, we cannot make the connections to its effects.
And if we do not understand the effects, we cannot propose relevant solutions. Doing nothing is not an option.

How does social media look different five years from now?

You know that phrase, "You are not sitting in traffic, you are traffic"? Well, we are not just consuming media, we are
media. Individuals are now micro-influencers with thousands of followers. Social media has gone from the quaint (us
sharing pictures with friends and family members, to users creating personal brands and viable businesses solely via
social media.

Social media will constantly evolve to keep up with society. We have seen a democratization of platforms in the last
decade, with more and more people having the access and visibility to share their truths and experiences. There is no
turning back the clock. Social media will now always serve as witness to the zeitgeist. More and more, though, we will
start to see a symbiotic relationship as social media will continue to shape how we live our lives. The first time I
understood the power and reach of social media was in 2008, when a student was jailed in Egypt and tweeted that he had
been arrested, prompting efforts to get him released. It was at that moment I knew social media would be a true force. As
a Californian, at the first signs of a shake, I immediately go to Twitter to validate whether or not what I experienced was
an earthquake. Behaviorally, we are wired now to consider social media in our daily lives.

Because of this, I do see a trend towards more privacy, security and trust-building by companies. Platforms understand
that users are becoming more educated about privacy matters and will choose platforms that can protect their privacy.
This is especially true as we move to more video interfaces.

We will also continue to see a wider array of voices, as more and more individuals gain access and create individual
platforms.

Continued on next page


IMPROVING SOCIAL MEDIA | 70
ALL TECH IS HUMAN

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

The beauty of the tech industry is that it actually needs people from every single demographic, every cultural and
linguistic background, every nationality, every religion, and every age and gender. This is because global access has
increased dramatically, and as we close the digital divide, every single member of humanity will need to be represented.

And a case for diversity is a business case. Of course, we want companies to hire a wide range of individuals from diverse
backgrounds because it is the right thing to do, but there is a solid business case to be made for representation amongst
the ranks. We have seen the impacts that exclusion has had on product innovation and marketing. You cannot innovate
for and market for a group whose DNA you do not understand. Thus, we need global representation in the local meeting
rooms.

The other great thing about tech is that because it engages so many different parts of our brains, it needs thinkers from
all academic backgrounds.

I am a lawyer by training with a social sciences background, but I think my analytical skills and national security
background, and lived experiences of being bicultural, multi-lingual, and having lived and travelled so much globally,
gives me a unique background. I am also naturally curious and empathetic, so I think about how others might experience
technology as well. So although I do not have a "traditional” tech background, I am a value-add for teams when it comes
to product design and product inclusion, user experience and proactively identifying trust and safety issues.

But how does someone that has never worked in tech even know that their skills would be valuable? I do think companies
need to think more broadly about how to attract people from varying backgrounds, including those that are
neurodiverse. Tech problems do not require tech solutions. They require a variety of humans who use these technologies
to develop those solutions.

What makes you optimistic that we, as a society, will be able to improve social media?

Humans always fight for the betterment of society, and social media is no exception. There are so many brilliant people
working on these challenges, so I have faith that we will figure out how to make social media less addictive, how to
engage in in it meaningfully and not in a way that it is consuming our lives, balance free expression and ideas against
hateful ideologies, and use it to enrich our lives in a way that we can learn from each other and keep in touch via social
media.

Connect with Pinal Shah @womanoffuego

"I do think companies need to think more broadly about how to


attract people from varying backgrounds, including those that are
neurodiverse. Tech problems do not require tech solutions. They
require a variety of humans who use these technologies to develop
those solutions."

-Pinal Shah, Behavioral Engineer at Robinhood

IMPROVING SOCIAL MEDIA | 71


where I am today. I spent a few years
LEARNING FROM THE COMMUNITY
exploring the world of criminal law at the
Philadelphia District Attorney’s Office in

Sydney Weigert
high school and early college, then
moved in-house to the legal team of a
utility company in the city of
Philadelphia, then switched back again to
criminal law. After receiving my B.A., I
moved to Berlin to work on the Trust and
Policy Administration Manager, Business and Safety team at SoundCloud. I love the
Legal Affairs, SoundCloud world of Trust and Safety; it is exciting
and extremely important. The work I did
on the team definitely gave me the skills
needed for my current position. After a
couple of years I decided I wanted to take
the next step and have more influence
over policymaking, especially given the
current climate. I felt it is where my
experience best suits me and where I
could have the most impact…. So here we
are!

How do we ensure safety, privacy and


freedom of expression all at the same
time?

However, I do think there is something to


be said about human moderation teams
“holding down the fort” behind the large
curtain that is algorithmic solutions and
detection, policy makers and executives.
I am personally of the opinion that, to
ensure this harmony, I am not sure we’re
ever going to be able to move into a fully
automated space, and that the value of a
set of human eyes and context
dependent moderation will remain very
important.
Tell us about your role: policy development, I manage all
inbound legal requests involving When we discuss improving social
As a Policy Administration Manager data distribution. I also support media, we often toggle between the
on the Business and Legal Affairs company training on best practices responsibility of platforms, the role of
(BALA) team at SoundCloud, I work around topics such as data media to educate the general public,
with teams across the organization protection and privacy matters. governmental oversight, and the role of
to shape policies that ensure the citizens in terms of literacy and how
company’s vision is reflected Tell us about your career path and they engage with platforms. In your
therein. A large part of my role is how it led you to your work’s opinion, what area do you think needs
working in conjunction with our focus: the most improvement?
legal team to oversee SoundCloud’s
external and internal facing policies I am still early in my career, but I What I have learned through my work in
surrounding content moderation began a variety of internships in the responsible tech is a group effort. We
and compliance. I also advise on legal field at quite a young age. I need to work together, communicate and
policy in relation to platform believe this helped to kickstart my
integrity, managing the career by giving me the insight and Continued on next page
development work. Outside of work ethic that brought me to
IMPROVING SOCIAL MEDIA | 72
ALL TECH IS HUMAN

hear each other’s voices, struggles and accomplishments. Each area mentioned above is able to bring different insight
and influence to the table, and each complements the others. Governmental oversight works best when platforms,
specifically smaller platforms, are heard. I would therefore hesitate in concluding that one area needs the most
improvement and instead place emphasis on teamwork and alliance.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

In my opinion, projects and initiatives such as Tech Against Terrorism and the Aqaba Process hosted by Jordan do a
fantastic job toward improving social media. We’ve had executives from SoundCloud attend the Aqaba Process, which
has helped foster dialogue around terrorism and extremism and led to fruitful cooperation with companies such as
ActiveFence. These initiatives and organizations are transparent and honest – and therefore impactful. They are there
for guidance, not for gain, and they simply want to help make the internet a better place. Having help, resources and
feedback focused in an area or areas where you may not have as detailed knowledge is also extremely useful in educating
your team on making informed and responsible decisions, especially for smaller platforms.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

Another million dollar question! Platforms will most likely always have to police, in some sense. That is to say, we all must
follow the law and be guided by those requirements. At the same time, laws differ by country and region, so that is tough
to navigate since one-size-fits-all is probably not realistic. Beyond that, platforms should do their best to be clear about
what is acceptable for their platform and be thoughtful and consistent in enforcing their policies. I do not believe this is
something governments can solve alone. However, I do believe they can have a positive impact.

"What I have learned through my work in responsible tech is a


group effort. We need to work together, communicate and hear
each other’s voices, struggles and accomplishments."

-Sydney Weigert, Policy Administration Manager, Business


and Legal Affairs, SoundCloud

IMPROVING SOCIAL MEDIA | 73


and broaden their empathy for people
LEARNING FROM THE COMMUNITY
beyond their limited friend circles.
However, in all these areas, I have seen

Nicole Chi
how technology today
disproportionately harms vulnerable
communities instead of making empathy
easier. This led me to my current work at
the Mobius Project, where I am so
excited to be working with amazing
Co-Founder, Mobius Project collaborators to seriously think about
(and then create!) what tools and
resources we need to build a thriving
community of practice around mitigating
platform abuse.

In your opinion, what are the biggest


issues facing social media?

When we talk to people working in social


media companies about platform abuse,
the question always comes back to the
bottom line: how to make fixing
harassment and other harms more
compelling by aligning it with profit
incentives. Limiting our solutions to
things that will align with private
companies’ profit incentives massively
cripples our imagination of what social
media can be. We need investments in
social media that think ambitiously about
what we as a society want from it, rather
than what it can do for private
companies.

Relatedly, the current revenue model for


the Internet is not creating the Internet
we want, but rather an Internet that sells
our data and thrives on outrage. We
Tell us about your role: abuse. need more alternatives to this economic
model, and we need to build consumer
I co-founded the Mobius Project Tell us about your career path and power to demand it.
alongside Avi Zajac and Ji Su Yoo, how it led you to your work’s
which aims to help product teams focus: What people and organizations do you
be more proactive about addressing feel are doing a good job toward
platform abuse. We built I have gotten a chance to work in so improving social media? Why/how
PlatformAbuse.org, a knowledge many different areas of what would you say their work is helping?
source of technological harms and broadly can be defined as
mitigations to guide safer product technology that serves the public Most recently, I attended a great
development. Last summer, we good – nonprofit digital capacity conference by Civic Signals (New_ Public)
participated in Mozilla's Fix-The- building, civic tech, tech policy, ML on how to build better digital public
Internet MVP Lab, and we are now ethics and platform abuse. Through spaces. I think their framing of social
working on a framework for this work, I have come to realize media as public parks is very powerful,
abusability testing to help folks that what excites me the most
systematically approach and about technology is its potential to Continued on next page
mitigate problems of platform help humans collaborate at scale
IMPROVING SOCIAL MEDIA | 74
ALL TECH IS HUMAN

and we need more spaces that encourage interdisciplinary work rather than creating more rooms full of technologists
trying to solve social problems.

In general, we need more solutions for social media that are built by and for the very people that it has harmed. Block
Party, by Tracy Chou, is a great example; it is an app created to make harassment on social media easier to mitigate. It
solves a real problem because it was built out of her experiences being harassed as a woman of color. I also love projects
like Archive of Our Own (AO3) that give us great case studies for what platforms can look like when they are designed
and developed entirely by the people they serve. AO3 is a fan-built social network with a women-led development team,
built with feminist HCI values in mind.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

I would love to see more people who do not have traditional tech backgrounds, especially people who have personally
experienced harms on social media or have experience navigating complex social issues like conflict resolution or
restorative justice. I know folks who have backgrounds in human rights, policy and community building who are
interested in improving social media, but there are not a lot of avenues for them to do so.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

Zebras Unite, a cooperatively owned movement to build businesses that are better for the world, quotes Rebecca Solnit
in discussing how the “tyranny of the quantifiable” affects venture financing. Measuring impact is important, but I think a
similar bias exists in social media products. What is measurable, clicks and likes, takes precedence over value added to
consumers that is harder to quantify, such the deepening of relationships or broadening of worldviews. This is reflected
in product metrics and also in algorithms that shape the social media experience. I think giving consumers more power
over their experience is one way we can improve it – platforms are already doing this today by allowing people to
organize their timelines by top or most recent content, but I would love to see more of that. We also need more research
on the effects of algorithms on human behavior and society, both from companies and independent efforts. J. Nathan
Matias has a great article on this entitled “The Obligation to Experiment.”

Connect with Nicole Chi @nchisays

"I would love to see more people who do not have traditional tech
backgrounds, especially people who have personally experienced harms
on social media or have experience navigating complex social issues
like conflict resolution or restorative justice."

-Nicole Chi, Co-Founder, Mobius Project

IMPROVING SOCIAL MEDIA | 75


NAWA region before receiving a
LEARNING FROM THE COMMUNITY
Fulbright scholarship to study the impact
of misinformation on sectarian prejudice.

Azza El Masri
I worked in the digital rights space as a
campaigner, covering issues related to
free speech, misinformation and privacy
in the region, which then allowed me to
dig deeper into content moderation and
platform accountability.
NAWA Program Associate at Meedan
What "solutions" to improving social
media have you seen suggested or
implemented that you are excited
about?

I have advocated for decentralized


content moderation practices that
prioritize context, which can only happen
if invested civil society organizations are
invited to "co-design" these policies.

How do we ensure safety, privacy and


freedom of expression all at the same
time?

We do that by enriching, supporting,


empowering and advocating for open-
source projects and software.

What people and organizations do you


feel are doing a good job toward
improving social media? Why/how
would you say their work is helping?

The people behind the Internet Freedom


Festival and CommUNITY have been
doing a fantastic job in engaging a global
community of digital rights activists,
Tell us about your role: platforms' enforcement of their researchers and technologists about the
policies related to terrorist and increasingly complex challenges to digital
At Meedan, I manage and violent extremist content. rights and online freedom of expression.
coordinate projects in the North
Africa and Western Asia region Tell us about your career path, and What do you see as the risk of doing
(NAWA) focused on empowering how it led you to your work’s nothing to address the shortcomings of
local independent media and focus: social media?
human rights organizations and
training journalism students in My work sits at the intersection of Increased political polarization, offline
Egypt, Lebanon, Syria, Yemen and journalism, media research and violence, (self-)censorship and
Palestine on open-source tech, and my current position discriminatory laws, to enumerate a few.
investigations and fact-checking encapsulates my diversified career
through our open-source path. I began my career as both a Part of our mission at All Tech Is Human
annotation and verification journalist and researcher examining is to diversify the people working in the
software Check. Separately, I ISIS's online coordinated
research content moderation campaigns, and worked on media Continued on next page
practices in the region, focusing on literacy projects and curricula in the
IMPROVING SOCIAL MEDIA | 76
ALL TECH IS HUMAN

tech industry. In your opinion, what academic or experience backgrounds should be more involved in improving social
media?

Just as platforms should take the steps to decentralize content moderation practices, conversations about bettering or
improving social media should be global, multidisciplinary and equitable. Civil society groups, human rights defenders,
technologists and independent journalists in countries such as Nigeria, Myanmar, India, Lebanon, Palestine and others
have a lot to bring to this conversation – they just need to be allowed the space to do so.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

We can (hopefully) solve this conundrum by implementing context-driven content moderation policies.

Connect with Azza El Masri @aemasri

"I have advocated for decentralized content moderation practices that


prioritize context, which can only happen if invested civil society
organizations are invited to "co-design" these policies."

-Azza El Masri, NAWA Program Associate at Meedan

IMPROVING SOCIAL MEDIA | 77


burgeoning digital space. I focused my
LEARNING FROM THE COMMUNITY
research on accessibility and assistive
technology and, while my career hurled

Michelle Cortese
me toward art direction in the fashion
advertising industry, I rediscovered my
passion for using technology to enrich
lives when virtual reality began making
its way into advertising. After earning a
masters degree in creative technology
Design Lead Manager, Facebook Reality Labs and executing a ton of VR work in
experiential advertising industry, I
moved in-house at Facebook with the
intention to make the future of
communication technology a better
place. Here, I have had the opportunity to
set inclusive, empowering and safety-
driven design paradigms at the outset of
early social VR networks, such as
Horizon.

How do we ensure safety, privacy and


freedom of expression all at the same
time?

These are exactly the sorts of questions I


like to turn to existing – often
anthropological – frameworks for.
Safety, privacy and freedom of
expression are states of being that
should be considered basic human rights,
in or outside of digital space. Most of the
problems we face in the virtual world
(whether it is embodied VR or an image-
sharing social platform), have been
uncovered, researched and often
addressed in the real world.

For basic needs such as these, we can


Tell us about your role: frameworks to the design of future consider their importance and
technologies. Much of my recent implementation through the lens of the
At Facebook Reality Labs, I work independent research investigates Hancock's Hedonomic Pyramid (image
within the Oculus product group, how designers can use the ideology attached): a theoretical framework that
leading UI design for our latest of body sovereignty and consent to stands as an ethos for Hedonomics,
social VR experience, Facebook build safer VR spaces and systems. which is the branch of science devoted to
Horizon. Outside of Facebook, I the promotion of pleasurable human-
independently explore XR design Tell us about your career path and technology interaction. The pyramid's
ethics and teach creative how it led you to your work’s five layers begin with a foundation of
technology at New York University focus: Safety and stack Functionality, Usability,
and Queens College, City Pleasure and Individuation on top. This
University of New York. Integral to When I chose to study graphic gives us a ranking order for the inclusion
all my work is critical investigation design in the early 2000s, I assumed of safety features (foundational), privacy
into the transmutation of human I would be typesetting books or art features (functional) and expressive
expression across new technologies directing magazine layouts, but I
and formats, and the application of found the most compelling design Continued on next page
real-world anthropological opportunities to be in the then-
IMPROVING SOCIAL MEDIA | 78
ALL TECH IS HUMAN

features (pleasurable and personal). From here, we inject these principles, at these respective levels, into the fabric of an
application. The short answer: We must embed safety, first and foremost, into the core architecture of anything we build.
That foundation sets a tone to support privacy, and a palpable layer of privacy allows for freedom of expression.

How does social media look different five years from now?

In the coming years, I anticipate we will see growth in what are currently considered unconventional social networks. I
am particularly interested in the rapid growth of videogame social networks. Fortnite and Animal Crossing are platforms
that, while centered around a game, are functionally social networks. As of May 2020, Fortnite reported having 350
million players; Animal Crossing: New Horizons reportedly sold 31 million copies; and VRChat claims tens of thousands
of concurrent users. These are not trivial numbers. And they're a sign of a new generation of social networking: one that
is immersive, creative and not tied to real identity.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Everyone. This might feel like a cheap answer, but I stand by it. I want to live in a world where everyone has the tech
literacy to understand, criticize and demand more from their devices and software services. I want all elected officials to
understand the complexity of the digital social landscape – and enact policies to protect individuals. I want everyone
involved, because everyone is affected.

What makes you optimistic that we, as a society, will be able to improve social media?

I am actually incredibly optimistic that we, as a society, will materially improve social media in the coming years.
Everyday, at FRL, I get to see the potential future architecture of communication technology come together. Implicit in
that work, is an understanding that communication devices and services are an invaluable part of our world. When
designing a new experience, we acknowledge and investigate the impact it may have on the lives of the people who use it,
before they ever use it. This culture of responsible product development is baking itself into the ethos of product design
on an industry level – and that's my source of hope.

Connect with Michelle Cortese @ellecortese

"When designing a new experience, we acknowledge and investigate


the impact it may have on the lives of the people who use it, before
they ever use it. This culture of responsible product development is
baking itself into the ethos of product design on an industry level –
and that's my source of hope."

-Michelle Cortese, Design Lead Manager, Facebook Reality


Labs

IMPROVING SOCIAL MEDIA | 79


social media have you seen suggested or
LEARNING FROM THE COMMUNITY
implemented that you are excited
about?

Stephanie Humphrey Unfortunately, I do not know that I have


seen any sustainable solutions to
improving social media, although I may
have missed something. The protections
in place in Europe with GDPR are a start,
Technology Contributor, Author, Speaker but that policy and others like it still have
a ways to go.

How do we ensure safety, privacy and


freedom of expression all at the same
time?

We could start by having the platforms


simply enforce the policies they already
have in an equitable manner. That would
require an even bigger investment in
human moderators and AI, but it would
be a good start. We also need better
legislation around what information
companies are allowed to collect and
what they can do with that info. The
companies themselves could also do
better due diligence in identifying
revenue streams/models that do not
involve advertising to reduce the amount
of data they collect.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
media to educate the general public,
governmental oversight, and the role of
citizens in terms of literacy and how
they engage with platforms. In your
Tell us about your role: amedia could affect young people, opinion, what area do you think needs
and Til Death Do You Tweet was the most improvement?
I am currently a technology born!
contributor to ABC News, and I Honestly, all of these areas could use a
speak about good digital citizenship In your opinion, what are the lot of work, but I think we could start
through my seminar, Til Death Do biggest issues facing social media? with better media integrity and informed
You Tweet. governmental oversight.
In my opinion, the biggest issues
Tell us about your career path and facing social media are content What people and organizations do you
how it led you to your work’s moderation and data privacy. feel are doing a good job toward
focus: Platforms seem to be a free-for-all improving social media?
of abuse and misinformation, while
I am a former engineer who used companies only appear to care My company, Til Death Do You Tweet, is
school career days to help students about how much of our data they doing the work of citizen engagement
understand the pitfalls of social can sell.
media. I expanded the program Continued on next page
when I saw how negatively social What "solutions" to improving
IMPROVING SOCIAL MEDIA | 80
ALL TECH IS HUMAN

and literacy on social media platforms. Common Sense Media, Bark Technologies, and CyberWise are also doing great
work in this area, while the Electronic Frontier Foundation is leading the charge of online privacy protection.

What do you see as the risk of doing nothing to address the shortcomings of social media?

We've already seen the risk of doing nothing to address the shortcomings of social media. Starting with the 2016
election, through a global pandemic, and all the way up to the insurrection on January 6, 2021, social media has been
grossly manipulated in ways we should have been able to see coming and the results have been devastating to our
country and the world.

How does social media look different five years from now?

I think some users will be open to the idea of paid platforms that provide the type of privacy protection and content
moderation that would make your time on the platform a more valuable experience.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

Algorithms are only as effective as the people who develop them. There is still a lot of work to be done to diversify this
area to improve bias.

What makes you optimistic that we, as a society, will be able to improve social media?

I believe society will be able to eventually improve social media because I believe human ingenuity and innovation have
improved society as a whole more often than not. However, I am not as optimistic about what it will actually take to make
those necessary changes. Every time it feels like we may have hit rock bottom with respect to social media, something
else happens that makes it seem like there is no floor. I worry about what has to happen before we all agree that
something more disruptive has to be done and we take actionable steps to do it.

Connect with Stephanie Humphrey at @techlifesteph

"In my opinion, the biggest issues facing social media are content
moderation and data privacy. Platforms seem to be a free-for-all of
abuse and misinformation, while companies only appear to care about
how much of our data they can sell."

-Stephanie Humphrey, Technology contributor & author

IMPROVING SOCIAL MEDIA | 81


verisimilitude. I feared reality tv could
LEARNING FROM THE COMMUNITY
create a model for modern fascism. I left
and went to grad school to study media,

Jamie Cohen, PhD


culture and art. I started teaching and
founded a new media degree at Molloy
College. (Over two dozen people walk
around with a degree I invented!) My
doctorate focused on visual culture,
memes, and digital literacies and I
Digital Culture Researcher & Digital Void Host authored a book titled Producing New
and Digital Media: Your Guide to Savvy
Use of the Web. Now I co-produce/co-
host the Digital Void Salon and Podcast
Series, a show that bridges the gap
between digital cultures and common
understanding.

In your opinion, what are the biggest


issues facing social media?

Social media's biggest issue is its profit


model. Let's focus on just the past few
years. When Trump made it to the White
House, it was clear that a singular
character would dictate the engagement
factors for the users. Groups on
Facebook became the locus of evolution,
Twitter aimed to amplify the president's
speech into a feedback loop running from
Twitter to mainstream media back to
Twitter. YouTube created a funnel
system that relied on influencers to
engage with viewers and customize their
directive flow. This backfired and created
rabbit holes that algorithms helped
operate.

The idea of the open web was never to be


Tell us about your role: I knew I wanted to produce a vast infrastructure that simply hosted
television since I was in 8th grade. I four websites as the main traffic followed
The 2010s were a period of studied TV and art history in college by static web pages that sort of just
horizontal tech advancement with and after graduating, worked on a occupied server space. It was supposed
extreme changes to our variety of shows as an editor and to be dynamic and social. But social
communication through the color corrector. I advanced to media became a platform, and an
evolution of visual discourse that production a little over a year later unregulated one at that. I am not arguing
emerged from forums to social and found myself producing for government oversight, but I would
media to weaponization in physical television shows for NJN (PBS). I easily argue for moderation and ethics
spaces. I study, educate and explain was hired at mtvU, MTV's college boards – and advisory and trust and
the digital culture that creates network, to work on a reality show safety to be at the forefront. Social media
surplus media. soon after. The experience was is our landing zone on the internet. Its
intense and life changing. I had a priority shouldn't be profit, but
Tell us about your career path and sneaking feeling that the method of connectivity and welcoming.
how it led you to your work’s reality show production could
focus: amplify bad actors using the Continued on next page
method of entertaining
IMPROVING SOCIAL MEDIA | 82
ALL TECH IS HUMAN

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

I am really excited about the possibility that Trust and Safety and advisory boards have become much more prominent in
the last few years. I think that bringing the discussion of platform moderation and safety into common talk of social
media is a really good sign.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

To be honest, I think the role of media to educate the general public needs a lot of work. Here's why. Media in general is,
well, general. it is made for the widest audience possible. Second, media is designed to be consumed and consumed
strategically – as in advertising time, network clocks, scheduling, booking. In this way, media uses reductionism and
talking points to deliver nuanced concepts to a public. This requires either the host to be well versed in nearly everything
or a guest to be media trained to distill depth into shorter segments.

The additional issue is the in-built bias – not left or right, but rather the assumption that the audience isn't that smart (I
know this for a fact). So the reductionism employed by media to educate the public is simplified to a nearly unusable
system so that people continue to watch but do not feel the need to activate or participate – just consume. What we need
is on-air talent and guests that have internet literacies or meme literacies or internet fluency. The audience actually
wants this, but the corporate media model does not believe it to be true.

What do you see as the risk of doing nothing to address the shortcomings of social media?

If we do not address the shortcomings of social media, the systems will become recursive. They will inevitably create
their own ecosystems (which, to an extent they have) and eventually their own universes, cordoned off from the other
social media. We need to address the profit model first and reestablish the user's rights, but then of course create a
functioning way forward for social media that straddles the efforts of communication, connection and speech with
common sense approaches that prevent feedback loops, rabbit holes and dangerous activity.

If you want to go a step further, if we do not address the shortcomings, the market will possibly create a new model, a
more fractured web with distinct media flows that do not enable any cross-flow or connective discourse. The editorial
web, one with unique media flows, will literally result in competing realities.

How does social media look different five years from now?

I would have said something completely different previous to the insurrection. Now, I believe that the conversation has
moved out of the esoteric circles and into public conversation. Having the conversation about social media helps make
using social media more intentional. I think if many of the concerns are addressed, we'll see a more useful approach to
social media. What I actually foresee are new platforms that are not immediately crushed or purchased by the MAAAF
giants and we can use multiple platforms together. Sort of like how Zuckerberg wanted to merge the messaging on the
Facebook products, but without the possibility of complicity in more war crimes.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

This question will be biased as I come from a technical background and further educated in the Humanities and Cultural
Studies. I think, first, diversity is the top priority. Founders must prioritize BIPOC, people with disabilities and immigrants
into their organizations and tech. Second, I think some sort of humanities and critical thinking need to be on the resume
of those involved. Engineers are extremely important, and I think there are engineers who also have knowledge, interest
or experience in critical courses from English to media studies to history to art, etc.

Connect with Jamie Cohen @NewandDigital

JAMIE COHEN IMPROVING SOCIAL MEDIA | 83


campuses as well. The existence of Lips
LEARNING FROM THE COMMUNITY
Zine empowered those who contributed
and even those who didn't. It created and

Annie Brown
claimed a space where female* voices
were allowed, acknowledged and valued.
After graduating, I worked for gender
equality organizations such as Grameen
Bank, Humsafar Trust and Planned
Parenthood, and participated in Y
Founder of Lips Combinator as the Communications
Director for SafetyWing (W18).

I soon began speaking and writing on the


topics of AI, Blockchain and Diversity in
Tech. Today, I work as the founder of Lips
and am a guest lecturer at University of
California, San Diego's Rady School of
Management and a contributor for
Forbes. Since Lips’ inception, I have
continued growing the community,
combining my expertise of innovative
technologies with a passion for creating
space for open and honest expression.

In your opinion, what are the biggest


issues facing social media?

While social media faces many issues


today, our team at Lips primarily focuses
on addressing discriminatory censorship,
harassment and plagiarism enabled by
mainstream social media platforms
disproportionately affecting
marginalized communities – particularly
women, non-binary folks, Black,
Indigenous & People of Color and the
LGBTQIA+ community.

Tell us about your role: womxn for over 10 years. The These issues cause significant harm to
concept of Lips originated as a mental and physical health and
I am the Founder of Lips. Lips is a project for my Introduction to a reproduce existing inequities rather than
novel, alternative social media Women’s Studies course at William correcting and solving them. For
platform built by and for women, and Mary University. example, it is common knowledge that
non-binary folks and the many women* deal with self-image issues
LGBTQIA+ community to safely I realized that there were no spaces and that social media has chiefly made it
express themselves and sell their on campus where women could worse. However, a large portion of body-
work without biased censorship, express themselves, and especially positive content on social media are
harassment or plagiarism. their sexuality, safely, openly and flagged as inappropriate and removed.
honestly. Lips asked women to mail
Tell us about your career path and in – or anonymously drop into a Also, hate groups and trolls have
how it led you to your work’s P.O. box – stories, poetry and unfortunately become inescapable on
focus: artwork expressing their sexuality social media — trans people being one of
for the publication. Quickly, the
I have been working to create idea became a hit on campus and Continued on next page
spaces of free expression for Lips Zine grew to five other local
IMPROVING SOCIAL MEDIA | 84
ALL TECH IS HUMAN

the most vulnerable populations to their abuse — and sadly, most platforms have done little to control or prevent
harmful antics. Their features often reinforce the behavior by removing the creator’s account when reported without
detecting that the actions are motivated purely by hate.

Finally, these issues perpetuate economic inequality. As just one example, female wellness brands are barred from selling
and advertising, and sexuality educators/coaches are shadow-banned for inappropriate content. Without better options,
these entrepreneurs will continue to be stricken with additional labor that is both wasted time and dollars lost; as just
one example, hundreds of business owners and creators have vented to us about the hours they’ve spent personally
contacting Facebook and Instagram reps about unfair rejections and content deleted.

When creators are faced with violence, shame and censorship by digital platforms, they are not able to reach their fullest
potential as artists, as entrepreneurs and as humans, and we want to change that.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

What does safe space online for women* & the LGBTQIA+ community actually look like? Well, it takes bringing these
marginalized voices in and centering them in the design process, which is a practice known as Design Justice led by
Professor Sasha Costanza Chock. These communities are well equipped with the knowledge and tools necessary for their
flourishing, but representatives are not often brought to the decision-making table. As members of the communities
ourselves, using research from user testing and co-design workshops, all of our decisions – from features to the language
of our “Community Guidelines” to how our community is built, maintained and moderated – are made with this
awareness. We host workshops for creators, brands and LGBTQIA+ youth to ensure that the communities for whom we
are building the app are centered, have been involved and will remain involved in the app’s design process at every stage.

We aim to give brands, artists, influencers a place to share content that educates and empowers marginalized
communities. Our “Community Guidelines” were collectively written and are rooted in freedom and fairness for all
groups. We do not tolerate hate speech, harassment, abuse or discrimination of any kind. Lips operates on the philosophy
sometimes referred to as the paradox of tolerance: “If a society is tolerant without limit, its ability to be tolerant is
eventually seized or destroyed by the intolerant.” [Karl Popper, the Paradox of Tolerance.] Hate speech is not free speech,
as it forces others into silence in order to survive.

We keep our people safe through a vetting process and only allow those who express understanding and shared
appreciation for our values to contribute their work to the community. Anyone can browse Lips, but only approved
members can post. Our patent-pending “inclusive” AI moderation system and blockchain technology are what will enable
us to maintain the norms and values of the community.

As we grow, we will continue to engage the community in design decisions along the way. Many other platforms deal with
issues of abuse by making it possible to turn features such as messaging, commenting and tagging on and off. Adopting
solely an approach like this is basically a form of virtual victim-blaming. Creators have the choice of continuing to receive
hate or disabling messaging, which usually comes at the expense of their businesses. Lips is much more proactive about
preventing this type of behavior in the first place.

Continued with Annie Brown @andreafrancesb

ANNIE BROWN IMPROVING SOCIAL MEDIA | 85


Pennsylvania, Sciences Po in Paris and
LEARNING FROM THE COMMUNITY
Fudan University, I have built extensive
experience in leading cross-border

Tiffany Xingyu Wang


investments and operating global
alliances. I was an award-winning asset
investor focused on emerging markets,
including sub-Saharan Africa, APAC and
Europe. From an asset investor to a tech
enterprise operator today, I focus on
GM & Co-founder of Oasis Consortium; Chief driving operational excellence and
Strategy Officer of Spectrum Labs scalable growth. My time at Salesforce,
now Spectrum, and co-founding Oasis
Consortium has given me the know-how
of building ethical brands and platforms.

As new technologies are reshaping our


practices and challenging norms, I stay
abreast of the latest developments
specifically in AI, Blockchain and
Quantum through investments and
advisory roles. I speak on global stages in
the areas of AI for good, and investing in
emerging markets.

In your opinion, what are the biggest


issues facing social media?

Lack of attention and investment in


safety by design, privacy by design and
DEI [diversity, equity & inclusion] by
design features led to unfortunate
events today. Social media channels have
become dangerous back alleys of
conspiracy theorists, empowered bullies,
scam artists and worse.

What "solutions" to improving social


media have you seen suggested or
Tell us about your role: world polluted by predators, hate implemented that you are excited
speech and mistrust. I am also a about?
I am Chief Strategy Officer at venture partner at Tribe, a
Spectrum Labs, a contextual AI Singapore government-backed It is obvious Contextual AI is the way to
platform set to build a safer blockchain accelerator leading go. But why is it not a prevailing solution
internet. At Spectrum, we help investor relations in the United yet? The answer I hear is that it is hard
companies manage their online States. and costly. The more ambiguity and the
communities, where toxic more corner cases there are, the more
behaviors often run rampant. Tell us about your career path and data you need to correctly navigate
Recently I co-founded Oasis how it led you to your work’s those complexities. Like skiing—if it were
Consortium to establish focus: a straight shot down the hill, you’d only
governance around brand digital need one gate (Keyword match); if the
safety issues. If we do not start a Years on the road have given me an path is twisting and curvy, you need more
movement of forging trust and appreciation that humanity is gates to direct the skier down the hill.
safety for the digital spaces where common and cuts across diverse
we live, work and play, the next cultures. A graduate of the Continued on next page
generations may inherit a digital Wharton School of University of
IMPROVING SOCIAL MEDIA | 86
ALL TECH IS HUMAN

Gates are the data we need here (by vertical, platform, use case), and the path toward the destination – your
community’s guidelines – is the AI model, Contextual AI. The problem at hand, Trust & Safety of the internet, is hard but
exciting.

How do we ensure safety, privacy and freedom of expression all at the same time?

Speed to trust will make the next generation of decacorns. Acknowledging this and acting now is the first order of all
things. Then it comes down to the three "first principles" for ethics by design: safety by design, privacy by design and DEI
by design. If we can bake the principles in the policy, product and platform design phase through ideas, people and
technologies, we will stand a chance to build a better internet. One where we can ensure safety, privacy and freedom of
diversity at the same time.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

It is a shared responsibility across regulators, platforms, media and users. The fundamental challenge for today's chaos is
that corporates are assuming a government-like role to enact, promote and enforce community policies. This is an
extremely hard and costly task for platforms. Acknowledging this, a few recommendations to increase ethical
transparency:

1. Collaborations: Technologists share the responsibility to educate and collaborate with policymakers. This will enhance
regulators' understanding of the issues du jour and drive effective policies.

2. Proactive approach: A platform needs to do more than just rely on users to report guideline infractions. This can take
shape in many ways (ideally using contextual AI) to proactively detect behaviors across all content.

3. Established process and consistent enforcement: A platform needs to know how its detection and reporting
capabilities are performing against what’s happening on the platform. There are going to be some cases that can be
automated away. Other cases are going to always require human moderation. And some of those are going to be unclear
gray areas that will be tough to deliberate on. it is important to have an internal process for how to handle these.

By improving policy transparency and content moderation on platforms, we can as an industry stay ahead of the curve of
government regulations.

Connect with Tiffany Xingyu Wang at @tiffanyxingyuw

"If we can bake the principles in the policy, product and platform
design phase through ideas, people and technologies, we will stand a
chance to build a better internet. One where we can ensure safety,
privacy and freedom of diversity at the same time."

-Tiffany Xingyu Wang, GM & Co-founder of Oasis Consortium

IMPROVING SOCIAL MEDIA | 87


variety of technology news outlets. I co-
LEARNING FROM THE COMMUNITY
founded the Open Rights Group in 2005,

Suw Charman-
campaigning for digital rights in the UK
alongside my consulting work, and then
founded Ada Lovelace Day in 2009 to

Anderson
campaign for equality for women in
STEM. In 2014, I moved to the USA and
began working on Ada Lovelace Day full
time.

In your opinion, what are the biggest


Founder & CEO, FindingAda.com issues facing social media?

The unwillingness of social media


companies, particularly the giants like
Twitter and Facebook, to fully grapple
with the abuse and harassment of
women on their platforms is one of the
largest problems social media faces. it is
not just an issue because of the terrible
impact such abuse has on its targets, but
because widespread harassment leads to
a culture of self-censorship and self-
anonymisation by women, even if they
are not directly attacked, as they seek to
avoid becoming a target.

This has a profound and worrying chilling


effect on women's participation in online
public spaces. It is literally impossible to
measure the damage this does, as we are
not able to identify or measure the ideas
that are not had, the connections that are
not made, the businesses that are not
founded, the creativity that isn't
expressed, because women fear the
consequences of speaking up and
conversing in public. The impact is felt
Tell us about your role: achieving our goals. We focus on not just by women themselves, who must
equality advocates in STEM; and constantly be aware of their internet
For the past 4+ years, I have served the Finding Ada Conference, an surroundings and constantly question
I am the founder of international conference for whether if they speak up they are about
FindingAda.com, a business that women, D&I and HR professionals, to be the recipient of abuse, but also by
supports women in science, and educators. society which is deprived of women's
technology, engineering and maths cultural and political contributions, and
(STEM) through mentoring, events, Tell us about your career path, and by the economy which is deprived of
conferences and content. Our three how it led you to your work’s women's business and financial
main projects are Ada Lovelace focus: contributions. Women's activities are
Day, an international celebration of curtailed and their professional and
the achievements of women in I started in science, moved into web personal growth stunted because they
STEM which aims to inspire girls design in the late ‘90s, then became cannot take part in the public commons
and women to study and work in a social technology consultant in on an equal footing to men.
STEM; the Finding Ada Network, a 2004. I was one of the UK's social
mentoring and knowledge sharing media pioneers, working with Continued on next page
network for women and gender household brands and writing for a
IMPROVING SOCIAL MEDIA | 88
ALL TECH IS HUMAN

How do we ensure safety, privacy and freedom of expression all at the same time?

The tech industry tends to focus on freedom of expression without considering the responsibilities that come with every
freedom, and it is only by considering our responsibilities that we can fairly balance safety, privacy and freedom of
expression. We have to think about what we owe to each other, what our responsibilities are to one another. Only when
we consider these questions can we begin to formulate appropriate responses to the question of how free our freedoms
can really be, and where they end.

The radical individualism and libertarianism of the American tech industry leads businesses to consider freedom of
expression as an absolute right, rather than considering that it has limits. Where is the boundary between your right to
free speech and my right to freedom from abuse, harassment and violence? Not only do these boundaries need to be
considered and defined, platforms need to decide what to do about transgressors, to apply rules consistently and fairly,
and have an accessible and transparent appeals process. there is no doubt that this is a difficult challenge for companies
with millions or even billions of users that are active across multiple jurisdictions with radically different and sometimes
opposing definitions of acceptable speech, but there is also no doubt that this nettle has not been grasped and that social
media platforms are failing to balance rights and responsibilities.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

There are multiple axes of diversity, and any attempt to diversify the tech industry must address all of these axes. Whilst
I campaign for women's rights, I recognize not just that our work must be intersectional, but that there must also be
broader diversification projects that support communities I cannot reach. Obviously women are badly underrepresented
in tech, and social media companies are no different to any other in the sector. We need more women at all pay scales,
but we also need to see more women from a variety of backgrounds, particularly women of colour, women with
disabilities, and women from working class, immigrant and other under-served communities. Only when we see these
women who bring invaluable perspectives and expertise to the table progressing into senior roles will we start to see
social media companies developing the skill, empathy and insight required to deal with problems such as harassment and
abuse.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

The biggest problem with algorithms is that they take the biases of a small group of individuals – the developers – and
bake them into the system, whilst also removing choice from the user. We need to recognize that content sorting
algorithms exist for the benefit of the platforms, not the user. The idea that they surface valuable content to the user is
nothing but a myth, given how poorly these algorithms perform. While content sorting algorithms serve the business's
profit motive rather than the user's interests, it will be almost impossible to engender the required change. This isn't just
a matter of user experience, but a problem with respect to issues that have serious repercussions for society, such as far-
right radicalisation or the suppression of speech from marginalised groups. It seems difficult to see how platforms will
solve this problem when it is not in their interest to do so, and I fear that regulatory intervention is needed.

Connect with Suw Charman-Anderson at @suw

"The biggest problem with algorithms is that they take the biases
of a small group of individuals – the developers – and bake them
into the system, whilst also removing choice from the user. We
need to recognize that content sorting algorithms exist for the
benefit of the platforms, not the user."

-Suw Charman-Anderson, Founder & CEO, FindingAda.com

IMPROVING SOCIAL MEDIA | 89


ALL TECH IS HUMAN

the internet more broadly). The current


LEARNING FROM THE COMMUNITY
challenges we see with social media are a
threat to democracies around the world.

Sonja Solomun
There seem to be two ways to answer
this question of “what is wrong” more
generally. One is that it seems
resoundingly clear that the economic
models underpinning social media are
Research Director, Centre for Media, incompatible with the public interest. I
Technology and Democracy at McGill think to address “what is wrong,” we also
need to look at the norms, the practices
University and the experience of platforms as
something other than a tool. At the end
of the day, we experience social media
and technology more broadly as many
different kinds of things.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
media to educate the general public,
governmental oversight, and the role of
citizens in terms of literacy and how
they engage with platforms. In your
opinion, what area do you think needs
the most improvement?

I think there is an urgent need right now


for government to address the clear and
repeated patterns of harms on social
media, especially the ways in which the
technological structure and economic
models of social media platforms
generate and maintain racism and
misogyny, and the ways in which the tech
infrastructure and its affordances are
weaponized to belie the fabric of
democracies around the world. So the
Tell us about your role: extremism and the reproduction of problem of social media goes beyond just
historical and social inequality. We the technology and therefore requires
I am the Research Director at the do a lot of work in technology solutions that go beyond just fixing the
Centre for Media, Technology and governance, including platform tech. I also think we need to create more
Democracy. We examine how governance and facial recognition pan-coalitions between groups currently
media and emerging technologies governance as one core solution to doing critical, reflexive and justice-
shape democracy. We approach the this problem (among others). oriented work in improving and resisting
current set of platform harms as these harms. That means uniting climate
symptoms of a more systemic and In your opinion, what are the justice groups with tech governance
entrenched structural problem, one biggest issues facing social media? groups, it means gig workers, academics,
that is directly threatening artists – and especially community
democracies around the world. So It is clear that the increasingly organizers – coming together to work on
in directing the research program, I algorithmic infrastructure of social these challenges.
help identify ways we can conduct media reproduces and amplifies
and highlight research that focuses racism and misogyny, which is a Continued on next page
on those very structures that fundamental problem (and was
generate and maintain racism, always a fundamental problem of
IMPROVING SOCIAL MEDIA | 90
ALL TECH IS HUMAN

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

There are so many civil society, academic and community groups and organizations working for fair and accountable
platform governance that are doing such critical work in this area.

I am really excited by the first annual conference of a platform governance research network (Platgov.net). Research on
platform governance often remains fragmented by discipline, methods and regions; or focuses on the most popular
platforms, usually based in the U.S. Too often, work by the most affected communities remains excluded, including so
much critical and justice-oriented work coming out of the Global South. The conference aims to bring those voices
together and to build a new research network that aims to coalesce a global conversation about platform governance,
while highlighting underrepresented groups and disciplines.

So much work by organizations such as IT for Change, CIS-India, Mnemonic, Digital Africa Research Lab, KICTANet and
countless other Global South organizations are leading the charge and coming up with innovative ways to govern
platforms in more responsible and justice-oriented ways.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

I work in platform governance, where work tends to focus on the dominant U.S. platforms and on the same kinds of
problems that affect mostly North American and European users. That means that critical work on other types of
platforms such as payment platforms, or on problems facing gig workers gets left out of the platform governance
conversations. Research from other disciplines and approaches, like game studies or sex worker rights, which may not be
typically associated with platform governance do not get brought in to these conversations. Most importantly, the work
of Global South groups long at the forefront of working for fair and accountable tech governance needs to be brought
into global approaches to the problems we are seeing today (including organizations mentioned above).

More broadly, I think we need to hear more from what different groups using and building social media can teach us
about it, including sex workers (including the work of Zahra Stardust, Gabriella Garcia and Chibundo Egwuatu) and what
other models of relating to technology and ethics of technology look like more broadly (such as Sareeta Amrute [2019]
“Of Techno-Ethics and Techno-Affects,” Feminist Review 123[(1]: 56–73 and Jason Edward Lewis, Noelani Arista, Archer
Pechawis, and Suzanne Kit [2018] “Making kin with the machines,” Journal of Design and Science). Bringing those most
marginalized by technology into the design process is also key (Sasha Costanza-Chock [2020] “Design Justice:
Community-Led Practices to Build the Worlds We Need,” Cambridge, MA: MIT Press).

Connect with Sonja Solomun @SonjaSolomun

"Research on platform governance often remains fragmented by


discipline, methods and regions; or focuses on the most popular
platforms, usually based in the U.S. Too often, work by the most
affected communities remains excluded, including so much critical and
justice-oriented work coming out of the Global South"

-Sonja Solomun, Research Director, Centre for Media,


Technology and Democracy at McGill University

IMPROVING SOCIAL MEDIA | 91


new fears arose so that the evidence,
LEARNING FROM THE COMMUNITY
from our children's own experiences to
research evidence, continues to

Anne Collier
encounter more skepticism than
receptivity. I find there are two ways to
solve this problem. Pushing out the
research, which we do via our sites,
speaking and social media channels, isn't
enough. The other piece is cross-sector
Founder of NetFamilyNews.org conversation and collaboration, such as
what All Tech Is Human is about –
because working together is even more
persuasive than facts, and a siloed,
single-sector approach cannot work
when problem-solving for a media
environment that is, by definition, social.

In your opinion, what are the biggest


issues facing social media?

I think we are all seeing an increasingly


conflicted marketplace for social media:
growing demand for content moderation
– which requires centralized or
hierarchical control of "speech" – and at
the same time growing distrust of the
platforms' moderation of speech. The
public has this intractable love-hate-fear
relationship with social media, the hate
part being our reaction to the techlash
since the US’s 2016 election, the
Cambridge Analytica story and a rolling
technopanic dating back to CDA and the
birth of Section 230; the love part
represented in continued avid use of
social media by both users and
advertisers; continuing reinforcement of
a business model that now affects
Tell us about your role: publisher's first Web editor, I was societies and challenges governments
fascinated by the story of how the worldwide; and the fear part being this
Executive director of The Net Internet and digital media were rolling moral panic since the late ‘90s,
Safety Collaborative, changing every aspect of society. I spiking with each new development
founder/writer at formed a nonprofit (Net Family (Web, social media, smartphones, etc.).
NetFamilyNews.org, safety adviser News, Inc.) in 1999 and started We can’t afford to perpetuate this love-
to Facebook, Snapchat, Twitter, writing about the impact on youth. I hate-fear logjam. We’re at a tipping point
YouTube and Yubo, consultant to soon found myself in the middle of where more and more of us are
platforms, news businesses, NGOs an ongoing (increasingly demanding and participating in
and investors. worldwide) media panic around thoughtful, cross-discipline, cross-
children's online safety and felt continent conversations about how to
Tell us about your career path and compelled to push out the research make our new media work for us, most
how it led you to your work’s that was slowly emerging (starting especially the most vulnerable media
focus: in 2000), to replace fear with facts users.
wherever possible. With each new
As a journalist in analog media development of digital media – the Continued on next page
(print, radio, TV) and then as a news Web, social media, smartphones –
IMPROVING SOCIAL MEDIA | 92
ALL TECH IS HUMAN

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

With the pandemic and related mental health effects, ever greater demand for social justice and growing distrust of Big
Tech, social media and the dominant ad-based business model, the conditions have never been more ripe for
experimenting with business and governance models for a social media environment that serves people better.
Diversification is needed – and we are seeing innovation in both the business and investing communities. I am excited to
see the experiments with platform-based and -enabled cooperatives, or “platform cooperativism" (platform.coop);
private vertical-interest digital communities such as 2Swim.plus; user-governed decentralized autonomous
organizations such as Mastodon and DAOs on Aragon (aragon.org); and blockchain-enabled transparency and
accountability. I am thankful that people are challenging what scholars in Australia have called the “control paradigm,”
whether manifest in digital media or governments.

How do we ensure safety, privacy and freedom of expression all at the same time?

Societies do not have the answer to this question yet. Especially in the US, we have not figured out how to regulate social
media, even as some social media platforms are calling for regulation! We need to figure this out in this country (I think of
ideas being put forth by University of Toronto law Prof. Gillian Hadfield and what Australia’s eSafety Commissioner,
Julie Inman-Grant, and her office are modeling).

Just in the past few months, European regulators hit a logjam in their efforts to protect privacy and safety at the same
time, to the point where Microsoft, Google, LinkedIn, Yubo and Roblox felt the need to sign a joint statement saying they
would continue to detect, remove and report online child sexual abuse content despite Europe's new law (requiring
confidentiality of communications data on devices other than phones) But promising experiments are being proposed
and tried, convening experts in all three and other fields, such as Social Media Councils and TSPA-like professional
associations for workers who deal with all sorts of content. And Facebook spawned, funded and spun off an Oversight
Board as a new, experimental form of regulation. We need to stay tuned.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

Regulation on this side of the Atlantic is what I see as needing the most improvement – with an eye to innovation in
regulation, for example, law professor Gillian Hadfield's concept of "super-regulation". It is useful to consider all
potential solutions, such as revising Sect. 230 and antitrust action, yes, but not as the only possible solutions. There
needs to be a conscious effort to be less reactive to public fears and less focused on revising old models. And as for old
models, regulation and legislation need to be newly flexible – have built-in expiration dates or mandate being revisited so
as to keep up with changing technology.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you
say their work is helping?

If you mean "improving social media" in the broader sense of improving people's experiences in our new, very social,
media environment, I feel Internet helplines – such as the Europe-wide network of helplines instituted by the European
Commission over a decade ago, New Zealand's Netsafe and Australia's Office of the eSafety Commissioner – are doing a
great deal to improve young people's experiences with social media. More need to be established so that psychosocial
Internet help – independent of the Internet industry but in cooperation with it – could form a network that covers all the
planet's time zones – and counterbalances law enforcement responses to online harm.

Helplines take different forms in different countries – from long-standing services such as Child Focus in Belgium, which
added Internet help to existing child help services, to new ones that support the well-established offline services of the
likes of Canada’s Kids Help Phone – but societies need to be sure that help for children, and other vulnerable groups,
includes expertise in social media and technologies (Internet helplines can also provide psychosocial and Internet
expertise to law enforcement and social services). A way after-the-fact “appeals court” like the Oversight Board
Facebook created is great and serves an important purpose but is also way too late for the harms social media users can
experience. Users experiencing online harm need and deserve realtime – or near-realtime – help independent of
industry, and we have great models for this in a number of countries now, including eSafety in Australia, Netsafe in New
Zealand and Internet helplines in a number of EU countries.
ANNE COLLIER IMPROVING SOCIAL MEDIA | 93
Continued on next page
ALL TECH IS HUMAN

How does social media look different five years from now?

Big Tech will remain, but alongside the giant platforms, there will be many more media options in five years, including
private, vertical-interest, member-governed membership communities both for-profit and non-profit, on the Web, on
the blockchain, etc. We're already seeing examples, from Mastodon to DAOs (decentralized autonomous organizations)
on Aragon.org to platform cooperatives with physical presences. All of them need to be embracing Safety by Design, but
it’s great that they’re all about giving power to their users, involving them in governance and, at least to a degree, giving
users ownership of their own data. This is actually a pretty exciting moment in media history.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Psychology, social science, media studies, neuroscience, anthropology, constitutional law, criminology, human rights
(including children’s rights), pedagogy/andragogy, child/adolescent development, pediatrics – to name just a handful. I
would say any field that tracks human development and sociality in the digital age, mirroring the methodology of a multi-
disciplinary report on bullying and cyberbullying by the National Academies in 2016. Just as the name All Tech Is Human
implies, social media is just as much about our humanity as our technology, if not more. Social media is global, too; we
can’t ever think together in terms of only one country or society.

What makes you optimistic that we, as a society, will be able to improve social media?

MLK's “the arc of history...bends toward justice." When bad stuff happens, such as Cambridge Analytica, election
manipulation and disinformation and the massive cyber attack on the US that came to light late last year, people –
advocates, activists, innovators, investors, researchers, students, educators, parents, policymakers and pundits – call for
and effect change. It takes time, but change happens, at least in many societies around the world. It just may not happen
where and as we most want it to right now. An example of meaningful change for a huge sector of humanity – children
and young people, who represent fully one-third of Internet users worldwide – is General Comment 25, bringing their
digital rights to the more than 30-year-old UN Convention on the Rights of the Child. The UN Committee thereof just
announced the General Comment’s adoption earlier this month. The US is the only country on the planet that hasn’t
ratified the UNCRC, unfortunately, so I’m hoping the “arc of history” will include that development and US-based
Internet corporations will honor minors’ digital rights in any case.

Connect with Anne Collier @annecollier

"[T]he conditions have never been more ripe for experimenting with
business and governance models for a social media environment that
serves people better. Diversification is needed – and we are seeing
innovation in both the business and investing communities."

-Anne Collier, Founder of NetFamilyNews.org

IMPROVING SOCIAL MEDIA | 94


Media education, digital literacy, etc. and
LEARNING FROM THE COMMUNITY
funding for that from platforms and
government.

Yalda Tehranian-Uhls Part of our mission at All Tech Is Human


is to diversify the people working in the
tech industry. In your opinion, what
academic or experience backgrounds
should be more involved in improving
Founding Director, Center for Scholars & social media?
Storytellers, UCLA Psychologists and in particular those
who study childhood and adolescence.

Algorithms play a significant role in the


social media experience. What issues do
you see with how algorithms play such a
large role in an individual's experience
and how can we improve this?

They choose content based on your


interests, in theory this is good, but in
practice we know it creates problems. It
is important to be exposed to other
points of view. Otherwise, we live solely
within our in-groups in person and on
through the media we consume. This is
an issue for our society as we can not
build empathy without being exposed to
other perspectives.

What makes you optimistic that we, as a


society, will be able to improve social
media?

Things are improving. I know there are


good people in the industry who are
motivated.
Tell us about your role: In your opinion, what are the
biggest issues facing social media? Connect with Yalda Tehranian-Uhls
I run the Center for Scholars & @drYaldaUhls
Storytellers, based in the Commercialization of
psychology department at UCLA, communication between people,
and I teach and do research. especially young people.

Tell us about your career path, and When we discuss improving social
how it led you to your work’s media, we often toggle between
focus: the responsibility of platforms, the
role of media to educate the
I was a movie executive and then general public, governmental
got a PhD in child development. I oversight, and the role of citizens
am combining my expertise in the in terms of literacy and how they
entertainment industry and engage with platforms. In your
academics to support positive opinion, what area do you think
youth development. needs the most improvement?
IMPROVING SOCIAL MEDIA | 95
community of people who can listen and
LEARNING FROM THE COMMUNITY
know what it is like.

Charlotte Willner
What do you see as the risk of doing
nothing to address the shortcomings of
social media?

There is no such thing as "doing nothing"


– accepting the status quo and
Executive Director, Trust & Safety Professional perpetuating the systems that made it is
Association a choice, too. In trust and safety, "doing
nothing" typically only means agreeing to
accrue technical debt until we can cover
the cost of the cycles necessary to fix the
problem – a gambit that often works out,
but only if the cost does not become
insurmountable in the meantime. Social
media and communication technology
move at such a rapid pace that it is
foolish to agree to accrue that debt; if we
are not actively working on the problem
every day, the cost will become
insurmountable.

Part of our mission at All Tech Is Human


is to diversify the people working in the
tech industry. In your opinion, what
academic or experience backgrounds
should be more involved in improving
social media?

We must work to bring historically


marginalized voices into the content
moderation ecosystem – as moderators,
as rule-writers, as system-designers, as
product-builders. Too often, tech and
social media manage to hurt
communities that are already less
Tell us about your role: since had the privilege to work with structurally powerful – even if that
hundreds of people in the trust and technology was ostensibly made with the
I am the founding executive safety field at both small and giant needs of the less powerful in mind. If the
director at the Trust & Safety platforms. In that time, I have seen life experiences of your end users are not
Professional Association (TSPA) and experienced both the rewards core to your development process, you
and the Trust & Safety Foundation and the challenges of frontline T&S still have work to do.
Project (TSF). We support the work; T&S practitioners pour
people who work in online trust and endless energy, imagination and Connect with Charlotte Willner
safety around the world. empathy into helping others, but @helloyouths
their work is often invisible to the
Tell us about your career path, and public and even to their own
how it led you to your work’s colleagues. That reality is what
focus: motivates our work to support T&S
professionals in their own journeys
I started my career in online safety in the field, whether that is
15 years ago as a part-time specialized training, expert
message board moderator and have resources, career coaching or just a
IMPROVING SOCIAL MEDIA | 96
world. In addition to this, the impetus to
LEARNING FROM THE COMMUNITY
grow whilst ensuring it meets the needs
and goals of the user base.

Lawrence Ampofo
What "solutions" to improving social
media have you seen suggested or
implemented that you are excited
about?

CEO of Digital Mindfulness I am interested in the application of deep


ethical and responsible frameworks
within the everyday experiences of social
media platforms as the platforms
integrate increasing amounts of
technological advancements such as AI,
ML and more. I am also interested in the
emergence of new social platforms and
the impact they will have on more
established platforms.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
media to educate the general public,
governmental oversight, and the role of
citizens in terms of literacy and how
they engage with platforms. In your
opinion, what area do you think needs
the most improvement?

For me, platform innovation in this


sphere has the greatest potential for
growth and impact on global social media
users. From this, all else will follow.

What people and organizations do you


feel are doing a good job toward
improving social media? Why/how
Tell us about your role: After finishing, I developed Digital would you say their work is helping?
Mindfulness to help companies and
I am the CEO and founder of Digital product owners better understand I think there is good work currently being
Mindfulness, a digital ethics and ways to operationalise digital ethics done by some of the Asian social media
responsible innovation company and responsible innovation. platforms such as WeChat. In addition,
based in London. I am responsible dating apps are doing interesting work in
for product development and In your opinion, what are the improving the user experience in
research at the company. biggest issues facing social media? complex and highly emotive
environments.
Tell us about your career path, and The biggest issues facing social
how it led you to your work’s media really stem from balancing its What do you see as the risk of doing
focus: ambitions for growth, which are nothing to address the shortcomings of
driven both by the market and the social media?
My PhD focused on the emergence user base, in tandem with the
and impact of computational guardrails being put on social media Continued on next page
propaganda. At the same time, I by governments, quangos and
built a career in data and analytics. supranational agencies around the
IMPROVING SOCIAL MEDIA | 97
ALL TECH IS HUMAN

Widespread social degradation in the form of groups with nefarious intentions using these platforms for their own ends,
whether asymmetric warfare, disinformation, market manipulation. The potential consequences are grave indeed for the
world and humanity. Addressing them is strategically important.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

Psychologists and governance experts should have more influence in product decisions if social media is to thrive.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

There will always be a balancing act around policing platforms. It is unclear whether nation states will be able to solve
these problems, but perhaps this will be solved by supranational organizations.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

I see that algorithms can have a strong impact on people's everyday experiences, both online and offline. They can play
an influential role in shaping people's beliefs, decisions and life experiences. The solutions to ensuring the algorithms
work harmoniously with society are manifold, but the best thing at the outset is to ensure that high-level multi-
stakeholder engagement is put in place in companies to ensure that the best possible product is produced.

What makes you optimistic that we, as a society, will be able to improve social media?

I believe that, as the community of people involved in building technologies discussing the real improvement of social
media grows, this will bring social media to its stated goal of being a real force for global good.

Connect with Lawrence Ampofo @digital-mindful

"Psychologists and governance experts should have more influence in


product decisions if social media is to thrive."

-Lawrence Ampofo, CEO of Digital Mindfulness

IMPROVING SOCIAL MEDIA | 98


Trafficking Act (FOSTA) that removed
LEARNING FROM THE COMMUNITY
congressional-based protections for
platforms that “knowingly” participate in

Lisa Thee
the crime of human trafficking on their
platforms. The most visible outcome of
this legislation was Backpage.com losing
their protections from CDA 230 that had
prevented victims from holding them
accountable from profiting from their
Data for Good Practice Lead abuse until 2020.

When we discuss improving social


media, we often toggle between the
responsibility of platforms, the role of
media to educate the general public,
governmental oversight, and the role of
citizens in terms of literacy and how
they engage with platforms. In your
opinion, what area do you think needs
the most improvement?

The internet is over 20 years old and we


are at a tipping point – hoping the tech
industry can regulate itself is no longer
serving society. With new competitors
coming into the market every day, it is
important that there is a legal framework
that is enforced in order to make it a fair
playing field for industry and a safer
space for society. The first versions of
the automobile did not include safety
features like seatbelts but there was
eventually enough traffic on the road
that the need to mitigate risk to the
consumers required safety laws.

What people and organizations do you


feel are doing a good job toward
How do we ensure safety, privacy Secondly, the Right to Safety must improving social media?
and freedom of expression all at be preserved for the protection of
the same time? people. The tech industry is not the Center for Humane Technology, Thorn
first to stumble when it comes to (resources for eliminating child sexual
First, anonymity undermines finding the right balance of abuse and recovering Human Trafficking
accountability and enables regulation from the government, victims)
victimization.The trend in social but it is uniquely protected from it,
media right now is to go toward full due to Section 230 of the What do you see as the risk of doing
encryption. This seems like a wise Communications Decency Act nothing to address the shortcomings of
course of action on the surface, but (CDA) of 1996. CDA 230 protects social media?
in order to hold child predators third-party platforms from liability
accountable for their actions, for what is posted on their sites. Further division in the social fabric of
companies must have the ability to The first modifications to this law society.
determine if illegal content is being began in 2018, when Trump signed
traded on their platforms, which the Stop Enabling Sex Traffickers Connect with Lisa Thee @Lisa_thee
violates terms of service (in Act (SESTA) and Allow States and
addition to being a felony). Victims to Fight Online Sex
IMPROVING SOCIAL MEDIA | 99
government and civil society to build
LEARNING FROM THE COMMUNITY
human-centered solutions and public
policy initiatives that are innovative,

Gillian K. Hadfield
actionable and both locally and globally
relevant to the challenge of building safe,
responsible and inclusive AI and other
advanced technologies.

Tell us about your career path and how


Director of the Schwartz Reisman Institute for it led you to your work’s focus:
Technology and Society I have been interested in integrating
normative thinking about fairness and
justice with systems thinking about how
human societies work since reading John
Rawls’s A Theory of Justice as an
undergrad. I arrived at Stanford in 1983
to pursue a joint degree in law and
economics, and my PhD focused on the
economics of contracting and bargaining,
including issues of incomplete
contracting.

I then experienced legal systems


personally during a family dispute and it
was a major wake-up call: they’re
expensive, complex and do not work very
well! I started examining how legal
systems work, realizing that they’ve
grown increasingly out of step with the
modern world.

My 2017 book, Rules for a Flat World: Why


Humans Invented Law and How to Reinvent
It for a Complex Global Economy, re-
envisioned legal systems for an
increasingly complex world. I am now
applying that work to the risks, benefits,
Tell us about your role: in fields ranging from computer evolution and regulation of powerful
science to philosophy, political technologies like AI. We’re currently
My current research focuses on science and law to investigate and trying to shoehorn new technologies into
innovative design for legal and transform how we think about how old concepts, so re-thinking terms like
regulatory systems for AI and other technology, systems and society “regulation” “data,” “privacy” and
complex global technologies, interact. I believe that developing “consent” can ensure adequate
computational models of human responsible, fair, and beneficial AI governance models for the AI era.
normative systems and working isn’t only a computer scientist’s or
with machine learning researchers an engineer’s job. it is the job of My work in AI began with a joint paper
to build ML systems that social scientists and humanists as with my son, now a professor at MIT in
understand and respond to human well, and I strive to convene AI, about incomplete contracts and the
norms. conversations between those alignment problem. How can we make
experts so that we can solve sure our machines achieve our desired
As the Director of the Schwartz contemporary problems – together. outcomes? I think we need to study
Reisman Institute for Technology
and Society, I work with a team of At SRI, we also collaborate with Continued on next page
highly experienced senior scholars external partners in industry,
IMPROVING SOCIAL MEDIA | 100
normativity itself as a phenomenon in order to figure that out. I have come full circle, back to Rawls: What is fair, and how
do we get there?

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

Most solutions call for governments to conduct fairly traditional oversight and rulemaking. But I am most excited about
new ideas that leverage technology and markets. For example, Ron Bodkin, our Engineering Lead at the Schwartz
Reisman Institute, is spearheading an initiative to focus on what machine learning systems are optimizing for – especially
recommendation systems. Ron’s group will explore what goes into the design of optimization objectives (looking at both
engineering decisions and business incentives) and how better objective functions could mitigate undesirable outcomes.
This is a pragmatic and transparent approach that can be implemented today. We want to show what’s possible right
now and to spur innovation. I think this is exciting .I am also talking to researchers about ideas I have developed for
“regulatory markets.” We know that governments do not have the capacity or speed to keep up with technological
developments, so we need to create market incentives for new regulation ideas.

We can do this by creating a market for licensed regulatory services and requiring social media companies to purchase
independent regulatory oversight services. Regulators would be required to show that their systems achieve goals set by
governments. So rather than passing a law that specifies what social media platforms can and cannot do (as a bill
currently before the U.S. Senate proposes), governments would license private regulatory companies whose methods
demonstrably reduce excessive and harmful engagement on social media. Governments then regulate the regulators—
who are incentivized to do the research and tech development we need.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

A big part of my research is on the innovation of governance methods, so I would say we need the involvement of both
governments and platforms most urgently. But, what’s key here is that they must do things very differently. We need to
rethink the entire relationship between governments and corporations, and we need to involve diverse parties like
standards organizations, social science and humanities scholars, and public policy experts in this endeavor. We also know
that self-regulation simply does not work.

The bulk of responsibility should not rest on citizens to fully understand and think critically about powerful technologies.
Certainly, media literacy and public education are important, but primary reliance on these is a sort of victim-blaming.
One of my colleagues at the Schwartz Reisman Institute, Lisa Austin, has written about how the “individual consent”
model – one in which people are expected to, for example, read, understand and consent to the use or disclosure of their
personal information – is simply untenable and unrealistic in light of the complexity and opacity of current data flows and
data ecosystems. Who reads – let alone understands – terms and conditions on apps and other digital products?

Instead, the builders of technologies and the governments, whose job it is to protect people, need to imagine a new kind
of collaborative governance framework. That’s why my vision of innovative governance includes both tech- and market-
based regimes. I suppose that’s a more advanced version of “governmental oversight,” but that’s where I think our best
option lies.

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

If we find more effective means of regulating with some of the methods I have mentioned, I really believe we’ll get better
at finding the right balance. But it is important to remember that we do a lot of this kind of political toggling around all
important public issues, not just this one. So there is no reason to think this will be any different.

The tech world, of course, would like to wish away politics. But politics is not going anywhere, and we need to get better
at making this complex question about politics and, by extension, about people. We often see contested public decision-
making in democratic institutions (appellate courts, election observation, etc.) and that’s a fundamental part of a healthy
democracy. There is rarely a decision that affects a large part of the public that does not get scrutinized – and rightfully
so.

Continued on next page


GILLIAN K. HADFIELD IMPROVING SOCIAL MEDIA | 101
So I think the question is not: Will we ever solve this? But rather: How can we ensure that our collective decision-making
is accountable to our shared values, subject to review and revision and, most importantly, amenable to flexibility in
constantly changing circumstances?

The current problem is that the power of Big Tech swamps our regulatory and democratic methods. Consequently, these
companies have outsized power in these kinds of decision-making processes. That’s what we need to change.

Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

The problem is not algorithms themselves – or how large their role is. There are “good” algorithms and “bad” ones as well.
The crux of this issue is how we build those algorithms. What exactly are we optimizing for? Are we optimizing too much?

There is an axiom in economics that says, when you cannot measure everything you care about, you have to be careful
about over-optimizing for the things you can measure. In other words, we need to look at results with some skepticism
when we know that not all inputs were available for measurement. Take teacher incentives, for example—we can
measure student performance on standardized tests, but not as easily measure creativity or moral growth. If we tie
teacher incentives to standardized tests, we pay too much attention to optimizing student test performance, and not
enough to the things we cannot measure. The overall outcome is worse for everyone.

Excessive optimization is a problem we're facing in multiple domains – scheduling of retail workers, for example. Just
because we can optimize with data does not mean we should.

The fact that we are building very powerful algorithms that play a large role in dictating users’ experiences on social
media is not, in itself, a problem. What we need is a complex incentive structure – and corollary research and policy
infrastructures – that truly understand the power of these algorithms in order to nurture and shape them into the kinds
of algorithms we want to be influential—the kinds that reflect human values.

Connect with Gillian K. Hadfield @ghadfield

""The problem is not algorithms themselves – or how large their role


is. There are “good” algorithms and “bad” ones as well. The crux of
this issue is how we build those algorithms. What exactly are we
optimizing for? Are we optimizing too much?"

-Gillian K. Hadfield, Director of the Schwartz Reisman


Institute for Technology and Society

IMPROVING SOCIAL MEDIA | 102


and race. In 2013, I completed the MPhil
LEARNING FROM THE COMMUNITY
in Race, Ethnicity and Conflict at Trinity
College Dublin, which revolves around

Paloma Viejo
race-critical theory and critical social
studies. By 2014, Professor Eugenia
Sapiera of Dublin City University School
of Communications opened a PhD
research position in racism and hate
speech in online environments. I was
Research Assistant/Post Doc selected as the PhD candidate to
research the conditions of possibility for
the creation and circulation of racist
material in social media. Inquiring about
the notion of hate speech leads me to
look at the evolution of mechanisms in
place over time to “control hate,”
particularly the period between 1940
and the 2010s (from the drafting process
of the Declaration of Human Rights to
the time of social media), by looking into
the principles and values that underpin
each actor who has regulated hate.

Ultimately, I am looking to research any


potential challenges by the rise of social
media platforms as both new cultural
power and spaces where “hate speech”
regularly occurs.

In your opinion, what are the biggest


issues facing social media?

I would say it is hate speech, or more


accurately, the conditions of the
possibility for hate speech to be on the
platforms. Hate speech is for the most
part framed by Facebook and by
politicians as an operational problem.
Tell us about your role: My background is in Media and However, through my research, I have
Culture Studies. Through the latter, observed that the problem is a more
I am currently a Research Assistant, I obtained an internship position profound one, rooted in the values and
postdoctoral researcher for FUSE with the Spanish government, principles upon which Facebook has built
at Dublin City University's National working as a cultural specialist for its technology – and that technology
Anti-Bullying Research and the Minister of Foreign Affairs. This perpetuates. This question needs to be
Resource Center. I assist schools in internship led me to spend the unfolded. Perhaps this interview has no
tackling bullying, hate speech- initial years of my professional room for it, but I will [give] you a simple
based bullying and online safety. career working for the Spanish example. Facebook has two values to
My doctorate has explored how Development Unit in Guatemala justify how users upload content: Voice
Facebook governs hate speech, and and Sudan and for UNESCO as a and Equity. Voices and Equity are
I am deeply interested in the area of visitor researcher. Influence was technologically reflected on a Facebook
platform governance. drawn from both academic and user’s wall under a simple question:
professional working practice, “What is on your mind?”
Tell us about your career path, and seeing my critical thought
how it led you to your work’s developing. I became increasingly Continued on next page
focus: interested in the subjects of class
IMPROVING SOCIAL MEDIA | 103
Among many other possibilities, Facebook asks the user: “What is on your mind?” That is the type of question you ask
someone who is lost in thought, who is staring at the ceiling. It does not ask for elaborate thoughts; it is asking one to
speak, simply speak, and the question is supported by two principles: Voice and Equity. Voice means that all individuals
can upload whatever is in their minds, and Equity implies that all users are arithmetically equal, regardless of whether or
not they belong to the oppressed or the oppressor. Every single user is in a position to speak their mind. That is, at the
end of the day, what ”Platform for all” means, but – also – here is where the problems start.

In this particular case, Facebook has invited us to post anything we want, whatever is on our mind, and that potentially
includes hateful content. Yes, we have the Community Standards forbidding specific expressions and automatic
detection to stop them. However, operationally speaking, those are activated once the content is flowing in the platform
– once the word is out. That is only a small example of how Facebook’s Principles and Values affect how we interact. We
could also talk about how Facebook’s value of Equality determines the policy definition of hate speech and embraces a
post-racial understanding of hate speech.

What "solutions" to improving social media have you seen suggested or implemented that you are excited about?

What do we mean by improving? Do we mean adding more product solutions designed upon the same principles? Or do
we mean altering the conditions of possibility for hateful content to be on the platform? If it is the first case, I can say I am
excited to see how Facebook will expand its product solutions to “advance racial justice” (see [Mark] Zuckerberg’s post
on June 5th, 2020). It is a new project currently led by Fidji Simo, head of the Facebook app, and Ime Archibong, who is in
charge of Product Experimentation on Facebook.

I look forward to seeing what kind of solutions they propose.

If by improving, we mean altering the conditions of possibility for hateful content on the platform, platforms like
Facebook would have to change enormously, to the extent, I argue, that they would no longer be the platforms we know.
Therefore, it would no longer be an improvement but a change. I am inquisitive to know how building platforms with
different values would affect the way we connect and communicate.

How do we ensure safety, privacy and freedom of expression all at the same time?

When it comes to ensuring safety and freedom of expression, a matter of fact is that Facebook already does. It is a
technicality, but one I find fascinating.

Tacitly, Facebook makes the distinction between freedom of expression and freedom of information. If we look closely,
all the mechanisms and techniques that Facebook has implemented to provide safety do not dictate what the users have
to say. Their voices are intact but mostly interfere with how users receive and disseminate information. Take a look:

1. User's settings regulate user visibility.


2. The user's flagging report system lets Facebook know what the user considers should not keep circulating.
3. Automatic detection is for obvious reasons only for content that is on the platform.
4. Human moderation, whose task is to eliminate or filter the visibility of content.
5. Oversight Board, whose ultimate task is to decide if certain content should be back on circulation or not.

Zuckerberg summarized this well in 2017: “Freedom means you do not have to ask permission first, and that by default
you can say what you want. If you break our community standards or the law, then you're going to face consequences
afterwards. We won't catch everyone immediately, but we can make it harder to try to interfere.” (Zuckerberg, Mark, 21
September 2017).

As such, freedom of expression and safety are ensured. Perhaps we should start talking specifically about freedom of
information. I actually think that, to talk about privacy, we will need to open a different question, but to an extent it is
also linked with circulation. The lower your visibility, the lower your circulation of content. Although it is not guaranteed.
You would have to rely on your close contacts to not circulate a post whose privacy is important for you.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?
PALOMA VIEJO IMPROVING SOCIAL MEDIA | 104
Continued on next page
Governmental Oversight. No doubt. I like Suzor’s (2019) idea when he suggests that terms of service should respond to
General Law. It would affect community standards, I guess. Furthermore, I say Facebook would be grateful for it. They
clarify that they do not want to be the arbiters of discrimination, neither the arbiters of truth. That is at least what the
public says, and I don’t have arguments that prove that what they – Facebook – says is not what they believe.

What makes you optimistic that we, as a society, will be able to improve social media?

It makes me feel optimistic that we will keep testing different forms of connecting digitally. Not sure if it has to be on a
platform. I do not see why we cannot own our data and share it with whoever we want. I would love to have a small data
center in my kitchen, right beside my toaster.

Connect with Paloma Viejo @palomaviejo

"It makes me feel optimistic that we will keep testing different forms of
connecting digitally. Not sure if it has to be on a platform. I do not see
why we cannot own our data and share it with whoever we want. I
would love to have a small data center in my kitchen, right beside my
toaster."

-Paloma Viejo, Research Assistant/Post Doc

IMPROVING SOCIAL MEDIA | 105


I am definitely of the camp that believes
LEARNING FROM THE COMMUNITY
our current social media ecosystem is
damaging to democracy and built to

Justin Hendrix
satisfy incentives that promote an
unhealthy information ecosystem. Some
of the companies in that ecosystem are
more honest with themselves and with
the public than others are about that
reality. But while I am a pessimist in the
Editor at Tech Policy Press short term, I am an optimist in the long
term, for a few different reasons. The
problems we face are now much better
understood than they were a few years
ago. Investments in research, the efforts
of journalists and activists and the
bravery of individuals across the world in
raising awareness of the real violence
and damages they have experienced
have created the conditions for a
renewal I believe is just getting
underway.

What "solutions" to improving social


media have you seen suggested or
implemented that you are excited
about?

I would point to people like Karen


Kornbluh and Ellen Goodman, who are
leading a “Digital New Deal” initiative.
They have proposed a variety of
priorities – including clarifying that “what
happens online is subject to the same
legal standards as what happens in real
life, update regulations and increase
enforcement,” insisting that “the industry
make a high-level commitment to
democratic design — a so-called digital
Justin Hendrix is CEO and Editor of Tell us about your career path and code of conduct,” and create “a new ‘PBS
Tech Policy Press, a new nonprofit how it led you to your work’s of the Internet’ to strengthen our civic
media venture concerned with the focus: infrastructure and ensure a strong online
intersection of technology and supply of trustworthy, nonpartisan
democracy. Previously, he was I have spent my career at the scientific and election information.”
Executive Director of NYC Media Lab. intersection of media and There is a great deal to be done – these
technology. For the past few years, I are good ideas to start.
He spent over a decade at The have been more concerned about
Economist in roles including Vice an adjacent intersection: where How do we ensure safety, privacy and
President, Business Development & media and technology intersect freedom of expression all at the same
Innovation. He is an associate with democracy. Here and abroad, time?
research scientist and adjunct there are profound challenges to
professor at New York University’s democracy – now is the time to I do not believe addressing dis-
Tandon School of Engineering. tackle them. information, safety and privacy issues

In your opinion, what are the Continued on next page


biggest issues facing social media?
IMPROVING SOCIAL MEDIA | 106
online requires us to limit free speech, but it does require limits on reach for certain types of statements, especially on
major social media networks that can channel dangerous speech and incitement to violence instantly to millions. Look at
experts like Jameel Jaffer at Columbia’s Knight First Amendment Institute, who found the decision of the platforms to
suspend Donald Trump justifiable, because “to incite violence is causing harms that cannot be countered by speech and
cannot be undone,” which should be an obvious line we draw on free speech rights. Even now, Donald Trump’s speech is
not in any way limited, even if some companies have decided to limit his reach utilizing the privately owned platforms
that they control. We need to think about the true harms to people when we think about these issues.

When we discuss improving social media, we often toggle between the responsibility of platforms, the role of media to
educate the general public, governmental oversight, and the role of citizens in terms of literacy and how they engage
with platforms. In your opinion, what area do you think needs the most improvement?

The one that needs the most attention at present is governmental oversight, followed closely by the responsibility of the
platforms. That is because the responsibility of the platforms needs to be defined by governments in liberal democracies.

What people and organizations do you feel are doing a good job toward improving social media?

I would point to the excellent ideas that just came out of the New Public festival, and the new partnership described by
Nantina Vgontzas and Meredith Whittaker between "militant workers, engaged social movements, progressive
politicians, radical lawyers and critical researchers" who want to develop a new future. We need to put our attention on
what comes next.

What do you see as the risk of doing nothing to address the shortcomings of social media?

It is wrong to blame social media for every problem in the public sphere; but it is equally wrong not to ascribe some blame
to these massive platforms for the fact that democracy is losing ground around the world. We risk losing this form of
governance, which is one of humanity's greatest achievements.

What models do you see coming on line for providing a digital community (beyond today’s ad-based, extraction
model) – platform cooperatives? Decentralized Autonomous Organizations (DAOs)? For example, are there promising
applications for the blockchain?

I am certainly interested in some of these new models. The subscription economy in media has its advantages, for
instance, and the platform co-op vision seems worth continuing to build. But the dominant model, the capitalist,
attention economy model, will be hard to displace. That is why efforts to hack it such as the ideas proposed at New Public
are important.

How does social media look different five years from now?

I reckon we will see continued innovation. At least two or three new platforms will be prominent. We may also see more
fragmentation into peculiar, more narrow communities such as Parler. I also reckon that ubiquitous 5G networks will
drive new modes of interaction – Clubhouse may be an early example, Spatial another. Expect more media-rich
experiences.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what
academic or experience backgrounds should be more involved in improving social media?

I'll point to what Courtney Cogburn and Desmond Patton at the Columbia School of Social Work often say: We need
more social workers in tech! They will help us identify the issues and ideas that are important to society that mainstream
technology companies and their engineers may regard as "fringe.”

Will we (all) ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-
platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

Yes, they can – if they establish bodies that can build precedent, iterate on changes and incorporate new data as they
move forward. We do not need one framework; we need a system that can grow and evolve and change as we learn more
and observe the effects of regulation.
JUSTIN HENDRIX IMPROVING SOCIAL MEDIA | 107
Continued on next page
Algorithms play a significant role in the social media experience. What issues do you see with how algorithms play
such a large role in an individual's experience and how can we improve this?

I would point to writers like Shoshana Zuboff or Cathy O'Neil or Kate Crawford or danah boyd who have written
extensively on these issues. Algorithms increasingly shape reality – they are defined by people with biases. We need to
be careful about their application and honest about how and why they are applied.

What makes you optimistic that we, as a society, will be able to improve social media?

I am optimistic because I teach. There are great ideas coming out of the young people I work with. They are always
looking for something to fix. I try and connect them with these various movements that are pushing toward a more just,
equitable, democratic information ecosystem.

Connect with Justin Hendrix @justinhendrix

"It is wrong to blame social media for every problem in the public
sphere; but it is equally wrong not to ascribe some blame to these
massive platforms for the fact that democracy is losing ground around
the world. We risk losing this form of governance, which is one of
humanity's greatest achievements."

-Justin Hendrix, Editor at Tech Policy Press

IMPROVING SOCIAL MEDIA | 108


IMPROVING SOCIAL MEDIA

Organizations
and Resources
Read about and connect with the many
organizations that are involved in improving
social media, and utilize the vast amount of
resources available

AllTechIsHuman.org | ImprovingSocialMedia.com

IMPROVING SOCIAL MEDIA | 109


Access Now (@accessnow) “[D]efends and extends the digital rights of users at risk around the world. By
combining direct technical support, comprehensive policy engagement, global advocacy, grassroots
grantmaking, legal interventions, and convenings such as RightsCon, we fight for human rights in the digital
age.” Accessnow.org

RESOURCE: AccessNow Digital Security Helpline: Services include support for securing users and
NGOs’ technical infrastructure, websites, and social media against attacks (government or otherwise)

Accountable Tech (@accountabletech) "We are facing a crisis of truth. Accountable Tech advocates for
the social media companies at the center of today’s information ecosystem to strengthen the integrity of
their platforms and our democracy." Accountabletech.org

RESOURCE: The Tech Transparency Project (TTP) is a research initiative of Accountable Tech
that seeks to hold large technology companies accountable.

Ada Lovelace Institute (@AdaLovelaceInst) "An independent research institute and deliberative body
with a mission to ensure data and AI work for people and society." Adalovelaceinstitute.org

RESOURCE: Algorithms in social media: realistic routes to regulatory inspection

AI Now Institute (@AINowInstitute) “The AI Now Institute at New York University is an interdisciplinary
research center dedicated to understanding the social implications of artificial intelligence.”
ainowinstitute.org

RESOURCE: How to Interview a Tech Company

Algorithmic Justice League (@AJLUnited) "The Algorithmic Justice League’s mission is to raise awareness
about the impacts of AI, equip advocates with empirical research, build the voice and choice of the most
impacted communities, and galvanize researchers, policy makers, and industry practitioners to mitigate AI
harms and biases. We’re building a movement to shift the AI ecosystem towards equitable and accountable
AI." AJL.org

RESOURCE: Coded Bias Documentary

AlgorithmWatch (@algorithmwatch) "Non-profit research and advocacy organisation to evaluate and


shed light on algorithmic decision making processes that have a social relevance, meaning they are used
either to predict or prescribe human action or to make decisions automatically. Also keeps a shared
inventory of AI related principles." Algorithmwatch.org/en

RESOURCE: Automating Society Report

All Tech Is Human (@AllTechIsHuman) "Building the Responsible Tech pipeline by informing & inspiring
the next generation of responsible technologists & changemakers. Building a better tech future by changing
those involved in it, making the pipeline more diverse, multidisciplinary and aligned with the public
interest." AllTechIsHuman.org

RESOURCE: Guide to Responsible Tech: How to Get Involved & Build a Better Tech Future, aka the
"Responsible Tech Guide"

The Asia Foundation (@Asia_Foundation) “The Asia Foundation is a nonprofit international development
organization committed to improving lives across a dynamic and developing Asia.Through their emerging
issues lab they examine shifting labor markets, how to help workers adapt, and setting a policy agenda for a
future of work that promotes prosperity, jobs, and inclusive growth.” AsiaFoundation.org

RESOURCE: Violent Conflict, Tech Companies, and Social Media in Southeast Asia

IMPROVING SOCIAL MEDIA | 110


Aspen Digital (@AspenInstitute; @AspenDigital) "We empower policy makers, civic organizations,
companies, and the public to be responsible stewards of technology and digital media in the service of a
more just and equitable world." Aspeninstitute.org

RESOURCE: The Future of Social Connection, Loneliness, and Technology

Aspen Tech Policy Hub (@AspenPolicyHub) "The Aspen Tech Policy Hub is a West Coast policy
incubator, training a new generation of tech policy entrepreneurs. Modeled after tech incubators like Y
Combinator, we take tech experts, teach them the policy process through an in-residence fellowship
program in the Bay Area, and encourage them to develop outside-the-box solutions to society’s problems."
AspenTechPolicyHub.org

RESOURCE: Aspen Tech Policy Hub Projects

Atlantic Council (@AtlanticCouncil) "The Atlantic Council promotes constructive leadership and
engagement in international affairs based on the Atlantic Community’s central role in meeting global
challenges. The Council provides an essential forum for navigating the dramatic economic and political
changes defining the twenty-first century by informing and galvanizing its uniquely influential network of
global leaders." Atlanticcouncil.org

RESOURCE: GeoTech Center - Tech, Data, People, Prosperity, Peace

Avaaz (@avaaz) "Avaaz is a global web movement to bring people-powered politics to decision-making
everywhere. Avaaz empowers millions of people from all walks of life to take action on pressing global,
regional and national issues, from corruption and poverty to conflict and climate change. Our model of
internet organising allows thousands of individual efforts, however small, to be rapidly combined into a
powerful collective force." Avaaz.org

RESOURCE: Disinfo Hub

Berggruen Institute (@berggruenInst) "Exploring new ideas across tech, governance & philosophy in an
era of great transformations." Berggruen.org

RESOURCE: The Berggruen Fellowship Program

Berkman Klein Center (Harvard) (@BKCHarvard) "The Berkman Klein Center for Internet & Society at
Harvard University is dedicated to exploring, understanding, and shaping the way we use technology."
Cyber.havard.edu

RESOURCE: Digital Citizenship and Resource Platform

Betalab (@betaworksVC) “An early-stage investment program for startups aiming to Fix The Internet.”
betaworksventures.com/betalab

RESOURCE: Application for Cohort 2 of the Betalab funding program

TheBridge (@TheBridgeWork) "TheBridge is a non-partisan organization breaking down silos and


connecting professionals across technology, policy and politics — building stronger, more collaborative
relationships. We believe mutual understanding will lead to further collaboration among these cultures. As
a neutral organization, TheBridge provides a unique forum for productive discussions on hotly debated
issues. We host a tech, policy, politics jobs board and a career platform helping translate career skills
between these industries, we convene our members, make connections and provide resources on our
website including a searchable leaders database." Thebridgework.com

RESOURCE: TheBridge Leaders Directory

Build Tech We Trust (@buildtechtrust) "We are a collective of tech CEOs, activists, changemakers, and
workers who believe the time to act to counter the hate and terrorism is now. We believe technology should
improve the human experience and quality of life for everyone, and that tech companies and
IMPROVING SOCIAL MEDIA | 111
leaders should take responsibility for the harm caused by their platforms and tools. We believe technology
has the power to transform our lives for the better, but only if we prioritize people over the gains for the
few. Today, we invite you to join us in changing the way we build and use tech." BuildTechWeTrust.com

RESOURCE: Contact form to share more information with Build Tech We Trust

Center for Democracy & Technology (@CenDemTech) "The Center for Democracy & Technology.
Shaping tech policy & architecture, with a focus on the rights of the individual...Our team of experts
includes lawyers, technologists, academics, and analysts, bringing diverse perspectives to all of our efforts."
Cdt.org

RESOURCE: CDTs collection of reports and insights

Center for Humane Technology (@HumaneTech_) "We are a team of deeply concerned technologists,
policy experts, and social impact leaders who intimately understand how the tech industry’s culture,
techniques, and business models control 21st century digital infrastructure. Together with our partners, we
are dedicated to radically reimagining technology for the common good of humanity." Humanetech.com

RESOURCE: Collection of Resources

Center for Information Technology and Policy (CITP) at Princeton University (@PrincetonCITP)
“CITP is an interdisciplinary center at Princeton University. The center is a nexus of expertise in technology,
engineering, public policy, and the social sciences on campus. In keeping with the strong University tradition
of service, the center’s research, teaching, and events address digital technologies as they interact with
society.” citp.princeton.edu

RESOURCE: Tech Policy Case Studies

Center for Media, Technology and Democracy at McGill University (@MediaTechDem) "The Centre
produces critical research, policy activism, and inclusive events that inform public debates about the
changing relationship between media and democracy, and that ground policy aimed at maximising the
benefits and minimizing the systemic harms embedded in the design and use of emerging technologies.”
Mediatechdemocracy.com

RESOURCE: Related projects

Center for Technology Innovation at Brookings (@BrookingsInst) "[F]ocuses on delivering research that
affects public debate and policymaking in the arena of U.S. and global technology innovation. Our research
centers on identifying and analyzing key developments to increase innovation; developing and publicizing
best practices to relevant stakeholders; briefing policymakers about actions needed to improve innovation;
and enhancing the public and media’s understanding of technology innovation."
Brookings.edu/center/center-for-technology-innovation/

RESOURCE: A focused federal agency is necessary to oversee Big Tech

Center for Technology & Society at the ADL (@ADL) "How do we ensure justice and fair treatment for
all in a digital environment? How do we counter online hate, protect free speech, and use social media to
reduce bias in society? The Center for Technology and Society takes ADL’s civil rights mission and applies it
to the 21st century." ADL.org/who-we-are/our-organization/advocacy-centers/center-for-
technology-and-society

RESOURCE: The Online Hate Index

Change the Terms (@changeterms) “To ensure that companies are doing their part to help combat
hateful conduct on their platforms, organizations in this campaign will track the progress of major tech
companies – especially social media platforms – to adopt and implement these model corporate policies
and give report cards to these same companies on both their policies and their execution of those policies
the following year.” changetheterms.org

RESOURCE: Adopt the Terms IMPROVING SOCIAL MEDIA | 112


Citizen Browser (The Markup) (@themarkup) “The Markup, a nonprofit newsroom that investigates how
the world’s most powerful institutions use technology to reshape society, today announced the
development of The Citizen Browser Project—an initiative designed to measure how disinformation travels
across social media platforms over time.”

RESOURCE: Citizen Browser

CiviliNation (@CiviliNation) "CiviliNation's goal is to advance the full capability of individuals to


communicate and engage in cyberspace in an open, responsible, and accountable way. We believe
that by fostering an online culture in which individuals can fully engage and contribute without fear
or threat of being the target of unwarranted abuse, harassment, or lies, the core ideals of democracy
are upheld." Civilination.org

RESOURCE: Resources to Protect Yourself Online

Clean Up Twitter Online forum for interdisciplinary discussions of combating online hate on all
platforms.

RESOURCE: The forum

Common Sense Media (@CommonSense) "Common Sense has been the leading source of entertainment
and technology recommendations for families and schools...Together with policymakers, industry leaders,
and global media partners, we're building a digital world that works better for all kids, their families, and
their communities." Commonsensemedia.org

RESOURCE: Tweens, Teens, Tech, and Mental Health: Coming of Age in an Increasingly Digital,
Uncertain, and Unequal World 2020

ConnectSafely (@ConnectSafely) "[D]edicated to educating users of connected technology about safety,


privacy and security. Here you’ll find research-based safety tips, parents’ guidebooks, advice, news and
commentary on all aspects of tech use and policy." Connectsafely.org

RESOURCE: Educator Guides

Consentful Tech Project "The Consentful Tech Project raises awareness, develops strategies, and shares
skills to help people build and use technology consentfully."

RESOURCE: Building Consentful Tech

Contract for the Web (@webfoundation) "A global plan of action to make our online world safe and
empowering for everyone" Contract launch, 2018: Founder: Sir Tim Berners-Lee." Webfoundation.org &
Contractfortheweb.org

RESOURCE: Algorithmic Accountability: Applying the concept to different country contexts

Countering Crime (@CounteringCrime) “Team of experts in online-trafficking, extremism/terrorism


and tech policy. Let's make the Internet safer.” counteringcrime.org

RESOURCE: The ESG Report: Organized Crime and Terror on Facebook, WhatsApp, Instagram and
Messenger

Coworker.org (@teamcoworker) "At Coworker.org, we deploy digital tools, data, and strategies in service
of helping people improve their work lives. Coworker.org is a laboratory for workers to experiment with
power-building strategies and win meaningful changes in the 21st century economy." Coworker.org

RESOURCE: For Workers in Tech

Cyber Civil Rights Initiative (@CCRInitiative) "Empowering victims of nonconsensual porn (NCP) to
become stewards of their own life and doing everything in our power to eradicate NCP altogether.
IMPROVING SOCIAL MEDIA | 113
CCRI’s Mission is to combat online abuses that threaten civil rights and civil liberties. CCRI’s Vision is of a
world in which law, policy and technology align to ensure the protection of civil rights and civil liberties for
all." Cybercivilrights.org

RESOURCE: 2017 Nationwide Online Study of Nonconsensual Porn Victimization and Perpetration:
A Summary Report

CyberPeace Institute (@CyberpeaceInst) "A Cyberspace at peace, for everyone, everywhere." Based in
Geneva, Switzerland." Cyberpeaceinstitute.org

RESOURCE: The COVID-19 Infodemic: When One Epidemic Hides Another

CyberWise (@BeCyberwise) "CyberWise is a resource site for BUSY grownups who want to help youth
use digital media safely and wisely. It is the companion site to Cyber Civics, our comprehensive digital
literacy program for middle school." Cyberwise.org

RESOURCE: Social Media Learning Hub

Dangerous Speech Project (@dangerousspeech) “The Dangerous Speech Project was founded in 2010
to study speech (any form of human expression) that inspires violence between groups of people – and to
find ways to mitigate this while protecting freedom of expression.” Dangerousspeech.org

RESOURCE: Counterspeech: A Literature Review

Data & Society (@datasociety) "Data & Society studies the social implications of data-centric
technologies & automation. We produce original research on topics including AI and automation, the
impact of technology on labor and health, and online disinformation." Datasociety.net/

RESOURCE: Reorienting Platform Power

DemocracyLab (@DemocracyLab) "Nonprofit, open source platform empowering people who use
#technology to advance the public good by connecting skilled #volunteers to #techforgood projects."
Democracylab.org

RESOURCE: Tech For Good projects

Design Justice Network (@design__justice) "The Design Justice Network challenges the ways that design
and designers can harm those who are marginalized by systems of power. We use design to imagine and
build the worlds we need to live in — worlds that are safer, more just, and more sustainable. We advance
practices that center those who are normally excluded from and adversely impacted by design decisions in
design processes." Designjustice.org

RESOURCE: Design Justice Network Principles

Digital Wellness Collective (@dwforall) "We enhance human relationship through the intentional use and
development of technology." Digitalwellnesscollective.com

RESOURCE: Digital Wellness day Toolkit

Digital Forensic Research Lab, Atlantic Council (@DFRLab) "Atlantic Council's Digital Forensic
Research Lab. Cultivating a global network of digital forensic analysts (#DigitalSherlocks) to combat
disinformation." Based in Washington, DC. Digitalsherlocks.org

RESOURCE: Reports

DQ Institute (@DQforAll) "The DQ Institute (DQI) is an international think-tank that is dedicated to


setting global standards for digital intelligence education, outreach, and policies.”

RESOURCE: #DQEveryChild
IMPROVING SOCIAL MEDIA | 114
Electronic Frontier Foundation (@EFF) "We're the Electronic Frontier Foundation. We defend your civil
liberties in a digital world." EFF.org

RESOURCE: Privacy Without Monopoly: Data Protection and Interoperability

EU Disinfo Lab (@DisinfoEU) "A vibrant home for disinformation activists and experts. EU DisinfoLab is
an independent non-profit organisation focused on tackling sophisticated disinformation campaigns
targeting the EU, its member states, core institutions, and core values." Disinfo.eu

RESOURCE: Automated tackling of disinformation

Facing Facts (@FacingFactsEU) “Facing Facts is an innovative programme aiming to tackle the issue of
hate crime and hate speech in Europe. Due to increasing demand for capacity building programmes in this
field by EU Member States, the Facing Facts training offer is now available online
(www.facingfactsonline.eu) and is used by law enforcement and civil society representatives. Multiple
courses in multiple languages address specific aspects of identifying, monitoring and countering hate crime
and hate speech.”

RESOURCE: Facing Facts courses

Family Online Safety Institute [FOSI] (@FOSI) "FOSI convenes leaders in industry, government and
the non-profit sectors to collaborate and innovate new solutions and policies in the field of online
safety." Fosi.org/

RESOURCE: Tools for Today's Digital Parents

Fight for the Future (@fightfortheftr) "Fight for the Future is a nonprofit advocacy group in the area of
digital rights founded in 2011. The group aims to promote causes related to copyright legislation, as well as
online privacy and censorship through the use of the Internet.“ Fightforthefuture.org

RESOURCE: Projects

First Draft News (@FirstDraftNews) "We work to protect communities from harmful information by
sharing tips and resources to build resilience and improve access to accurate information."
Firstdraftnews.org

RESOURCE: Too much information: a public guide to navigating the infodemic

Future Says (@futuresays_) "Powered by the Minderoo Foundation, Future Says is a new global initiative,
committed to accountability in the tech ecosystem, to rebalancing power, and to reimagining technology in
a pro-public way – built and designed by people for people." FutureSays.org

RESOURCE: Reimagine Tech

Global Disinformation Index (@DisinfoIndex) “The Global Disinformation Index (GDI) aims to disrupt,
defund and down-rank disinformation sites. We collectively work with governments, business and
civil society. We operate on three core principles of neutrality, independence and transparency.”

RESOURCE: Related Research

Global Internet Forum to Counter Terrorism (@GIFCT_official) "The mission of the Global Internet
Forum to Counter Terrorism (GIFCT) is to prevent terrorists and violent extremists from exploiting digital
platforms. Founded by Facebook, Microsoft, Twitter, and YouTube in 2017, the Forum was designed to
foster technical collaboration among member companies, advance relevant research, and share knowledge
with smaller platforms. Since 2017, GIFCT’s membership has expanded beyond the founding companies to
include over a dozen diverse platforms committed to cross-industry efforts to counter the spread of
terrorist and violent extremist content online."

RESOURCE: Conspiracy Theories, Radicalisation and Digital Media


IMPROVING SOCIAL MEDIA | 115
HONR Network (@honrnetwork) "Organization dedicated to empowering victims of online abuse through
education and advocacy." Honrnetwork.org/honr-today

RESOURCE: Why hasn't social media done more?

Information, Communication & Society (@ICSJournal) "Information Communication & Society is an


academic journal that publishes current work on the social, economic, and cultural impact of information
and communication technologies."

RESOURCE: Antecedents of support for social media content moderation and platform regulation:
the role of presumed effects on self and others

Institute for Strategic Dialogue (@ISDglobal) “ISD’s work surveys the wide range of disinformation
tactics used to promote polarisation, to undermine elections and to threaten democratic discourse. This
includes smear campaigns, distortive and deceptive media, and the range of inorganic methods used to
amplify this content to wider audiences." ISDglobal.org

RESOURCE: Disinformation material

Lincoln Network (@JoinLincoln) “Lincoln Network believes that when technology meets and supports the
cause of liberty, our society wins and our future becomes brighter.” LincolnPolicy.org

RESOURCE: Open Data Initiative

Lumen Database "The Lumen database collects and analyzes legal complaints and requests for removal of
online materials, helping Internet users to know their rights and understand the law. These data enable us
to study the prevalence of legal threats and let Internet users see the source of content removals."

RESOURCE: The database

MediaJustice (@mediajustice) “MediaJustice (formerly CMJ) fights racial, economic, and gender justice in
a digital age.” MediaJustice.org

RESOURCE: Big Tech and Platform Accountability

Meedan (@meedan) “Meedan builds digital tools for global journalism and translation. We are a team of
designers, technologists and journalists who focus on open source investigation of digital media and
crowdsourced translation of social media. With commercial, media and university partners, we support
research, curriculum development, and new forms of digital storytelling.” Meedan.com

RESOURCE: Content Moderation Toolkit

MIT - The Media Lab (@medialab) "An antidisciplinary research community and graduate program at MIT
focused on the study, invention, and creative use of emerging technologies." Media.mit.edu

RESOURCE: The Human Dynamics group

The Mozilla Foundation (@mozilla) "The Mozilla Foundation works to ensure the internet remains a
public resource that is open and accessible to us all." Foundation.mozilla.org

RESOURCE: When Content Moderation Hurts

NAMLE (@MediaLiteracyEd) "The National Association for Media Literacy Education (NAMLE) is a non-
profit organization dedicated to advancing media literacy education. We define both education and media
broadly." Namle.net

RESOURCE: Journal of Media Literacy Education

IMPROVING SOCIAL MEDIA | 116


The Net Safety Collaborative "Through NetFamilyNews.org, SocialMediaHelpline.com and participation
in the international public discourse about our new, very social media environment, The Net Safety
Collaborative provides insights into research, trends and developments that surface ways stakeholders are
helping it serve humanity better." Netfamilynews.org and Socialmediahelpline.org

RESOURCE: Beyond ‘The Social Dilemma’ to social solutions

New_Public By Civic Signals (@WeAreNew_Public) “We’re a community of thinkers, designers, and


technologists building the digital public spaces of the future.” newpublic.org

RESOURCE: Building better digital spaces

The News Literacy Project (@NewsLitProject) "The News Literacy Project, a nonpartisan national
education nonprofit, provides programs and resources for educators and the public to teach, learn and
share the abilities needed to be smart, active consumers of news and information and equal and engaged
participants in a democracy." Newslit.org

RESOURCE: The News Literacy Project resources hub

OASIS Consortium (@ConsortiumOasis) "[A]n association of key stakeholders across digital platforms,
media, government, and academia focused on brand and user safety. This includes advocating for brand
and user safety, focusing on actionable terms and standards of behavior, and steering the industry toward
clarity and responsibility." Oasisconsortium.com

RESOURCE: Brand Safety Exchange

One in Tech (@WeAreOneInTech) "One In Tech is focused on the prevalent issues of inequality, inequity,
and bias in technology and digital inclusion affecting under-resourced, under-represented, and under-
engaged populations throughout the world. Our organization works to bridge the global Digital Divide,
which is the gap between people with and those without effective access, resources, and skills to enable
healthy digital engagement with the internet and other digital technology." Oneintech.org

RESOURCE: Three Key Programmes

Online Hate Prevention Institute (@OnlineHate) “The Online Hate Prevention Institute (OHPI) is an
Australian Harm Prevention Charity that conducts research, runs campaigns, provides public education,
recommends policy changes and law reform, and seeks ways of changing online systems to make them
more effective in reducing the risks posed by online hate. We work to change online culture so hate in all its
forms becomes as socially unacceptable online as it is in real life.” ohpi.org.au

RESOURCE: Measuring the Hate: The State of Antisemitism in Social Media

OnlineSOS (@onlinesos) “Online SOS is a safe place where people can find tools, information and, above
all, empowerment, in the face of online harassment.” Onlinesos.org

RESOURCE: Assess and Take Action: Identify what type of online harassment you’re experiencing-
and take action.

Open Source Researchers of Color (@osroccollective) “We are a radical and ethical collective of
investigators who research and preserve crowd-sourced information, we create resources on
security, privacy, investigating, and archiving social movements” Osroc.org

RESOURCE: Related resources

Oxford Internet Institute (@oiioxford) "The Oxford Internet Institute is a multi-disciplinary department
of social and computer science dedicated to the study of information, communication, and technology, and
is part of the Social Sciences Division of the University of Oxford, England."

RESOURCE: Industrialized Disinformation 2020 Global Inventory of Organized Social Media


Manipulation IMPROVING SOCIAL MEDIA | 117
People-Centered Internet (@PCI_Initiative) "Working to ensure that the Internet is a positive force for
good, improving the lives and well-being of people around the world. This includes promoting connectivity,
fighting disinformation, contributing to the discussion about technology ethics, supporting the development
of people-centered applications and initiatives, advising policymakers, and leveraging technology to help
communities be more resilient." Peoplecentered.net

RESOURCE: Policy & Governance

Pew Research Center: Internet & Technology (@pewinternet) "Analyzing the social impact of digital
technologies." Pewresearch.org/internet/

RESOURCE: The Role of Social Media in News

Privacy International “Investigating brands using Facebook for advertising, exposing how difficult it is to
understand how our data's used and demanding Facebook make it easier to exercise our rights."

RESOURCE: Advertisers on Facebook: who the heck are you and how did you get my data?

Prosocial Design Network (@DesignProsocial) “We believe that digital products can be designed to help
us better understand one another. That’s why we are building an international network of behavioral
science and design experts to articulate a better, more prosocial future online; and to disentangle the Web’s
most glaring drawbacks: from misunderstandings to incitements to hatred.” prosocialdesign.org

RESOURCE: Case studies

Pivot For Humanity (@Pivot4Humanity) "We’re working to professionalize the social tech industry and
create a more responsible and accountable Silicon Valley." Pivotforhumanity.com

RESOURCE: A Way Forward

Public Data Lab (@PublicDataLab) "Created A Field Guide to “Fake News” and Other Information
Disorders explores the use of digital methods to study false viral news, political memes, trolling practices
and their social life online." PublicDataLab.org

RESOURCE: Field Guide to Fake News

Public Knowledge (@publicknowledge) "[P]ublic-interest advocacy organization working to defend your


rights in the emerging digital culture. We work on issues ranging from preserving protecting an open and
neutral Internet, to making sure that copyright law is balanced and realistic so that values like access to
information, and the capacity to create and compete — will be preserved and protected in the digital age."
PublicKnowledge.org

RESOURCE: Public Interest Advocacy Training

Ranking Digital Rights (@rankingrights) "Evaluating the world's most powerful digital platforms and
telecommunications companies on their commitments to #digitalrights." RankingDigitalRights.org

RESOURCE: Recommendations for governments and policymakers

Safer Internet Day (@SaferInternetDay) "Starting as an initiative of the EU SafeBorders project in 2004
and taken up by the Insafe network as one of its earliest actions in 2005, Safer Internet Day has grown
beyond its traditional geographic zone and is now celebrated in approximately 170 countries worldwide.
From cyberbullying to social networking to digital identity, each year Safer Internet Day aims to raise
awareness of emerging online issues and current concerns." SaferInternetDay.org

RESOURCE: Gallery of resources

IMPROVING SOCIAL MEDIA | 118


Shorenstein Center on Media, Politics and Public Policy (@ShorensteinCtr) "The Shorenstein Center
on Media, Politics and Public Policy is a Harvard University research center that explores the intersection
and impact of media, politics and public policy in theory and practice." Shorensteincenter.org

RESOURCE: Media Manipulation

Santa Clara University, The Internet Ethics program at the Markkula Center for Applied Ethics
(@IEthics) "The Markkula Center for Applied Ethics explores privacy, big data, social media, the "right to be
forgotten," cybersecurity, and other issues in Internet Ethics." Scu.edu/ethics

RESOURCE: Santa Clara Principles

Stanford Psychiatry’s Center for Youth Mental Health & Wellbeing "Their Media and Mental Health
Initiative co-designs with youth and implements interventions to support the mental health and wellbeing
of young people ages 12-25, including the youth-led #goodformedia project."

RESOURCE: Good for Media project

Tech Against Terrorism (@techvsterrorism) “an initiative launched and supported by the United Nations
Counter Terrorism Executive Directorate (UN CTED) working with the global tech industry to tackle
terrorist use of the internet whilst respecting human rights.” Techagainstterrorism.org

Resource: Research and Publications

TechFreedom (@techfreedom) is a non-profit, non-partisan technology policy think tank. We work to


chart a path forward for policymakers towards a bright future where technology enhances freedom,
and freedom enhances technology. Techfreedom.org

Resource: Digital Security white paper

TechCongress "Tech experts and professionals spend one year with relevant Members or Committees in
the House and Senate. The fellowship's goal is helping Congress aim for more informed decisions regarding
technology and policy by allowing Congress to gain technical insight. At present, only 6 out of 15,000
staffers have a technical background." Techcongress.io

RESOURCE: Policy Opportunities for Technologists

Tech Policy Press (@TechPolicyPress) “The goal for Tech Policy press is to provoke new ideas, debate and
discussion at the intersection of technology, democracy and policy. We invite you to submit essays, opinion,
reporting and other forms of content for consideration.” techpolicy.press

Resource: browse by topic

Tech2025 (@JoinTech2025) "Tech 2025 is a platform and innovation community for learning about, and
discussing, the most consequential emerging technologies that will impact our world in the next 5 years."
Tech2025.com

RESOURCE: Online courses

Tech Transparency Project (@TTP_updates) “TTP is an information and research hub for journalists,
academics, policymakers and members of the public interested in exploring the influence of the major
technology platforms on politics, policy, and our lives.” Techtransparencyproject.org

RESOURCE: Reports

Thorn (@thorn) Thorn: "[A]n international anti-human trafficking organization that works to address the
sexual exploitation of children. The primary programming efforts of the organization focus on Internet
technology and the role it plays in facilitating child pornography and sexual slavery of children on a global
scale."
IMPROVING SOCIAL MEDIA | 119
RESOURCE: Sound Practices Guide Download

Trust & Safety Professional Association (@tspainfo) "TSPA is a forum for professionals to connect with
a network of peers, find resources for career development, and exchange best practices for navigating
challenges unique to the profession." Tspa.info

RESOURCE: Trust & Safety Resource Library

WashingTech (@WashingTECH) "As America's "inclusive voice of tech policy", WashingTECH's mission is
to convene diverse technology public policy professionals to defend America's rich diversity with programs
that promote an inclusive narrative about technology's impact on society." WashingTech.org

RESOURCE: What Is Technology Policy?

The Web Foundation (@WebFoundation) "The World Wide Web Foundation was established in 2009 by
web inventor Sir Tim Berners-Lee and Rosemary Leith to advance the open web as a public good and a
basic right. We are an independent, international organisation fighting for digital equality — a world where
everyone can access the web and use it to improve their lives." WebFoundation.org

RESOURCE: Tackling Online Gender-Based Violence and Abuse

5Rights Foundation (@5RightsFound) "5Rights Foundation exists to make systemic changes to the digital
world that will ensure it caters for children and young people, by design and default, so that they can thrive.
5 Rights work with, and on behalf of, children and young people to reshape the norms of the digital world in
four priority areas: design of service, child online protection, children and young people's rights and data
literacy." 5rightsfoundation.com

RESOURCE: Data Literacy

IMPROVING SOCIAL MEDIA | 120


IMPROVING SOCIAL MEDIA | 121
Contributors to this Report
This report was developed through the collaboration of a wide range of contributors
each bringing in their unique expertise and perspective.

Adewale Babalola Felicia Chen Michelle Sheu


Adrian Mack Felicia Vacarelu Nana Young
Alfredo Pupillo Francesca Scapolo Nandini Ranganathan
Amanda Conrad Gabriel Kobus Natalia Monje
Amit Dar Grace Juster Nicholas J Picard
Amy Giddon Gurshaant Bassi Nicholas Perry
Andre Arias Heera Kamboj Nina Joshi
Andrew Bolson Iris Thiele Isip Tan Nupur Sahai
Aneekah Uddin Ivy Mahsciao Osiris Parikh
Anh Dao Jasmin Crentsil Patrick McAndrew
Ankita Joshi Jason Lajoie Phil Surles
Anne Collier Jennifer Forsberg Rania Wazir
Ariba Jahan Jenny Korn Rebecca Newton
Aron Rosenberg Jessica Ji Rebecca Sealfon
Arsh Shah Jessica Pham-Ruhland Rebekah Tweed
Arushi Saxena Jessica Rudd Rhett King
Ashley B Hixson Jigyasa Sharma Ricky Marton
Bijal Mehta Jon Pincus Rohan Light
Borhane Blili-Hamelin Josh Sprague Rusk Fello
Brian Reitz Joshua Nunn Sanjeeb Kumar Mishra
Caleb Gardner Kaliya Young Sara-Jayne Terp
Cara Hall Karen Aloysia Barreto Simerpreet Kaur
Carolina Christofoletti Kari Dreyling Simon Penwright
Catherine Daar Karina Alexanyan Stephanie Davey
Chhavi Chauhan Kashia Dunner Stephen Gray
Christina Wong Kasia Jakimowicz Tara Osler
Christy Casey Kayla Brown Teodora Pavkovic
Dan Gorman Laure Cast Tiffany Jiang
David Ryan Polgar Lili Siri Spira Tina Purnat
Deb Schultz Lisa Thee Toby Shulruff
Devika Malik Lucina Di Meco Tracy E. McDowell
Eli Clein Mari Escoto Trishnika Chakraborty
Ellen Rowe Mariana Avelar Vicki Harrison
Erin Carr-Jordan Matt Klein Waqar Hussain
Eva Sachar Maya Sellon
IMPROVING SOCIAL MEDIA | 122
Next Steps

PLATFORMS Becoming more open,


transparent, and democratic

KNOWLEDGE Greater knowledge-sharing and

BASE
collaboration to inform all nodes.

TECH WORKERS
Greater awareness of their
power and role.

POLICYMAKERS
Moving from reactive to
proactive.

USERS Increasing education and voice in


the decision-making process.

ADVERTISERS
Recognition of responsibility as
the lifeblood of ad-based models.

NEWS MEDIA
Greater ethics, tech literacy and
accountability to serve the public.

FUNDERS
Withholding funding for
exploitative business practices.
IMPROVING SOCIAL MEDIA | 123
In Summary
This report is a catalyst for change and a resource to begin
transitioning from theorizing to greater action.

The current media narrative around improving social media often


paints a simplistic for-or-against dichotomy and leaves many
important, diverse voices out of the conversation.

Mapping out an approach to improve social media is dependent


on understanding what an ideal social media future looks like.
This requires understanding the diverse range of opinions and
options, and ensuring that a wide range of voices are included in
its development.

In order to successfully improve social media, we need to move


toward a holistic, collective approach that considers the roles of
platforms, users, policymakers, tech workers, news media,
advertisers, funders, and the underlying Knowledge Base that
informs the entire ecosystem.

We can improve the quality of the underlying Knowledge Base


(researchers, academics, advocates, activists) by promoting a
culture of knowledge-sharing and collaboration.

IMPROVING SOCIAL MEDIA | 124


All Tech Is Human
All Tech Is Human is an organization that is building the
Responsible Tech pipeline by informing & inspiring the next
generation of responsible technologists & changemakers. Our
aim is to improve our tech future by changing those involved
in it, making the pipeline more diverse, multidisciplinary, and
aligned with the public interest.

Our organization was launched in 2018 with our inaugural


ethical tech summit in NYC. Since that time, we have been
building an expansive community across civil society,
government, and industry and we connect the people and
organizations in the Responsible Tech ecosystem.

In September 2020, we released our Guide to Responsible


Tech: How to Get Involved & Build a Better Tech Future. The
"Responsible Tech Guide" provides pathways for more voices
to get involved, and is aimed at college students, grad
students, young professionals, and career-changers to be
onboarded into the Responsible Tech ecosystem. Our
organization runs a Responsible Tech Job Board and is
currently exploring ways to expand our work.

Our COMMUNITY PARTNER for this report is TheBridge.


TheBridge is a non-partisan community connecting
professionals across technology, policy and politics with a
common interest in breaking down silos and building stronger,
more collaborative relationships. IMPROVING SOCIAL MEDIA | 125
We'd love to hear your feedback about our report and
have you involved. Request to join our Slack group, or
send us an email at Hello@AllTechIsHuman.org

You can find the most up-to-date version of our report


at ImprovingSocialMedia.com

If you would like to join our newsletter, you can do so


at AllTechIsHuman.substack.com

Stay on topic of all the latest issues facing social media


with our monthly livestream series with TheBridge

If you have an idea about how we can better meet our


mission to build the Responsible Tech pipeline, reach
out! AllTechIsHuman.org

IMPROVING SOCIAL MEDIA | 126

You might also like