← All card decks

Elements of Digital Ethics

A card deck with 32 elements illuminating different aspects of digital ethics. Used in workshops to spark discussion and reflection on how digital services affect individuals and society.

Favorites 0
  • Organisation 🀷

    Lack of Ethical Ownership

    When nobody in a company or organisation has the explicit responsibility of ensuring ethical consideration in day-to-day operations, any initiatives will be unstructured and vulnerable to neglect. Lack of ownership makes organisations complicit in activities that lead to negative impact.

    Who should bear ultimate responsibility for ethics and care for wellbeing? How is this work planned and carried out? How do we follow up?

    or01
  • Organisation 🧽

    Ethicswashing

    Ethical codes, advisory boards, whitepapers, awards and more can be assembled to provide an appearance of careful consideration without any real substance behind the gloss. Ethicswashing may also direct attention towards investments and donations in goodwill projects, which upon further scrutiny prove to be of little value with regards to reaching the expressed goal of mitigating negative impact of the digital services and products themselves.

    How do we ensure that the efforts we make are meaningful? How can we tie these efforts more concretely to our actual work?

    or02
  • Organisation πŸ‘€

    Monoculture

    Through the lens of a small subset of human experience and circumstance it is difficult to envision and foresee the multitudes of perspectives and fates that one new creation may influence. The homogenity of those who have been provided the capacity to make and create in the digital space means that it is primarily their mirror-images who benefit – with little thought for the wellbeing of those not visible inside the reflection.

    What barriers exist to involving more voices and perspectives in the work? Whose perspectives are needed β€” when and where in the process? Where do we find these perspectives and how do we invite them?

    or03
  • Organisation πŸ‘¨πŸ»β€πŸ’Ό

    Power Concentration

    When power is with a few, their own needs and concerns will naturally be top of mind and prioritized. The more their needs are prioritized, the more power they gain.

    Who holds power over development decisions? Can we shift or broaden that power in any way? What transparency exists around decision-making?

    or04
  • Organisation 🫣

    Unaccountability

    There are very few mechanisms in place to hold the creators of digital solutions accountable for any number of real-world harms instigated by the design and development of tech services and products. Often both machine-makers and lawmakers are unaware of harm. At other times the machines themselves are perfect scapegoats.

    How do we ensure it is possible to take responsibility for the effects and consequences of our operations? How do we want things to work? Should it be possible to hide in order to escape accountability?

    or05
  • Organisation πŸ˜΅β€πŸ’«

    Digital Gaslighting

    The advantage of knowledge opens the playing field to framing concerns and hesitancy as the fault of the user or someone else. When users are harmed they are sometimes led to believe it is by their own doing, and deserved.

    When, where, and how might users misunderstand technology to their own detriment? When do they blame themselves? How do we give users the ability to understand better and/or signal their misunderstandings?

    or06
  • Organisation πŸ“¦

    Digital Obfuscation

    It is easy to hide far-reaching agreements and terms in far-from-intuitive digital locations and in walls of content obscured by deceptive language or innocent phrasing.

    How do we ensure that people can easily find and understand the rules that apply, including their own responsibilities? How can we use design to make risks and important decisions more visible without needing to refer to terms and conditions? How do we remind users over time or when circumstances change?

    or07
  • Organisation 🧷

    Naive Recklessness

    In the wake of overconfidence and a naive understanding of both the value of data and the security of online storage, personal information is often haphazardly managed and accessed by problematic actors.

    When, where, and how is information stored? What information is stored and what does not need to be stored? How do we respond when something goes wrong?

    or08
  • Machine πŸ€–

    Algorithmic Injustice

    Nothing that is built can be considered neutral. Everything created mirrors the values and beliefs of the creator - consciously or not. We can not escape power imbalance and we can not escape prejudice as these are necessarily coded into the automated decision-makers. Worryingly, the prejudice and its impact can easily be made exponentially more efficient, essentially increasing the reach and severity of harmful outcomes.

    How do we know which biases may be amplified? How do we know when biases go too far? What level of uncertainty can we accept? How much transparency must there be in how the solution works? Who can help us understand the complexity?

    ma01
  • Machine πŸ‘Ύ

    Invisible Decision-Making

    The more complex the algorithms become, the harder they are to understand. As more people are involved, time passes, integrations with other systems are made and documentation is faulty, the further they deviate from human understanding. Many companies will hide proprietary code and evade scrutiny, sometimes themselves losing understanding of the full picture of how the code works. Decoding and understanding how decisions are made will be open to infinitely fewer beings.

    Who needs to know how the code works? How is transparency in the code ensured? How do we handle code that cannot be understood? What risks of invisible code exist that more people need to understand?

    ma02
  • Machine 🧠

    Psycho-Engineering

    With infinite interconnected data points about individual biology and behavior the machines will know more about us than we ourselves could ever know. This allows for prediction and modification of behavior in unprecedented and unpredictable ways. While behavioral economics still has an undercurrent of attention to human consent, when authority shifts from human to machine it will be ever more difficult to determine who is in charge. It's one thing for persuaders to be hidden, another for them to hack humans with razor-sharp precision, all the while having humans believe they are acting autonomously.

    What behaviors can be amplified or suppressed? How might relationships to society, friends, and family change? How do we know what is happening, and how should we draw attention to risks and effects?

    ma03
  • Machine 🦠

    Viral Reproduction

    Anything created digitally can have an exponential impact far beyond the creator's control.

    What could be positive or negative viral effects? What can we do to proactively limit viral properties that cause the organisation to lose control? How can we manage harmful viral effects?

    ma04
  • Society πŸ“

    Rule of Quantity

    Faster, bigger and more efficient have never been easier to measure and anything that can be awarded a number, rank or score often is. This is rarely out of necessity, or with a strategic end in sight, but more commonly because "it's possible". Humans are more often judged by a number, a numbers game that now permeates throughout business, medicine, physical exercise and even the quantity of friends.

    What do we measure and why? What do we not measure? How can what we measure lead us astray? How do we notice over time that our measurements may need to change?

    so01
  • Society πŸ§‘β€πŸ’»

    Technosolutionism

    When stumbling upon a new societal challenge there is widespread belief and determination that digital technology should be used to solve it. This dogma is quick to dismiss non-digital suggestions. Ingrained in technosolutionism is of course a deep conviction that technology can solve problems but also often a glaring disregard for technology's ability to create problems.

    What problems do we create by going digital? Which parts could work without digitalisation? What are we converting to code that could actually be eliminated instead?

    so02
  • Society ⏰

    Attention Economy

    Modern society is a competition for attention in mannerisms that strive to create opportunities for attention where there previously was none. This makes it profitable and desirable to push for less sleep, support less time with loved ones, encourage multitasking and engage in behavior tracking and modification.

    How do we affect attention at a small and large scale? What portion of the attention required or created promotes or undermines wellbeing? Can we contribute to healthy use of attention?

    so03
  • Society πŸ“‘

    Surveillance Capitalism

    The more we know the more we can persuade. Data about humans – coupled with their attention – allow organisations to target people when they are at their most vulnerable to specific messaging. This vastly increases chances of a desired behavior or purchase taking place. The one that knows the most has the most power and can charge the highest prices. The market rewards them rather than question the surveillance required to gain that power.

    What information that we store or collect gives us power over people? What should we do with that information? What should we not do?

    so04
  • Society 🧌

    Rampant Harassment

    The platforms enabled for greater numbers and greater attention also empower each and everyone with enhanced abilities for bullying, provoking and tyrannizing at scale and for personal enjoyment. Digital harassment and terror can happen anonymously and 24/7, exposing a great number of people to trauma and suffering that is ever harder to evade as society demands their digital availability.

    How can we counteract harassment before it happens? How can we detect harassment that arises on, or with the support of, our services? How do we respond when harassment occurs?

    so05
  • Human πŸ›‘

    Exclusion

    Language, citizenship, physical ability and a multitude of other reasons exclude people from equal access to digital. And yet when digital solutions carry potential to include most everyone, digital is still designed in ways that keep the proverbial door closed. Organisations choose to dismiss human beings on the grounds of 'edge-cases', financial constraints or standardized ways of working.

    Who do we exclude and why? Who do we not consider as a target audience? What prevents us from including more people, and what might we gain by doing so?

    hu01
  • Human 🀐

    Voice Suppression

    Rarely-heard voices are not sought out to participate in digital development. Instead the already strong voices are heard and enabled and find their needs met as the silent voices fade even more. The suppressed voices, not invited to the platforms, find themselves without one. When they are invited, some people will still struggle to communicate equally well on digital platforms that require a certain level of experience and aptitude in writing. Uncertainty and anxiousness is more easily identified in physical meetings while online they can go unnoticed.

    Which users exist who may be uncomfortable? Whom are we missing in our surveys and follow-ups? Whose voices are loudest and may be giving a misleading picture of what works well or less well?

    hu02
  • Human πŸ›‹οΈ

    Abuse Enablement

    Technology designed to control things prove themselves equally capable of controlling beings. In abusive relationships the abuser now wields power over lights, doors, communication, cameras and location-enabled devices. The systems fail to provide protection for those who find themselves targeted by tech-enabled abuse.

    At home, at work and in public spaces, who is given more power and who is stripped of power? How can technology be used for abuse and how much do we contribute to an increased risk? What measures can reduce, counteract, or eliminate abuse?

    hu03
  • Human πŸ˜Άβ€πŸŒ«οΈ

    Privacy Loss

    To gain access to services, paying with personal information has become norm and often accepted without second thoughts. Understanding is generally low of how personal data is stored, how much is given away and how it is used. At some point the experience evolves into resignation, accepting the terms not because of a good deal but because the exchange has been normalized and there is a sense of being too late for regrets. There is a concerning tolerance of sharing not only your own data but also the data of friends and family. And the desire to understand more appears to dwindle with time.

    How do we affect personal privacy, both for users and non-users? How aware are people of how sensitive their information is, how it is handled, what risks exist, and how it can be misused? How can we help people both understand and feel trust?

    hu04
  • Human πŸ‘¨πŸΌβ€πŸŽ€

    Behavior Suppression

    Anything we do and say can end up on the Internet. Anything we do and say in public can be recorded. Anything we do and say at work can be measured to determine performance. We know our purchases end up in a database because rewards programs tell us what we buy. Our locations are tracked. Meeting software can gauge our "attention". The freedom to act is always inhibited by apprehension of how that act will be perceived out of context in the eyes of an external assessor. The number of assessors keeps rising.

    How do people's behaviours change given that information about them is visible or being processed? How could information be misinterpreted by other parties and contribute to harmful outcomes?

    hu05
  • Human πŸ«†

    Biometric Abuse

    Collection of biometric data is on the rise while neglecting the permanence, and thus immense delicacy, of this data. Once lost biometric data can rarely be reclaimed, and it can be used to identify even where anonymity has been promised. The continuous collection of vast amounts of data about humans are creating biometric markers where many did not expect it: beyond fingerprints, face recognition and gait, there is already use of heartbeat recognition and keystroke biometrics.

    Can we avoid collecting biometric information and still achieve the purpose or function? How do we best guide people in decisions and interactions that involve collecting biometric information? What types of stored biometric information do we need to disclose?

    hu06
  • Supervision 🧱

    Permanent Impact

    Many of the problems in the digital space are triggered by actions that are not easily reversed. When data has been shared, it can been copied endlessly. When a fingerprint has leaked, it can not be changed. When a machine learning algorithm has been implemented, few can understand it and even fewer change its behavior. Additionally, the values of today are coded into decision-making systems disregarding how the values of societies change over the span of a few decades. The prejudice of today is what we are teaching the digital systems that make decisions tomorrow.

    What permanent characteristics make it difficult to change system behavior in the future? What fundamental pillars are needed for the product or service to function (both technical and social)? How can we enable product changes going forward as values, working methods, knowledge, and the environment shift?

    su01
  • Supervision πŸ“š

    Education Defects

    Integrity, ethics and privacy laws are rarely on the curriculum of the training programs that have given us the digital professionals of today. Often programming and design skills are self-taught or gained through online courses with no cross-disciplinary teachings. While demand for digital developers and makers continues to rise, employers pay little attention to an ability to care for, and reason around, human wellbeing.

    How can digital creators be better equipped to make decisions that affect the wellbeing of people, communities, and nature? Are there particular areas of ethics, philosophy, and human rights that are relevant to what we do? Can we in some way encourage greater consideration of these questions?

    su02
  • Supervision πŸ§‘β€βš–οΈ

    Regulation Defects

    Law and policy often attempt to target each new technology individually while missing the bigger picture. Change happens at a higher rate than regulation can manage, which indicates that what the regulation should begin to manage is the rate of change itself and the premises for change. How many times during a lifetime should a citizen need to relearn how to apply for a job, buy a bus ticket or book a doctor's appointment – and who should bear the cost of that change?

    What laws are we subject to? How does oversight function in our area? What regulatory gaps can we identify that affect the organisation, the individual, and/or society at large? Can we build trust by describing how we ourselves act in light of these insights?

    su03
  • Supervision πŸ“―

    Private-Public Infrastructure

    Many countries have moved towards the informal acceptance of privately-owned services as platforms for public discourse and free speech. Still the control of the platforms and their participants are determined by the private companies and not by the state, and equal access laws rarely apply to private companies in the same extent as to the public sector. This creates many grey zones and confusion with regards to accountability, accessibility and responsibility, all while affecting people's ability to participate in an open society.

    What happens if we are regarded as infrastructure for communities or entire nations? What do we need to take responsibility for? What do we need to clarify that we do not take responsibility for? What do we want to happen and how do we move in that direction?

    su04
  • Supervision πŸ—ΊοΈ

    Geographic Resistance

    When trying to manage digital services, governments are to a large extent limited by physical boundaries and jurisdictions that the digital systems are happy to ignore. Many countries find their citizens adopting tools and services that are difficult to influence through national or regional law.

    To what extent do we account for varying requirements and laws based on region and country? When do we block content and what consequences might that have? How do we help users understand how features may differ depending on their location? When people travel, what guidance do we provide?

    su05
  • Environment πŸ‘¨πŸΎβ€πŸ’»

    Worker Exploitation

    In services where millions of users convene to publish content there is always percentage of harmful and illegal content that needs to be filtered and removed. To manage this, low-wage workers may have to sift through hate speech, violent attacks against animals and graphic pornography. Workers are also endangered by the mere knowledge of understanding the harm their work contributes to, while under duress or external pressure that makes it difficult for them to refuse contributing to that harm.

    How do we take responsibility for the people behind the scenes? What do we do to ensure that suppliers respect wellbeing and dignity? What do we require of suppliers and of ourselves? What do we make visible and what do we hide?

    en01
  • Environment πŸͺ

    Supply Chain Neglect

    To realise digital services and solutions we need hardware and software. Behind the production of these there can be any number of oppressive relationships and exploitation of workers. One example is the need for cobalt, used in lithium batteries, which is often mined under unfair conditions in oppressive environments reminiscent of slavery. Failing to consider one's own part in the supply chain required to deploy digital services is to ignore accountability for potential harm to others.

    What do we know about the components and materials required for our own development and for our products? How can we be more transparent, impose requirements, and act to reduce and eliminate suffering?

    en02
  • Environment βš–οΈ

    Unequal Access

    Only slightly more than half the world's population has access to the Internet and within different countries access to Internet and specific services will vary. In some countries the Internet is systematically used as an oppressive force by closing and opening access during unrest. Often talked about as open, it is important to recognize how the internet is closed to a significant number of people around the world, and to reflect on the consequences of ignoring them.

    Who is excluded from access and the ability to use? How is the balance of power changing and worsening? Is there interest in reaching more people and what would that look like and how would it work?

    en03
  • Environment 🏭

    Ecological Neglect

    As digital services grow they demand more in terms of energy use, more transportation of goods, and require data centers that displace physical land and communities. It is easy to overlook the overall environmental impact of creative work in the digital industry when it is not being measured, and other measurements take precedence. As always, all variables can not grow for the better at the same time – and we need to be aware of what grows for the worse based on our actions.

    What improvements and deteriorations does our work contribute to? What do we measure and what do we take into account? How could we better assess if our work improves something without worsening other things?

    en04

v1.3.2