Practical insights for compliance and ethics professionals and commentary on the intersection of compliance and culture.

Round-up on compliance issues with online platforms: Instagram

This is the third in a series of six posts on compliance issues with various online platforms.  The first post was about YouTube.  Last week’s post was about Facebook.  Today’s post, the third in the series, will discuss Instagram.  Next week’s post will focus on Twitter.  On April 5, the fifth post in the series will cover Snapchat.  The sixth and last post in the series, on April 12, will be about Reddit.

The photo and video sharing and social media service Instagram was created in 2010 and has been owned by Facebook since 2012. Instagram has evolved massively in popularity, adding users exponentially year after year, and creating features which have inspired huge engagement and imitation on other platforms to take advantage of popular usage of the app, such as thematic hashtags and aesthetically curated content.

READ MORE

Instagram and the internet’s code of ethics

Instagram is a very popular social media app based on sharing photos and videos, publicly and to selected users as well as via direct, private message. It was launched in 2010 and since April 2012 has been owned by Facebook, another giant in the social media industry. In less than the decade of its existence, Instagram has grown a very large and active community, where users can interact with their friends and “followers” as well as other communities who maintain a presence there, public figures, media sources, and corporate brands.

All of these wildly different groups, from all over the world, sharing content and commentary on one platform, is exciting and promises many opportunities for collaboration. Along with these positive connections, though, of course come negative surprises and possibilities for challenges and abuses. With all the influence Instagram has through its popularity comes also responsibility for defining the standards and limitations of the community as well as what it will put out into the internet and the world.

Instagram has faced its share of criticism for its efforts to implement and maintain effective controls and reporting mechanisms.   Instagram relies heavily on user reporting of inappropriate content, such as posts depicting illegal activity or the use of “coded” hashtags and emojis to conceal but continue on with such practices. Understandably, even the most aggressive attempts to keep up with the pace of this behavior on social media will fall behind quickly, leading to criticism the community is unsafe. When Instagram is too proactive or reaches in deleting comments, posts, or users, however, then controversy about overreaching into privacy and expression begins in response.

Kevin Systrom, one of the original creators of Instagram and its current CEO, wants to work this balance between protection from abuse and freedom of expression. Under his leadership, Instagram is dedicated to ensuring that the content and tone on the platform is compliant with its community guidelines. Changes to the comments sections on photos – including allowing users to filter out comments that had certain words, or to post photos without comment sections available – are intended to encourage safer self-expression by the posters who might otherwise fear harassment or offensive content in response below their photos.

Platforms such as Instagram, of course, can never be neutral – any technology’s relationship with its user is one that is fraught with moral concerns, starting right at the ethics of its design, which is made only more complex by algorithms, robot users, and the real users who make their own decisions about the content to share and promote that run the gamut from universally appropriate to offensive, harassing, or even illegal. In such a context, applying a code of ethics is a very hard task, but perhaps it is the inherent difficulty of doing this that makes it so important to try.

Creating filters and tools to hide and promote, prevent and engage, either when deployed by the community management behind the scenes or when elected by users, is just the beginning of the design choices engineers have made at Instagram to implement technical responses to problematic tone in some corners of the platform. Instagram tries to deploy artificial intelligence to help also, to sort real posts from fake and to learn from the data to understand why innocent comments or content may be abusive to the context, a concept called word embeddings. AI has its limitations, of course, but in any rules-based approach to governance it’s necessary to start with something good and then make continual efforts to make it better, rather than leave risks un-addressed while in hopeful pursuit of the best.

Time will tell how effective Instagram’s efforts to make the platform a safer place for expression really are, and what they really accomplish – a place which is open for creative sharing and communication creation, but not to toxicity and abuse, or a censored, sanitized, disingenuous photo collection where self-expression is restricted and speech censored? Perhaps Instagram will succeed in going against the tide on the internet and in much of life, where the level of social discourse seems to have gone low, tinged by anger and dark with people’s worst impulses, and make a place where the conversation can be a bit more civil, even if it has to be filtered first to get there.

For more detail on Kevin Systrom’s ambition of making Instagram a safe haven and role model platform on the internet, and the challenges that both motivate and complicate this mission, see Nicholas Thompson’s story on Wired.

READ MORE

Round-up on ethics of design in technology

One of the most interesting and challenging inquiries in the evolving ethical code of technology has to do with design choices. Ethical decision-making and process design has direct impact on the fluid, complex process of creating the devices, interfaces, and systems that are brought to market and used by consumers on a constant basis. In such a disruptive and innovative industry, there are moral costs for every design decision: every new creation replaces or changes an existing one, and for everyone who has new access or benefits, others experience the costs of these decisions. Therefore the ethics of design as applied to technology and, of particular interest, social media, have concrete importance for everyone living in a world increasingly dominated by user experiences, communities’ terms of service, and smart devices.

  • Former Google product manager Tristan Harris has gone viral with his commentary on the ethics of design in smart phones and platforms creating apps for them. There is a balance in online design where the internet platforms go from being useful or intuitive to encouraging interruption and even obsession. Many people worry about the effect “screen time” may have on their attention span, quality of sleep, and offline interactions with people. Design techniques may actually keep people attached to their devices in a constant loop of advertisements, notifications, and links, as content providers and platforms compete to grab viewers’ attention. Alerting people to the control their devices have over their attention and time is one step, but urging more ethical choices in the design process is the next frontier for innovation reform:  Our Minds Have Been Hijacked By Our Phones.  Tristan Harris Wants To Rescue Them. 
  • The above phenomenon of addictive design has become so imbedded in the creation of app features that even the most subtle changes can have a huge impact on the consumption practices of users. But when do features go from entertaining and user-friendly to compulsive, even addictive? Refreshing an app can be like pulling the lever on a slot machine, giving the brain rewards in the form of new content to keep the loop going at the expense of other activities and priorities. These design improvements, then, may actually affect users more as manipulations:  Designers are using “dark UX” to turn you into a sleep-deprived internet addict
  • These small, ongoing redesigns are intended to make apps more readable and consumable. These periodic improvements are intended to make content more captivating and enable longer browsing – again prompting the question, what is the ethical code for the control designers wield over users with these choices? From a design ethics perspective, these small changes can be viewed as more alarming than major ones, as they are so incremental that many users do not consciously notice them and therefore “optimization” tips into “over-optimization,” meaningful interaction becoming possibly destructive:  Facebook and Instagram get redesigns for readability
  • Artificial intelligence always captures the public’s imagination – thrills and fears about the possible developing capabilities of robots and predictive algorithms that could direct and define – and perhaps threaten – human existence in the future. AI has been developing in recent years at a breakneck pace, and all indications are that this innovation will continue or multiply in the coming period. The science fiction-esque impact of AI on society will grow and bring with it all kinds of ethical concerns about the abilities of humans to define and control it in a timely and effective way:  Ethics — the next frontier for artificial intelligence
  • Social media platforms have developed into social systems, with all the dilemmas and dynamics that come along with that. These networks may face the choice between engagement and all of the thorny dialogs that come with it, and a simpler, more remote model that can be enjoyable but is less interactive and therefore, perhaps, less provocative:  ‘Link in Bio’ Keeps Instagram Nice

Queries into design ethics and choice theory in technology, especially social media, ask the questions of what human experience will evolve into in a world which is increasingly digitized and networked. The design decisions made in the creation of these devices and systems require an ethical code and a sense of social responsibility in order to define the boundaries of what are the best collective choices.

READ MORE