By Luis F. Yanes
This blog originally appeared on SLSA Blog.
From Aristotle to contemporary thinkers, many have suggested that there is a human instinct to produce and to enjoy artistic experiences or expressions. But how to define such natural instinct? When we really enjoy something – something we believe to be well done and particularly beautiful – like a car, a house, a table, or even a person, we tend to refer it as ‘a piece of art’, highlighting a distinct characteristic that it has from everything else. Is art then beauty?
By Vivian Ng
Each week the Human Rights, Big Data & Technology Project, based at the University of Essex Human Rights Centre, prepares an overview of related news stories from the week. This summary contains news articles from 25 – 1 June 2018.
You can follow the HRBDT Project on twitter: @hrbdtNews.
By Vivian Ng
The House of Commons Science and Technology Committee recently released the Fourth Report of Session 2017-2019 on ‘Algorithms in decision-making’. The release of the Committee’s findings and recommendations for the government is particularly timely, following recent revelations regarding Cambridge Analytica and Facebook and the increasing recognition that these issues extend far wider. This post will unpack how human rights have featured in the Committee’s analysis, and argues that human rights should underpin and center the understanding of how algorithms affect individuals and groups in society, as well as the responses to address the risks and challenges.
By Lorna McGregor, Daragh Murray, and Vivian Ng
This blog originally appeared on The Conversation
Whether or not you realise or consent to it, big data can affect you and how you live your life. The data we create when using social media, browsing the internet and wearing fitness trackers are all collected, categorised and used by businesses and the state to create profiles of us. These profiles are then used to target advertisements for products and services to those most likely to buy them, or to inform government decisions.
Big data enable states and companies to access, combine and analyse our information and build revealing – but incomplete and potentially inaccurate – profiles of our lives. They do so by identifying correlations and patterns in data about us, and people with similar profiles to us, to make predictions about what we might do.
But just because big data analytics are based on algorithms and statistics, does not mean that they are accurate, neutral or inherently objective. And while big data may provide insights about group behaviour, these are not necessarily a reliable way to determine individual behaviour. In fact, these methods can open the door to discrimination and threaten people’s human rights – they could even be working against you. Here are four examples where big data analytics can lead to injustice.