NOT-EQUAL: EXPLORING SOCIAL JUSTICE IN THE DIGITAL ECONOMY
We have launched Not-Equal, a three-year £1.2m UK Research and Innovation Network+ project on Social Justice and the Digital Economy.
Not-Equal aims to bring together and resource collaborations between academia, industry, government and civil society to explore and develop innovative responses to issues of social justice in technology design and implementation.
The project will explore current issues around the ethics of technology, algorithm bias, and how to create fairer working conditions in the platform economy.
Led by Newcastle University, in collaboration with Royal Holloway University of London, the University of Sussex and Swansea University, Not-Equal includes funding calls for collaborative projects and offers support for events and activities (symposia, workshops, hackathons, design sprints).
We wish to support transnational collaborations as well as give exposure to initiatives and projects that tackle the societal challenges that digital innovation presents today and make our digital society work for social justice.
Not-Equal already has more than 30 partners including Google, the BBC, and the National Innovation Centre for Ageing based at Newcastle University, as well as the Trade Unions Congress and organisations such as Citizens Advice and Voluntary Organisations’ Network North East (VONNE).
It aims to build sustainable networks across academic communities, such as social scientists collaborating with tech designers, partners from industry and civil society to build on areas of interest and foster new collaborations.
Challenges for Network+
This will be supported through an open commissioning process and activities that will fund projects that address the critical challenges the Network+ has so far identified.
The challenges mirror conversations happening publicly around the gig economy, issues around algorithms bias, and digital security.
Algorithmic Social Justice
Algorithmic bias and the implications of using algorithms to make decisions has been making the news recently. However, when used correctly algorithms can have a positive effect from saving time, money and spotting problems that humans would not.
How can computers and their underlying algorithms help make the decisions that affect us all, fairer?
Not-Equal has already ran a number of events looking at how we can navigate the potential risks and opportunities when it comes to algorithm service delivery.
Digital Security for All
Computers and applications should safeguard and protect the interests of everyone. As more of our private data is uploaded online from our shopping habits to healthcare data – questions are arising about whether this data is being used in an ethical way?
There are fears around data breaches, including the recent Cambridge Analytica breach, which have brought up issues around the use of data in the political sphere to the forefront of many discussions.
What digital security models can ensure the safeguarding of all of us in this digital society?
Fairer Futures for Businesses and Workforce
Technology has irreversibly changed the way people work, and the rise of the gig economy is raising issues around workers rights, sick pay and working hours. How can we ensure fair opportunities and working conditions for all related to the future of work in our digital society?
For more information or to collaborate with the Network visit our website
Does the use of personas within design processes prevent meaningful participation?
Looking at the design considerations for developing a digital directory for domestic violence services in the North of England.
Working with participants in Brixton, London and Pallion, Sunderland, this study aims to better understand under what conditions smart city technologies can benefit city residents.