A Plea for Digital Humanism
Vienna Business Agency
A Plea for Digital Humanism: How we can set the correct course for the future
1. The system is failing!
Ground-breaking technological innovations are causing fundamental changes to our society, both for the better and for the worse.
Our ability to reflect and create is what sets us humans apart from other species. We love to develop ideas and are proud of our new achievements. At the same time, however, we are afraid of change. There are good reasons for this. History has taught us that ground-breaking innovations also often have unintended consequences that present us with great social challenges.
In this sense, the invention of the printing press in the 15th century not only laid the foundation for emancipation and enlightenment, but was also an instrument for propaganda and thus one of the precursors of the increasing polarisation of our society, culminating in riots and mass murders in the 30 Years War (Beerens, 2008).
At the end of the 18th and beginning of the 19th Century, the steam propulsion came up and became a game-changer. The use of steam engines increased productivity and made it easier to travel and trade goods around the world. At the same time, however, it led to structural unemployment and growing inequality and was thus amongst others an initiator of the two World Wars in the 20th Century (Perez, 2017).
Without steam engines, the industrial revolution would not have been conceivable in the way it took place. Coal was the fuel for this increasing industrialisation. It is true that coal has become less important, but the overall consumption of fossil fuels has continued to rise until the present day. The greenhouse gases that are released this way have led to the climate crisis, which is the greatest threat humanity is currently facing.
Microchips have ushered in radical changes since the 1970s. The Internet has accelerated this development further. As a global computer network for the decentralised operation of various services, it has created fundamental change in the economy and in our society since the 1990s. It is now easier than ever to overcome geographical borders, we have access to unlimited knowledge, our everyday life has become even more convenient in many ways and some groups have seen the opening up of completely new opportunities to participate in and shape society.
Internet pioneers saw the basis of completely new freedom in this technological innovation. The Internet was to become the open platform on which information was shared free of charge and monopolies of power could be undermined. It was driven by a counter-culture and was characterised by a very strong sense of freedom. In his much-acclaimed article “How the Hippies Destroyed the Internet”, Moshe Vardi (2018) describes how the lack of transparent market mechanisms led to the development of non-transparent, web-based business models. It is precisely because the Internet does not belong to anyone and is provided to us all at no cost, and therefore cannot be monetised as such, that business models were developed that aim to maximise clicks and increase advertising value while the information appears to be “free of charge”.
This indirect monetisation of the Internet forms the basis for many worrying developments. Fake news is spread across social media and is undermining democracy. Filter bubbles distort our perception. Digital monitoring technology leads to a loss of privacy and companies act in a non-transparent manner while consumers are transparent (Strassnig et al. 2019). Large technology companies influence our behaviour with algorithms in order to make even more money. Many of the most profitable companies have secured a worrying monopoly on the Internet and beyond with digital marketplaces and platforms.
Given these developments, Tim Berners Lee, the inventor of the World Wide Web, attracted attention in 2017 as he was warning that “the system is failing”.
It almost seems like digital technologies have taken over and are shaping our society and our culture today. For decades now, we have seen regular dystopian magazine headlines with alarming predictions that robots and artificial intelligence will take over. However, history has taught us that inventions and technological developments do not just happen, but rather are always the product of social culture. Or as the French philosopher Gilles Deleuze (1986, P. 39) put it, “The machine is always social before it is technical”. Charlie Gere (2008) also provided evidence that digital culture is not the product of digital technologies but digital technology is the product of digital culture. Society is not the victim of technological developments but rather the very thing that creates and shapes new technologies.
Therefore, it is up to us to mould digitalisation in such a way that it benefits us humans and does not lead to our downfall. Experts are increasingly calling for elected governments to play a strong role in the shaping of our digital future. The market will not take care of it for us.
2. What can cities do for a human-centred digitalisation
The Republic of Florence is considered to be the birthplace of the Renaissance. Soon other Italian city-states followed. In general, cities were the main drivers in the transition to modernity in the 15th and 16th century. Once again, cities should now play a vital role in the New Renaissance and could be the nucleus for a human-centred digitisation.
The City of Vienna aims to shape the Next Renaissance in the digital transformation of society and economy. With its Digital Agenda 2025 Vienna hat set the framework for a Viennese Way of digitisation and has defined 12 principles as guidelines for its journey into the digital future in a dialogue with representatives from companies, research institutions and civil society initiatives that was broad and, of course, digitally supported. Equal opportunities, participation, focus on service, an open culture of error, gender equality, innovation, consolidation, sustainability, openness, cooperation with the regional economy, independence and security are to characterise the Viennese path of digitalisation
This commitment of the City of Vienna is supported by the Vienna Manifesto for Digital Humanism, which leading scientists proclaimed in Vienna in May 2019. In it, they formulate a “call for reflection and action in the face of current and future technological development” and announce 11 key demands to academic communities, educators, industry leaders, policy makers and professional societies around the world.
The commitment to digital humanism also points the way forward for Viennese economic policy. The Viennese path to digitalisation was pinpointed as a central topic for economic policy in the new Viennese economic and innovation strategy WIEN 2030. Vienna is to stand for digital solutions that guarantee fairness, transparency, safety and self-determination. Vienna does not want to copy Silicon Valley, but rather quite the opposite. It wants to build on the fact that Vienna already has an excellent reputation around the world when it comes to cyber security, digital citizens’ rights, transparent and trustworthy artificial intelligence, assistive technology for older and handicapped people and the calculation of complex predictions for the future.
In this connection, Vienna’s rich tradition in art, culture and science inspires and secures the basis for a vibrant and democratic society of the future. Art and culture create social spaces that need to be open and accessible to the population as a whole and in its diversity (Stadt Wien, 2021). Through the intersection of culture and technology in the field of the creative industries, new points of access as well as innovations are created. This is at the centre of our focus on Culture & Technology in the creative industries in accordance with the motto Digital Humanism (Vienna Business Agency 2022).
As employees at the Vienna Business Agency, we ask ourselves if it is possible to position Vienna as a pioneer for digital humanism in the international race for talent and business ideas and to differentiate Vienna from other emerging centres of innovation.
In this article, we discuss concrete developments in which digitalisation brings dynamic and sometimes also worrying changes. We believe that we have collected noteworthy ideas and concrete initiatives that show how the course for the future can be set correctly in the spirit of digital humanism.
3. Who owns the data?
If we set the course correctly, then people will gain sovereignty over their data.
Data has become a central building block of capitalism in digital society. In her book “The Age of Surveillance Capitalism” Shoshana Zuboff (2019) describes very vividly how companies have commercialised data about our behaviour and how they have created a market for it. The ubiquity of “smarter” technology today, from a loudspeaker to a child’s toy, makes it possible that conversations, patterns of behaviour, habits, emotions, etc. can be collected. Some of this data really is used for product improvements, but smart assistants also collect and process data that goes well beyond that. This data, which Zuboff calls “behavioural surplus”, is converted into prediction products, or calculations concerning a person’s next actions. In this way, humans become a free commodity for the data economy through their behaviour on the Internet. Companies like Google, Amazon, and Facebook (Meta) generate billions based on their behavioural predictions.
However, it is not only predictions that are commercialised but also the modification of behaviour by smart technology. In some cases, social media makes people addicted on purpose, influences elections and purchase decisions. That mainly happens in secret and is not visible to the users. Surveillance capitalism was created as a business model to monetise the use of “free of charge” Internet (Zuboff, 2019).
The European Union already reacted to this development in 2016 with its General Data Protection Regulation (GDPR) to protect users from surveillance capitalism. This Regulation has been directly applicable in Austria since 2018 and is therefore binding for all organisations that process data. It obliges the implementation of Privacy by Design as well as Privacy by Default and guarantees private individuals a right to get information about and claim the deletion of their data. The GDPR imposes high fines in the event of violations.
Politically, this regulation is a great success, attracting a lot of attention worldwide and serving as an example for other regions. Further regulations followed or will likely follow. The Data Governance Act and above all the Data Service Act will further strengthen the state’s right of intervention and create more transparency.
We see this increase in state intervention as a positive development. These regulations are necessary, but not yet sufficient. Since the GDPR came into effect we have observed that companies make the necessary declarations of agreement as complicated as possible, so that the users of Internet services give their authorisation for the use of all data without even knowing how the data is processed. Rights and obligations concerning the handling of data are transparent in theory but in practice they are unknown and data protection continues to be a challenge for individuals.
At the same time, data protection limits innovative business models and is also often seen as an obstacle in research. It prevents researchers from breaking down data silos and exchanging data from different sources to analyse complex correlations. While the implementation of GDPR has presented small and medium-sized companies with great challenges, large companies hired lawyers who adapted their terms and conditions in such a way that they hardly needed to adjust their data handling in practice.
Further technical and non-technical approaches are therefore necessary to ensure the consistent protection of private data. One interesting proposal comes from the Austrian association OwnMyData, who considers itself a data sovereignty enabler for individuals and companies. The association recommends making consent forms machine-readable and also provides a technical solution for this, with which you can record how you want to handle your data. An algorithm aligns these preferences with the machine-readable information about how a service processes data. Consent is given automatically by the individual values that were entered. Otherwise, consent has to be given or refused manually. This can simplify the handling of complicated declarations of consent. Unfortunately, it is still rare for providers to publish their data processing information in a machine-readable manner. Additional state regulation may be required.
Corresponding data competence among citizens is necessary for citizens to use and share their data in a self-determined manner. The state is also challenged to ensure the teaching of humanistic values, knowledge about the mechanisms of the data economy and skills in handling data by the publicly funded education system. Currently, many large companies are benefiting from the lack of data competence in the population.
Data sovereignty is the objective. This means the right to self-determination, the right of each individual to determine when, for what purpose and at what price his or her data can be used. Data sovereignty also enables full control over people’s data – regardless of whether this is personal data or the data of a company.
Still, data sovereignty should also promote innovation by encouraging self-determined data exchange and a modern data economy. Data donation is an exciting idea that allows someone to actively support civil society initiatives, research projects and innovative business ideas by sharing your data. Data that is provided voluntarily can lead to ground-breaking insights and thus improve the lives of all.
We also find the MyData initiative noteworthy. It is a global association of scientists, lawyers, software developers and political activists who wish to drive a human-centred data economy. MyData aims to advance the discourse on data sovereignty with position papers, concrete case studies and regular meet-ups at the regional hubs: “The core idea is that we, you and I, should have an easy way to see where data about us goes, specify who can use it, and alter these decisions over time.” (MyData)
4. Open as a matter of principle?
If we set the course correctly, the many will benefit from new insights and the data that is collected.
Open Data is data that can be made available for free use in a standardised and machine-readable form. Cities are at the forefront of this movement. This data may be used, distributed and re-used for any purpose. Personal data and other sensitive data cannot, by definition, be Open Data. There are similar concepts, which sometimes overlap, in the areas of Open Source, Open Hardware, Open Educational Resources and Open Access in the scientific sector. The term Open Government Data relates specifically to the public sector.
Open Data and Open Government Data provide a great additional social value. Positive social, political and economic effects, in particular, have been shown. Open Government Data strengthens transparency, promotes the provision of innovative public services, favours the participation of citizens in political processes and plays an important role in the generation of social capital in a society. Company founders develop new business models that would not have been possible before, administrations offer central databases that can be used commercially, scientific work can be accelerated, and processed data improves decision-making processes by various stakeholders (Gurin, 2014). Concrete economic benefits are also well studied. In the study “The Economic Impact of Open Data” (2020), initiated by the European data portal, a market size of 184 billion Euros was calculated in the year 2019 for the EU27 countries plus Iceland, Liechtenstein, Norway and Switzerland.
The relevance of Open Data and Open Government Data was recognised early in Europe. The European Union already issued the Public Sector Information (PSI) directive in 2003. The directive aimed to make more data from public administration accessible to the public. In 2019, the directive was replaced by a new one so that data from public companies and publicly funded research data is also covered by the directive. Moreover, the availability of dynamic real-time data and application programming interfaces (APIs) and the use of standard licences are pushed, and the charging principles are simplified and made more transparent.
The City of Vienna has taken on a pioneering role in the area of Open Government Data. Vienna was the first German-speaking city with a dedicated Open Data portal. More than 500 data sets provide detailed information on one-way streets, public transport, historical aerial images, measurement data on air pollutants and Wi-Fi location, to name just a few of the areas. The City of Vienna is following the guiding principle of “open by default”. This means that all databases that are classified as public are published as Open Government Data. In the current ranking of the Open Data Maturity Report 2020 Austria is in the leading group of “Trend Setters” in 7th place of all 27 European states.
There are numerous databases, particularly in public administration, which cannot be published because of data protection requirements. However, these data sets have a particularly high potential for society, research and, with some caution, also companies. Technologies that protect privacy but still allow evaluation can help to increase this potential. The creation of opportunities for targeted evaluation in trusted environments with transparent and safe processes that guarantee the lawful processing of this kind of sensitive data, would have a major effect on the innovative use of this data. This would make it possible to use data as an instrument for public funding.
So-called data synthesization is another approach. Here, AI technology learns the patterns and statistical correlations of a data set and then generates a synthetic, completely anonymous data set. This data set does not contain any personal information but is still highly statistically representative, with the quality retained for analytical purposes, unlike with conventional anonymisation techniques. The City of Vienna is also testing the creation of synthesized population register data in a pilot project.
Felix Stadler and Mélanie Dulong de Rosnay (2020) describe a promising approach with the Digital Commons, a kind of digital common ground. They point out that several parties are often involved in the creation of data and it is not possible to simply allocate data to one person or organisation. Stadler (2020) names electric scooter sharing as an illustrative example. The data is created jointly by users, mobility providers, other road users, the city and other stakeholders. However, it currently “belongs” exclusively to the electric scooter provider. At the same time, open data is processed by large companies and the profit that is made in this way is privatised, even though many stakeholders have contributed to the generation of the valuable data. Stadler argues that data commons provide a third way to produce and use data collectively. In this process, the stakeholders who are involved need to jointly establish rules on how the data can be used.
5. Can we trust artificial intelligence?
If we set the course correctly, then digitisation does not heighten human prejudices but rather guarantees transparency and fairness.
Artificial intelligence (AI) is a promising technology and is already used in various areas such as image recognition, human speech processing and assistance systems. When used correctly it can improve the common good and our quality of life. However, AI also has a dark side.
In general, applications that are based on AI use historical data. This serves as training data to teach the desired behaviour to artificial intelligence. The rules upon which these decision patterns are based on remain unrecognised and, above all, unexamined. We call it a black box. A lot of training data contains unconscious biases. In practice, this leads to questionable decisions or predictions. As decisions by AI cannot be traced there is a very serious risk that existing biases will be further exacerbated and, additionally, will be perceived as the seemingly objective decisions of a machine.
There are numerous concrete examples of this: Amazon used AI, at least in test settings, to prioritize applications from potential employees. The AI displayed a clear gender bias after it was trained using the CVs of the predominantly male team members. Applications including the word “woman” were automatically rated badly. As a result, applicants who had attended a “Women’s College” had a substantially worse chance of being successful in automatic recruitment than other applicants (Dastin, 2018).
Another alarming example comes from courts in the USA that used COMPAS software to estimate the probability of reoffending. Historical data showed that white people were evaluated in a more positive way than black people (Tashea, 2017). The software should lead to neutral decisions but in practice, people’s racist and misogynistic biases were further exacerbated by the algorithms, and seemingly objectified.
Strong regulation is therefore needed here as well, so that AI is able to enhance the common good and our quality of life. The European Commission has also taken on a pioneering role here and has formed an expert group to create a guideline for trustworthy AI (HLEG, 2019). According to these regulations AI must adhere to all valid laws and provisions, follow ethical principles and stay robust with regard to technical and social issues. The ethical principles are based on fundamental rights: respect for human autonomy, the prevention of harm, fairness and accountability. Concrete demands for trustworthy AI are the priority of human decisions, technical robustness, protection of privacy, transparency, ecological and social well-being, non-discrimination and fairness as well as accountability.
The next step is to embody these principles in law. The European Commission submitted a first draft for regulating the use of AI in spring 2021. The proposal aims to create a legal framework for trustworthy AI. The law follows a risk-based approach and classifies applications into different risk classes. AI applications with unacceptably high risk because they violate fundamental rights should be prohibited. For example, a toy with an integrated language assistant that should encourage minors to engage in dangerous conduct should be banned. Applications with high risk should be strongly regulated. There are high-risk systems in many areas, including the education sector, where systems that are designed for decisions concerning university admission or for the evaluation of students, are classified as high-risk systems. Most AI applications, however, are considered to be low-risk and therefore are hardly regulated.
There is no schedule yet for when the directive will come into effect and many stakeholders do not think that the draft goes far enough. Even if this regulation initiative is in principle going in the right direction, many demands are being made to block significantly more applications or to regulate them more strongly. Human Rights Watch (2021) argues that an approach, that is too narrow, has been chosen and that the focus lies only on the technical avoidance of biases (also called debiasing).
Action is also being taken at a municipal level. The City of Vienna has developed its own strategy for the use of AI in 2019 with the involvement of scientific, economic and civil society experts as one of the first European cities. In the Viennese AI strategy, one chapter is dedicated to the topic of ethics and risk and ensuring human-centred AI. The strategy document states that: “Transparency, traceability and verifiability must be guaranteed in AI systems so that effective protection against distortions, discrimination, manipulation or other improper uses is guaranteed, particularly with regard to the use of forecasting and decision-making systems. This also means that the decision-making sovereignty is not left to a computer system, but remains the responsibility of a human”.
The certification of AI systems is also an important step to achieving their trustworthiness. The public utility company in Vienna (Wiener Stadtwerke) has positioned itself as a pioneer in this regard. This group of companies organises public transport and the energy system in Vienna on behalf of the city and is thus an essential player in the city’s organisation and development. The public utility company in Vienna was the first organisation in the world to have an AI solution certified by IEEE in 2021, thus ensuring its ethical soundness. IEEE is the world’s largest technical professional association with more than 400,000 members and has developed its own certification programme for ethically sound software systems (Ethics Certification Program for Autonomous and Intelligent Systems). This certification is a designation for safe and trustworthy AI systems and aims to achieve transparency, accountability and the prevention of biases.
In connection with Vienna’s rich cultural tradition, art and design can take on the important role of contributing interference or intervention in order to offset the supposedly objective “decisions” made by machines, and to counter the chaos of new media through active skill building. With our focus on Culture & Technology, the Creatives for Vienna and Content Vienna competitions, we promote creative innovations in the field of digital design at the Vienna Business Agency and support their implementation. (Vienna Business Agency, 2022)
6. Forever young?
If we set the course correctly, then digitalisation will provide us with additional, healthy years of life.
The aging of society and the increase in chronic illnesses is placing enormous demands on the health system. At the same time, digitalisation promises ground-breaking improvements, particularly in the health system. In diagnostics, some machines are already trumping what humans can achieve concerning the speed, precision and even analytical interpretation. In his book “Deep Medicine”, the cardiologist and geneticist Eric Topol shows how AI is fundamentally changing medical research. AI tools allow us to learn more about ourselves than we could ever imagine (Topol, 2020).
In their book “Die digitale Pille” Edgar Fleisch and his colleagues from the University of St. Gallen also showed how digitalisation can contribute to a healthy life and an efficient health system: Big Data and AI improve diagnostics. Standardised treatments will soon be a thing of the past because therapies can be tailored individually to each patient in the digital age. Telemedicine permits a regular exchange on an equal footing between doctors and patients and comprehensibly processed information empower us all to live a healthy lifestyle (Edgar Fleisch et al., 2021).
However, even these rosy prospects for the future are countered by justified concerns. Doctors are already permanently overstretched and human interaction with patients comes up too short. There is a risk that pressure on the system will increase even further if digitalisation promises increased efficiency and reduced costs and is used as an argument for cost savings. Instead, the time that is freed up by machines taking over routine work should improve the individual support of each patient.
This would also incidentally mitigate a second justified concern: We have learned that AI works through pattern recognition. As a logical consequence of this mode of operation, it has blind spots where people differ from the norm, e.g. intersex people have justified concerns that they will no longer have a place in digitalised medicine. This concern can also be countered with consistent customer orientation of the health system.
The fear surrounding the transparent person, who can only count on support from the public health system if he or she meets the expectations of a healthy lifestyle, is also quite justified. This mistrust of the public health system is the only way to explain why people in Europe cannot be motivated to use digital devices for seamless contact tracing and thus contain infections during the COVID19 pandemic.
Solid financing of the health system, the reliable protection of highly sensitive health data and the development of capacities to provide this data in anonymised form for research and innovation must therefore be given the highest priority if we wish to use the benefits of digitalisation in the health sector.
INiTS, the Viennese high-tech incubator, has initiated Health Hub Vienna to drive digital health innovations. Pharmaceutical companies, medical product manufacturers, private and public insurance companies, health service providers, and start-ups work within Health Hub Vienna on the development of patient-centred health solutions. At Health Hub Vienna, startups receive custom support in the critical steps associated with the development of products and services and their introduction on the market within a complex health system. There is a particular focus on the fulfilment of regulatory requirements around data protection and the certification processes.
7. Fake or News?
If we set the course correctly, we will stop polarisation and create a transparent media landscape.
The dissemination of fake news has been a source of concern for a few years now. In his book “Lie Machines”, Philip N. Howard, Director of the Oxford Internet Institute, traces how leading politicians influence people with the help of modern communication technologies, both in democracies and autocracies. Individuals, companies and governments develop “lie machines”, as Howard calls algorithms, that distribute incorrect information and thereby undermine both the authority and legitimacy of persons and institutions and the voters’ capacity for judgement. Howard shows how political interests are manipulated by combining social media profiles with data from credit card history, credit information, political donation records and other data to send tailored messages to individual voters using “microtargeting”. He pleads for an educational offensive so that citizens understand how lying machines work and critically question them. The algorithms for social media services should be subject to public control to stop the dissemination of fake news (Howard, 2020).
The well-known Austrian journalist Corinna Milborn and media manager Markus Breitenecker tackle the dark side of the digitalisation of the media landscape in their book “Change the Game”. They argue that Google, Facebook (Meta) and Amazon are not only useful platforms for the sharing of content but rather a media system that plays content curated by algorithms, in newsfeeds, personalised supermarkets, search results and automatically generated playlists. The targeted distribution of this content can be purchased from these companies at high prices: Google and Facebook (Meta) alone account for 60% of the online marketing market in the USA. Remember: web-based business models have been established because the Internet itself is free of charge.
Milborn and Breitenecker advocate to learn from past media upheavals how we can win the Internet back from Facebook (Meta) and Google and actively shape the digital media revolution. With media laws the right to freedom of the press had also been contrasted with responsibility on the part of the media companies.
Media companies need to test information for truthfulness, may not invade people’s privacy, must protect minors and the rights of ethnic and religious groups and may not steal content from others. Violations against these regulations will be punished. Furthermore, the state ensures, through antitrust law, media funding and public service media, that there is a diverse media landscape. Milborn and Breitenecker argue that the media companies Facebook (Meta) and Google need to be subject to exactly the same regulations if we want to stop the dissemination of fake news and the increased polarisation on the Internet (Milborn & Breitenecker, 2018).
In 2016, the private TV station that Milborn and Breitenecker work for, namely ProSiebenSat.1 PULS 4 GmbH, facilitated the launch of the 4Gamechangers Festival in Vienna. This annual festivalis a mixture of a symposium, an innovation fair and a cultural festival at which pioneers, visionaries and rebels develop a road map on how digital transformation can be used for the common good.
In addition to statutory obligations, a varied and innovative regional media landscape is also a recipe against polarisation and the uncertainty that is created by fake news. In Vienna, the „Vienna Media Initiative“ funding programme of the Vienna Business Agency started in autumn 2019 with a budget of 7.5 million Euros. This funding programme is Vienna’s reaction to the increasing digitalisation of the media landscape and aims to promote increased diversity in media and quality journalism.
8. Job guarantee or basic income?
If we set the course correctly, automation does not lead to job losses and performance pressure but rather to a higher quality of life.
In its Future Jobs Report 2020, the World Economic Forum predicts that 85 million jobs will be lost by 2025 because machines will take over tasks that were previously done by humans. At the same time, the new division of labour between people and machines will create 97 million new jobs. In this context, further training and retraining measures will play an even more important role in the upcoming years than had previously been the case.
For the state, it is essential to ask how social balance can be achieved considering these upheavals. Wolfgang Becker describes two possible scenarios of how digitalisation could affect the welfare state. Increasing automation could lead to structural mass unemployment because the factor of labour is increasingly losing importance due to greater capital input (e.g. robots, AI). A welfare state that is based on the taxation of labour income would thus face the problem that it needs to support more people with less income. Social equity can then only succeed if there is a radical change in the tax system. However, Becker also counters this scenario with a more optimistic version: digitalisation could create inclusive growth. Digitisation could provide a powerful productivity boost and thus real wage increases for the broader society and even meet the pressure of an ageing society on the welfare state. In this case, the financing of the welfare state would be assured. In Becker’s opinion, no robust evidence currently exists for such scenarios (Becker, 2019).
In his bestseller “Utopia for Realists” Rutger Bregman advocates for a 15-hour week and unconditional basic income. He argues that our rapid technological advances lead to a polarisation of the labour market, in which jobs for those with many or few qualifications remain stable for the most part while the number of jobs on offer for those with low qualifications is constantly reducing. This causes the foundations of modern democracy to crumble, challenging politics to reallocate the available work and the resulting added value (Bregman 2020).
The philosopher and economist Lisa Herzog also made it onto the bestseller list with her appeal “Die Rettung der Arbeit”. She describes work as something that is fundamentally human and that holds societies together and proposes to ensure work not only for the few privileged people but rather for everyone. She refers to empirical studies that show that even the much-quoted supermarket cashiers enjoy doing their jobs. The point is not only securing your livelihood financially but also about making a contribution to society. The complaint is not about the work itself, but rather about a lack of co-determination and appreciation. This is why Herzog advocates for a job guarantee and democratisation of labour (Herzog, 2019).
The MAGMA project in the Lower Austrian community of Gramatneusiedl (model project for guaranteed work in Marienthal) aims to provide evidence of the effects of a job guarantee. This study follows on from the 1933 study „The Unemployed of Marienthal“, with which the social scientists Marie Jahoda, Paul Lazarsfeld and Hans Zeisel demonstrated the dramatic social consequences of the closure of the textile factory, which caused many community members to lose their jobs. They showed that long-term unemployment not only meant a loss of income but also resignation and social isolation, which in the end led to health damage of those who were affected. 100 years later the labour market service of Lower Austria with its MAGMA study wants to realise an evidence-based model for a job guarantee in the same town. The aim of the project is to give 150 people who have been unemployed in Gramatneusiedl, as the town is called today, for more than one year a new job. AMS is financing 100% of the wage costs for the jobs in the private sector. New positions will be created in the non-profit sector for people who cannot find jobs in private companies. The project was initiated in October 2020 and is planned to run for 3 years. It is accompanied by economists of the University of Oxford and sociologists of the University of Vienna (AMS, 2020).
It is not yet known whether digitalisation will in the end be the path to an unconditional basic income for all or to a job guarantee. However, it is clear that the increasing pressure on the labour market, the increase in precarious working conditions and the erosion of employee rights are no appropriate answers to the achievements of digitalisation.
9. Equal rights for all?
If we set the course correctly, men and women will benefit equally from digitalisation.
Looking at the data from Eurostat, one could think that digitalisation is male. Just 17.2% of the 1.4 million Europeans who decided to study information and communication technology, and just 16.7% of the almost 8.2 million IKT workers who were employed in the EU in 2016, are women. Only 19% of managers in the ICT sector are women, while the average in other sectors is 45% (European Parliament, 2018).
The world of startups, which is dominated by software developers, is also shaped by men. The latest analysis by the Austrian Startup Monitor 2020 shows a slight increase in the number of female founders but the great majority of startups (64%) is still founded by males or by all-male teams. 27% of the founder teams comprised both men and women and just 9% of the startups were founded by an all-female team (Leitner, 2020).
Female founders also have a disadvantage when it comes to access to capital. An analysis of the investment rounds recorded by Dealroom in 2020 shows that 91% of the invested capital benefited male founders with only 9% going to female founders (Atomico, 2020).
Against this backdrop, the Vienna Business Agency announced the first funding competition for corporate research and development projects, in which only projects that are led or significantly implemented by women are funded. Six FemPower calls have been announced since 2004, supporting Viennese companies to develop new products, services and processes, conditional upon female project leaders. While it was still discussed in 2004 whether there are enough sufficiently qualified women and whether it would do women a favour to stamp them as quota women, measures to fund women in applied research have now become the state of the art. Ongoing monitoring of the funded projects has confirmed that the management roles in the funded projects had a positive effect on the career paths of the participating female researchers.
The Eurostat figures show that it is still necessary to exercise positive discrimination of women in IT. We also need to already arouse children’s and young people’s curiosity for innovation in nurseries and schools and to overcome traditional role clichés. The serious changes in the labour market could help to overcome old role clichés: many children and young people will work in professions when they grow up that do not even exist today. In this way, they will no longer be able to orient themselves towards the role models in their direct environment, but rather will need to find their own way. The education system needs to equip them with self-confidence and a solid trust in their self-efficacy for this path.
10. It is in our hands!
It is in our hands to find the right path to digital transformation. We can design digital technologies, products and business models in such a way that they focus on people’s well-being. We can promote broad, participatory and inclusive discourse about digital culture in order to define the digital society and the handling of digital technology. We can also empower people to hold this discourse competently and on an equal footing.
With its intellectual and political tradition, Vienna is predestined to become the capital of digital humanism. In doing so, we can tie in with schools of thought such as the “Wiener Kreis” and the raise of psychoanalysis around Sigmund Freud, with which Vienna also triggered a global revolution in thinking. Digital humanism is the next step in this development and Vienna is the ideal breeding ground for that (Strassnig et al., 2019).
We expect of our elected representatives on all political levels that they recognise the immense social relevance of digitalisation and design the regulatory framework for this in such a way that our privacy remains protected, everyone has equal access to the digital world, market monopolies are prevented and laws are not undermined. In doing so they can count on the expertise of many researchers who have announced their willingness to contribute by approving the Vienna Manifesto for Digital Humanism.
We want to motivate Viennese companies, research facilities and civil society to work together to make our city an internationally respected role model for digital humanism. In 2022, we will start an initiative, to animate as many stakeholders in Vienna as possible to create their own road map for digital humanism.
We would like to quote an appeal in the Vienna Manifesto for Digital Humanism: “We must shape technologies in accordance with human values and needs, instead of allowing technologies to shape humans. Our task is not only to rein in the downsides of information and communication technologies, but to encourage human-centred innovation. We call for a Digital Humanism that describes, analyses, and, most importantly, influences the complex interplay of technology and humankind, for a better society and life, fully respecting universal human rights”.
Literature
AMS (2020): AMS NÖ startet weltweit erstes Modellprojekt einer Arbeitsplatzgarantie;
https://www.ams.at/regionen/niederoesterreich/news/2020/10/ams-noe-startet-weltweit-erstes-modellprojekt-einer-arbeitsplatz (accessed 23. 11. 2021).
atomico (2020): The State of European Tech;
https://2020.stateofeuropeantech.com (accessed 24. 11. 2021).
Becker, Sebastian (2019): Digitaler Strukturwandel und der Sozialstaat im 21. Jahrhundert.
In: Deutsche Bank Research Institute. EU-Monitor;
https://www.dbresearch.de/PROD/RPS_DEPROD/PROD0000000000486872/Digitaler_Strukturwandel_und_der_Sozialstaat_im_21.PDF (accessed 1. 12. 2021).
Berens, Philipp (2008): Die Macht der Sensation. Medien im Dreißigjährigen Krieg. In: Spiegel Wissenschaft;
https://www.spiegel.de/wissenschaft/mensch/medien-im-dreissigjaehrigen-krieg-die-macht-der-sensation-a-535375.html (accessed 16. 11. 2021).
Bregman, Rutger (2020): Utopien für Realisten. Die Zeit ist reif für die 15-Stunden-Woche, offene Grenzen und das bedingungslose Grundeinkommen, Hamburg bei Reinbek.
Dastin, Jeffrey (2018): Amazon scraps secret AI recruiting tool that showed bias against women.
In: Reuters Business News; https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G (accessed 11. 12. 2021).
Deleuze, Gilles (1987): Foucault, Frankfurt am Main.
Dulong de Rosnay, Mélanie/Stalder, Felix (2020): Digital commons. In: Internet Policy Review, 9 (4), 1-22.
European Commission (2021): Proposal for harmonised Rules on Artificial Intelligence (Artificial Intelligence Act);
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206 (accessed 9. 12. 2021).
European Parliament (2018): More women in ICT. Empowering women in the digital world; https://www.europarl.europa.eu/news/en/headlines/society/20180301STO98927/more-women-in-ict-empowering-women-in-the-digital-world (accessed 24. 11. 2021).
Fleisch, Elgar/Franz, Christoph/Herrmann, Andreas/Moenninghoff, Annette (2021): Die digitale Pille. Eine Reise in die Zukunft unseres Gesundheitssystems, Frankfurt/New York.
Gere, Charlie (2008): Digital culture, London
Gurin, Joel (2014): Open data now. The secret to hot startups, smart investing, savvy marketing, and fast innovation, New York.
Hannes Werthner et al. (2019): Vienna Manifesto on Digital Humanism;
http://www.informatik.tuwien.ac.at/dighum/wp-content/uploads/2019/07/Vienna_Manifesto_on_Digital_Humanism_EN.pdf (accessed am 9. 11. 2021).
Herzig, Lisa (2019): Die Rettung der Arbeit. Ein politischer Appell, Berlin.
HLEG (High-Level Expert Group on AI) (2019): Ethics guidelines for trustworthy AI; https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=60419 (accessed 12. 11. 2021).
Howard, Philip N. (2020): Lie Machines. How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operation, New Haven.
Human Rights Watch (2021) Q&A. How the EU’s Flawed Artificial Intelligence Regulation Endangers the Social Safety Net; https://www.hrw.org/sites/default/files/media_2021/11/202111hrw_eu_ai_regulation_qa_0.pdf (accessed 3. 12. 2021).
Huyer, Esther/van Knippenberg, Laura (2020): The Economic Impact of Open Data. Opportunities for value creation in Europe; https://data.europa.eu/sites/default/files/the-economic-impact-of-open-data.pdf (accessed 15.12.2021).
Jason Tashea (2017): Courts Are Using AI to Sentence Criminals. That Must Stop Now. In WIRED; https://www.wired.com/2017/04/courts-using-ai-sentence-criminals-must-stop-now/ (accessed 26. 11. 2021).
Karl-Heinz Leitner/Georg Zahradnik/Doris Schartinger/Rudolf Dömötör/Johanna Einsiedler/Markus Raunig (2020): Austrian Startup Monitor 2020;
https://austrianstartupmonitor.at/wp-content/uploads/2021/05/Austrian-Startup-2020.pdf (accessed 24. 11. 2021).
Michael Strassnig/Katja Mayer/Michael Stampfer/Simon Zingerle (2019): Akteure, Instrumente und Themen für eine Digital Humanism Initiative in Wien. Studie im Auftrag der Stadt Wien Magistratsabteilung 23;
https://gmbh.wwtf.at/upload/digital-humanism-wien.pdf (accessed 12. 11. 2021).
Milborn, Corina/Breitecker, Markus (2018): Change the Game. Wie wir uns das Netz von Facebook und Google zurückerobern, Wien.
MyData; https://mydata.org/mydata-101/ (accessed 9. 12. 2021).
Perez, Carlota (2017): It is time for government to come back boldly, wisely and adequately: a view from the history of technological revolutions. Drucker Forum Vienna 2017;
https://www.youtube.com/watch?v=B30gLQ3kjjA (accessed 12. 11. 2021).
Stadler Felix (2020): Data Commons und Wiens Digitaler Humanismus;
https://www.digitalcity.wien/data-commons-und-wiens-digitaler-humanismus/ (accessed 9. 12. 2021).
Stadt Wien (2019): Digitale Agenda Wien 2025. Wien wird Digitalisierungshauptstadt;
https://digitales.wien.gv.at/wp-xontent/uploads/sites/47/2019/09/20190830_DigitaleAgendaWien_2025.pdf (accessed 1. 12. 2021).
Stadt Wien (2019): Künstliche Intelligenz Strategie. Digitale Agenda Wien;
https://digitales.wien.gv.at/wp-content/uploads/sites/47/2019/09/StadtWien_KI-Strategiepapier.pdf (accessed 3. 12. 2021).
Stadt Wien (2020): Wien 2030. Wirtschaft & Innovation;
https://www.wien.gv.at/wirtschaft/standort/strategie.html (accessed 5. 12. 2021).
Stadt Wien (2021). Die Wiener Fortschrittskoalition, 4.1 Kunst und Kultur – Kulturmetropole Wien; https://www.wien.gv.at/regierungsabkommen2020/files/Koalitionsabkommen_Master_FINAL.pdf (accessed 25. 02. 2022).
Topol, Eric (2019): Deep Medicine. How Artificial Intelligence can make Healthcare human again, New York.
Van Klippenberg, Laura (2020): Open Data Maturity Report. European Data Portal; https://data.europa.eu/sites/default/files/edp_landscaping_insight_report_n6_2020.pdf (accessed 3. 12. 2021).
Vardi, Moshe Y. (2018): How the Hippies Destroyed the Internet. In: Communications of the ACM, 61 (7), 9.
Vienna Business Agency (2022). White Paper Culture & Technology. A white paper to inspire creative ideas for digital applications in art and culture
https://viennabusinessagency.at/fileadmin/user_upload/Kreativwirtschaft/Publikationen/WA_WhitePaper_Culture_and_Techonology_WEB_EN.pdf (accessed 19.04.2022)
World Economic Forum (2020): The Future of Jobs Report 2020; https://www3.weforum.org/docs/WEF_Future_of_Jobs_2020.pdf (accessed 19. 11. 2021).
Zuboff, Shoshana (2019): The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power, New York.
Eva Czernohorszky
Director of Technology Services at the Vienna Business Agency
Eva Czernohorszky is a political scientist who works on the development of innovation ecosystems. She is Director of Technology Services at the Vienna Business Agency, helped shape Vienna’s economic strategy WIEN 2030 and is active in setting up the European Innovation Community EIT Manufacturing.
Picture: © Karin Hackl
Georg Sedlbauer
Technology expert at Vienna Business Agency,
Georg Sedlbauer has received master degrees in political science as well as history at the University of Vienna. Subsequently, he further specialized in the field of Digital Humanities and completed a third master degree at the University of Turku in Finland. He currently works for the Vienna Business Agency. There he consults innovative Viennese companies with a focus on connecting the stakeholders of the regional ICT landscape.
Picture: © Karin Hackl