NEW REPORT OUT NOW
Read highlights from CEDA’s PIT Forum 2020 with Australian National University, 3A Institute, Senior Fellow, Ellen Broad, King & Wood Mallesons, Partner, Scott Farrell, The University of Melbourne, Centre for AI and Digital Ethics, Co-Director, Professor Jeannie Paterson and Industry, and Science Australia, Chair, Andrew Stevens, who discussed putting the digital technology 'ball in the court of the consumer' and explored the effect of new technologies on consumer decision making.
06/12/2020
This session of the Public Interest Technology Forum, chaired by Andrew Stevens, Chair of Innovation and Science Australia, sought to put the digital technology ‘ball in the court of the consumer’ and explore the effect of new technologies on consumer decision making, choice and consent and how consumers might be empowered to use technology and data for their benefits, including through the legislated Consumer Data Right (CDR).
Professor Jeannie Paterson from Melbourne University kicked off the conversation by making a point similar to that made in a prior session by Human Rights Commissioner, Ed Santow, that there are existing laws and regulations that need to be applied to ensure that digital services and technology enabled products are safe, fit for purpose and actually deliver the services or product promised. Australia is lucky, she argued to have a robust consumer law framework, and has adopted progressive structures such at the CDR, however, there is a need to think about standards in the digital space and about the coherence of laws and how they work together, rather than looking to more laws. For instance, how the privacy and consent mechanisms work through the CDR are different to those that apply through the Privacy Act.
Scott Farrell, a partner at King & Wood Mallesons, and leading expert and advisor on open banking and the CDR, articulated the key priorities in protecting and empowering consumers in the context of digital technologies. First and foremost, he argued the need to start from the perspective of having empathy for the consumer and the need to make it easy for people to make choices, and to be confident in sharing their information for their own benefit. If people are scared that their data or information will be used against them, the more likely they are to ‘lock it away’ which might deny them important benefits from use. The right way to go about this is to recognise what people want to and will do, and provide a framework to enable them to do so safely – akin to the flags on the beach to encourage people to swim in the safest spot.
An important point around empowerment that emerged through Scott’s comments and those of Ellen Board, from the 3A Institute, is that genuinely empowering someone requires you to recognise that people have different preferences, capabilities etc. As Ellen noted “you need to recognise people where they are” and then provide different supports and options that enable them to make their choices safely.
Ellen observed that before the CDR and open banking discussions the focus was on providing through ‘my data’ bulk copies of raw data to people and hoping they might figure out what to do with it. The CDR provides a framework for people who want to do more, or less, with their data to be able to do so safely and for their benefit, in contrast to existing apps etc, which often require you to provide your data in unsafe ways.
The discussion of consent touched on a range of interesting issues. Scott reminded us that we don’t want to have to give consent every time we share our data, in the same way that we don’t give consent at every stage when we use money, and suggested that we look to the controls that we have in place for the use of money as a guide to how we might build frameworks for consent around data. He emphasised the need to let people have a say and involve real people in a process to learn where the boundaries for consent might be drawn. Scott also stressed the importance of using plain language, noting that risk doesn’t come around because of misunderstandings of nuanced legal language but from situations in which people ‘point blank did not know’ what was being done with their data and information.
Reiterating a theme that came through in the opening session to the forum, Jeannie spoke to the importance of ‘earned trust’ and open, transparent and lawful use of data, proving the value of the service promised and delivering in line with expectations.
Ellen rounded out the discussion of consent by suggesting that while consent is often seen as empowering consumers, that is not the case if consumers don’t really have choice, and that perhaps the more important aspect of consent is that it raises a set of promises and expectations on the part of the provider/data user regarding their behaviour, which is useful in restricting unsafe and untrustworthy behaviour.
All speakers agreed that information around data sharing and consent and notice provisions could be made simpler and easier, including for example through the adoption of standardised consent dashboards and the like, In other words, we can make better use of technology to make these processes better for consumers.
It is interesting to note that while all speakers acknowledged the potential benefits of the CDR, the vast majority of the audience indicated they are unsure whether they will use the CDR because they don’t know enough about it.
Innovation drives productivity and competitiveness and that’s why it is vital, Federal Minister for Industry, Innovation and Science, the Hon. Greg Hunt has told CEDA’s State of the Nation conference.
Read more Technology | Innovation August 30, 2016Australia needs to take innovation seriously, due to the research-proven link between innovation and performance measures, like productivity, Federal Department of Industry, Innovation and Science Secretary, Glenys Beauchamp PSM has told a CEDA audience in Melbourne.
Read more Technology | Innovation June 16, 2013Paper presented to the International Summit of Business Think Tanks by CEDA, Chief Executive, Professor the Hon Stephen Martin.
Read more