Report for EBU Big Data Conference 2017
29 Mar 2017 02:00h
Event report
Day 1
The European Broadcasting Union (EBU) Big Data Conference brought together heads of digital, marketing, communications and legal departments alongside technologists and academics to share best practices for big data media strategies, with a focus on public service media (PSM).
The conference was opened by Ms Ingrid Deltenre, Director General of the EBU, who stressed the importance of big data as a ‘game changer’, as it allows for an improved understanding of the audience. As such, big data is not an aim in itself, but a tool to better engage with audiences. Mr Guillaume Klossa, Director of Public Affairs & Communications at the EBU, further elaborated on the importance of collaboration to break the silos between media, communications, legal, public affairs, technologists, and other experts and to ‘create conditions for EBU’s members to embrace big data as rapidly as possible’.
Building Trust
The first panel discussed the creation of trust and the importance of ethics, laws, and human rights. Ms Lokke Moerel, Senior of Counsel at Morrison & Foerster LLP, explained that ethics rules need to be respected in order to achieve social acceptation of the use of big data. This includes not only an adherence to laws, but also to underlying ethics and social conventions, including principles of non-instrumentalisation, non-discrimination, equity, and consent. Ideally, these conventions need to be analysed prior to implementation, as privacy or ethics-by-design, as they will not be automatically rectified through an ‘invisible hand’: everybody is violating the rules in order to keep up with the competition. Prof. Alessandro Mantelero, Politecnico di Torino, echoed Moerel’s comprehensive approach and pleaded for preventive policies to address and mitigate potential privacy, ethical, and social impacts. These impact assessments should be based on human rights charters and community values, but tailored to the specific context in which the data is used.
Mr Joseph Cannataci, UN Special Rapporteur on the Right to Privacy, looked deeper into the concept of privacy and argued for a ‘very very very’ critical approach to big data. Privacy needs to be understood both as the protection of personal data, but also as an enabler to the free development of personality and to other human rights, including freedom of expression. After elucidating several international legal challenges, he claimed that ‘In an Internet without borders, we need safeguards without borders and remedies across borders’. According to Mr Joe McNamee, Executive Director of European Digital Rights, privacy challenges are primarily related to a ‘broken market’ in which ‘consequences don’t matter’ and incentives for transparency and accountability are lacking. The launch of the EU’s General Data Protection Regulation (GDPR) is an opportunity to move away from a ‘race to the bottom’ and ‘move the balance back towards the trust and reliability we expect’. Finally, Mr Pierre-Nicolas Schwab, Big Data/CRM Manager at the RTBF, shared their experience of privacy protection, addressing three problems: over-personalisation, the absence of alternative viewpoints, and threats to privacy. To mitigate these challenges, education is key, and should focus on empowering users to control their data, to build knowledge on personalisation, and to open the black boxes and ‘show’ algorithms.
During the Q&A that followed, the speakers argued for a more inclusive approach to the challenge of trust-building. Cannataci stressed the importance of moving beyond transparency on algorithms (which will only be understood by less than 1% of the population), towards a total societal impact assessment. Mantelero underlined the need to be open to discussion and ask advice from experts from different fields, including anthropology, to establish trust. Schwab agreed that data scientists should not be left alone, and that there is a continued need for sociological models.
Personalisation and recommendation systems: Delivering quality
This panel, moderated by Mr Alberto Messina, R&D coordinator RAI Centre for Research and Technological Innovation, discussed the opportunities of recommendation and personalisation systems, as well as the risks of filter bubbles that could arise from them. First, Mr Andrew Scott, Launch Director myBBC, shared experiences on the balance between automation and smart curation. While the former is entirely algorithmic, the latter requires human capabilities and allows for the provision of ‘breadth’, linking users to content they do not necessarily expect. Mr Mika Rahkonen, Head of Development, News and Current Affairs at Yle, debunked five misconceptions about news personalisation and filter bubbles:
- Personalisation is not about breaking things, it is about fixing something that is already broken, adapting to the digital age and attracting new audiences.
- Personalisation is not forced on people, it is a tool that people can choose to use.
- Personalisation will not replace every important news story ‘with pictures of Kim Kardashian’s cleavage’ – there is no evidence of this happening.
- Filter bubbles are not new, there have been bubbles ‘since the bubble of cave 1 and the bubble of cave 2’.
- There is no one and only newsfeed for all that is endangered by personalisation.
While these misconceptions might be giving rise to a false understanding of challenges, Mr Michael de Lucia, Head of Digital Innovation at RTS, provided three examples of key challenges related to the data-driven era: infobesity, competition with Internet giants, and algorithms and artificial intelligence, which might result in the PSM’s inability to stand out. These challenges can be mitigated by adapting a more global and co-operative approach, and continuously learn and share experiences.
The discussion that followed reiterated that personalisation and smart curation ultimately aim at understanding the user and providing content in a better way. There is enough quality content, but the question relates to ‘getting the right content to the right people at the right time through the right devices’. Although it is difficult to compete with the ‘infobesity’ on the web, by adopting trust, transparency, and the right tone of voice, PSM can make a difference and avoid that ‘a lot of good content goes to waste’.
Algorithms and online platforms: Limitations and opportunities
Mr Robert Amlung, Head of Digital Strategy at ZDF, moderated the discussion, which aimed at sharing experiences to find out how the combination of platforms and algorithms can make sure that great content is easily found by users. According to Mr Michael Hlobil, Data Insights COE Solution Architect at Microsoft, the most important thing in this effort is to ‘document what you do’ and to have a diverse team of experts involved. Furthermore, data quality is crucial: ‘if you put shitty input, you get shitty output’. Mr Michael Paustian, Creative Director of Axel Springer, elaborated on the importance of human involvement: ‘an algorithm itself is just a set of instructions’, the process of getting there is scientific and can be understood as a dialogue with the problem. Yet Mr Rigo Wenning, Legal Counsel at W3C, wondered whether we have ‘the right instructions’? There are many ‘dumb algorithms’ that ‘make people naked’. Furthermore, challenges remain regarding the re-centralisation of the web and the power of its gatekeepers, which continuously change the rules. This problem was also emphasised by Mr Sylvain Lapoix, freelance data journalist, who explained that Internet platforms have become Internet service providers, leaving the PSM vulnerable and dependent.
Data Journalism: New possibilities for investigation, collaboration and ubiquity
This discussion focused on the utility of big data for journalism and reporting, and was moderated by Mr Laurens Cerulus, Reporter at Politico Europe. Mr Mirko Lorenz, Information Architect at Deutsche Welle, states that although individual media outlets might be small, ‘collectively we’re really big’ and can compete. There is a need to push back against fake news and invest in stories with data narratives and to think about the future of content for future generations. Zooming in on the value of data journalism, Mr Neal Rothleder, CTO of ORBmedia, explained how data brings new perspectives by seeing large pieces of the world all-together, by grasping how things are changing over time, and by providing new views on complex situations.
Although data journalism has many promises, it might not be easy to get it right at once. Mr Jan Lukas Strozyk, Journalist at NDR, added 5 lessons from his experience in data journalism:
- The size of the data never matters
- Data itself does no’t provide full stories
- The best datasets are useless if they are not readable
- You will miss obvious things if you do not visualise the data
- You cannot make it alone
Once data-driven stories are created, they need to be visible, and Mr Roland Schatz, CEO of Media Tenor International, spoke about the importance of knowing the audience that is to be attracted. Furthermore, they do not necessarily have to be time and resource-consuming if they are produced through partnerships with organisations that already collect extensive amounts of data, and with researchers and academics who can assist in the data analysis.
The following Q&A addressed the importance of building partnerships and collaborating with other sectors, the need for the organisation as a whole to be more data-aware (train people in Excel!), as well as bridging the gap between the data scientists and editorial teams: the former might lack the knowledge on how to build the narrative, while the latter might not know what is possible with the data. This requires a cultural shift in organisation.
The path towards a data company
This round-table focused on the new mindsets, technologies, tools, and strategies needed for the creation of a data company, and was moderated by Mr Aleksi Rossi, Head of Interfaces at Yle. Mr Ignacio Gomez, Director of Analytics and Future Media at RTVE, started by explaining the need to connect ‘tv people’ with data teams, as they are two halves of a brain that does not work in sync yet. Mr Sanjeevan Bala, Head of Data Planning and Analytics at Channel 4, added that there is a need for senior buy-in, as data should be in every part of the business. Within the business, there is a need for co-operation across different teams. Furthermore, recruiting from outside the broadcasting sector might help, as it allows for learning from other practices. These insights were echoed by Mr Dieter Boen, Manager of Research and Innovation at VRT, as the panel concluded that collaboration is key: identify shared interests with others and seize opportunities to collaborate: ‘better together!’.
Day 2
The second day of the conference on Big Data Conference: Serving Citizens, on 22 March 2017 at the EBU Headquarters in Geneva, started with welcome remarks by Mr Guillaume Klossa, Director of Public Affairs and Communications at EBU, who reaffirmed the conference’s purpose: developing strategies and implementing recommendation systems aiming at fostering citizens’ trust in data.
The first panel, ‘How Can Big Data Help Public Service Media Better Serve Citizens?’, explored the possibility for Public Service Media Companies (PSMC) to better accommodate citizens’ demand and use of digital content. Mr Gilles Marchand, General Director of Télévision Suisse Romande (TSR) and Radio Télévision Suisse (RTS), first considered that competition in this sector is increasing considerably.
He stressed the importance of optimising the current co-operative processes among different PSMCs. In particular, he suggested a threefold approach, based on intelligence (optimisation of the distribution of all content), community (involvement of the public through the use of smart data), and journalism (smart data can optimise the user-on-user content and consequently increase public trust).
The second speaker, Dr Mirko Schäfer, Leader of the Utrecht Data School, discussed the positive use of datification as a potential means of fostering a European public sphere. He considered that online active participation is closely linked to civic action. Moreover, recalling the need for strategy expressed during the previous day, he reaffirmed that big data should be approached from a top-down perspective (that is, at the top decision-making levels) rather that with a bottom-up approach.
The second panel, Audience Measurement: Evolution or Revolution?, included three main speakers moderated by Mr Kristian Tolonen, Head of the Audience Research Department at Norwegian National Broadcasting (NRK), who opened the discussion by illustrating the importance of big data for audience measurement. In particular, he considered that the use of big data is beneficial on four dimensions: the target (the profile of the audience), the source (shifting from a one-source measurement system to hybrid solutions), the time (optimising the time required through measurement), and the level of the discussion (the depth of the information collected).
Mr Emil Pawłowski, Chief Science Officer at Gemius, further considered whether the big changes that have modified consumption patterns in the past decades should push for a re-evaluation of the existing measurement techniques. Currently, accurate measurement is impaired by economic reasons – conducting research on a small panel is expensive – and by a fragmentation of data caused by the existence of multiple browsers. He affirmed that the ultimate goal of audience measurement will be multimedia research, that is, a measurement system that would encompass simultaneously all the media used by the consumer (Internet, television, radio, press) rather than analysing such media separately.
Mr Nick Johnson, Solutions Architect at Ireland’s National Television and Radio Broadcaster (RTE), centred his speech on the challenges faced when measuring the performance of RTE programs across all its platforms. Consumption patterns have changed over the past years because they are technologically driven by the Internet and smart devices; hence he explained the difficulty for RTE to assess the total value of its assets and to measure it efficiently.
Lastly, Dr Uffe Høy Svenningsen, Audience Researcher at Danmarks Radio (DR), illustrated the new TV-audience measurement system launched in Denmark on 1 January 2017. This innovative approach is based on four main sources: a basic panel, a digital panel, a web profile panel, and census data. The information coming from all these sources is combined and calibrated to produce a more accurate measurement. Despite the fact that such system allows for a better measurement of overall consumption, there are still challenges regarding the calibration of the information obtained (e.g. regarding determined on-demand programmes) and the actual mapping of all the content consumed by the viewer.
The event continued with two Toolbox sections, which offered speakers a space to deliver hands-on experience illustrating specific case studies.
The first, on Privacy Policies, was moderated by Mr Pierre-Nicolas Schwab, Big Data/CRM Manager at Radio Télévision Belge de la Communauté Française (RTBF), who recalled the crucial role of education and consumers’ trust. He illustrated the four-step approach taken at RTBF based on a confidentiality charter, a single-sing-on platform, a recommendation system (sensitive to ethical concerns, marginalised groups, gender equality), and an educational programme to Artificial Intelligence (AI).
Ms Lucy Campbell, Marketing Director TV & Digital at RTE, presented RTE’s single-sign-on platform: myRTE. Digitalisation services are raising consumers’ expectations, and the proliferation of actors providing media content is making this sector very competitive. For these reasons, RTE has inaugurated a consumer experience strategy aimed at achieving a better understanding of the audience by providing personalised services and thus better experiences.
Mr Peter Farrell, Head of Legal BBC Workplace and Information Rights, complemented Ms Campbell’s speech on the necessity of rendering digital content more personal and relevant. He presented the myBBC single-sign-on platform and stressed the importance of building consumers’ trust towards the platform through clear and transparent privacy policies.
The second toolbox, on The Innersource Approach to Personalisation, focused on the Personalisation for EACH (PEACH) technology system. Mr Michael de Lucia, Head of Digital Innovation at RTS, reminded the audience that such system aims to deliver personalized media recommendations to audiences which are increasingly accessing content on demand, through a variety of devices and platforms. As Anselm Eickhoff, Software Architect at Bavarian Broadcasting (BR) further explained, PEACH system aspires to deliver ‘the right content, at the right time, to the right person, on the right device’.
Furthermore, Mr Michael Barroco, Head of Software Engineering at EBU, illustrated the organisational structure of the projects. PEACH is a cross-organisational system developed by the EBU, featuring two main stakeholders: BR and RTS. More specifically the team is composed of developers as well as data scientists functioning collaboratively as single Scrum team across organisations and borders.
The event concluded with a presentation of Project Kelvin by Mr Bram Tullemans, Project Manager at EBU. This project aims at using real-time data collected from video players in order to produce information that can optimise the distribution flow of content. The ultimate goal is to identify the content-delivery method that performs best.