Report Quality

IN CHAPTER 3,WE identified company adoption of integrated reporting as the key indicator of momentum. In Chapter 2, however, we also discussed the difference between a "combined report" and a truly "integrated" report. In Chapter 6, we described how greenwashing occurs when companies are insufficiently disciplined in their development of what we call the Sustainable Value Matrix (SVM). It is not solely the absolute number of companies practicing integrated reporting, but the quality of adoption that matters. How thorough and comparable these integrated reports are begins with the quality of frameworks for integrated reporting and standards for reporting on non-financial information. Although companies may achieve a truly integrated report by other means, the effectiveness with which they apply these frameworks and standards will determine how useful these reports are to investors.

To assess report quality, we analyzed 124 listed companies' self-declared integrated reports in the context of the "Consultation Draft of The International <IR> Framework" (Consultation Draft), published in July 2013 -1 Sourced from Global Reporting Initiative (GRI) website via its "Sustainability Disclosure Database" on October 17, 2013, 100 were English-language reports of the 135 non-South African companies that had made this declaration. To these, we added the reports of the largest 24 South African listed companies by revenue. The analysis team began its work on October 8, 2013, and completed it on March 14, 2014. During that period, the team held numerous conference calls to discuss the research and analysis. Over 400 hours were spent coding the data template for each of these 124 reports, with an additional 500 hours spent aggregating and analyzing these data.2

While it may seem counterintuitive to use a framework that did not exist at the time of report preparation to analyze reports, this approach allows us to gauge whether, even at this early stage of the movement, companies were intuitively following the principles of integrated reporting as articulated in December 2013's "The International <IR> Framework" (<IR> Framework). If company practice matches the framework's suggestions, it both validates the framework and suggests that it is not unreasonably difficult to apply.3

In comparing company reports with the Consultation Draft, scoring was done based on its seven Content Elements, its Six Capitals, and seven Special Factors, for a total of 20 factors. Each factor was scored from 0 (lowest) to 3 (highest), meaning the maximum score a report could receive was 60. Sub-scores were calculated for each of the Content Elements, Six Capitals, and Special Factors. While some degree of subjectivity is inevitable in scoring narrative data, numerous steps were taken to ensure that the coding was done as consistently and reliably as possible across reports and coders. Appendix 7 A contains a full explanation of the methodology used.

We had no expectations of report quality prior to our analysis, but the results pleasantly surprised us. With admittedly substantial variation, these 100 companies were, on average, doing a fair job. The 24 South African companies fared noticeably better, likely due to the fact that they had at least two years of experience producing an integrated report thanks to King III and the Integrated Reporting Committee of South Africa's 2011 Discussion Paper (IRC of SA Discussion Paper), and that, because of these pushes, they had been learning how to improve based on audience feedback and observing practices of other companies. However, some areas of noticeable weakness appeared in both samples. Discussed below, these included outlook in the Content Elements and materiality, connectivity of information, and stakeholder engagement in Special Factors.

< Prev   CONTENTS   Next >