Dear Rose


I am writing to endorse the approach taken to develop interim indicators from the People and Nature Survey. The survey, which collects vital information on people’s engagement with the natural environment in England, was always scheduled to go live in April 2020, but your team has managed to publish timelier data and adapt the survey to collect additional information on the impact of COVID-19. I would like to congratulate everyone involved for their work to produce these timely and valuable statistics in challenging circumstances.

My team has conducted a rapid regulatory review of the published information. We have reviewed the extent to which they have been produced in accordance with the Code of Practice’s Trustworthiness, Quality and Value pillars, while taking account of the pressures you and your teams have faced to deliver timely statistics about an important topic. A summary of our findings is set out below and more detailed feedback has been provided to your team.


  • The People and Nature Survey (PANS) is already a hugely valuable survey under normal circumstances, and the data are even more relevant since the COVID-19 pandemic. The data will help answer questions about how COVID-19 is impacting people’s experience of nature, how this is associated with physical and mental health, and how people’s attitudes and behaviours towards nature are changing. These data provide important evidence for the pandemic response and recovery. To measure the impact of COVID-19 specifically, new questions were added to survey from May to gather information on topics like the amount of time spent in gardens and changes in behaviour and well-being under lockdown. We welcome the rapid publication of these indicators and the flexible survey design to support understanding of the impact of COVID-19.
  • Your approach to user engagement for these statistics has been exemplary. Users have been involved in the development of the statistics from the outset – your team selected the set of indicators and developed the COVID-19 questions in consultation with stakeholders from across government, academia and non-governmental organisations – and the indicators provide users with additional opportunities to collaborate on their ongoing development. Your team has strengthened links with known users and established links with new users. Together, these activities ensure that the statistics meet users’ needs as well as they can, and provide a strong foundation for future developments.
  • The statistics are published in an accessible HTML format. We welcome that your team is developing additional interactive ways of presenting the data, including a dashboard.
  • It is particularly good that your team is exploring the use of external data sources, such as land use and habitat type data and index of multiple deprivation data, to contextualise the PANS data and generate further insight.


  • These are one of the few official statistics based on an online panel survey. There are known issues with the representativeness of panel samples. Your team is aware of these limitations and has put quotas in place for age, gender, region, education and ethnicity to ensure that key population groups are adequately represented, but there is more work to be done to understand potential bias in the sample and the results. The harmonisation exercise that your team intends to carry out between PANS data and data from the Monitor of Engagement with the Natural Engagement survey (MENE – PANS’ predecessor, which was a face-to-face random probability survey) should help you get a good handle on the representativeness of the PANS sample and the impacts on the quality of the statistics, along with any implications for use.
  • The methods used and their limitations are explained clearly and aid interpretation of the statistics. Information on limitations covers the representativeness of the online panel sample, as well as the comparability of the monthly indicators with other data (including the MENE data) and the weighting approach (MENE weights will be used until bespoke weights have been developed for the new survey). Confidence intervals are presented alongside the indicators to give an indication of the uncertainty around the estimates, and the interim nature and experimental status of the statistics are highlighted prominently.
  • The quality assurance process is proportionate and robust. Your team works closely with the contractor that runs the survey to check and validate the data. External stakeholders peer reviewed the bulletin, which allowed you to gather feedback on the indicators themselves as well as suggestions for improving your quality assurance process. Quality assurance for the full dataset, which is expected to be published on a quarterly basis from September, will be more comprehensive, for example, through the use of scripts to conduct automated quality checks.


  • Survey data are collected in line with Government Statistical Service guidance on data collection during the COVID-19 crisis. You have been transparent about the reasons for continuing with the survey.
  • Survey participants are informed that current government rules and regulations are still in place, and the survey does not encourage participants to engage with the natural environment outside of current restrictions. Relevant questions, including those about visits outside, are contextualised and this context is updated as the government guidance is updated. This clear, unambiguous messaging contributes to survey participant safety.

We look forward to seeing these statistics develop as more data are collected on the impact of COVID-19. As set out in the guidance on changes to statistical outputs you can include a statement in your release such as: “These statistics were already in development pre-COVID-19. We have adapted the existing survey to understand the impact of the pandemic and these monthly interim indicators have been produced quickly in response to developing world events. The Office for Statistics Regulation, on behalf of the UK Statistics Authority, has reviewed them against several key aspects of the Code of Practice for Statistics and regards them as consistent with the Code’s pillars of Trustworthiness, Quality and Value.”

I am copying this to Ken Roy, Defra Group Head of Profession for Statistics, and Simon Doxford, Natural England official statistics lead.

Yours sincerely

Ed Humpherson

Director General for Regulation