
Tenant Satisfaction Measures
The government asks all housing associations to report a set of performance and satisfaction numbers each year. See our scores from last year.
This is just one of the ways we listen to your views. You may be contacted to take part in a satisfaction survey over the phone or via email. The questions we ask you are called ‘Tenant Satisfaction Measures’.
We contact a random group of residents to complete the survey each year, this might be over the phone or via email. The survey is quite short, it should only take 5 to 7 minutes to complete.
We read every piece of feedback you share and use it to spot ways to improve. Sometimes we can make changes straightaway and sometimes we use your feedback to help us plan for improvements in the future. We will tell you how your feedback is impacting on our services through our regular communications.
The surveys are completed anonymously, however, if you are dissatisfied with your experience, you have the option to give permission for us to contact you to talk about your experience and try to make it right.
There are lots of ways you can get in touch to share your feedback. Whether you want to compliment a member of our team or let us know we didn’t get it right, we love to hear from you.
We received survey responses from 1,301 people. This includes 1,008 responses from people living in our rented homes, including retirement housing communities, and 293 responses from people living in shared ownership homes.
The survey was conducted between September 2024 and March 2025. We conducted the survey throughout the year, asking different residents to share their views over the time period.
All surveys were conducted over the phone.
All residents and shared owners were chosen randomly. At the start of the programme, we created a list of residents eligible to be contacted for a TSM survey. A daily automated process checked the list and removed anyone no longer eligible — for example, if the tenancy had ended, or they had already completed or declined the survey.
The list was paired with a random number generator to create a fair, shuffled order. This ensured each eligible resident had an equal chance of being selected.
The report showed the top 25 residents by needs category to be contacted – this was the live call list for staff. Each time the report was refreshed, a new random set of 25 residents was shown.
Although people were selected at random, we monitored responses carefully to ensure we received opinions from a wide range of people.
In most cases, yes it does. We want to make sure our survey results reflect the views of all types of residents. To do this, we compare the mix of people who responded with the mix of people living in our homes.
We look at how many responses we’d expect from each group if everyone had taken part and we compare this to the actual number of responses we received. We used a statistical technique called the Standardised Residual Value to calculate how different an actual result is from what we expected. If the value is above +2 or below –2, it usually means the difference is statistically significant.
This helps us see if any groups are over- or under-represented. This is shown in the demographic breakdown report under the column ‘Standardised Residual Value’; if it is highlighted in Green, is it not statistically significant, if in Red it is statistically significant.
We don’t include groups that are too small (fewer than 5 expected responses), as the results wouldn’t be reliable.
We monitored responders against property type, ethnicity, age, location, needs category property size and household size We found that, under most categories, there were fair and consistent trends over the survey period.
For Low Cost Rented Accommodation, we have slightly fewer residents who live in flats who responded to the survey, than the overall population. For Low Cost Home Ownership, we had less survey respondents from 3-person households than the overall population.
No. Weighting is a statistical technique that can be used to update survey responses to better represent the wider community. We did not use weighting as we made sure we spoke to people from a wide range of ages, abilities, and backgrounds and believe no group was at a disadvantage through the random approach.
Yes, we worked with a partner, Pexel Research Services, who completed 41 surveys with our Shared Owner residents, on our behalf to ensure we met our sample size. They are an experienced customer feedback agency who follow the Market Research Society code of conduct to make sure their approach is ethical and independent and are accredited to ISO 20252.
No, we did not exclude any households due to exceptional circumstances, as described in the TSM guidance.
A minimum of 973 responses for Low Cost Rental Accommodation residents and a minimum of 293 response for Low Cost Home Ownership were required to meet the sample size set by the Regulator of Social Housing. We achieved 1,008 and 293.
To make sure our survey results reflected the views of all residents with a high level of confidence, we used a sample size that gives a margin of error of +/-3% for Low Cost Rented Accommodation and +/-5% for Low Cost Home Ownership. This means we can be 95% confident that the reported satisfaction levels are accurate within 3% or 5% of the true values for each group.
No, we simply asked people to spare a few minutes to share some feedback to help us improve.
We asked 11 questions about people’s experiences and perceptions of our services.
We also asked if residents wanted to provide more information and checked if they gave permission for us to follow up on negative feedback to try to put things right and share the information with other teams.
You can read all the questions in the questionnaire survey design.
No. We conducted all our surveys this year over the phone, compared to last year where we conducted around 47% of surveys over the phone and 53% via an online survey. While it’s acknowledged that phone surveys can sometimes yield more positive responses, they also provide richer, more nuanced insights, with residents able to explain their experiences in their own words.
Learning from our three previous surveys, we chose to conduct our resident satisfaction surveys using our own staff, over the phone, to build a better understanding of residents’ experiences and ensure their feedback directly informs how we shape and improve our services. Speaking directly with residents allowed us to clarify responses, provide support in real time, explore concerns in greater depth, and address misunderstandings(eg thinking we did not provide a certain service) or urgent issues as they arose—something that was not possible through external, online or email-based surveys.
Conducting the survey over the course of the year enabled us to identify and act on emerging trends quickly, as well as respond effectively to issues as they developed. This approach also helped us reach residents who may not regularly use email or have reliable access to digital platforms, ensuring their voices were heard.
We did not use any visual featu res to support the survey, as it was conducted over the phone.
We believe our improved resident satisfaction scores this year are mainly a direct result of the proactive measures we’ve taken to address keys areas of concern raised by residents in the last three surveys. For example, anti-social behaviour had low satisfaction scores, to address this we lowered the threshold and started to investigate a wider range of anti-social behaviour. We recruited more staff to support these investigations. This new service was promoted to all residents across the website, social media as well as newsletters and emails sent to residents before the survey started.
The way we handled complaints also had low levels of satisfaction from the beginning. To address this we initiated a new complaints process, and renewed our inhouse training for all staff with a renewed resident focus. We also recruited a new Resident Feedback and Experience Manager to support the management of complaints and ensure we learn from the experiences residents share with us , again this was in place before the survey started.
Click on the headings to view the results by topic:
Results for residents of rented homes:
Results for shared owners:
Our repairs service
Communal areas
Results for residents of rented homes:
Results for shared owners:
We responded to 100% of all stage 1 and 2 complaints within the Housing Ombudsman Complaint Handling Code timescales.
Combined results for residents in low cost rented homes and low cost shared ownership homes: