Agenda item

Customer Service Centre Update

Councillor Dan Levy, Cabinet Member for Finance, Property and Transformation, Susannah Wintersgill, Director of Public Affairs, Policy and Partnerships, Tom Parsons, Head of Customer Experience, Clare Martin, Strategic Improvement Lead, and Richard Merritt, Operational Manager Contact Oxfordshire have been invited to present a report on the Customer Service Centre.

 

The Committee is asked to consider the report and raise any questions, and to AGREE any recommendations it wishes to make to Cabinet arising therefrom.

 

Minutes:

Councillor Dan Levy, Cabinet Member for Finance, Property and Transformation, Susannah Wintersgill, Director of Public Affairs, Policy and Partnerships, Tom Parsons, Head of Customer Experience, Clare Martin, Strategic Improvement Lead, and Richard Merritt, Operational Manager Contact Oxfordshire attended the committee to present an update report on the Customer Service Centre.

 

Councillor Levy introduced the Customer Experience service update. The Council had prioritised enabling residents to contact the authority through channels that suited them, supported by investment in online and telephone systems. Officers introduced the customer experience team and the redesigned had gone live on 1 February, with management layers streamlined and additional capacity added in workforce planning and complaints to improve response times and learning. The service had been structured around customer feedback (complaints/FOI/MP enquiries), a contact centre covering communities and neighbourhoods (excluding adult social care), a dedicated social and health care ‘front door’ for adult social care, and a small strategic improvements function focused on customer journeys and working with Zoom. A major demand spike linked to the temporary congestion charge was highlighted, which included handling around 15,000 emails handled in six weeks. The focus had been on reducing avoidable ‘chase’ calls and switchboard misrouting, and that new Zoom reporting had provided better insight than low response-rate satisfaction surveys, alongside reduced waiting times.

 

In response to the report received and introduction given, members of the committee began their questioning.

 

Members sought to know how AI had been used to help answer the high volume of enquiries, and what impact this had had on staff wellbeing. Officers said AI functionality within the new Zoom platform had surfaced relevant information from the council’s knowledge base (including web content and training materials) to advisers during calls, helping them provide faster and more accurate answers. AI had also been used to analyse call sentiment and engagement across all interactions, and that virtual-agent technology had supported intent-based routing and reduced switchboard demand and customer waiting time. On wellbeing, officers said the peak period had strengthened teamwork and morale, with staff working flexibly to manage demand, and they said a positive workplace culture had been promoted.

 

Clarification of the service level targets (SLAs), which issues had affected achievement, and whether there had been major drivers of contact other than the temporary congestion charge. Officers said SLAs had been set as the proportion of calls answered within a specified time and had been tailored by service to reflect average call length, rather than applying a single ‘80/20’ industry standard. Adult social care calls were typically lengthy (often around 40 minutes and sometimes significantly longer), so the SLA had been set at 70% answered within five minutes, whereas more transactional lines (including general enquiries) had operated to shorter SLAs and generally performed strongly. Performance had varied predictably by day and time and could not be fully controlled because customer calling patterns clustered. On other drivers, the overall service-level figure had been pulled down primarily by adult social care telephony due to call complexity, safeguarding contacts and the need for strengths-based conversations at the start of care pathways; if adult social care was separated out, performance for other teams would have been higher.

 

Members sought to know what feedback the service had received from staff when leaving the team (either leaving the council or moving to other roles), whether turnover created operational challenges, how these were managed, and whether the service wanted people to stay. Officers viewed progression from the contact centre into other council roles as a positive outcome, and that most leavers moved internally (notably into Adult Social Care roles such as commissioning and brokerage, and into Highways), reflecting the transferable knowledge gained in the contact centre. Turnover could create pressure because vacancies could arise quickly and some service areas were complex to learn, but they had managed this through maintaining a pool of casual staff. The team benefited from a wide age range and flexible working arrangements, including apprenticeships and roles for disabled staff.

 

Members asked about the level of preventative activity to reduce call volumes. Officers explained calls had been reduced by using the Zoom Virtual Agent to route customers correctly first time and to answer routine questions without adviser intervention. Traditional push-button phone menu had been replaced with intent-based voice routing, so callers could describe what they needed in their own words, which had reduced switchboard ‘general enquiries’ demand and avoided double-handling. This approach had also reduced time spent navigating phone menus by around half. In addition, web chat bots had been deployed on key webpages (including the congestion charge and school admissions) to provide answers drawn from the council’s knowledge base and website content, with an option to transfer quickly to a person where required. Officers added that they had used call transcripts and contact data to identify common queries, refine the bots, and feed back issues with online systems to service owners to reduce avoidable contact.

 

Further information was sought on the impact the introduction of AI agents had had on the customer experience team, including whether staff might have been concerned about jobs being replaced, and he also raised the need for transparency and for considering environmental impacts. Officers said they had involved staff in developing the AI tools and had been clear that they were intended to deflect routine contact, but that this would have changed the role rather than removed it, by enabling advisers to focus on more complex cases and customers who needed more support. Staff had been re-skilled as transactional queries were handled through virtual agents, and that the system had helped advisers by surfacing relevant guidance during calls and by providing better insight into why customers contacted the council. Customers could to speak to a person, and that they would have continued to monitor feedback and performance data but officers acknowledged the need to be explicit to ensure that customers were speaking to a virtual agent.

 

The Committee pressed further on how the council had ensured transparency given that voice bots could sound realistic. It was suggested that virtual agents should begin by stating that they were not a real person and explaining how to reach a human adviser. In response, officers said that, for the congestion charge voice bot, callers had already come through an automated system and the service had announced that the interaction was with a virtual agent. This approach would be the model for future virtual agents, and confirmed that customers could ask to speak to a person and would be transferred through without challenge.

 

Members sought to know what feedback there was on customer satisfaction with AI bot implementation. It had been early days but there had been little negative feedback, with staff reporting generally positive comments. Officers had reviewed transcripts and early data, and significant internal and user-group testing had been undertaken to ensure the system worked for people with different needs and to avoid callers becoming trapped in loops.

 

Finally, members asked what support was available for residents who came into County Hall for support, and where the ‘front door’ would be located when the council left County Hall before moving into Speedwell House. Officers said the County Hall reception team had been part of the contact centre and had supported residents and visitors in person for the services delivered by the contact centre, with staff presence available each day. The contact centre did not cover every council service, there had been no general appointment system, and County Hall had not been designed for confidential ‘walk-in’ conversations, which had created challenges when residents sought support for services outside the contact centre. The intention for Speedwell House had been to provide a more suitable resident hub with appropriate privacy, and they had explored options such as a video kiosk to enable residents to speak to other council teams remotely.

 

The Committee AGREED to make the following recommendation to Cabinet:

 

-        That the Cabinet ensures that there is full transparency over whether a caller to the Council is interacting with an AI bot or a human

 

The Committee also AGREED to request the following additional information:

 

-        Performance stats broken across the different CSC teams

-        An estimate of how much time have the Council has given back to Oxfordshire residents because the improvements made have reduced the time they are left waiting on hold

-        Details of the sustainability impacts of the use of AI bots

-        A breakdown of the things that people seek help for from the CSC when they physically come into County Hall

 

 

Supporting documents: