Question 1 – How have you adapted service delivery in response to the bushfires, floods and Coronavirus pandemic? When has it worked and when hasn’t it worked? How will this affect how you deliver services in the future? Have your service adaptations included better integration with other initiatives?
Marymead moved towards telehealth options for all clients. This included telephone and video communication tools.
Marymead moved towards offering support to staff via telehealth; individual phone support/ supervision, peer support and team meetings and very quick setup for staff to work from home
No change in engagement with Aboriginal and Torres Strait Island or CALD community during COVID-19
Intake saw an increase in enquiry for this type of service outside of the funded geographical footprint.
NH groups within schools were cancelled at the schools request. This was due to the challenges of coordinating clinicians logging into the session as a guest remotely during the school day. When school returned to face to face teaching options, there was a delay with reintegrating NH services within schools as schools begun to modify visitor access in line with COVID-19 safety protocols.
Removal of NH outreach meant more clients per day able to be serviced. Clinicians were no longer traveling so could schedule more appointments.
Overall sessions became more emotional support and management during unprecedented times.
Children less than 5 were challenging to engage for the typical duration of a session (50 minutes). These resulted often in more frequent check ins for shorter durations.
As soon as possible, this program moved back to face to face which was preferred by the clients. Some CYP preferred to stay online, and parents have noted the convenience of taking a phone call consult during the day rather than face to face meetings. However service type is now individualised based on client preference.
CAPS – Early Life Matters (ELM)
ELM responded to COVID-19 by moving all existing COSP groups online immediately. No barriers existing with the clients in terms of internet speed, access to computer – however these were considered.
Management within the ELM team participated in conversations with international colleagues of how to transition COS-I protocol online while remaining within the evidence-based framework.
Based on these conversations, there was discussion of utilising a hybrid of COS models to provide alternative to face to face filming (Attachment-based intervention improves Japanese parent-child relationship quality: A pilot study)
Movement to COSI online showed with mixed success – this was not to do with the protocol but more so the external stressors impacting clients (children at home, their own stress levels, working from home etc.).
ELM responded to COSI group pressures by offering clients the opportunity to complete COSI individually online
FARS – Family Skills (FS)
During COVID-19 FS moved all scheduled groups online. The team made modifications to group content and timing to better suit the online forum.
While the transition to telehealth for group was warmly received by clients, once Marymead was deemed an essential service individual counselling clients
Question 2 – Are the proposed key outcomes for the families and children programs the right ones? Are there any major outcomes missing? How can we include strengths-based outcomes that focus on family or child safety?
Marymead feels that these key outcomes are the right ones. The identified outcomes resonate with the programs delivered and with each respective program logic.
When referencing the key outcomes, it may be helpful to include ‘contextual factors’ or the elements of ARACY’s Common Approach: Safety, physical and mental health, relationships, material basics, learning.
ARACY’s common approach is an underpinning framework utilised for FMHSS, CAPS and FARS at Marymead.
Question 3 – What tools or training would support you to effectively measure and report outcomes through the Data Exchange Partnership Approach?
Marymead would value hands-on training in use of validated outcome measurement tools and their translation into SCORE. This would increase confidence in staff, but also ensure consistency of use and expectations across other service providers within each contract.
Marymead has welcomed SCORE as a means of recording a client’s status in a domain at different points in time during the intervention to record change over time. This has been utilised even in programs where it is not yet compulsory.
However, Marymead does not consider the recorded data useful for attributing change in clients to service delivery or being used with simplistic interpretations like positive change is good and negative change is bad. It is our experience that contextual information is often required. For instance within CAPS and the use of the SDQ (strengths and difficulty questionnaire); when relying only on differences in SDQ scores it looks although the intervention has made the client worse. However when gathering additional information, the client indicates that are actually more reflective and informed on the child’s behaviour and how they parent which can result in lower scores than their pre score.
The translation of the SDQ into SCORE aligns well with the Common approach and FAP to support and assess therapeutic change in clients.
However how DEX data is captured is unclear: NH capture both clients (CYP) and their support people (these can be carers, siblings etc.). DEX shows data for all of these. For instance; if we have a CYP, mum, sister- we are currently capturing data for 1/3 of the clients.
Marymead would be keen to hear how other organisations approach this.
When a client extends over 3 periods this has impacts on changes reported on DEX. For example pre-service (counselling) a client engages with Marymead and completes a pre SCORE dataset in December 20, the client may not commence counselling until the next reporting period. Can this be considered pre-service and not require SCORE data, as by the time they commence the intervention in Feb and complete SCORE again – there has been no change, or change that cannot be attributed to the intervention. SCORE is then taken again post intervention in the third reporting period.
FARS – Family Skills (FS)
Currently unable to find a single tool to complete SCORE. PWI is used to inform circumstances.
CAPS – Early Life Matters (ELM)
Use of SDQ in this program; however as above the reliance on this tool is challenging as a pre/post. Often increased reflective capacity which is a positive outcome of the intervention leads to lower scores in post-test.
Marymead would benefit from training and additional support in the use of SCORE across all contracts.
Question 4 – Do you already have a program logic or theory of change outlined for your program? Did you find the process useful? If you do not have one, what has stopped you from developing one? What capacity building support would assist service providers to develop program logics and theories of change?
Marymead has program logic for FMHSS, CAPS and FARS.
Marymead’s program logics are visually represented and show intended short, medium- and long-term outcomes. Program logics are known by all staff and incorporated into practice.
Program logics are helpful to guide practice, and are easily accessibly within procedure manuals.
Procedure manuals are updated regularly to move in line with evidence based frameworks and changes to internal practice.
Question 5 – If longer-term agreements are implemented, how can the department work with you to develop criteria to measure and demonstrate performance? How can the Data Exchange better support this?
Specifically, for improvement of outcome measurement recording in DEX the following would be helpful:
1.) Staff training
2.) Increase range of outcome measurement tools for translation into SCORE
3.) Simpler processes for the translation into SCORE
Marymead feel that a review of the DEX protocols in relation to SCORE is essential. Since entering into SCORE and since receiving feedback in relation to our SCORE data, we have detected some anomalies.
Reliance on SCORE alone minimises rich data to more simplistic terms which often do not show the true changes.
As outlined above different programs show difficulties in utilising SCORE data. Further direction on how other service providers are avoiding or inputting this data from DSS would helpful.
While overall Marymead supports the movement towards program evaluation and incorporation of evidence base into practice, there should be considerations for this on expected contractual outputs for a specified period of time. To consistently source, train and implement an evidence evaluation into each program; staff require less client facing time to do this. The increased administrative load both for service delivery staff and management alike should either considered in expected outputs, or incur additional funding.
Question 6 – What does success look like for your service, and how do you assess the overall success of your service?
Client outcomes and feedback
Qualitative data at any time (via formal or informal means)
Quantitative data obtained via tools
Client and stakeholder feedback
Case studies – these are obtained by each clinician every 6 months
Staff retention and satisfaction
Question 7 – Do you currently service cohorts experiencing vulnerability, including those at risk of engaging with the child protection system? If not, how does service delivery need to adapt to provide support to these cohorts?
Marymead does service cohorts experiencing vulnerability, including those at risk of engaging with the child protection system within Families and Children contracts.
FMHSS – (NH) works with those at risk of child protection removal, not under the care of child protection. This is in line with the service being an Early Intervention service not a crisis service. Service delivery cannot adapt to under the care of CYPS as it is not set up for crisis support. To move towards this model additional funds would be required to support on-call and crisis support training for service delivery and management.
CAPS (ELM) & FARS (FS)– both services currently service families both at risk of involvement and under care of child protection services. Both programs support clients individually through counselling and in psychoeducation groups. Often these services are referred to by Marymead by CPS.
Question 8 – If you are a Children and Parenting Support or Budget Based Funded service provider, do you currently link with a Communities for Children Facilitating Partner or other regional planning mechanism to understand what other services are provided in the community and what the community identifies as their needs? How does this work in practice? Would you value the increased support of being attached to a local Facilitating Partner?
Marymead is funded under the CAPS agreement. While we are not formally linked in with a Communities for Children Facilitating Partner, we are well connected within the local community.
Question 9 – For all providers, are there other ways to improve collaboration and coordination across services and systems?
Funding certainty for some services has assisted this, and will encourage collaboration
FMHSS and FARS contracts.
The generation of a service hub for discussion of cases (i.e. if clients are accessing multiple services – how can we identify and discuss this for a wraparound support?) The development of such a hub would avoid service duplication – and instead encourage continuation of support.
Question 10 – The capability building support offered under Families and Children Activity programs has gone through several iterations. What works well? What do you think should change?
Marymead are not familiar with the available capacity building support, but would welcome further support if available.
Question 11 – Aside from additional funding, how can the department best work with you to support innovation in your services while maintaining a commitment to existing service delivery?
The development of AWP allows some innovation for service delivery. It has been Marymead’s experience that FAM’s have been positive in discussing changes to both outputs and innovation to existing services through AWP. The fact that AWP are both reported on and updated annually allows for evaluation, modification and innovation as required.
As above a hub for local agencies to discuss cases would be advantageous. The same hub may also allow a space for agencies to compare empirical research, evidence based and collectively move towards a benchmark of best practice.