CatholicCare Wilcannia-Forbes
Question 1 – How have you adapted service delivery in response to the bushfires, floods and Coronavirus pandemic? When has it worked and when hasn’t it worked? How will this affect how you deliver services in the future? Have your service adaptations included better integration with other initiatives?
During the COVID-19 pandemic, we increased remote delivery via digital means including video-enabled delivery, web-based self-help options and phone support. We had to act quickly to reach people in a meaningful way. However, this was not a new issue to us due to the area we cover (Western 52% of NSW). Many of our clients live in isolated locations with the tyranny of distance being their major barrier to service access.
We were well-placed to transition to video-enabled service delivery as we had already installed the required technology; it was functional and ready to go. However, we primarily had used it for staff interactions to that point and only on a very small scale for client interactions. The pandemic had caused us to fast-forward video-enabled and web-based service solutions.
Moving forward we will offer a greater suite of service options to suit the diverse needs of our clients. Remote service delivery models have proven useful for clients who cannot access face-to-face services. Remote delivery has also enabled us to cover a greater geographical footprint, e.g. by joining participants from multiple locations for a single workshop. Remote delivery has also enabled us to respond to service gaps; for example, communities that cannot offer a certain service are now able to refer to us despite us not having a physical presence; e.g. during the initial stages of COVID-19 restrictions saw a spike in referrals to our anger management program. Referrals came from multiple locations. We were able to respond promptly by coordinating classes via Zoom with participants linking in from several communities.
Whilst remote delivery has so far shown little benefit for some target groups (e.g., Aboriginal families), others have embraced it and benefitted greatly; for example, clients suffering from social anxiety. We have also discovered that social media platforms complement traditional service delivery models well – specifically for the ‘new’ younger generation of parents. A combination of face-to-face delivery with Facebook content has been well received and enabled us to bring content into the home without intruding. Our Facebook presence then had the ripple effect of increased self-referrals into our playgroups once they started up again.
Many of our workers discovered an almost enforced empowerment of clients due the workers not being able to assist the way they had pre-COVID; e.g. clients completing forms themselves with minimal support over the phone.
The other significant issue in our footprint is the drought which almost has become a part of life. We are constantly adapting services to meet the needs of drought-affected clients. This has been achieved through targeted funding made available by philanthropic funders with a focus on rural and remote issues. An example for these types of innovative services includes CatholicCare’s Wellbeing Mobile that visits farming properties and villages for the delivery of wellbeing activities.
Question 2 – Are the proposed key outcomes for the families and children programs the right ones? Are there any major outcomes missing? How can we include strengths-based outcomes that focus on family or child safety?
Yes, the outcome framework resonates very well with us. The outcomes are reflective of outcomes identified in our own program logics. We appreciate the graphic representation that highlights how outcomes for children, adults and families are interlinked and need to be seen in the broader context of community, aiming towards a set of broader outcomes. Our only suggestion would be to consider replacing ‘physical health’ by a more general ‘health and wellbeing’ so to acknowledge all aspects of good health. We feel that strength-based outcomes focusing on family and child safety are already captured under ‘Children and Young People Thrive’. Our only recommendation would relate to physical needs; e.g. “basic needs (food, shelter, access to healthcare) are met”
Question 3 – What tools or training would support you to effectively measure and report outcomes through the Data Exchange Partnership Approach?
CatholicCare Wilcannia-Forbes would value hands-on training in use of validated outcome measurement tools and their translation into SCORE. We engaged into the partnership agreement from the moment DEX was introduced. We welcomed SCORE as a means of recording a client’s status in a domain at different points in time in which we are wanting to make a difference. We don’t think the recorded data is useful for attributing change in clients to service delivery or being used with simplistic interpretations like positive change is good and negative change is bad. We appreciated the capacity to translate a suite of validated outcome measurement tools into SCORE. However, using these tools did not become our consistent practice. This was due to several reasons. Some programs found it difficult to find the tool that was just right. Under a holistic assessment model, most tools are too narrow and would require staff to use multiple tools. This would have added significantly to the administrative burden for our team members, of which many works part-time. On the other hand, broad tools such as PWI-A may lack detail and are no viewed as useful in examining a client’s complex circumstances. Many of our team members would have liked to see the inclusion of tool that they find useful such as the outcome star, the resilience donut and the CAARS wheel. The CAARs wheel (compulsory under FMHSS) is greatly appreciated by staff as it aligns well with the Family Action Plan. Therefore, the assessment fits nicely in the case management process, makes sense and is not viewed as an additional burden. It would be good of the translation into SCORE was possible and could be simplified. In all the years of the partnership agreement, we have not successfully used outcome measurement tools and translation matrix to record outcome measures in SCORE. Training and additional support are needed.
Question 4 – Do you already have a program logic or theory of change outlined for your program? Did you find the process useful? If you do not have one, what has stopped you from developing one? What capacity building support would assist service providers to develop program logics and theories of change?
Yes, we do. Program logic development has been part of our practice for many years. It is part of our practice to develop a program logic for every program that we develop, tender for and/ or win. Program logics are developed through a workshopped approach with key stakeholder involvement that involves frontline staff and occasionally clients and partner agencies. The more stakeholders are involved, the richer becomes the program logic. Whilst in the past our program logics were a very simple outcomes chain, we have in recent years enhanced our template and content and included a space for underpinning theories of change. Our staff find logics exceptionally useful. They are the foundational document for all conversations relating to program and are regularly referred to during team meetings and supervisions. They are the ‘go to’ document for project briefs and proposals for innovation. Enhancements of our logic template were triggered by the work that was conducted under the TEI (Targeted Earlier Intervention; DCJ) reform.
Question 5 – If longer-term agreements are implemented, how can the department work with you to develop criteria to measure and demonstrate performance? How can the Data Exchange better support this?
We value the tailored support that DSS has been able to offer in the past. DSS has a strong track record of working in partnership with their funded service providers through a solution-focused approach, specifically through their appointment of grant agreement managers who are subject matter experts in program areas and have a good understanding of the communities where services are being delivered. We hope that this type and level of support will continue. For improvement of outcome measurement recording in DEX the following would be helpful: staff training; increase range of outcome measurement tools for translation into SCORE; simpler processes for the translation into SCORE.A review of the DEX protocols in relation to SCORE is essential. Since entering into SCORE and receiving feedback about SCORE data, we have detected some anomalies that make us question the usefulness of SCORE data.
In the discussion paper the principal way that SCOREs are used is looking for individual differences over time and attributing the change to the service provision. We support looking for individual differences over time, but don’t support attributing the change to the service provision, or simplistic interpretations of what the change might mean. To get the most from scores when looking for individual differences over time one needs to compare pre, (interim),and post scores– especially for long term clients. The current way this works in DEX is flawed, for example DEX will compare a POST score from a prior session with a PRE score from a subsequent session and say the service has not achieved an outcome, when in fact the wrong scores are being compared.
There are other uses of the score data at a population level, e.g. comparison of pre-scores of all clients entering a program over a period of time with pre scores from a later period to monitor changes in the status of people when they coming in to services. Reflection on this data may help to understand population differences in needs of people over time. There are many references to and support for evaluation throughout the document.
Given the comments about the use of SCORES and their limits for attributing cause and effect from service delivery to outcome, and not making simplistic interpretations about what the difference in scores means, it is very important that thorough evaluations are used to better design and measure outcomes of programs and improve programs. It will be important to fund organisations to do evaluations. It is important fund relevant foundational research, e.g. studies on the reliability and validity of scores, work on the cultural relevance of concepts and interventions. Much of the “evidence” that is referred to is often not applicable to the cultures and populations that are being worked with in services. These issues and many others need to be addressed through funded research and funded evaluation.
Investment in research and evaluation is necessary to get practice right
Question 6 – What does success look like for your service, and how do you assess the overall success of your service?
The simple answer is client outcomes. A good program logic links outputs and outcomes under a ‘what if’ framework; therefore, if the logic is sound, then outputs can be a valuable measure of success; outputs and outcomes need to be understood in the context of the narrative, the client’s story. Case studies are therefore a critical part of the success story. Output and outcome measures alone lack depth and causality can only be assumed. The Discussion Paper uses the term “outcomes” in multiple ways, including, e.g something to be achieved; domains in which we are wanting to make a difference; client’s current status in a domain in which we are wanting to make a difference; what has been achieved with the client through the provision of a service. Because the term “outcome” is used in different ways in the paper, it is often not clear what the intended meaning is in particular cases; e.g. Does “the reporting of outcomes through the Data Exchange Partnership Approach” mean reporting the client’s current status in a domain in which we are wanting to make a difference; or does it mean reporting what has been achieved with the client through the provision of a service. If the former, then we would fully support it. If the latter then we could not support it as it is not possible to use the difference between the client’s current status in a domain in which we are wanting to make a difference, at two different points in time, to measure what has been achieved with the client through the provision of service. There are multiple causes and multiple effects in changing contexts for people receiving services. There are also different situations: Were we expecting to see change, would it have happened anyway, are we stopping the clients situation from getting worse, so no change is good, would it have got even worse without the service no negative change is good. So the pre and post domain status’s are starting points for asking questions. They are the bases for attribution of the impact of the service provision; not should they be combined with simplistic notions that positive change is good and negative change is bad. It is really helpful to collect the data as client’s current status in a domain to ask good questions; it is simultaneously important to not misuse the data as a measure of service impact or with overly simplistic interpretations. In the discussion paper measure is consistently used with outcomes – measure outcomes. Measuring outcomes can mean different things. And because of cause and effect issues it is not possible to attribute impact to service delivery. When developing and using “measures” there are issues about reliability and validity to be addressed. If two people used the same measure with the same client would they get the same result? Does the measure measure what it says it measures? If there were to be confidence in SCORE data then there needs to be evidence to show that SCOREs are reliable and valid.
Question 7 – Do you currently service cohorts experiencing vulnerability, including those at risk of engaging with the child protection system? If not, how does service delivery need to adapt to provide support to these cohorts?
Yes, we do. Key ingredients to success here are a physical presence in community, visibility, locally employed staff, and a long-term commitment to community that instils trust as a foundation for the establishment of rapport. In addition, flexibility is critical – meeting clients on their terms. Time is of essence. Trusting relationships do not develop overnight. It is therefore critical that this component of client work is recognized, and sufficient resources are allocated. If we were to go down the path of a prescribed percentage of evidence-based evidence content, we need to be cautious not to do so at the detriment of the relationship component for client work. The best evidence-based content cannot be delivered until a solid trusting relationship with the client is in place.
Yes, we do work increasingly with clients who are already engaged with the child protection system. In some of our communities we now have systems in place that enable case coordination with DCJ (child protection) and our programs sitting at one table for a wrap-around support for vulnerable families. This is a model we would like to see in every community.
We would like DSS to give thought to broaden the eligibility for FMHSS. Under current guidelines we cannot work with children who are a ward of the state. These children require FMHSS the most.
Question 8 – If you are a Children and Parenting Support or Budget Based Funded service provider, do you currently link with a Communities for Children Facilitating Partner or other regional planning mechanism to understand what other services are provided in the community and what the community identifies as their needs? How does this work in practice? Would you value the increased support of being attached to a local Facilitating Partner?
CatholicCare Wilcannia-Forbes is funded both under CfC FP and CaPSS; however, the CfC and CaPSS sites do not overlap. We hold the auspice for the Dubbo CfC site, our CAPSS locations are further West.
We value the CfC model and are firmly committed to it; over 14 years it has become a community-owned initiative. Through community consultation, allocation of contracts to small community-based organizations and a governance framework that gives voices to a broad range of stakeholders, CfC is the epitome of a place-based, needs-responsive and community driven service model. Its key to success is the funded coordination through the role of the Facilitating Partner. Without dedicated funding, similar ‘voluntary’ approaches to service coordination often fail. Indeed, we would like to see it rolled out on a much larger scale.
CaPSS on the other hand has given us the chance to deliver much needed family-focused support services into our smaller rural and remote communities. Through the tendering process we were able to propose a service model that was to meet needs based on our experience as a locally embedded provider (through consultation with staff and clients) whilst filling service gaps and avoiding duplication. In many of these communities we are often the only or one of very few providers.
From our perspective, we wish to offer two thoughts for consideration:
- take the key ingredients of CfC (i.e. structured consultation, development of a strategic document and ongoing key stakeholder input through an advisory committee) and apply them to CaPPS.
- If we were to go down the path of the CfC FP model, remove the stipulation that FPs cannot deliver services as the FP in rural and remote locations often may be the only provider that can deliver the proposed activities.
In principle, we believe that a ‘blend’ of CfC FP and CaPSS is possible and most certainly feasible. However, all implications need to be considered carefully and thought needs to be given to transition processes that are fair, don’t disrupt service delivery to communities and don’t ‘throw the baby out with the bath water’.
CatholicCare Wilcannia-Forbes has for quite some time now given thought to an expansion of their CfC site (Dubbo) to include the Western part of NSW (from Dubbo to Bourke). This could be achieved without significant increase to FP funding as the FP/ governance/ administrative functions are well-established. Majority of funding would therefore go to frontline delivery. This solution would combine the best elements of CfC and CaPSS whilst being highly cost-effective.
We would be very much in support of this approach as long as the FP would be able to deliver services in communities where they are the only or most suited provider for a particular CSP activity.
Question 9 – For all providers, are there other ways to improve collaboration and coordination across services and systems?
We feel that five-year contracts have taken the edge of competitiveness. This has had a positive impact on organisations’ willingness to collaborate and share resources.
Greater consideration needs to be given to State-funded initiatives. From our perspective the duplication between Targeted Earlier Intervention and CaPPS is a greater issue than the duplication of CfC and CaPSS.
Question 10 – The capability building support offered under Families and Children Activity programs has gone through several iterations. What works well? What do you think should change?
The paper does not define evidence, evidence based practice or evidence informed practice. There is considerable literature on the meaning of these terms and the meanings can change across settings/professions. Definitions are essential to be able to comment on / support particular initiatives, e.g. considering “meeting evidence-based program requirements”. In DSS’s current approach, it would seem from our experience that what DSS means by evidence, evidence-based practice or evidence informed practice is too narrow. Innovation is essential for service development. We need to be cautious not to let evidence-base stifle innovation. Again, definitions are needed. Comments on capability-building support: Industry List – We accessed many valuable programs through the list and trained our staff in several. Most are in use but not all as they proved not useful for many of our families. For many programs, pre-requisites could not be met, e.g. only qualified psychologist were able to be trained. We have very few trained psychologists in our region but very gifted and skilled community workers with lesser qualifications. Greater flexibility here would have been useful. For some programs, it was near impossible to access training. This improved with time. We observed an interesting trend with many of the programs on the list. Fees have increased and licensing periods have been introduced, most of them of quite short duration; e.g the annual license fee for ‘Fun Friends’ is $ 3,000. This needs to be factored into the budget. The range of programs is too narrow. More child-focused and adult relationship focused programs will need to be added. There are highly recognised programs that in our eyes should be on the industry list, for example “My kids and me” and “RAGE”. AIFS accreditation process – We engaged in this process under CfC. Our community consultation had shown a need for after school and school holiday activities for the most vulnerable children in our community, children living in a social housing estate. An activity was designed, developed, and implemented that through a highly client-centred and co-designed approach would offer daily activities to these children in a space where they felt safe and were able to build trusting relationships with adults. Our Community Partner needed permission to commit 100% of their time to this important work that was showing promising outcomes. For that reason we put this program forward for ‘accreditation’ but found that it could not be ‘bottled’ in terms of dosage, frequency, and content due to its highly client-centred approach. We found the narrow view of evidence-base that underpinned the accreditation unhelpful and counter-productive to the intent and philosophy of CfC. Training and CoPs offered by Family Action Centre and ANU were extremely useful as they were tailored to our needs.
Question 11 – Aside from additional funding, how can the department best work with you to support innovation in your services while maintaining a commitment to existing service delivery?
In line with our response to 10, it is important that we continue to respond to community need. That will require innovation. We need to be cautious not to stifle innovation by structures created by evidence-based programs and EBP ratios firmly stipulated in funding agreements. The department will be able to help by allowing services to consult with communities, exploring workable solutions and negotiating EBP ratios on a case-by-case basis. The department will also be able to help by supporting agencies in their evaluation efforts for innovative models. We also see great benefit in creating funding schedules with room for flexibility, so to allow service providers to respond to emerging needs. The current TEI (Targeted Earlier Intervention) reform is a good example for these type of flexible funding agreements.
Question 12 – Is there anything else you would like to share about the ideas and proposals in the Discussion Paper?
We look forward to our continued involvement in the discussion. Thank you.