Need help with your Discussion

Get a timely done, PLAGIARISM-FREE paper
from our highly-qualified writers!

glass
pen
clip
papers
heaphones

Colorado Technical University Program Design and Development Worksheet

Colorado Technical University Program Design and Development Worksheet

Colorado Technical University Program Design and Development Worksheet

Question Description

Identifying Types of Data and Developing an Information-Sharing Plan

Return to the issue you are addressing from the Riverbend City: Examining the Ruby Lake Community media piece. Address the following for this assignment:

  1. Create a data reporting table for the program you are developing, similar to Table 15.3 on page 441 of your Program Development in the 21st Century textbook. Address all data types relevant for your selected program and structure data for each of the four key aspects:
    • Responsibility.
    • Timeframe.
    • Methods.
    • Recipients.
  2. Prepare an example of a data reporting tool that would help you assure the quality of program implementation and an understanding of whether or not your program objectives are being met. You may choose any aspects of project implementation on which to report data, such as staffing, budgeting, information systems, etcetera.

Review the scoring guide for the assignment carefully to determine what is required to achieve a distinguished level of performance.

Assignment Requirements

  • Written Communication: Written communication should be free of errors that detract from the overall message.
  • APA Formatting: The paper, including resources and citations, should be formatted according to current APA style and formatting guidelines.
  • Length of Paper: 3–5 typed, double-spaced pages, not including the title page and references.
  • Font and Font Size: Times New Roman, 12 point.

Chapter 15 pg 419-449 Program Development in the 21st Century: An Evidence-Based Approach to Design, Implementation, and Evaluation

ISBN: 9781452238142

By: Nancy G. Calley

BUT THE PROGRAM IS EFFECTIVE

On Tuesday afternoon, Reggie received a call from his contract manager telling him that his contract for mentoring at–risk youth would not be renewed next year because funding for children’s services had been reallocated. The contract manager went on to explain that the county was facing budget cuts and, therefore, had to make decisions about which programs to continue funding. While these decisions were difficult to make, they were based on identifying the most essential programs and those that had produced strong outcomes. Unfortunately, the mentoring programs were not viewed as essential nor was there evidence of their success.

When Reggie heard this, he was flabbergasted. He explained to the contract manager that he had been conducting a program evaluation since his program’s inception 2 ½ years ago and that the outcomes were extremely positive. He quickly shared with her some of the highlights of his program:

  • Teens who had been successfully matched with mentors: 342
  • Percentage of mentees who graduated high school compared with the region’s 67% graduation rate: 94%
  • Percentage of the mentees who pursued college: 68%
  • Of the 32% who did not pursue college, percentage who pursued vocational training or were employed: 25%
  • Percentage of the mentees who were either living at home or living independently: 92%
  • Percentage of the mentees who had been involved in criminal activities post–mentoring services: less than 5%
  • Percentage of the mentees who had experienced substance abuse problems: 6%; less than 3% required treatment
  • Percentage of mentees and parents/caregivers who identified having had a mentor as one of the most important aspects of their teenage years: 94% and 98%, respectively
  • Total cost per youth: $302 (compared with the $1,680–$6,440 costs related to much more intensive case management services and comprehensive community–based programs for court–involved youth—precisely what Reggie’s program was designed to prevent being needed)

After quickly reviewing these outcomes with the contract manager, Reggie promised to send her the full set of evaluation data and the summary report. He told her that he had been planning to send her the report and evaluation results once he had 3 full years of data and said he was sorry that he had held onto the information.

The contract manager shared her surprise with Reggie, stating that she wished she had known sooner about the program’s success, since it very well could have meant that the funds for mentoring programs would not have been cut. However, legislative action had already been taken, so the funding decisions were final. She did encourage Reggie to make the evaluation findings available and told him that they may still prove fruitful in the next funding cycle’s decision–making process.

CONSIDERING REGGIE

  1. What did Reggie do right, and what mistakes did he make?
  2. If you were Reggie, what would you do next?
  3. What practices should Reggie put in place to ensure that relevant program information is shared with all who have a need to know on an ongoing basis?

About This Chapter

This chapter’s focus is the significance of data, the critical information that data provides about all aspects of a program and organization, and most importantly, the invaluable need for comprehensive information sharing. We will examine both indirect and direct benefits of information sharing. In addition, we will explore the various types of data that are collected as part of comprehensive program development, including outcomes data, process evaluation data, human resource data, financial data, compliance and quality improvement data, and other pertinent data. We will explore the questions related to whom information should be shared with and why, as well as how frequently and through what medium information should be communicated. To guide comprehensive data collection and to illustrate the importance of examining all the program data in order to understand the total program operations, the Quarterly/Annual Comprehensive Data Report Tool is provided. The chapter concludes with a case illustration to further reinforce the content of the chapter, followed by a data report plan exercise and questions for reflection and discussion.

STEP XIII: DEVELOP AN INFORMATION–SHARING PLAN

Significance of Information Sharing

Seemingly, Reggie did everything right. At the implementation of a new program (i.e., mentoring), he designed and implemented a comprehensive process evaluation and a comprehensive outcome evaluation. He methodically collected and analyzed the data and, as a result, was intimately aware of the significant details of his program and the impact that it had made. However, he made one unforgiving mistake: He failed to share the data that he had collected and analyzed with all the people who had a need to know. As a result, his highly successful program would cease to operate.

While Reggie’s example illustrates one of the toughest lessons about the business of which program development is a part, it is unfortunately not uncommon. As mental health and human service professionals have continued to become much more concerned with evaluation methods and other data collection activities, there continues to be a lag in following through once data has been collected. This can create obvious challenges, particularly since any data that is collected but not used should not have been collected in the first place, since it incurred a cost without producing a benefit.

There are many other important lessons that Reggie’s vignette illustrates (see Box 15.1).

BOX 15.1

LESSONS FROM REGGIE

  • Comprehensive program evaluation must be conducted at the start of any new program.
  • Outcome data is essential and, therefore, must be collected.
  • Output data, such as cost–effectiveness, is essential and, therefore, must be collected.
  • Program data must be shared frequently and regularly with all stakeholders.
  • Collecting and analyzing data without sharing it with stakeholders may have devastating results.
  • Collecting and analyzing data and not sharing the results may have the same effect as if no data had been collected or analyzed.
  • Conducting various types of evaluation and sharing the findings with stakeholders may have a direct effect on your program’s sustainability.

Whereas each of these lessons is significant, the more critical issue related to information sharing has to do with why data is collected in the first place. A large part of this answer is provided in previous chapters in the discussions related to program design, implementation, and evaluation and assessment. However, in addition to implementation and evaluation data, other data must be collected, such as client demographic data and financial data. Through comprehensive data collection and analysis, mental health professionals are empowered—empowered to better understand and manage their program. The notion that information is power can be clearly illuminated in program development efforts, particularly as the more knowledgeable the program developer is about the program, the easier it is to articulate the program to others. Conversely, without detailed information about the ongoing operations of the program, it’s more challenging both to communicate the program to others and to garner support for the program. Because the operations provided by human service organizations depend on people (Gibelman & Furman, 2008), the central role that ongoing communication, including information sharing and data sharing, plays in supporting a program’s operations is critical.

More importantly, without effective means by which to communicate the work that mental health professionals provide and the impact that this work makes, mental health care itself is at risk. Morris et al. (2010) speak about this global issue from an Irish perspective:

As with all areas of health care in Ireland and internationally, the health information deficit in the mental health services serves to impede the decisions of policymakers, health care workers, patients, and their families. It is imperative that mental health information becomes more accessible, useful, and comprehensible so that a culture of information gathering and use can be fostered both internationally and in Ireland. This information can then provide the evidence required for the provision of high–quality health care. (p. 360)

Direct and Indirect Benefits

In addition to what is listed above, there are numerous other benefits—both direct and indirect—that may result from sharing information related to program operations and outcomes with stakeholders. Indirect benefits refer to benefits that may not produce a direct result but that produce some impact, whereas direct benefits are those whose effect is concrete. For instance, by sharing information about program operations with staff, employees may have an increased level of engagement with the program/organization. This level of engagement may not be quantifiable, but it may mean that some employees choose to remain at the organization even when other more lucrative opportunities arise. Because you may not be aware of this impact, particularly since you may not have had any idea that someone was considering leaving, the impact is indirect—yet still significant. Alternatively, the sum effect of employee engagement may produce the direct benefit of employee retention, especially since employee retention results in decreased expenditures associated with hiring. This benefit can be tremendous, as any effective program developer and human resources manager can tell you exactly what it costs to replace an entry–level professional employee (e.g., case manager, therapist), which may range from $6,000 to $12,000. Thus, reducing unwanted employee turnover is an objective of most managers, because replacing an employee creates additional and often unnecessary expense to the organization that cannot be recouped. The costs are largely attributed to such administrative work as processing new applicants, hiring–related activities, coordination of employee benefits, and new employee orientation and training, among others. Considering these unnecessary costs, it is not difficult to see the benefit of staff retention.

Box 15.2 provides a snapshot of other indirect and direct benefits related to information sharing.

BOX 15.2

INDIRECT AND DIRECT BENEFITS RELATED TO INFORMATION SHARING

Indirect Benefits

  • Increased ownership in the program/organization among employees, resulting from increased knowledge of shared responsibilities
  • Creation of a culture of transparency and shared commitment
  • More flexible workforce that can more easily adapt to changes when needed as a result of being consistently informed

Direct Benefits

  • More productive and effective workforce as a result of increased knowledge of the business
  • Problems and deficits able to be quickly identified and resolved so that program/organization is continuously improving
  • More competitive program and organization as a result of increased productivity and effectiveness
  • Increased business and growth opportunities
  • Program/organizational sustainability

Types of Data

There are multiple types of data that mental health and human service professionals collect as part of the program management process. Indeed, at times, some mental health professionals claim that they are more data collectors than mental health professionals—with responsibilities of collecting intake information and administering and collecting assessment data, treatment planning data, quality assurance data, contract compliance data, and so on. However, the issue is not one of data collector versus mental health professional but, rather, of mental health professional whose role very much involves data collection and management. Data is pertinent to our ability to effectively assess and treat clients, manage staff and other resources, manage programs and organizations, and continue to enjoy our livelihood. Or put even more succinctly, “Data collection is the sine qua non of effectiveness–based program planning” (Kettner, Moroney, & Martin, 2008, p. 19). Data collection and management, therefore, must be both respected and appreciated—not as an added job but as one of the most integral parts of our job. Once this has occurred, the power that information holds can be fully unleashed.

While there is an enormous amount of data that may be collected, the primary reason for collecting the data has to do with gaining knowledge about all aspects of the program. However, all data that is collected must be fully justified. And as Gard, Flannigan, and Cluskey (2004, p. 176) remind us, the four questions that should guide the data collection process are as follows:

  1. What do we want to know?
  2. Why do we want to know it?
  3. What should we measure?
  4. How should we measure it?

Knowing that all data that is collected has a specific use is essential. Often, the most essential data is collected for a process or outcome evaluation, human resource management, financial management, or contract compliance and quality improvement activities. While these data sets can be reviewed independently, they also must be thoroughly reviewed concurrently, thus forming a complete picture of the program. By doing so, a critical understanding of how each of the data sets interacts with the others can be achieved. Each of these various types of data sets is discussed next.

Process Evaluation Data

As discussed in Chapter 12, a comprehensive process evaluation allows you to assess the myriad aspects of a program throughout its implementation. Depending on the type and scope of the process evaluation, a variety of data can be collected that includes client demographic and other descriptive characteristics and program outputs such as number and type of interventions provided, treatment length, and number and qualifications of staff providing treatment. In addition, coverage and equity data can be collected to provide specific information about who is being served and who is not being served.

Demographic and descriptive data can be highly useful in gaining increased understanding and knowledge of your client population and, therefore, must also be collected and analyzed. This data has multiple uses, including as part of a process evaluation in identifying the target population and needs, increasing knowledge about program outcomes as related to client subpopulations and specific characteristics, advocacy efforts, and pursuing funding opportunities. Indeed, possessing specific and comprehensive knowledge about client populations is essential to effective program management. Box 15.3 provides a sample of possible types of demographic information that may be collected and reported.

BOX 15.3

SAMPLE OF DEMOGRAPHIC DATA CHARACTERISTICS FOR A TRANSITIONAL HOUSING PROGRAM

  • Age
  • Gender
  • Race
  • Ethnicity
  • Language
  • Dependent children (ages, gender, and special needs)
  • Intimate partner status
  • Special needs
  • Academic history
  • Employment history
  • History of homelessness
  • Family, friends, and other supports

Demographic data provides rich information; however, it is often in collecting this type of data that mental health professionals run into trouble. Much too often, data is collected that is not needed—data that is not going to be used for a specific purpose. This goes back to the issue that no data should be collected that does not have a specifically identified use, because otherwise, you risk doing a disservice to those whom you are serving as well as wasting time and money For instance, each of the data elements in Box 15.3must serve a specific purpose, to justify why it is being collected. And in this case, each data element does serve a purpose, as illustrated in Table 15.1.

Table 15.1 Data Elements and Rationale

In addition to the specific purposes listed above, client demographic and descriptive data also can be used to learn specifically about program coverage and program equity—significant information for program developers, communities, and funding sources.

Coverage data provide feedback on the extent to which a program is a) meeting the community need and b) reaching its target population. Monitored during program implementation, coverage data can be used not only to determine the extent to which the target group is being reached but also to ensure that individuals ineligible for the program are not served. (Kettner et al., 2008, p. 258)

Similarly, equity data provides feedback on the various subgroups within a region to identify what, if any, disparities exist in regard to who is being served.

Unless a program is targeted at a specific subgroup of a community, all other things being equal, geographical subareas and subgroups should be served by a program in roughly the same proportion as their composition in the community. Equity data can be used to ensure adequate coverage of subgeographical areas and subgroups during implementation or at the end of a program to document that a program is or is not reaching some geographical subarea or subgroup. Utilized in a performance measurement approach, coverage data provides stakeholders with information about the distribution of outputs, quality outputs, and outcomes across subgeographical areas and subgroups. (Kettner et al., 2008, pp. 258–259)

In addition to client demographic and descriptive data, various types of information are collected specifically for the process evaluation to provide comprehensive information related to program implementation and operations and to promote further knowledge of outcomes. Several of the types of data that are collected as part of a process evaluation are reviewed in Chapter 12; so please refer back to that chapter if needed. Briefly, information about the implementation process itself is collected, including the number of resources (e.g., staff, money) allocated to the program, location of service delivery, and unexpected occurrences, to name a few.

To reiterate, fidelity assessment may be included in the process evaluation in order to specifically assess the degree to which a treatment is delivered as intended. The five major areas of fidelity are treatment design, training, treatment, receipt of treatment, and treatment skill enactment (Borelli et al., 2005), and each requires specific data to be collected and analyzed. Treatment design data may include number and type of interventions and theoretical basis of treatment, while training may include the content and methods used to prepare staff to deliver the treatment and staff credentials. Treatment delivery data may include the number and type of interventions actually delivered, the time frame in which treatment was delivered, and the credentials of the individual(s) delivering the treatment. Other specific data that may be collected and analyzed in a fidelity assessment were also discussed in Chapter 12; so again, please refer back for a more comprehensive discussion of data types involved in a fidelity assessment.

Because of the unique power that process evaluation data holds—including demographic and fidelity assessment data—sharing this information with stakeholders is critical. Client demographic data can be particularly useful not only in increasing knowledge of your particular target population or region but also in informing the broader field about client needs and characteristics. Therefore, this information is of great value to staff, funding agents, and other professionals. In addition, this data is pivotal to ongoing program planning efforts. For instance, program modifications may need to be made to a program that was originally designed for adolescents but that currently has a majority client population of older teens, since there are often significant developmental differences between the two groups. Likewise, a subpopulation of clients may not speak English, and therefore, specific program modifications and additional supports will be required to effectively serve this group. In addition, information about the type and scope of resources, such as staff credentials, administrative oversight, and adjunctive services, is essential not only to fully understanding all the aspects that contribute to the program’s success but also to understanding all that must be in place to effectively support the program. This information has specific relevance to planning, managing, and sustaining programs and is directly related to the program’s finances.

Because treatment fidelity data speaks directly to the design of a particular treatment, sharing information about the degree to which fidelity has been maintained throughout implementation is critical for program staff. As such, this information provides direct feedback about their performance as well as about the success or failure of the program developer in planning for retaining treatment fidelity. In addition, this information is critical to clients as part of the informed consent process and as consumers of services with a right to know that they did receive what they were told they would receive. Moreover, this information is significant to funders, as it speaks to accountability and treatment design. Finally, this information is essential to other professionals and stakeholders in continued efforts to better understand treatment design and to understand the relationship between design, implementation, and outcomes.

Outcomes Evaluation Data

The types of data collected in the outcomes evaluation are specific to the program’s objectives and, therefore, are unique to a program. However, there are also often common outcomes relevant to different types of programs. For instance, a treatment program for juvenile sex offenders and a second–chance academic program for young adults (i.e., a high school diploma program for young adults) may share the outcome of academic success. Whereas both programs may share an outcome related to academic success, the outcomes for juvenile sex offenders may also include recidivism (i.e., reoffending), improved family functioning, and increased independence.

Outcomes are generated directly from the program design. For instance, family therapy and the development of a family support network are designed to improve family functioning and, therefore, must be evaluated to determine if the interventions did indeed lead to the anticipated outcomes. This interdependent nature between design, implementation, and evaluation is essential to understanding the complexity of programs and program development efforts, and therefore, highlighting these relationships in discussions of outcomes is helpful in increasing knowledge of the program’s efforts.

In addition, because outcomes reflect a program’s success, they are critical to all stakeholders, including clients, staff, funding agents, community members, legislators, and professionals in the field. Staff are particularly interested in outcomes since they reflect another measure of their performance and, as such, provide another integral link to what they do and why they do it—a critical benefit to us all while working with individuals. However, whereas it is necessary and healthy for any organization to continuously evaluate outcomes, caution must be exercised in the degree to which outcomes are directly associated with employee performance. Indeed, the employee is a vehicle by which an intervention is delivered, and therefore, rarely is it the employee who failed but, rather, the intervention that failed.

Funding sources and legislators are specifically interested in outcomes since this information is pertinent to decisions about continued and future funding. Community members are interested in outcomes, since they also have a particular interest in what works and what doesn’t. After all, as taxpayers, community members provide primary support for nonprofit services. Finally, other mental health professionals have a vested interest in the continued development of knowledge and understanding not only related to what works but why it works so that efforts can continue to develop and implement the most effective types of services and treatment.

Continuous collecting, analyzing, and reporting of outcomes must occur to ensure that all stakeholders are well informed about outcomes. Without doing so, the dilemma that Reggie faced may become a reality for other mental health professionals—regardless of just how good the work they are doing is.

Human Resources Data

Human resources data includes all data pertaining to staff (i.e., personnel). This includes but is not limited to such data as illustrated in Box 15.4.

BOX 15.4

EXAMPLES OF HUMAN RESOURCES DATA

  • Hiring
  • Job descriptions
  • Performance reviews
  • Educational records
  • Medical information
  • Salary information
  • Insurance
  • Company–sponsored retirement plan information
  • Tax information
  • Citizenship information
  • Staff vacancies (unfilled positions)
  • Training completed by staff
  • Staff credentials
  • Retention
  • Separation
  • Disciplinary action
  • Staff challenges/problems
  • Staff commendations/rewards

Sample of Human Resources Data

Human resource professionals maintain various documents and data pertaining to staff and are responsible for each staff member’s personnel file to ensure that confidential information remains confidential. The types of data that are private and, therefore, must be maintained in a confidential manner include medical and other personal information, salary, tax and citizenship information, and disciplinary action. Whereas this data is relevant to program managers/administrators, specific personal information is not relevant to program staff, and it is not permissible to disclose such information to staff who do not have a justified need to know. However, data related to staff that can be reported in aggregate and that is not considered confidential is highly useful to program developers/clinicians and all program staff, as it relates to program operations.

This data, including staff vacancies, hiring, credentials, job descriptions, and training activity, is often most valuable to program managers and staff when examined as trend data—for instance, you might examine staff training needs versus training completion on a quarterly basis to strategize methods to address training needs. Or you may evaluate staff retention to determine what trends might exist related to when staff end their employment and the reasons why they choose to do so.

By collecting and analyzing data related to staff, program managers are equipped with pertinent information about their staffing infrastructure. For example, if employee exit interview results indicate that 76% of staff voluntarily terminated their employment last year due to either not feeling connected to the program/organization or due to the lack of professional development activities the organization offered, this essential information can be used in staff retention efforts. However, this data is meaningful to all staff—not simply the manager/supervisors—particularly because sharing this type of information among staff and involving staff in retention efforts may serve to engage staff. As a result, the method by which future retention efforts are developed may in fact be a critical retention tool.

Financial Data

Financial data comprise all the program’s finances—costs and revenue. Data such as employee salaries, office space, furniture, supplies, administrative support services, and the contract rate(s) are all essential financial data. Financial data is pertinent to program planning, management, and sustain–ability and, as a result, must be collected and analyzed frequently. Effective program developers and managers are keenly aware of the financial aspects of their program(

Have a similar assignment? "Place an order for your assignment and have exceptional work written by our team of experts, guaranteeing you A results."

Order Solution Now

Our Service Charter


1. Professional & Expert Writers: Eminence Papers only hires the best. Our writers are specially selected and recruited, after which they undergo further training to perfect their skills for specialization purposes. Moreover, our writers are holders of masters and Ph.D. degrees. They have impressive academic records, besides being native English speakers.

2. Top Quality Papers: Our customers are always guaranteed of papers that exceed their expectations. All our writers have +5 years of experience. This implies that all papers are written by individuals who are experts in their fields. In addition, the quality team reviews all the papers before sending them to the customers.

3. Plagiarism-Free Papers: All papers provided by Eminence Papers are written from scratch. Appropriate referencing and citation of key information are followed. Plagiarism checkers are used by the Quality assurance team and our editors just to double-check that there are no instances of plagiarism.

4. Timely Delivery: Time wasted is equivalent to a failed dedication and commitment. Eminence Papers are known for the timely delivery of any pending customer orders. Customers are well informed of the progress of their papers to ensure they keep track of what the writer is providing before the final draft is sent for grading.

5. Affordable Prices: Our prices are fairly structured to fit in all groups. Any customer willing to place their assignments with us can do so at very affordable prices. In addition, our customers enjoy regular discounts and bonuses.

6. 24/7 Customer Support: At Eminence Papers, we have put in place a team of experts who answer all customer inquiries promptly. The best part is the ever-availability of the team. Customers can make inquiries anytime.

We Can Write It for You! Enjoy 20% OFF on This Order. Use Code SAVE20

Stuck with your Assignment?

Enjoy 20% OFF Today
Use code SAVE20