Comparing Two Forms of Public Schooling: What Relevance to Homeschooling?
A recently published study compared the academic achievement of government-run public correspondence school students and government-run public conventional school students in Alaska (McCracken and Coleman, 2020). The reason that this study is being reviewed by an author and institute that focus on homeschooling (home education) is that the authors of the new study held that their findings tell us something about homeschooling. Do they?
To be specific, McCracken and Coleman studied government-run and tax-funded correspondence schools (i.e., “Alaska Statewide Correspondence Schools”; Alaska Department of Education and Early Development, 2020a). The researchers compared the academic success of Alaska students in public correspondence schools to those in “traditional” schools in Alaska. The authors did not, however, define “traditional” school, so readers will likely assume that it means conventional brick-and-mortar institutional public schools.
The first thing the reader should notice is that although the title of the article mentions homeschooling, it also tells the reader that this is a study about government-school students. The title is, “A meaningful measure of homeschool academic achievement: Statistical analysis of standardized test performance in Alaska public correspondence school.” It should be noted that there are several indications that this is not a study of homeschool students.
First, many statutes (laws) control these government correspondence schools that are termed and defined as “correspondence study programs” in Alaska law, and they are not called, termed, or defined as homeschooling (Alaska 2020b). For example, “… an individual learning plan for each student enrolled …” must be created every year and it must be developed with the assistance and approval of a government-certificated teacher, must be consistent with government-school standards, and must include government-controlled learning assessments of each child. If the student scores below a certain level, the learning plan must be modified. The laws also control what curriculum materials may be purchased with the government-dispersed tax dollars. Tax-funded state-certified teachers are involved. Textbooks for use in these public correspondence schools “… shall be selected by district boards for district schools.” “Partisan, sectarian, or denominational [religious] doctrines may not be advocated in a public school during the hours the school is in session.” In other words, the correspondence schooling of a child is directly, clearly, and comprehensively controlled by the state, with the “collaboration” of the parents. Alaska law defines homeschooling as something provided by the parent, not the government.
The second thing to notice with respect to the McCraken and Coleman paper’s title is that some 40 years of the modern homeschool movement community and research on homeschooling have not considered government-controlled and tax-funded distance programs to be homeschooling. Researchers and scholars, since the early 1980s, have considered the key attributes of homeschooling to be that it is led or directed by the parents, home- or family-based, and generally a form of private education that does not rely on either state-run public schooling or institutional private schooling, on state-certified teachers, on government-approved curriculum, or on tax funding for the child’s education (Ray 2012). The scholarly and popular literature has defined and conceptualized homeschooling this way for over three decades (Lines 1987; Ray 2012; Van Galen 1987).[i]
The following definition accurately captures the meaning of modern-day homeschooling (or home education, as it is called in some countries):
Homeschooling is parent-directed, family-based education. Parent-directed means the parents have deliberately chosen to take responsibility for the education of their children, controlling both the education process and the curriculum (course of study). Family-based means the center of educational gravity is the home, with other resources being secondary. Parents may choose to partner with other homeschooling parents in cooperatives or support groups to provide portions of the education. They may also choose to have others assist in the education process (grandparents, tutors such as a music teacher, older siblings). (Homeschoolingbackgrounder.com, 2020)
Further regarding the definition of homeschooling, Williams (2003) provided a simple 2-dimensional diagram that explains “four types of education,” with one of them being private home education (homeschooling) (Figure). Consistent with the definitions above, Williams showed that the clearest form of homeschooling is home-based (or family-based) and is operated under private money and private control.
With the aforementioned in mind, McCracken and Coleman presented an analysis of standardized academic test performance in Alaska public correspondence and public conventional schools, but also reported that they had “… the purpose of contributing to the literature on the academic outcomes of homeschooling” (p. 191). This purpose of theirs will be considered later in this review.
The researchers never specifically stated this but their study is either a cross-sectional descriptive non-experimental study or a cross-sectional explanatory non-experimental study (Johnson, 2001); they imply that it is the latter. Designating the design is important to understanding and applying the findings.
The main dependent variable was academic performance, as measured by “pass rates,” a percentage value (p. 196). They analyzed test data from about 12 years from two standardized tests, the Standards Based Assessment (SBA) and the High School Graduation Qualifying Examination (HSGQE). “The SBA test was designed to be a criterion-based assessment aligned with the Alaska Grade Level Expectations for each grade” (p. 193) and the HSGQE. The HSGQE “… was intended to ‘reflect the essential skills that students should know as a result of their public school experience’ …” (p. 193). Scores from the SBA were the authors’ “… largest source of testing data …” (p. 192).
During 2003-2014, student scores from the two tests were reported as pass rates. The authors calculated pass rates from data on the number of students enrolled in a given category, the percent in the category who participated in the assessment, the number in the category who achieved a score of proficient or above, and the number of students in the category who scored below the proficiency standard.
The researchers’ main independent variable was type of government school (correspondence versus conventional) and the several background independent variables included “… gender, race/ethnicity, disability status, economic status, migrant status, and limited English proficiency status” (p. 194).
SBA Pass Rates
First, they found that the overall SBA pass rates of government correspondence school students were not statistically significantly different from those of government conventional (traditional) school students. A more fine-tuned analysis, however, did find some differences.
Public correspondence students scored significantly higher than traditional students in reading and writing. These government correspondence students scored significantly lower in math than their conventional school counterparts.
Further, at a more detailed level, they found school-type differences in SBA pass rates depending on the academic subject. For example, non-Caucasian and economically disadvantaged public correspondence students scored significantly higher than non-Caucasian and economically disadvantaged public conventional students in reading and writing, but significantly lower in math.
Male correspondence students scored significantly higher overall than male conventional school students. School type was not related to female students’ overall scores.
There was no significant difference in scores by school type between African American and Hispanic students. Alaska Native/American Indian government correspondence students, however, scored significantly higher than their government conventional school counterparts. Also, Asian/Pacific Islander correspondence students scored significantly better than traditional school students.
“When the scores of all non-Caucasian groups were combined, non-Caucasian correspondence students were significantly more likely to be proficient overall than non-Caucasian traditional students” (p. 197). Caucasian correspondence government school students, on the other hand, scored significantly lower than their traditional government school counterparts, as did correspondence students of Two or More Races.
Income and Economic Disadvantage
Public correspondence students who were economically disadvantaged scored significantly higher than public economically disadvantaged conventional school students. On the other hand, “… correspondence students who were not economically disadvantaged scored significantly lower than their traditional counterparts” (p. 197).
Disabled public correspondence students scored significantly higher than disabled conventional school students. On the other hand, government correspondence students who were not disabled scored significantly lower than their public traditional school counterparts.
Limitations Regarding SBA Data
McCracken and Coleman pointed out one of the limitations of their study. They reported, “… it is difficult to draw definitive comparisons between traditional and correspondence students because these groups did not actually take the SBA at the same rate” (p. 201). They found that although all public school students are required to take the SBA, they also found public “… correspondence students opted out of the test at a significantly higher rate than traditional students” (p. 201).
SGQE Pass Rates
Government school “… correspondence students were significantly more likely to pass overall” (i.e., grade levels 10 to 12, overall) than were government conventional school students (p. 206). When examining only pass rates at the 10th grade, there was no statistical difference between the public correspondence and public traditional students.
At more detailed levels of analysis, the findings were mixed regarding comparing public correspondence to public conventional students. For example, at the 10th grade, there was no difference in reading or writing pass rates, but the conventional student pass rates were better in math. There were no significant differences between school type by gender or by disability. There were mixed findings – that is, regarding whether correspondence or conventional students had significantly different pass rates – regarding ethnicity/race, economic disadvantage, and income level.
Limitations and Cautions
McCracken and Coleman did not state the type of study they were doing (e.g., non-experimental, experimental, causal, causal-comparative). They did, however, offer reasonable detail regarding their methods. They explicitly stated some, but not others, of the limitations or “caveats” of their study. For example, they divulged that they had to “… reconstruct much of the data using two methods” (p. 195). They also put forth some of their analytical assumptions, and explained some challenges with the nature of the data.
It is fitting that McCracken and Coleman stated the following: “Our data merely shows [sic] a correlation between school type and academic achievement for disadvantaged students in Alaska. ….. A more detailed quasi-experimental design would be necessary to hypothesize a causal relationship” (p. 213). They thus imply that this is a non-experimental study and it does not establish causation regarding different pass rates between public correspondence school and public conventional school students. Along this vein, the authors should have been more careful about the use of the term effect (e.g., “… we found no significant effect of school type …,” p. 197) since their study design is not experimental or one that can establish causation, and they are not discussing effect sizes.
It is also proper and helpful that the authors explained that government correspondence students in every demographic category were significantly more likely than conventional public students “… to opt out of or simply be absent from the required state assessments” (p. 213) and that they do not know whether the public correspondence students in their study who opted out were more likely to be high-achieving or low-achieving students and this fact, therefore, “… makes it difficult to make definitive statements comparing the scores of …” (p. 214) public correspondence students to the scores of public conventional students in Alaska.
Also significantly, the authors mentioned another “caveat” about their study. That is, “… correspondence students in Alaska may not be representative of all homeschooled students in Alaska” (p. 214). In reality, as noted previously in this article, government public correspondence school students are not homeschool students and therefore no one should expect them to be representative of homeschool students.
A probing scholar will look at the practical differences between groups where the authors found statistical differences. For example, for males, the statistically different pass rates were about 73% and 71%; this difference might not be of practical consequence or worth in the world of schooling. Further, the authors did not explain the real possibility of multiple error rate (e.g., from multiple testing) in their study in which they conducted a high number of statistical tests; such multiple testing amplifies the probability of false-positive findings (Ranganathan, Pramesh, & Buyse, 2016; Schochet, 2008).
Summary and Comments: Comparing Two Forms of Public Schooling — What Relevance to Homeschooling?
McCracken and Coleman have provided an engaging look at the relative academic success of two types of public school students in Alaska. One group is engaged in government-run correspondence schooling and the other is in conventional institutional schools. Their study will likely add to the research base and debate surrounding whether students in conventional public institutional schools or public distance, virtual, blended, online, or correspondence schools are better off academically (Miron, Shank, & Davidson, 2018; Vasquez & Serianni, 2017).
It is unusual and convoluted that the author’s main empirical and analytical focus throughout the study was comparing the academic achievement of two types of public/government school students but stated that doing so is “… with the purpose of contributing to the literature on the academic outcomes of homeschooling” (p. 191).
In their discussion section, McCracken and Coleman spent considerable space comparing the findings of their study that compared the pass rates of government-controlled public correspondence school and government-controlled conventional school students to research findings on homeschooling. However, as the terms they use – “Alaska correspondence students,” “traditional school,” “correspondence education in Alaska,” and “correspondence programs like Alaska’s” – show, they did not study homeschool students as defined and considered by scholars, policymakers, lawmakers (including Alaska’s), and the general public for the past roughly 40 years (as explained at the beginning of this article). [ii]
This study expands our knowledge and understanding of government-run public correspondence and conventional schools. It tells us very little, however, about homeschooling. It is not clear why McCracken and Coleman tried so hard to lump government public correspondence school students in with homeschool students. Many Americans lump whitetail, mule, blacktail, Coues, and Sitka deer and elk, and moose together as big animals with antlers (on the males). Mammalogists, wildlife biologists, and hunters know, however, that there are many differences between these various ungulates in terms of habitat and their feeding, social, and mating behaviors. There are many sound reasons to not lump them together for purposes of management, taxonomy, scientific study, and hunting. It is something of a mystery why McCracken and Coleman confuse government-run, licensed-teacher managed, and tax-funded correspondence schooling with homeschooling.
Perhaps the strangest claim by McCracken and Coleman is the following at the very end of their article: “Despite these caveats, our findings have important implications for homeschooling policy” (p. 215). Since their study is clearly compared the academic achievement (measured as pass rates) of government public correspondence school students to government public conventional schools students, their findings have no implications for policy related to homeschooling. Tax-funded, government-controlled public school programs such as “Alaska Statewide Correspondence Schools” are different in terms of law, conceptual construction, pedagogical practice, management, philosophy, and familial involvement than the vast majority of modern-day homeschooling. The authors are admittedly comparing the proverbial apples to oranges, with admitted major caveats and a non-experimental design, and then claiming that there is much to learn about homeschooling and much to apply to the government control (i.e., “policies”) of private education homeschooling.
In a final and peculiar twist, the authors implied that their study suggests a significant enough number of homeschool children are being harmed by homeschooling compared to those in government-run public schools that states and government school districts should analyze the data that they have in order to create “… homeschooling policies that protect children from negative academic outcomes” (p. 215-216). Since (a) McCracken and Coleman’s study is not about homeschooling, (b) their study, from the big-picture view, largely finds a slight academic advantage for public correspondence school students over public conventional school students, and (c) a majority of peer-reviewed studies to date have found that homeschool students perform academically better than institutional school students (Ray, 2017), it is unfitting for the authors to imply that their study shows that homeschool students are at any kind of risk that needs to be mitigated by government policy, law, or controls.
McCracken and Coleman’s study provides more knowledge and insight regarding the relative academic success of two types of public school students in Alaska. One group is enrolled in government-operated correspondence schools and the other is enrolled in government-operated conventional institutional schools. Their study will likely provide information from which public school policymakers and educators can learn.
On the other hand, there is nothing in McCracken and Coleman’s findings that tells us about the academic achievement of homeschool students as compared to those in government conventional institutional schools or in government-controlled correspondence schools. Further, no empirical evidence in their study informs government policy or controls regarding private parent-directed and home-based homeschooling education.
Alaska Department of Education and Early Development. (2020a). Alaska statewide correspondence schools. Retrieved September 9, 2020 from www.education.alaska.gov/alaskan_schools/corres)
Alaska Department of Education and Early Development. (2020b). Article 3. Correspondence Study Programs. Retrieved September 9, 2020 from https://education.alaska.gov/alaskan_schools/corres/docs/Correspondence_School_Statutes.docx
Homeschoolingbackgrounder.com. (2020). Retrieved September 8, 2020 from www.homeschoolingbackgrounder.com).
Johnson, Burke. (2001). Toward a new classification of nonexperimental quantitative research. Educational Researcher, 30(2), 3-13.
Lines, Patricia M. (1987). An overview of home instruction. Phi Delta Kappan, 68(7), 510-517.
McCracken, Chelsea, & Coleman, Rachel. (2020). A meaningful measure of homeschool academic achievement: Statistical analysis of standardized test performance in Alaska public correspondence schools. Other Education: The Journal of Educational Alternatives, 9(1), 207-252.
Miron, Gary; Shank, Christopher; & Davidson, Caryn. (2018). (2018). Full-Time Virtual and Blended Schools: Enrollment, Student Characteristics, and Performance. Retrieved September 9, 2020 from https://nepc.colorado.edu/sites/default/files/publications/RB%20Miron%20Virtual%20Schools%202018_0.pdf
Ranganathan, Priya; Pramesh, C.S.; & Buyse, Marc. (2016). Common pitfalls in statistical analysis: The perils of multiple testing. Perspectives in Clinical Research, 7(2), 106–107.
Ray, Brian D. 2012. Evangelical Protestant and other faith-based homeschooling. In James C. Carper & Thomas C. Hunt (Eds.), Praeger handbook of faith-based schools in the United States, K-12 (chapter 12, pages 123-135). Santa Barbara, CA: Praeger, ABC-CLIO.
Ray, Brian D. (2013). Homeschooling associated with beneficial learner and societal outcomes but educators do not promote it. Peabody Journal of Education, 88(3), 324-341.
Ray, Brian D. (2017). A systematic review of the empirical research on selected aspects of homeschooling as a school choice. Journal of School Choice: International Research and Reform, 11(4), 604-621. Retrieved February 20, 2020 from https://www.nheri.org/a-systematic-review-of-the-empirical-research-on-selected-aspects-of-homeschooling-as-a-school-choice/
Schochet, Peter Z. (2008). Technical methods report: Guidelines for multiple testing in impact evaluations. Washington, DC: The Institute of Education Sciences, U.S. Department of Education.
Van Galen, Jane A. (1987). Explaining home education: Parents’ accounts of their decisions to teach their own children. The Urban Review, 19(3), 161‑77.
Vasquez, Eleazar, Iii; Serianni, Barbara A. (2017). Research and Practice in Distance Education for K-12 Students with Disabilities. Retrieved September 9, 2020 from https://journals.sagepub.com/doi/10.1177/875687051203100406
Williams, Rodger. (2003). Four types of education. Retrieved September 8, 2020 from http://www.hstuac.org/four-types-of-education.png
[i] I recognize that there are differences in philosophical, pedagogical, and legal conceptualizations regarding “parent-led” or “parent-directed” between those parents, families, and children who consider themselves as unschooling rather than homeschooling.
[ii] A decade or so ago, when there was much less research on homeschool learner outcomes, scholars sometimes mentioned Alaska government correspondence programs from about two or more decades ago in reviews of homeschool research, pointing out that this correspondence schooling was not the same as homeschooling (e.g., Ray, 2013). At the time, the government exerted less legal and practical control over the correspondence programs.