Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Parental Educational Decision Making: The Information They Seek and What They Want From Data Systems


by Ellen B. Mandinach, Ryan C. Miskell & Edith S. Gummer - 2020

Background/Context: Parents are important consumers of educational information, especially with stipulations in ESSA. Yet, research on parental information needs is limited. It is important to understand how the information is displayed for parents in accessible and understandable ways. This study uses a theoretical perspective of parental engagement and social networking to ground the research.

Purpose/Objective/Research Questions: Education is awash with information. This study sought to understand the nature of the information parents seek to make decisions about their child’s education. It also sought to understand the characteristics of data displays that would make information accessible and understandable. What information do parents seek to make decisions about their children’s education? Where do parents go to obtain educational information? What sources do parents access to obtain the needed information? What design characteristics do parents prefer to help them navigate websites and understand the presented information?

Setting: The study took place in focus groups held across Missouri.

Population/Participants: Twenty-one focus groups were convened with 118 parents from urban, suburban, and rural areas of Missouri.

Intervention/Program/Practice: Focus groups addressed the need to involve parents in educational decisions and their need for data. They were asked a series of questions to elicit responses about the information they seek to make decisions about their child’s education. Parents were also asked about the design, ease of access, and understandability of data displays.

Research Design: The study used small focus groups and a standardized interview protocol. To elicit more detailed and deeper focused responses, alternative prompts were used to ensure that participants understood what was being asked.

Data Collection and Analysis: Parents responded to structured questions and visual displays. Responses were recorded through comprehensive note taking. The responses were analyzed using ATLAS.ti to identify underlying themes.

Findings/Results: Parents seek qualitative data sources that supply descriptions of schools and districts. They want not only test scores and school grades but also information that helps them understand the schools. They want details about the teachers, leaders, and programs offered and information about safety and processes. Parents want the information presented in accessible and understandable formats that include better graphics and more easily understood details.

Conclusions/Recommendations: Education agencies need to consider how to present diverse information that will meet stakeholders’ needs. This is relevant because not all parents are familiar with data displays. Parents seek descriptive data that are locally relevant and represented on school or district websites and data systems.



INTRODUCTION


This study focuses on both the information needs of parents and the nature of reporting mechanisms for data systems to support educational decision making. The topics are timely and relevant because the Every Student Succeeds Act (ESSA; The Congress of the United States, 2015) requires data use from the federal to the local level, performance information that can be used for knowledge creation and capacity building (Darling-Hammond, Wilhoit, & Pittenger, 2014). ESSA also requires states to address parental needs for connection to their children’s education, a need that necessitates data collection and reporting. States have begun to address the ESSA parental component (Donovan, 2018). A growing trend is for states to pass legislation to empower families by providing student data (R. Anderson, 2018). To help parents make informed decisions about their children’s education, they must have easily accessible and understandable information presented in media and technologies that are user friendly and provide the kinds of information parents seek. Thus, this study addressed the following research questions:


What information do parents most need and seek to make decisions about their children’s education?

Where do parents go to obtain educational data and information? More specifically, what sources and resources do parents access and consult to obtain the needed information?

What design characteristics do parents prefer to help them easily navigate a data system or website and understand the data and information that are presented?


The review of the literature briefly describes the educational technology literature, noting that there is a dearth of research that pertains to stakeholders other than educators that appears in peer reviewed journals. It then turns to the limited literature on what information parents need to make educational decisions.


A BRIEF HISTORY OF EDUCATIONAL DATA SYSTEMS AND DATA REPORTING


There is a vast literature on educational data systems ranging over two decades, yet only recently has there been a focus on data displays and reporting. This change reflects a trend in technologies from data systems to collect data to dashboards and applications that display and report data in accessible and easily useable ways. Many of the traditional data systems that focus on data collection are quite robust, showing the progress that has been made in the development process. These systems function effectively for the purposes for which they were developed. That said, the problem is not on the nature of the systems to collect data. The emerging issue is with the data display systems, dashboards, and reporting mechanisms that make the data usable for others. The display systems have made data more sortable, comparable, and overall, visible. Such functionalities will be essential for broad use by educators and other stakeholders.


In an informative and cutting-edge document, Wayman, Stringfield, and Yakimowski (2004) provided an early review of educational technologies, including detailed descriptions and critiques of their functionalities. The review spurred further examinations of emerging data systems. Wayman (2005, 2007) noted that there were four main types of data systems used by school districts: (a) student information systems, (b) assessment systems, (c) instructional management systems, and (d) data warehouses. Wayman noted that these systems do not perform all of the needed functions individually. Hence vendors sought applications that combined features and diverse solutions with suites of applications in one tool.


More recently, Wayman, Cho, and Richards (2010) re-examined educational data systems and identified four characteristics of good applications: (a) user friendliness; (b) user features; (c) information access; and (d) data quality. Although the authors focused on data use by educators, these four characteristics also generalize to other user populations, including parents. For audiences that are perhaps less sophisticated data users, the ease of use, easy access, understandable displays, and understandable query tools may be particularly important.     


Other technological applications are emerging to support data use. These include applications (apps) loaded on mobile devices and data dashboards that provide ready access to data, such as early warning indicators. The important component is that they provide access for data use. Rankin (2016) has noted that for any of these data technologies, there is a concern that the systems fail to provide proper interpretative guidance for users to understand the data.


Having useable data systems is a pervasive finding in the literature. Hamilton et al. (2009) conducted an extensive review and derived five recommendations for effective data use, one of which was that every district must have a data system to inform practice, either home grown or commercially developed. The data systems must be aligned to educational objectives at the classroom, school, and district levels.


At the state level, state education agencies are required to have statewide longitudinal data systems (SLDS). The Statewide Longitudinal Data Systems Grants Program has funded almost every state, expending over $1 billion to develop and implement these large and complex data systems. In the early rounds of funding, the focus was on building the technological infrastructure. In 2015, the focus shifted to data use, recognizing that the next step to development was the actual use of the data (National Center for Education Statistics, 2017).


The SLDSs have been used primarily for accountability and compliance purposes, and only recently has there been a focus on more locally relevant data use for classroom and district continuous improvement. Some states, like Arizona, have developed ancillary tools like data dashboards for more real-time data use by educators (see http://www.azed.gov/aelas/azdash/). The development of such dashboards reflects the growing need to provide data displays that are easy to understand and use and provide real-time data to educators.


State and local data systems focus on data collection for educator data use. Several assumptions underlie such technologies: (a) these data systems are about informing decisions for educators; they are not intended for public dissemination and the sharing of information; (b) there is a certain level of data literacy that is needed to examine the data (Mandinach & Miskell, 2017); and (c) the data housed in the systems are not those that have utility and relevance for other stakeholders, such as parents (Schneider, Jacobsen, White, & Gehlbach, 2018). There is a need for applications to display easily accessible and understandable data for a range of users from educators to other stakeholders, with rich data displays and easily interpretable reporting.


To tease out data needs and their utility, Schneider et al. (2018) conducted an experimental study to examine the impact of different types of data systems and their data on participants’ perceptions of school quality. They compared the utility of a state data system with a system that contained more comprehensive data, including data connected to teachers and the teaching environment, school climate, resources, indicators of academic learning, and character and well-being outcomes. The authors found that having a broad range of data impacted how participants perceived school quality and that the more comprehensive the data, the less reliance participants put on test scores and typical accountability indices. They also found that given much reliance on social media, having a more comprehensive set of data provides “vicarious exposure” to data that may be more reliable than typical word-of-mouth information. However, rather than provide a comprehensive, diverse, and visibly interpretive set of data, a majority of states have relied on composite indicators, like an index score, an A–F grade, a star system, or a qualitative descriptor (e.g., reward, focus, excellent, meets standards) to sort schools and present effectiveness levels (Adams et al., 2017). Reliance on composite indicators limits the information available to different stakeholder groups that can help direct school improvement.


Given data limitations, other data repositories have emerged to make data more publicly accessible to a variety of stakeholder groups. For example, the Virginia Department of Education (http://www.doe.virginia.gov/statistics_reports/research_data/index.shtml) developed a sophisticated portal for researchers. The Kauffman Foundation developed EdWise, a data visualization platform (http://www.kauffman.org/microsites/edwise) populated with data from the Missouri Department of Education, to provide data to stakeholders, including researchers, education support organizations, businesses, civic and neighborhood leaders, and media.


A HISTORICAL PERSPECTIVE ON PARENTAL USE OF INFORMATION


Parents wanting information about schools and districts is not a recent phenomenon. Almost two decades ago, A-Plus Communications (1999) conducted a study for Education Week’s Quality Counts entitled, Reporting Results: What the Public Wants to Know (see also Olson, 1999a, 1999b). The purpose of this work was to understand what information stakeholder groups (including parents) want in order to hold schools accountable so that improvements can be made. The premise is that accountability reporting is only one aspect of why parents seek information. The report yielded four key findings. First, parents sought performance data (e.g., test scores, promotion rates, graduation rates). But parents also sought more detailed information about school safety and educator qualifications. Second, parents wanted comparative data across schools as well as in comparison to a standard. To the study’s credit, this finding came with a caveat about the overreliance on test scores. Third, the report yielded information about the design characteristics of accountability reports. Parents wanted the reports to be short and well designed, with access to additional levels of detail. Finally, the report documented a level of dissonance between the information contained in the accountability reports and what parents actually want. The ultimate finding of this work was that, “what the public wants to know is not necessarily what school districts and state education departments want to provide” (p. 13).


School safety and teacher qualifications were the most desired data, followed by class size, graduation rates, dropout rates, state testing results, parental satisfaction survey results, SAT/ACT scores, and promotion rates. The least desired data were about technology offerings, attendance at parent-teacher conferences, and demographics. Other sources of data that were moderately desired included information about course offerings, attendance rates, expenditures, salaries, hours of homework, and number of students.


The study also examined parents’ perceptions about the characteristics of a good accountability report. It was important to note that less than one-third of the parents affirmed having seen an accountability report. Five topics were thought to be important for inclusion in the reports: funding expenditures, student performance, time allocation; school environment, and action steps for school improvement. Parents reported that they rated the credibility of reports from school districts or the federal education agency as only moderately credible. Reports written by non-profit organizations based on policy or research had the most credibility.


Paraphrasing the A-Plus Communications (1999) report, it concludes with key points that remain relevant today in terms of parental information needs. Although student performance is important, other data need to be provided, perhaps most notably school safety data. Another finding that generalizes to today is that data reports should be brief and readily understandable to diverse stakeholders. Finally, in building data reports, educational agencies should consult stakeholders to understand what information is important to them. To that end, Texas (Reynolds, 2018) is conducting focus groups to develop their data dashboard and Mississippi (Donovan, 2018) is obtaining feedback to make their ESSA report cards more user-friendly for parents.


In a recent study, the Data Quality Campaign (2016) conducted a national survey of parents to understand their data needs. The survey found that 89% of the parents want student test scores and grades, while 48% want indicators of school quality (i.e., how well schools are preparing students for the future). Parents (87%) want data to help them make educational decisions. The study noted that data such as test scores are “polarizing” in that they focus solely on one indicator of student and school performance, sometimes to the exclusion of other sources of data. This sole reliance narrows the focus of the work of school and eliminates other important factors. As the report stated, “Although assessment data is one important piece of student information, the narrow focus of the public conversation has ignored the fact that parents by and large want access to multiple types of education data” (p. 1). Yet, 91% of parents reported being only somewhat interested in receiving information about social-emotional learning. They were more interested in receiving data that are perceived to help their children progress successfully. Parents wanted data that indicate on-track performance. Of the parents who reported not receiving these data, they wanted to receive information about student performance and progress. The study also found that parents trusted educators to use data to support improvement.


A further analysis of the survey results for the Data Quality Campaign (Harris Poll, 2015) reveals additional information. For parents, the most important characteristics of school quality were preparedness for the future (48%), educator qualifications (48%), and school safety (47%). When asked about knowledge of the data being collected, 72% reported that they know what the schools are collecting while 70% reported understanding how the data are being used.


The Data Quality Campaign (2016) noted a socioeconomic status effect in the survey responses where more affluent parents preferred what is known as “hard data” such as grades, test scores, and graduation rates. In contrast, parents from less affluent circumstances were more accepting of socio-emotional data. The study also reported that parents feel that schools and districts can do more to provide data, but they also feel that the data they do receive are timely and secure. Parents also reported that the data are accessible and easy to understand. Yet, 70% of parents reported that there is a problem with one of the most important indices of school quality: how schools are preparing students for the future.


In contrast, the Harris Poll (2015) survey found no patterns for gender, age, region, and wealth among parents. Most parents (93%) receive information about how their child is progressing in school, if their child is on track to graduate (80%), if there is a need for remediation (78%), school safety (76%), preparation for the future (70%), and if their child is ready for more advanced coursework (70%).


A follow-up poll was conducted by the Data Quality Campaign (2017). The report found four key results: (a) 89% of the parents reported that school performance influences their decision making (e.g., school A–F grades); 91% reported that they used test results and grades to understand their child’s performance and progress; 95% of the parents want teachers to use data (grades, tests scores, and attendance) to personalize instruction; and 88% report that their school is protecting the privacy of student data (which is an improvement from 81% in 2015). The report also found that parents sought data on educator qualifications (45%), school safety (43%), the availability of options and services (42%), and preparing students for the future (41%).


Impediments abound in parental use of data. One possible challenge is that there is no central repository of information for educational data, according to 64% of parents participating in the Harris Poll (2015). Parents have to seek out and have access to multiple resources to acquire the desired information. Another impediment may be that parents connect with the familiar (Phi Delta Kappan, 2015); that is, parents may have a bias toward rating school quality higher for schools with which they have familiarity. Parents might ignore hard data that conflict with the impressions they have gained through personal experience. Third, there is an over-reliance on unreliable data from social media and word-of-mouth sources (Schneider et al., 2018). Parents may or may or may not be good consumers of information. As A-Plus Communications (1999) showed, parents are skeptical about information found in public media such as newspapers, television, and radio. Even the chamber of commerce was viewed as relatively unreliable. Dabney (2018) reported unpublished results from the Project Tomorrow Speak Up research project that asked high school parents about the sources from which they seek information specifically about college and career readiness. Results indicated that 35% of the parents used word of mouth, 24% went to third party sites (e.g., GreatSchools, Zillow), 24% went to state websites, following by high school websites and school district websites.


Real estate sites have become a source for information about schools and districts over the past decade. Websites like Zillow are linked to information about local schools as a selling point for rentals and purchases. Even before the advent of websites, real estate agents were armed with CD-ROMs filled with test scores and other data that parents could use as a proxy for real estate values. Yet, the use of such data is problematic at best. Yoshinaga and Kamenetz (2016) noted that there is a “legal gray area” around the provision of such data. High performing schools positively affect real estate values. Yet certain data may be off limits in terms of Fair Housing Act policies, information that potential purchasers may want to know. Such data may include demographics, crime rates, diversity, income disparity, socio-economic status, and racial composition of the schools. Steil and Jordan (2017) note that overall, blacks and whites use similar information sources when making real estate decisions, such as networks, realtors, and newspapers, yet early research on real estate decision making found differential patterns of use by race with blacks using social networks rather than the internet (Krysan, 2008). Given the proliferation of internet availability, Steil and Jordan (2017) note the need for a more research.


THEORETICAL GROUNDING FOR THE STUDY


The theoretical perspective underlying this study is informed by two literatures viewed as working in parallel, parent or community engagement (Goodall, 2012; Hornby & Lafaele, 2011) and social networking (Stephenson, 2005). Although parents may make an individual decision about their children’s education, they seek input from a variety of sources many of which may be in their immediate network. As noted above in terms of real estate decision making, Krysan (2008) described the importance of social networks from which to seek information. Lawson and Alameda-Lawson (2012) and Warren, Honig, Rubin, and Uy (2009) noted the importance of collective parental engagement or social networks, particularly for low-income parents to help mitigate barriers to student performance. Parents engage with community groups, school groups and leaders, and other relevant supports to enhance engagement with the school system.


Parental engagement relies on communication with trusted sources. Thus, Stephenson’s (2005) quantum theory of trust in social networking is relevant when parents seek information about schools and districts. Although Stephenson’s work focuses on business relationships in the form of formal social network analyses, the fundamental premise is that individuals tend to seek out people they trust. “People have at their very fingertips, at the tips of their brains, tremendous amounts of tacit knowledge, which are not captured in our computer systems or on paper. Trust is the utility through which this knowledge flows” (Stephenson as quoted in Kleiner, 2002). Trusted networked relationships may be the most important source of information for parents about the education of their children, beyond more vetted and quantitative sources.


METHODS


RECRUITMENT


The study sought the feedback and experiences of a diverse group of Missouri parents. The researchers relied on third party organizations to help recruit and secure commitments from parent participants. The research team worked with the National PTA to establish a relationship with the Missouri PTA and its school-level affiliates. These affiliates helped to inform parents of upcoming focus groups and secure commitments. To broaden the population of parents participating in focus groups, researchers also contacted nonprofit organizations that work with families in need (the Children’s Education Alliance of Missouri, the Columbia Housing Authority, and A Vision for Children at Risk), charter schools, and university programs.


Researchers met with 118 parents of diverse backgrounds and socioeconomic status. Of the 118 participants, 21 were male and 97 were female; 66 were white, 35 African American, 13 Hispanic, and 3 Asian. Seventy-one of the parents were recruited through the PTA and 47 through the nonprofit organizations. Fifty-one parents had students in schools in an urban setting, 50 parents had students in a suburban setting, and 17 parents had students in a rural setting. Additionally, each participant received a $50 gift card for agreeing to provide feedback. Parents attending the focus groups had children in varying grade-levels, from soon-to-be pre-kindergarteners to recently graduated high school seniors: 39 parents had children in pre-kindergarten or younger; 77 had children in elementary school; 31 had children in middle school; 41 had children in high school; and 31 had children graduated from high school.


FOCUS GROUP INTERVIEWS


The focus groups were conducted with the use of an interview protocol that consisted of two parts that addressed information needs and technology design issues. Each group was asked the same questions, but in a semi-structured manner to take into consideration the responses of the participants and reduce redundancy. In this semi-structured format, each question was addressed to elicit common information from the participants. The intent was for the focus groups to be limited to 10 or fewer participants so that interaction, discussion, and input would be maximized for all participants. In one group, there were 11 participants and the research team did not want to turn away the extra person. The participants were told that the reporting of the research results would address the themes from the discussions and include quotations.


Two researchers were present for the focus group sessions. One researcher conducted the interview while the other researcher typed verbatim notes throughout the discussion. The focus groups did not use voice recording (see Appendix A for a sample of the notes). The research team made the decision to rely on the notes rather than audio record for several reasons. First, having checked with the schools and the community organizations, the idea of audio recordings was not favorably received. This was confirmed through discussion with our agency IRB. Given the diversity of the sample, this decision also aligns with Tillman’s (2002) culturally sensitive approach to research. Researchers assured the participants that names and other identifying information would not be mentioned or reported.


The research team made every effort to engage each participant, regardless of the size of the group. For the larger groups, the leader sought input from each participant on each question. The leader probed specific individuals if they had not weighed in on a particular question. The leader also controlled the sessions to ensure that no participant monopolized the discussion, by politely diverting the conversation to other attendees if needed. The focus groups all ran smoothly, were conversational, interactive, and respectful. There was only one instance where a participant made a comment that required intervention by the research team. It was a racially charged statement and was immediately addressed. Luckily, it was a small focus group and the comment occurred at the end of the session.


Education Information


The attendees were asked preliminary information to gauge the level of interaction with schools. Participants were asked the ages and grades of their children, how long they have been members of the school and district community, and if they were involved in the school’s PTA.


The researchers decided not to ask direct questions about information needs because they anticipated that they would not yield the kinds of responses sought from the participants. For example, asking some parents, “What kind of information do you need to make a decision about your child’s education?” or “What do you want from a website?” might lead to confusion and a lack of useful information. Instead, the researchers chose to use scenarios because the method has been shown to elicit deeper thinking from and provide more context for the respondents (Dotger, 2011; Dotger, Dotger, & Tillotson, 2010). Scenarios also enable respondents to place themselves into authentic situations. Scenario-based questions were developed that placed the participants in situations that would be realistic and readily understandable. The scenarios were presented to participants to help ground the discussion and have participants think about different viewpoints from which parents may seek education data. The scenarios are described below. Variations of these scenarios were pursued if there was a need to solicit further information. If a group went off topic or did not understand the scenario, the researchers rephrased the scenarios or provided parallel examples to solicit the desired information.


You are a new parent to a school and even to the area. You are pretty savvy about the neighborhood, education, and what you want for your family, but you want to know more about the school, the district, and the state. At an informal gathering you spot a neighbor who seems knowledgeable about the school and area. The neighbor has a child the same age as yours. What kinds of questions do you ask this neighbor? What is most important for you to know? How would you use that information?

If you were a real estate agent trying to sell a house in the neighborhood where the potential buyer’s children would attend school, what information do you share? What messages do you have for these parents?

A parent new to the area approaches you to ask about your child’s school. What are the most important things you would share with this parent?

You are a PTA officer for your child’s school. What information do you share with parents and community members? How can they learn more about the school? What are some developments at the school that parents and the community should be aware of?

You are looking to move in the near future and have the ability to take time to find an area that is right for you and your family. What types of information will help you make a decision about where to move? How will you access that information?

Your child will be entering elementary/middle/high school next year. At the grocery store, you run into a teacher whom you know teaches at that school. What questions do you have for that teacher? What do you want to know about the school your child will attend?


Participants were asked to reflect on types of and sources of educational information: (a) the educational information that is important to them; (b) the information to which they currently have access; (c) the information to which they wish they had access; (d) the kinds of decisions they have to make as parents now and in the future and what information will help assist them in making these decisions; (e) the sources they seek out to explore education information; (f) which sources have a good reputation and why; (g) the forms of information parents seek most consistently and the forms they avoid; and (h) the information they want to have about a school, its leadership, and faculty. Participants were prompted to think if there were differences in the information they use and desire when their children are entering elementary school, middle school, and high school.


The participants also were asked how they reconcile conflicting information. Specifically, they were asked about the information parents tend to trust or rely on if their own experiences or the verbal feedback they receive from other parents conflict with published data and statistics. The researchers provided examples for parents to consider. In Case 1, parents were told that a school had received a grade of a “D”, but that there were good values, strong school safety, and parents walking down the hall had excellent impressions of the school. In Case 2, parents were told the school had received a grade of “B” or “A”, but that the impressions of the school were not positive. Parents were asked which schools they would choose and why.


Website Information


Websites take different approaches to presenting information. Some may be verbal, others more graphical or statistical. Some may require significant drill-downs to find information, others less deeply embedded, what might be called the click metric—how many clicks does it take to get to the desired information. Some websites require substantial expertise in navigation, whereas others are more geared to technological novices. Websites differ in their help components, such as providing Frequently Asked Questions (FAQs), definitions, translations, and explanations. Websites also differ in their visual appeal and more importantly, what information and data actually are presented. Websites also differ in the assumptions developers have about the level of sophistication of the parental users (Donovan, 2018).


Asking participants directly about websites and their functionality would be too abstract to elicit useful information about design characteristics. Instead, the participants were asked to examine concrete examples of three websites that present data and information in distinct ways. The research team reviewed the websites of schools, districts, state departments of education, and organizations that seek to share education data. The selected websites were actual sites that varied markedly in the manner in which data and information are presented and navigation functionalities. The three websites were selected specifically for their diversity to enable participants to react to their obvious differences and elicit their preferences. The researchers led parents through a navigation of three websites that display education data:


a National Science Foundation website (https://nsf.gov/nsb/sei/edTool/) that uses questions to help guide users to available data;

the Illinois Report Card website (http://www.illinoisreportcard.com/) that organizes information by key topics of interest; and

the Jacksonville Public Education Fund website (http://www.jaxpef.org/) that organizes information by school-level areas.


Each site was displayed using paper copies to parents one at a time and parents were able to reflect on their reactions to the graphics, the ease of the site’s navigability, what they liked and did not like about the website, and what aspects were clear or confusing. Participants then compared the three websites and ranked them in order of preference.


Analyses


The software ATLAS.ti was used to examine and coordinate emerging themes across participants and focus group sessions, keeping in mind both parental engagement and social networking. Focus group sessions notes were captured in as verbatim a way as possible during each session by one researcher while another researcher led the discussion. The data analysis process relied on the general inductive logic described by Strauss and Corbin (2007), Merriam (2009), and others to explore and build upon specific observations and points brought up by focus group participants. An “open coding” procedure was used wherein each of the two researchers read through the typed notes and identified text segments of hypothesized relevance to the research questions. These codes were informed by both the theories of social networking and parental engagement. The researchers compared the segments they identified and developed low inference “tags” which were used to identify additional text segments and consolidate these into categories that would be useful for analyzing the data across sessions. This coding hierarchy and associated tags were applied to each document within the ATLAS.ti software.


RESULTS


Findings are reported based on the topic areas addressed: what parents value from available information; what data and information they would like to access; and what they want from a website focused on education data of their children.


WHAT PARENTS VALUE IN AVAILABLE INFORMATION


Parents were asked to consider the kinds of education data to which they pay attention, where they go for education information, and how they determine data to be valuable and reliable. They also were asked how to reconcile conflicting information.


Data to Which Parents Pay Attention


All focus group sessions began with a discussion of parents sharing their experience accessing standard quantitative indices of school success, such as test scores, graduation rates, school grades, and educator qualifications. Parents discussed paying attention to reports that detail school, district, and state test score averages. One parent shared, “Test scores are my top priority.” Parents appreciate data that are comparable across schools, districts, and, in some cases, states. One parent commented, “What are the test scores? The ACT score is a reference that I understand. I don’t want too much to digest. If I can find a data point with a national standard that I understand how it is measured, that will help me.”


Graduation rates were discussed readily by parents who had children in high school or by parents who had recently moved or were thinking about moving. One parent shared, “If I was moving to a new area, I would want to know graduation rates. It is nice to know graduation rates and the ready for college percentage. It is an understandable presentation of data.” Seven focus groups consisted of parents who had moved based on education data they sought out and 10 focus groups consisted of parents who have considered or are considering moving based on these data. These parents reported that they considered the graduation rates of area schools when making home buying decisions. Another parent reflected on a possible move to Tennessee, stating that while she values test scores as a key piece of information, her husband focused more on the future of the community to which they would move: “I thought about what makes a good school. Are the kids progressing? Are they making progress? So, I look at test scores. My husband likes to think about growth in the area, if young people are moving in.”


Parents were looking for data that could help determine if schools would be successful with their children, with one commenting, “Will I trust these people to be with my child more than I am? Will they give her a strong foundation to move on in life?” Another parent described the search process: “I was in a situation where I could transfer my daughter and I could pick my top three schools in the district. I went to a website and looked at attendance, school websites, if teacher information was available. I picked the teacher I wanted. I looked at graduation rates, curriculum, the transportation route to the school, the district’s rating. I looked for gifted [and talented] testing. I looked for special needs services, emotional needs, and if the school supports them. I wanted to make sure the school could cover my kid’s needs, emotional and academic.”


Data on teacher qualifications also were of interest and were discussed by 13 of the 21 focus groups. For those parents who noted the importance of educator qualifications as an informative data source, parents reported that they pay attention to teacher qualifications such as their backgrounds, training, schools of education, and number of years of experience at the school and in the profession. As explained later, there were differences of opinion among parents who favored an experienced teaching staff and those who favored a younger teaching staff that may be more open to innovative practices. When possible, parents reported examining where teachers attended educator training programs. Some parents felt hesitant to send a child to a school where many teachers were alternatively certified through a program like Teach For America. One parent commented, “How many teachers have alternative certification? How many are Teach For America or Missouri Fellows? How many have little education experience? Some may be good, but others are bad, and some leave.” Parents were particularly concerned with teacher turnover, which was discussed in 17 focus groups. One parent commented, “I would look at staff turnover. Are there new teachers every year? I would avoid a lot of turnover.”


Different types of information were more important as children progressed through grade levels. Parents of children in pre-kindergarten and elementary school sought information on a school’s learning environment. These parents looked for data to show the schools had supportive and caring cultures, climates, and staff. A parent of an elementary school child commented, “Climate is huge. I do PTA to know what my child’s day is like. Climate is how much the teacher enjoys the building and colleagues, how much warmth is in the building.” Another parent shared that when looking at elementary schools, she asked, “Are counselors provided? How does the school respond to the social emotional needs of the students?”


Parents of middle school children expressed an interest in the enrichment activities available at school. These parents sought information on available student electives and extracurricular activities. One parent noted, “Many schools offer varying programs. I would like to see a matrix of what the programs offered at each school are. That would let you see what you didn’t know existed: [International Baccalaureate], career and [Advanced Placement] courses, STEM, STEAM, and gifted and talented. You would be able to see what programs are offered through the state, in districts, and how schools compare”. The parents also wanted school staff to be supportive of students in transitioning to be more independent and self-directed.


Parents of children in high school also expressed a concern for enrichment activities, with an added emphasis on advanced course offerings (Advanced Placement and International Baccalaureate), college and career readiness activities, and amenities provided at the school site. One parent noted, “There is one great high school in the district with a lot of amenities and staff. It provides strong academics and has technology for students, and has staff and coaches. Amenities are huge.” These parents were less concerned about the supportiveness of staff members. As one parent said, “By the time students get to high school, they are who they are.”


As discussions continued, school safety became the most discussed data considered by parents. Quantitative indices became of secondary importance. Parents with children in all grade levels discussed paying attention to information that would help them understand school safety. One parent shared, “When I think about middle school, I am concerned with safety. How do they handle discipline and how can they handle learning disabilities and behavioral problems?” Parents focused on the need to send their children to schools with safe and supportive learning environments. Described below, these data varied by grade and urbanicity—the classification as urban/suburban/rural contexts for the parents.


Parents of pre-kindergarten and elementary school children sought data to show the school had caring and supportive staff members and school counselors. To these parents, safety was centered on the culture and climate maintained by the school leaders and faculty, with one parent commenting, “I wish I could ask about how the school creates a socially welcoming environment for students.” For parents of secondary grade students, safety centered on the culture and climate as defined by student-level actions and data, such as suspension rates. Parents of middle school children discussed bullying as a concern. One parent asked, “Does the school have cameras? What if there is bullying? How are discipline issues handled?” These parents also reflected on the various forms in which bullying occurs. One parent moved her child out of a school that she found to be disorganized and placed him in a school that provided more services connected to student safety: “The [Parent Teacher Organization] and the district have evenings where lecturers come in and talk about cyber safety and cyber bullying. They are working to help parents talk about and learn about important things.” Parents of high school students focused on the prevalence of drugs and violence. A parent reflected on the high school in his son’s feeder pattern and the worries he has around potential violence: “We have heard they have discipline issues. We heard this on the news. They had fights, they have had drugs. What will the school do to ensure my child is safe and to make sure I know he is safe? I am also concerned about school shootings and what they do? What is their evacuation plan? How will they do that?”


Where parents live impacted how they perceived information about school environment. Parents in suburban and rural areas focused on the within school environment. For the purposes here, within school environment means details on the school leaders, teachers, their mindsets, and their approach toward creating a supportive classroom and school environment. One suburban parent commented, “I think the principal is why the school is so successful. The principal sets the example and the teachers take that example and it affects their behavior and how they work with students.”


In contrast, parents in urban areas focused on a more limited definition of school environment while also focusing on the environment of the surrounding neighborhood. These discussions focused on school safety policies and the safety of that neighborhood. These parents expressed a deep understanding of and concern for the area’s crime history and rates and how they impact the school and surrounding area. The parents wanted information about school safety procedures and processes including the existence of security officers, metal detectors, and surveillance cameras. Issues included safety from violence and what the school was doing to protect students. Parents expressed fear about guns being brought into the schools and about predators lurking in the neighborhood. One parent shared, “Neighborhood details are important. There are vacant houses near the school and those can be a problem.” Another parent shared that when she was making a decision on where to move, “I think about neighborhoods, while others think about schools. They are different. I would want information on the neighborhood. What did kids have to walk home?” Another parent added, “Public transportation information is important. Knowing school boundaries and their transportation options matter.” For these parents, having both school-level data concerning safety as well as neighborhood-level data matters and affects their decision making.


In addition to safety, parents in urban areas expressed interest in how schools focus on the unique needs of their students in ways that did not come up in suburban and rural groups. Parents in urban areas recognized that many students face issues related to poverty and mobility. They were interested in the ways schools understood these problems and worked to address them, at a school level and in partnership with other organizations. One parent commented, “I want to know what resources are available in the community. We have organizations that provide coats, clothes, and shoes. School partnerships are important. Can the school provide for student needs?”


Where Parents Go for Education Data


The vast majority of parents shared how they access education information from a variety of sources. Parents have easy access to quantifiable data such as test scores and graduation rates and these data are important. Parents get this education information from: local district and school websites, the state department of education, websites such as GreatSchools (https://www.greatschools.org/), real estate websites, and realtors. Because parents were directly asked to imagine they were a local realtor and consider what information they would share with other parents, experiences with real estate websites and realtors were discussed in every focus group. It was revealed that many parents have used this information, with one parent commenting, “Realtors provide a lot of information. Some will tell you where you do and don’t want to live. The realtors find school information useful and perspective parents find it useful.”


Parents want easily accessible and functional websites, using few navigational clicks. One parent shared that while accessing information online, “If I can’t find something in a few clicks, I will go elsewhere.” Absent a reliable and comprehensive source of information, many parents simply use Google to conduct searches and explore websites that appear in the results. Relying on Google was explicitly stated in 16 of the 21 focus groups. Ten parents commented that if they are unable to find information of interest on a website after a few clicks, they will return to their Google search and try a new source. One parent shared, “I use Google and see what comes up. I seriously consider websites with ‘.org’ and ‘.edu.’”


Only a small number of parents sought information from the state department of education. Parents in 7 of the 21 focus groups commented on using the Missouri Department of Elementary and Secondary Education (DESE) website (https://dese.mo.gov/) but, while it contains valuable information, parents shared that information is hard to access and understand. One PTA president commented that, “approximately 10 percent of the parents in my PTA know of and use DESE, though I wish more could and would.” Another parent who works as a demographer and is comfortable making sense of data similar to what is contained on DESE said, “I try to use DESE but it is not user friendly and it is not intuitive.” After looking at a website example that one parent found easier to navigate than DESE’s website, the parent commented, “How different is this from DESE? There, you have to know what you are looking for and the abbreviations can be confusing.”


Parents expressed a strong preference for descriptive data, including parent and student experiences. They sought these data through social media and parental comments, with social networking looming large. One parent shared, “I appreciate the full circle way social media allows you to see the involvement and events at the school. You can see what is happening, read parent comments, and the Facebook page directs you to more information.” However, the way by which parents access descriptive data is not limited to the Internet. Parents discussed talking with other parents, attending school functions, interacting with community members through religious organizations, contacting the PTA, and attending board meetings. One parent shared that when moving to a new area, she “Looked up schools in the area, found their websites, found churches near the schools, and reached out to church members” when possible.


Parents also discussed visiting schools to talk with educators. One parent stated that while test scores matter to her, “I have to be able to visit the school.” These visits help parents develop an idea for a school’s culture, climate, curriculum, classroom activities, and homework policies. One parent stated that she wants to know, “Are teachers and kids looking happy?” Parents expressed the value of sources of information that can provide this type of qualitative data. Parents largely believe these data demonstrate a more complete picture of a school.


Determining Value and Reliability


As described above, parents who access websites for education information often determine a website’s utility in a matter of computer mouse clicks. Parents are looking for “trusted,” third party sources of information, with one parent noting, “Making education information public on a neutral website will make the districts work to maintain equity.” But parents want this information to be easily accessed and professionally presented. If a source of information is difficult to navigate or does not appear to be professional, parents will abandon that source and seek others. All but one focus group (20) discussed seeking out multiple sources of information. One parent commented, “Sometimes a website will look credible, but if I don’t see the source information, I will find another website. Tell me where [the data] came from.”


Parents want sources of information that make clear what the data mean and where they come from. When possible, parents prefer to have this information augmented with parent feedback. Participants acknowledged the limitations inherit within websites that provide parent feedback: biased parents are likely to provide an incomplete portrayal of the school. However, websites that provide parent comments with the information on when these comments were published are more likely to be trusted than websites showing quantitative data with no information on when the data were published or with no information on how that data were collected and measured. One parent shared his preferred website would have four things: “The data, the source, the date, and a comment section where real people can leave comments.”


Parents expressed frustration when different forms of quantitative and qualitative data conflict with one another. As the conflict became apparent at the first focus group, a set of questions was introduced for subsequent focus groups that specifically would elicit more information about conflict resolution, as described in the Methods section above. In almost all instances, parents deferred to the descriptive information. If the school received a poor rating but the parents felt positively about the school, the parents focused more on the positive feelings. One parent commented, “When I look at ratings, I know there are good kids at bad schools and bad kids at good schools. I look to accountability. Do the teachers hold my kid accountable. I look to the school’s motto and language. If they talk about being responsible, respectful, and doing the right thing, the things I practice at home, then I am happy.”


Conversely, if the school was highly rated but the parents had negative impressions, they again opted for their impressions rather than the data source. Parents more readily believe their intuitions and subjective information available to them than the harder, quantitative evidence, with one parent commenting, “I would have to visit the school and then go with my gut feeling.” Another parent added, “I don’t look at the [ratings] online because you can’t find everything. It comes to interactions with people in the school. There are relationships in the school. It’s not always about getting answers correct, it’s about who [my daughter] is as a person.”


Same Information, Different Interpretation


Not only must parents deal with conflicting information, different parents also exhibited conflicts in how they interpreted particular data sources. Focus group discussions yielded evidence of a conflict with how parents interpreted the same source of information concerning technology, student demographics, and teacher experience.


Ten focus groups discussed technology available within a school and district, and all but two parents expressed that they wanted a school to provide varying technological experiences to students to help develop important skills. The majority of parents wanted to know how much districts and schools spend on technology and viewed this as a measure of how much districts focus on investing in students. One parent shared that she was very pleased that her child’s school “Has one-to-one technology with Chromebooks.” Two dissenting parents expressed a desire to not have elementary-aged students use technology in school. These parents noted that technology use is already prevalent. They did not want children to have more time on devices. One parent shared that schools “Are introducing [technology] to students too young. I don’t want kindergartners on laptops. I want them interacting.” They expressed that the use of technology came at the expense of collaborative and engaging teaching practices. To these parents, technology was viewed as a replacement for lazy teachers. While the majority of focus group parents would find data explaining the technology available to students as a favorable metric, these two parents viewed this as an unfavorable metric, but still a metric they want to know.


A second example of interpretive conflict arose around student demographic information. Diversity and demographics were discussed in 15 of the focus groups and overwhelmingly, parents desired data on a school’s student demographics to find diverse schools that could provide their children with experiences with diverse peers. One parent commented on searching for the most diverse schools and loved that she found an elementary school “that was like a little United Nations.” Another parent sought out a diverse neighborhood into which to move so that her daughters would have as diverse as possible educational experience. The language of the school she selected was primarily Spanish. She only spoke English. A third parent commented, “I want my child to be in a diverse environment. It matters.”


However, there were two instances when parents described not wanting a school to be diverse. One parent’s child attended a suburban school that was now accepting students from a nearby urban area due to accreditation issues. This parent viewed diversity as a sign of potential decreases in school-wide achievement and safety due to perceived deficiencies in the urban school from which new students are coming, explaining, “The school has changed from six years ago. There have been environmental changes . . . with an influx of students that increased free and reduced lunch rates and changed demographics. There was then turnover in teachers, so now we have more new teachers. We’ve seen challenges with discipline and focus.” For another parent, race and ethnicity was clearly an issue. This parent made it explicitly apparent that not only did he not want his children in a diverse school, but also sought out a school with a faculty lacking diversity. It is important to note that these two examples were a stark minority. For all other parents, diversity was an asset and demographic data were viewed as helpful resources in school selection. Parents by and large want access to school demographic data, but a minority of focus group parents would make use of that data in a different way than the majority of parents.


A third example of conflict was how parents interpreted teacher experience, which was of interest in the majority (13) of the focus group sessions. Most parents expressed an inclination to send their children to schools where teachers had several years of experience. One parent shared that she looks for “Years of experience versus credentials.” However, in three focus group sessions, parents expressed a desire to see a school staff that consisted of younger teachers, even if they were less experienced. These parents thought that younger teachers would be more comfortable incorporating technology and innovative practices, whereas more experienced teachers may not be as comfortable or as open to changing their practice. After voicing this opinion, the parent group was split between parents who agreed with this thought and those who still favored a more experienced teaching staff. One parent commented, “I’m not sure if I want a new teacher or an older teacher. New teachers might be more open to changing their behavior management and consequence system [so they are] conducive to learning and growth.” While the majority of focus group parents would look favorable upon a school with experienced teachers, a minority of parents would interpret the same data point differently.


WHAT DATA PARENTS WANT FROM INFORMATION SOURCES


Parents were asked to identify the data to which they would like to have access in order to get a sense of what they want and do not currently have. The variety of data desired by parents would be difficult to capture within one information source. Some of these data would be suitable only for school-specific websites. Parents recognized these limitations. However, parents expressed a desire for a source of information that could provide an overview of key education data, a high-level analysis of that data, and direction to more nuanced sources of information (including district, school, and appropriate nonprofit websites).


Though the type of qualitative data desired by parents depended on the urban/suburban/rural context of the focus group location, as described previously, all focus group sessions generally concluded with parents expressing a desire for school-specific qualitative and quantitative data that are comparable across schools, districts, and years. One parent commented, “I like being able to see and compare schools.” Another parent added that it is important to have the capability to compare across schools and districts because, “We need to know what and why we are not offering the same things.”


The quantitative data that parents already access largely are those data they desire: academic data such as test scores, graduation rates, and proficiency rates; school-level data such as attendance, class size, total enrollment, student demographic information, and district-level data such as financial information. One parent commented, “I am focused on statistics and trends. If I see trends that other schools are increasing in various ways and our school is not, that would be a consideration to leave. We can use that information to make decisions.” Parents wanted the capability to access a variety of data and to be able to make sense of those data or not pay attention as they saw fit.


School-specific information, both qualitative and quantitative and largely housed on school and district websites, is dependent on the capacity of those schools and districts to provide updated and accurate information. As a result, these data are more difficult for parents, especially for schools with little capacity, to access reliably. One parent commented, “I like to know how much the district spends per child, how much the library spends per child, how much the district spends on technology. But we don’t get that because it’s not published, and we cannot compare to other districts.” Parents expressed frustration in not being able to depend on the quality and timeliness of the data on these websites. Parents want reliable access to school and district mission and vision statements, leadership information, data on how students are being prepared for college and careers, and how successful students are in college or in obtaining employment. One parent shared that information on, “Leadership and the mission and vision is important to know and how it is carried out matters.” One parent shared that she always desired the amenities of larger schools and districts but wanted a smaller environment and was pleased to find a school that had a “Population of 500 but we offer an array of career and college courses, from technical to businesses. There’s a myriad of programs being offered with a student-teacher ratio I like.”


Parents also expressed a desire to have financial information such as per pupil spending, teacher salary information, and expenditures on facilities and programs. Parents wanted qualitative data to be paired with these quantitative measures. Parents greatly appreciate access to parent comments that can be provided on social media websites. However, parents also recognized that these are not always the best sources of reliable information. There was a consensus that websites that can provide clear and accurate quantitative data with a section for parent comments and feedback would be highly valued.


WHAT PARENTS WANT FROM A WEBSITE


Parents considered the different ways in which education data could be published and navigated through websites. Parents were shown three examples of websites that differed in the ways information was displayed and asked to consider what they liked and disliked about each website, how easy or difficult it was to navigate each website, and then to compare the three.


Features of Websites that Parents Liked


Participants were asked about the presentation of certain data elements. Figure 1 demonstrates a data display with four features that were expressed by the majority of parents (17 of the 21 focus groups) as being necessary in assisting their ability to make sense of education data. This example, from a school’s page on the Illinois Report Card website, provides users with a question mark (see *1A in Figure 1) next to the name of the data being displayed. This feature allows users to click the question mark and learn the definition of the data, in this case, what PARCC means. Parents expressed wanting the ability to quickly learn and define education phrases and terms without having to navigate to a new page. After examining the question mark feature, one parent commented, “I like that this is easy to navigate, and I don’t have to open a new page to Google this term.”


This display provides icons that allow users to interact with the displayed data by downloading it to Excel, to PDF format, or to print the image (see *2A in Figure 1). Parents expressed wanting these capabilities as they navigate education data. The display allows users to quickly compare data by grade level, student groups, or by school, district, and state. One parent commented, “I like that you can download the facts to Excel. I can download the data and make use of it.” It also allows users to toggle between different subject areas and comparison groups (see *3A in Figure 1). It is an expressly desirable capability that allows parent to more easily access and interpret data. One parent shared, “It is so easy to drill down. If I went to a school’s page, I wouldn’t know how it compares to other schools in the district. I like to see this.” Another parent added, “This makes it easy to compare. It is always nice to compare on the statistics you want to see.”


Finally, this display also provides a graphical display of the data (see *4A in Figure 1). While a minority of parents were comfortable reading data in tables and charts, most parents wanted graphical displays of data when possible. One parent commented that this display is “A nice way to represent information, having district, state, and school all in one graph. I can compare easily.” Another added, “The percentages also help a lot. This has clear information.”



Figure 1. Four examples of data features desired by parents

N.B. The figure demonstrates four main features parents look for when searching for education data. *1A: [39_22944.htm_g/00002.jpg]

The question mark allows users the ability to click and learn the definition of “PARCC.” Parents want the ability to quickly learn what phrases and terms mean without having to navigate to a new page; *2A: These functions allow users to interact with the displayed data by downloading it to Excel, to PDF, or printing the image. Parents expressed wanting these abilities as they navigate education data; *3A: This feature allows users to toggle to different types of data without having to navigate to a new webpage. Parents expressed wanting this feature as It allows them to more easily access and interpret data; *4A: This feature provides a visual display to compare data at the school, district, and state level. Parents expressed a desire to compare data at these levels and to have data displayed in a graphic when possible. Source: Illinois Report Card (http://www.illinoisreportcard.com/).




Parents liked website features that made it easy to navigate the available information with minimal computer mouse clicks. One parent noted, “I like to have a snapshot of information without having to navigate off the [web]page.” They favored websites that were designed in warm colors and with a clear and easy-to-use search feature that allowed for varied search terms (e.g., topics, districts, schools, addresses, and zip codes). Parents did not want websites that appeared sterile, with one commenting, “I think there can be more friendly colors than others. I like the warmth.”


Parents also appreciated website functionality that allowed them to better understand the presented information. The ability to access FAQs and have a quick way to define an education term or specific measurement made them more likely to value and use a website. For example, a website that allowed a user to hover the computer mouse over the term “PARCC” to learn the meaning of the acronym and read a brief description of it was strongly favored. Included in website functionality was the ability to translate a webpage from English to another language. Both Spanish-speaking and English-speaking parents noticed when this capability was and was not present and valued websites that provided for this ability. One parent commented, “I like that you can translate the page into Spanish by clicking the button.”


Parents strongly favored the presentation of information through icons and “quick facts” that allowed them to gain a general sense of multiple points of data quickly without having to navigate to a new webpage. One parent shared, “I like the snapshots and data below, that is great. There is no clicking back and forth.” Parents liked the ability to then select one of these points of data to further explore it and gain more clarity and understanding if it was of interest to them. The ability to use that data to compare schools to one another, to the district overall, and to the state was an essential feature for the participants. One parent noted, “I like the comparison of the school, state, and district. I like to see if a school is better or worse and that can tell me to pick a different school or stay in it.” Parents also wanted to know when the featured data was collected and published. Up-to date data made a difference to the parents.


Parents also liked having a variety of ways to retain and share the information provided on a webpage. Features that allowed parents the ability to download the data to Excel, print the page, convert the page to a PDF, or share a link to the page via email, Facebook, or Twitter were perceived as helpful and valuable. One parent shared the she and her husband do research on schools on their own and then talk about it, but it can be hard to find a specific link again. She “Liked that you can get a PDF and print it.” This feature would help her be more effective in using and sharing the data: “My husband and I can do the research when we have time, print out what we find interesting, and talk about it later,” she added. Another parent added, “I like that this website has the ‘print this page’ button large and where you can see it.” A third added, “If I were choosing between a few different areas, I would print off pages and then compare.”


Parents disliked features that made webpages appear unprofessional. These included the use of stock photography, unattributed data, unexplained abbreviations and acronyms, and unrefined graphs and tables. These features made parents less likely to explore a website in depth to understand its capabilities and use. Rather, they would close that page and begin a search for a new website. One parent shared, “I get distracted by stock photography. It is dumb and pandering.” Another added, “Stock photos turn me off. It makes it sterile and it does not seem credible.” Parents also disliked loud and bold colors. They found them to be both unprofessional and disconcerting. More muted tones were preferred and did not deter from the real data.


Parents expressed a need to balance the information available with how that information is presented. While some parents mentioned wanting a lot of information, there was an overwhelming agreement in not wanting to navigate through several webpages or hard to understand text and graphics. One parent noted, “People don’t have time to read everything. We want to be able to scan and find something quickly,” and another adding, “If [the webpage] has a snapshot and data below, that is great. I don’t want to click back and forth.” Presenting data clearly and simply is necessary to parent data use and understanding.


Website Design Considerations


A second component of the study was to understand website design characteristics. Figure 2 demonstrates a key feature of the Illinois report card that the majority of participants liked: The Fast Facts icons. This feature displays icons that provide an overview of 10 school-level data points that users can click for further examination. A user can take their computer mouse and hover it over the icons to read a description of each variable. This feature meets the needs of participants because it provides an easy way for them to quickly explore a lot of data without navigating to new pages, allows them to better understand data with which they are unfamiliar, and to decide which data to explore further. One parent commented, “I like that you get facts without having to click.” Also, participants favored these icons instead of websites that used stock photography. As noted above, they found stock photography to be insincere. In the absence of pictures of actual students in the school, participants found icons to be an acceptable way to accompany data and make for more user-friendly webpages. That said, there is a potential problem with using actual pictures in terms of protecting the privacy of students.



Figure 2. Example of a data display favored by the majority of participants

[39_22944.htm_g/00004.jpg]

N. B. Figure 2 demonstrates a data display favored by the majority of participants. Parents preferred displays that had multiple data points displayed simultaneously to limit the need to navigate to other webpages. Source: Illinois Report Card (http://www.illinoisreportcard.com/).




The Illinois Report Card platform demonstrated ease of use, ease of understanding, and ease of navigation, with the user having to make limited decisions. The way in which these data are navigated closely matches how many parents use the Internet: through a search engine. Overwhelmingly, parents go to Google, enter a search term, and navigate from there. The Illinois Report Card directs parents to data through a similar method: parents enter a search and navigate to data. The way in which the parent refines this search is minimal.


Overwhelmingly, parents preferred websites that provided a search bar to access information throughout the site. Parents compared this preference to being comfortable with search engines. Providing a home page that predominantly promotes a general search bar, like the home page of the Illinois Report Card website, will help parents be more comfortable in accessing information. Regarding the search bar, one parent shared, “You find it inviting. I like the detail. People don’t have time to read everything, we want to be able to scan and find something quickly. The search bar jumps out at me.”


After finalizing a search, parents overwhelmingly preferred having multiple forms of data displayed to them that they could then further dissect and analyze. It is recommended to have a template that populates a synthesis of important data for schools and districts prior to a user having to select and filter for terms.


CONCLUSIONS AND IMPLICATIONS


DISCUSSION


The current study examined two topics. It examined the information that parents want to make decisions about their children’s education. Second, it examined the design characteristics that parents see as desirable to make data systems informative, usable, and accessible. The study included focus groups with parents from across Missouri who represent diversity among socio-economic, ethnic, educational background, public/private, and geographical location designation groups. Despite the diversity, one message was clear in all of the focus groups— parents care deeply about their children and the education they receive. Parents want:


information that will help them make better and informed decisions about education;

the information to be easily accessible and easy to understand; and

information from user-friendly interfaces that provide a nice balance of information without complexity.


In education, districts, schools, and educators are often judged rightly or wrongly on what are seen as more objective, quantitative indices of quality. This may include test scores, graduation rate, dropout rate, educators’ qualifications, school grade, and per pupil expenditures. Such data may reside in local and state data repositories where they can be used for comparisons, compliance, and accountability. Educators want data that can lead to a process of continuous improvement. Parents are different. They want data and information to make informed decisions about their child. That data may well be more qualitative and not easily reside in a data system. They often rely on social networking to obtain the needed information. In reality, whether for educators, parents, or other stakeholder groups, data use is all about how the evidence is interpreted and made actionable through the decision-making process.


As is acknowledged in ESSA (Congress of the United States, 2015), parents form an essential stakeholder group which is why the legislation emphasized parental engagement. Parental needs around information are likely to differ from other stakeholder groups (A-Plus Communications, 1999; Olson 1999a, 1999b). Thus, parental engagement is seen as important in the educational process. Further, it is likely that some parents may have limited sophistication with data, thereby limiting their ability to effectively make meaning from the data. Some parents even lack access to online resources from which to obtain educational information. Having data systems that can balance providing easily understandable information while not limiting the data reported may be essential in getting the appropriate data into the hands of parents and others.


Many of the findings here concur with the few studies that address parental data use. Parents want to know about traditional school performance indices. Parents also want to know about their own child’s performance and have easy access to student-level data, typically through a parent portal on a local data system. But what now looms largest for parents is information about school safety. This topic is especially important for parents with young children, but the importance continues throughout the levels of education. It is especially important for parents in challenged schools. Parents want to know that their children will be safe in a supportive environment. Parents also have specific information needs based on the age of the child; that is, for elementary, middle, or high school level children, there are some common information needs but also specific needs. For example, college and career readiness looms large in high school, but preparedness begins at middle school. Enrichment activities are important at the middle and high school levels. In contrast, the most important information at the elementary school level is safety and support. Parents want to make sure their children have a loving and supportive environment.


Parents tend to seek information from a variety of sources. These include education (such as GreatSchools), real estate (such as Zillow), and local district and school websites. Although real estate sites provide limited information about schools, parents may consult them even if they are not in the market to buy or rent. Zillow contains school ratings provided by GreatSchools.org as well as grade levels for local schools and proximity to a targeted property. The parents interviewed in this study seemed more concerned with proximity than school rating. At no time in the focus groups did anyone mention the use of the websites to obtain demographic data. It is important to note here that the purpose of the inquiry about potential sources of information, such as Zillow, GreatSchools, or Trulia, was to determine where parents seek out information about schools, not to identify potential or confirmatory sources of bias.


Of course, there are risks involved with the use of the real estate websites due to Fair Housing laws and potential bias (Bischoff & Reardon, 2013) and real estate agents have strict guidelines about not discussing demographics. Yet as research has shown, many variables go into a housing decision (Bayoh, Irwin, & Haab, 2006; Dunning & Grayson, 2014). The Harvard Joint Center for Housing Studies (2017) convened a symposium to discuss how communities can become more inclusive. One of the panels at this symposium was entitled, “What would it take... To promote residential choices that result in greater integration?” (Steil & Jordan, 2017).


Parents rarely go to state department of education websites. When parents seek websites with education information, they most often use Google and conduct a simple search, rather than having a particular website in mind to navigate. But the prevalent sources of information are not technology-based. They are by word-of-mouth through social networks. Parents talk to other parents. They talk to PTA members and community leaders. They get a feel for the schools by walking the halls, talking with the principal, with the teachers, and others in the school. They get an impression of the school by what the expressed vision of a teacher is, information about the curriculum, classroom activities, homework policies, and displays in the halls.


Perhaps the most interesting finding is how parents reconcile conflicting information, essentially making a statement about how they prioritize educational data. In almost all instances when parents were provided conflicting information, parents deferred to the descriptive information. According to the study’s participants, these parents more readily rely on information they have gathered through experiences with schools or trusted informants, and they are more likely to use subjective information to make decisions, rather than quantitative data. This finding has major implications for the provision of information.


Another finding affects how the same data may be interpreted differently by different people. The role of data repositories is to provide access to the desired information, leaving the interpretation of the data to parents and other stakeholders. One aspect of the validity of the data themselves lies in the data quality—that is, the timeliness, accuracy, and completeness. Yet there is validity of interpretation in how the end users make meaning of the data provided to form the basis for their decisions. That is clearly the case in the examples of the conflicting data sources.


The study also sought to understand what makes good data displays. This is particularly relevant and timely, given that states are now struggling to design effective and meaningful school report cards as required by ESSA and data dashboards (Donovan, 2018). Currently, education has robust data systems to collect and store much more data than are being used. In fact, the field is awash with data. But data access and data visualizations are lacking, which makes data interpretation and information translation into decisions difficult or impossible. Data dashboards are hard for teachers to use, but at least they exist. Data dashboards for parents are much less well developed. Thus, there is increasing attention being given to developing better and more understandable way to visually present data (Friedman, Bernstein, & Miceli, 2018).


Part of the issue is about what data are accessible and how they are displayed. Parents want data that are easily accessed, displayed, and understood. They want FAQs, drill-down capability, translations, and explanations of indices. They want easy navigation. They want a warm and friendly design and they want to know current the data are. These findings not only inform the design of robust websites dedicated to education data, but also ways in which schools and districts can better tailor their website design and content. Schools, districts, and states can apply these findings to make their websites more parent-friendly and to empower parents with information they can navigate, make sense of, and from which they can make decisions.


Making the right data available to parents is a first step to empowering them to use data. But another issue looms large and has received little or no attention in the research literature. Having access to and being given data is one thing. Knowing how to make sense of the information is a more complicated issue. There can be no assumptions about the level of sophistication most parents have when it comes to dealing with data and information. The field knows that educators often lack data literacy and have not been trained adequately to effectively use data (Mandinach & Gummer, 2013, 2016a, 2016b). States have only recently begun to attend to the importance of educator data literacy (N. Anderson, 2018; Data Quality Campaign, 2014; Mandinach, Friedman, & Gummer, 2015). If educators lack the capacity to use data, what expectation is there for parents to be able to understand and make meaning from complex information? There may well be a disparity among parents who differ in their education levels, but even among the most educated, understanding educational data is not a trivial enterprise.


Not only may data literacy be an issue, but the digital divide also remains an impediment. There is an implied assumption that all parents have access to electronic information. This is not the case. In one focus group in an economically challenged location, several participants stated that they did not have access to the Internet, did not have computers, nor did they have mobile devices. They had flip phones with no connectivity. It was unlikely that these individuals would seek access at a library or elsewhere where there might be publicly available computers. This circumstance raises the issue of how unconnected parents can seek information beyond through human interaction.


The need for data and information is only going to increase over time. It is essential that data repositories are secure, accessible, user friendly, and contain data that are valid and of high quality. These data need to be readily available to end users. But the issue remains about whether the kinds of data that parents find most useful and actionable are the ones that typically reside in data warehouses and district websites. Access to more data is not being accompanied by better data dashboards and displays. The Schneider et al. (2018) study provides evidence that the provision for more comprehensive data than are typically found in state (and likely local) data systems can serve to fill the information gap. But the question remains whether the broader range of data can be collected and stored in district and state data systems in ways that are understandable to the general public.


LIMITATIONS


This study is limited by the representativeness of the parents. Participants were recruited through outreach to local PTAs and to local nonprofit organizations that works with families in need. This type of outreach made it difficult to secure equal numbers of parents that represent all grade levels, socioeconomic status, and urban/suburban rural context of schools. Using the PTA as a conduit for recruiting also introduced a potential bias to the sample. PTA members tend to be more active than other parents, a bias that was unavoidable in the study. While parents who identified as upper middle class did attend focus group sessions, there were no parents with more affluent backgrounds. Perspectives from parents from more affluent communities may have yielded different feedback. While there were parents from rural areas, representation from parents in more extreme rural districts was not possible. Some parents mentioned having moved from these types of school districts, but no participating parent currently lived in those areas.


Additionally, relying on third-party partners to recruit parents made it difficult to maintain consistent numbers of participants across focus groups. Researchers worked with these partners to generate communications to potential participants and to cap outreach after a certain number of parents (10) committed to attend the focus group. Limiting the number of participants per session was done in an effort to ensure each participant could be an active member of the group and to streamline compensation for the participants. However, some outreach partners were more capable of ensuring parental commitments than others. In the best scenarios, researchers were able to communicate with partner organizations without issue and was able to receive updates if and when parents had to cancel their participation, allowing for further outreach to be conducted. In the worst scenarios, partner organizations were not able to find participants and failed to communicate that to the researchers until right before the focus group date. This fluctuation in partner organization communication and outreach effectiveness resulted in focus group sessions that had 2 or 3 parents in attendance when 8 to 10 were expected.


Researchers ensured that every participant was able to respond and express their perspectives by structuring the focus groups to include direct questioning when needed and open discussion. Everyone got a chance to respond. The team applied appropriate wait time strategies and used direct questioning to participants who were quieter. None of the focus groups had a respondent who dominated the conversation. However, there was one incident where a respondent made a racially charged statement. Luckily, it was a small group that included the individual’s spouse and only one other individual. In all other groups, respondents expressed desires for racial diversity that seemed both sincere and unassailable. Despite these limitations, the focus groups did yield rich information about the targeted foci of investigation. The focus groups provided feedback about the design of data displays and reporting and an understanding of the kinds of information parents seek to make educational decisions.


No doubt, the development of universally understandable data systems and displays is a daunting task. The proliferation of both qualitative and quantitative data sources presents a challenge. Yet the technologies are becoming more sophisticated, allowing state and local education agencies to consider better ways to display information to stakeholders. The agencies are now paying more attention to communicating with data and information, in part because of ESSA requirements. There is still much work to be done, particularly in how to reach unconnected segments of the population. If the agendas at recent STATS-DC conferences (the annual data conferences from the Institute of Education Sciences) are any indication, districts and states are taking seriously the need to provide accessible and understandable information to diverse stakeholder groups, not just educators.



Acknowledgment


The authors would like to acknowledge support from the Ewing Marion Kauffman Foundation for this work.


References


A-Plus Communications. (1999). Reporting results: What the public wants to know: A companion report to Education Week’s Quality Counts ’99. Arlington, VA: A- Plus Communications.


Adams, C. M., Ford, T. G. Forsyth, P. B., Ware, J. K., Olsen, J. J., Lepine, J. A., Sr., Barnes, L. B., Khoiasteh, J., & Mwavita, M. (2017). Next generation accountability: A vision for improvement under ESSA. Palo Alto, CA: Learning Policy Institute.


Anderson, N. (2018, July). How to develop and disseminate sustainable online data use training for teachers. Presentation at STATS-DC, Washington, DC.


Anderson, R. (2018, May 3). 2018 state legislation update: State leaders look at data privacy – and much more. Washington, DC: Data Quality Campaign. Retrieved from https://dataqualitycampaign.org/2018-state-legislation-update/


Bayoh, I., Irwin, E. G., & Haab, T. (2006). Determinants of residential location choice: How important are local public goods in attracting homeowners to central city locations?" Journal of Regional Science, 46(1), 97–120.


Bischoff, K., & Reardon, S. F. (2013). Residential segregation by income, 1970-2009. Stanford, CA: Center for Education Policy Analysis, Stanford University. Retrieved from http://cepa.stanford.edu/sites/default/files/report10162013.pdf


Congress of the United States. (2015). Every Student Succeeds Act (S. 1777). Washington, DC: One Hundred and Fourteenth Congress of the United States.


Dabney, E. (2018, June). More than a number: Tools for talking about education data. Presentation made at the Maryland Connections Summit, Towson, MD.


Darling-Hammond, L., Wilhoit, G., & Pittenger, L. (2014). Accountability for college and career readiness: Developing a new paradigm. Education Policy Analysis Archives, 22(86), 1.


Data Quality Campaign. (2014). Teacher data literacy: It’s about time. Washington, DC: Author.


Data Quality Campaign. (2016, April). Parents want their children’s data. Washington, DC: Author. Retrieved from http://dataqualitycampaign.org/resource/parents-want-childrens-data/


Data Quality Campaign. (2017, August). Parents value, trust, and rely on education data. Washington, DC: Author. Retrieved from https://dataqualitycampaign.org/resource/parents-value-trust-rely-education-data/


Donovan, D. (2018, July). Empowering parents: Mississippi’s ESSA report card. Presentation made at STATS-DC, Washington, DC.


Dotger, B. H. (2011). The school leader communication model: An emerging method for bridging school leader preparation and practice. Journal of School Leadership, 21, 871–891.


Dotger, S., Dotger, B., & Tillotson, J. (2010). Examining how preservice science teachers navigate simulated parent-teacher conversations on evolution and intelligent design. Science Education, 94(3), 552–570.


Dunning, R., & Grayson, A. (2014). Homebuyers and the representation of spatial markets by information providers. International Journal of Housing Markets and Analysis, 7(3), 292–306.


Friedman, K., Bernstein, H., & Miceli, M. (2018, July). Connecting the dots: From visualization to useful information. Presentation made at STATS-DC, Washington, DC.


Goodall, J. (2012) Parental engagement to support children's learning: a six point model, School Leadership & Management, 33(2), 133–150. doi:10.1080/13632434.2012.724668


Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sci­ences, U.S. Department of Education. Retrieved from https://ies.ed.gov/ncee/wwc/Docs/PracticeGuide/dddm_pg_092909.pdf 


Harris Poll. (2015). Parent survey: Prepared for the Data Quality Campaign. Rochester, NY: Author.


Harvard Joint Center for Housing Studies. (2017). A shared future: Fostering communities of inclusion in an era of inequality. National symposium. Cambridge, MA: Author.


Hornby, G., & Lafaele, R. (2011) Barriers to parental involvement in education: an explanatory model, Educational Review, 63(1), 37–52, doi:10.1080/00131911.2010.488049.


Kleiner, A. (2002, October 11). Karen Stephenson’s quantum theory of trust. Strategy+Business, 2002(29). Retrieved from https://www.strategy-business.com/article/20964?gko=8942e


Krysan, M. (2008). Does race matter in the search for housing? An exploratory study of search strategies, experiences, and locations. Social Science Research, 37, 582–603.


Lawson, M. A, & Alameda-Lawson, T. (2012). A case study of school-linked, collective parent engagement. American Educational Research Journal, 49(4), 651–684.


Mandinach, E. B., Friedman, J. M., Gummer, E. S. (2015). How can schools of education help to build educators’ capacity to use data: A systemic view of the issue. Teachers College Record, 117(4). Retrieved from http://www.tcrecord.org/PrintContent.asp?ContentID=17850


Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy into educator preparation. Educational Researcher, 42(1), 30–37.


Mandinach, E. B., & Gummer, E. S. (2016a). Data literacy for educators: Making it count in teacher preparation and practice. New York: Teachers College Press.


Mandinach, E. B., & Gummer, E. S. (2016b). What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions. Teaching and Teacher Education, 60c, 1–11.


Mandinach, E. B., & Miskell, R. C. (2017). Focus groups to support the development and utility of EdWise for parental stakeholders: Findings from parental focus groups. Washington, DC: WestEd.


Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San Francisco: John Wiley & Sons.


National Center for Education Statistics. (2017). Statewide longitudinal data systems grants program. Retrieved from https://nces.ed.gov/programs/slds/about_SLDS.asp


Olson, L. (1999a, January 11). A closer look: What makes a good report card? EdWeek, 18(17), 29. Retrieved from https://www.edweek.org/ew/articles/1999/01/11/a-closer-look-what-makes-a-good.html


Olson, L. (1999b, January 11). Report cards for schools: No two states report exactly the same information. EdWeek, 18(7), 27–28, 32–36. Retrieved from https://www.edweek.org/ew/articles/1999/01/11/report-cards-for-schools.html?r=1980558413&qs=prototypes+of+data+reports


Phi Delta Kappan. (2015, September). The 47th Annual PDK/Gallup Poll of the public’s attitudes toward the public schools. Bloomington, IN: PDK International.


Rankin, J. (2016). Data systems and reports as active participants in data interpretation. Universal Journal of Educational Research, 4(11), 2493–2501. doi:10.13189/ujer.2016.041101


Reynolds, K. (2018, July). Building a continuum of training for educator dashboards. Presentation made at STATS-DC, Washington, DC.


Schneider, J. S., Jacobsen, R., White, R. S., & Gehlbach, H. (2018). The (mis)measure of schools: How data affect stakeholder knowledge and perceptions of quality. Teachers College Record, 120(6).


Steil, J., & Jordan, R. (2017, April). Household neighborhood decisionmaking and segregation. Presentation in What would it take . . . To promote residential choices that result in greater integration?/ Presented at the national symposium A Shared Future: Fostering Communities of Inclusion in an Era of Inequality. Cambridge, MA. Retrieved from http://jchs.harvard.edu/research/publications/shared-future-household-neighborhood-decisionmaking-and-segregation?_ga=2.114402946.402950750.1526316432-975570592.1526316432


Stephenson K. (2005). Trafficking in trust. In L. Coughlin, E. Wingard, & K. Hollihan (Eds.), Enlightened power: How women are transforming the practice of leadership (pp. 243–264). San Francisco, CA: Jossey-Bass.


Strauss, A., & Corbin, J. M. (1997). Grounded theory in practice. Thousand Oaks, CA: Sage.


Tillman, L. C. (2002). Culturally sensitive research approaches: An African-American perspective. Educational Researcher, 31(9), 3–12.


Warren, M., Honig, S., Rubin, C., & Uy, P. (2009). Beyond the bake sale: A community-based relational approach to parent engagement in schools. Teachers College Record, 111, 2209–2254.


Wayman, J. C. (2005). Involving teachers in data-driven decision-making: Using computer data systems to support teacher inquiry and reflection. Journal of Education for Student Placed at Risk, 112(4), 521–548.


Wayman, J. C. (2007). Student data systems for school improvement: The state of the field. TCEA In Educational Technology Research Symposium: Vol. 1 (pp. 156–162). Lancaster, PA: ProActive Publications.


Wayman, J. C., Cho, V., & Richards, M. (2010). Student data systems and their use for educational improvement. In McGaw, B., Peterson, P., & Baker, E. (Eds.) The international encyclopedia of education (Vol. 8, pp. 14–20). London: Elsevier.


Wayman, J. C., Stringfield, S., & Yakimowski, M. (2004). Software enabling school improvement through analysis of student data (CRESPAR Tech. Rep. No. 67). Baltimore, MD: Johns Hopkins University, Center for Research on the Education of Students Placed at Risk. Retrieved from http://www.csos.jhu.edu/crespar/techReports/Report67.pdf.


Yoshinaga, K., & Kamenetz, A. (2016, October 10). Race, school ratings and real estate: A ‘legal gray area’. nprEd how learning happens. Retrieved from http://www.npr.org/sections/ed/2016/10/10/495944682/race-school-ratings-and-real-estate-a-legal-gray-area



APPENDIX: SAMPLE NOTES


You are a new parent to a school and even to the area. You are pretty savvy about the neighborhood, education, and what you want for your family, but you want to know more about the school, the district, and the state. At an informal gathering you spot a neighbor who seems knowledgeable about the school and area. The neighbor has a child the same age as yours. What kinds of questions do you ask this neighbor? What is most important for you to know? How would you use that information?


P1: Does it feel like a community among the staff? They will pass that to the students. What does parent involvement look like? Do parents help out and stay involved or do they simply drop kids off?

P2: Which school are you working at? How would you evaluate your school to others? How to compare: Want the psychological profile, how the teachers coordinate together, how long they have been there. What is the best thing and what is something you would change?

P3: My son’s teacher quit halfway through the year. I want to know about issues before.

P4: Do you like working there?

P5: What is the culture? Class size, I would want to know. In a bigger school, he may be one of 30. How involved is the PTA? Is there one? I stay at home so PTA is my social outlet.

P6: How do you like your principals? What is your class like? What are the students like? How do you get along with students? How do students get along with you? What is the demographics of the teachers? Is it a mixed community but do teachers only represent one race?


You are a PTA officer for your child’s school. What information do you share with parents and community members? How can they learn more about the school? What are some developments at the school that parents and the community should be aware of?


P1: Many schools offer varying programs. If there was a way to see a matrix of what the programs offered at each school are. See what you didn’t know existed: IB, Career and AP courses, STEM, STEAM, gifted and talented. What programs offered throughout the state, districts, and how schools compare. We have strings. Do other schools have that? What is open to them? There are programs that I probably don’t even know exist but knowing options is key. What they teach changes every year. Know what program and what method they are using and teaching from. Common Core? Is it something else? For special education, I don’t want my child to have to take a bus during the day to go to a specialist. Want a specialist on site.

P2: I’d tell them this school is walkable. What is turnover like? Even without a special needs child, it could affect a parent knowing the school has many resources, because it shows the school is a thought out functioning institution.

P3: Use great schools but it is not super productive and does not seem up to date. Parents go on Facebook and say what curriculum is good and bad, I want to hear from teachers and the school on what they use and why, how it is helpful? Kids are bussed in for sage, would like to see schools offering this.

P4: My child has dyslexia and the staff has been so helpful. I want to know what accommodations they make and what staff does to support. Know if there is a dedicated special education teacher.

P5: I talk about the staff; they support your child. I feel good about the staff. I explain the staff and the PTA involvement, the fun activities, it’s still a learning environment. Turnover is a big one, and what is it due to? Did a teacher leave for professional growth to become a principal? Not the details but why the turnover happened. And the demographic of the teachers. Is it one race of teachers teaching a mixed student body? Does it match the students?

P6: Curriculum is another piece. They don’t teach penmanship or spelling, which is important for life skills.




Cite This Article as: Teachers College Record Volume 122 Number 1, 2020, p. 1-42
https://www.tcrecord.org ID Number: 22944, Date Accessed: 10/23/2021 8:17:54 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Ellen Mandinach
    WestEd
    E-mail Author
    ELLEN B. MANDINACH is a senior research scientist and the director of the Data for Decisions Initiative at WestEd. She has worked in the area of data-driven decision making for over 15 years to understand how to help educators use data more effectively. She has studied data use in classrooms, schools, districts, and state departments of education. Mandinach has served on numerous national advisory boards around data use. Her research interests include data literacy, data-driven decision making in education, the integration of data literacy into teacher preparation programs, and the use of data to improve teacher preparation programs. Her two most recent books are: Mandinach, E. B., & Gummer, E. S., (Eds.). (in press). Data for continuous programmatic improvement: Steps colleges of education must take to become a data culture. New York, NY: Routledge. Mandinach, E. B., & Gummer, E. S. (2016). Data literacy for educators: Making it count in teacher preparation and practice. New York, NY: Teachers College Press.
  • Ryan Miskell
    WestEd
    E-mail Author
    RYAN C. MISKELL is a research associate in WestEd’s Learning Innovations program. His work involves the evaluation of programs implemented by schools, districts, states, and foundations to inform and improve educational policy and program decisions. He also provides technical assistance and training to school leaders and teachers to build capacity and improve programs. Recent work has included monitoring of the U.S. Department of Education’s Charter School Program and Magnet School Assistance Program grantees, evaluations of the special education programs of two school districts in Maryland, and technical assistance provided to three charter networks in Washington, DC. Mandinach, E. B., & Miskell, R. C. (2018). Blended learning and data use in three technology-infused charter schools. LEARNing Landscapes, 11(1), 185-200.
  • Edith Gummer
    Arizona State University
    E-mail Author
    EDITH S. GUMMER is the executive director of the Office of Data Strategy in the Mary Lou Fulton Teachers College at Arizona State University. She has been a program director at the Ewing Marion Kauffman Foundation and the National Science Foundation, supporting the funding of research and development in education with a specific focus on STEM education. Her research interests focus on the effective use of information to inform educational decisions ranging from instructional decisions in the classroom to educator preparation programs. She has been supporting the development of a research agenda in personalized learning with LEAP Innovations and serves on the advisory board of a number of organizations that support education entrepreneurs. Mandinach, E.B. & Gummer, E.S. (2016). Data literacy for educators: Making it count in teacher preparation and practice. New York: Teacher College Press: NY. Gummer, E. S., & Mandinach, E. B. (2015). Building a conceptual framework for data literacy. Teachers College Record, 117(4). Retrieved from http://www.tcrecord.org/PrintContent.asp?ContentID=17856
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS