This report identifies the methods used, results, and recommendations to improve student STEM education success.
Knowing something and knowing how to do something are very different things. In past years the world has recognized the U.S. as a leader in education and a nation of doers and innovators. U.S. schools produced these innovators who kept our economy strong and our country secure. However, the ability of U.S. schools to produce citizens with those abilities seems to have ebbed, and our innovative prominence has eroded.
It appears that over the past several decades the approach to education...
It appears that over the past several decades the approach to education has been to prepare students for standardized (high-stakes) tests versus teaching them how to apply knowledge (Archbald & Newmann, 1988; Martinez & Stager, 2013). Possessing knowledge is very important. However, being able to draw upon and apply that knowledge is necessary to adequately function in life and the reason why learning is so important. In a study identifying a means to improve students’ statistical thinking, Sedlmeier (2000) found that, “learning by doing has a large and lasting effect on how well people can solve conjunctive probability tasks” (p. 227). In support of the Activity Based Learning approach, in 2011 Robert Yager (Professor of Science, University of Iowa) posed the question: “Why is there not more attention to all students (and teachers) actually “doing” science in every K-16 science classroom”? (p. 62).
The purpose of this study is to determine the extent to which U.S. public school elementary and secondary education science, technology, engineering, and mathematics (STEM) students are doing activities in their classrooms. This is the second in a series of articles discussing the “Doing-Based Learning” study. The first article (Moye, Dugger, Starkweather, 2014) identified the study and selected findings from the first round of this four-year study. This report identifies the methods used, results, and recommendations to improve student STEM education success.
Ongoing international studies and other standardized measures have provided performance data on the quality of education for students in participating countries. These studies include data collection on cognitive knowledge but do not have a strong emphasis on measures related to doing. Measuring cognitive knowledge and not the ability to apply that knowledge comes with controversy. To a great extent, the emphasis is on high-stakes standardized testing, while there is very little focus on measuring the ability for students to use that knowledge.
During the time that the U.S. has increased focus on standardized testing, “there has been a marked increase in the share of jobs that require creative problem-solving skills.” (PISA, n.d. para. 1). Discussing their performance on the 2012 Programme for International Student Assessment (PISA) Problem Solving Assessment, the “students in the United States perform slightly above the average (500 points) of the 28 OECD countries that took part in the assessment.” (PISA, n.d. para. 2). Further, the PISA report identified that only 18.2% of U.S. students reached “the baseline level of proficiency in problem solving—meaning that, at best, they are only able to solve very simple problems that do not require thinking ahead” (PISA, n.d., para. 5).
Trends in International Mathematics and Science Study (TIMSS) reveals similar results. In 2011: “At Grade 4, the United States was among the top 15 education systems in mathematics” (NCES, n.d.a., para. 3) and “at Grade 8, the United States was among the top 24 education systems in mathematics” (NCES, n.d.a., para. 9). In fourth grade, the United States was “among the top 10 education systems in science” (NCES, n.d.b., para. 3) and “at Grade 8, the United States was among the top 23 education systems in science” (NCES, n.d.b., para. 9).
Education leaders should ask: Are we satisfied that U.S. students are deemed average? Are we using all available resources to improve U.S. students’ STEM and problem-solving skills? Martinez and Stager submit that,
The past few decades have been a dark time in many schools. Emphasis on high-stakes standardized testing, teaching to the test, de-professionalizing teachers, and depending on data rather than teacher expertise has created classrooms that are increasingly devoid of play, rich materials, and the time to do projects. (2013, p. 1).
To what extent U.S. students are learning by doing is another question education leaders could ask. The purpose of this study is to answer that question. The researchers developed three instruments, one each for the elementary, middle, and high school levels. The three instruments used asked teachers to respond “Yes” or “No” to 13 statements. The same first two statements were presented in all three instruments. The 11 subsequent statements were specific to each grade level (Grades 3-5, 6-8, and 9-12) and were based on Next Generation Science Standards (NGSS, 2013a), Standards for Technological Literacy (STL) (ITEA/ITEEA, 2000/2002/2007), and Common Core State Standards for Mathematics (CCSS, 2010). Grades K-2 were not included in this study because NGSS standards specific to Engineering Design begin at the third grade level. The NGSS authors state, “With increased maturity students in third through fifth grade are able to engage in engineering” (NGSS, 2013b, p. 52).
Often when items such as curricula are designated “standardsbased,” they may in fact only allude to those standards. The statements designed for this study were gleaned directly from NGSS and STL. The Common Core State Standards for Mathematics used were the “Standards for Mathematical Practice” (NGSS, 2013b. p. 138). For example, one of this study’s middle school statements was “My students have analyzed and interpreted data to determine similarities and differences in findings.” The statement was based on NGSS Middle School Engineering Design standard MS-ETS1-3, “Analyze data from tests to determine similarities and differences among several design solutions to identify the best characteristics of each that can be combined into a new solution to better meet the criteria for success” (NGSS, 2013a, p. 86). The reader will find that the statement also reflects Standards for Technological Literacy Standard 13, Grade 6-8 benchmark, “Interpret and evaluate accuracy of information” (ITEA/ITEEA, 2000/2002/2007, p. 213). The researchers also provided a list of definitions to clarify the meaning of words and terms used in this study.
The instruments were validated by elementary, middle, and high school STEM teachers. The validating teachers were asked to “Agree” or “Disagree” that the statements reflected something that a teacher at their grade levels could expect their students to do in their courses. They were also given an opportunity to include any comments that they felt should be included in the study. The feedback was sufficient to consider the instruments valid.
The researchers prepared a cover letter introducing the study and asked for teacher participation. The cover letter explained the purpose of the study and explained how to use the links to access the list of definitions and survey instruments using SurveyMonkey. The researchers emailed the cover letters to each state science, technology, engineering, and mathematics specialist as well as the board of directors of state associations. The researchers also used the U.S. News Education List of Best High Schools website (U.S. News, n.d.) to identify email addresses of teachers in those schools. The U.S. News website also led to many school districts’ elementary and middle school websites. The researchers ultimately sent emails to approximately 5,000 elementary, middle, and high school science, technology and engineering, and mathematics teachers, state supervisors, and association affiliate representatives.
This survey was open from March 1, 2014 until April 15, 2014. Although not all teachers responded to each statement, there was a total of 1,670 responding teachers. A total of 437 elementary, 404 middle and high school science, 544 middle and high school technology and engineering, and 285 middle and high school mathematics teachers responded to the first statement. As for the second statement, the number of responses was the same, with the exception of one less middle and high school mathematics teacher (284 versus 285). For statements three through thirteen, 365 elementary teachers responded. At the middle and high school levels, there were 133 middle and 220 high school (total 353) science, 194 middle and 308 high school (total 502) technology and engineering, and 104 middle and 151 high school (total 255) mathematics teachers who responded.
The reader will see that there was a significant drop from the number of teachers who responded to Statements 1 and 2, and those who continued to respond to Statements 3 through 13.
The first two statements were designed to find teachers’ opinions concerning students doing projects in their classrooms. Table 1 identifies the two general statements asked at both elementary and secondary levels, the total number of responses, the number and percentage of elementary, and the combined number of middle school and high school teachers in each
subject area who indicated “Yes” to each statement. Technology and engineering teachers were grouped together because of the impossibility of distinguishing between the two types of teachers that use similar content.
Of the 437 elementary teachers who responded to Statements 1 and 2, 365 responded to the remainder of the statements (3-13). Table 2 identifies the statements and total number and percentage of elementary teachers responding “Yes” to each statement. The last row of the table contains the total responses/percentage of doing in courses. The researchers derived this data by adding the number of responses in the Total Elem. Resp. column and dividing that number by the total number of “Yes” responses
Four hundred and thirty-one middle school teachers responded. Of those, 133 were science, 194 were technology and engineering, and 104 were mathematics teachers. Table 3 identifies the statements as well as the number and percentage of the teachers responding “Yes” to each statement. The last row of the table contains the total responses/percentage of doing in courses. The researchers derived this data by adding the number of responses in each column and dividing that number by the total number of “Yes” responses.
Six hundred seventy-nine high school teachers responded. Of those, 220 were science, 308 technology and engineering, and 151 were mathematics. Table 4 identifies the statements as well as the number and percentage of the teachers responding “Yes” to each statement. The last row of the table contains the total responses/percentage of doing in courses. The researchers derived this data by adding the number of responses in each column and dividing that number by the total number of “Yes” responses.
If you are unable to login, you may need to update your Profile.
Go here for more information on how to update your profile to access your account in EbD-BUZZ.
If you are still unable to access your account after following these directions, contact
If you are not currently an EbD-Network School and need more information,
contact us at firstname.lastname@example.org
for a Network Agreement and any associated costs.