Foreign Languages: The Scales Tip in Favor of Starting Early
To the Editor:
I must take issue with the Commentary by Brad Marshall ("Is There a 'Child Advantage'in Learning Foreign Languages?," Feb. 9, 2000). In asserting that young children do not have an advantage over adults in learning new languages, Mr. Marshall contradicts not only the conclusions of several neuroscientists but the experiences of countless parents, students, and teachers as well.
Dr. Norman Relkin, who participated in a study of language-sensitive regions of the brain for the Memorial Sloan-Kettering Cancer Center and the Cornell University Medical College, concluded that there is "a critical period for language learning. . . if you miss that period, it may be putting you at a disadvantage later in life." Research by Patricia Kuhl, the chairwoman of the department of speech and hearing sciences at the University of Washington, found that children can acquire new languages easily in their preschool years, and that "if we're going to expose children to second languages, it's best to do it early."
Mr. Marshall claims that because some adults learn new languages easily, the medical claims regarding children's language advantage must be wrong. However, exceptions to the rule do not disprove the rule, and there are always deviations from the mean. He cites research that preschool children learned less of a new language than older students did after the same amount of exposure, but he does not say just how much exposure there was. He also cites research that the listening skills of teenagers surpassed those of 6-year-olds. This is of course obvious to anyone, and again says nothing about the natural language advantage of younger children.
Certainly, environment plays a major role in the language advantage young children have. Young children tend to be less self-conscious about making mistakes, and they seem to have a greater eagerness to interact with native speakers of the second language, for example. But these environmental factors provide even greater reasons why second-language-acquisition programs must start as early as possible.
I also found it interesting that Mr. Marshall claims it is easier to learn a second language if it is very similar to the student's native language. I agree with him on this point, but he is contradicting the claims of bilingual education supporters like Jim Cummins and Stephen Krashen, who claim that literacy skills transfer easily from one language to another, regardless of differences between the two.
Vice President for Education Center for Equal Opportunity
To the Editor:
I am writing to express outrage at your publication of Brad Marhsall's Commentary. Any fair-minded researcher would present a balanced review of the research literature, instead of hand-picking the studies that only support his bias. For this reason, I and many others who have been both practitioners and researchers in the field take strong issue with this scientifically unsound essay.
The author neglected to mention studies that do indeed support the "child advantage" for starting foreign languages before the age of 10. Eileen Rafferty's statewide study of students in Louisiana compared those who studied a foreign language in elementary school with those who did not, in terms of performance in reading, language arts, and mathematics in English (Second Language Study and Basic Skills in Louisiana, 1986). The results, which repeated results in similar studies in the 1970s, indicated that those studying a foreign language outperformed those who did not.
Other studies by Stephen Krashen and Michael Long have shown that those children who began their study of a foreign language before age 10 or 11 developed near-native pronunciation of the foreign language, while those who started later, after puberty, almost never were able to reach that high level of pronunciation (Child-Adult Differences in Second Languages Acquisition, 1982).Other studies conducted by the Educational Testing Service compared the Advanced Placement results on the 1995 French ap examination and found that students who had started in grades 1-6 outperformed those who had started French in grades 7- 12 ("Does FLES Help AP French Students Perform Better?," Practical Handbook to Elementary Foreign Language Programs, 1998).
Other studies, by John Carpenter and Judith Torney, found that before the age of 10, children had greater openness to other cultures than older students had ("Beyond the Melting Pot," Children and International Education, 1973). Charles Hancock and I compared elementary school foreign-language students' attitudes toward the French people and their culture with those of students not studying French and found similar advantages in starting early ("A Study of fles and Non-FLES Pupils' Attitudes Toward the French and Their Culture," The French Review, 1976).
Richard Landry, in Quebec, has conducted studies on cognitive thinking, divergent thinking, and creativity, and found that those students who started foreign-language study early outperformed those who had not studied a foreign language early.
The last area I will mention (there are many others, which I have enumerated in my books and articles on FLES) deals with the studies by brain researchers. There are many researchers, such as Harold Chugani, who have examined positron emission tomography, or PET, scans and observed the glucose metabolization of various areas of the brain, uncovering the timetable for the development of various areas of the brain. Michael Phelps, a biophysicist and the co-inventor of the PET scan, said, "When small children learn a new language, the ability to use that language is wired in the brain" ("Kids' Brainpower," Oregonian, Dec. 13, 1993).
This is a mere sampling of the growing body of solid research concerning the "child advantage." From having examined both the negative and positive research studies on starting foreign-language study early, I can say that, clearly, the scales tip in favor of starting early.
National FLES Institute
University of Maryland
'Value Added' and Absolute Standards
To the Editor:
Your recent article on the utility of value-added assessment ("Ariz. Ranks Schools by 'Value Added' to Scores," Feb. 9, 2000) encourages readers to believe, unfairly perhaps, that there is an absolute choice between a value-added test approach and the more traditional means of reporting test results. And further, that the best choice between the two approaches is the value-added approach. In reality, the two approaches do not compete against one another, and, in fact, one without the other leaves schools with an incomplete picture of student achievement. I'm not the principal of an elementary school, but if I were, I'd want to know both how much my students had grown (information provided through the value-added approach), and whether or not these students achieved some absolute benchmark standard (information provided through the traditional testing approach).
I'd be thrilled if I saw that my 3rd graders managed, from the beginning of grade 2 to the conclusion of grade 3, to grow, on average, more than their counterparts elsewhere in my district in reading or math. But before chilling the champagne, I'd want to know more.
After all, my 3rd graders could grow this much—far more than others—and still not be reading on grade level at the end of 3rd grade. And if I'm concerned about my 3rd graders competing with others, then I won't be satisfied knowing that 3rd graders elsewhere in my district were reading, on average, at a 5th grade level (even if these advanced readers hadn't worked as hard as my readers).
I'm pleased to see more and more school districts and state systems using the value- added approach; however, it is unfortunate to think that some educators believe this relieves schools, especially schools in poorer neighborhoods, of reaching absolute standards. It shouldn't.
Working hard is always important, but hard work means little if poor students fail to achieve the same targeted benchmarks as their more affluent counterparts.
Joseph A. Hawkins
Senior Study Director
After-Hours Tips From Texas District
To the Editor:
Thank you for your excellent article on after-school programs ("Research: After the Bell Rings," Feb. 2, 2000). I would like to add some information based on what we have learned and are continuing to learn from students who participate in the junior high after-hours programs in my suburban Dallas district.
We started offering these programs at each of our nine junior high schools in 1998-99 with the support of principals, teachers, volunteers, and the Youth Services Council, a community-based coalition of agencies that serve young people. Our young teenagers obviously approve of what we are doing because they keep coming back for more. Parents are delighted, and principals tell us many students feel more connected to their schools.
Martha Horan, the executive director of the Youth Services Council, offers these suggestions for community leaders who are planning similar programs for junior high or middle school students:
- Ask students to identify the activities that interest them. Through focus groups at some schools and student surveys at others, we found that students wanted to take part in athletic activities and art classes and were eager to participate in such skills-oriented classes as cooking, auto maintenance, chess, and swing dancing. Offering a wide variety of activities attracts students.
- Schedule a transition period at the end of the school day. Provide food, give students a chance to socialize, and consider using this time to teach leadership skills.
- Offer professional pay to certified teachers and consider hiring some college and graduate students to make funds go further. We have some wonderful young adults who are eager to work with our programs and are excellent mentors, partly because they are about seven years older than our students. Research indicates this age difference often contributes to the success of mentoring relationships.
- Provide academic help to students in a relaxed, non-threatening atmosphere.
- Be creative in finding other ways to help students learn. Only a few of our students signed up for an excellent service-learning program offered by a community agency, but they responded enthusiastically when we expanded their auto-maintenance class to include doing oil changes for low-income families.
- Identify the students who are not participating in the program and plan classes to attract them. When we realized we were attracting few high-achieving students, we started planning a prelaw course that we believe will please both current participants and the more academically skilled students. Students from a nearby law school will help with this program.
- Look for ways to reach students who do not want structured activities after school. By next year, we hope to provide adult-supervised "student unions" where these students can play games and socialize with friends. We expect to charge a 25-cent admission fee because we believe people value a service more if they invest in it.
- Welcome suggestions from your students, parents, staff members, and volunteers. Adjust your program as necessary.
We expect to do some testing next year to determine how after-hours programs affect student attitudes, and we are looking for other ways to document the effects of these programs in our increasingly diverse district of about 34,500 students. We know these programs are not a panacea for all societal problems, but we also know they provide students with a safe environment where they can enjoy making new friends and learning new skills.
Richardson Independent School District
Hirsch Is Right on the Essentials
To the Editor:
I found it interesting that you published both E.D. Hirsch Jr.'s Commentary ("The Tests We Need") and Alan Stoskopf's "Clio's Lament" in your Feb. 2, 2000, issue. Mr. Hirsch stoutly defends content-based high-stakes testing, while Mr. Stoskopf laments its effect on the teaching of history.
As a 20- year teacher of history, I find myself agreeing with Mr. Hirsch. The purpose of most middle and high school history classes is to provide students with a broad foundation for future study. While standardized tests obviously cannot measure every aspect of a successful instructional program, they can point out glaring deficiencies.
As Mr. Hirsch indicates, one would question any high school program in which the students cannot choose the correct answer from the following test question: The Civil War ended in: (a) 1812 (b) 1830 (c) 1865 (d) 1880. It is hard to believe that a requirement that students be able to master such elementary material as this is stifling to the inspired teaching of history.
While intensive studies of life in Colonial America or immigration in early-20th-century America may be extremely valuable, they are not a substitute for a basic foundation in a broader range of history. It is hard to maintain the unfairness of the Civil War question regardless of the merit of any other program that is designed to cover that period of time.
Undoubtedly, the problem may lie in the test itself. As Mr. Hirsch indicates, a question such as "The Civil War ended in: (a) 1864 (b) 1865 (c) 1866 (d) 1867" is inappropriate and only encourages the trivialization of instruction. The greatest challenge of any history teacher is to determine what material is essential to the level of his or her students. For this reason, the objective quantification of fair test questions is problematic.
Mr. Hirsch's Civil War example, however, points out a common ground that few can successfully challenge. High-stakes testing is not designed to provide a measurement of what is excellent. It is merely a fail-safe device to ensure that students acquire the rudimentary aspects of what a literate person is expected to know.
I also take exception to Mr. Stoskopf's contention that "a single high-stakes test severs the bond between teacher and student." The increased accountability of competency testing has the opposite effect. It reinforces that teachers and students are on the same team and working toward a common goal.
Clearly, there are difficulties with high-stakes testing, especially in areas such as history that cover wide ranges of unagreed-upon content material. It is more useful, however, for educators to work toward more valid and reliable means of accountability than to continue to graduate high school seniors who do not possess even the basic elements of essential knowledge.
Jeffrey T. Stroebel
The Sycamore School
Help With Dropouts Is Not Needed Here
To the Editor:
I'm curious about your article noting that Gerry House has resigned the superintendency in Memphis, Tenn., to take on the leadership of the Institute for Student Achievement, a public-private partnership based on Long Island, N.Y., that works with schools on dropout prevention and preparing at-risk students for college or careers ("House Resigns as Memphis Superintendent," Feb. 2, 2000).
The article mentions the institute's decision to expand its work from New York into other districts, such as Fairfax County, Va. Why Fairfax? It's a school system with over 150,000 students and a dropout rate of 2 percent, as opposed to major cities with dropout rates as high as 60 percent.
I would suggest that the Institute for Student Achievement study
Fairfax County to see what it's doing to have such enviable dropout and
college-admission rates for minority students.Emily Yurman
Ranking Schools As Sports Teams
To the Editor:
The goal of California's Public Schools Accountability Act is to reward schools where students achieve academically and apply pressure to those where students do not ("Calif. Schools Get Rankings Based on Tests," Feb. 2, 2000). To ensure that schools are compared with others in like socioeconomic circumstances, complicated formulas have been applied to results from last year's standardized test scores. But most readers of the academic-performance index, or API, rankings simply want to know who is No. 1, and how does my child's school measure up.
The public's familiarity with sports rankings probably makes such use of lists inevitable. When we read that Arizona's basketball team is ranked second and Stanford's third, or that the University of Southern California is the best team in the Pacific 10 Conference, we have confidence that such placement on a scale is fair and relatively accurate. We take it for granted that straightforward game results determine rankings.
Applying the same approach to academic achievement, we tend to forget that basketball results involve counting how many times a ball goes through a hoop. Educational results are rather more complicated to measure.
Schools and teachers should be held accountable for improving instruction, and the measurement of every school's progress toward achieving academic excellence over time makes eminent sense. It doesn't make sense to assume that a test score is the best measure of such progress. Even when other indicators, such as attendance patterns and dropout rates, are factored into the equation, the matrix is still incomplete.
How can you rate a teacher's enthusiasm for literature or instinctive ability to transmit this enthusiasm to students? How can you quantify the extent to which the culture of a high school campus either supports or undermines student achievement? Where are children's attitudes toward learning measured? Shouldn't schools receive progress points for helping parents become more involved in their children's education? What about a measurement for reduced hours spent in front of the television on school nights? That would be true progress.
Basketball fans have likely started grumbling that similar factors—the quality of coaching, team morale, and a player's work ethic—are exactly what determine the number of times a ball goes through the hoop. There is one huge difference. Every college team in the rankings, whether first or last on the list, practices with regulation balls in a clean and well-lighted gymnasium, wearing first-rate equipment. In California's public schools, no such parity exists. Until it does, we need to think long and hard about judging the quality of a school on the basis of its API ranking.
Santa Monica, Calif.
The writer teaches English at Santa Monica High School and directs the California Reading and Literature Project at the University of California, Los Angeles.
Applicants Can Now Post Résumés Online
To the Editor:
In Quality Counts 2000: Who Should Teach?," Jan. 13, 2000, you indicate in the "Incentives and Recruitment" table that candidates can find teaching openings in Connecticut online, but that they cannot post résumés, applications, and other materials online.
Beware of Toxins in Art Products
To the Editor:
In your article about white markerboards in the classroom ("Modern Classrooms See Chalkboards Left in the Dust," Jan. 12, 2000), you did not discuss any of the pitfalls of markerboard use.
Many "magic markers" contain toxic aromatic hydrocarbon solvents, such as toluene and xylene, which, in large amounts, can cause drowsiness and damage to the upper respiratory system, kidneys, and liver. Other markers contain less toxic, alcohol-based solvents. Though these markers are less toxic, they still can emit an intoxicating vapor that can produce central nervous system depression (drunkenness).
Virginia, like many other states, has a law that restricts the use of toxic materials by children under 12 years old. Children are more susceptible to toxic art materials because of their size and faster metabolism.
It is for these reasons that the Boston-based Art & Creative Materials Institute, or ACMI, has created a labeling system to raise awareness of the health risks in children's art products. Products that are nontoxic will have the "AP Nontoxic" (approved product) or the "CP Nontoxic" (certified product) label. Products with these labels meet the American Society for Testing Materials Standard D-4236, as having no acute (short-term) or chronic (long-term) health hazards. Products that contain materials with known health implications will be labeled "Health Label."
When markers are dry-wiped, a dust is formed. That dust becomes airborne and is easily brought into the lungs by passing natural respiratory defenses (mucous membranes and cilia) because of the dust particle's size and weight. It should be noted that chalk also creates dust, but those dusts typically fall out of the environment because they are so heavy (that is why we see a lot of chalk dust in the tray). That is to say, we do not breathe in the chalk dusts deep into our lungs during normal use of the chalk.
In addition, chalkboards clean up with water (nontoxic). Whiteboards require a spray cleaner that contains ethylene glycol (anti-freeze, toxic by ingestion) and is an irritant when sprayed into the eyes.
Teachers should take particular notice of all art and office products in use in their classrooms (liquid paper, pottery glazes, oil paints, and markers). Teachers should verify that all products used (school-purchased and home-bought) meet the American Society for Testing and Materials D-4236 label and are labeled AP Nontoxic or CP Nontoxic.
Douglass T. O'Neill
Environmental Health Specialist
Fairfax County Public Schools
Vol. 19, Issue 25, Pages 49-51
Vol. 19, Issue 25, Pages 49-51
Get more stories and free e-newsletters!
- Multiple Vacancies
- Mercer Island School District, Mercer Island, WA
- Multiple Vacancies
- Milwaukee Public Schools, Milwaukee, WI
- Superintendent of Schools
- Alachua County Public Schools, Gainesville, FL
- K-12 Online Career eXPO
- TopSchoolJobs eXPO, US
- Multiple Vacancies
- Richland County School District One, Richland County, SC