It’s 8:15 a.m. Inside Room 215, 20 hunched-over students scratch and scrawl away with No. 2 pencils on their first task of the day.
It’s both familiar and uncharted territory for these high school juniors and seniors. And for 25 minutes of a grueling three-hour-and-45-minute SAT exam, like the nearly 1.4 million students across the nation who took it in spring and early summer, they must complete not only multiple-choice questions but also, for the first time, demonstrate their writing ability.
By lunch, papers shuffle up to the front. Eraser shavings are brushed to the floor. Pencils drop, lightly clacking, reverberating down the slope of desks. Along with their answers to the revised math and critical reading sections – what was once known as the verbal – each student’s drafted argumentative essay will soon make its way into the hands of the College Board.
Once graded, an applicant’s score is sent to colleges, where it is measured against other pooled applicants. For some institutions, admissions staffers pull the essays off-line and slide them into college files for review. For Ray Brown, TCU Dean of Admissions, “it’s just one more piece to the puzzle that tells us about somebody.”
But because the writing section is so new, it has sprouted unanswered questions for TCU, including: How much bearing will it have on future applicants and how accurately does it really evaluate a student’s writing ability?
High school seniors applying to TCU face the recent challenge of competing against a large volume of applications, which, since 2000, is up by 60 percent.
Because of those numbers, the TCU Admissions Staff is finding new ways to select incoming freshmen.
Using the essay, Brown said, might be “yet another cut we can make to separate students in the application process,” which he admits is an “uncomfortable truth.”
“Part of it is positioning,” he added.
As of 2004, TCU slipped into the No. 2 spot behind Rice, as the most selective university in the state, he explained.
To keep that spot, Brown said, TCU will continue to be more selective.
Adding requirements to the application process, such as a tested writing section, might be in order, Brown said. But since the writing test is so new, the 2006 admissions period will be a trial-by-error experiment, he said.
“We’ve never done this before, [so] we just don’t know how to use it,” Brown said.
As the writing test is administered over the next few years and universities see how those scores relate to actual performance of the incoming freshman classes, the writing scores might foreshadow some changes at TCU, Brown said.
One change might be the placement of freshmen, after admissions, into different writing classes based on SAT scores, he said.
“I can see in another year or so,” Brown said, “we are going to be saying in essence, ‘You know, we love you a lot; we want you to come here, but you are going to have to go to the little red schoolhouse because your writing needs a little bit of work.'”
According to College Board literature on the new SAT, “The writing section measures a student’s mastery of developing and expressing ideas effectively.”
In 25 minutes students are required to respond to a prompt that presents a broad topic in which the student is required to respond by taking a position and then defending it with examples.
During the grading process, two graders are assigned to an essay. Each gives a score between a one and a six. Those scores are added together. The total of those two scores is then factored in to the rest of the section composed of content multiple-choice questions.
Bernie Phelan, an English teacher in Flossmore, Ill., and committee member for the new SAT, said the test “resembles something in the freshmen writing experience.” While other educators say the SAT writing section reflects the initial stages of writing, something similar to a first draft created in a real-time environment rather than a revised, finished product that has had time for revision.
At Rice University, the Office of Admissions knows and supports that distinction.
Ann Wright, vice president for enrollment at Rice and member of the College Board research and development advisory committee, said: “Increasingly there have been demands from corporations and businesses for students to be more articulate and better at writing. The test is designed to do a better job at measuring how well students have achieved and how well they will fit.”
In previous years, Rice used the SAT II Subject Test in Writing, similar in format to the new SAT, to do just that.
With the onset of the new SAT writing section, the writing subject test has become obsolete, enabling all types of universities and colleges to use it in the ways Rice has done for many years, including freshman placement into various writing classes.
Wright cautioned, however: “[Rice] also recognize[s] it is the first year and we don’t have any experience with it. If the test is completely off, it will count for less.”
Because the test is different, the College Board suggests scores on the SAT I should be examined differently.
Caren Scoropanos, a representative from the College Board, said, “The old subject test scores are not comparable to the scores of the new writing section since they are each different tests.”
One reason as to why they are different, Phelan said, is because students who took the SAT II writing test were of a different talent pool, scoring around an 8 on the essays. Students now taking the SAT I writing test scored slightly lower around a 7.5, he added.
Brown said TCU hopes to see scores averaging around 8.5 to 9. But until those scores come in, TCU won’t be able to assess what it says about its applicant pool, how it can determine the writing tests’ difficulty level, or if they should even be used.
In a scenario that TCU felt the scores were a valid indicator as to how a student should be placed into a writing program, Carrie Leverenz, TCU director of composition, said she would have her qualms about it.
“There has been a lot of controversy among writing program administrators about the writing test counting very much because how can you tell?” she said.
Writing tests, Leverenz said, are “based on your ability to use a pre-existing form and write in complete sentences without any research.
“For assessment to be reliable, to test what it purports to test, the [exam] has to match the skill. So, if all we did in college is ask students to write impromptu 25 minute essays, then the test would be a perfect match,” she said.
But it’s not, she noted.
“It does not match the skills we try to teach in 10803, [TCU’s freshman Writing Workshop class],” she added.
However, “at TCU, it’s a moot point,” she said, because placement isn’t a tool used here after the admissions process.
But it might be in the future, Brown said.
If it is, Leverenz said, the most reliable way to assess a student’s ability is to look at a portfolio full of different types of revised writings, including argumentative, personal and researched essays, evident of what a student would learn in 10803.
Richard Enos, TCU professor and holder of the Lillian Radford chair of rhetoric and composition, said if placement does become a part of the process, it is up to TCU to make sure students have and will develop good writing skills.
“We need to be really careful because you have to ensure that every student has [writing] competency,” he said. “We can’t be in a position where we just have the logistics drive the ideals. We have to make sure students write well.