I Went To Another College Board Presentation About The New SAT…

Back in September, I attended the College Board‘s Annual Counselor Workshop at the University of Richmond. You can read about the session in detail in the post I wrote at the time, but the simple summary was that they gave a very detailed description of the new SAT/PSAT, but seemed a little unprepared for the negative emotions many of the counselors in attendance manifested. 
 

Last week, I was in Ocean City, Maryland to attend the annual meeting of the Potomac and Chesapeake Association of College Admissions Counseling (PCACAC), the regional affiliate of the National Association of College Admissions Counseling (NACAC). PCACAC is made of up members from Delaware, Maryland, Virginia and West Virginia, as well as the District of Columbia. I was there to represent my company, Method Test Prep but was able to attend the session held by the College Board which discussed the new SAT and how it would be used by colleges and universities in the next year.

The College Board was represented by Mr. René Rosa and Ms. Cassandra Allen. They were also joined by Josh Lubben, Assistant Director of Admission at the University of Maryland, Baltimore County. The plan for the hour-long presentation seemed to be to give an overview of the changes in the SAT, followed by information about the data the new test will provide and how secondary and higher ed personnel can interpret it. Josh was there to describe how he and his colleagues have been preparing to receive scores from the new SAT. Unfortunately, the presentation got off the rails pretty quickly and it did not go as intended. Once again it seemed like the College Board does not have a high level of emotional intelligence which causes them to underestimate how audiences will respond to their presentations.

The session started with Mr. Rosa asking “how many of you are educators?” This seemingly innocuous icebreaker was met with a rude reply of “we’re not educators!” which flustered Mr. Rosa and resulted in the follow up “I’m sorry–how many of you are K12 teachers?”. That really ticked off the room, as more than one person said “we’re counselors” and “there are no teachers here!” I would dispute that a person can’t be both, but the hostility seemed to wrong-foot Mr. Rosa and he had some trouble regaining the room. 

Mr. Rosa finished up his section by describing the Khan Academy/College Board partnership, but I’m not sure that he did so effectively. While he touted that students can have a “personalized study plan” if they link their College Board accounts to their Khan Academy accounts, he also told us that so far over 1.1 million students have “logged on” to the Khan Academy SAT prep program, but only 300,000 have linked their College Board scores (and he had no details about how much work those students have since done in the program). He also–inadvertently–weakened his argument that “kids can use Khan Academy to supplement what they learn in school” by describing how his elementary school-age son has used Khan’s SAT prep to work on math. While I’m sure that young master Rosa is a very brilliant boy, it undercuts the idea of the SAT being a useful tool to gauge college readiness if kids can work with it before entering middle school.

The next presenter was Ms. Allen. Her presentation took up the majority of the session, and was another example of not responding properly to the emotions of the audience. Many people in high schools have been very unhappy with College Board tardiness, especially the late release of PSAT scores and their delaying of the release of March SAT scores until after the May test (which made it impossible to use the first test to prepare for the second one). Ms. Allen blithely told us that “March scores will be out in mid-May, along with a concordance” to compare the new test to the old one. At no point did she give even a token apology to mollify the audience which was rightly skeptical of College Board overpromising and underdelivering.

She then went on to talk about the revised College Board benchmarks, a preliminary version of which is also due in mid-May.  I wrote about this four years ago, but simply put, the College Board believes that they can correlate performance on the SAT to future success in college. The old SAT generated a single benchmark score, but the new one has five. Students who achieve the benchmark are believed to have “a 75% chance of earning a C or better by the end of their first semester in college”. That doesn’t actually mean much when you think of it, because 90% of college grades are C or better, but I guess it’s a start. 

One of the attendees asked Ms. Allen how the College Board could calculate these benchmarks since no scores have been earned on the new test yet? “Easy”, we were told. Apparently the College Board has a top secret concordance based on ” a pilot group of 2000 highly incentivized students”; members of the class of 2015 who took the new SAT last summer before it was publicly available. Ms. Allen did note that the benchmarks “may or may not change” (which, of course, is true about everything) and might get recalculated after the class of 2017 finish their first year of college. 

According to Ms. Allen, there are grade level benchmarks on every test the company offers, so the (meaningless) PSAT 8/9 is a benchmark for success on the (largely meaningless) PSAT, which is a benchmark for success on the SAT, which is a benchmark for success in college. 
At this point, Ms. Allen began saying things that frustrated the audience, and she did not react well. She noted that everyone has “the ability to go online and pull data yourself” and visualize it via “nice pretty pictures to print out” and how this is better than the previous system where the College Board sent schools a CD-ROM with all the relevant data. Unfortunately, not every counselor has full, unfettered access to this online data, which Ms. Allen minimized by saying “District level access managers” grant view privileges. She then elaborated that there are actually five levels of access to the online report and indulged in her favorite binary truism to say “You may or may not have full access to data. You may only see summary data, not student data”.  She even noted that one unnamed group (“I forget what it’s called”) doesn’t get access at all. 

A member of the audience asked how she as a college counselor could use this data, and Ms. Allen unwisely dismissed her question by saying “If I was a counselor I wouldn’t use this information. I would encourage my math and English teachers to use it so that they can see how their kids will do on other Common Core tests.” Besides the fact that her answer was unhelpful to private schools and schools in Virginia and West Virginia (those states eschew the Common Core) it was a little brusque. 

Ms. Allen was asked about anecdotal reports from schools which seemed to indicate that PSAT scores in Math were lower than in other sections of the new test. After some hemming and hawing, she was asked “are you saying that the math is harder?”, to which she responded, “that’s what I want to say without saying it.” That point of view seems to jibe with surveys taken of students who took the test in March, but colleagues of mine in attendance were a little unhappy to hear that the math is perceived as more difficult. 

Ms. Allen was asked numerous questions about average scores (on the old test, each administration was curved so that the average score would be 500 in each section) but every time she answered with references to benchmarks (which are set at 510, for now). That was disappointing, and she never answered the question about how much the new test has to be curved to get to that level. As this part of the conversation spiraled out of control, Ms. Allen lost the room for good. She was asked how students would be able to use the PSAT to predict SAT scores (since the PSAT is scored on a 200-1540 scale, as opposed to a 200-1600 scale on the SAT). “Any student ready for college should be able to understand ratios like 600/720 vs. 600/800, so it won’t be hard”, we were told. Ms. Allen’s fellow panelists visibly winced at this.

Mr. Rosa returned to present his information on “Higher Education: Using and Interpreting Scores”, but there was very little time remaining, and so he had to scramble. He noted that colleges will get a lot more data about each applicant now; not just total and section scores, but cross test scores, test scores and subscores. He noted that “It is up to colleges to decide what to do with this.”  When asked if the College Board has given colleges guidance, or received hints from higher education about how these scores will fit into an “holistic” evaluation, he said no. “They are interested in receiving the scores”, we were told, “but they are in ‘wait and see’ mode to find out if the new scores are valid for admission decisions.” So, in other words, the College Board admits that the test that more than a million kids will take next year will not be seen as fully “valid for admission decisions” next year. 

Mr. Lubben described the multiple committees at UM-Baltimore that have been set up to try to get the organization prepared for the new SAT. “The reality at this stage”, he said, “is we are still trying to figure this out.” They don’t know what to expect from applicants of the class of 2017, and he said they are also trying to figure for how long they will accept old SAT scores (in the cases of transfers or non-traditional students). Ultimately, he concluded, nothing has changed: “students should send us every score because we will use whatever is most advantageous” to the applicant.

As my colleague Evan Wessler has pointed out, colleges and universities will not be able to superscore between the old and new SAT, because the assessments are so different. But colleges will need a concordance table to be able to compare a student’s score on the old test with the same student’s score on the new one (not to mention, to an ACT score). This concordance table will be “available soon”, and will have three columns: New SAT score compared to Old SAT score and ACT score compared to old SAT score. So if a person wants to compare the new SAT to the ACT they will actually be doing two conversions. 

I still believe that the new SAT is a much better test than the old one was, and I think it is much more directly relevant to what students see in their high school careers. But I am baffled as to why the College Board’s presenters are so apparently clueless of their audience, unaware of how education professionals on both sides of the admission desk use test scores, and unable to empathize with the concerns that are shared with them. Soon enough people will forget that this rocky conversion process ever happened, but in the meantime, it is challenging all around!

No Comments

Leave a Reply to Unknown Cancel reply

Your email address will not be published. Required fields are marked *