NC state senators call on Senator Berger to delay Istation implementation and review contract process

Thirteen North Carolina state senators are calling on Senate President Pro Tempore Phil Berger to investigate the process Superintendent Mark Johnson followed in unilaterally awarding the multi-million dollar contract for a K-3 reading assessment to Istation.

In a letter sent to Senator Berger today, the lawmakers request that Berger “establish a Select Senate Committee to review the procurement process for the contract.” The senators are further asking that Berger delay implementation of the new K-3 reading assessment for a year.

As justification for the request, the senators point to Johnson’s dismissal of the findings of an evaluation committee which recommended that the state continue using the mClass tool which has been in use since 2013.

The letter asks that Berger take these steps to “restore some of the confidence lost by superintendents, teachers, and parents regarding the procurement process.”

Amendment which would allow school districts to opt out of Istation switch passes NC House, now on to Senate

An amendment by Representative Graig Meyer to Read to Achieve legislation sponsored by Senator Phil Berger passed the House by a vote of 62-51 tonight and the amended bill is on its way back to the Senate.

As amended, the legislation would give local boards of education the flexibility to “select different assessments” to meet the universal screener requirement of Read to Achieve.

Meyer-amendment

This change would give school districts the option to continue using the mClass reading assessment they’ve been using since Read to Achieve was implemented in 2013 instead of switching to the computer-based Istation tool which was unilaterally adopted by State Superintendent Mark Johnson against the recommendations of a team of educators.

The legislation now heads to the Senate where lawmakers will vote to either concur or not concur with SB 438 as amended.

**Updated 7/23: It appears that any school districts that opt to use assessments other than the one selected by the Department of Public Instruction would most likely be responsible for 100% of the cost of the alternate assessment. This raises equity concerns that deserve some healthy public debate.

Action needed: NC legislator to offer amendment which would arm teachers

On Monday, July 22, Cabarrus County Representative Larry Pittman will reportedly once again introduce legislation to arm teachers.  

According to NC gun rights organization Grassroots North Carolina, Pittman will offer his School Defense Act as an amendment to a Senate School Safety Omnibus bill which will be voted on in the House.

Pittman’s last effort to put guns in the hands of teachers came in the wake of the 2018 Parkland massacre, where 17 lost their lives and an armed security officer declined to enter the school building and engage the shooter.  

At the time, Pittman urged fellow lawmakers to support his legislation, saying 

We need to allow teachers, other school personnel and other citizens, who are willing, to be screened and to receive tactical training and bring their weapons to school, in cooperation with local law enforcement who would need to be informed as to who is doing this. We should give them a fighting chance. Otherwise, when they die, and children die whom they could have defended, their blood will be on our hands. I cannot accept that. I hope you will think this through and find that you cannot accept it, either.

Pittman is totally out of touch on this issue, just as he was last spring when he fought against a bill which prohibited corporal punishment in public schools.

What makes the amazing things going on in North Carolina’s public schools possible is the positive culture that professional educators work so hard to establish.  The relationships we build with our students help them see our classrooms as a safe harbor, a place where they will be respected and given the support they need to succeed.   We can’t keep that all-important culture intact while militarizing our classrooms.  It’s as simple as that.

Adding more guns to our buildings is not going to solve school shootings–it will only make things worse.  Statistically speaking, our schools are still very safe places to be. And a recent national survey of educators found that more than 95% did not believe that teachers should carry a gun in the classroom.  The notion of a pistol-packing badass teacher taking out a villain in a blaze of gunfire is nothing more than the action movie fantasy of an out-of-touch, NRA-purchased politician.  

Grassroots North Carolina, of course, disagrees. They cite research by gun rights advocate John Lott in claiming “schools that allow educators and administrators to be armed are much (much) safer than their gun-free counterparts.”

Side note: GRNC refers to Lott as a “respected researcher.” He’s not:

A little over a decade ago, he was disgraced and his career was in tatters. Not only was Lott’s assertion that more guns leads to more safety formally repudiated by a National Research Council panel, but he had also been caught pushing studies with severe statistical errors on numerous occasions. An investigation uncovered that he had almost certainly fabricated an entire survey on defensive gun use. And a blogger revealed that Mary Rosh, an online commentator claiming to be a former student of Lott’s who would frequently post about how amazing he was, was in fact John Lott himself. He was all but excommunicated from academia.

Grassroots North Carolina is mobilizing its members to contact legislators today and urge them to support Pittman’s amendment. They have conveniently provided email addresses for all House Republicans, which I am including below. For some odd reason, they declined to include House Democrat contact information, but you can find all House members’ individual contact information here.

Please take the time to reach out today and express your opinion on the matter, and encourage others to do the same.

*NC House Republicans Copy/Paste Email List(s):
Jay.Adams@ncleg.net; Dean.Arp@ncleg.net; Lisa.Barnes@ncleg.net; John.Bell@ncleg.net; Hugh.Blackwell@ncleg.net; Jamie.Boles@ncleg.net; William.Brisson@ncleg.net; Mark.Brody@ncleg.net; Dana.Bumgardner@ncleg.net; Jerry.Carter@ncleg.net; George.Cleveland@ncleg.net; Debra.Conrad@ncleg.net; Kevin.Corbin@ncleg.net; Ted.Davis@ncleg.net; Jimmy.Dixon@ncleg.net; Josh.Dobson@ncleg.net; Jeffrey.Elmore@ncleg.net; John.Faircloth@ncleg.net; John.Fraley@ncleg.net; Edward.Goodwin@ncleg.net; Holly.Grange@ncleg.net; Destin.Hall@ncleg.net; Kyle.Hall@ncleg.net; 

Bobby.Hanig@ncleg.net; Jon.Hardister@ncleg.net; Kelly.Hastings@ncleg.net; Cody.Henson@ncleg.net; Craig.Horn@ncleg.net; Julia.Howard@ncleg.net; Chris.Humphrey@ncleg.net; Pat.Hurley@ncleg.net; Frank.Iler@ncleg.net; Steve.Jarvis@ncleg.net; LindaP.Johnson@ncleg.net; Brenden.Jones@ncleg.net; Donny.Lambeth@ncleg.net; David.Lewis@ncleg.net; Pat.McElraft@ncleg.net; Chuck.McGrady@ncleg.net; Allen.McNeill@ncleg.net; Tim.Moore@ncleg.net; Gregory.Murphy@ncleg.net; Larry.Potts@ncleg.net; Michele.Presnell@ncleg.net; 

Dennis.Riddell@ncleg.net; David.Rogers@ncleg.net; Stephen.Ross@ncleg.net; Jason.Saine@ncleg.net; Wayne.Sasser@ncleg.net; John.Sauls@ncleg.net; Mitchell.Setzer@ncleg.net; Phil.Shepard@ncleg.net; Carson.Smith@ncleg.net; Sarah.Stevens@ncleg.net; Larry.Strickland@ncleg.net; John.Szoka@ncleg.net; John.Torbett@ncleg.net; Harry.Warren@ncleg.net; Donna.White@ncleg.net; Larry.Yarborough@ncleg.net; Lee.Zachary@ncleg.net; Jeffrey.McNeely@ncleg.net

* Limitations of certain email programs and spam filters may require you to send this message to smaller lists. If that is necessary, the above email list is conveniently split into three parts to allow you to easily send this message three times—once to each list.

About Istation’s threats of legal action against NC educators

I haven’t said much in public about multi-million dollar ed-tech corporation Istation’s threats of legal action against me and two other North Carolina educators up to this point.  But since the media has now published those threats I thought they deserved a response.

About a month ago it was brought to my attention that a significant change in the way we evaluate our children’s reading ability in North Carolina was flying very much under the radar.  I began to research the matter and was dismayed by what I learned.  

I learned that there were some crucial differences between the mClass assessment tool and the Istation assessment tool which had resulted in an evaluation committee overwhelmingly recommending that North Carolina’s schools continue using mClass.

I learned that our state superintendent had disregarded that committee’s input and awarded the contract to Istation anyway.

I learned that the superintendent and his representatives had claimed repeatedly that the committee had not recommended mClass.

I learned that documentation existed that would show those claims to be false.

I learned that the level of fear within the Department of Public Instruction is so high that it’s very difficult to get current–and even, in some cases, former–employees to talk about what goes on there.  But it’s not impossible.

I believe that the people of North Carolina deserve government that is truthful and transparent.  I believe the policies of our education system should be informed by the consensus of the people who are most knowledgeable about how they will affect our children.  I believe that those children deserve best practices in their classrooms that will lead to the brightest future they can have.

With those goals in mind, I have put countless hours into researching this matter over the last month.  I have spoken with dozens of people who are deeply invested in education and government in North Carolina and learned much about everything from dyslexia to procurement rules.  

Along the way I have passed along a lot of information that I felt would be helpful to the public in understanding an issue that deeply impacts our children and therefore the future of all of our communities in North Carolina.  Not one time have I stated anything that I did not believe to be absolutely true. And indeed, when the Department of Public Instruction finally gave in to massive pressure to comply with lawful public records requests and released a trove of records on July 12, those documents substantiated all the claims that had been made about the evaluation committee’s recommendations and revealed even more troubling details about the contract process.

On Monday, Istation’s North Carolina attorneys sent me the following cease and desist letter accusing me of making “demonstrably false, misleading, and defamatory public statements about Istation” based on what they refer to as “unverifiable speculation and unsubstantiated statements.”  They informed me that Istation is considering its legal options against me at this time. It’s a curious PR strategy for a company that you’d think would be focused on winning over North Carolina teachers right now.

Parmenter-Cease-and-Desist-Letter

It’s unfortunate to see attempts like this to silence educators who simply want the truth and what’s best for our children.  

Thank you to everyone who has reached out over the past few days in support.  My attorney assures me that the truth is an absolute defense against charges of defamation.  With that in mind, no matter what course of action Istation chooses, I know we are in great shape.  

What DPI’s newly released documents reveal—and what they don’t reveal—about Superintendent Mark Johnson’s unilateral Istation contract award

Last week the NC Department of Public Instruction finally released information related to the procurement process which ended with Superintendent Mark Johnson unilaterally awarding a 3 year, multimillion dollar contract for North Carolina’s K-3 diagnostic reading assessment to Istation.

Both Johnson and DPI Communications Director Graham Wilson had previously claimed that the evaluation committee assembled in the fall of 2018 to inform the decision had failed to come to a consensus or make a recommendation.  The records provided by DPI show those claims are absolutely false.

The documents also reveal some important details about the path Johnson took as he disregarded the input of the team of evaluators.  However, the release omits records which will be crucial in substantiating DPI’s version of events.

Here’s what we know based on the records DPI released:

On October 5, 2018, the Request for Proposal (RFP) evaluation team first met under the direction of co-business managers Pam Shue and Amy Jablonski to discuss background for the project, evaluation ground rules, and how the process would work.  The team included both voting members and non-voting members and was made up of DPI employees and a broad collection of subject matter experts.  

Notice the importance of selecting an effective dyslexia screener in the initial project scope as presented to the RFP evaluation committee.  Some of the strongest outcry that has followed Johnson’s selection of Istation has been about the tool’s inability to flag children who are at risk for dyslexia and other specific learning disabilities.  DPI representatives have responded by explaining that dyslexia screening is outside the purview of Read to Achieve and is not the state’s responsibility, as DPI Director of K-3 Literacy Tara Galloway told the State Board of Education last week.

01-October-project-background

Individual team members were given until mid-November to evaluate the four vendors (Amplify, Istation, NWEA, and Curriculum Associates), at which time they were expected to be prepared to meet, discuss their findings, and come to a consensus ranking which would later be presented to Superintendent Johnson.

The consensus meeting took place on November 19 and 20, 2018.  Notes from the records release indicate that participants were reminded at the outset of the meeting that their goal was to arrive at a consensus on which product should be selected, and that “consensus means general agreement and not unanimity.”

02-RFP-November-consensus-meeting

The team discussed their findings in painstaking detail before ranking the products.  They agreed unanimously that Amplify was the best choice. Istation came in second.

03-November-meeting-results-and-ranking

On December 4, 2018, Amy Jablonski, Pam Shue, DPI Procurement Officer Tymica Dunn, and Project Manager Srirekha Viswanathan met with Superintendent Mark Johnson to present the committee’s findings in a PowerPoint which is included in the released DPI records.  They told Johnson that the team had selected Amplify’s mClass tool as its top choice to be used as the K-3 reading diagnostic assessment in all of North Carolina’s schools.

The next records DPI provided are from a meeting on January 8, 2019, between Superintendent Mark Johnson and the members of the evaluation team who worked for DPI.  The purpose of the meeting reads “consensus meeting to recommend finalist for negotiations,” which is odd since the team had already presented its unambiguous recommendation to Johnson the month before.

04-RFP-meeting-January-with-Johnson

According to the notes, Mark Johnson began the meeting by thanking those present for their input on the K-3 screener selection.  He gave a speech about the importance of freeing up more time for teachers to teach and the need to provide them with the right tools.  As this was his first reaction to the team recommending that schools continue using the Amplify tool, Johnson’s comments could be interpreted as an attempt to influence the team toward changing their recommendation to Istation (a computer-based tool which Istation advertises as requiring minimal class time).  Johnson then asked the 10 voting members present to vote for the second time and stepped out of the room “to maintain integrity of the process.”

After the superintendent exited the room, team members wrote their choices on sticky notes, and the project manager tallied the results.  Amplify again easily came out on top, with six people recommending negotiations proceed with Amplify only, three with Istation only, and one voting that negotiations continue with both companies.  Pam Shue was tasked with informing Johnson of the committee’s recommendation the next day.

On March 8, 2019, another meeting was held to discuss the procurement.  This time only 8 of the 10 DPI voting members who had been at the previous meeting were present.  Superintendent Johnson was not in attendance, but new General Counsel Jonathan Sink was.  

Sink informed those present that the procurement process was being cancelled.  According to the notes, he gave two reasons for the cancellation. The first reason was that a voting member of the evaluation committee had breached confidentiality on the procurement process.  The second reason provided was that there had been no unanimous consensus in selecting a vendor for the K-3 reading assessment.

05-RFP-March-meeting-with-Jonathan-Sink

There are a couple of important things to note here.  First of all, Sink gave no additional detail on the alleged confidentiality breach at the meeting, and the records DPI released include no information about exactly what the breach was or the identity of the person responsible.  

Given DPI’s pattern of dishonesty on the procurement and Mark Johnson’s apparent desire to award the contract to Istation, it’s fair to wonder whether a breach really occurred.  If it did, records detailing the breach should have been provided to the public as information relevant to the procurement process. Nothing in North Carolina public records law prevents DPI from releasing that information and corroborating the claim.

Secondly, remember that the evaluation team had been informed from the beginning of the RFP process that “consensus means general agreement and not unanimity,” so the lack of unanimous agreement does not seem to be a valid reason for cancelling the procurement.  Indeed, it’s hard to imagine procurements in general being successful if the process required those involved to unanimously agree.

After the March 8 meeting, the RFP process was cancelled and restarted with a smaller evaluation committee which had very little expertise in literacy or teaching.  The new committee selected Istation as the vendor, and Mark Johnson announced the contract award to the public on June 7.

Mark Johnson appears to believe that the legislation which called for the procurement gave him sole decision-making authority on the K-3 reading assessment and, as such, that he could simply disregard all the work done by the evaluation team.  The exact wording of the bill in question reads  “…the Evaluation Panel, with the approval of the State Superintendent, shall select one vendor to provide the assessment instrument…”  

Ultimately, it may be up to a judge to determine whether Johnson had that authority.

Amplify has filed a protest with the Department of Public Instruction under North Carolina Administrative Code, and representatives of Amplify and DPI will be meeting for the first time to discuss the protest on Thursday, July 18.  

Amplify is protesting under North Carolina General Statute 150B, which mandates that government contract awards must be reversed in a number of circumstances, including if a state agency which awarded a contract “acted arbitrarily or capriciously.”  Depending on the outcome of this meeting, the procurement may then be reviewed by the Department of Information Technology and the decision is subject to appeal in superior court.

DPI releases documents showing NC Superintendent Mark Johnson lied about K-3 screener procurement

Late Friday afternoon the Department of Public Instruction finally responded to Lord knows how many requests by the public seeking records about the mClass/Istation procurement process.

The documents DPI made available confirm claims made by former DPI Division Director of Integrated Academic and Behavior Services Amy Jablonski that the initial Request for Proposal (RFP) evaluation committee recommended that DPI adopt mClass as the tool to evaluate K-3 reading achievement as required under Read to Achieve.

They also show that both NC Superintendent Mark Johnson and his staff were lying when they said the team had not recommended mClass.

When the controversy first broke over Superintendent Mark Johnson unilaterally selecting Istation, DPI spokeswoman Jacqueline Wyatt denied to the Raleigh News & Observer that the RFP committee had reached a consensus and recommended mClass. In an email to representatives of the North Carolina School Superintendents Association, Johnson repeated the claim, saying “No consensus was reached by the RFP evaluation committee, so there was no recommendation.”

The documents released today after weeks of pressure by the public clearly show that the evaluation committee recommended Amplify as its number 1 choice in nearly every category and easily first overall.

You can see the relevant portion of the released documents below. A complete upload of what DPI released (including some caveats by Mark Johnson) is posted here.

RFP-committee-on-K-3-screener-results

Court documents allege CEO sought to influence research on Istation

In a lawsuit filed against Istation in 2010, company co-founder and former CEO George Grayson alleged that current CEO Richard Collins had proposed a plan to give a “gift” of $150,000 to his alma mater Southern Methodist University in return for a study “that would be favorable to the company.”

Collins and Grayson founded Istation in Dallas in 1998.  According to court documents, Grayson was Istation’s sole shareholder until 2006 and served as the company’s CEO from its founding until 2007, when he was ousted by Collins following a recapitalization.

In 2010, Grayson filed suit against Istation and Collins for breach of fiduciary duty.  In that complaint, Grayson alleged that, in 2008, Collins had announced his plan to the company’s board to pay for research showing Istation in a positive light:

The allegations add an interesting wrinkle in North Carolina, where the public continues to seek clarity on how Istation was able to win a multimillion dollar K-3 reading screener contract despite the reported recommendation of a broad evaluation team that the contract be awarded to Amplify for its mClass product.  

One criticism of Istation in comparison to mClass has been the relative scarcity of independent research on the tool’s efficacy.  When he announced his decision to give Istation the contract, State Superintendent Mark Johnson referred to the company’s “proven results of helping students grow.”  Interested members of the public reached out to Istation to find out more about those proven results.

Istation obliged by providing links to just a handful of studies, three of which were written by a Dallas professor who it turns out actually works for Istation:

In addition to providing some shaky research, Istation is continuing its PR offensive.  Today company president Ossa Fisher spammed state legislators with quotes by anonymous North Carolina public school teachers who, of course, just love the new product.   

Istation-anonymous-testimonials-from-NC-teachers

Fisher claims Istation was “designed to make life easier for educators.” Let’s be clear.  Istation was designed to make money. Mr. Grayson’s allegations may shed some light on just how far Istation’s leadership is willing to go to turn that profit.

(h/t Chelsea Bartel for unearthing the court documents)

New Istation tool underwhelms for North Carolina’s dyslexic children

The controversy over Superintendent Mark Johnson’s unilateral adoption of the Istation K-3 reading diagnostic tool against the wishes of a team of experts continues to swirl as the public waits to see if the superintendent will release related records as required by law.

One reason the backlash has been so intense has to do with Istation’s limitations as a screener for dyslexia and other specific learning disabilities.  

Identifying reading difficulties as early as possible is crucial in allowing schools to provide the targeted interventions necessary for student success.  Effective screening for dyslexia requires having a reader produce sounds and read words to determine whether phonological processing problems are present.  

Because Istation is an online tool, its phonological awareness measure is limited to having students listen to a sound and match it with an answer choice instead of actually segmenting individual sounds out from a spoken word by producing it on their own. This approach is inconsistent with the International Dyslexia Association’s recommendations on dyslexia assessment and appears to fall short of meeting the North Carolina General Assembly’s mandate that students “receive the necessary and appropriate screenings” as well.  The mClass tool which has been in statewide use since 2013 aligns much more closely with those requirements.

For its part, Istation continues to claim its tool is “capable of identifying and supporting students with learning disabilities” including dyslexia.  

Now more detailed claims by both Istation and Amplify–parent company of mClass–have been provided by the North Carolina Department of Information Technology.  Actually, detailed is probably not the right word to use to describe Istation’s claims.

In the original RFP, both companies were asked by the Department of Public Instruction to provide a description of how measures “adequately and accurately identify indicators of risk for dyslexia in grades K-3.” 

Here’s the description Amplify provided in its Response to RFP:

Amplify and the co-authors of DIBELS Next support the use of a universal screener (the DIBELS Next measures assess decoding, accuracy, fluency, and comprehension) for identifying students with reading difficulties, including students at risk of dyslexia. Amplify has developed additional screening measures of Vocabulary, Spelling and Rapid Automatized Naming (RAN), that may also be administered to obtain additional information on risk for dyslexia and possible impact of difficulty on related skills.

The Vocabulary(VOC) measure is an indicator of a student’s general vocabulary knowledge. It assesses a student’s depth of knowledge of a set of grade level high frequency and high utility words that are used across domains (Tier 2 words; Beck, McKeown, & Kucan, 2002) and content specific words and whether a student has strategies for making meaning of words encountered in text. It incorporates a variety of tasks that vary across grade levels and allow for assessing multiple dimensions of vocabulary knowledge across multiple contexts. The tasks included in each grade level are those that produced the most reliable results for that grade. The number of items increases in each grade level as students’ ability to sustain attention increases with age.

The first task administered to all students is the two-question vocabulary task adapted from the work of Kearns & Biemiller (2010). In this task, students are asked two yes or no questions about a target word. This requires deeper knowledge of a word than many traditional tasks as the same word is included in two contexts or questions. Evidence of the reliability, validity and sensitivity to differences in students vocabulary skills has been documented (see Kearns & Biemiller, 2010). In grades 1 through 3, students also complete traditional fill in the blank questions. They read a sentence with a missing word and select the word that best completes the sentence from a set of four words. Distractor or incorrect responses for each item include words that look or sound similar to the target word, words that mean almost the same thing as the target word but are not correct in the context of the sentence, or are related to the target word or sentence context but are not correct in the context of the full sentence.

Finally, grades 2 and 3, students complete items that require matching a word to its basic definition. The words included in this question type are words that are included in earlier portions of the assessment. The purpose is to see if a student has basic knowledge of the definitions of these same words. 

The Spelling measure is an indicator of a student’s level of general spelling skills. It is designed based on the principles of General Outcome Measurement and Curriculum-Based Measurement (CBM; Deno, 1992). Assessments from this approach are designed to efficiently screen for students who are at-risk for difficulty – they are brief assessments of critical skills that are sensitive to student learning and growth overall. CBM measures do not assess all skills within a domain but provide a snapshot of a student’s skills in a given area using tasks that are instructionally useful and can be reliably administered (Deno, 2003).

The Amplify spelling measure incorporates the key features of CBM Spelling measure design and administration. It is administered on a computer or tablet so typical procedures for administering the measure were modified to fit the software environment. The target word is spoken (by the computer) and the student uses letter tiles to spell the word. Both correct letter sequences (CLS) and words spelled correctly (WSC) are calculated.

The Rapid Automatized Naming (RAN) measure indicates how quickly students can name aloud numeric symbols. While there is not strong agreement in the field of exactly which cognitive processes RAN is measuring, a large body of evidence has documented RAN as one of the best predictors of overall reading skill, including word reading, text reading, reading fluency, and comprehension and an area difficulty for students with reading disabilities (Araujo, Reis, Petersson, & Faisca, 2015). Deficits in rapid automatized naming have also been shown to be a robust indicator of risk for dyslexia in children (Gaab, 2017). 

Because of the strong predictive relationship that RAN displays with tasks that measure various reading skills, researchers hypothesize that completion of RAN tasks requires the coordination of multiple processes. “The seemingly simple task of naming a series of familiar items as quickly as possible appears to invoke a microcosm of the later developing, more elaborated reading circuit” (Norton & Wolf, 2012, p. 427). The full circuit requires coordination of attention, working memory, visual processing, phonological processing, etc., individual processes also required for reading. 

The addition of these measures is in line with the definition of dyslexia included in NC HB 149. These measures provide additional information about a student’s processing, spelling, and decoding abilities. In addition, the Vocabulary screener and the existing mCLASS:Reading 3D measures allow teachers to continue to understand the overall reading skills of students and potential “secondary consequences” or problems in reading comprehension or vocabulary as described by North Carolina’s definition of dyslexia. The addition of these measures in the screening process aids in the development and implementation of targeted interventions with ongoing progress monitoring through a multi-tiered system of supports. Progress monitoring data should then be used to determine whether additional assessment and evaluation are needed for the student.

The Vocabulary measure is administered on the computer. Each item and all answer options are spoken (by the computer) to the students who then select their answer choice. The Rapid Automatized Naming and Spelling measures are administered on the online student testing platform. To administer this assessment, an educator enables the measures from Online Assessment Management portal then logs into the student account to launch the RAN and Spelling assessment with the individual student. Both the teacher and student can view the screen and listen to the audio prompts that guide the student through a model, practice, and each assessment item. The teacher controls the input device and selects student responses.

For students with adequate self-regulation and computer skills, teacher assistance may not be needed for Spelling measure and students can interact directly to enter their response.

A student may be flagged as demonstrating additional risk for reading difficulty including dyslexia when the DIBELS Next composite result is in the Well Below Benchmark range and results from either the Spelling and RAN measures are in the Well Below Benchmark range. Educators will have the option to include this information in the Home Connect Letter to share the student’s performance on these additional measures with parents. Please see Appendix G (pgs. G-1 – G-23) for more information on the Vocabulary, Spelling, and RAN measures.  

Now take a look at the “meh, we got that covered” response from Istation:

That’s literally all they had to say about it.  And somehow, it was enough for Mark Johnson.

New documents show DPI dismissed dyslexia screening deficiency in selecting Istation tool

Newly released documents obtained through a public records request with the NC Department of Information Technology reveal that the Department of Public Instruction was fully aware of Istation’s dyslexia screening shortcomings–and chose to purchase the tool anyway.

When the North Carolina Department of Public Instruction originally issued its Request for Proposal (RFP) for a K-3 literacy assessment tool, one requirement was that vendors explain how their product would identify students at risk for dyslexia.  

The image you see below is from the RFP that DPI released in the fall of 2018.  Business Specification #8 requires vendors to provide a description of how measures “adequately and accurately identify indicators of risk for dyslexia in grades K-3.” 

The RFP includes a link to legislation passed by the General Assembly in 2017 which mandates that students with specific learning disabilities such as dyslexia “receive the necessary and appropriate screenings” and tasking local boards of education with reviewing “diagnostic tools and screening instruments used for dyslexia…to ensure that they are age-appropriate and effective.”   

Clearly it was a priority in the fall of 2018 that the Department of Public Instruction procure a reading assessment tool that can be used to identify students who are at risk for dyslexia.  That was the understanding of the two robust evaluation committees that thoroughly reviewed the available vendors before recommending to Superintendent Mark Johnson in December of 2018 that mClass was the best choice for North Carolina’s children. It’s also an approach which is consistent with this Dyslexia Topic Brief produced by DPI’s Exceptional Children division in 2015, which points out that “Assessments that serve as screening tools can provide early warning indicators of students who are at risk of reading failure.”

After the RFP evaluation committees both recommended mClass, the RFP process was cancelled.  DPI has yet to explain what led to the cancellation, although last month a spokeswoman for the department referred in a cryptic statement to “actions that jeopardized the legality of the procurement.”  

Whatever the true cause for the RFP cancellation, the procurement process was restarted in the spring with a new evaluation team, and Requests to Negotiate were sent to two vendors:  Amplify (the company which produces mClass) and Istation. DPI’s Contract Award Recommendation document lays out the following timeline for what occurred:

According to DPI’s letter, the Evaluation Committee that chose to award the contract to Istation consisted of the following individuals:

Note that it’s a much narrower team than the roughly 20-25 knowledgeable statewide education leaders–including specialists in general education, special education, and English language learner services, school psychologists, representatives of Institutions for Higher Education, and dyslexia experts–who made up the two committees that originally evaluated the available screeners before recommending mClass to Mark Johnson.  You know, right before the RFP was cancelled.

The team that chose Istation specifically referred to the program’s lack of a separate dyslexia component as a weakness of the program (while also holding up Amplify’s dyslexia component as a strength of mClass):

In explaining the final choice, Istation’s deficiencies with regard to screening for dyslexia were noted by the Evaluation Committee again–and then dismissed as being outside the scope of the procurement.  

The law referenced here by DPI (GS 115C-83.1) is the primary goal of the Read to Achieve program:

The goal of the State is to ensure that every student read at or above grade level by the end of third grade and continue to progress in reading proficiency so that he or she can read, comprehend, integrate, and apply complex texts needed for secondary education and career success.

I’d argue that effective dyslexia screening is part of ensuring “that every student read at or above grade level by the end of third grade,” and that it should have been considered a “primary obligation of this procurement.”

For a young child, failing to have a learning disability detected early on can completely alter his or her life trajectory.  

Our students deserve better than this.

Research shared by Istation deems Istation an inadequate substitute for human teachers

The “e-learning” company Istation has been engaged in a public relations offensive ever since NC Superintendent Mark Johnson’s early June announcement that he had awarded a three year, multimillion dollar contract to the company for use of its K-3 literacy assessment tool led to massive pushback from North Carolina’s public school parents, teachers, and superintendents. 

It isn’t going very well.

Johnson’s unilateral decision disregarded input of two teams of professional educators who overwhelmingly recommended students continue using the mClass tool which has been in North Carolina schools since 2013.  The aftermath has seen both public outcry and official protest by Amplify, the company that produces mClass.

One key difference between the two products is that mClass requires one-on-one interaction between student and teacher, while Istation has young children sit and work alone on a computer.

When he announced the news about Istation, Johnson referred to the company’s “proven results of helping students grow.”

Durham school psychologist Dr. Chelsea Bartel, whose research focused on identifying and implementing effective interventions to improve skill deficits, reached out to Istation to find out more about those proven results.

Istation obliged by providing links to a handful of studies, three of which were written by a Dallas professor who it turns out actually works for Istation:

But it was an independent study by Tarleton University professor and education researcher Rebecca Putman that caught Dr. Bartel’s eye.

As provided by Istation, Putman’s study “Technology versus teachers in the early literacy classroom: an investigation of the effectiveness of the Istation integrated learning system” is behind a paywall.  You’d have to pay $39.95 to actually read anything beyond an abstract that mentions Istation’s “statistically significant effect” on kindergarten literacy skills.

If you don’t have that much cash to spare, you could just read the Istation.com summary of the article entitled “Does Istation’s Technology Improve Learning? Research Says Yes!” by Istation’s digital marketing manager Rachel Vitemb.  Not surprisingly, the Istation summary of Dr. Putman’s research focuses exclusively on positives, crediting the software with “improved students’ letter-sound knowledge as well as their ability to hear and record sounds and write vocabulary.”

But if you only read the Cliff’s Notes version provided by Istation’s marketing team, you’d be missing out on some of the study’s most important conclusions.  Fortunately, the study’s author graciously provided Dr. Bartel with the full piece free of charge.

Dr. Putnam’s article explains that she gave kindergarten students the ISIP-ER assessment at the beginning of the study, then had them spend 135 minutes per week using computer-based Istation reading interventions.  The study lasted 24 weeks. At the end of the study (so, after approximately 54 hours of kindergarten children working alone, wearing headphones and clicking boxes on a computer screen), the students took the ISIP assessment again.  Comparing the results, Dr. Putnam found the program to be effective in teaching students early literacy skills such as letter sound recognition “that require drill and repeated practice”.

I’d bet $39.95 you could get even better results if you spent those 54 hours working on sound recognition with a human teacher.

Here’s what Istation left out about Putnam’s research:

In the full study, Dr. Putnam explains that, in addition to assessing Istation for its effectiveness in supporting early literacy achievement, her goal was to determine whether Istation served as what she refers to as a “more knowledgeable other” when compared with a classroom teacher: 

 Another purpose of this study was to investigate whether Istation is an adequate substitute for the more knowledgeable other (MKO) in the classroom. In other words, did this particular application of technology scaffold students’ learning as effectively as a classroom teacher and serve as a MKO? Generally, a MKO refers to a person who has a higher level of understanding and knowledge about a particular topic or concept (Vygotsky 1978).

Her conclusion?  It isn’t.

Istation does not appear to be an adequate substitute for the MKO when it comes to creating meaning and applying early literacy skills to more complex literacy tasks. Based on the data from this study, early literacy skills that require the integration of a variety of literacy skills and strategies, such as reading and comprehending a book, understanding concepts about print, and reading words, seem to require the instruction and feedback of a human, one who is able to interact, provide multidimensional feedback and allow for the student to take on a more active role in the social interaction.

At the end of a study in a section titled “Implications for use of Istation in early literacy education,” Dr. Putnam acknowledges that technology is often seen as a quick fix for literacy problems and calls for more independent research so that it can be incorporated into early childhood classrooms in a way that is healthy for students:

There is increasing pressure on school districts to find quick and efficient solutions to perceived problems in reading achievement, and often, the focus is on improving early reading skills (Paterson et al. 2003). A popular solution to these problems is educational technology. As the use of technology becomes more prevalent in elementary schools, and particularly in early childhood classrooms, there is an increased need for independent research on the relationship between technology and literacy in order to justify (or discourage) districts’ large expenditures and inform their decisions about how to integrate technology into the instructional curriculum (Tracey and Young 2007).

Istation is the wrong choice for North Carolina’s children, and the research that the company itself is sharing just confirms it.

Note:  Chelsea Bartel has conducted an in-depth review of the available research on Istation.  You can read her summaries of the studies and takeaways in her piece “Try Again, Istation.”