Download Data Dictionary

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Community Partnership System DPS Data Report: Definitions & Descriptions Data Descriptions The purpose of this document is to inform program staff and/or evaluators about data points that may be useful to improve programming and to provide guidelines on how each piece of data should be used. In order to best utilize DPS Assessment data, especially for evaluation purposes, it is essential to understand these, and other, basic concepts: Causal vs. Correlative: The distinctions between causal and correlative impacts are important to make. Causal relationship means that you can prove that a program CAUSED a change in student outcomes whereas correlative means that there is a relationship, but it is not clear what causes the relationship. Oftentimes you will see a correlative relationship, either positive or negative but it is considerably more difficult to prove a causal relationship, especially in education where there are multiple interventions and/or uncontrolled variables at any given time. Comparison Group: a comparison group is a group of very similar students that did not receive the same intervention. It is sometimes referred to as a Control Group, however a control group may infer a different type of experimental design. Indicator: An indicator is a measure that helps to quantify the achievement of a result. It demonstrates whether or not students are progressing toward outcomes. Outcome: An outcome is the impact you intend to see based on participation in your program. Output: An output is what your program will DO in order to reach short, mid and long term goals. Examples are numbers of participants served, number of program hours, curriculum delivered, etc. Regular Attender, Program Completer, etc.: A student that has received a significant dosage in which you would expect to see intended impacts. It is important to differentiate these participants from unduplicated participants. Short term, mid-­‐term and long term goals: In order to see long term goals, such as increased academic achievement, it is important to know what you expect to see in the short term to ensure adequate progress is being made. Unduplicated Participant: A unique student who is counted in only one treatment or control and who has participated, even minimally within your program(s). May or may not have received enough dosage to see the benefit. It is important to differentiate this from a regular attender or a participant who completed the intervention. For information and resources regarding the use of data to drive program improvement and in evaluation, please refer to www.denvergov.org/denverafterschoolalliance For more information email: [email protected]. Page 1 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions DPS Data Elements Aggregate data points available through CSP Summary Report: Some, or all aggregate data points (including TCAP, FRL, etc.) are only available if the uploaded roster has more than 16 verifiable students with information. This serves to protect student confidentiality. Attendance Rates: Average attendance rates of students in the uploaded roster, attending the school if a site is located at a school, and attending across the district, for each month. Percent of Student Ethnicities: Percentage of students of each ethnicity for the program, school site (if a site is located at a school) and district. Absenteeism Rates: Percentages of students in the program, school and district that have attendance rates in the Satisfactory (>95%), At Risk (90-­‐95%), Chronic (80-­‐90%), and Severely Chronic (<80%) categories. Student count by Zip Code for participants in an uploaded roster: The number of students whose primary home address on record with DPS falls within each zip code represented. Content Specific TCAP Proficiency Rates (Reading, Writing, Math, Science, Lectura, Escritura): Percentage of students in each of the 4 proficiency bands (Advanced, Proficient, Partially Proficient, Unsatisfactory) and percentage of students with no score. These are given for program participants, schools and district. A mid-­‐line is given for percentage of students at or above proficiency. These columns of data are only available if the number of verified students in the roster who took each specific test is 16 or greater. TCAP MGP (Transitional Colorado Assessment Program Median Growth Percentile): Aggregate summary of growth scores for uploaded roster, school site and/or district. Represents how much a group of students grew from one year to the next compared to students with similar TCAP performance in the past. See below for explanation and for how it can be used. This data is only available if the number of verified students in the roster who took the test is 20 or greater. FRL (Free/Reduced Lunch): This is a percentage of students who are eligible for free or reduced lunches. This is an indicator of income/poverty levels as it is based on income eligibility. Percentages are reported for the uploaded roster, school site and/or district. Because this data is highly confidential, data will not be reported for group sizes any smaller than 16. Student level data will not be available. This data is only available if the number of verified students in the roster is 16 or greater. For more information email: [email protected]. Page 2 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions Student-­‐level data points available through CPS Detail Report: Demographic Data In many instances demographic data can be used to target and individualize services to students and families. It should also be used to ensure programs are reaching the target population. State Student ID: This identification is different than the DPS ID number and travels with the student throughout the state. School Number and School Name: The School Number and name associated with a students current enrollment. Program Count: The number of programs a student is currently enrolled as reported in CPS. These could be multiple programs within one organization or in different organizations. Zip Code: Zip Code of student’s main residence ELA (English Language Acquisition) Flag: Current means a student is currently designated as an ELL (English Language Learner) or No means the student not currently designated as an ELL, and Past signifies they could have been prior. Each district school has an Instructional Services Advisory (ISA) team. ISA teams are responsible for placing English language learners in ELA program services, classifying them as English language learners, reviewing their progress while receiving program services, recommending them for exit from program services, and monitoring students for one year after they have been exited from ELA program services SPED Flag (Special Education): Current, Past or NO; Eligibility for special education services is based on a wide variety of special needs ranging from vision or hearing impairments, ADD/ADHD, Autism, Emotional Disability. GT Flag (Gifted and Talented): These students typically represent the highest achieving 10% of the school, as evidenced by their body of work, and may be eligible for GT services. Students typically performing in the top 1%-­‐3% may be eligible to attend a Highly Gifted and Talented Magnet program. Grade: Grade level currently enrolled Gender: Male or Female Ethnicity: Ethnicity categories include: American Indian; Black; Asian; Hispanic; White, not Hispanic; Pacific Islander; Two or More Races. Credits Attempted: The number of credits attempted within the previous 365 days. An average semester course load is 30 with an annual course load of 60. For more information email: [email protected]. Page 3 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions Attendance Data: Attendance data can be used to provide services to students specifically struggling with day school attendance. It is important to work with students and families to understand the root causes which may include transportation, sibling care, disengagement from school, peer relations, etc. Student attendance data can also be used as a student level outcome (did this student’s attendance rate increase or decrease during the time they were enrolled in the program or participant attendance rates higher or lower than the school average) although it should be noted that trends show that student’s day school attendance varies throughout the school year. Attendance data is calculated based on the most recent 365 days of the enrollment period of the CPS site (as defined in the CPS site record), which is the period a student was being served by a program. If a student is enrolled for 1 year or longer, then the data is limited to the previous 365 days. The attendance rate is cumulative for the entire enrollment period or year. For example, if a site has an enrollment period of February 1 through June 1, the time span over which the attendance data is reflected is the 4-­‐month period between February 1 and June 1. Day school attendance rates are a measure of how much a student is in school. Research shows that students who are not in school at least 90% of the time are less likely to master the content being taught. In addition to the overall rates as defined above, Attendance Rates, Absenteeism and Tardy Rates are also given for the current month (CM), current quarter (CQ), current half (CH). YTD Attendance Rate: Attendance rates are based on the percent of time students are “present” based on enrolled and absent minutes for attendance and instructional days. Attendance rates are cumulative for each enrollment period (or up to 365 days) to create a rating for the entire time span of the site enrollment period. YTD Absenteeism: A Risk factor is given based on attendance rates. (Note that colors are different than cohort graph colors) •
•
•
•
Satisfactory (95%-­‐100% attendance rate) Light Green At Risk (90%-­‐95% attendance rate) Gold Chronic (80%-­‐90% attendance rate) Orange Severe Chronic (0-­‐80% attendance rate) Tomato YTD Tardy Rate: The percentage of time students have been tardy to school over the time span of the program site. Tardy rates are cumulative for each enrollment period (or 365 days) to create a rating for the entire time span of the site enrollment period. For more information email: [email protected]. Page 4 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions In School Behavior: Student behavior records include referrals, in school suspension, out of school suspension and expulsions. Behavior and discipline policies, as well as tracking protocols, can vary across schools, making it difficult to compare behavior records across schools. Referrals are not included in the CPS report. Similar to attendance, behavior data can be used to target interventions or services to students who are struggling with behavior. Program staff should work with students and families to better understand the context of behavior incidents in order to support students. In School Suspensions: The number of in school suspensions, as a resolution to a behavior event. Out of School Suspensions: The number of out of school suspensions as a resolution to a behavior event. Academic Status: Academic scores, grades and credits can be used to target interventions based on specific academic needs. Some assessments give scores within specific domains (i.e. comprehension levels vs. oral reading scores) and can be used to group students based on content needs. Many schools have data coordinators and/or data teams that are valuable resources to better understand these specific assessments and how students can be targeted based on needs. When used as student level outcomes, academic status scores should be used carefully. Work with an experienced and knowledgeable evaluator to determine which scores can be used to determine growth or can be compared with another group of students to demonstrate correlation. See below for more information on these assessments. Credits Earned: The number of credits successfully earned (passed) within the previous 365 days. An average semester course load is 30 with an annual course load of 60. Grades A-­‐F: These are the number of each grades earned within the previous 365 days in Core Subjects (these include Reading, Writing, Math, Science). TCAP Content Specific Proficiency Levels and Date: This is the level of proficiency the student’s score falls within for the most recent score available for each subject. Students who do not have a score in the most recently available test administration will reflect a “no score”. Previous scores will not be available. TCAP Content Specific Scale Scores and Date: This is the student’s scale score for the most recent score available subject. Note that scale scores cannot be compared across subject or grade level. Students who do not have a score in the most recently available test administration will reflect a “no score”. Previous scores will not be available. DRA scores and dates: These are scores for fall, winter and spring assessments. STAR scores: These are scores for fall, winter and spring assessments. For more information email: [email protected]. Page 5 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions DPS Assessments The following assessments measure how well a student understands concepts and possesses specific skills in a particular educational area. The assessment scores represent a snapshot of achievement at a specific point in time. As with any assessment, it is important to remember that many factors can affect a student’s scores and these assessments only give one picture of how well a student is doing in school. TCAP: The Transitional Colorado Assessment Program Background: TCAP, known prior to 2011-­‐12 as CSAP (Colorado Student Assessment Program) is the Colorado standardized test measuring proficiency and growth in Reading, Writing, Math and Science for grades 3-­‐
10. TCAP is offered in English for all grades and Spanish for reading/writing in grades 3 and 4. There is an alternative version of this assessment designed specifically for students with significant cognitive disabilities called The Colorado Alternate (CoAlt), formerly known as CSAPA (Colorado Student Assessment Program Alternative Assessment). Scores from CoAlt are not included in the CPS report. TCAP Proficiency Levels: TCAP uses scale scores instead of a raw score to compare scores across different time periods and versions of the assessment. Achievement levels have been developed to measure a student’s performance relative to the Colorado Model Content Standards and translated to the Common Core State Standards (CCSS). The four different achievement levels, called proficiency bands, are Advanced (A), Proficient (P), Partially Proficient (PP), and Unsatisfactory (U). Proficiency is determined as scoring in the Proficient or Advanced proficiency bands and non-­‐proficient are those scoring in the Partially Proficient, Unsatisfactory, and No Score. Growth Scores: The Colorado Growth Model is a statistical model that shows us how individual students (and groups of students) progress from year to year toward state standards. Each student's progress is compared to the progress of other students in the state with a similar score history on CSAP in that subject area. For an individual student, growth is a measure of progress in academic achievement. For Colorado, growth is not expressed in test score point gains or losses, but in student growth percentiles. An individual's test scores are used as the basis for a growth calculation, using a statistical model called quantile regression. The student growth percentile tells us how a student's current test score compares with that of other similar students (students across the state whose previous test scores are similar). This process can be understood as a comparison to members of a student's academic peer group. So, Colorado's measure of growth is a normative rather than an absolute one. A student growth percentile of 60 indicates the student grew as well or better than 60% of her academic peers. It is not about how that recent test score compares to all the other test scores. Even students with test scores that are very low can receive high growth scores. For more information email: [email protected]. Page 6 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions Median Growth Percentile (MGP): The MGP summarizes student growth rates by district, school, grade level, or other group of interest. The median is calculated by taking the individual student growth percentiles of all the students in the group being analyzed, ordering them from lowest to highest, and identifying the middle score, the median. The median may not be as familiar to people as the average, but it is similar in interpretation – it summarizes the group in a single number that is fairly calculated to reflect the group as a whole. (Medians are more appropriate to use than averages when summarizing a collection of percentile scores.) When reporting growth percentiles it is recommended to use a group size equal to or greater than 20. Results are made available mid to late August and are regularly the last piece of TCAP data available. Who takes the TCAP? All students in grades 3-­‐10 take the reading, writing and math assessments. Students in grades 3 and 4 who qualify, can take the Lectura & Escritura assessment. An additional science assessment is administered in grades 5, 8 & 10. It is important to note that in order to receive a growth percentile, students’ need a valid English CSAP/ TCAP score over two consecutive years with a typical grade level progression (e.g., third grade to fourth grade). Therefore, growth scores are not available for students in 3rd grade and below, as well as in science as students do not take the Science TCAP for 2 consecutive years. What does it measure? TCAP assess students’ proficiency in Math, Reading, Writing, and Science. The Colorado Growth Model tracks how a student’s performance from year to year measures up to that of their academic peer group. The growth model measures how much content a student has learned in a given year relative to his or her academic peers. The Median Growth Percentile is used to describe the level of relative growth reached by a typical student from a larger group (either the program in question or a school). When is it administered and when are score available? TCAP is administered for 3rd Grade reading in late February to Early March. All other sections of the TCAP are administered in Early/Mid March to Late March. TCAP results are available mid to late August. Third grade reading scores are available late April to Early May. How is the data to be used? Scale Scores: TCAP uses scaled scores instead of a raw score to compare scores across different time periods and versions of the assessment. The scoring range for each achievement level differs based on grade level and content area. For example, the scoring range for Performance level 1 for Reading Grade 3 is 150 to 465 while the scoring range for Performance level 1 for Reading Grade 4 is 180 to 516. Therefore, the scale scores cannot be averaged across grade levels or subjects as the scales for each grade and subject may differ. Based on students scale score, they will be within one of four proficiency bands: Advanced, Proficient, Partially Proficient or Unsatisfactory. Students may also receive a No Score (NS). For more information email: [email protected]. Page 7 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions A common and useful way to look at TCAP status scores is by comparing the percentage of students who actively participate in a program that are proficient (including Proficient and Advanced) compared to other students, either other students in the school or other students in the district. When reporting percentage of students at proficient or above it is recommended to use a group size equal to or greater than 16. Growth Scores: For an individual student, growth is a measure of progress in academic achievement. For Colorado, growth is not expressed in test score point gains or losses, but in student growth percentiles. The student growth percentile tells us how a student's current test score compares with that of other similar students (students across the state whose previous test scores are similar). This process can be understood as a comparison to members of a student's academic peer group. A student growth percentile of 60 indicates the student grew as well or better than 60% of her academic peers. It is not about how that recent test score compares to all the other test scores. Even students with test scores that are very low can receive high growth scores. The test score data underlying these student growth percentiles are not perfectly precise, because they contain measurement error, so the growth percentiles themselves are in turn also not perfectly precise. A student with a growth percentile of 63 may not actually be growing significantly faster than another student with a 60. In a similar way, even though you might not be able to reliably discern a 63-­‐decibel sound from a 60-­‐decibel one, you can still easily categorize different sounds as soft, normal, or loud -­‐ finer-­‐grained comparisons are hard to make. For this reason, student growth percentiles are categorized by "low," "typical," or "high" growth -­‐ we can be pretty sure about these large differences, even if small differences may not be reliable or meaningful. A growth percentile of 35% -­‐ 65% represents typical growth. A growth percentage below 35% represents low growth and a growth percentage above 65% represents high growth. Median Growth Percentile (MGP): When trying to summarize a group of students, for example a group of students who regularly attended an after school program, the median growth percentile (MGP) can be used. The median is calculated by taking the individual student growth percentiles of all the students in the group being analyzed, ordering them from lowest to highest, and identifying the middle score, the median. The median may not be as familiar to people as the average, but it is similar in interpretation – it summarizes in a single number to reflect the group’s performance as a whole. (Medians are more appropriate to use than averages when summarizing a collection of percentile scores.) For current and prior TCAP State, School, and District Reports visit: http://www.cde.state.co.us/assessment/CoAssess-­‐DataAndResults.asp For TCAP achievement levels and proficiency levels visit the link below: http://www.cde.state.co.us/assessment/coassess-­‐additionalresources and click on achievement levels Links to data/info on CDE’s website: http://www.cde.state.co.us/assessment/CoAssess-­‐FrameworksAndFactSheets.asp For more information please visit: http://www.schoolview.org/ColoradoGrowthModel2.asp For more information email: [email protected]. Page 8 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions MAP: Measures of Academic Progress Background: MAP is a computerized academic assessment primarily designed for students in grades 6-­‐12. MAP assessments are adaptive achievement tests in mathematics, reading, language usage, and science. The difficulty of the test is adjusted based on how the student responds to the questions. If the student answers a question correctly, the next question becomes more difficult, and if the student answers incorrectly, the next question becomes easier. Who takes it? Currently 22 Intensive Pathway schools and five charter schools are using the MAP assessment. What does it measure? MAP measures students’ achievement and growth in math, reading, language usage and science. Scores are reported in a Rausch Unit, or RIT scale, which is an equal-­‐interval scale, so scores can be aggregated to calculate class or school averages (means). Teachers see an overall RIT scores and information around each goal performance area. When is it administered and when are score available? MAP test results are made available in three windows. The fall reports are available in November, the winter reports are available in February, and the spring reports are available in June. How can the data be used? The Northwest Evaluation Association (NWEA) compares the Fall to Spring and Fall to Fall administration windows to determine student growth. Students with at least two tests in the same schools are included. Use the student’s first test for each window when there are multiple tests in a testing window. The student’s earliest test in a window is compared to the following window’s earliest test. The difference in the RIT scale score is then evaluated for the School Performance Framework. Only students whose first test and last test were at least 40 instructional days apart are included. Reading, Mathematics, and Language Use are evaluated separately. It is recommended to use a group size equal to or greater than 16 for aggregating MAP Scores. How is MAP data different than TCAP data? Because students can be tested multiple times each year, this data gives more real-­‐time progress related to content specific to what the teacher is teaching or has taught. For more information: http://www.nwea.org/ For more information email: [email protected]. Page 9 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions STAR Early Literacy and STAR Reading Background: The STAR assessments (STAR Reading and STAR Early Literacy Enterprise) are computer-­‐adaptive tests used in DPS classrooms. They are aligned to state standards and provide information about student mastery of those standards to judge the success of student learning as it happens. Who takes it? Only K – 5th graders are required to take the tests; however some teachers do test students in other grades. Tests can be administered any number of times, however when reporting, only tests within the three testing windows should be considered. Students who take STAR Early Literacy include: • Students grades K-­‐2 without previous STAR test history or K-­‐2 students with a previous STAR Early Literacy scale score <775 • Students grades 3-­‐5 that do not pass the STAR Reading practice questions Students who take STAR Reading include? • Students in grades K-­‐2 with a previous or current scale score ≥775 (Literacy Classification is “Probable Reader”) on STAR Early Literacy • Students grades 3-­‐5 What does it measure? Students are tested on the 5 areas of reading, organized in the following way: • The kinds of words students know and are able to use • How well students understands what he/she reads • How well students can take apart a story and figure out what it means • Students ability to understand and evaluate what a story or article explained STAR scores are grouped into four performance levels. These levels are: A = At/Above Benchmark W = Watch I = Intervention UI = Urgent Intervention These performance levels are intended to help educators identify which students require some form of intervention to accelerate growth and move toward proficiency. Students performing in the At/Above Benchmark level are meeting the minimum level of performance needed to meet end-­‐of-­‐year grade level expectations as measured by Colorado state standards. For more information email: [email protected]. Page 10 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions When is it administered and when are score available? These assessments are given three times each year (October, December, and April/May) and therefore the STAR data is available during three windows. The fall data is available in October, the winter data is available in January, and the spring data is available in May. How can the data be used? These assessment scores can be used to identify students reading level, check progress in reading, and/or identify needs for additional support. The STAR Early Literacy Enterprise and STAR Reading are reported in Scaled scores (SS). These scaled scores are calculated based on the difficulty of items and the number of correct responses. Because the same range is used for all students, scaled scores can be used to compare student performance across grade levels for the same version of the test. Do not combine SS for different tests. STAR Early Literacy Enterprise scaled scores range from 300 to 900 and relate directly to the literacy classifications including Emergent Reader, Transitional Reader and Probable Reader. STAR Reading scaled scores range in value from 0-­‐1400. It is recommended to use a group size equal to or greater than 16 for aggregating STAR data. Percentile rank (PR) is a norm-­‐referenced score that provides a measure of a student’s reading ability compared to other students in the same grade nationally. The percentile rank Score, which ranges from 1 to 99, indicates the percentage of other students nationally who obtained scores equal to or lower than the score of a particular student. For example, a student with a percentile rank score of 85 performed as well as or better than 85 percent of other students in the same grade nationally. Instructional reading level (IRL) is a criterion-­‐referenced score that indicates the highest reading level at which a student is at least 80 percent proficient at recognizing words and understanding material with instructional assistance. For example, a seventh-­‐grade student with a score of 8.0 reads eighth-­‐grade words with 80 percent accuracy or better. IRL scores are Pre-­‐Primer (PP), Primer (P), grades 1.0 through 12.9, and Post-­‐High School (PHS). For more information: http://www.renlearn.com/sel/ http://testing.dpsk12.org/resources/star.html For more information email: [email protected]. Page 11 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions DRA2: Developmental Reading Assessment 2nd Edition EDL2: Evaluacion del Desarrollo de la Lectura Background: The Developmental Reading Assessment 2nd Edition (DRA2) and the Evaluacion del Desarrollo de la Lectura (EDL2) provide a standardized method for assessing students’ reading development. The DRA is intended to be administered by classroom teachers to 1) assess a student’s independent reading level and 2) diagnose a student’s strengths and weaknesses in relation to accuracy, fluency, and comprehension. Who takes it? All students in grades Kindergarten through 5th grade take the DRA2 or ELD2 to check their reading skills. Additionally, the Colorado Department of Education has determined that all newly enrolled English Language Learner’s (ELLs) in grades K-­‐12 take this assessment. Some students may have scores for both assessments. In this case use the score flagged for accountability for evaluation purposes. Use both scores for instructional purposes. What does it measure? DRA2/EDL tests student reading levels based on: • Reading Engagement: how do students interact with books away from the classroom • Fluency: how well works and sentences are read • Comprehension: how well do students understand what they have read Students should not be tested more than two grade levels above the DPS Target Level Expectations for their current grade level. DRA2/EDL2 scores significantly above or below grade level become unstable and should not be considered accurate on their own. Having students go too far beyond their grade-­‐
level on the DRA2/EDL2 tends to inflate the impact of word calling at the expense of deep comprehension. Because of this, the district recommends going to only two grade-­‐levels above on the DRA2/EDL2, and then building a larger body of evidence with running records using benchmark books or DRA2 Progress Monitoring passages, regular ongoing classroom running records, anecdotal notes, STAR, STAR Early Literacy, etc. When is it administered and when are score available? The assessment is typically given twice a year (fall and spring), but is sometimes administered mid-­‐year for Kindergartners. In the fall, the assessment is administered at the instructional level. This is a higher level administration that allows teachers to assist the student and develop individual instructions. In the spring the assessment is administered at the independent level. This is a lower level administration as per state requirements. For more information email: [email protected]. Page 12 of 13 Revised 11/2013 Community Partnership System DPS Data Report: Definitions & Descriptions The DRA2/EDL2 data is available during three windows. The fall data is available in October, and the spring data is available in June. The mid-­‐year data (Kindergarten only) are available upon request in March. How can the data be used? The DRA2/EDL2 scores are reported in Text level (see table below), which represents the individual’s overall score. The range of values begins with P, then A, then begins numerically starting at 1 through 80. 80 is the grade level target for End of Year 8th grade students. It is important to note that text levels cannot be averaged and cannot be used as Scaled Scores. Commonly used metrics are the percentage of students at or above grade level and the percentage of students at or above expected text level. It is recommended to use a group size equal to or greater than 16 when reporting on DRA2/EDL2 scores. What follows are the beginning, mid-­‐year, and end of the year grade level text targets for DRA2 and EDL2 test takers. Beginning of Year Mid-­‐Year End of Year Kindergarten A 2 4 First Grade 4 12 16 Second Grade 16 20 28 Third Grade 28 34 38 Fourth Grade 38 40 40 Fifth Grade 40 50 50 For more information: http://www.scholastic.com/parents/resources/article/book-­‐selection-­‐tips/assess-­‐dra-­‐reading-­‐levels For more information on the READ Act please visit: http://www.cde.state.co.us/coloradoliteracy/ReadAct/ The following links are available for those with DPS login credentials: This like displays which assessment score a student will have. http://testing.dpsk12.org/secure/cbla/ELA%20Clarifications%20Document.pdf For a timeline including the testing requirements please visit: http://testing.dpsk12.org/secure/cbla/CBLA_Requirements_&_Timelines_1213.pdf For more information on the overall CBLA testing information please visit: http://testing.dpsk12.org/resources/cbla.htm For target level expectations of 2012-­‐2013 scores please visit: http://testing.dpsk12.org/secure/cbla/DRA2-­‐EDL2%20Target%20Level%20Expectations%20K-­‐
8%201213.pdf For more information email: [email protected]. Page 13 of 13 Revised 11/2013