Download Data Warehousing: Not our fathers` spreadsheets

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Big data wikipedia , lookup

Data Protection Act, 2012 wikipedia , lookup

Data model wikipedia , lookup

Data center wikipedia , lookup

Forecasting wikipedia , lookup

Data analysis wikipedia , lookup

Information privacy law wikipedia , lookup

3D optical data storage wikipedia , lookup

Data vault modeling wikipedia , lookup

Business intelligence wikipedia , lookup

Transcript
Shiawassee County’s
Data Project
Why a Data Project?
• Student achievement is the goal of all
school districts.
• Resources (time & money) are limited.
• NCLB & Education YES! have created a
high stakes situation for schools.
• The use of data to make informed
decisions is more crucial than ever.
October 25, 2007
AESA
Student Achievement is
the Goal
October 25, 2007
AESA
What are our Needs?
•
•
In 1995, Shiawassee County Schools determined
they had a need and desire to enter into a
countywide assessment project. As a result, a
county assessment committee was formed and
a search conducted for a common assessment.
In 1997, on-line testing began. Seventy-five
percent of the districts in the county
participated in the project.
October 25, 2007
AESA
Since 1997…
•
An on-line testing system was used countywide
multiple times a year every year.
•
K-5 testing requirements were instituted by the
state.
•
NCLB was created and implemented.
•
Shiawassee County schools expanded their
assessment capabilities and activities as well
as their data use and analysis abilities.
October 25, 2007
AESA
As Our Expertise Has
Grown…
• One year of Wahlstrom work/study provided a background in the
multiple types of data (outcome, demographic, process, and
perception).
• Three years of Data Packets have been used to make limited school
improvement decisions.
• Trainings & experiences took place using a variety of data for decision
making purposes (Savy with SASI, Test Wiz, School Improvement
Planning Days, etc.).
• County Assessment Committees were formed & their results analyzed
which further stressed the need for data based solutions.
• Local Service Planning results found county consensus that data
organization and analysis was a need.
• County Assessment Survey and CCIC Surveys discovered districts
were looking for similar attributes in a data system.
October 25, 2007
AESA
The Question Now Is…
What do schools need to use data
effectively …
as a means to monitor a program’s
impact on student achievement
to identify the most critical opportunity
areas to focus school improvement
efforts?
October 25, 2007
AESA
A Data Warehouse
• A tool to help districts become data driven in order to meet the
requirements of NCLB and Ed YES!
• A collection of various sets of data found in a variety of unrelated
locations and formats brought into one relational database.
• A system that will allow districts to find answers and ask complex
questions that uncover underlying problems – leading to the design
of data driven student achievement and school improvement
strategies.
• A program that will incorporate data into a fully relational data
warehouse that includes:
–
–
–
–
–
–
Financial data
Personnel data
Building infrastructure data
Student demographic data
Student achievement data
Assessment data
and answers a variety of diverse and interactive questions easily.
October 25, 2007
AESA
Many Programs are Data
Mining Tools
They address the following in isolation:
• Assessments
– Most of these tools do not contain data from other sources
beyond student demographics.
• Student Information Systems
– These packages were not designed to link data from multiple
years with assessment and special program data, or even
teacher data.
• Document Storage
– There is no association to student, teacher and assessment
data in order to identify areas to target for school improvement.
October 25, 2007
AESA
The Answer…
We have experience with data mining
(through MAPS, Data Packets, etc.).
We are ready to move to the next level.
Our local districts all agree an
interactive data warehouse is what
would best meet their needs.
October 25, 2007
AESA
Data Warehouse Timeline
2004-2005 Academic Year
SRESD staff attended multiple vendor demonstrations throughout the state
September 2005
RFI requirements and criteria established
September – October 2005
The opportunity to submit an RFI was given to:
Achieve! Data Solutions
Chancery SMS
Compass
CREST
CRM
dataMetrics Software, Inc.
Edmin.com, Inc
Edsmart
Enterprise Computing Service, Inc
eScholar LLC
Executive Intelligence, Inc.
IBM
Just 5 Clicks
Kent ISD
MI Tracker
Midwest Educational Group
National Study of School Evaluation
Pearson School Systems
Performance Matters
Plato
QSP
Regional Data Services
Riverdeep, Inc.
Sagebrush Corp.
SCHOLARinc
SchoolCity, Inc.
Schoolnet, Inc.
School Interopability Framework
Skyward
Swiftknowledge, Inc.
TetraData Corp.
TurnLeaf
October 25, 2007
AESA
Data Warehouse Timeline
September – October 2005
continued
RFI received from:
Achieve! Data Solutions, LLC
Edmin.com, Inc.
Edsmart
eScholar LLC
Kent ISD
MidWest Educational Group
Pearson School Systems
Sagebrush, Corp.
SchoolCity, Inc.
TetraData Corp.
November 1- 3, 2005
Prescreening occurred – RFI eliminated:
Edmin.com, Inc.
Kent ISD
MidWest Educational Group
Sagebrush, Corp.
November 10, 2005
RFI Committee Review of candidates by county data project committee consisting of curriculum directors,
technology experts, principals and teachers:
Achieve! Data Solutions
Edsmart
eScholar, LLC
Pearson School Systems
SchoolCity, Inc.
TetraData Corp.
October 25, 2007
AESA
RFI Committee Review Criteria
Training Requirements
Speed and Efficiency of Data
Easy to Query
Drill Down Capabilities
Longitudinal Data Capabilities (consider # of years as well as ability)
Formats (charts, graphs, etc.) (Graphs in system, including longitudinal, & do not require export)
Pre-formatted Reports
Export Capabilities
Web Based
Multiple Levels
Importability
Proven Data Inputs (data elements are cited that are supported by research)
Customized Fields
Customized Reports/Flexibility
Student Work Tracked
Michigan Curriculum Frameworks/GLCE Addressed
SIF Compliant
Testing Capabilities
Achievement Data
Demographic Data
Process Data
Perception Data
OctoberSupport
25, 2007
AESA
Critical
Important
Bonus
Narrowing the Field
November 10, 2005
Final two vendors chosen for in-house demonstrations:
Achieve! Data Solutions
TetraData Corp.
Demonstration requirements and score criteria for the
demonstration were determined by the county data
committee project committee.
December 6, 2005
In-house demonstrations by Achieve! Data Solutions and
TetraData Corp.
October 25, 2007
AESA
The Process
Vendors received demonstration requirements:
•
Demonstration for users at six different levels: ISD, Superintendent,
Curriculum Director, Principal, Teacher, Parent.
•
Samples of queries, pre-formatted reports (drilled down to individual student
level), charts and graphs; creation of non-preformatted queries and
reports; different types of data “interaction”; a “live” data import.
•
Examples of training model, including content and timing (timeline for set up,
data upload and implementation) as well as an explanation of technical and
user support available.
•
Explanation of all costs (including: components needed, technology costs,
start up costs, annual costs, consultation costs, and training/support costs).
•
Explanation of the frequency of updates (how often is new data added to the
warehouse and when will that data “show up” in reports/queries).
October 25, 2007
AESA
Demonstration Criteria
SRESD district participants see vendor demonstrations and rate the products
based on the following criteria:
 Ease of use
 Charts & Graphs
 Drill down ability
 Pre-formatted reports
 Pre-formatted queries
 Creation of customized reports
 Creation of customized queries
 Interaction of process & perception data with achievement & demographic
data
 Data upload speed & efficiency
 Training
 Initial setup timeline
 Technical support
 User support
 Frequency of updates
October 25, 2007
AESA
The Final Selection
October 25, 2007
AESA
Data Framework for
Continuous
Improvement
Demographics
Standardized Tests,
Norm/Criterion
Referenced Tests,
Grade point,
Formative
Assessments
Enrollment, Mobility,
Attendance, Dropout/Graduation Rate,
Ethnicity, Gender, Grade Level,
Teachers, Language
proficiency
Gaining active insight by
analyzing data to improve
learning for all students.
Student
Learning
Programs, Instructional
Strategies, Classroom
practices, Assessment
Strategies, Summer
School, Finance,
Transportation
October 25, 2007
2005 TetraData Confidential
Perceptions
School
Process
AESA
Perceptions of
Learning
Environments,
Values and Beliefs,
Attitudes,
Questionnaires,
Observations
Information Foundation
Copyright ©1991-2005 Education for the Future Initiative,
Chico, CA
Analyzer and DASH
• 2 components of the warehouse
• Analyzer allows for flexibility of complex
reports: comparing variables,
longitudinal reports, looking at trends,
etc.
• DASH provides a snapshot of an
identified issue.
October 25, 2007
AESA
October 25, 2007
AESA
2005 TetraData Confidential
October 25, 2007
AESA
2005 TetraData Confidential
October 25, 2007
AESA
October 25, 2007
AESA
Building a Warehouse
• Step 1 – Data Discovery
– Defining & acquiring data to be included in the data warehouse
• Step 2 – Mapping
– Mapping data from source system(s) to the data warehouse
– Aligning various data elements into common folders
• Step 3 – Engineering
– Building the warehouse & adding the data
• Step 4 – Quality Assurance
– Querying in the warehouse to determine if the data is mapped & loaded accurately
• Step 5 – Implementation
– Using the warehouse to make data decisions
October 25, 2007
AESA
Available Data in Warehouse
Right Now
• Achievement Data
• MEAP
• Grades
• GPA
• Courses
• Credits
• Teachers
• Process Data
• Title One Programs
• Extra Curricular Activities
• Programming
October 25, 2007
• Student Data
• Subgroups
• Lunch Status
• Special Education
• Language
• Ethnicity
• Discipline
• Attendance
AESA
Two Distinct Uses:
• Summary Reports – Annual/semi-annual
long-term results, after instruction
• Monitoring Reports –On-going, check of
progress
October 25, 2007
AESA
Welcome to Our Warehouse
http://analysis03.tetradata.com/ease-e/Login.aspx
October 25, 2007
AESA
Using Our Warehouses
• Summer School Intervention Identification
• Annual Report Achievement Trends
• Professional Development Planning Regionally
based on Student Achievement (Ds and Fs)
• District and Building Profiles
• Program Evaluation
• NCA Goal Identification and Monitoring
• Responses to Board Questions
• Grant Applications (Special Education, Writing,
Math, CASM, etc.)
October 25, 2007
AESA
Professional Development
Then, Now, and in the Future
(Based on Zoomerang Survey)
•
•
•
•
•
•
•
•
•
•
Five Day Analyzer Trainings – All 8 districts have been involved
Local District and Board Overview Presentations
Dabbling with Data Sessions
RtI and Program Evaluation
Office Professional Training (2 Days)
Report of the Month Specialized Group Trainings
Data Ambassador Training
Half Day Specialized Skill Ongoing Trainings
Annual Report, NCA Goal, Profile Update Topic Work Sessions
Etc, Etc, Etc…
October 25, 2007
AESA
October 25, 2007
AESA
Questions
October 25, 2007
AESA