A Collection of Social Studies Resources

My Post (3)

  • What happens when cultures collide?
  • How can I be part of the solution?
  • Are rights the same as responsibilities?
  • What influences my space and place?
  • How can data be used to tell a story?

We need more inquiry, more beautiful questions, more Problem Seekers, not just Problem Solvers.

Lately, I have been making connections between literacy and social studies. Along with refining inquiry instructional frameworks and strategies, I have begun collecting useful resources for educators. Find the list HERE

Let me know if I have forgotten any of your favorites?

Assessment Types Explained for Educators

My Post (1)

Assessment in Education, in the early years, typically took the form of oral evaluation. Tests were subjective, often performed at the front of the classroom, and largely teacher directed; posing questions to the student around typical areas of mastery needed to pass to the next grade level. From there, assessing students took its traditional form (students at their desk and a paper/pencil test) in the late 1890s following the institution of letter grades (A, B, C, etc.) to replace the teacher’s subjective measure of a student’s ability.

The first standardized test in education was the Stone Arithmetic Test (the Early 1900s) and the SATs made its way onto the education landscape in the 1930s as a way to check a student’s readiness for college.

Current trends in education have seen an increase in testing and making data-driven decisions, but in the era of TLA (another Three Letter Acronym), the volume of assessments educators and districts can/have to use often leads to confusion. The following is a list of assessment terms that are commonly found in education and my simple definition and use of them.

Types of Assessment

Type Who Purpose Examples
Formative Assessment – formal and informal assessment to monitor and provide feedback on student understanding of targeted learning goals. Formative assessment is frequent and ongoing; it is not typically graded. Whole Class Formative assessment is used to inform teacher instruction and by students to set goals and next steps. Exit Slips, Games, Pretest,3-2-1
Summative Assessment –   culminating assessment used to evaluate student learning, skill acquisition, and achievement. It typically occurs at the end of a unit, lesson, semester, or year. It is commonly considered “high-stakes” testing and is graded. Whole Class Demonstration of understanding by the student. Project, Portfolio, Test, Paper
Screener –  a valid, reliable, evidence-based assessment used to indicate or predict student proficiency or identify those at-risk. Screeners are brief, identify the “who”, and are given a few times a year. Whole Class or Targeted Group Identification of students at-risk and who need additional support. AIMSweb, DIBELS, FAST, EasyCBM, iReady, STAR
Diagnostica tool used to provide insights into a student’s specific strengths and weaknesses. The data collected provides the teacher with specific skills to target when designing individualized instruction. Diagnostic Assessments identify the “what” for the student. Individual Student After a student has been identified via a screener, a diagnostic assessment is used to determine specific areas of focus. Error analysis of literacy progress monitoring data, Phonics Inventory, Reading Miscue Analysis
Progress Monitoring a tool used to assess student’s academic performance and rate of growth on individualized or targeted instruction. Individual Student To ensure the response to instruction is helping students grow in a targeted area. Based on specific intervention or instruction. The diagnostic tool can be used if there are multiple forms available.
Norm-Referenced Assessment – compares student’s performance to the “average student” score. The “average student” score is constructed statistically selected group of test takers, typically of the same age or grade level, who have already taken the exam Whole Class, Whole Grade Level Designed to rank test takers on a bell curve. Used to determine how students in a particular school or district are ranking to others who take the same test. Standardized tests. California Achievement Test, Iowa Test of Basic Skills, Stanford Achievement Test, and TerraNova.
Criterion-Referenced Assessment –  measures student performance against a fixed set of standards or criteria that are predetermined as to what a student should be able to do at a certain stage in education. The score is determined by the number of questions correct. Whole Class Can be both high-stakes (used to make decisions about students, teachers, schools, etc.) or low-stakes (used for student achievement, adjusting instruction, etc.) Multiple choices, true/false. Short answer or a combination. Can be teacher designed.
Benchmark Assessment – Fixed assessments (also called interim assessments) to measure a students progress against a grade-level or learning goal. Often given in-between formative and summative assessments. Whole Class or Individual Student Used to communicate to educators, students, and parents which skills are important to master and student’s progress (so far) towards those learning goals. Fountas and Pinnell, Reading A to Z Benchmark Passages
Other Assessment Terms You May Encounter
CFAs (Common Formative Assessments) Assessment that is collaboratively created and agreed upon by a group or grade-level team to measure students attainment of the learning goals.
Alternate Assessment Assessments for students with severe cognitive disabilities. Tests have less depth and breadth than the general assessment. (Small number of kids on IEPs that are unable to take the general test)
Alternative Assessment Also called authentic assessment or performance assessment. Alternative assessment is in contrast to the traditional standardized test and focuses on individuals progress, multiple ways to demonstrate understanding)
Authentic Assessment Replicates real-world challenges that experts or professionals in the field encounter. Used to not only demonstrate mastery of learning goals or standards but also critical thinking skills and problem-solving. (Students construct, respond, or produce to demonstrate understanding)
Synoptic Assessment Combines multiple concepts, units, or topics in which a single assessment requires students to make connections between the learning. A holistic approach to assessment and the interconnectedness of learning.
Quantitative Data Data collected that can be measured and written down in numbers.
Qualitative Data Data collected that is more subjective and speaks to the expertise of the teacher to provide their opinion based on trends and past experiences.

 

The ability to choose the right assessment that meets the needs of students and teachers is essential. Most often, confusion does not occur between the differences between formative and summative assessments. Through my own work with districts and educators across the nation, I have found a need to clarify the definition and purpose between a Screener, Diagnostic Tool, and Progress Monitoring. These three assessment types are essential when digging deep into student needs and help to inform instruction.

Resources to Explore:

My Collection of Edtech Tools for Assessment

List of Screeners

List of Diagnostic Tools

Progress Monitoring List

Authentic Assessment

7 Resources to Fight Digital Misinformation in the Classroom

7 New Resources to Fight Digital MisinformationAccessing information online is like looking for a proverbial needle in a haystack. The abundance of resources available 24/7 makes Information Literacy an essential life skill for one’s working, civic and personal lives. As an educator, it is imperative to recognize the shifts in locating reliable and relevant sources online. I spoke about this need at ISTE 2018 in my Ignite. Developing healthy skepticism and honing fact-checking skills are an important part of being literate today. Recently, there have been a release of new tools to support this endeavour; along with some updates to some of my favorite resources.

Here are 7 Resources to Support Information Literacy Online and to Fight the Misinformation Out There:

  1. NewsGuard – NewsGuard is a browser extension to add to your Chrome or Edge browser. Trained journalist, with “no political axe to grind” help readers and viewers know which sites are reliable. Their tagline, “Restoring trust and accountability” uses 9 Criteria to give websites ratings by color-codes from red to green. If a reader wants to understand the rating given by the group, they can read the expanded “Nutrition Label” that provides this information. NewsGuard also has great resources for libraries and is user-friendly.  
  2. SurfSafe – SurfSafe is also a browser extension for Chrome with one goal, to detect fake or altered photos. After installing this extension, users can hover over an image on the web or Facebook which instantly checks it against 100s of trusted sites for its validity. Surfsafe provides a rating system to users, along with links to other websites. Users can also help “defend the internet” against misinformation by reporting suspicious images as well.
  3. News Literacy Project – The News Literacy Project is a national education nonprofit offering nonpartisan, independent programs that teach students how to know what to believe in the digital age. They have been helping students and teachers identify fact from fiction on the web for the past 10 years. On their website, educators will find resources, information, infographics, stats, and much more. Schedule a virtual visit, or catch up on their blog; News Literacy Project is a beneficial resource for all teachers.
  4. Factitious – Factitious is a Tinder-like game but involves news instead of potential dates. Created by JoLT, (a collaboration between American University’s GameLab and School of Communication tasked with exploring the intersection of journalism and game design) users are given a title and brief text of news and are to swipe right if they think it is real, or swipe left if they believe it to be fake. After guessing, users are given the link to the source and a brief summary statement, pointing to strategies that can be used to identify misinformation. This game is fun and fast-paced.
  5. Snopes – A website that many turn to first, Snopes is a resource that all educators and students should be aware of and use when questioning validity of digital information. What began in 1994 as David Mikkelson’s project to study Urban Legends has now “come to be regarded as an online touchstone of research on rumors and misinformation.” Snopes provides users with a description on their methods and selection on their about page, which is important information to point out to students. Users can search for a specific topic or check out the “What’s New” or “Hot 50” to be current on the misinformation and the actual truth that is spreading across the digital waves.
  6. Politifact – One word, Truth-O-Meter. PolitiFact’s core principles, “independence, transparency, fairness, thorough reporting and clear writing,” give citizens the information they need to govern themselves in a democracy. During the election of 2007, Politifact was born and has continued to fact-check and provide ratings on their Truth-O-Meter on all things political. From statements made by Politicians to bloggers, Politifact offers users information on a Global, National, and State level.
  7. CommonSense – Common Sense is a leading nonprofit organization dedicated to improving the lives of kids and families by providing the trustworthy information, education, and independent voice they need to thrive in the 21st century. Fortunately for all of us, CommonSense News and Media Literacy offers a  Toolkit for educators with strategies, resources, videos, and lessons to support understanding of news and media literacy and promotion of Digital Citizenship. This is a website to check frequently for updates, news, and excellent educator resources; one of my favorites!

 

Have I missed any of your favorites? Drop me a comment to investigate additional resources.

Research Isn’t Sexy… But You Need It Anyway!

My Post (8)

For some time now Steven Anderson and I have been reflecting on our own learning and professional development practices, looking for gaps in instruction and aiming to improve our craft. One of our longest conversations has been around research. In the work we do, we are constantly reading and attempting to understand the research behind the popular instructional movements of today. What we find is that much of the educational research available today isn’t used, isn’t cited, and really isn’t sexy.

Take, for instance, literacy instruction.

While trying to capture the success in student achievement scores in literacy from the research and studies done in the 1980s to the early 2000s the RTI (Response To Intervention) process and framework were created (Vellutino et al.). RTI was developed to help schools replicate the gains witnessed in this research. Today, RTI has morphed into MTSS (Multi-Tiered System of Support) but very few districts have seen the increase in student-achievement that was initially experienced.

What happened?

While the RTI framework was adopted and utilized throughout the nation, the actual reading strategies and interventions used to achieve this growth were left behind. Implementing only half of the research (RTI Framework) while substituting different interventions and reading strategies have produced only limited results, leaving many administrators, teachers, and students frustrated. We know what works in literacy, and there is research behind it (most reading research is in the Psychology field) but still fail to dig into it, much less use it. (Kilpatrick)

Education research is vast. It spans disciplines, instruction, leadership, and many other components that contribute to a school. The problem has become that research has been co-opted by publishers, organizations, and individuals to sell one-size-fits-all quick fixes, programs and books. Many will ignore the fundamental findings of the research and insert their own ideas and practices that help these packages fly off the shelf.

With the abundance of research available, why do very few practitioners use it? What barriers exist that slow the transfer into the classroom? And what can be done to support administrators and practitioners in their quest of research-based methods?

We believe that there are 5 Main Barriers that exist which impacts how or if educators use research. While there could certainly be more added to this list, we feel these were the top 5 Problem Areas.

  1. Access and Abundance – Digging deep into research is typically done during college. Free access to databases, extensive libraries, experts for days. But upon graduation access is limited and met with the dreaded paywalls when locating many peer-reviewed articles, journals, and research. On top of limited access, the abundance of research out there is overwhelming. A simple search on Google Scholar with the keywords “Struggling Readers” lists 500,000 results. It is no wonder educators do not know where to begin when sifting through the research.

  2. Lack of Research in PD – There is no doubt the access to professional development is more abundant now than ever before. But that comes at a risk for individual educators and district leadership. We want to provide and participate in high-quality learning, however, much isn’t grounded in any realistic or research-based practices but instead, they are the ideas that someone read about or heard about or tweeted about. Anyone today can learn about innovation, makerspaces, augmented reality, really any instructional practice, create a slide deck and share it with the world. Instructional practices that impact student learning are based in more than tweets and blog posts. As learners and leaders, we have to model and understand where these ideas are coming from and are they based in sound research.

  3. Time – Time is a commonly mentioned barrier for educators. From new initiatives, faculty meetings, lesson planning, and connecting with students; time to do everything well is a deterrent for many educators when it comes to research.

  4. Research is Written For Researchers – Most research is written for other researchers, not necessarily the practitioners in the field. With this in mind, it is no wonder that many educators find it inaccessible because of the jargon used by a specific group of professionals. This jargon is filled with technical terminology that is understood as both a literal and figurative level by the group but leaves the rest of us guessing. (Education Jargon Generator)

  5. Distrust and Disconnect Between Theory and Practice – First, disconnect. There often times is a gap between theory and reality when reading research done by professionals in the same discipline but with little to no educational background. Ideas, studies, and strategies are examined with a skeptical lens and doubt is raised when research seems isolated or without consideration of the whole child or educator demands. What many educators do not realize is the disconnect within education research itself. With no agreed upon definition of research-based, no common training methods for preservice educators, and both qualitative and quantitative inquiries producing complementary but still fragmented results, the disconnect, cognitive bias, and skepticism of authority is not only confusing but creates a sense of distrust among the education community. Educators are more apt to believe other teachers implementing a program or using a specific framework over the years of research with statistics and data.  (D.W. Miller)

Items To Consider

  • Comparing Sources: One source isn’t gospel. Do your homework. Be open to opposing ideas. Don’t be married to an idea because you agree. If the research isn’t there, it’s not there. Be a critical discerner of information and ask lots of questions when you participate in professional development or are in a presentation. Ask where the research is and investigate yourself. Can you draw the same conclusions? Compile a list, according to your discipline, of leading theorists in the field to cross-check what you hear and read.

  • Research-Based Is A Convoluted Term: When designing activities that include techniques or strategies that have been research-proven, you can then call it “researched-based”. Since there are varying degrees of improvement (statistically significant yes, but how much) and are there approaches that work better and have a higher effect size, it is important to have a basic understanding of the research. If there isn’t any then that doesn’t mean you can’t use it. It just means you have to be more skeptical of the results, long-term.

  • Be A Researcher: No, this doesn’t mean you need to know about standard deviations or methodologies. What it means is be a student of your students. Gather data and examine what’s happening with student learning. What does the data show? Are the practices you are using improving student understanding? Are the results what you expected? What went wrong (or right!)? Be a reflective educator.

  • Always Remember To Keep Students First: Even research can get it wrong. You know your students. Always do what is best for them, even when that means going against what others say.

Resource/Reading List:

ISTE Ignite::: Flat Earth, 9/11, Anti-Vax: Things People Doubt in the Digital Age of Information

IMG_0046

Photo Credit: Shawn McCusker, Thank you, friend!

This year, I pushed my comfort level and gave an Ignite (5 mins. 20 slides) at ISTE in Chicago. It was my first time presenting in this format and I chose to speak about a topic that I am passionate about – How to Develop Healthy Skepticism and Fact-Checking in Students.

I started off with a personal story from college about a girl on my floor who was sucked into a cult…

Armed with flyers and a headful of answers, Cassandra pushed her way into our room and began her recruitment speech.

The misinformation of today is more difficult to recognize, posing as websites and Facebook pages. As educators, it is our obligation recognize that the checklists we once used to verify information have a hard time exposing the fake news, half-truths, media-bias, propaganda, fallacies… that we consume on a daily basis.

Critical literacy skills are needed not only for current discourse but also rhetoric in modes we haven’t even considered taking, for instance, Deep Fakes. Fueled by AI, creators are enabled to hijack one’s identity, voice, face, body. Think of it like photoshop on steroids but also with video, and now audio. What was easily recognizable as altered has become so sophisticated that it is almost imperceptible to detect by both human or computer.

We must recognize the shifts in information and change to adapt to the new mediums, equip students with critical thinking skills that allow them to get closer to the truth than they once were. To move beyond checklists I suggest looking into the work of Michael Caufield who provides guidance with 4 Moves for digital information.

Verification is a process, not a simple yes or no. You may ask if it is worth it? Or why doesn’t the government step in and take down these websites? On a surface level, that may seem easiest, but upon further reflection, once one allows censorship to invade their space it creeps into every aspect of their life.

The answer is not censorship but empowerment. And when our students walk out that door for the last time, I hope they leave with a critical lens to consume information. Equipped with the ability to not only think critically but speak with authority and be advocates for themselves and others in the great unknowns of the world.

Thank you to all of the people that supported me during this process and cheered me on as I took the stage! Steven Anderson, Adam Bellow, and Erin Olson

Until next year!!!

Thank you to Dan Kreiness for recording Round 2 #Ignites. If you would like to see my whole presentation click the link!  Shaelynn’s Ignite

On mobile device? Try this link at 47 mins. Round 2 all Ignites