Decoding is a foundational literacy skill where readers break down and sound out unfamiliar words to understand them. Mounting evidence, including a 2024 replication study by Reading Reimagined and ETS that confirmed a decoding threshold, suggests decoding may explain why so many older students continue to read below national standards.
“If you teach kids to break words into their smallest meaningful pieces, like ‘un-’ for ‘not’ or ‘-ness’ for a state of being, they’re more likely to be able to handle ‘unhappiness’ by spotting its parts,” recommends Rebecca Kockler, executive director of Reading Reimagined. Read the article to understand the pressing need for the sector to reconsider how we teach foundational literacy skills in schools.
Assessment For Good, a R&D program with the Advanced Education Research and Development Fund (AERDF), creates rigorous assessments that provide a comprehensive picture into skills that power learning. While most assessments focus on subject mastery, educators are often missing information about foundational skills to understand the next best step to support the learning process. Even if thoughtful educators want to understand student learning, existing assessments are limited in terms of portraying personalized and contextualized views of learning beyond whether or not a state standard was reached. AFG is focused on how to more effectively measure over 30 skills that power learning, such as perspective taking, initiation, or metacognition, which are key parts of the learning equation, but are often missed in the information we provide to educators for instructional decision-making.
At AFG, inclusive research and development (R&D) is at the core of how we build. At this year’s AI show at ASU/GSV, we shared a case study highlighting how AFG has scaled its inclusive R&D approach through a strategic and intentional integration of AI, while keeping community voice at the center. Our goal was to honor educators and learners as the expert—ensuring a continuous community loop throughout the process to center subject matter expertise and community insights as part of our R&D process.
Our assessment development process starts and ends with the learners we serve.
From the start, every assessment item or question has been shaped by real feedback from learners. From 2022 until very recently, nearly 1000 assessment items were co-created with learners to ensure they reflect learners’ contexts and reflect the assets they bring. This feedback loop ensures our assessments reflect not just academic standards, but learners’ lived experiences.
In 2024, we built a capability that explored the use of generative AI as a tool to inform assessment development. Instead of thinking about this as a way to replace the human-in-the-loop, we thought about how to leverage an inclusive process that expanded how learner, educator, and research expertise contribute to assessment development, while reducing the resource burden of measure development, which is felt across the assessment industry.
What is different about this process?
We wanted to include the learner and educator voice at the beginning, middle, and end. Our training set of items was rooted in AFG’s scientific framework and that was carried through to subject matter expert review, in which AI generated assessment items were accepted, rejected, or revised for inclusion. Overall, the AI generated items had an acceptance rate of 82% for alignment to our scientific frameworks. Lastly, we tested our AI generated items alongside our human-generated items with students and found no differences in item performance. In six months, we’d been able to increase our bank of assessment items by 167%, far faster than if we had continued to generate them all by hand. This technical capability, born out of our inclusive R&D efforts, can now be deployed within AFG’s other assessment products to personalize assessments at scale or licensed to other assessment creators.
Our AI strategy is to support and scale human-centered processes, not replace them.
By anchoring AI tools in trusted knowledge bases, including content informed by learners and educators, we can streamline content creation while maintaining high ecological validity—the alignment between what’s being assessed and the real world learners inhabit. We believe that more assessment development processes should be broadened to include a greater variety of experts in the process, including learners, caregivers, and educators; the key to better R&D is to continue to center their voice.
Inclusive R&D isn’t about completing a checklist—it’s a smarter approach to development. By honoring human insight and strategically using AI to support, not replace, community-driven expertise, we’re redefining what it means to scale assessment innovation.
This Spring, AERDF is excited to share our innovations and impact while connecting with educators, researchers, technologists, and innovators at three leading national conferences. Don’t miss the opportunity to hear firsthand from our team and partners about our scientific breakthroughs and how advanced research and development that centers learners and teachers can push the boundaries of what’s possible in PreK–12 education.
Here’s where you can find us this month:
ASU+GSV Summit
San Diego, CA | April 6–9, 2025
We’re thrilled to join some of the most influential voices in education, innovation, and technology this weekend at the 2025 ASU+GSV Summit.
Tune in to presentations from AERDF and R&D partners during the AI Learning Show on April 7th:
Catch our team in panel discussions exploring the power of collaborative R&D and how to secure funding for breakthrough ideas:
And don’t miss our interactive Glass classroom workshop spotlighting the latest insights and innovations from AERDF with our R&D partners! Participants will get a behind-the-scenes look at our evidence-based, learner-centered education breakthroughs and how they come to fruition—from concept, to prototype, to adoption.
Eighth Annual CREA Conference
Chicago, IL | April 8-11, 2025
At this year’s CREA Conference, AERDF’s Assessment for Good team will lead a session focused on what students themselves say they need to succeed and feel supported in school. Dr. Leslie Nabors Oláh and Dr. Lauren D. Kendall Brooks will share insights from their research highlighting how AFG has invited and cultivated student partnership into its R&D process to develop learner-centered assessment tools.
AERA Annual Meeting
Denver, CO | April 23–27, 2025
We are excited to showcase our R&D at the upcoming American Educational Research Association (AERA) Annual Meeting in Denver! From examining how math beliefs, anxiety, and metacognition interact to shape problem-solving skills to exploring new findings on decoding challenges and targeted support for older students, join AERDF’s R&D teams and partners for engaging sessions and evidence-based approaches.
Stay tuned to our social channels for real-time updates and reflections, and we look forward to seeing many of you on our travels!
“Assessment has really tried to serve so many purposes in education, from trying to make sure resources get aligned in the right place for the right learners at the right time to supporting educators trying to understand how to unlock the next instructional moment. This is because assessment can give us a lot of information–it’s one of the most powerful levers we have in education [to make informed teaching and learning decisions]. However when assessment is used improperly, it can lead to decisions that close doors for opportunity for learners all across this nation,” highlights Dr. Temple Lovelace, Executive Director of Assessment for Good, one of AERDF’s flagship programs, in a recent episode of the Teaching Matters Podcast, from NPR.
The power of assessment was at the heart of the conversation between Dr. Lovelace and Dr. Scott Titsworth. Together, they explored the possibilities of formative assessment, the implications of high-stakes testing, and innovative approaches to assessment—including the uses of AI and machine learning—that prioritize learner-centered practices.
To best support students to thrive in school and lead in our rapidly changing world, we need accurate and understandable information about student learning as it happens, not just the moment they’re sitting taking a test. Listen to the episode to be inspired by new ideas on how assessment can be an invaluable tool that helps maximize the growth and potential of all learners.
Imagine a future where the limits of today’s knowledge are the starting point for unlocking critical scientific insights and technical capabilities that push us just beyond what we believe is possible in education – that’s where our work at the Advanced Education Research and Develop Fund (AERDF) begins.
In a recent conversation with Mike Palmer for the Trending in Ed podcast, Auditi Chakravarty, CEO of AERDF, explained how AERDF’s R&D model starts at the edge of what the science of learning tells us we know. It is from that edge that we consider what must be proven or unlocked today, so that in three to five years what’s happening in the classroom looks very different. And not just different, but better. Where far more students are positioned to thrive in the economic and global conditions they will face.
In this podcast episode, you will hear how our programs are using R&D to bridge scientific research with real-world applications by co-designing solutions with educators, students, and caregivers to produce scientific knowledge, technical capabilities, and scalable classroom-ready prototypes:
- With a focus on mathematical conceptual understanding and complex problem-solving, EF+Math, in its final phases as an AERDF program, explores strengthening executive functioning (EF) while innovating math teaching and learning. EF+Math’s R&D teams, Fraction Ball, MathicSTEAM, and CueThinkEF+, are generating evidence-based solutions proving that creating math learning experiences that bring out the innate skills inside every student can improve math outcomes.
- By leveraging the power of AI and the expertise of learners, Assessment for Good is creating a more seamless and game-based assessment experience that encourages students to want to use these tools because they learn more about who they are as learners and that provides educators with instructional strategies in real-time.
- Reading Reimagined aims to better support older students who need to develop foundational reading skills, like decoding by breaking down and sounding out unfamiliar words to understand their meaning. Having confirmed the decoding threshold, Reading Reimagined is now working to develop, test, and implement tools to assess students’ decoding skills and curriculum that bolsters foundational reading skills.
- While much of today’s conversation centers on using AI to make current approaches to education more efficient, AugmentED, our newest program, will bring together expert teachers, researchers, and technologists to reimagine education for the AI era, developing and testing new teaching approaches and AI-powered tools to nurture every student’s potential.
Take a listen to the podcast to hear about the distinct ways that AERDF is using R&D to catalyze and expedite advancements in teaching and learning.
“NAEP is a critical tool for providing transparency, informing education policy decisions, and examining long-term trends in student outcomes,” asserted Auditi Chakravarty, CEO of the Advanced Education Research and Development Fund (AERDF), in an article by The Hill.
The recent cancellation of the long-term NAEP assessment for 17-year-olds by the U.S. Department of Education raises concerns about the future of educational data transparency. Research-backed policies ensure taxpayer dollars are spent efficiently on education initiatives that are most likely to bolster student achievement. Without reliable and accurate data, we risk the rigor of our education system and leave the success of our students to chance.
Read the article to better understand what is at stake with respect to this vital resource for educators, parents, and policymakers.
Education R&D can fuel technological and scientific breakthroughs that unlock innovative teaching methods and improve student outcomes, positioning all students to adapt and thrive in a rapidly changing world. An investment in education R&D is an investment in our nation’s economic growth and global leadership.
The Advanced Education Research and Development Fund champions the power of research and development to unlock scientific breakthroughs and deliver research-backed solutions to pressing teaching and learning challenges.
Read the article to learn how deepening investment in education R&D can pay dividends, and what a pathway forward can look like.
Decades of research show that when learning is fun, it’s not only more effective but also keeps young people truly engaged. Building on this research, Dr. Andres Bustamante and his team at UC Irvine are combining community partnership, cultural context, and play to transform everyday spaces like parks, bus stops, and local markets into vibrant hubs for learning all across Santa Ana, California.
Recently, Dr. Bustamante was invited onto the UC Irvine Podcast to discuss his work with Playful Learning Landscapes: common places where families and community members naturally gather that are designed to invite play and learning. He shared how his team is partnering with communities to ensure that these learning opportunities are deeply rooted in local values, cultural practices, and familiar routines. This ensures the installations are effective, useful, and sustainable to the local community where they are present.
One such installation is Fraction Ball. Designed at the suggestion of teachers at El Sol Academy, Fraction Ball integrates fractions into the three-point arc and smaller arcs on the basketball court, creating a hands-on, engaging way for students to understand fractions. The court features both fraction and decimal representations, helping students visualize their equivalence. Rational numbers, like fractions and decimals, are notoriously challenging for students, yet they are foundational for future success in math learning, especially for algebra.
With funding support from EF+Math, a program of the Advanced Education Research and Development Fund (AERDF), and in partnership with the Santa Ana School District, Fraction Ball has been introduced to 5,000 students across 27 schools. Initial experimental studies conducted over the last five years have consistently shown strong positive impacts on student learning.
Listen to the podcast to learn more about Dr. Bustamante’s work and the success of Fraction Ball so far.
Math skills are and will remain in high demand in the job market, so it’s critical that students experience rich math learning experiences, in school and beyond, that build their confidence as math learners. In a recent Education Week article, AERDF president and CEO, Auditi Chakravarty asks us to consider, “What kind of exposure do kids get at a really young age to numbers and numeracy to build that same kind of literacy [as reading readiness]?” Providing all young people with opportunities to build mathematical proficiency will support their academic achievement and create pathways to a wide variety of careers.
As such, it’s of great concern when recent NAEP scores indicate that nearly 40% of 8th graders and almost 25% of 4th graders in the U.S. are below basic proficiency in math. One reason this may be the case is math anxiety. What’s described as a learned emotional response to math-related activities, math anxiety is said to affect 20-30% of students. Math anxiety can look like feeling blank, sweating palms, and a racing heartbeat when faced with math problems.
Co-executive director of EF+Math, a program of AERDF, Michelle Tiu, recommends mastering basic math facts and focusing on conceptual understanding over math drills and timed tasks to promote math fluency. “Not possessing fluency adds a lot of cognitive load to each step of a mathematical process. If students’ cognitive load is not taken up with thinking about basic fluency facts, it frees them up to be able to focus on higher-order thinking and conceptual understanding.”
Read the article to learn more about evidence-based strategies to mitigate math anxiety and improve student learning.
Introduction
Lifelong learning starts with self-awareness. When students recognize their strengths and understand how they shape their learning, they don’t just participate—they take ownership. That’s why student co-design is at the heart of Assessment for Good’s (AFG) work. By collaborating with learners, we ensure that assessments reflect their experiences, needs, and potential. To dive deeper into this approach, we are excited to talk with Dr. Lauren D. Kendall Brooks, Research Scientist for AFG, who has led many student interviews and focus groups that shape both the development of assessment constructs and the design of assessment environments. In this conversation, she shares more about AFG’s approach to collaborating and codesigning with students to co-create meaningful assessments, and shares stories of students playing an active role in their own learning, and ours!
What is assessment and why is it the focus of AFG?
Assessment tells us what’s important to our culture. Traditionally, if a student studies something that isn’t on the test, it’s often framed as studying the “wrong” thing, rather than a reflection of their learning. This is the nature of summative assessment. At AFG, we aim to impact formative assessment, which is a type of assessment that focuses on understanding what students know and are learning in the moment–allowing educators and caregivers to intervene and support students in real time.
What does it mean for assessment to be inclusive and effective?
Formative assessment is inclusive when students are able to show what they know in ways that are authentic to how they learn and understand the world. Assessing them in environments that feel comfortable for them, safe for them, and in ways they’re able to best express themselves, makes it effective. Assessment should provide students with information that supports continual learning. Traditionally, assessment is thought of as the end of learning when it is actually just the beginning – the beginning of being able to use the knowledge that you’ve gained to shape your learning journey.
How is AFG engaging in assessment development and design differently?
We see our students as people, not numbers or data points. Their perspectives matter because they are the experts of their own lived experience–they understand how they learn and process information better than we ever could. By combining our expertise in assessment and students’ deep understanding of themselves, we find that middle ground where their voices shape what and how we measure.
This became especially clear to me in a conversation with a student when I used the phrase, “students like you.” She immediately questioned it: “What does that mean? Like other third-grade Black girls?” I paused and clarified, “No, I mean students who may share your interests or preferences.” But she made it clear—she didn’t see herself as similar to other third-grade Black girls in that way. By the end of the interview, I recognized what she meant. I also realized there were likely thousands of other third grade Black girls who, like her, don’t feel seen.
Her insights directly influenced the questions we included in our assessment. I had planned to invite her to participate in product testing–where we work with students to test and refine our playful assessment tools–but when I asked her if she wanted to play a game next time, she surprisingly replied, “No – I want to keep doing this because sometimes kids that are different like me need help with these questions.” She saw herself as a real voice for other third grade students who don’t enjoy assessment. This is exactly why we listen.
A cornerstone of AFG’s approach is codesign. How do you define it?
Codesign is working with and for students. Codesign is collaborating with partners from the start, giving them the roughest version of a plan, a research question, or prototype and asking questions like: “Am I in the right direction?” “Can you show this to me?” “What do you think about this?” “How would you change it?” “Does this even make sense?”
It’s giving them a storyboard idea before you get to a prototype, or a drawing on a page before they are given something they can touch and feel, and asking them to co-create the path forward.
How are students who are a part of AFG’s codesign process affected by this engagement?
Students get really excited. I will have heard feedback from five different students on that same day, each eager to share their thoughts. By letting them know that their input is valued, I find that I get to the richness of codesign. It’s them correcting me, showing me, teaching me. “Actually, we would use this word instead.” “That word doesn’t make sense here.”, “I would change this.” , “This looks weird.” , “It reminds me of this negative thing” (like a TV show or video game).
I think this allows them to feel valued–they see their voices shaping something real, something that will reach others. I always ask them, “What does it feel like to know that you have given feedback on something that will help thousands of kids?” Their responses are powerful:
- “This is really important because not everyone has someone to talk to.”
- “Sometimes people have problems and they don’t know where to go and this may help them think about things differently.”
- “Kids don’t have anybody to talk to and this may be a safe place. Not everybody has that like I do.”
They start getting very deep. They want to help other people because they recognize that not everyone has a trusted adult and this could help and support them.
It’s apparent that trust is crucial in order to codesign effectively. How do you approach building trust with students?
I am as honest with them as possible. I try to give them agency up front, reminding them repeatedly that they don’t have participate. I read them their consent form and give them a chance to ask questions. I let them know what is going to happen along the way at each step. They can choose not to answer any of my questions. We use pseudonyms for anonymity–they come up with some fun ones–but I always make sure to remember their real name at the end. I thank them for being there before they go.
It’s also important to allow them to see the mistakes in the prototype. They have seen my spelling errors, perhaps doubling a question. When they point out the mistakes, they see that this is a real prototype, and know that I am really listening to them. I hear a lot of the same things, but I always ask them to keep going. I ask them to clarify, which validates their responses. We have conversations about what they mean exactly. They get to correct me. Eventually, they give me lots of explanation without me having to prompt them. I love when we students come back because they can see the changes that have taken place. They can see that they are part of the process.
Conclusion
Through co-design, students move beyond being research participants—they become collaborators, shaping something bigger than themselves. When we truly listen, we don’t just build better assessments—we help students see their own power in shaping the world around them. At AFG, we are excited to continue engaging students, educators and their caregivers in codesigning asset based, formative assessments that measure the critical skill that power learning–we look forward to sharing what we’re learning along the way!