AERDF is seeking a new program executive director to prove the impossible in education »

Our work at AERDF centers on building innovative educational solutions to support the brilliance of Black and Latino students and students experiencing poverty.

At the center of that work, of course, are the learners themselves — students from across the country who are involved in, or will be involved in, research and development (R&D) with AERDF.

Working with these students directly is a privilege, and it’s one we don’t take lightly. Our teams have worked diligently to create R&D processes that involve students, caregivers, and educators collaboratively while mitigating their risks of participation.

We’ve also taken exhaustive steps to safeguard the student data we’re collecting from potential hacking or misuse. We firmly believe that any data worth collecting is worth protecting. 

As such, we treat U.S. regulatory data protection standards as a baseline — and then we go above and beyond them to mitigate any risk that an individual learner’s reading-ability data or social-emotional learning feedback, for example, falls into the wrong hands.

At AERDF, we view data protection not as a barrier to innovation, but rather as a framework that unlocks and deeply informs our investigative discoveries.

In our work, we also strive to acknowledge that, historically, research on Black and Latino populations has at times been unethical or undertaken without proper consent. The Tuskegee syphilis experiments and studies using Henrietta Lacks’ cells without her family’s awareness are examples of this kind of research misuse.

Instead, we actively and openly engage with study participants and protect their shared data throughout our research process in order to prevent repeating past harms.

 

Safeguarding data through a multidimensional approach

 

AERDF’s comprehensive data privacy protocols are driven by a multidisciplinary team with expertise in all facets of data protection. This allows us to tackle the wide-ranging ethical, technical, and legal considerations involved when collecting personal information about real people.

In addition to leveraging my own expertise as a privacy attorney, our data protection team includes cloud architects, data security experts, IRB (Institutional Review Board) specialists, generative AI experts, and student privacy consultants.

This multidimensional approach allows us to do what many larger, for-profit organizations have struggled to do: namely, get the many diverse cogs of R&D to work together effectively and cohesively.

In a traditional R&D framework, lawyers and tech team members often aren’t sure how to work collaboratively with one another, given their different backgrounds and even differing jargon. In these environments, for example, an attorney may struggle to explain privacy demands for a particular data collection platform if tech team members have previously worked on platforms that didn’t require them.  

But at AERDF, we have broken down such stagnating silos of expertise to build a single, multidimensional R&D team — one in which everyone knows the hows and whys behind what everyone else is doing. By keeping information flowing among all our R&D team members, we’re able to create an R&D process that’s innovative and inclusive, while mitigating involvement risk for participants.

From a technology perspective, we’ve built a comprehensive, deeply protected cloud-based platform where all our study data lives and where new prototypes can be safely developed. Every AERDF researcher has access to this platform to ensure their assessed data receives cutting-edge security protections.

Together, our team is also working to plan for and stay ahead of new regulations that may reshape data collection best practices in the future. Even before the FTC announced new potential COPPA rules, for example, we were already working on ways to keep identifiable student data out of searchable algorithms.

All this work has one goal: ensuring that the learner and educator data empowering our work remains free from misuse or unwarranted distribution in an era when student data has become a key target of hackers.

 

Privacy by design

 

Data protection informs every stage of the work we do at AERDF, from early project inception and logistics ideation to best practices for ethical data collection, storage, and — once the project ends — data de-identification or disposal.

Put another way, AERDF projects are designed for privacy optimization from the outset, and privacy remains a top priority at every stage as projects prepare for launch and implementation. Here, privacy is not an end-stage afterthought but rather a driving force of innovation.

In their earliest stages, all AERDF research projects are reviewed by Institutional Review Boards (IRBs) that specialize in protocols for safe data collection and informed consent from participants. These IRBs are often housed at universities affiliated with our researchers or at external, independent partnering organizations.

Before projects launch, our investigators work diligently to provide study participants with every detail they may need — including the research hypothesis and what data we will be collecting, how, and why — so that educators, parents, and students can answer knowledgeably and with clear autonomy when they decide whether or not to participate in our studies.

Additionally, before any data is collected, each project goes through internal review by AERDF’s Research and Development and Ethics Committee. This review process asks investigators to, among other things, consider their own biases about Black and Latino students that could inadvertently or improperly affect their research protocols or study findings.

As part of project review, investigators also receive a data risk matrix, which encourages them to report whether participants might face emotional, social, or physical harm from participating in an AERDF-sponsored study — and instructions for ways they might mitigate that risk.

Finally, before any project officially starts, AERDF enters into formal, written data-sharing agreements with school district administrators, learners, caregivers, and educators to ensure all involved parties are informed and feel comfortable about what data is to be collected and how it will be used.

To some, this may sound like a cumbersome or overly meticulous process, but we feel it actually speeds up our work. Through these extensive protocols, we can reduce the risk of mid-project research shutdowns due to data breaches, insufficient consent acquisitions, or other ethical concerns.

 

Pursuing informed and Inclusive R&D

 

AERDF is committed to ethical and Inclusive R&D. Providing participants with ample information to enable informed consent and ensuring their information is protected, once shared, are key steps in our pursuit of that goal.

Above all, we want our study participants to feel confident that the data they’re sharing will be safeguarded and that their voices and insights will be heard and valued.

Ethical R&D understands that Black and Latino communities and all communities experiencing poverty know and understand the problems their neighborhoods and schools are facing. They don’t need an outsider to “identify” or “discover” them. What they need is someone willing to listen and support possible solutions.

That’s why our work aims to center and engage the experiences and rich knowledge base of the communities at the heart of our work. We endeavor to provide these students — and their caregivers and teachers — meaningful opportunities to participate in R&D with us. After all, it’s their lived experiences that hold answers to the questions we’re pursuing.

With their help, we’ll achieve our goal of providing every student with the educational support they need to succeed.

 

By Asia Parks, Ethics and Privacy Counsel for AERDF

Translate »