Interview with Kristen DiCerbo, Educational Psychology PhD and Education Research Scientist, Lead Center for Learning Science & Technology.
Kristen DiCerbo’s career path proves that it’s not always a straight line from point A to a job in which you thrive. Starting out as a psychology and sociology major doesn’t seem like the most direct route to becoming a research scientist for technology in the classroom, especially since her aspirations were to work in academia. But that’s exactly how DiCerbo started out. Her interest in analyzing data and results led her to a career as a research scientist who works to develop learning science principles that help children learn in the classroom.
As a psychology and sociology major, DiCerbo earned her Bachelor’s degree from Hamilton College. She then went on to earn her Masters of Education and PhD—both in educational psychology—from Arizona State University. Her early career as a school psychologist was a great entrance into the world of education and the mind behind the student. Since then, DiCerbo has had roles in varying areas of education, including a recent stint as adjunct faculty teaching statistics and research methods.
As an authority on the subject of educational technology, DiCerbo has written and contributed to countless peer-accepted journals and white papers on topics such as game-based assessment of persistence, design and discovery in educational assessment, and research methods in psychology. DiCerbo is also a sought-after speaker and workshop leader for groups to impart game development and research wisdom to industry peers. And when she’s not presenting or contributing, she’s writing her informative blog posts on Pearson’s Research and Innovation Network.
Enjoy our full interview with Kristen DiCerbo as she discusses the science and methodology behind finding what educational technology works and how today’s teachers can use it to their benefit
I went to graduate school in a School Psychology Training Program intending to become a school psychologist. I ended up in school psychology because even at a relatively young age (middle school maybe). I was really interested in why some things seemed easy for me to learn and hard for other people. I wanted to understand more how people learned.
I was in a scientist-practitioner program and during my first year of the program, I realized I actually really liked statistics classes. I had always been good at math, but never really liked it. When I took stats, suddenly I saw an application of quantitative reasoning to interesting questions about learning. I took a lot more stats classes than most school psychology students would and began to think that I might go into academia (as one does in grad school). Also during grad school, I was involved with a number of different research groups, which I think gave me a broader view of how different researchers work.
However, I still wanted to get experience in the schools. I did my year-long internship in the schools and worked for two subsequent years as a school psychologist. I was then thinking about making the move into academia. In the process of getting letters of recommendation, I got back in touch with a professor who had left academia and went to go design an assessment system for the Cisco Networking Academy. This is a program in which the company develops curricula and tools to teach basic computer networking and gives it away to high schools and colleges in 160 countries around the world as part of the social responsibility programs. They were looking for educational psychologists to help design the learning and assessment tools. This was clearly a big decision: academia or industry? Ultimately, my decision was made based on a couple things: 1) the direct impact on nearly a million students using things that I helped develop and 2) the ability to still conduct and publish educational research.
I did a number of things with the Networking Academy, but ultimately ended up with a group creating simulations. It turns out that having kids practice on enterprise routers that cost tens of thousands of dollars isn’t great. Schools can’t afford them and kids tend to cause problems while asking “what if I try this?” So, we started looking at simulations, and how we could use data from their interactions to understand what the students know and can do. Then we asked what would happen if we added a game layer on top of the simulation.
After some time working on this, some folks from Pearson heard me presenting about the game and simulation-based assessment work and expressed interest in expanding that to domains other than computer networking. So that is how I ended up where I am now. It is probably clear by now that this is in no way the plan I had entering grad school, but I love what I’m doing and where I’ve ended up. However, I was open to options and sought out lots of experiences that weren’t “typical” for my path or program.
My research relies heavily on understanding how students learn and therefore draws heavily on both cognitive and educational psychology. Cognitive psychology provides insight into things like attention, working memory, and other executive functions that we then translate into the design of digital games and simulations. Psychology as a field defines things like accepted evidence for validity and reliability that we then apply as we build models of what students know and can do from the data of student interactions.
My experience as a school psychologist helps me in two ways: 1) understanding the ecosystem of schools and 2) understanding students struggling to learn. The first is important when we think about the constraints of implementing educational technology in schools. Having been in schools gives me a perspective of how systems work. I know how decisions get made, what happens in classrooms, the pressures teachers feel, etc. I also have a lot of experience understanding learning difficulties for students and designing and monitoring interventions. That clearly impacts a lot of the work I do in designing digital activities and systems.
The first key is that I have defined my research niche. I do research on the application of learning science to the design and research of rich learning environments—‘rich’ in terms of the interactions students have in them, the data they collect, and the feedback they provide. Next, having worked in the field for ten plus years, I know what the major questions are that need to be addressed. In my head, I have a relative progression of a program of research that will lead us to progress on these major questions. Finally, I work in an applied position. So, I am constantly designing research so it serves the dual function of informing the products and services the company creates while also advancing the field.
Right now I’m excited about a project we’re working on called the Insight Learning System. It is a proof of concept that seeks to show how we can integrate a variety of digital activities and in-person classroom activities theoretically around a learning progression and quantitatively through statistical modeling of data. I am excited because it demonstrates from end-to-end the processes of everything from designing activities to align with stages in a learning progression to developing Bayesian Networks to integrate very different kinds of data to communicating information to teachers in ways that support their instructional decisions. We have used a very iterative process in development so we have had a lot of opportunity to see students interacting with the activities and building and refining both our learning and assessment models. You can find out more here.
I lead a small group of researchers in the Center for Learning Science & Technology, so I have some more administrative duties (working on larger research and company strategy definition, budgeting, etc.) and communications tasks (like doing this interview) than a typical research scientist.
That said, at any given time, I have 3-4 research projects in various stages of the pipeline: planning/literature review, activity design/creation, data collection, data analysis, reporting and dissemination of results. So, on a given day I will be splitting time between things like working with a team who is working on game; meeting with a global team interested in investigating how to build learning progressions; writing/editing anything from blog posts to peer-reviewed journal articles; and presenting results, often to people who are not researchers. Although I work in industry, I also still do things like review for peer-reviewed journals and conferences.
My first years of graduate school, my professors still came in with a stack of transparencies and put them on the overhead projector. That was technology. Although reflecting on that, I don’t know that the move to PowerPoint presentations is a huge step up from there. Really, the big changes are the huge increases in computing power that allow us to create engaging digital learning environments that students can interact with in a meaningful way and then our ability to capture and use data from those interactions to understand what learners know and can do. We still aren’t anywhere close to making full use of these two things, but that is where I believe the potential for changing education lies.
I hesitate to talk about teachers. In general, that is not my area of research specifically. I think educational psychology in general has a lot to say about being an effective student. One of the researchers in the research center I lead, Liane Wardlow, recently wrote a nice blog post about what learning science tells us are some of the biggest mistakes students make in learning.
Absolutely. There are many times when interaction with one or more peers or a more knowledgeable person is essential for learning. Certainly that can be done over technology, but I believe (as someone who works on a geographically distributed virtual team) that these interactions work better when there are established relationships and those relationships are easier to build face-to-face. In addition, for some topics, interaction with the real world and real objects is essential for mastering concepts and skills.
One of the biggest things I see is not a skill but more of a disposition. Teachers need to learn to be comfortable with not having all the answers all the time. I see this when we introduce games into classrooms. Teachers are hesitant to implement them unless they have played all the way through and are still nervous about students getting stuck and not being able to help them. However, one of the engaging aspects of games for many players is getting stuck and then figuring out for themselves how to get unstuck. I know even when I’m testing games with students, it is so tempting to jump in and help players, but I also find if I shut my mouth and sit on my hands, players figure it out and ultimately get more from the experience.
The second thing that is important is trying to think about what technology can help you do that you couldn’t do before, rather than focusing on just replicating what had been done before in the technology. I go to a lot of assessment conferences and often see people talking about putting paper-and-pencil tests onto the computer, trying to copy exactly what they had on paper. I totally disagree with this approach. Rather than sticking with what we have always done, how can we do it differently, and hopefully better? This is why I spend so much time focusing on data from games, investigating whether we can eliminate the need to stop and give a traditional test if we have all this information from learning activity.
The author of this blog may be compensated to provide opinion on products, services, websites and various other topics. Even though the authors of this blog may receive compensation for posts or advertisements, the views, opinions, and positions expressed by the authors and those providing comments are theirs alone, are not endorsed by, and do not necessarily reflect the views, opinions, and positions of GradSchools.com or EducationDynamics, LLC. GradSchools.com and EducationDynamics, LLC make no representations as to the accuracy, completeness, timeliness, suitability, or validity of any information on this site and will not be liable for any errors, omissions, or delays in or resulting from this information or any losses or damages arising from its display or use.