Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you!
You can contact us via the Podcast page on the IDC website at https://ideadata.org/
### Episode Transcript ###
00:00:01.55 >> You're listening to "A Date with Data" with your host, Amy Bitterman.
00:00:07.37 >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day.
00:00:19.53 >> "A Date with Data" is brought to you by the IDEA Data Center.
00:00:24.60 >> On this episode, we're joined by Ginger Elliott-Teague, Director of Data Analysis with the Oklahoma State Department of Education. Welcome.
00:00:32.33 >> I've been at the State Department of Education in Oklahoma for about 6 years now, always in the same role, working with part-B data and part-C data functionality, the SPP/APR, the SSIP reporting and evaluation and a wide variety of other data elements.
00:00:51.20 >> So states recently submitted their first SPP/APR of the federal fiscal year 2020/2025 cycle, and that included a lot of changes to the indicators. How, in your state, did you support districts and other stakeholders to understand those changes and the implications of those changes?
00:01:08.97 >> We've had traditionally a hard time having districts understand what the APR is, why we have to report it the way we do, why we make the calculations we do, understanding the role of the federal reports and how that trickles down to districts, can be difficult to explain particularly when we have a lot of turnover, as I'm sure a lot of states and districts do. So this year when we were explaining the changes, a lot of new folks didn't know there were things that had changed in it, so talking about comparisons to the past didn't always work. We did that a little bit for those who had experience, but we tried to generally explain the calculations, particularly for one. Two did not change. Indicator two dropouts did not change in Oklahoma because we've been using the same methodology for quite a while that will be moving forward as required. But for indicator one graduation, the change to the exiting file was an important conversation with districts because it really changes how they understand what a graduate is. So rather than thinking about the cohort base, changing it to the percentage of graduates of all 14-to-21-year-old exiters, massive change in calculation. So that took some time to explain, and I think we're going to continue to do that over time as they see that now graduation and dropout are basically mirror images of each other. In Oklahoma, we don't have a lot of students who reach max age, and so by reaching max age, we do not have an alternate diploma or certificate or anything. So graduates plus dropouts is pretty much our entire exiting report, so that at least makes those two numbers align really nicely, but it also means that if you fail at one, you fail at the other, so that's going to be a little bit of a change. Three was really hard to explain because with all of the new assessment measures and the limitations to specific grades, we went through a lot of charts and a lot of data. And when we started talking about target-setting, we didn't have the assessment results yet, so we projected different options in our explanations of what the targets might be. And so we said if our proficiency rate were X, this is what our targets could be. So we set forward some proposed target lines that differed based on that starting point of our data this year because this, of course, is baseline, so we would be looking forward. So if proficiency were 10 percent or 20 percent or 30 percent, and, in Oklahoma, unfortunately we had to go down to four percent, is where we are this year in some of our baseline data. We are at two, three, four percent efficiency for special education students, which is just horrifying, but that's where we are and how we will move forward. So that was interesting to talk about with stakeholders and to say, "We must set targets. We're not sure what they will look like, but here are some options for raising it annually, where we might want to be in the end, and what do you think? How do you want to move forward?" We always asked about methodology options as well as final target preferences so that stakeholders can really vote on, and we did polling in our meetings so that we could get a firm preference setting for those options, but then, of course, we had discussion and got some qualitative feedback, as well. So with six C, that was really the only other indicator that stakeholders thought was potentially problematic with percent receiving services in the home. But we didn't have a lot of discussion on that mainly because we have maybe 30 or 40 a year out of our entire child count that receive services in that setting, so that's more of a factor of, why are we measuring that? Why do we have to report that? So we do what we can, say what we must and then also say, "Well, that won't be in your determination, so it's not something you need to worry about at the local level."
00:05:24.39 >> What would you say worked really well in terms of engaging the districts and stakeholders and increasing their understanding in the changes?
00:05:31.77 >> Talking through the potential options, I think, worked really well. They appreciated knowing that we weren't presenting one option, and they had to say yes or no. So we said, "Okay, they had really high target. What would that look like over time, if we had medium targets or if we had conservative targets? What would that look like over time? Where would we want to be in the end?" So if we had an incremental approach versus setting specific percentage increases over time, so we talked through all of those options, and I hope that that process helped them understand how there's a lot of options for setting targets, and one is not necessarily required. We do want stakeholder input into what those will be, and that decision stays with us for 6 years, so understanding the implications and importance of choosing [Indistinct] that was part of the discussion. We put charts up and tables and those kinds of things, and we got good feedback about that comparing target lines right next to each other. That seemed like a successful approach. The hardest part we had was actually getting people to attend, so we had good participation for those who we did have low participation overall.
00:06:50.67 >> And what plans do you have for ongoing, continuous engagement with stakeholders moving forward?
00:06:56.33 >> So for the first 16 indicators of the SPP/APR, we do report annually to stakeholders, particularly through our advisory panel. We post information online and do all of those public reporting requirements. Traditionally we haven't done a lot of other stakeholder engagement, and I know moving forward we need to, so that is something we are thinking about. With the SSIP, we have much more engagement and a lot of opportunities for stakeholder engagement. We have several standing committees, advisory committees of different types of stakeholders, to help us move forward with that, and that's been really successful and plan on engaging those groups every other month for the foreseeable future. How we can use that model for the rest of the APR, we'll have to think about, that might be too much in general. I'm not sure, but at least that model has worked really successfully for us for the SSIP.
00:07:50.83 >> Can you share a success story, something you're really proud of related to the stakeholder engagement work you've been doing?
00:07:56.93 >> We are ... As a state, we have a really good relationship with our parent center, and that's something that we think of as successful. They will host meetings with us. They promote our stakeholder engagement. The always attend, of course, and try to bring other parents in. They're not necessarily successful at ensuring that a wide variety of parents attend. They advertise, but that doesn't mean parents will necessarily participate, but the fact that they are engaged and they share information with their, I don't want to say members, but with their stakeholders themselves has been really helpful for us, and we really appreciate the support with that. The SSIP stakeholder groups, those standing committees, we think, have been quite successful and are glad that those are continuing, as well.
00:08:46.34 >> This is all such incredibly rich information that you've shared with us. Before we go, is there anything else that you would like to leave us with?
00:08:52.52 >> So this year was the first time I've gone through the stakeholder engagement process for an entirely new cycle for the SPP/APR, so that learning experience has been really valuable. Working with new ways to engage stakeholders, we've set up new website, tried to share as much information as possible to build stakeholder capacity, and I think we're going to learn from that information and that process of how we can continue to engage stakeholders. In Oklahoma, we don't have a lot of stakeholder engagement. We aren't highly litigious, so we just don't hear from stakeholders very much in a wide variety of ways, so finding new ways to bring them into the process and ensure that they have a voice has been great to practice, and I'm hoping that we can learn from that and develop new ways to pull them in as we move forward.
00:09:44.54 >> Thank you, Ginger. We really appreciate having you on the podcast with us.
00:09:48.02 >> Thank you. I appreciate the opportunity to share.
00:09:51.74 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at IDEAdata.org.