New PD Site: SuperAdmins

A list of users with their name, location, email, and role. Each user has an Edit button available to the SuperAdmin.

Starting off with this project, I knew it would need a strong admin interface. One of the problems I mentioned initially is that all of the PD troubleshooting and administration went through me. That became a big burden, especially when presenters needed something done.

The new platform includes a SuperAdmin area which can be used to manage events and users. This role can be given to anyone, so it’ll be nice to have some other team members who are able to make system-level changes.

Navigation

The navigation menu includes two options for SuperAdmins: Event Management and User Management. I decided to split it into two different views because they were getting very complex when combined. SuperAdmins can also create events, just like presenters.

A navigation menu with 'Home', 'My Schedule', 'Documents', 'Event Management', 'User Management', and 'Create Event' listed as choices.

Event Management

SuperAdmins have access to all events in the system. They can see registration details and update attendance. From time to time, a presenter may miss a person who was actually there, so the admin can go in and mark that person as having attended after the fact.

The SuperAdmin event controls are nearly the same as the Presenter controls with two major differences:

First, SuperAdmins can add any user as a presenter to an event. Presenters can only search through users who are already presenters in the platform. SuperAdmins override that and can add any registered user. When that user is made a presenter, their user role is also updated and they’ll be granted the Presenter permissions detailed in my last post.

Second, SuperAdmins can completely delete events. Presenters could set an event status as ‘inactive,’ which hides it from the UI, but doesn’t remove any of the data. The SuperAdmin can nuke the entire event along with registrations, attendance, and other associated data.

Because the deletion is completely irreversible, this has an extra confirmation step before actually performing the operation.

A popup asking the SuperAdmin if they are sure they want to delete all information and registrations for an event.

User Management

When a user registers for the site, their account is set by default to User. This limits what controls the general user has (again, detailed in another post) and lets people sign up without interaction from presenters or admins.

There are times when users need to graduate up a level. The User Management area allows admins to change user roles with a dropdown menu. This role change is immediate and on the next login (or page refresh), the user permissions will update.

A list of users with their name, location, email, and role. Each user has an Edit button available to the SuperAdmin.

SuperAdmins can make edits to users within the platform. Their login matches their Google account, but maybe they want their first name to be displayed differently. Or, more importantly, they didn't register for an event that they actually showed up for. If an edit is necessary, the SuperAdmin can do all of those things in the sidebar. This is the same sidebar mentioned in the post on presenters with different form fields thrown in there.

I decided not to all admins to disenroll a user from an event because we want to be able to look at what was registered for vs what was actually attended. There isn't a view for this report yet, but we'll be able to do that in the future.

In the end...

This was a project of necessity to help us move forward as a cohesive team but it was also one of significant learning for me. This is my real first full-stack project from the database up to the frontend. The entire project is on Github, and while it isn't ready for general distribution yet, you can look at the design from the code level.

There are several features I've added since starting these posts (duplicating events, UI updates, mobile styles, accessibility styles, etc) that I won't be visiting specifically. I'm going to do one last post on technical aspects and explain some of my design and code choices if you're interested.

If this is something you'd like to consider using in your school or district, I'd be happy to set up a demo to walk through the platform as a whole. Get in touch if you'd like to do that.

New PD Site: Staff User

This spring and summer, I've taken on a full-blown re-write of our PD registration and management system. Our current system is a collection of centralized Google Apps Script projects which create, share, and manage all event folders, documents, and records. It's worked well over the last two years but there's been a single point of contact for everything: me.

Now that we're getting more people involved in professional development, it's time to have a more robust, user-based system for creating and managing events as well as teacher signups and recordkeeping. This post is going to explore the first role: Staff Users. These are teachers or staff who are registering for and tracking participation in events.

Home

The home page shows logged-in users all district events. Their own state is shown on course badge as either Attended or Registered. Clicking on a course shows specifics (presenters, location, etc) in a sidebar. If a user registers for an event, their course badge updates dynamically which prevents multiple registrations by the same person.

To do

  • Google calendar invitation to events
  • Only show upcoming events
  • Disable registration on events with no remaining space

Schedule

This is essentially a filtered view of workshops or events the staff member has signed up for. Each event's status is shown and details are displayed in the sidebar when the event is clicked.

To do

  • Custom date filtering
  • Expand view to remove a click for details

Documents

We've had a digital sign-up tool in place for several years. The biggest improvement I'm excited about is the documentation processing. Any registration is put into a database which can be queried and filtered by a bunch of parameters. This allows me to build out a nice spot for teachers to find their documents on demand and print whatever they needed rather than waiting on us to generate a PDF from a spreadsheet and send it off.

This page shows the which events have their participation or completion confirmed by the facilitator. The reason this confirmation step is so important is that we need to move away from being trained and move toward showing competency. So, a workshop might be a part of a program, but it does not guarantee that the staff member has actually improved.

This is a big shift for us. In the past, we used a standard feedback form. But, given the variety of presenters working with us, we wanted to give people more freedom in how they collected feedback. Also, since we were generating all the feedback forms centrally, we found presenters were less likely to actually read the feedback because the questions may not have been relevant to their own goals. At worst, participants were filling out multiple forms at events - one for us, and one for the presenter. Taking the form out of the documentation flow simplifies for everyone.

Without showing the presenter interface now, this view is any confirmed event for the user. They are also given a couple snapshots at the top: total registrations (how much am I signing up for) and Professional Growth Points (PGPs) earned for completing requirements.

From here, they can either print a summary of all activity on record or print individual documents as needed. All of these details are generated by the database. The record is also validated by the server and database rather than taking input directly. There's no more wondering when an event was or how many PGPs it was worth because it's all driven from a single source of truth.

That's a quick view of a portion of this site that's just about finished. But there's a lot happening in the background to make that work and to allow different people to manage. In future posts, I'll detail Presenters and SuperAdmins and their roles on the site for creating and managing events. I'll also get a technical post published on the technology used to build this as well as deploying.

Considering PD Structures

I'm in the midst of an action research course and my topic is evaluating and reflecting on our systems of PD in the district. This post is the literature review I did as part of the research process. This is similar to some of the work I did last year on leadership development and PD and those links to related items are at the bottom of this post.


“Professional development” as a catch-all for staff training has a degree of uncertainty associated which clouds our ability to critically discuss and reflect on programming. As an instructional team, we have not taken time to critically assess and address our effectiveness in presentation or facilitation nor have we done any work to gauge the effectiveness of professional development in changing teacher practice.

In Elkhart, we have worked mainly with self-selected groups of teachers as technical coaches according to the definition provided by Hargreaves & Dawe (1990). Though our sessions contained collaborative elements, they were singularly focused on developing discrete skills to meet an immediate need. As a team, these have been effective in closing a significant digital teaching and learning skill gap present in the teaching staff. We have not, to date, considered specific models of professional development as a mechanism for planning or evaluating the effectiveness of workshops offered in a given school year.

According to Kennedy (2005), comparative research exploring models of professional development is lacking. Her analysis and resulting framework provides helpful questions when assessing and determining the type of offerings for staff. Reflective questions range from the type of accountability organizers want from teachers to determining whether the professional development will focus on transformative practice or serve as a method of skill transmission. It is tempting to always reach for models which support transformative practice, but there are considerations which need to be made for those structures to be truly transformative.

As a district, our efforts have centered on active processes with teachers, but this has been done without an objective measure of what those types of programs actually look like in practice. Darling-Hammond & McLaughlin (1995) summarize our working goal succinctly: “Effective professional development involves teachers both as learners and as teachers and allows them to struggle with the uncertainties that accompany each role,” (emphasis mine). Struggling with uncertainties requires some measure of collaboration, but collaboration alone does not necessarily lead toward transformative ends and can even drive top-down mandates to improve palatability (Hargreaves & Dawe, 1990).

To structure collaborative development opportunities, Darling-Hammond & McLaughlin (1995) make a case for policies which “allow [collaborative] structures and extra-school arrangements to come and go and change and evolve as necessary, rather than insist on permanent plans or promises.” This counters many district-driven professional development programs which require stated goals, minutes, and outcomes as “proof” of the event’s efficacy and resultant implementation. The problem with these expectations is that truly collaborative groups are constantly changing their goals or foci to meet changing conditions identified by the group (Burbank & Kauchak, 2003).

In response, a “Transformative Model” (Kennedy, 2005) attempts to move beyond a simple “collaboration” label and build a professional development regimen which pulls the best from skills-based training to into truly collaborative pairs or small groups attempting to make changes in practice. She argues that transformative development must consist of a multi-faceted approach: training where training is needed to open spaces when groups need time to discuss. All work falls under the fold of reflection and evaluation of practice in the classroom. Burbank & Kauchak (2003) modeled a collaborative structure with pre-service and practicing teachers taking part in self-defined action research programs. At the end of the study, there were qualitative differences in the teachers’ responses to the particulars of the study, but most groups agreed that it was a beneficial process and they would consider participating in a similar structure in the future. Hargreaves & Dawe (1990) alluded to the efficacy of truly collaborative research as a way to combat what they termed “contrived collegiality,” where outcomes were predetermined and presented through a “collaborative” session.

Collaboration as a means alone will not change practices. Hargreaves and Dawe’s (1990) warning against contrived collegiality is characterized by collaborative environments with limited scope “to such a degree that true collaboration becomes impossible”. Groups working toward a shared goal of transformative practices is undercut when the professional development structures disallow questioning of classroom, building, or district status quos. If collaborative professional development groups are allowed to “struggle with the uncertainties” (Darling-Hammond & McLaughlin, 1995) present in education both in and beyond the classroom, the group will be more effective in reaching and implementing strategies to improve practice. This view subtly reinforces Hargreaves & Dawe’s (1990) perspective that collaboration must tackle the hard problems in order to have a lasting impact.

There are several other factors identified which contribute to the strength and efficacy of professional development. These range from continuous, long-term commitments (Darling-Hammond & McLaughlin, 1995; Hargreaves & Dawe, 1990; Richardson, 1990), work that is immediately connected to classroom practice (Darling-Hammond & McLaughlin, 1995; Richardson, 1990; Burbank & Kauchak, 2003), and a group dynamic which recognizes the variety of perspectives which inform teaching habits across a wide spectrum of participants (Kennedy, 2005).

As an instructional coach, one of my core responsibilities is to help create a culture of learning amongst members to mitigate division or power dynamics based on experience (Darling-Hammond & McLaughlin, 1995; Burbank & Kauchak, 2003), which is particularly evident in mixed-experience groups. In addition to fostering a strong group dynamic, the instructional coaching role becomes facilitative rather than instructive to help teachers address problems of practice (Darling-Hammond & McLaughlin, 1995). It is easy to fall into an technical coaching position in collaborative groups, but such a role reduces the chances for transformative work to emerge as teachers become trainees rather than practitioners (Kennedy, 2005). This becomes more apparent as districts add instructional coaching positions, but limit the scope of the role to training sessions under the guise of “encouraging teachers to collaborate more…when there is less for them to collaborate about” (Hargreaves & Dawe, 1990). Ultimately, the coaching role is most effective when it is used to support teachers through “personal, moral, and socio-political” choices (Hargreaves & Dawe, 1990) rather than technical skill and competence.

In order to fully reflect upon and evaluate our programming, Kennedy’s (2005) framework for professional development will serve as a spectrum on which to categorize our professional development workshops and courses. Hargreaves & Dawe (1990) also provide helpful reflective questions (ie, are teachers equal partners in experimentation and problem solving?) to evaluate just how collaborative our “collaborative” groups are in practice. Once our habits of working are established on the framework, we can address shortcomings in order to build toward more effective coaching with the teachers in the district.

Resources

Burbank, M. D., & Kauchak, D. (2003). An alternative model for professional development: Investigations into effective collaboration. Teaching and Teacher Education, 19(5), 499-514. doi:10.1016/S0742-051X(03)00048-9

Darling-Hammond, Linda, and Milbrey W. McLaughlin. "Policies that support professional development in an era of reform." Phi Delta Kappan, Apr. 1995, p. 597+. Biography In Context, http://link.galegroup.com.proxy.bsu.edu/apps/doc/A16834863/BIC?u=munc80314&sid=BIC&xid=abd8b6f2. Accessed 5 Mar. 2019.

Hargreaves, A., & Dawe, R. (1990). Paths of professional development: Contrived collegiality, collaborative culture, and the case of peer coaching. Teaching and Teacher Education, 6(3), 227-241.

Kennedy, A. (2005). Models of continuing professional development: A framework for analysis. Journal of in-service education, 31(2), 235-250.

Richardson, V. (1990). Significant and worthwhile change in teaching practice. Educational Researcher, 19(7), 10-18. doi:10.2307/1176411


Here's a presentation I did for a class about a year ago over similar themes, but with a leadership spin.

The featured image is by Jaromír Kavan on Unsplash.

The Why Loops

I spent some time last week running through some "why" loops to hone in on reasons behind my potential research question. I think the question is broad enough to allow for several avenues of exploration, but it was insightful to run through the cycle several times (below). We've actually used this mechanism as an instructional coaching team in the past and being familiar with the process helped me focus on larger issues. Granted, some of the issues contributing to some of the behaviors we see are well beyond my specific purview and definitely outside the scope of my AR project.

Below is a straight copy/paste of my brainstorming. I think items two and three are most within my realm of influence. I can use my time to focus on teachers who have recently participated in PD to help provide that instructional support. I can also work proactively with principals, helping them follow up with their staff members learning new methods or techniques and recognizing those either with informal pop-ins to see students in action or public recognition in front of their staffmates.

Why don’t teachers implement the training they’ve received in PD?

  1. Teachers don’t put their training into practice   
    • There are good ideas presented, but no time to work on building their own versions.   
    • The PD was focused on the why, not enough on the how   
    • Teachers don’t understand why they need to change practice   
    • The district’s communication about the offered PD is lacking clarity   
    • There is a lack of leadership when it comes to instructional vision.
  2. Teachers do now show evidence of putting training to use with students.   
    • Teachers don’t know how to implement ideas they’ve learned in the workshop   
    • There are so many demands on their time, planning new lessons falls to the back burner   
    • In-building support systems are lacking   
    • The district is strapped for money and hiring instructional coaches isn’t a priority.
  3. Teachers do not put learning from PD into practice.   
    • There is no outside pressure to implement ideas learned in training   
    • Principals are spread too thin to pay close attention to inservice teachers are attending   
    • Principals do not know what to look for after teachers attend inservice.   
    • Teacher evaluations are based on outdated expectations and promote superficial expectations.
  4. Teachers do not communicate implementation of learning   
    • Workshops in the district are often standalone with no formal structure for long term support   
    • The resources committed to PD for several years were focused on one-off training   
    • The district lacked a vision for teacher development as a continual process   
    • District leadership did not see the value of instructional support as a formal position in the district.
  5. Teachers do not implement learning from workshops   
    • No one follows up on the learning from the PD   
    • There was no formal method for recognizing PD   
    • There is no formal expectation of implementation from supervisors (principals, etc)

"Loop" by maldoit https://flickr.com/photos/maldoit/265859956 is licensed under CC BY-NC-ND

Checking Implementation

Running PD for an entire district is a challenge. The biggest gap I see is knowing how or when teachers actually use what they've learned in a session or a series of sessions. We have automated systems in place, but it doesn't give us information on the effectiveness of our instruction.

We coach our teachers to check for understanding and watch for application of learning with their students, yet this is something I have not done well with the teachers I work with. Granted, I work with all five secondary buildings (and teachers in general with my partners), so geography and time are a challenge in gathering and collating the right kind of information.

I'm interested in what kinds of supports we provide will help teachers actually use what they've learned. We run several programs, but which ones are the most effective at engaging and enabling our teachers to make changes to their teaching? What kinds of environments or availabilities are the most helpful to the staff?

Homing In

I haven't defined a specific question yet, but several I'm thinking about include:

  • How long do teachers wait before implementing training they've received from the district?
  • What professional development structures or systems best enable teachers to implement skills or strategies learned in professional workshops?
  • How does student engagement or learning change as a result of a specific instructional change by a teacher after attending a training event?
  • What are the reasons teachers do not put strategies or systems in place after a workshop?
  • Do professional development workshops make an impact on day to day instruction by the teaching staff?

My main concern is that several of these questions are very subjective. Measuring the result - either quantitatively or qualitatively - will be difficult and rely on select groups of teachers self-electing an evaluation tool. We already send a basic implementation survey to teacher three weeks after an event, so my intent is to go through all of those records and begin to identify the response rate as well as the most common responses for implementation vs non-implementation by teachers. I'm also hoping to gain some candid insight on the state of our professional learning opportunities from teachers' perspectives.