The role of the faculty in the post-LMS world (opinion)

However, working outside the LMS, well-trained instructors will be able to do far more than meet the minimal requirements for moving college courses online...

Source: The role of the faculty in the post-LMS world (opinion)

I don't work at a university, but we're in the process of moving teachers into using Canvas in our district, so this resonates. I see two groups of people:

  • those who already had material online and are struggling to work backwards (essentially) to fit items into Canvas.
  • those who have nothing online and are struggling to make sense of what works well digitally and what doesn't.

The LMS is a weird stepping stone. I've had materials online for years, so I don't like the constriction an LMS brings to what I've done in the past, but I do appreciate the streamlined data I can grab from the system (I need to write more on using Outcomes in Canvas later...)

For the second group, it's a great intermediate step and I'm already seeing people look for more online on their own. They want to push the system now that they understand it more. They're seeing the benefit if using the Internet as a whole and not limiting their courses to the flow in Canvas.

Striking the balance between structure and variety is difficult. I'm not sure the LMS will ever completely go away, but I can see the influence waning as skills develop and alternatives becoming more accessible to teachers.

Will Hide Ads in Exchange for Bitcoin

Will Hide Ads in Exchange for Bitcoin thumbnail

Several articles swirled around early this week about’s (sorry, not going to link it) new ad-blocking choice to users. You can:

  1. Disable your ad blocker to see the content, including ads.
  2. Keep your ad blocker on, but allow Salon to mine bitcoin with “spare processing power”

This is a terrible, terrible system for several reasons.

Bitcoin isn’t mined with “spare processing power,” as the FAQ claims. It’s mined with electricity, that you pay for, for no reason at all. I’m not going to go into super detail because this post, written on a related topic, has a great explainer on how Bitcoin “mining” actually works (jump to “Why this is bad” in the post).

Also included in the FAQ, “nothing is installed” on the user’s computer if they choose to opt in to mining. This also isn’t true. It’s true in the sense that I don’t have to download and install a program in the traditional sense. But, if I opt in, Salon installs a script silently through the browser which begins to work in the background with no notice to the user.

Also this week (coincidentally), there was a malicious mining script placed on thousands of government websites. When a user loaded a page, the mining script went to work at the expense of the user’s computer. As cryptocurrencies continue to bubble, I think we’ll be seeing more and more of these “opportunities” at the expense of the user.

The problem with ads isn’t the fact that I’m seeing ads. The problem is that ad technology on the web is invasive, expansive, opaque, and a really terrible experience for most users. Ad software builds a profile of an individual to target more “relevant” ads based on your browsing history. If a company tracks you on a particular page, that page’s content is stored and called up next time you hit a page with that company’s software.

These algorithms are totally opaque - no one knows exactly how they work, which means you - the user - are a product, not the consumer. As a consumer, sure, I want to see relevant ads. But that data which is used to show me advertisements is also sold by clearinghouses to other companies for profit. I’m a transactional item, not a customer. The nature of advertising on the Internet has fundamentally changed.

Salon’s adoption and PR to convince people that this is a fair exchange is misleading and doesn’t do anything to address the fact that Salon-the-organization is getting money from companies with shady, at best, business practices. Selling readers while claiming they’re selling ad space takes advantage of illiteracy in how the Internet works. Masking this practice is underhanded and should be recognized.

Featured image: Stop sign flickr photo by deb & devin etheredge shared under a Creative Commons (BY-NC) license

Slides Tweeter Update 3

(Part 1, Part 2)

I’m excited.

A beta version of the Slides Tweeter AddOn will be ready this week. Two major updates helped get it to this point:

  1. Google changed the URL pattern for the thumbnail image, meaning I can grab a much smaller file which greatly increases the speed of the AddOn. Most tweets are posting in less than 20 seconds. Currently, the AddOn is grabbing a 500px wide image, but I may bump it up to 700 or 800px to see if I can squeeze a larger image without the loss of performance.

  2. I’m using the PropertiesService function of Apps Script to store the active Slides ID and title. When I first built the proof of concept, I didn’t need to store IDs because I could access the getActivePresentation() property directly. As an AddOn, I need to open the presentation by ID to make sure the correct one is being opened at any one point. This also allowed me to set the webapp as a static address, accessible by anyone using the AddOn. No data is pushed to the client (browser) other than the images of the Slides, so no data is exposed.

To make it easier, I updated the initial UI slightly. Here’s the updated launcher:

The title and hashtag are customizable, the ID field is not. There is still a little tweaking to do to ensure the player launches correctly every time.

If you’d like to be whitelisted for a beta, fill out the form below. I’ll follow up directly via email once it’s ready.

Link Google Forms to Bitly Automatically

If you have a account, you can get a public API token which can be used to create shortcodes. This is really handy in my situation, where I’m creating a ton of feedback spreadsheets (another monster post on that later). Using a small code snippet, you can create a feedback form, throw in some code, and have it display as a short URL and QR code.

If you’re starting from scratch, create a template form and spreadsheet. When you need to make a feedback form, use File > Make a copy on the spreadsheet to copy over the code.

Otherwise, you can make a copy of this blank template to get started (code is already inserted). If you’re going to make your own, be sure you have a form linked. If there is no form on your sheet, you’ll get an error.

The code

The full source code is below. Note that there are two files: one called and one called popup.html. If you’re copying/pasting, you need to create an HTML file (File > New > Html file in the script editor) and call it ‘popup’.

In action

click for larger view

The wheels of the web keep spinning…

Cog in the wheel flickr photo by WickedVT shared under a Creative Commons (BY-NC-ND) license

Tweeting Google Slides Automatically

An app called Keynote Tweet has been around (in various working and non-working states) since the late 2000’s and let users auto-tweet images of their Keynote slides during a presentation to a hashtag or stream. Google released the Slides API this year and one of the API methods allows you to get a thumbnail of the image which can then be sent to other applications. You can see an example of this in a slideshow now by going to View > HTML View. It opens a tab with slide images embedded in plain HTML formatting. Since we can now get the image, we can start to push them out to other platforms with Google Apps Script.

This post is going to be technical in nature and is really meant as a proof-of-concept. I’ll explain some of the shortcomings of this implementation in context. The code is broken up into several chunks and the entire source is posted to GitHub.


First, the Slides API has to be enabled in the Google Cloud Console. Once that’s done, getting the thumbnails is pretty easy.

Off the bat, the API doesn’t have event triggers like the Forms, Sheets, or Docs. I wanted each slide to be tweeted as the presentation advanced, so I needed a custom presentation view. To get this to work, I wrote up a web app presentation window served by Google’s HtmlService.

This simple HTML page requests and displays the slides from an array created by the backend. There are some controls that hide on the bottom of the screen and a position indicator in the top right. Hover the mouse and they’ll pop up for interaction.

Issue 1

  1. The initial page load for the web app varies depending on the size of the presentation. The request for slides on line 37 fires as soon as the document loads in the browser. The loading GIF is replaced by the slides when they’re returned.

  2. The slide thumbnails are returned as 1600×900 pixel PNGs, so they’re big, which increases load time. There is no way to specify the size of the image returned at this point.

Each slide is sent as an image on a tweet as they show is advanced and has posted class added to prevent multiple tweets of the same slide. The “previous” button does not trigger a tweet in the event you go backwards.

I used Martin Hawksey’s TwtrService library to connect my Twitter account. He has a detailed post on how to connect and use the library, so I’m not going to go through that here. This is also where the second major snag comes up.

Issue 2

Google recommends not using libraries in production code because they can negative impact on script runtime. This is especially apparent on the first slide in this script – it times out frequently (3 of 5 times?) and I’m not sure why. Subsequent slides come in between 20-50 seconds, which isn’t terrible, considering the image size being uploaded. But, if you’re a fast talker, this won’t be able to keep up unless some kind of queueing is implemented.

To do this without a library, the OAuth flow needs to be incorporated into the main script. It’s beyond my ability at the moment, so if you’d like to contribute that component and help this run as a standalone app, you can do submit a pull request on the GitHub repo.


Sending the tweet is actually a two-step process. First, the slide thumbnail is posted and then the media_id assigned is attached to the tweet. This is all done on the Google Apps Script side of the code to account for security considerations.

Google’s thumbnail is generated and hosted on their server, so I used the UrlFetchApp to request the content as a blob. This is serialized data that can be passed on to Twitter’s image hosting service.

Once the image is uploaded, we can take the returned media_id string and attach it to a tweet. The Twitter API object for a tweet has a number of options, but all I’m using is status (what you’re saying) and media_ids, which takes the image ID string from the upload.

Right now, the string is hard-coded into the script. This could be set via the Apps Script UI tools if this gets turned into an AddOn at some point if I can speed it up.

Issue 3

Twitter requires a high degree of authorization for posting. I tried implementing the OAuth flow without using a library to speed up performance, but I couldn’t get it to work. TwtrService stores the app credentials for the OAuth flow and has both an upload and post method that make the tweeting easy. But, performance varies for 20 seconds to as long as 300.


The app works, which was exciting to put together and see. It’s a function that would be great in a number of situations and implementation will only get better as the Slides API improves. I’d love to work with someone with more experience to speed the API calls up significantly by including all the necessary authentication in the main script rather than in a library. If you’d be willing to contribute, the source code is on GitHub.

If you’d like to play with it, you can either copy all the files from GitHub or copy and paste the separate embeds here into an empty project. Add postTweet and getThumbnails to the code below.

Mountain Bluebird flickr photo by Andrej Chudy shared under a Creative Commons (BY-NC-SA) license

Parsing a JSON log feed with Python

I have several Google Sheets doing several things on their own through Google Apps Script. I’ve started to make it a habit that each action is logged to a separate, isolated spreadsheet so I can pop in and look for error messages in one places rather than several.

This poses a small problem. I have to actually remember to open that sheet. Usually, something goes wrong, and then I remember to check the logs. I wanted to have something more up to date that I could glance at without too much effort.

You can get Google Sheet data as JSON which is handy in a number of contexts (here and here are two examples from my own work). It’s not as straightforward as tagging .json on the end of the URL (though that would be sweet) but the process isn’t hard. To get the data, this post details how to publish your sheet and find the feed.

Once the dataset was live online and updating regularly, I needed to decide how to get it. I use GeekTool on my desktop so I decided to use a Python script and the Responses library to gather and process the feed.

I put this into a Geeklet on my desktop and voila!

Give it a try with your own sheet. You can run it in your terminal to get a printout of the last 5 entries of the sheet. The JSON output from Google is really weird, so it helps to put it into a prettifier to make it more readable before parsing keys.

What did I miss? What would you do differently?

Featured image, Logs, flickr photo by CIFOR shared under a Creative Commons (BY-NC-ND) license

Ideas for Apps Script in Google Slides

Google Slides got a big update from Google this week, notably the inclusion of AddOns and Apps Script functionality. The UI updates are nice (grid view, skip slide, etc) but the real power and extensibility of Slides through GAS allows for connection beyond the immediate audience.

Some ideas I know I’m going to play with:

– Auto tweet images of slides through a presentation to a hashtag

– Update slides with data/charts from a spreadsheet so data is always up to date

– Auto-generate photo slideshows from a Drive folder of images

– Memes. All the memes.

Slide projector flickr photo by Yair Aronshtam shared under a Creative Commons (BY-SA) license

Date Countdown in Sheets for Triggers

I have a Google Sheet which displays all upcoming PD in the district. It also tracks registrations for people through a web app. I’ve documented that in other places, so I want to focus on an easy method of calculating days until an event to use as a script trigger.

This started because teachers were looking for an automated email reminder a few days before the workshop so they didn’t forget to come. I’d rather they get a Calendar invitation when they register for the event, but I ran into some authentication snags, so that aspect is back burner for the time being. Currently, the sheet is using today’s date and the date of the workshop to trigger an email four days in advance.

Calculating the “days remaining” is pretty easy. The cell formula is:


There are several components of this:

ARRAYFORMULA applies formulas to a range of cells rather than a single cell. Saves me from having to copy the formula down to each new entry.

ISBLANK checks for data in a cell. Because it’s inside ARRAYFORMULA, it looks at the cell in the matching row. If it is blank, TRUE is returned.

ROUNDDOWN rounds a result to a whole integer. This is useful because the subtraction taking place inside the formula returns a large decimal. This makes it easier to test in the script.

NOW gives the date and time when the sheet is updated. Any time you make a change, NOW is calculated.

– The IF conditional keeps the sheet clean and wraps everything up. The syntax is, IF(_logical test_, _value if true_, _value if false_). So, this reads, “If the cell column B for this row is blank, show nothing. If it’s false (is not blank), calculate the difference between the PD date in column B and NOW.

The core of the function is the count down calculation. For instance, today is Friday, September 8. Subtracting it from a date in the future like Monday, September 11, returns a whole integer: 3. I can test for that integer (or any integer) in a simple script.

This is particularly helpful with timed triggers in scripts. I have a utility script wrapped in a conditional:


if(date === 3) {

// do something here



If the condition isn’t met in the script, nothing happens and I don’t get a failure email notification. This is also nice because if I want to adjust the timing, the trigger can stay the same (daily, for instance) without changing the codebase.

Outsourcing, EdTech, and 1986

Outsourcing education doesn’t look like robots taking over our classes. It happens when we willingly turn over the tasks of teaching to machines without thinking through implications or repercussions thoroughly.

Computers are really good at a lot of things. Media companies are also really good at a lot of things. When the two really teamed up in the late 90’s/early 2000’s with the Internet becoming more consumer focused, there was a big shift in the way the Western world – in particular Americans – interacted with media. The move from producer to consumer started in the 50’s with television becoming more ubiquitous and speed-of-light imagery took over our visual world. Information was available instantly through the telephone, captured on film and broadcast to us in the comfort of our homes.

These films ultimately made their way into the classroom and mixed media instruction, the precursor to “edutainment,” became an expectation. With the computer revolution of the 1980’s and the shift of entertainment into all areas of life (political and social, in particular) education was soon to follow suit with educational films and games that focused on the entertainment aspect and not so much on the educational component. The teacher was starting to be outsourced because content should be now, decontextualized, and consumable in a comfortable amount of time.

The growth of EdTech in the late 2000’s has pushed this boundary even further. Teachers are no longer consumers – they’re “ambassadors,” focused on serving students with some perks on the side. Content can – and should – be outsourced because information is available in all of our pockets. Why should I, the teacher, be focused so much on the curriculum when I need to focus on the experience my students have?

Neil Postman paints the early days of edtech in Amusing Ourselves to Death. It’s stark, reading this book 21 years after its original publication. Postman devotes an entire chapter to the trend of entertainment-as-king in education and his predictions ring true.

Yes, teachers are undervalued, scapegoated, undersupported and treated poorly all around today. Our classes are large, our schools and policies can be suffocating. We lack resources, time, and frankly, pay, to accomplish impossible tasks set before us. Yet we show up every morning to continue the work. (I won’t raise teaching to the realm of nobility because that comes with it’s own set of problems.)

Outsourcing is subtle and often overlooked. We want lessons to be memorable. We want to provide the best experience possible for our students. There is nothing wrong with that goal. The problems come when the means to achieve the goal sink to places which ultimately continue the cycle of devaluation of the profession.

Highlighted recently, the frequency of product “ambassador” programs which throw perks to teachers in exchange for recommendations (and even students as guinea pigs) has grown exponentially. Companies promising to revolutionize learning are taking advantage of a cultural bias against teachers and feel like they’re providing a service.

We’d be well suited to remember that if software is free, you, and by extension, your students, are the product. The freemium model is dead and to stay open, these companies need customers. Arguing that providing a few, all-star, typically already privileged teachers with resources in exchange for “some feedback on a product” is an attempt to hide what is really happening – willing participants in corporate strategy and market gains. Why focus on perks? If the value a teacher ambassador brings is so great, pay them for their insight and time.

From Amusing Ourselves…

…We delude ourselves if we believe that most everything a teacher normally does can be replicated efficiently by a micro-computer. Perhaps some things can, but there is always the question, What is lost in the translation? The answer may even be: everything that was significant about education.

Outsourcing ourselves in the name of efficiency or engagement sells short the role of teacher. Focusing on the authentic “as-is” nature of learning is always a better option that the more efficient, computerize, compromised classroom. Recognizing that edtech companies and teachers have different goals is also important. Companies exist and function to make money. Period.

Teachers exist and function to make better people in the world.

Postman called this out in 1986. No one listened. 21 years later, are we ready to listen?

This post was written immediately after finished Amusing Ourselves to Death. I highly recommend picking up a copy to read.

Featured image is Improving Kids flickr photo by cogdogblog shared under a Creative Commons (BY) license

Digital Teaching and Learning is Great (Until It’s Not)

The nuances of digital teaching and learning are often lost on Twitter and off-the-cuff blogs. Posts long enough to explore some of the finer points of teaching and learning today are often skipped over as being “too academic” or “too heady.” Nope, those posts aren’t written for teachers “in the trenches.”

I’ve moved fully into a coaching position with my district. One of my primary goals in this role is to help teachers digest and process what it means to teach effectively, equitably, and responsibly in a digital world.

We cannot separate ourselves completely from bits and bytes. The Internet has gone from being encased in phone lines to flowing in, around, and through us all day, every day. The Internet used to be hard to use. Now, it’s an expectation that it’s just there. The change in availability and usability means the user base increases exponentially while understanding of the mechanisms for use decrease.

With the explosion of online “learning media,” it seems that teaching can be boiled down to engaging videos and the right entrepreneurial mindset. The personal branding narrative of edu-Twitter and edtech in general is a byproduct of the deconstruction and dissolution of structured debate and discussion about solid pedagogical practices.

Intentionality in Instruction

Popular posts in the edu-blogosphere inevitably come back to teachers leaving the “sage on the stage” role to become a “guide on the side.” The sentiment rolls off the tongue and it makes us feel good about making connections with students. But, it lacks the nuance necessary to have any kind of significant conversation about the differences between didactic instruction and active learning.

We have set up a false narrative. I do not have to remove myself as an expert in teaching and learning in order to make connections with students or allow them to explore their interests. The guide-as-greater narrative attempts to make the case that we are partners in learning, but not without the devaluation of a profession as a whole. As a result, schools are throwing students into virtual credit programs led by a single teacher at a dashboard and equivocating it with an in-person experience down the hall.

Sherry Turkle calls this out in Reclaiming Conversation in a chapter focused on changes in education practices which have shifted as a result of prolific digital resources. She doesn’t go so far as to say that Internet-ready tools are destroying a generation but she does call for specific behaviors to change on the part of developers and users alike.

Her most poignant observation was calling out the difference between the natural, as-is instructional setting with the digital, as-if representation. When students are working in the same space – conversing and collaborating with one another, they are experiencing community and content in a real way. “The message is the medium,” as they say, and when we connect teaching and learning with very human interactions, the content gains new relevance.

As a teacher, it’s still your responsibility to construct a learning environment where context lends relevance to the content, whether it’s through constructionist work or through direct instruction. Without intentional preparation and implementation, digital or tangible, instruction suffers.

Finding the Proper Place

Andy Crouch offers insight on technology being in its proper place in his book, The Tech Wise Family. He opens with a story about blitzkrieg cleaning when his children were young. Anything out after 10 minutes was either donated or trashed. (He tells a story about dangling favorite toys over the donation bin to speed things along.) The point being that a house is out of order when things are not in their proper place.

In the classroom, we make proper place decisions about everything, it seems, except for technology. Since we have it, the edu-Twitter cultural push is to use it all the time. Need to do an assessment? There’s an app for that. Want to encourage collaboration? Use this website. Ditch your books for Google because “they’re out of date the minute they’re printed anyway.” The suggestions for technology uses for teachers starting out on this path are wholesale and without nuance and it’s hurting educators across the world.

Technology is not taught in its proper place, and that is a problem. Just like intentional instruction, technology use has to be hyper-intentional. We’re seeing this right now as we move into year one of a distributed iPad rollout in our district. The iPad (or Chromebook or Surface tablet or Linux machine) can be a powerful tool for learning but only when it is in its proper place. Students need to be taught to use the hardware as an instructional aid. Teachers need to be taught how to design units and lessons which intentionally place technology in spots where it can be used for powerful purposes. It requires a cultural shift for all parties.

For teachers, it is much more than taking a plunge into paperless classrooms, making sure they’re a part of every Twitter chat they can get in on, and starting a blog. It’s important to remember that we are training future adults – we have to keep the long game in mind. Using some gimmicks now to keep students “engaged” for the day is robbing them of a life skill which can help them function as adults. Some growth may come through chats and blogging (my own growth included those things) but not without recognizing that they aren’t required for change to happen. Instead of making flat recommendations about what people should do, we need to be approaching these conversations from our personal perspectives, telling stories of what worked – and, more importantly, what didn’t work – as we grew.

Reading and Writing for Nuance

Another component of my work is staying on top of what teachers in the district are reading and talking about. I noticed our central library had a number of copies of Ditch That Textbook by Matt Miller. I grabbed a copy so I would be able to carry on a conversation with people who have read it.

If there is a book that exemplifies a lack of nuance, it was Ditch. Much of the book can be boiled down to:

– Join Twitter.

– Use Google Apps [G Suite] religiously

– Talk about how awesome you are now that you’re on Twitter and G Suite.

Each chapter up until Section 4 – page 197 – read like a blog post pushing a thin implementation of tech for poor reasons. For example, much of the first section talked about the power of being paperless without really diving into instructional effectiveness. As I read I tried to highlight simple suggestions written as if they were the best solution to a particular problem in the margins. My intent is to go back through and try to identify instructional situations where those suggestions are relevant to give context to teachers looking for help in school.

The difference in tone between books that were all taking on the same topic is stark. Segmentation in a market (education is a market, after all, and edtech is a particularly lucrative submarket) and these books speak to their particular audience. After three months, I’m focusing on ways to bring teachers from the realm of edtech sex appeal into technology-rich instruction with fidelity to nuanced practice.

Making the Transition

I realize that some of the judgements I’m making are not fair at face value. I’m also very aware that changing practice takes a long time, especially if you’re searching for methods to change on your own without support. But, I’m not convinced that the path most teachers follow through the edtech regions is the best, or only, one.

The discrepancy between these books is stark. I don’t disagree that the more exciting changes come from trying apps and tools because they show off well. Changes in philosophy are harder to show in a Tweet and even harder to process and make essential in our day to day goings on. As a coach, it would be a disservice to not push teachers for the philosophical shift in everything I do, even through the lens of using a particular tool more effectively.

This is the spot when I would offer a handful of poignant, but not heady, methods for making the shift.

I don’t have any.

This is an intensely personal process. It requires reflection and relationships. The goal for teachers, in any case, is the same: improve teaching using resources intentionally.

Edtech preaches a wholesale shift away from the tangible in favor of the digital. Deniers push back with a deep-seeded reluctance to discuss new ideas or methods. I’m convinced that proselytizing either approach, while good for personal branding and making a name for yourself, is ineffective in the long run. Reading with a critical eye, looking for statements in absolutes and ultimatums, and thinking beyond short-term gains make the difference.

Featured image is abstract green flickr photo by dr.larsbergmann shared under a Creative Commons (BY-NC-SA) license