The recently released Family Link app from Google provides parents with several helpful measures for monitoring and restricting the device use of their children. Parents can approve and deny various apps from being used, set bedtimes at which point the phone or tablet will be inaccessible, and monitor how much time their kids spend using each application.
Family Link is undeniably a powerful tool with great benefits, however, it invokes a question concerning how safety should be managed on a larger scale for K-12 schools and districts. Should schools and districts focus on measures that restrict functionality for students, or should they focus on proactive safety measures?
Broadly speaking, there are two ways to make technology safe for students. The first is by creating limitations or restrictions that prevent students from accessing or creating inappropriate content. One example of this is the “walled garden” approach to student safety—creating limitations on whom students can communicate with, as well as what they’re able to send and receive. The second way to make technology safe for students is by creating ways to receive early warning detection of inappropriate use in real time.
Neither of these offers a complete solution to the issue of student safety. There are obviously needs for access restriction, which is why we always insist that Gaggle Safety Management should not be seen as a replacement for your network security. But it’s wrong to think that limiting or restricting student communications with one another, or even with individuals outside the school or district, is going to be sufficient answer to the risks and threats they face inside and outside of the classroom.
The Family Link app is a limitation-focused tool. It allows parents to prevent certain apps from being downloaded, as well as prevent device use during certain hours of the night. These limitation-focused features need to cooperate with a proactive means of ensuring student safety. Based on our history of reviewing student content, here are some reasons why limitation-focused measures don’t offer a complete solution.
When Gaggle Safety Representatives discover inappropriate communications or content, in most cases the technology was not the cause of the issue, but merely the delivery mechanism or messenger. For instance, cyberbullying that occurs within G Suite for Education is also likely occurring outside of Gmail or Google Docs. The instance within G Suite tools actually provided you with the ability to discover it.
The same goes for mentions of self-harm or suicide. Student are very transparent with one another online, and they share these thoughts. Removing the technology prevents you from having the ability to detect them and intervene in a timely manner.
Creating limitations does not support digital citizenship. Good citizens do the right thing, because they know it’s right, not because they’re forced to. When a group of four-year-olds cannot play with a toy without fighting, it could very well be appropriate to take the toy out of the equation.
As children get older, however, it is important to adopt policies that actually encourage good behavior and a positive moral attitude toward others. For this reason, removing technology that could be used inappropriately does nothing to positively shape the moral horizon of your students. They do not become good digital citizens by suffering limitations.
Some limitations will always be necessary. But be sure to remember that removing access to tools will not teach students who to use them appropriately. Implement tools that focus on early warning detection of inappropriate behavior if you want to create a safe environment that also promotes digital citizenship.The Limitations of Tying Student Safety to Devices Click To Tweet