March 2021

Last year, amid the pandemic, reports emerged that some thirty candidates in the Georgia State Trooper academy were accused of cheating on an online exam. It resonated across the nation because it mirrored the behavior many saw taking place at our universities and colleges. The stakes were much higher, though. The candidates were disciplined severely, many being dismissed from the program and the director and deputy directors of the Georgia Public Safety Commission both resigned. In addition, because the cheating occurred on a test involving the writing of speeding tickets and the cadets had since written almost two hundred tickets, most violations were thrown out.

The article in USA Today described the cheating behavior in detail:

Investigators found the cadets utilized written or typed notes, received direct assistance from another cadet (test   answers), utilized test questions and answers posted by a cadet on the GroupMe online application, and queried an internet search engine for test questions and answers,” the department said.

DPS added two Snapchat group chats were created, which included members of the class. There were 33 members of the 106th Trooper School, the department said — before Wednesday, one was already dismissed, another resigned and a third is on military leave.

According to the DPS, the allegations were:

  • Everyone in the 106th Trooper School Cheated on the Speed Detection Operator Exam
  • A cadet at the time had helped other cadets with their online exams
  • Three cadets at the time had assisted another cadet with passing his exam;
  • A training instructor had printed a written makeup exam and permitted two cadets who had failed the exam to return to their dorm rooms with the make exam and turn it in the next date.

Stop me if you’ve heard this one. After all, the behavior described in the reports sounded similar to cheating behavior we saw across higher education last year.  It was a sad story, one that underscored the assault on integrity in a year where remote instruction had necessarily become the norm.

But a funny thing happened on the way to the courtroom.

In January, nearly all of the troopers were cleared by an investigation from the Georgia POST council, an independent council of state government. To be honest, I was surprised. How could the Georgia Department of Public Safety get the investigation so wrong? I read through nearly every article I could find about the POST investigation and the clearing. After all, perhaps there was some takeaway from their investigation that we should consider in higher ed. Perhaps there was something we could use as a case study for our students and for integrity professionals. As I read article after article, they sounded eerily similar. They sounded, frankly, like they had come from the same press release.

Most of these reports reiterated the same blurb about POST’s conclusion: that while the cadets clearly had collaborated without the permission of the instructor, they had not intended to cheat. In summary, it was all a big misunderstanding, despite the fact that the instructor had said, repeatedly, that cadets were told they were not to work together. It’s as if a college professor told her students they couldn’t share notes, talk, or work together while completing an exam, found out they did, and yet was told by a reviewer that they could see inside the student’s mind and ascertain they hadn’t intended to cheat. Worse yet, it appeared that the cadets were asked if they had intended to cheat. Considering that a lifelong career was on the line, the answer they gave was unsurprising.

I wanted to read more and so I kept looking for one of the local news agencies (11 Alive News or even CNN) to link out to a summary report. Most referred to a report, but none of them appeared to have lain eyes on it.

I reached out to an investigative reporter at a local news station and asked if I was just missing it, and if he could point me in the right direction. He responded by saying that, under normal circumstances, he would have run a parallel investigation and requested the report under the open records act. This was not, he said, normal circumstances. Keep in mind, we were only twenty days away from the riot at the US Capitol and the nation held its breath waiting to see how the inauguration of President Biden would transpire. In other words, I could understand why he didn’t have the bandwidth to chase down an old story, especially one that had been presumably cleared by an independent agency. He gave me POST’s contact info and suggested that I request the documents. You know, all 8,000 of them.

So, I did. At least, I reached out to POST, and asked about getting the summary report. This should be easy, after all. It was referenced in the first paragraph of this piece by the AJC.

Turns out, it doesn’t exist. I asked twice. But I was told by a very nice, but very terse public relations person that I was welcome to all 10,000 pages (which she offered to put on a disk for $20), but no summary report would be included. I’m ordering that disk. It turns out that the most comprehensive analysis of the Georgia State Patrol and POST’s investigation was completed by the Atlanta Journal-Constitution, who apparently reviewed the 8,000 pages available to POST and the 1,500 pages provided by the GSP investigation. Their final conclusion? “What mattered to one set of investigators,” they said, “didn’t quite register with the other.”

Now, I’m not sure what any of this adds up to. The executive director of POST, Mike Ayers, must have been referencing something when he made this statement. But did he, alone, read the 10,000 pages covering the actions of 32 fired state troopers? And if not, how did he or the agency overseeing this investigation come to the questionable conclusion that, although they did cheat, the trooper cadets’ intentions were not to cheat, therefore absolving them.

I’m not an investigative reporter. I was a hearing officer for academic misconduct cases for four years, and in that time a student’s “intent to cheat” was never a standard we strove to meet. After all, we can’t peer into someone’s mind and discern their intentions. We have an evidentiary standard and if the evidence meets that standard, the student is found responsible. I understand that law enforcement has a different standard, but doesn’t that underscore how odd this finding was? After all, if you don’t mean to kill someone, but do so anyway, you’re charged with manslaughter. Likewise, if you didn’t “intend” to cheat, but did so, shouldn’t you hold some culpability? It feels weird having to explain this to law enforcement professionals and those overseeing them, but here we are.

There are a lot of reasons why the state of Georgia and POST would have embraced this finding. It was ugly and it made national news.  More troubling is the methodology that it seems that POST employed. Despite the fact that the instructors stated they had made it clear to the cadets that they were not to collaborate (online or otherwise), POST established that the candidates didn’t have the intention to cheat by…asking them if they had intended to cheat. Unsurprisingly, all but one cleared that low bar.

Why does it matter?

Maybe they did sweep this all under the rug, or maybe there is a perfectly reasonable explanation for why these cadets (students) were cleared without the kind of documentation or reasoning people who normally deal with integrity incidents would expect to see.

However, this story does provide a window into the importance of transparent oversight. Without being able to follow the state’s reasoning or the evidence upon which they based that reasoning, questions will remain. Are these cadets truly innocent or were they the beneficiaries of an investigation designed to find them (and their superiors) innocent? The main takeaway for those of us doing academic integrity work is to make sure that our decisions and the methods we use to arrive at them are understandable to the communities we serve. They must also survive the scrutiny of those asking serious questions about what we do. Without that, we’re a bigger punchline than anything Steve Guttenberg could deliver.

The area of math assessment is a rapidly evolving one, and the way educators think about academic integrity in this area needs to evolve with it. Apps which not only solve math problems, but show the steps taken towards the solution, are readily available to students. Meanwhile, “study help” websites allow students writing tests and exams outside of their schools to outsource questions in fast turn-around times.

Are current academic integrity policies equipped to deal with this aspect of remote learning? At which point does the use of these technologies represent cognitive offloading (Dawson, 2021)? In considering ways to address this, what are the effects on the stress levels of students (Eaton & Turner, 2020)?

Join the multidisciplinary Learning Commons team for an interactive session where academic integrity is anchored in teaching and learning (Bertram Gallant, 2016). Participants will leave this session with an improved understanding of math applications and their uses, defining cheating behaviours for specific assessments, and looking at assessment from the student perspective.

Presenters: Lynn Cliplef (Faculty Development Coach), Craig Dedrick (Learning Strategist), Caitlin Munn (Quality Assurance Specialist), and Josh Seeland (Manager, Library Services & Academic Integrity).

Date: Wednesday, April 7th, 2021

Time: 12:00 p.m. – 1:30 p.m. (CST)

Capacity: Limited to 40

Platform: Zoom

To register, email: 


Bertram Gallant, T. (2016). Leveraging institutional integrity for the betterment of education. In Bretag, T. (Ed.). Handbook of academic integrity. Springer.​

Dawson, P. (2021). Defending assessment security in a digital world: preventing e-cheating and supporting academic integrity in higher education. Routledge.

Eaton, S. E., & Turner, K. L. (2020). Exploring academic integrity and mental health during COVID-19: Rapid review. Journal of Contemporary Education Theory & Review4(1),35 – 41. ​

Josh Seeland is the Manager, Library Services & Academic Integrity Officer at the Assiniboine Community College (ACC) Library in Brandon, MB, Canada, where his primary duties include research initiatives and library instruction/outreach at ACC locations across Manitoba. He is a member of the Manitoba Academic Integrity Network (MAIN) and chairs ACC’s Academic Integrity Advisory Committee. Seeland holds Bachelor of Arts in History and Philosophy from the University of Manitoba and a diploma in Library and Information Technology from Red River College.

During the COVID-19 pandemic I have evolved from being an academic integrity advocate to being an academic integrity activist. I have learned that being an activist does not require being an antagonist. Some activism is big, bold, and public and other kinds are quiet, discreet, and cooperative. Standing up for what matters is important no matter how you do it.

In a book chapter I am writing with Dr. Natasha Kenny for Academic Integrity in Canada (forthcoming, 2021), we discuss how academic integrity work is often invisible. It involves conversations with individuals, small groups, and big committees. These conversations can be unscheduled and informal or they can be formal and demand a ton of preparation, including reports and slide decks. All too often, these reports are internal documents that never become publicly available. I expect many schools have collections of such reports and documents that never see the light of day. These are the invisible artefacts of integrity.

In academia, the work we do must be visible in order to receive recognition in regular performance reports, and applications for promotion and tenure. But much of the work that many of us do as academic integrity leaders, researchers, and activists is entirely invisible. I am sure I am not alone when I become frustrated beyond words when administrators and colleagues demand “evidence” for aspects of this work that are in a pre-evidentiary state. When I – and others – started becoming vocal a few years ago about the ways in which contract cheating companies blackmail students, we were mocked by some colleagues as being sensationalist and dismissed by others who insisted that unless we had “evidence” that we had no business to be making such claims.

When Yorke et al. (2020) published their article on blackmailing of students by contract cheating companies, the academic integrity community finally had evidence to substantiate what we had been talking about for years. When Australia’s national quality assurance body for education, the Tertiary Education Quality and Standards Agency (TEQSA), developed an infographic to help promote awareness about how contract cheating companies blackmail students, that further legitimized the conversation. Over time, we will gather more evidence and have more conversations about the insidious practices of contract cheating, but the underlying issue of critics shutting down conversations about important issues due to lack of “evidence” remains problematic.

During the Black Lives Matter movement, a number of academic integrity advocates began having conversations about how particular student groups are over-represented in academic misconduct reporting. This is a topic that Tracey Bretag addressed in her workshop, “Academic Integrity and Embracing Diversity” when she joined us at the Canadian Symposium on Academic Integrity. There is some evidence from other countries that students from particular backgrounds get reported more often for misconduct than others, but as yet, we have not collected data on this in Canada. Let’s get one thing straight: Just because we have not yet collected data on a problem does not mean that the problem does not exist.

In 2020, I produced a discussion paper about why we need more data relating to student misconduct to better understand how and when students from particular groups might be over-represented (Eaton, 2020). Critics (particularly in my own country of Canada) emerged from the woodwork to demand “evidence” that there was injustice and implicit bias with regards to which students get reported for misconduct. I am paraphrasing, but the general gist of the comments was, “until you can prove to me that international students do not cheat more than domestic students, then I don’t believe you.” I carefully try to explain that those who get reported for misconduct may not include everyone who commits misconduct. The critics are not interested. Their myopia prevents them from entertaining the idea that a problem might exist even in circumstances where formal data are not yet available. Once again, we find ourselves in a pre-evidentiary state.

Insisting on having “evidence” for invisible work is frustrating, and at times it seems downright ludicrous. Many of us who work in academic integrity research are working as fast as we can to conduct research and gather the necessary data. As I have pointed out in an article I co-authored with a graduate student a few years ago, in Canada, very few researchers have successfully received any federal funding to study these questions (Eaton & Edino, 2018). I will keep applying for federal research grants to study these topics. Until then, I do the work anyway, because it is important and urgent.

For me, doing academic integrity research is not an ideologically agnostic endeavour. This work is not values-free.  It is entirely values-laden. When one studies ethics in educational contexts we do not do so because it is merely an intellectual endeavour. We are not dispassionate, detached, or objective. In many cases, we are passionate not only about the work, but about change that can result because of the work. For many of us, academic ethics inquiry is intertwined with advocacy. We do this work because we care deeply about our students, our colleagues, and the systems that are supposed to support us all.

I have had many sleepless nights mentally preparing for conversations about academic integrity and ethical issues in education, particularly during the pandemic. These conversations may happen quietly or behind closed doors, leaving no trace that they ever occurred. The impact of the conversations can change the trajectory of how individuals or organizations act. Just because work is invisible does not mean that it does not have impact. And in the world of academia where we are under constant and unrelenting pressure to show the “impact” of our work, much of this work will continue to go unrecognized by our superiors. But we do the work anyway knowing that sometimes the invisible efforts are just as effective – if not more so – at creating lasting change.

Dr. Leslie Reid, the University of Calgary’s Vice Provost Teaching and Learning, has commented to me more than once that change happens “one conversation at a time”. During this pandemic, my identity as an academic integrity activist has definitely evolved. I recognize that I must undertake the invisible work in addition to – not instead of – the visible (and quantifiable) work such as research articles, book chapters, books, conference presentations, and so on. But like so many others who engage in this work, I know that the invisible work matters.

I will be an activist on my own terms: having one conversation at a time, sometimes publicly, but also (and often) privately. But no matter how those conversations happen, they matter.

Eaton, S. E. (2021). On Becoming an Academic Integrity Activist: Reflections on the Impact of COVID-19 on Scholarly Identity. University of Calgary.


Bretag, T. (2019). Academic integrity and embracing diversity. Workshop presented at the Canadian Symposium on Academic Integrity, Calgary, Canada.

Eaton, S. E. (2020). Race-Based Data in Student Conduct: A Call to Action. Retrieved from Calgary, Canada:

Eaton, S. E., & Edino, R. I. (2018). Strengthening the research agenda of educational integrity in Canada: A review of the research literature and call to action. International Journal of Educational Integrity, 14(1).

Kenny, N., & Eaton, S. E. (2021). Academic integrity through a SoTL lens and 4M framework: An institutional self-study. In S. E. Eaton & J. Christensen Hughes (Eds.), Academic integrity in Canada: An enduring and essential challenge: Springer.

Tertiary Education Quality and Standards Agency (TEQSA). (2020). Contract cheating and blackmail. Retrieved from

Yorke, J., Sefcik, L., & Veeran-Colton, T. (2020). Contract cheating and blackmail: a risky business? Studies in Higher Education, 1-14. doi:10.1080/03075079.2020.1730313

Related Reading

Eaton, S. E. (2020). Academic Integrity During COVID-19: Reflections from the University of Calgary. International Studies in Educational Administration, 48(1), 80-85. Retrieved from

Eaton, S. E., & Turner, K. L. (2020). Exploring academic integrity and mental health during COVID-19: Rapid review. Journal of Contemporary Education Theory & Research, 4(1), 35-41. Retrieved from

Curtis, G.J., Slade, C., Bretag, T., & McNeill, M. (2021) Developing and evaluating nationwide expert-delivered academic integrity workshops for higher education sector in Australia. Higher Education Research and Development, DOI: 10.1080 / 07294360.2021.1872057


This research accompanied the rollout of national academic integrity workshops (19 in total) funded by the Tertiary Education Quality and Standards Agency (TEQSA) and facilitated by a small team of academic integrity researchers/practitioners, led by Professor Tracey Bretag. These workshops, held in late 2019, added to previous sector-wide academic integrity initiatives and preceded the development of a toolkit of resources, freely available on the TEQSA website.  Therefore, the workshops aimed to capitalise on existing academic integrity knowledge and practice across universities and independent higher education providers and to encourage a collaborative culture across the sector. Four hundred and fifty-two participants attended the three-hour workshops across 17 different locations. To the knowledge of the authors this is the first investigation of its kind, in assessing the impact of academic integrity workshops across a nation.

Research Approach

The goal of the research was to quantitatively measure participant learning related to the workshop content, rather than just their reactions to the workshop itself. Only one question related to participant reactions. The response rate was 75.7%.  Short pre- and post-workshop surveys were given to participants that contained eight items as outlined in Table 1.  The surveys focused on awareness of key issues discussed in the workshops, the confidence levels participants had in their institutions’ academic integrity approach, and whether there were any changes across the items pre-and-post workshop.

Table 1: Awareness and confidence items used in the workshop surveys

Item No. Awareness of… How confident are you that your institution’s…
1 The Model Statement of Commitment to Academic Integrity Strategies will be able to mitigate academic integrity risks
2 Australian and International academic integrity research Staff can detect contract cheating
3 Your institutional policies and procedures Contract cheating responses are aligned with those recommended by the TEQSA Good Practice Guide
4 Ways to promote academic integrity Technology effectively supports academic integrity



The research found that participants’ awareness increased for Items 1,2 & 4, particularly concerning academic integrity research (Item 2). The aggregated data here suggests that participants benefit from academic development workshops, led by expert facilitators with first-hand knowledge of the topics and teaching delivery skills.  Responses for Item 3, ‘Awareness of your institutional policies and procedures’ however, indicated decreased awareness, pre-and post-workshop. Reflection by the facilitators post-workshops suggests that as participants progressed through the workshop content, they may have realised that their institutions’ policy and procedures were lacking.

Questions around confidence levels was the second area investigated. Only a small increase in confidence was seen, which was influenced by institutional demographics, such as size and type. In particular, the research found, however, that confidence levels in participants from larger institutions were higher post-workshop than those from smaller institutions, with the former participants beginning the workshop with a significantly lower average confidence level.


The findings provide empirical evidence, based on pre-and post-workshop changes in both awareness and confidence concerning the workshop content. Further, it shows the benefit of one-off workshops in which participants can respond to content stimulus by reflecting on their institutional progress in addressing academic integrity issues, and the policy or practice areas that still need improvement. This research adds to the limited evidence of the efficacy of academic integrity professional development workshops for higher education institutional staff (both professional and academic) and adds empirical support to the scholarly discourse of improved effectiveness of such workshops if they are part of an overall sequence of themed academic development activities.


This study shows the potential of one-off workshops, led by experts in the field and part of a sequence of initiatives, can make substantial advances in participant awareness and confidence of academic integrity foundations and practices.


The author thanks TEQSA for their financial support for this work, and acknowledges the expertise of her colleagues, Professor Tracey BretagDr Guy Curtis and Dr Margot McNeill in developing and facilitating the workshops and toolkit of resources and undertaking this research. The author acknowledges the loss of Tracey who passed away in 2020. Her contribution to the sector and the field of academic integrity is greatly missed.