Reflection on Week 3 of LTEC 6512 - The Ethical Challenges of Using Technology in Teaching and Research
- Marcus D. Taylor, MBA
- Feb 5
- 4 min read
Updated: Feb 13
Introduction: Wrestling with Ethical Questions
As someone who uses technology in both teaching and research, I often find myself in a tug-of-war between excitement and concern. There’s no doubt that digital tools make education more accessible and research more efficient, but at what cost? The more I rely on technology, the more I notice the ethical challenges that come with it.
I’ve had to ask myself some tough questions: Are my students truly benefiting from technology, or are some of them being left behind? Am I encouraging critical thinking, or are we becoming too dependent on AI? What happens to the massive amounts of student data being collected? These are the thoughts that have been weighing on me, and I think they deserve deeper reflection.
1. The Digital Divide: Are We Leaving Students Behind?
I used to assume that because we have access to technology, everyone can benefit from it equally. But that’s just not true. I’ve had students struggle to complete assignments because they didn’t have reliable internet at home. Others couldn’t afford the latest software or devices required for research. It made me realize that technology, rather than leveling the playing field, sometimes makes existing inequalities even worse.
What I’m Learning:
Access isn’t universal. I need to be mindful of students’ different financial situations when assigning tech-based work.
Open resources matter. Using free, high-quality educational materials can make a big difference.
Digital literacy is a skill. Just because students grow up with technology doesn’t mean they automatically know how to use it well.
2. Are We Becoming Too Dependent on AI?
I use AI-powered tools all the time—to streamline grading, assist in research, and even generate ideas. And while AI has its benefits, I worry about what we’re losing. I’ve noticed that some students are relying on AI to do their thinking for them, using chatbots to answer questions instead of struggling through the material themselves. As a researcher, I’ve also seen how easy it is to take AI-generated insights at face value without questioning their accuracy.
What I’m Grappling With:
AI should assist, not replace. I need to be intentional about when and how I use AI in the classroom.
Students need AI literacy. It’s my responsibility to help them understand AI’s limitations and biases.
Critical thinking must come first. Just because AI provides an answer doesn’t mean it’s the right one.
3. The Uncomfortable Reality of Student Data Collection
I’ve become increasingly uneasy with how much data universities collect on students. From tracking attendance through learning management systems to analyzing online activity, it sometimes feels more like surveillance than support. I wonder: Do students fully understand what’s being collected? Who owns this data? Could it be used against them?
What I’m Reflecting On:
Transparency is key. Students have the right to know how their data is being used.
Less is more. Institutions should only collect data that directly benefits learning.
Privacy matters. Just because we can collect data doesn’t mean we should.
4. The Influence of Money in Research
Funding is a constant challenge in academia, and I’ve seen how corporate money can shape research in ways that aren’t always obvious. Even when researchers aim to be objective, financial ties can influence which questions get asked and which findings get published. I’ve asked myself: Am I engaging with research critically enough? Am I aware of the financial interests behind the studies I rely on?
What I’m Reminding Myself:
Always check funding sources. Knowing who pays for research helps me evaluate its credibility.
Open access is the future. Paywalls shouldn’t stand in the way of knowledge.
Independent peer review is crucial. A strong review process helps keep research honest.
5. Virtual Reality, Augmented Reality, and Ethical Blind Spots
VR and AR are exciting teaching tools, but I can’t ignore the ethical concerns they bring. I’ve been thinking a lot about how immersive technology affects students psychologically. Could too much exposure to virtual spaces blur their sense of reality? And what about privacy? VR platforms collect data in ways we don’t fully understand yet.
What I’m Keeping in Mind:
Balance is necessary. VR can enhance learning, but it shouldn’t replace real-world experiences.
Privacy protections need to be stronger. The data collected in virtual spaces should be treated with extreme care.
Not all VR content is ethical. We need clear guidelines for what’s appropriate in educational settings.
Conclusion: Ethical Tech Use Starts with Awareness
The more I reflect on these ethical concerns, the more I realize that technology itself isn’t the problem—it’s how we use it. If I’m not thinking critically about the way I integrate technology into my work, I could be contributing to the very problems I’m worried about.
I don’t have all the answers, but I do know this: The more we talk about these issues, the better we can navigate them. I want to keep asking these hard questions, and I hope others—educators, researchers, and students—will join me in the conversation.
What ethical concerns have been on your mind when it comes to technology in education and research? I’d love to hear your thoughts.
Opmerkingen