We set up a table at the University of Minnesota and asked students to share how they use AI in their studies — and how they feel about the technology. Fear was the common thread — Fear of being caught, fear of not learning, and fear of AI taking away jobs or making their degrees less valuable. And in some cases the students shied away from trying AI tools even in productive ways, for fear of being accused of academic misconduct. This special episode of Learning Curve was co-produced with Ceci Heinen, podcast producer of the university’s student newspaper, the Minnesota Daily.
In the Know, the podcast of the University of Minnesota's student newspaper, The Minnesota Daily.
"Everyone Is Cheating Their Way Through College," in New York magazine.
Anonymous Student:
What is this? We're doing a podcast about AI and like students …
Jeff Young:
I recently set up a table with a group of student journalists in front of the Student Center at the University of Minnesota in Minneapolis. We even brought some chocolates to entice students to talk to
Anonymous Student:
I just want a chocolate. Yeah, it's shaking. Which one's which one's a dark one? Let me get two of these.
Jeff Young:
But we didn't need to worry. Students were eager to share their thoughts about the topic. We came to talk to them about their use of AI in their schoolwork.
There was just one thing a lot of them did not want to share.
Reporter:
So what's your name and major?
Anonymous Student:
Oh wait, I thought it was anonymous.
Reporter:
You can be anonymous.
Anonymous Student:
Okay, I'll be anonymous.
Jeff Young:
So we granted anonymity so that we could get their unguarded views, even so some claim to be speaking only in hypotheticals.
Anonymous Sports Management Major
Hypothetically speaking, if I did use AI, I would use it for classes I just don't really care about. I would just copy the questions in the chat and then put them into the AI bot to get the answers and put them on the canvas, and then when I get them, eventually, right, I'll just beat that.
Jeff Young:
Welcome to Learning Curve. I'm Jeff Young, and I'm joined today by Ceci Heinen, the podcast producer at the Minnesota Daily, which is the student newspaper at the University of Minnesota. Thanks for doing this, Ceci. And also, what got you interested in this topic?
Ceci Heinen:
Yeah, thank you so much for having me. I think what got me interested in this topic was just being a student right now and having to deal with AI in my own right, and then having to deal with it in my job at the daily and in my schoolwork, and just kind of every aspect of my life. I've also been wondering what other students who are, unlike me, are, how they're dealing with this, how they're coping with it, and what they think about it.
Jeff Young:
So we set up a table in this central spot on campus, and brought out some microphones and put up a sign and said, We want to know how you use AI. This team of student reporters talked to 20 of their classmates of various years and majors. We heard a wide range of views and experiences with us, lots of professors that I talked to, and I think lots of folks outside of colleges, I think they have this idea that students are really abusing AI and essentially hit the easy button when they have to do homework and just have the bot do it for them.
This really came through in this article that ran in New York Magazine this summer. It went viral. You might have seen it. The headline was, everyone is cheating their way through college. And it just had anecdote after anecdote of people just doing nothing as far as learning. And it's true, some students that we talked to at the University of Minnesota, they are cheating with AI, I think they would admit but we also heard from lots of students who are kind of doing the opposite, figuring out how to harness AI to study better so they learned more than if they did not have the tools.
Anonymous senior:
I use it to study the Google Gemini has a guided learning thing where you can just give it whatever material you're working on, and it'll ask you questions that lead you to the answer. I think people using it to straight up just do their assignments are wasting AI. I think it's like more of a tool than a get out of work free card. You know,
Anonymous junior:
the way I use AI in my schoolwork is I use the Google notebook, and if I have a reading due, I will put the reading into the AI generator, and then it will create, like a podcast out of it so I don't have to read a really dense reading
Anonymous junior:
Let's say I had a statistics class. Okay, I went to the lectures. I studied it, but I did not learn much. But for example, I sat down, ask AI to teach me, teach me in the simple ways possible. You know, I made it in a way so it could learn. You know, how I understand it, you know, so. And there was a good experience. I just had an exam right then, you know? And I did very good. And all I studied was from, you know, chat GBT, you know, like I put my lectures and ask it to teach me in the best way possible.
Anonymous junior:
On occasion, if I really don't, don't understand the problem, I usually use it to try to find one step of a specific problem, like I don't try to have it solve the entire problem, because that would be counterproductive. I just look to sometimes I just can't figure out one step and don't have time to go to office hours.
Anonymous early childhood education major, junior:
I kind of just use it if there's part of a direction for a paper, and I don't really know what it means, because it's a lot of fancy words, and I say, hey, simplify this for me, because I don't know what this means.
Jeff Young:
Those were four juniors and a senior that we talked to majoring in industrial and systems engineering, economics, strategic communications, engineering and early childhood education, respectively. Okay, Ceci, were you surprised by the uses that you heard here,
Ceci Heinen:
Honestly, many of these were new to me. I have used notebook LM before for some of the readings that I have to do, and it can very scarily turn readings into a deep dive podcast, which is helpful, but also slightly terrifying for those who haven't used that. It's this free Google tool that you. Uses AI to help summarize material in different ways. It was interesting to hear, though, how each major had a different approach to using AI.
Jeff Young:
For some of the students we talked to, AI was helping to push them past something they were really stuck on, especially when what the professor provided just wasn't working for them. In my reporting, I've been finding that a lot of times it seems like AI is a band aid for parts of the traditional teaching system that are kind of broken or that just don't work for a lot of students.
Ceci Heinen:
You'd think that many professors would want their students to use AI this way.
Jeff Young:
Yeah. I mean, these seem like a good thing. But even some of the students we talked to who had found productive uses of AI also admitted to pressing that easy button. In other cases, depending on the class or the assignment, it seemed to depend on whether the student perceived the class as valuable.
Anonymous student:
I don't know. There's just some classes that are just seem a little pointless to me, so I do use AI for that,
Anonymous sports management major:
but I will be using AI for classes that don't matter for my major. Because Why am I taking a geology class when I'm working in sports,
Anonymous sophomore:
it makes things go by faster. Sometimes, something like that, I don't necessarily need to do, like, it's not gonna help me, like, study if I need to get it done, yeah, like a little five point easy thing. Get out the way so I can do things that do matter.
Jeff Young:
Those were three sophomores. The second one is the same student we heard from at the start of the episode. He's majoring in sports management.
Ceci Heinen:
It’s clear that students are using AI in a variety of innovative and maybe some questionable ways.
Throughout all the interviews with students, fear was the common thread: fear of being caught, fear of not learning, and fear of AI's growth, etc.
One of those fears is the effect of AI on the jobs that college students are hoping to graduate into when they finish school. And a group of students that I know are getting hit hard by this are the Minnesota daily staff, and it's not just a question of jobs, but a question of whether AI will take over mass communications, or whether the human voice continues to be valued by readers.
We spoke to a handful of MN daily staff about their unique perspective on AI as budding journalists.
Jeff Young:
Yeah, both of us were really curious to hear the perspective of student journalists at the Minnesota daily, because for them, these issues are not theoretical. They run a newspaper. They're doing all the writing and the editing and making podcasts, and so they are wrestling firsthand with whether to experiment with AI in their work to see if it could maybe improve what they offer to readers, or whether to stay away from it so that they hone their human skills without the crutch of these chat bots.
Ceci Heinen:
Yeah, I think overall, the general vibe about AI from MND daily staff was extremely negative. Nearly all of the half dozen staffers that we talked to stressed a desire to maintain the human to human connections that journalism fosters.
Jeff Young:
Ceci, you had mentioned to me that the MN daily has seen some examples of where students were kind of turning in work trying to get into the newspaper as a staff person, and their applications didn't quite seem like their own work.
Ceci Heinen:
Yeah, I’ve had several applications, unfortunately for students who want to work with me on the podcast desk, and their cover letters kind of sounded bland and unnatural, and clearly were AI generated, which made me set but our Managing Editor, Sam Hill, has also noted that I'm not the only editor that's been seeing that
Sam Hill:
We've had a lot of problems with AI at the Daily journalism is all about writing your own stuff, getting your own information, verifying your own information, and when people use AI, first of all, it's unethical that they use an outside source to get the information that they have to write as their job. But also, like when we're getting cover letters produced by AI, work produced by AI, you just don't know if people are, A competent, and B like doing the work that they need to.
Ceci Heinen:
So some reporters are afraid to use it at all, so that they don't get labeled as someone who doesn't do their own work. That's the case for campus reporter Isabella Morden wheelden. And like my work at the Daily I stay away from it because I am a little bit scared of I don't know, I just try to stay away from it with writing. I think the reason I am scared of ever using it. Here is, I think, Well, I think it happens this summer where somebody kind of accidentally copied and pasted a link from chat GPT, but chat GPT puts their name like in the link of whatever source you're using. And there was a little bit of a riff about that, but I think yeah, that just has kind of scared me away from it. A lot of folks here at the MN daily felt this fear of using it in their own work, or many like opinions desk reporter Amy waters, were adamantly against it, as they believed it would take away from their learning opportunities and the process of journalism is.
Amy Watters:
I think there's a growing level of apathy among students in general about school work and the world and things like that, right? They care more about getting a good grade over the process. I do think students want to learn, and I think if you removed grades out of the equation entirely, I'd say I think there'd probably be a lot less AI use but people care more about the achievement over the actual process and at the Daily we care about the process, we care about the writing, we care about the learning, we care about the talking to people.
Jeff Young:
As I listened to these student journalists, I was reminded that putting out great articles that's just one goal for them. They're also in a kind of unofficial classroom. They don't get grades for doing the paper, but it is a training ground, a way to try out a profession they might go into after graduation. So efficiency does not make as much sense in that context as it would in a commercial newsroom.
Grace Aigner:
We are very much approaching what we do with the mindset of that. We are student journalists. We are learning so it must be done by us. You know, I think when you get out of that realm, the questions of convenience and productivity and all those things, and those things are a part of the daily but we have a slightly different focus when
Ceci Heinen:
it comes to how much AI might play a role in journalism in the future. The staff at our student paper have differing opinions. Owen McDonnell, the video editor, believes that AI will soon become the standard for newsrooms.
Owen McDonnell:
I think that within a year, it is going to completely replace people who are writing briefs and breaking news. You're just going to plug it in and it will write it right away. I think that within a few years, even, it's going to be pretty hard to distinguish vast majority of stories from human and AI, the big human ones, the ones that are like, take six months of reporting. Those like, maybe not, but like, even then, it's like, you just plug in all your information. You plug in how you write. Like, it's going to be pretty close.
Jeff Young:
Most of the daily staff seem pretty optimistic about their future careers, though, with many of them talking about how AI is not really going to replace the human touch of journalism. Grace praxmar, the copy desk chief, is one of many journalist undergrads who still believe there is hope for the journalism market.
Grace Praxmarer:
I tell people not to be too concerned about it, because I don't really see how AI could replace journalists. It's a very unique job, and it requires a lot of face to face interactions, interpersonal connections, and, you know, having that unique voice, and I don't think AI is capable of doing those things as well as humans could.
Ceci Heinen:
But time and time again, the conversation with these student journalists returned to their almost visceral feelings about AI.
Grace Praxmarer:
I hate it. I do like, I'm sorry people are no longer thinking critically because they're outsourcing their thinking to whatever. I'm not going to talk about that specifically, but I don't like AI. I think it's weird, and it scares me.
And it all goes back to the human aspect of it. Like, journalism was created as a public service for people by other people. It's not okay to me like you, if you like, want to be a newsroom that is supposedly publishing for people, and you're not going to have people do that work like it just does not make any sense to me. It's very contradictory to like the core tenets of journalism.
Ceci Heinen:
Student journalists are grappling with the unknowable future of AI. I will say I do not think that AI will ever be able to make a podcast like this, where you have to convince sources to share their thoughts and their feelings. So at least that's reassuring.
Jeff Young:
Yeah, I definitely have to believe that's right.
We did learn from our reporting that many students are feeling anxious about how AI is changing the job market though.
Amy Watters:
Are you worried about your first job possibly being jeopardized by AI?
Anonymous Student:
Definitely, somewhat, I hope that it won't be. I'm looking to major in psychology,
Anonymous sophomore majoring in political science:
I just worry about entry level jobs, especially in the white collar business and other jobs AI can just take. Would my degree be less valuable now than when I went into it two years ago?
Jeff Young:
That was a sophomore majoring in political science, but not all the students that we talked to were worried. Some looked forward to using AI in their jobs, like Yusuf Yuser, an engineering major.
Yusuf Yuser:
In industrial engineering, we're responsible for improving and innovating businesses. So I could see in the future, I could just give an AI all the info that I have and tell it what's the most efficient way for this business to spend their money from there with my college education, I could see if that's a valid answer and use that it would be a tool.
Ceci Heinen:
Or, they felt their career wouldn't really be impacted either way by AI like that Sports Management major we keep coming back to
Student journalist:
are you worried about your first job being jeopardized by AI?
Anonymous sports management major:
No, because it's not, and I know it because I already have the job and. And I'm working it right now, and we do not use AI at all.
Jeff Young:
After the break, we ask students how their professors are reacting to AI. Stay with us.
Promo Swap:
Hey, everybody. I wanted to tell you about another podcast that I think you'll enjoy. College matters, from the Chronicle. College matters is a weekly show from the Chronicle of Higher Education, and it's a great resource for news and analysis about colleges and universities. The host, Jack stripling is my former colleague at the Chronicle, who has covered and investigated higher ed for two decades. Jack really knows his stuff, and I think you'll get a kick out of his conversations with reporters in newsmakers. So check out college matters on Apple podcasts Spotify, or wherever you get your podcasts. Now back to the episode.
Ceci Heinen:
Another layer of this whole AI story and how it relates to students is how university faculty are approaching AI syllabi and every university course now include a subsection specifically related to AI use. Every instructor is allowed to set their own rules and regulations about whether and how students can use AI in assignments, leaving students facing a confusing landscape where the rules can vary from class to class. Many students we talked to said their professors have gone down the route of banning AI from their courses.
Will:
Most professors have a policy against AI use, especially in essays and stuff. Hence why last semester, when my professors moved all exams to Blue Books.
Toby Williams:
My professors on the technology side are very adamantly against using it, especially when we are still learning some of the basics, just because, if you are kind of using AI to substitute learning the basics, they believe once you advance in your sections or Your classes, you really haven't learned much if you're substituting those basic processes with AI, and when AI can't help you, then you've got no sort of background or footing to propel yourself forward on.
Jackson Bugg:
I definitely think they're anti AI more so they're not very lenient on it, and they're like, we're gonna check all of your grades and make sure that nothing is being used. Limiting it is going to be more challenging. So I think kind of finding a way to be able to use it while also making sure it's not only what students are using.
Jeff Young:
The reality is that professors can't really tell when students are using AI or not, at least not for sure. There are AI detectors that promise to help, but these have been proven pretty unreliable, or even worse, they're often falsely accusing honest students of using AI, especially when that student's native language is not English.
The University of Minnesota's teaching support website actually outlines a lot of these problems with these detectors for AI, and it has a statement that says, quote, the use of AI detection tools is not recommended. And it seems like many of the students that we talked to realize that professors could never definitively tell, such as an anonymous second year computer science major that we talked to.
Anonymous student:
I feel like they kind of give you, like, the whole spiel, like at the start of class, like, oh, we can tell, but they never tell. I don't think they're I don't think there. I don't think it's ableto . I don't think nobody could tell something's AI, even, like, Turnitiun. Like, if you look at it when they when you get flat for AI, it's like, I don't really see it looks like natural language at this point, especially if you're smart and you don't just, like, directly, copy and paste it, like, there's no telling.
Ceci Heinen:
Really. This is interesting because we had a physics grad student who is a TA come up to the table for an interview to talk about her experience grading lab reports.
Anonymous grad student:
Physics TAs can tell when you use AI to write your lab reports, mainly because a lot of you're putting in a lot of stuff that we didn't ask you to put in. We can tell it's like at a higher level, we've set up the lab report so that you know how to do it. I might see for every assignment, but it's um, they have four live reports to do a semester. So I might see like one person every time we have to do a live report, submit something that has, like AI in it.
Jeff Young:
But these students that TA was grading, I gotta say, they seem particularly clumsy in their AI use or kind of showing they don't understand what's being asked. So I kind of guess that plenty of students that are hitting the easy button, they do accidentally give themselves away with how little they understand the material.
Ceci Heinen:
I will say every professor I've had has known or feels like they know, and I've met many classmates who've been caught for AI use in a paper or assignment, and they had to go through the university's academic dishonesty program. And that's another thing the university is sending really contradictory signals about AI Political Science major Will said he was confused by the university's recent deal with Google Gemini, which the university now provides to all students for free, and what message that promotes.
Will:
They have an agreement with Gemini, so students who would get Gemini with their student fees. But I mean, there's like cases of where there's a PhD student last year who was accused of using AI during his research. I appreciate countless other examples of students take into honor code violations for AI use. And so the universities needs to put some real, clear guidelines. And there's a task force right now from the University on AI use. But I think because of how rapidly This is growing, that we need some real policy results. Now,
Jeff Young:
The case of this PhD student, it was covered in the daily haishan Yang, a third year PhD student at the U of M, he was expelled in January because of accusations that he used generative AI in his work.
Ceci Heinen:
Yang sued the university and filed a complaint with the Minnesota Department of Human Rights as he claimed he was wrongfully accused and faced discriminatory treatment based on his national origin. His writing was flagged for AI use, but he said that as a non native English speaker, AI detectors can often flag his style of writing.
Jeff Young:
We had multiple students cite this very case as a reason that they fear AI they don't want to be expelled. Ceci, you reached out to the administrators at the U to try to get their official position. What did you hear?
Ceci Heinen:
Yeah, in a statement to the daily Lauren Adamski, director of the office for community standards, says that they began tracking academic integrity cases involving student AI use in spring 2023 they've since found that cases involving aI have jumped from 26% to 39.4% from 2023 to 2025.
She said that the U of M will continue to empower instructors to define their own parameters for generative AI use in their courses, and that there is a course offered by the university that covers AI basics, Ethical use and responsible engagement with AI while learning if students so please to take that course. Ed Mansky stressed that when they receive a case from a professor, they follow the same procedures and ensure that due process rights are respected and that the student knows all of their options for resolving the case. They take evidence seriously, and they have high standards for what constitutes an academic violation.
Jeff Young:
I am talking to college leaders around the country and sending mixed signals about AI. That's pretty typical, because, really, universities are afraid of AI, too. For one thing, it is this huge challenge to academic integrity. I mean, AI makes it hard for colleges to prove students are learning anything, since it really is tricky to detect whether a student has submitted an essay or homework themselves or just had ai do it. So there is this risk that the public will just lose faith in degrees if they feel that most students aren't actually learning and because of that, there's a push around the country at colleges to revamp assignments and make them more AI proof by doing things like having students do projects AI can't easily do, or to move to old fashioned things like blue books.
Ceci Heinen:
I've had several professors who've switched to blue books, so I'd say that's definitely happening here. I personally love writing in a blue book. I find it really gratifying when I can prepare for an exam and write it in a blue book with no resources except my brain, but I know that many students struggle with synthesizing ideas on paper when we're so used to having Google right at our fingertips,
Jeff Young:
And now it's AI, too. And colleges also realize they kind of need to prepare students for jobs that are increasingly adopting AI, and they don't want to look like they're not adapting to this generative AI revolution that it seems to be booming these days.
Ceci Heinen:
There's really no quick fix to generative AI's presence in universities, and many students are still worried about not being taught the right skills for their future careers. Third year, biomedical engineering and genetic cell biology and development major Sam Thibodeau thinks that allowing each professor to develop their own AI policies is counterproductive, and students just need one clear policy to go by.
Sam Thibedeau:
I don't think that the U is doing a good job at preparing students for using it, because there's not necessarily the most clear policy, and it differs for every class on how to use it. And so with that, like, I don't think a lot of students see it. They either see it as a purpose for cheating in the class, or, like, not using it all because their professors, like explicitly banned it. And I think the reality of it in the job is going to be more of like as a tool. I don't think there's going to be a lot of jobs where you're going to be able to, like, use generative AI for the entire job, like students do when they write essays or use it to solve problems, but I think it will be used as a tool in almost every profession, and I think explicitly banning it kind of harms students ability. Need to learn how to use it as a tool.
Jeff Young:
Yeah. And, I mean, I totally hear that, but in reality, the problem is that colleges can't force professors to have the same policy on AI, just like they can't force professors to have the same policy on late assignments. It's all part of academic freedom. Of course, those larger rules about how professors work at colleges could be changed, but this is not something that's going to happen just overnight, even if colleges are certain they want a blanket policy one way or another,
Ceci Heinen:
It's definitely tricky. An overarching theme we discovered throughout this entire episode was a longing for human to human connection. Students are afraid of not only AI taking jobs and diminishing their education, but taking away their opportunities for connection, too.
Grace Aigner:
Just all of it in general, it just makes me sad, like humans are desiring ways to live less humanely, creating things, writing things, solving problems, thinking about how they want to structure an essay, making a grocery list. All of these things that are just parts of living are now being outsourced to a little robot. It makes me very sad. Humans are trying to escape the human life we live. It feels like and that actually like. I hate it.
Ceci Heinen:
The future is uncertain, and young professionals and college students are the biggest group of people grappling with that reality.
I hope that human connection wins out over a desire for further productivity and ease, because learning, failing, struggling, finding new things and meeting new people are all quintessentially human, and in my opinion, are the reason we live life.
Jeff Young:
We wanted to get all these points out there, but we want to take a few minutes now to kind of debrief about this.
Ceci Heinen:
Let's go off script here. Jeff, tell me what you learned in this process. Well,
Jeff Young:
I mean, honestly, there was, there was so much that we learned we didn't even get in there the environmental concerns that a couple students had, of one student who stopped using it because they learned that it's using all the water, all the energy. So yeah, there's so many professors who I'm hearing from who also don't want to use it for environmental reasons, which I totally respect. But it's like, how do you work through this? Because for the students you hear, they want to make sure they get a job, or they want to make sure they're prepared, and if they're not using AI, even if it's for a good reason, but they're not prepared for a job, then, I mean, it's very complicated here.
Ceci Heinen:
Yeah, I know it's hard. I think it's hard being a student in this time, because, like, on top of all the stress of even getting a job and going through school, and school's hard, you have this whole other thing of, like, this ethical, like, ethics battle of, do I use AI or do I not use AI?
Jeff Young:
Yeah, that really came through this idea of, like, people were really thinking about it. I think that's why they came up to the table. They were like, Yeah, you know, I don't want to not learn. I heard I might not learn, yeah, and I'm here paying all this money. I think people did say that so much money, and you're like, Well, yeah, like, what is it all for? If you're not, if you're just clicking buttons,
Ceci Heinen:
Yeah, for sure. And if you have that threat too, of like this, getting caught and being put, being expelled, like, it's crazy, you could literally be expelled for just suspicion.
Yeah, I don't know. It's just a scary time.
Jeff Young:
I was really struck by how hard it is for these students to wrestle through it, even if they were totally honest in everything they were doing with AI and in fact, being scared to even try the tools that might help them, because they just don't want to even risk it.
Ceci Heinen:
I know, that's another thing too, where sometimes I have friends who I'm in some courses with really dense readings, you know, and I have friends who be like, Oh my god, I was up all night, last night, trying to finish this reading. And I'm like, this is a time when I'm like, you should be using AI, because, genuinely, it helps so much. And I'm not a big user of AI, but like, that's like, the one area where I'm like, This is great, and I think students need to do this because I think it helps you learn better.
Jeff Young:
And that's just always with these things. It's like, what is it in comparison to it's like, the student who wasn't going to do anything? Is it better to listen to a Notebook LM podcast and have it summarize a dense reading that might have skipped over some things or kind of cliff note it a little bit? Or is is that better than nothing to at least have a discussion in class where the students have some grounding of what's happening? Yeah? Because I definitely hear from professors before AI that it's like the students just come to any reading and then how do you have a discussion in class? Yeah, or, or are we just then settling for the Cliff Notes version? And is that a bit of a, you know, I can definitely hear other professors being like, no, don't you need to read it, right? And I totally get that too. I mean, I would rather have a world where we all do read all the things so it is. Is tricky. I also was really struck, and we didn't really get it in here by the covid effect and how, yeah, and really, like, the even before AI, you were noting CC that, like, students, like, aren't necessarily practiced and talking to each other or feeling comfortable in social settings. And now we have this, yeah,
Ceci Heinen:
I mean, like, we've, we've actually done that reminds me of a story that Amy waters actually just did about loneliness, and I think, like, covid, Ai, those two factors specifically, kind of, like, came together at the worst time and made it so that, like, my, like, my age of Gen Z, like college students don't have like social you know, not like social skills, but it's just so much harder for people to make friends, and it's so much harder for people to make true connections, because everything's on our phones, and we were so isolated for so long at such a young age, that it just like has such a lasting impact. And now with AI like I think that another thing that that also made me think of is like, group projects and stuff like that are such a big part of college, but now students are just using AI instead of, like, meeting with their group and working through it and making friends and studying together and doing all this stuff together, like college stuff that students do together. And, yeah, that's just like yet another thing that AI, I think, is really heavily impacting in college students'
Jeff Young:
Yeah, we had a student who talked to us on that windy day, and the wind actually ruined this one recording, but he said that he feels like he doesn't. It gets in the way of just talking to that other student about, like, hey, what was that homework or hadn't really understood? Oh, I just used AI. Yeah, yeah. They just used AI instead of doing what we all would have done, yeah, before that, which is to be like, Well, I don't really feel comfortable asking that curl over there, but I'm gonna do it because I need to find out. Yeah. Now you don't need to do it. No,
Ceci Heinen:
I think that's that's a super common occurrence for me, honestly, is like asking someone a question about a reading or work or something like that, and they're like, oh, I don't know. I just used AI, like, okay, so conversation over
Jeff Young:
That’s kind of a dystopia there, right? Yeah. So it is clearly both. It's getting in the way, even when students individually are honest about
Ceci Heinen:
it. I think, yeah, I think it just is like, and I can't even, I can't even imagine how it's impacting younger generations too. Like, that's going to be a whole nother podcast in, like, the high school version of this one. That's crazy. And I don't know if you've done a podcast about that, but I think that would be interesting, because they're like, at a really foundational stage, that it's like, you need to be developing these learning habits before you go to college, because there are some things in college that, like aI can't do, that you do have to do. What
Jeff Young:
I'm hearing too, is like, at least you had a pre AI learning experience like you, you have a time when there was no such thing as chat
Ceci Heinen:
GPT, Oh, yeah. And I like, I mean, it might have also been the high school that I went to, but I I learned how to write an essay, and I learned how to, you know, do math and do science and stuff, like, without any help. And now I feel like high schoolers or younger kids have that crutch that college students are also leaning on now, and it's like, oh, so Are students now we're going to learn how to, like, you know, write a thesis, because they're just going to be plugging it into AI every time. And I don't know, it just scares me, for like, the kids growing up too, what it's going to be like for them,
Jeff Young:
and even this idea of professors designing AI proof assignments, which I've covered a lot. It just feels like that's getting harder and harder as AI is getting better and better. And all these tools come out where there was a thing for a while, where it's like, oh, we'll have the students give a little video presentation, if it was like an online class, or if they were doing that. And then, because then you can't do it, like chat, GPT, the text. But then students, students were, now you can just do it on, hey Jen, exactly like now you could literally have your bot do it. Or you could, you know, be reading a script, kind of on the down low, and not really knowing anything even you're just reading it.
Ceci Heinen:
It makes me so sad too, because, like a whole nother area we didn't even touch on in this episode, is how AI is impacting, like, art and creativity. And just like, you know how AI is literally making art now, and that, to me, is just so messed up. And that's kind of like the same thing with, like, podcasting, too, the fact that notebook LM can take really complex ideas that journalists are like, you know, struggling through, and just turn it into a quick and easy podcast well, and then
Jeff Young:
Yeah, their reading is better than me, maybe, or their delivery, and it's just like, because it's a robot. Oh, it's so scary, yeah.
But I think that was one thing we basically had in common here across our generational divide, is that it's like, in different ways. It's like, scary, scary about like, what's going on with jobs? Yeah? Um, at whatever level you're at,
Ceci Heinen:
Yeah, I know. I'm interested, like, for you have you do you know people who have, like, lost their jobs to AI?
Jeff Young:
Like, you know, you know it? I think it is really, I think it's hard to draw that line exactly with AI right now, because there's so many, like, crazy forces. I know a lot of too. I know a lot of people have been laid off, but I don't think you could trace it just to AI and, you know, you just see the shrinking of newsrooms. But I do think it doesn't help that this tool is there and there's news every day. Like, I was gonna mention this one article, but it seemed a little off topic, of like, just, you know, a brand new article this week about, you know, entry level jobs broadly being, you know, cut from and that it is AI that that was found to be the reason. So I'm sure it's coming. I don't, I don't have an example personally yet.
Ceci Heinen:
Yeah, I know it. It scares me for sure, but I guess there's nothing you can really do about it, other than learn how to use it as, you know, a tool, like a lot of students were saying, like, using it as a tool and not as a crutch, and using it to support your learning rather than replace your learning.
Jeff Young:
Yeah, there was one person, and maybe we can find this clip and put it, like, after the credits or something. But one person was basically like, Yeah, I'm actually getting prepared to use AI by the dance I have to do to, like, kind of hide it from the professor, but still do something better than I could do by myself. And I'm like, oh, that's kind of weird, because maybe there's some, you know, world in which, like, people, people definitely want to be able to show they are worth hiring. So even in a job, maybe you don't. It's not the same as being a college. But I mean, nobody wants to be known as the person and doesn't really do their own work.
Ceci Heinen:
Yeah, no. I mean, I don't think anyone wants to people like. It makes you feel so guilty. And that's another thing, is, like, I don't know if we put those clips even in there of students being like, yeah, even when I use it productively, I feel guilty because it's still like, it's so new, and we've never had something like this before. And so like, students our age or my age are used to having to do things on their own, and now they have this, like, extra thing that, like, do you feel guilty using it? Do you not like,
Jeff Young:
Yeah, we had a student say they felt guilty for basically using it as a study tool that I think any professor would have been like, yeah.
Ceci Heinen:
Yeah, learn my content better. Like, I know, and that's, I don't know. There's no, there's no easy through line to, like, draw between all of this. It's so complicated, complicated.
Jeff Young:
Yeah, and, and I do, yeah. I guess we wish we had some better, like, North Star to take us out on. But, but I think that came back to me is the theme of this, like, kind of fear and just uncertainty, uncertainty about, like, weight. And I think that just wasn't there a couple years ago, and this is like a new layer of a mess on top of everything.
Ceci Heinen:
Yeah, yeah. I know. I wish I could go out on an optimistic note, but I think it just is something people should continue to learn about and continue to not let AI take away your human experiences.
Jeff Young:
I was saying to myself as I heard the interviews, especially with the daily staff, was like, how human was, like a fight for humanity.
Ceci Heinen:
Literally.
Jeff Young:
It was, like, this big, very passionate. And, I mean, I definitely relate to that, but it was so strident, and I'm like, Oh yeah, it's good to be reminded of the fight.
Ceci Heinen:
yeah, and that there are people, you know, who are not that. I mean, not that AI is a bad thing, because I don't think it is fully in itself, but there are people who are still wanting to, like, live a human experience, not impacted by technology and not impacted by AI. And I think that's like, reassuring to know that.
Jeff Young:
So there's our positive, yeah, I guess so. All right, this has been great. it's like, so cool to work with you on this.
Ceci Heinen:
I know it's been wonderful. I'm so glad it finally happened, because we've been plotting this since summer. It's true. It's true. There was planning, but all right, cue the outro.
Jeff Young:
This has been Learning Curve. Episode 7. This episode was written and produced by Ceci Heinen and by me Jeff young.
Thanks to the staff of the Minnesota daily, especially to the student journalists who conducted the interviews on that windy, rainy day last month, or provided other support. They included Wren Warren, Jacobson, Amy Watters, Atticus Marse, Grace, Aigner, Lucas Vasquez, Callie Burch, Matthew Yeagers and Vivian Wilson. You should check out their work at mndaily.com.
Editing for this episode was by Rob McGinley Myers, and you should check out his amazing podcast, which is about narrative podcasts. It's called phonograph. You can find that wherever you listen, you.
Music this episode is by Komiku, and our episode art is a photo that I took and ran through Midjourney.
Please support this podcast by telling a friend about Learning Curve or share it on social media. Of course, I hope you are following the show already on your podcast app. If not click the button.
We'll be back next week with more on how education is adapting to this AI era. Thanks so much for listening.