44 comments

  • dehrmann 5 hours ago

    Full disclose: I'm a Purdue graduate, though I disagree with certain things the school has done (Purdue Global).

    Part of this is very reasonable; AI is upending how students learn (or cheat), so adding a requirement to teach how to do it in a way that improves learning rather than just enhances cheating makes sense. The problem with the broad, top-down approach is it looks like what happens in Corporate America where there's a CEO edict that "we need a ____ strategy," and every department pivots projects to include that, whether or not it makes sense.

    • daxfohl 4 hours ago

      I like this take. It seems like it would be useful to require professors to sit in on the class too. It'd be interesting to hear lots of different perspectives, ideas, concerns, etc., rather than a lecture format to half-awake students about something they arguably know more about than the instructor.

  • mwkaufma 5 hours ago

    Heads up: forbes.com/sites/xyz are ppl and groups who pay for the domain, but aren't edited or promoted by forbes itself. Almost always conservative interest groups posing as journalists.

    • mossTechnician 3 hours ago

      Additional information about Forbes' downward trajectory: https://larslofgren.com/forbes-marketplace/

    • andy99 5 hours ago

      Yes this has conservative psy-op written all over it /s

      • mwkaufma 4 hours ago

        Nietzel's whole shtick is "college reform" i.e. dismantling and financialization. See his book "Coming to Grips with Higher Education." Mixing non-agitprop into the feed is part of agitprop.

  • noitpmeder 6 hours ago

    How to Speedrun devaluing the credentials your institution exists to award.

  • conartist6 6 hours ago

    Well that's a public embarrassment...

    • throaway45425 16 minutes ago

      It is easy to forget though that the vast majority of people have no idea what is being talked about on this forum.

      What percentage of students who graduated in 2025 have no idea what machine learning is?

      Forget Attention Is All You Need and transformers. What percentage can't define machine learning? What percentage have no idea what the question even means? A highly non-trivial percentage.

      ChatGPT prompting 101 would obviously be stupid but there is more than enough material to do a fantastic AI 101 class.

    • andy99 6 hours ago

      That was my thought, it feels like something a career college or high school would do. Are CS students going to have to take a ā€œhow to talk to chat gpt courseā€? That’s probably less condescending than making an arts student or someone else that doesn’t need to have anything to do with LLMs have to sit through it.

      I though Purdue was a good school, these kind of gimmicks are usually the province of low-tier universities trying to get attention.

      • turtleyacht 5 hours ago

        Optimistically, the idea could be to push prerequisites to an always-on, ever-available resource. Depending on the major, skills could include organizing papers into outlines, using Excel, or building a computer.

        Professors can tailor lectures to narrower topics or advanced, current, or more specialized subjects. There may be less need to have a series of beginning or introductory courses--it's assumed learners will avail themselves.

        Pessimistically, AI literacy contributes to further erosion of critical thinking, lazy auto-grading, and inability to construct book-length arguments.

      • basch 3 hours ago

        > ā€œhow to talk to chat gpt courseā€?

        it's not unrealistic to be selecting for people with strong language skills and the ability to break tasks into discrete components and assemble them into a process. or the skill of being able to define what they do not know.

        a lot of what makes a person good with an llm makes them also good at general problem solving.

  • capyba 2 hours ago

    What exactly is an ā€œAI working competencyā€? How to have a conversation with a chatbot? How to ask a chatbot a question that you confirm with a Google search and then confirm that with a trusted online reference and confirm that with a book you check out of the library three weeks later?

    Perhaps the world is going the direction of relying on an AI to do half the things we use our own brains for today. But to me that sounds like a sad and worse future.

    I’m just rambling here. But at the moment I fail to see how current LLMs help people truly learn things.

  • djoldman 6 hours ago

    The announcement is here:

    https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...

    Where the actual news is:

    > To this end, the trustees have delegated authority to the provost, working with deans of all academic colleges, to develop and to review and update continuously, discipline-specific criteria and proficiency standards for a new campuswide ā€œartificial intelligence working competencyā€ graduation requirement for all Purdue main campus students, starting with new beginners in fall 2026.

    So the Purdue trustees have "delegated authority" to people at the University to make a new graduation requirement for 2026.

    Who knows what will be in the final.

    • gmfawcett 5 hours ago

      Delegated to the provost and deans. Who else would you expect to hold accountable for developing a graduate attribute?

      • djoldman 3 hours ago

        I guess they would already have had that authority?

        I think it would be the ongoing job of the dean's or at least someone to be setting graduation requirements? Why would the trustees have to explicitly delegate it?

        • gmfawcett 18 minutes ago

          I think this was more of a press release than an edict. The Purdue announcement says, "Built on recently launched AI majors, minors and certificates across colleges, and following the establishment of a working group last summer, with additional careful deliberation and advice from the University Senate through its Undergraduate Curriculum Council..."

  • whatever1 3 hours ago

    Realistically universities will have to ban the usage of computers for exams and homeworks.

    For the same reason that elementary schools don't allow calculators in math exams.

    You first need to understand how to do the thing yourself.

    • smegger001 3 hours ago

      realistically you can't ban computers for home work as you dont control the environment. and as for banning them for exams that may work for most of the humanities math chemistry and physic, but good luck trying to teach a computer science degree without computers, or graphic design, or any of a number of of programs that are reliant on them if you want student to be competent with standard tools of their trad which are on computers. audio engineering computer video editing computers

      • vostok 3 hours ago

        Outside of distance learning - I think computers are very rarely used for CS exams in my, albeit dated, experience.

      • whatever1 2 hours ago

        People not long ago were literally programming on paper.

        So no, computers are not required to teach computer science.

  • brian-armstrong 6 hours ago
  • jleyank 5 hours ago

    From my long-ago uni courses, current-day AI could have helped with the non-major courses: English and History, doing the first draft or even the final drafts of papers, etc. As a science major, I'm not sure what the point of relying on an AI is as it would leave you empty when considering further education or the tests they require. And as far as a foreign language goes, one needs to at least read the stuff without relying on Google Translate (assuming they have such a requirement anymore).

    But I like to think that actually learning the history was important and it certainly was a diversion from math/chemistry/physics. I liked Shakespeare, so reading the plays was also worthwhile and discussing them in class was fun. Yeah, I was bored to tears in medieval history, so AI could have helped there.

    • thfuran 4 hours ago

      >As a science major, I'm not sure what the point of relying on an AI is as it would leave you empty

      Why do you think it wouldn't do the same for other fields? The purpose of writing essays in school is never to have the finished product; it's to learn and analyze the topic of the essay and/or to go through the process of writing and editing it.

    • conartist6 5 hours ago

      It'll get you an academic integrity investigation if you get caught using it to write either a first draft or a final draft of a paper, and especially for an English class where the whole point is for you to learn how to write.

      If you're going to try to fake being able to write, better to try to dupe any other professor than a professor of English. (source: raised by English majors)

      • jleyank 4 hours ago

        Hope so. But if you can’t use it here, where CAN you use the thing??

  • gamblor956 6 hours ago

    This is going to be like when all the schools were pushing big data because that was going to be the next big thing.

    After more than a trillion dollars spent, LLMs can replace: (a) a new secretary with one week of experience (b) a junior programmer who just learned that they can install programs on a desktop computer, and (c) James Patterson.

    That's the bright future that Purdue is preparing its students for.

    Yes, AIs will be a huge thing...eventually...but LLMs are not AI, and they never will be.

    • jart 4 hours ago

      I hope Anthropic is saving all my interactions with Claude so they can replace me when I'm gone.

      Then future generations who like old school systems hacking will be able to pair program with Justine AI.

      • SOLAR_FIELDS 4 hours ago

        This is a much lighter take than mine which is that our behaviors being input into this system will eventually be used to subjugate and control future generations. I like it

    • andy99 5 hours ago

      This has nothing to do with whether the technology is valuable or not, it’s about cramming superficial treatment of trendy topics into academic degree rewuirements, which whatever one thinks of AI should be frowned upon.

    • ivape 5 hours ago

      It's definitely something that won't age well. Kids are going to grow up with many AI friends by the time they get to college.

      • bigstrat2003 2 hours ago

        If that's the case it sounds like universities will need to hire an army of psychologists to undo the damage that will have been done to those kids. Treating LLMs as a friend is profoundly unhealthy and will not end well.

  • AndrewKemendo 2 hours ago

    I was on the academic board of engineering mechanics for Purdue almost a decade ago.

    Purdue not necessarily uniquely but specific to their charter does a really good job at workforce development focus in their engineering. They are very highly focused on staffing and training and less so on the science and research part - though that exists as well.

    This tracks what I would expect an in line with what I think it should be best practice

  • danaris 3 hours ago

    If they were to set down what the curriculum needed to meet such a requirement would be today, by the time the students who matriculate in August graduate, it will be so out of date to be effectively worthless.

    This is not remotely the kind of thing that a school should be making a requirement at this time. The technology is changing way too fast to even be sure that basic fundamental skills related to it will remain relevant for as many as 4-5 years.

  • 65 4 hours ago

    Seems mostly knee-jerk reactionary more than anything. I'm sure this is to justify hiring even more administrators.

  • bgwalter 4 hours ago

    https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...

    "all as informed by evolving workforce and employer needs"

    ā€œAt the same time, it’s absolutely imperative that a requirement like this is well informed by continual input from industry partners and employers more broadly."

    Purdue is engaging in the oldest profession in the world. And the students pay for this BS.

  • keiferski 5 hours ago

    I don’t really get the dismissive comments here. Universities have had gen ed requirements for years, one of which is usually something to do with computers. AI seems to be a technology that will be increasingly relevant…so a basic gen ed requirement seems logical.

    • BeetleB 4 hours ago

      The problem is the field is changing way too fast. It's almost certain that whatever they'll learn will be outdated/wrong/poor practice by the time they graduate. Just compare with the state of things 2 years ago.

    • bigstrat2003 2 hours ago

      > AI seems to be a technology that will be increasingly relevant

      That's why you don't understand the dismissive comments. The reality is that the technology sucks for actually doing anything useful. Mandating that kids work with a poor tool just because it's trendy right now is the height of foolishness.

    • UncleEntity 5 hours ago

      Yeah, I'm still bitter I had to pass a literacy exam to get my BA and that was 28 years ago.

      And I just know this is going to turn into a (pearl-clutching) AI Ethics course...

    • alephnerd 5 hours ago

      These are the same people who would pooh-pooh teaching Excel and basic coding skills to non-STEM majors or have CS students take ethics or GenEd classes.

      AI/ML isn't going to completely shift the world, but understanding how to do basic prompt engineering, validate against hallucinations, and know what the difference between ChatGPT and GPT-4o is valuable for people who do not have a software background.

      Gaining any kind of knowledge is a net win.

      • hansmayer 5 hours ago

        "basic prompt engineering" - Since when has writing English language sentences become nothing less than "engineering" ?

        • IncreasePosts 3 hours ago

          It's more about knowing the tricks to get llms to give you the output you want.

          However, there's no reason to think any trick would be relevant even in a year. As llms get better, why wouldn't we just have them auto rewrite prompts using appropriate prompt engineering tricks?

  • turtleyacht 6 hours ago

    Upfront computer literacy may have never been convincing enough; AI could be the ubiquitous and timely leverage to open the way for general machine thinking.