Hacker Newsnew | past | comments | ask | show | jobs | submit | cluoma's commentslogin

Location: Canada

Remote: Yes

Willing to relocate: Yes, Canada/US/Germany

Technologies: C, R, Python, Multiple flavours of SQL

Resume/CV: Link in bio

Email: colin.luoma (at) gmail.com

I have several years experience as a data analyst, mostly in the video games industry and, more recently, in public service. A large part of this has been data engineering work which I've always found I tend to enjoy more than the analysis side. I've also always maintained a personal interest in programming working on side projects, and so, I am looking to transition into a development/data engineering role.

Lately, I have been spending my time in the embedded space working on retro game console accessories which has been very rewarding. I would love carry some of this experience into a professional role.


I used to get this same feeling during lectures in uni. Often the information was presented well and, along with some clear examples, everything seemed to make perfect sense.

It wasn't until working through practice problems later, on my own, did it become clear how much detail I was missing.


> It wasn't until working through practice problems later, on my own, did it become clear how much detail I was missing.

This is a common problem in learning. Recognition is easier than recall and smoothness is confused for understanding.

You actually need to struggle with the concepts a bit to learn effectively. Without the struggle it feels more effective, but is not.


Detail missing and being a confidently wrong are two different things though ?

Edit: Claude told me the other day told me my entire building might have to be demolished due to a slightly bow in my newly poured stem wall, I uploaded a photo etc and it was liked, “yes this is a serious structural issue blah blah blah” , the inspector came to look at it and literally laughed that I was worried about it.


Now consider what's happening to the learning process of the (rather large) subset of current college students choosing to replace that struggle for detailed understanding with LLM queries.


It’s the biggest crisis since math students started using graphing calculators.


LLMs are just graphing calculators for the humanities.


That’s beautiful


you are not the only one. There was a paper covering this exact topic in the Proceedings of the National Academy of Sciences a few years back [0].

Passive learning (lecture) scored better on:

* Student Enjoyment

* Feeling of Learning

* Instructor effectiveness

* I wish all my courses where taught this way

Active Learning (i.e., not lecture) scored better on:

* Actual learning

The differences are not small.

[0] https://www.pnas.org/doi/10.1073/pnas.1821936116


I suspect a majority of my students this semester used LLMs to complete homework assignments. It is really depressing. I spent hours making these assignments and all they probably did was to copy and paste them into ChatGPT. The worst part is when they write to me asking for help, sharing their code, and I can see it was written by LLMs. The errors are mostly there because occasionally the assignments refer to something we did in the class. Without that context LLMs make assumptions and the code fails to generate the exact output. So now I am fixing the part of the code that some of my students didn't bother to write themselves.

Edit: Added "I suspect" in the beginning as I can't prove it.


Why are you fixing their code? You're just doing the same thing as the LLM you're complaining about.

Also, as someone who attended university in Germany, the mental image of a professor helping undergrads with homework already seems strange if not funny to me. That is... at least I hope they're undergrads, because if people managed to get any sort of CS degree while having to rely on a LLM to code I might be sick.


Just going by the last 2 years of university teaching (energy focused computer science in germany), I feel like LLMs have already had a devastating effect. There has been a large influx of students who seemingly got through their entire Bachelors degree with nothing but ChatGPT. The university is slow to adapt and ill equipped to deal with this.

This is absolutely killing my enjoyment of teaching. There is nothing more disheartening than carefully preparing materials for people to grasp concepts I find extremely interesting, just for them to hand in ChatGPT generated slop and not understanding anything at all. In stark contrast, just a couple of years prior I would have quite rewarding projects and discussions with students. I also refuse to give detailed feedback on such "solutions" anymore because the asymmetry in student effort and my effort is just completely unreasonable.

This development is something very different from the often quipped "graphing calculator in maths education". For a graphing calculator you still need to know the mathematical foundations to input the correct things to get the correct results. LLMs are mostly used by just pasting in the exercise of the day.

This is not to say LLMs can't be a useful tool for learning. They absolutely can. But that is not how the majority of students uses them... to their own detriment and the detriment of those trying to teach them.

If universities don't adapt to this quickly, then the already weak signal of "university degree implies some amount of competence" will be entirely lost.


I've heard experts comment on this from the other side, that they'll give a quick layperson's soundbite about their subject of expertise that doesn't defensibly lay out all the possible exceptions and edge cases and weirdness for reasons of time and audience interest and then they'll be inundated with comments calling them a liar and accused of falsifying things or not actually understanding the subject.


That's been exactly my experience as well. Sometimes doing a little research on a lunch break gives enough direction on how to spend available time later on my project.

Accepting that progress will be slow has been the most difficult adjustment, and applies to more than just side-projects. Choosing books or games also becomes a more strategic decision when what used to be a weekend sprint, turns into a several week marathon.


I think I'm throwing in the towel on my 7 Plus this year. I'd love to keep it around for a bit longer but too many apps and websites are no longer functional. It's starting to become a usability issue.


I grew up in BC as well and never heard it. My parents were from Ontario and always called it the flipper. Because it flips channels I guess. Felt like every household had a different name for it though.


https://github.com/cluoma/Pico2Maple-fw

Currently working on a USB/Bluetooth to Sega Dreamcast controller adapter.

It's based on the Raspberry Pi Pico 2/W board and originally started as a fun project to play around with the PIO features of the RP chips and get my Steam Controller working on the Dreamcast. It has since expanded to support more controllers, keyboards, mice, and acts as VMU - the Dreamcast's memory card - while plugged in. Building the dongle itself has also been a fun exercise in 3d printing, cad and pcb design.

I'd really like to expand the amount of supported USB devices as I only have access to a limited amount of hardware that I can test myself. I've started looking into if I can use the SDL Game Controller DB (https://github.com/mdqinc/SDL_GameControllerDB) to kind of crowd source support for a bunch of controllers. I'm definitely open to other ideas though. I feel like I'm slowly learning that USB controllers are minefield of one-offs and edge cases.


The PIO functionality, I think, is a killer feature for interfacing with retro consoles. I've been working on a Sega Dreamcast controller adapter and the PIO made it pretty easy to start interfacing with the console's custom protocol, even for somebody with near-zero experience in this kind of thing.


What would you look for to see if somebody is 'staying relevant'? Side projects? Personally, I would like to explore new technologies, and am still excited to learn, but feel limited by what is actually needed/required in my day-to-day which is unlikely to change.


It's more just that your day to day *should* change over the course of 15-20 years. What I'm referring to are situations like people who have current day to days that read exactly the same as it did in the early aughts. For instance poking at pre-generics Java to parse XML files, populating some enterprise beans, and pushing to a web page using whatever the heck framework was in vogue back then.

There's nothing wrong with someone turning their brain off and working in that role for 20 years. But they shouldn't be surprised if their experience is completely irrelevant in the modern job market. If one wants to develop new experiences and their job won't provide that over a long period of time (say 5-10 years of employment, not every 6 months!) then it's time to consider the tradeoffs you're making.


In my experience technologies that were popular in the past but are basically dead now are a red flag. Perl, Pascal, Smalltalk, maybe even Java and C at this point, if they don't know anything newer.


Java is an "it depends". Are they writing early aughts style enterprise Java, following early aughts patterns, and using early aughts frameworks? Or are they using fairly modern Java, doing things in a fairly modern way?

I've hired for Java roles in the last handful of years and this was a very bimodal group.


I feel you, extremely frustrating when the phone is otherwise in good working order. I've started to get update prompts from apps, on my aging iPhone, even though no new versions exist for my OS.


There is the Zink project[1]. It is an OGL to Vulkan translation layer.

[1]https://docs.mesa3d.org/drivers/zink.html


But it's part of Mesa, it's not something you can drop into an app written against OpenGL to translate the calls to Vulkan right?


You absolutely can. It can even build and run on Windows too. I’ve used it to play some modded Minecraft builds where Zink outperformed the native OpenGL2 drivers on my machine. Mainly because the native OpenGL2 driver was terrible at the time for my hardware but it’s 100% a thing you can do.

Some games are even shipping on it [1]

[1] https://www.gamingonlinux.com/2023/02/x-plane-12-now-uses-th...


Where do you think <GL/gl.h> comes from?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: