One thing I genuinely hate about modern tech is that it punishes you for planning ahead. I purposely spent time getting a password manager and implementing 2fa protocols that would both speed up my time and keep me safe. Then suddenly every company decided it was time to go passwordless or do passkeys and all my work (researching different products, setting each one up, making sure hey work on all my devices, etc etc) suddenly goes down the drain
Actors have known this for decades: self-expression isn’t only a stage problem. It’s a life problem. Most people fail to express themselves on an hourly basis. Being good at expressing yourself is unnatural. Having clarity of what “yourself” even is is unnatural. The truth is that we’re all making comments, jokes, deciding what’s important and what not using old programming in our brains… programming that was given to us by our childhood and our education. Very few people can consistently have the luxury of being/ability to be creative with that old programming, and even those that can often have to plan ahead of time/rigidly control the environment in order to achieve a creative result.
The exact same problem exists with writing. In fact, this problem seems to exist across all fields: science, for example, is filled with people who have never done a groundbreaking study, presented a new idea, or solved an unsolved problem. These people and their jobs are so common that the education system orients itself to teach to them rather than anyone else. In the same way, an education in literature focused on the more likely traits you’ll need to get a job: hitting deadlines, following the expected story structure, etc etc.
Having confined ourselves to a tiny little box, can we really be surprised that we’re so easy to imitate?
The first iterations of the apple keyboard were perfect. They literally did everything perfectly without any notes.
Then it seems like they’re started teaching to the bottoms of the class and added a bunch of terrible decisions: Substituting touch to select instead of touch to move cursor was a genuinely awful decision that now makes typing a constant chore, and it seems like their autocorrect is overcompensating so hard that it prevents me from writing perfectly good words simply because they’re not common ones.
Side note: anyone else have moments where you can’t press delete once predictive text has shown up?
Ngl I feel like most people only accept these criticisms of AI because they’re against AI to begin with. If you look at the claims, they fall apart pretty quickly. The environment issue is negligible and has less to do with AI than just computing in general, the consolidation of resources assumes that larger more expensive AI models will outcompete smaller local models and that’s not necessarily happening, the spread of misinformation doesn’t seem to have accelerated at all since AI came about (probably because we were already at peak misinformation and AI can’t add much more), the decay in critical thinking is far overblown if not outright manipulated data.
About the only problem here is the increase of surveillance and you can avoid that by running your own models, which are getting better and better by the day. The fact that people are so willing to accept these criticisms without much scrutiny is really just indicative of prior bias
Karpathy recently did an interview where he says that the future of AI is 1b models and I honestly believe him. The small models are getting better and better, and it’s going to end up decentralizing power moreso than anything else
How did this article get so many upvotes? Even among articles that pine for the good old days, this article is trash. Like 80% of it is just saying “remember that movie? And the things we thought were meaningful back then?”
The idea that modern movies don’t take risks is absurd. Have you seen Poor Things? Have you seen Zone of Interest? Mickey17? OBAA? There are more movies taking more risks in this era of film than there has ever been before. You’re just not watching them.
The real story here is the way lighting has changed and how it makes you feel when you watch the movie.
For the San Francisco venues, some of my favorite artists come on random days, and as a 40-something with a kid I would only go out on nights other than Friday/Saturday if I REALLY liked the artist. From this perspective, it might make sense to weight weekday outliers higher
Very clearly shows much more sensitive our eyes are to luminance rather than hue or saturation, which was the main observation that allowed for the high compression rate of JPEG
Are you speaking of chroma subsampling, or is there a property of the discrete cosine transform that makes it more effective on luma rather than chroma?
Probably chroma subsampling - storing color at lower resolution than luminance to take advantage of the aforementioned sensitivity difference. Since it’s stored at 1/4 resolution it can alone almost halve the file size.
Saying it’s the insight that led to JPEG seems wrong though, as DCT + quantization was (don’t quote me on this) the main technical breakthrough?
reply