There is a more feasible future IMO that paints AI as the washing machine, calculator, computer, spreadsheet, automation, etc. Jobs AI can complete don’t lead to people getting let go, but rather those people sit on top of the AI who can do their job much better (sometimes at larger scale, sometimes not). Better outputs for the same cost (well wage+AI costs) -> more purchasing power, more efficient business, etc.
I don’t know if this is how AI will go, but this exact thing happened to me with deep learning. I did stupid math to optimize algos in 2012, but in 2022 deep learning was 100x better than me. I just babysat the AI, as it (and llms) still can’t talk to clients, understand business/culturual nuances, navigate an org, politic, innovate, etc
Money is just a proxy for value add. When automation replaces the labor, the only remaining value add may be withholding violence. I hope we don't get there
I doubt this was ever classified information. It's written all over DoD and NSA requirements and best practices for staff and diplomats.
She was probably briefed repeatedly about this as a member of that committee.
Here's one example:
> Headphones are wired headphones (i.e. not wireless) which can be plugged into a computing device to listen to audio media (e.g. music, Defense Collaboration Services, etc.).[0]
I have worked on out of sample problems, and AI absolutely struggles, but it dramatically accelerates the research process. Testing ideas is cheap, support tools are quick to write, and the LLM itself is a tremendous research tool itself.
More generally, I do think LLMs grant 10x+ performance for most
common work: most of what people do manually is in the training data (which is why there's so much of it in the first place.) 10x+ in those domains can in theory free up more brain space to solve the problems you're talking about.
My advice to you is to tone down the cynicism, and see how it could help you. I'll admit, AI makes me incredibly anxious about my future, but it's still fun to use.
In the US, homeless individuals foremost suffer from financial hardship not mental illness. Consider 39% of homeless individuals are in families [0: Page 17] while 40% have a serious mental illness or drug problem.[1] Many develop these problems while homeless.
Homelessness in the US has also increased by 47% since 2018. [0: Page 2] I doubt homelessness or drug abuse has increased accordingly.
People make the mistake to think otherwise because its not the homelessness you often see.
If have lived anywhere with a significant drug-addict (opioid or fentanyl) population through this time period, you’ve seen the increase; if you haven’t, you may be lucky for it.
I'm a former Ruby guy who ended up in stats/ML for a time. I think it's all about information density.
Let's use your example of `A = P (1 + r / n) * (n * t)` -- I can immediately see the shape of the function and how all the variables interrelated. If I'm comfortable in the domain, I also know what the variables mean. Finally, this maps perfectly to how the math is written.
If you look at everything in the post, all of the above apply. Every one in the domain has seen Q = query, K = key, V = value a billion times, and some variation of (B, N_h, T, D_h). Frankly, I've had enough exposure that after I see (B, N_h, T, D_h) once, I can parse (32, 8, 16, 16) without thinking.
I like you found this insane when I started studying stats, but overtime I realized there a lot to be gained once you've trained yourself to speak the language.
This brought up memory of Hungarian notation. I think now I will try to use it in my PyTorch code to solve the common problem I have with NN code: keeping track of tensor shapes and their meanings.
B, T, E = x.size() # batch size, sequence length, embedding dimensionality
q, k, v = self.qkv(x).split(self.embedding, dim=-1)
q, k, v = map(lambda y: y.view(B, T, self.heads, E // self.heads).transpose(1, 2))
attention = (q @ k.transpose(-2, -1)) * (1.0 / math.sqrt(k.size(-1)))
...
My takeaway is that clock is ticking on Claude, Codex et al's AI monopoly. If a local setup can do 90% of what Claude can do today, what do things look like in 5 years?
I think they have already realized this, which is why they are moving towards tool use instead of text generation. Also explains why there are no more free APIs nowadays (even for search)
This seems like a classic time vs space trade off.
Instead of reconstructing a "wide event" from multiple log lines with the same request id, the suggestion seems to be logging wide events repeatedly to simplify reconstruction from request ids.
I personally don't see the advantage, and in either scenario, if you're not logging what's needed your screwed.
Eternal September came up in conversation today about how users don't do effort posts any longer, they just want to leave funny comments below reaction videos and then swipe to the next one.
Anyone got any good effort post oases I can lurk and help out in?
reply