Hacker Newsnew | past | comments | ask | show | jobs | submit | smj-edison's commentslogin

https://github.com/smj-edison/zicl

Porting/reimplementing a Tcl interpreter from C to Zig, based on the design of Jimtcl. This is one of those sub-projects that started due to another project (folk.computer in this case). The biggest difference is thread-safe value sharing, and (soon to be) lexical variable capture.

But why? Right now folk.computer has about a 20% overhead of serializing and deserializing values as they get sent between threads, and it's also meant we can't sent large amounts of data around. I previously attempted to make the Jimtcl interpreter thread-safe, but it ended up being slower than the status quo. So, I started hacking on a new interpreter.

Commands evaluate, basic object operations are in place, but there's still a ton of work to do in order to implement core commands. It may even be good enough to swap in some day!


I'd distinguish between physical art and digital art tbh. Physical art has already grappled with being automated away with the advent of photography, but people still buy physical art because they like the physical medium and want to support the creator. Digital art (for one off needs), however, is a trickier place since I think that's where AI is displacing. It's not making masterpieces, but if someone wanted a picture of a dwarf for a D&D campaign, they'd probably generate it instead of contracting it out.

Right, but the question then is, would it actually have been contracted out?

I've played RPGs, I know how this works: you either Google image search for a character you like and copy/paste and illegally print it, or you just leave that part of the sheet blank.

So it's analogous to the "make a one-off dashboard" type uses from that programming survey: the work that's being done with AI is work that otherwise wouldn't have been done at all.


Every time I've tried visual programming for anything other than data flow I find it way more painful than just typing code. It's particularly hard to follow control flow, and organization is a pain. So I'm not convinced that visual programming (as conventionally envisioned) is the future. That said, I'm a big fan of making use of visual intuition, at least how dynamic land and folk computer envision it (they also have the datalog-like reactive database that is the glue that connects different programs together in a nicer way than data flow).


It'll be interesting to see whether OpenAI is able to enforce the guardrails that Disney would want...

They have to consider China. Right now Z-Image Turbo lets you render stills of any popular cartoon character you like, at frankly-disturbing levels of quality, doing almost anything you like. That's a relatively-tiny 6G-parameter model. If and when a WAN 2.2-level video model is released with a comparable lack of censorship, that will be the end of Disney's monopoly on pretty much any character IP.

Also, notice how Disney jumped all over Gemini's case before the ink was dry on the OpenAI partnership agreement. My guess is that Altman is just using Disney to attack his competitors, basically the 'two' part of a one-two punch that began by buying up a large portion of the world's RAM capacity for no valid business reason.


I'd put Noita in that category. I usually describe it as "broken both ways", because (as a rogue like) you have very little healing, and the enemies are punishingly hard. Not only that, but it's a full falling sand+physics simulation, so certain elements will randomly combine and kill you in the most unexpected and spectacular ways. On the flip side, the wand system is near turing-complete, and gets abused in the most crazy ways, to the point that you can do millions of damage per tick. One of the most chaotic and fun games I've played!

What form of interfaces would you want? Something trait-based? Rust's orphan rule has bitten me many times now, and it causes consolidation around certain libraries. Something like Go's interface, where the signature just needs to match? That would be nicer than the current situation with `anytype`, but I don't know if it's better enough to justify a full interface system. Curious to hear your thoughts.

Essentially enough syntactic sugar so you could write eg the Allocator interface without manually specifying the vtable and the opaque pointer.

But yeah, Go’s system is nice and simple. I am not sure, but I think the fact that Zig programs are a single compilation unit might have some bearing on the orphan rule. There is no concept of crates so “traits”/interfaces can be defined and implemented anywhere.


That's a good point that it's all a single compilation unit which removes the orphan rule problem, but it still has the issue that there could be multiple different implementations of a trait, each from a different dependency.

Though, I am seeing your point on a simple interface system that would be enough to have something like the allocater interface, or the hash map interface.


Allocator can be implemented as one function pointer - recallocarray, see https://man.openbsd.org/calloc.3

Oh my gosh, that would be incredible! In one of my rust projects, I used enum dispatch so simple functions could be inlined, but it used a slightly hacky macro that couldn't do static dispatch. One of those things that comptime matches very well.

I see people say that if you like C you'll like Zig, and if you like C++ you'll like Rust. I kinda disagree, since both languages take a lot from C and C++.

Rust feels like "let's take the best safety features from both", while Zig feels like "let's take the best transparency from both". In a way, Rust's traits are more foreign to C++ than Zig's comptime, as Zig has similar duck typing (though the ergonomics are much cleaner, especially with @compileError).


My brother is studying economics right now, and he said everyone could use some basic economics knowledge, because getting an intuition for how markets work really helps you as you're looking for jobs and navigating around companies. Maybe business knowledge is better, but I'm personally biased towards the empiricism of economics :) You're onto something though about the need for awareness of how companies think and work.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: