Hacker Newsnew | past | comments | ask | show | jobs | submit | jabowery's commentslogin

yeah because the amount of money a government spends on technology is correlated with its rate of progress!!!

https://jimbowery.blogspot.com/2017/07/fusion-energy-prize-a...

https://youtu.be/boLdXiLJZoY


Replace the 16th Amendment with a single tax on net assets at the interest rate on government debt, assessed at their liquidation value ... and use the revenue to privatize government with a citizen's dividend.

https://ota.polyonymo.us/others-papers/NetAssetTax_Bowery.tx...

When we got a law passed to privatize space launch services back in 1990

https://www.youtube.com/watch?v=boLdXiLJZoY

we were in the midst of a quasi-depression so I decided to address the problem of private capitalization of technology with the aforelinked proposal.


Schemes like this tend to move assets to a better location and otherwise cause widespread restructuring to minimise the loss. Taxation is inherently complicated.


Not only that, but it would incentivize people to not own assets. People would live paycheck to paycheck, they’d rent longer, and they’d waste money on lavish trips or food (at least until the next economic downturn when they get evicted and lose everything).


In identical twins, autism and homosexuality both have about the same rate of non-occurrence between the twins: 50%. This means, of course, that autists and gays are "born, not made" by the environment. Move along. Nothing to see here.


Don't count on it. My experience is that funding goes to those who are not serious about autism epidemiology. Back in the mid-1990s, I was at a startup in Silicon Valley with about 100 employees where, during a few year period, 5 of the employees had children diagnosed with autism severe enough that they were barely verbal at best. This struck me as a great opportunity to discover the cause so I contacted a Berkeley epidemiologist who had been funded to do autism research. His comment was simply that "Yes we know that these microclusters exist." and that was that. No follow up.


This is reminiscent of an argument I had with the Mercury Prolog guys regarding "typing" in logic programming. My point boils down to this:

Any predicate can be considered a constraint. Types are constraints. While it may be reasonable to have syntactic sugars for type declarations that, at compile time, are transformed into predicates, it is unreasonable to lard a completely different kind of semantics on top of an already adequate semantic such as first order logic.

https://groups.google.com/g/comp.lang.prolog/c/8yJxmY-jbG0/m...


Interval Arguments: Two Refutations of Cantor’s 1874 and 1878 Arguments:

https://www.academia.edu/93528167/Interval_Arguments_Two_Ref...


It's always fun to make fun of cranks. Thanks for linking that. The author really needs to find the right statement of what they call the Nested Interval Theorem. I cracked up at the complete misuse of it in the "Interval Argument for the Rationals" section


Dear Claude 3, please provide the shortest python program you can think of that outputs this string of binary digits: 0000000001000100001100100001010011000111010000100101010010110110001101011100111110000100011001010011101001010110110101111100011001110101101111100111011111011111

Claude 3 (as Double AI coding assistant): print('0000000001000100001100100001010011000111010000100101010010110110001101011100111110000100011001010011101001010110110101111100011001110101101111100111011111011111')


Learning theory is the attempt to formalize natural science up to decision. Natural science's unstated assumption is that a sufficiently sophisticated algorithmic world model can be used to predict future observations from past observations. Since this is the same assumption as Solomonoff's assumption in his proof of inductive inference, you have to start there: with Turing complete coding rather than Rissanen's so-called "universal" coding.

It's ok* to depart from that starting point in creating subtheories but if you don't start there you'll end up with garbage like the last 50 years of confusion over what "The Minimum Description Length Principle" really means.

*It is, however, _not_ "ok" if what you are trying to do is come up with causal models. You can't get away from Turing complete codes if you're trying to model dynamical systems even though dynamical systems can be thought of as finite state machines with very large numbers of states. In order to make optimally compact codes you need Turing complete semantics that execute on a finite state machine that just so happens to have a really large but finite number of flipflops or other directed cyclic graph of universal (eg NOR, NAND, etc.) gates.


Code...Matters...

You can tell everyone "who is anyone" is in hysterics when this gets virtually no attention at Salamandar's old stomping ground here at ycombinator.


Xanadu, folks. It's quite a tragedy that Wired Magazine's article failed to uncover the real reason Xanadu failed to become the WWW (hence why Smalltalk didn't become the scripting language rather than Javascript, etc.).

https://www.wired.com/1995/06/xanadu/


What's the real reason Xanadu failed to become the WWW?


I read a couple of Xanadu papers recently and my conclusion is that it failed to get big because it was mostly vaporware and when it finally delivered something it was much less than the press. The papers are interesting to read, but brilliant non-beings will always lose to more pedestrian beings. The story reminded me quite a bit of Chandler in "Dreaming in Code" by Scott Rosenberg.

Don't get me wrong, if you read Ted Nelson writing about Xanadu uncritically, you'll get a tale of utopia denied and genius tortured, but the reality seems much more prosaic.

Edit: I should also add that the web as originally built has real advantages over Xanadu. In order to implement its universal transclusion and DRM (yes, Xanadu had a scheme for DRM and micropayments to creators), Xanadu had to be centralized. I'd argue this is worse both socially and technologically. Adding DRM to the infrastructure of the web is something that I would really hate.


I was there (you'll find my name in the Wired article), and on the whole I would agree that Xanadu's reach far exceeded its grasp. Compared to the simplicity of the http protocol, Xanadu's complexity was high enough and its performance low enough that there was little opportunity for a genuine competition.

But I will say that Xanadu was conceptually not centralized; the peer-to-peer exchange of arbitrary information at scale was definitely part of the architecture. However, the major and systemic performance problems entirely prevented any scaling up of the system, which effectively means the distributed architecture was never proven.

I agree to a certain extent with the Chandler analogy, insofar as there was a lot of "architecture astronautics" that added complexity to the system beyond the ability of the team to manage -- especially given the limitations of early 1990s development machines.

One could refer to the article itself for Walker's own view of the sad outcome:

'Rather than push their product into the marketplace quickly, where it could compete, adapt, or die, the Xanadu programmers intended to produce their revolution ab initio.

'“When this process fails,” wrote Walker in his collection of documents from and about Autodesk, “and it always does, that doesn’t seem to weaken the belief in a design process which, in reality, is as bogus as astrology. It’s always a bad manager, problems with tools, etc.—precisely the unpredictable factors which make a priori design impossible in the first place.”'

He wasn't wrong. Xanadu tried to leap fully formed into the world as a megalithic architecture capable of arbitrarily large data structures supporting arbitrarily small comparisons and transclusions, and it couldn't compete with HTTP's fully open specification and implementations, low barrier to entry, and extreme simplicity.


I appreciate the boots-on-the-ground perspective, so thanks for posting! I do want to be clear that I do appreciate the research and enjoy reading the papers produced by Xanadu. My goal was never to belittle the project itself, just talk about reasons for history playing out as it did.


No worries, I didn't interpret your comment as belittlement. I agree the project was over-ambitious and overly complex, but it was also visionary and influential.


"In order to implement its universal transclusion and DRM (yes, Xanadu had a scheme for DRM and micropayments to creators), Xanadu had to be centralized."

Fallback positions from the idealized "roadmap" are what happens when VCs get involved with a system that offers that Zero To One advantage -- but you have to have a One to offer the VCs, which Memex didn't. The question then becomes how much of your road map can be recovered or, perhaps more to the point, do you even _want_ to recover in the light of ground truth experience? At present there is a lot of potential for Information Centric Networking that would be more likely realized in a Ship-Dumbed-Down-Decentralized-Xanadu1994 alternative universe than is likely to be realized now.


One man's vaporware is another man's roadmap. Think about it like this:

Why was Brendan Eich under such pressure from VCs to throw together a scripting language over a weekend?


Not "from VCs". Marc was not a VC then, Netscape's investors didn't direct any of our strategic or tactical moves. Bill Joy at Sun also supported JS as scripting language for Java, and signed the trademark license. Excerpt on the early days and why we did JS (part of 3 hour Lex Fridman interview):

https://www.youtube.com/watch?v=S0ZWtsYyX8E

HOPL paper: https://hopl4.sigplan.org/details/hopl-4-papers/10/JavaScrip...


I stand corrected (and also note the "weekend" was really 10 days). Jim Clark* apparently gave Marc Andreessen the necessary latitude.

* The other person with an office opening into Keith's work area at Memex was Ron Resch https://www.historyofcg.com/pages/university-of-utah/


1994: In the next room from me at Memex Corp. poor Keith Henson was draped over a chair (due to a bad back) working, alone, on the C++ Xanadu code to debug garbage collection among other things, because the original Smalltalk source had been lost. Memex Corp. was early enough in HTTP's development of lock-in network effects, that its acquisition of Xanadu _might_ yet have turned the tide. Why had the Smalltalk code been lost? Well, all I can tell you as that from my work with Roger (starting in 1996 on a rocket engine) that my understanding of events differs from that reported in Wired (and most others including, to some extent, Roger himself) and involves some pretty, shall we say, "bad behavior" on the part of certain parties that were more than a little partial to C++. Since this is hearsay, I won't go into more depth stating things "as fact". But it is pretty clear to me that the effort and investment put into making HTML, JS, etc. de facto standards, combined with Memex's acquisition of Xanadu rights (and potential willingness to open up the Xanadu protocols and implementation) at that critical juncture was fatally hampered by the C++-only handicap suffered by the Xanadu source.

Why didn't I step in and help poor Keith? Ever heard of Croquet's TeaTime?

https://dl.acm.org/doi/abs/10.1145/1094855.1094861

I was in a position to resurrect at least _that_ much of the original work I'd one at Viewtron Corp. of America based on David P. Reed's PhD thesis, and Reed was just down the street from us at Interval Research at that time, which rather tempted me away from helping Keith, even if I'd been authorized to do so, which I wasn't.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: