2024-01-03 15:26:31 sigh. i'm starting to see myself heading into the same spiral of confusion and abstraction that i fell into before walking away from forth many years ago 2024-01-03 15:36:51 Why so, zelgomer? What kind of confusion? 2024-01-03 15:38:47 because to write a metacompiler you end up with three identical vocabularies: one for the interpreter that you're writing the metacompiler in, one for the metacompiler, and one for the target, and that's just too confusing for me 2024-01-03 15:39:55 I see. Yeah, metacompiling seems to be the hardest thing in Forth. 2024-01-03 15:41:01 and then to get compile-time execution when you're building your target image, you have to have words to switch back and forth between target and host mode. to do it seamlessly, you have to compile a more abstract representation, like a list of objects that can be traversed and interpreted in interpret mode, or traversed and compiled. so you end up writing forth in a forth in a forth, and it's just too 2024-01-03 15:41:08 confusing to me. and add to that that forth is challenging for me to be effective with in the first place 2024-01-03 15:41:11 e.g., i really miss my structs 2024-01-03 15:41:33 cmForth supposedly has a particularly clean metacompiler - have you researched it? 2024-01-03 15:42:00 no, i have absolutely no interest in that color forth stuff. that sounds like hell to me 2024-01-03 15:42:15 I don't think cmForth is colorForth - they're different things. 2024-01-03 15:42:21 oh 2024-01-03 15:42:30 cmForth came earlier in Forth's history. 2024-01-03 15:42:57 https://dl.acm.org/doi/pdf/10.1145/382125.382916 has a few bits on cmFORTH's metacompiler 2024-01-03 15:45:18 I assume it still has some way to specify whether compilation should be for the host or the target; I doubt there's any way arond having to call that out somehow. 2024-01-03 15:47:45 IIRC, it uses { to switch between host & target 2024-01-03 15:48:56 ah, i was debating that. i settled on [[ ( host mode ) and ]] ( target mode ) 2024-01-03 15:49:09 seemed kind of analogous to [ and ] 2024-01-03 15:49:49 in cmforth, { toggles between modes, } was defined as an alias 2024-01-03 15:51:15 sheesh... that is also how i did it at first. but then i knew at some point i would imbalance them by mistake and it would get to be super confusing, so instead i made ]] switch to target mode or no-op if already in target mode, and ditto for [[ and host mode 2024-01-03 15:52:50 maybe i'll keep moving forward with what i'm doing, then. i think it just falls short of my (maybe misaligned) expectations going into it 2024-01-03 16:00:56 If I understand it right, cmForth separates interpreter and compiler into completely separate loops - the interpreter is running, but one of the things it can run is the compiler. And when it is compiling, there is an extra vocabulary that gets searched, before searching the "regular" vocabularies. I've assumed that the host compiler and the target compiler would be the same compiler word, but might use 2024-01-03 16:00:58 different extra vocabularies. 2024-01-03 16:01:38 Anyway, it seems to me it's heavily driven by vocabularies. If you're compiling to the target image, you better be lookin up the words you compile in the target dictionary, and likewise for the host. 2024-01-03 16:02:08 yeah, that's what i'm doing. but it means that when you write a word for the target, you can't execute it at compile time unless you switch to host mode and write a host version of it 2024-01-03 16:02:48 Yeah, that extra vocabulary would contain the immediate words for the appropriate compile target. 2024-01-03 16:03:03 crc did your multiple vm implementations give you the knowledge to create a source to source compiler (transpiler)? 2024-01-03 16:03:41 like bytecode generation is after all code generation 2024-01-03 16:05:24 vms14: To some extent any modification of the dictionary structure is akin to "code generation." May or may not be "literal," but I think of anything that 'crafts the dictionary' as being in that category. 2024-01-03 16:06:29 But the dictionary is really just a data structure, so it's different kind of only in the eye of the beholder. 2024-01-03 16:06:35 I'm not really interested in transpiling my forth source to other languagues 2024-01-03 16:06:49 For the VM, I could probably write a generic model that could be transpiled, but it'd not be much less effort to write the transpilers than just implementing the vm by hand 2024-01-03 16:06:52 I don't expect to be interested in that either. 2024-01-03 16:07:25 crc but once you have that generic model you can implement it in itself 2024-01-03 16:07:42 vms14: my son is working towards that 2024-01-03 16:07:45 and it would get transpiled to all the languages you have 2024-01-03 16:08:12 i prefer source code generation over a vm 2024-01-03 16:08:38 but i assume bytecode generation is a good start to learn about code generation anyways 2024-01-03 16:09:49 that's why i wonder if what you have learned over all this time with your vm implementations has a lot to share with source code generation 2024-01-03 16:10:05 and it does really help into understanding and learning about 2024-01-03 16:10:15 My main goal is to dispense with other languages and tools. So transpiling is somewhat moving in the wrong direction for me. 2024-01-03 16:10:35 If I transpile to source, then I need a compiler for that source. 2024-01-03 16:11:12 in your case it would be similar to have a forth assembler 2024-01-03 16:11:14 Not that I object to transpiling in general. My favorite one is verilator. 2024-01-03 16:11:43 and have your forth written on top of that assembler 2024-01-03 16:11:54 It transpiles Verilog to C, and apparently produces cycle-accurate models for quite large and complex circuites. 2024-01-03 16:11:58 circuits. 2024-01-03 16:12:19 that assembler could be extended to several other devices/platforms 2024-01-03 16:12:48 It's kind of like that, though exactly how clean and complete that "assembler" will be remains to be seen. 2024-01-03 16:12:50 your forth would automatically extend to them since it's written in that assembly 2024-01-03 16:13:15 A perfectly valid way to do it is to just emit bytes into the dictionary, which basically means you're assembling by hand. 2024-01-03 16:13:21 and being forth able to have immediate words and mess with itself, it could be fun 2024-01-03 16:13:29 But yeah, one way or another you have to get your primitive words into the dictionary. 2024-01-03 16:14:39 stupid question: is the order in which 2! or 2@ cells appear in memory defined? 2024-01-03 16:15:05 That sounds line endien stuff? 2024-01-03 16:15:07 as years go by i'm finding it increasingly difficult to find details on words from older forths 2024-01-03 16:15:31 I think double precision ints should observe the same byte order as standard ints. 2024-01-03 16:15:37 zelgomer: big first then little on big endian 2024-01-03 16:15:54 ^ Right. And opposite that on little endian. 2024-01-03 16:16:45 and strings start at higher address and go to low addresses on little endian 2024-01-03 16:17:06 Oh, I've never done that. 2024-01-03 16:17:27 I see what it would be trying to get at, though. 2024-01-03 16:17:48 so is rest of littleendian stuff 2024-01-03 16:17:49 Make it so strings sort the same way numbers do? 2024-01-03 16:18:13 then you have wacky pdp middle-endian 2024-01-03 16:19:05 2! and 2@ are not always used for double-precision numbers, though. sometimes they're used just for two independent cells, as in the article crc linked 2024-01-03 16:19:36 But that's just kind of hijacking them for an extra purpose, isn't it/ 2024-01-03 16:19:38 ? 2024-01-03 16:19:50 The reason they're *there* is double precision? 2024-01-03 16:19:58 i don't know, is it? 2024-01-03 16:20:08 nope, two cell access 2024-01-03 16:20:14 I thought so, but... I guess I don't know. 2024-01-03 16:20:42 The standard specifies the order https://forth-standard.org/standard/core/TwoFetch 2024-01-03 16:20:46 if it's just two cell access then endianness wouldn't be my first consideration, hence the question 2024-01-03 16:21:11 GeDaMo: and many of us ignore it 2024-01-03 16:21:42 GeDaMo: i don't know what that standard is or where it comes from, but i've found it to be nothing but misdirection when i'm searching for traditional forths behavior 2024-01-03 16:22:01 That just seems... odd to me. But there it is - there's no figuring standards. 2024-01-03 16:22:49 I'm pretty sure the historical origin was in doubles. 2024-01-03 16:23:08 Lemme see if I can find something... 2024-01-03 16:24:38 source generation doesn't really help with having an interactive, self-sufficient environment. 2024-01-03 16:25:36 in my experience, whatever is forth-standard.org, it seems to be more harmful to forth than good. rather than preserving forth history, it tends to clutter my search results with some modern things that don't interest me 2024-01-03 16:25:41 zelgomer: there's an American National Standard for Forth from back in the day, this site documents it and is apparently trying to update it 2024-01-03 16:25:44 Ok, in my old FIG era reference it specified that the high order half of the number would be uppermost on the stack (i.e., DUP would duplicate the high order half). 2024-01-03 16:26:54 GeDaMo: but it doesn't document most of the now obscure words that i want to read about, and instead introduces new obscure words that i don't care for 2024-01-03 16:27:33 but it keeps coming up in my search results which only makes my search more difficult 2024-01-03 16:27:59 KipIngram: but what about in memory? 2024-01-03 16:28:12 zelgomer: if it helps, I have a few older forth documents at http://forthworks.com:8800/forth 2024-01-03 16:28:22 Do you know about https://forth.sourceforge.net/mirror/comus/index.html and http://www.wilbaden.com/neil_bawd/ ? 2024-01-03 16:31:09 did not know about any of these links. thanks 2024-01-03 16:32:21 I don't see yet how the double store and fetch words operated. But I've always looked at this as something driven by your processor. It's going to be either big endian or little endian, and I'd conform all my words to that, both on-stack and in-RAM. 2024-01-03 16:32:33 It seems obvious to me. 2024-01-03 16:33:06 the x86, for example, can do 8, 16, 32, or 64 bit accesses, and it does them in a particular way. 2024-01-03 17:25:57 Looks like interesting stuff is going on in physics. Ever since Newton, the general theme was that space and time (or spacetime) were "given" and then physics was the study of stuff happening on that "stage.' But it's looking more and more like spacetime isn't actually fundamental at all, but is in fact part of the "output" of physics. The latest work is focused on other structures fully outside of 2024-01-03 17:25:59 spacetime, from which spacetime and quantum mechanics "emerge" as effects. I'm trying to crack into it, but it's fairly rough going; probably will take several exposures. 2024-01-03 17:27:30 A couple of guys in consciousness studies have been saying something like this for a while - it's interesting to see the high energy physics crowd kind of "converging" with them. 2024-01-03 17:29:53 the folks in consciousness studies are just unaware 2024-01-03 18:00:56 I'm not so sure; I think some of them may be on the right track. Especially since it seems like actual physics is nudging in the same direction. Of course these are hard-core physicists and you won't hear the word "consciousness" come out of their mouths - they're on a thoroughly different track. Scattering amplitudes and things like that. 2024-01-03 18:01:36 And of course plenty "consicousness folks" are indeed just out in the weeds wandering around. 2024-01-03 19:12:16 I get particularly frustrated with the ones that try to roll out quantum theory or whatever to support their New age ideas. You really don't start to see the effects of quantum phenomena in the world until you get down to a scale way, way below our day to day lives. Our "big world" works pretty much exactly the way it looks like it works. 2024-01-03 19:12:35 If it didn't, we wouldn't have gotten so far in science "pre-quantum." 2024-01-03 19:12:46 We just didn't need it to make sense of things for quite a few hundred years. 2024-01-03 19:13:12 After we really started getting scientific about things, I mean, which I generally place in the 1600's or so. 2024-01-03 19:13:34 Newton was quite the alchemist 2024-01-03 19:13:55 Though actually the Romans were doing pretty damn well. If it hadn't been for their collapse we'd probably have saved a thousand years or so. 2024-01-03 19:14:10 Sure - even those prominent guys still had their foibles. 2024-01-03 19:14:47 https://acoup.blog/2022/08/26/collections-why-no-roman-industrial-revolution/ 2024-01-03 19:14:48 The thing is, in the end the alchemists were right. It actually IS possible to turn lead into gold. It's just a heck of a lot more expensive than it would be to buy the gold. 2024-01-03 19:15:09 Oh, neat - thanks. 2024-01-03 19:15:32 Hero of Alexander built a "steam engine" (really just a toy, but it proved the principle) as early as 50 AD. 2024-01-03 19:16:11 And it really was the steam engine that brought on the industrial revolution. 2024-01-03 19:17:26 The Romans had no qualms about using slave labor, though, so a steam era wouldn't have really been feasible until steam-based tech became more economical than slave labor. 2024-01-03 19:18:22 I also read somewhere that steam engines didn't really get effective until they started to be coal-powered, and England turned out to have ample coal reserves. 2024-01-03 19:23:06 I generally agree with the claim in that link that it was Roman social and cultural structure that really precluded them having an industrial revolution. They also just lacked "stability" - apparently there was almost no such thing as a peaceful transfer of power from one regime to the next in Rome. There was a lot of instability in their system. 2024-01-03 19:26:33 Very good article. 2024-01-03 19:26:44 Bret does good stuff 2024-01-03 19:30:21 What Rome didn't have, that kept their steam devices in the toy category, was a *science* underpinning the phenomena. It's not a particularly big deal to just see that boiling water makes steam and steam makes pressure. But the details matter, and we really needed a science of thermodynamics before we could actually move the idea forward. 2024-01-03 19:31:05 Once the theory was nailed down, that's when steam engines really started to go. 2024-01-03 19:31:48 Maybe I'm going crazy, but a week ago I was reading a forth implementation and suddenly I "saw" a program. It was a tree structure with each node describing a "part" of it. It's strange because I'm confident I know this, but seeing it was an entirely different experience. 2024-01-03 19:32:03 just like in "The Matrix" 2024-01-03 19:33:18 Clubbed to Death now playing in my head :) 2024-01-03 19:33:36 https://www.youtube.com/watch?v=pFS4zYWxzNA 2024-01-03 19:53:59 Quite a few oopsies in this article - I'm wagering it didn't get a proofread. 2024-01-03 19:54:32 "preconditions which were on true on Great Britain..." 2024-01-03 19:54:46 I'm sure that was meant to be "only true in." 2024-01-03 19:54:59 he's writing more or less for free in his copious spare time 2024-01-03 19:55:00 AI GENERATED! 2024-01-03 19:55:12 naw, bret has an article dissing on AI 2024-01-03 19:55:28 thrig: Sure - I don't really mean to be criticizing him. It's an excellent and insightful article. 2024-01-03 19:56:57 Actually that's exactly the sort of error I would *not* expect from an AI. 2024-01-03 19:57:28 but you can expect typos galore in his blog (maybe his other prose is perfect, so his blog is where he lets his hair down?) 2024-01-03 19:57:50 Yeah, he's probably operating in a more casual, conversational mode. 2024-01-03 19:58:04 And it's really the ideas that are important. 2024-01-03 19:58:31 That was good - thanks again for sharing.