2023-07-30 02:37:23 Man, I'm so confused about how the definition of : works in SmithForth 2023-07-30 02:38:21 Thereas a word Head, which sets up new word header in the dictionary and expects [rsi] to point to the length of the word. 2023-07-30 02:39:15 But, IIUC, : just callse pname, which parses a new word, shoving it's address in rbp and length in rax. 2023-07-30 02:51:36 xelxebar: it seems to me ':' set STATE to compiling by ']', then any word would get compiled instead of executed in the interpreter loop (call hit), until ';' restore STATE to interpreting. (guessed from SF comments..) 2023-07-30 02:52:50 next4th: Yeah, after the dictionary is setup, it switches to compiling, which is fine. 2023-07-30 02:53:03 I'm confused about how it's correctly setting up the dictionary header, though. 2023-07-30 02:53:44 ah, okay, that detail i won't look into now :\ 2023-07-30 02:54:09 Head expects rsi to point at bytes like [count] [word char 1] [char 2] ..., but rbp just points at [word char 1], right? 2023-07-30 02:59:27 i see the length is in rax from pname and PARSE, and Head has 'xchg eax, r32', maybe passed this way.. 2023-07-30 03:01:01 Hehe, I see you're actually looking :D Yeah, it's xchg rax, rcx, I believe. However, the instruction just before that is stos, which clobbers rax. 2023-07-30 03:03:36 well, it has "[rdi++] = al stos m8", not sure, but it will keep rax right? 2023-07-30 03:04:20 i don't know what 'stos' means yet lol 2023-07-30 03:41:30 next4th: STOS is 'STOre String', it stores data in AL/AX/EAX/RAX to the destination index (RDI) 2023-07-30 03:41:59 So STOS M8 I would guess means store AL into byte at RDI 2023-07-30 03:42:57 And it also auto-increments or auto-decrements RDI, depending on the value of the direction flag (most ABI's leave this as auto-increment) 2023-07-30 03:43:23 So yes that wouldn't affect RAX 2023-07-30 03:44:02 [rdi++] = al is a good summary of the instruction, providing the direction flag is clear 2023-07-30 03:45:06 (also it goes without saying the 'increment' is based on size of store, so if you store a word it increments by 2, etc) 2023-07-30 03:45:40 (but AL is a byte register so it increments by 1) 2023-07-30 03:48:20 veltas: Much cleaner explanation. Thank you. 2023-07-30 04:01:27 Hrm. So . essentially writes a byte into the dictionary/data piece of memory? 2023-07-30 04:01:44 Looking like SmithForth uses r15 for it's stack pointer. 2023-07-30 04:04:59 I don't know the full context here but . usually prints a number followed by a space 2023-07-30 04:06:17 Hrm... Okay. I'm clearly not seeing the full picture then. 2023-07-30 04:07:20 Looking at SmithForth's source I agree with your statement 2023-07-30 04:07:51 [ switches into immediate mode (this is standard), and then . seems to write the number to the code area 2023-07-30 04:08:48 He documents right at the bottom the definition `: .` as 'the normal Forth dot' 2023-07-30 04:09:04 So he obviously uses `.` in a different sense in the compiler 2023-07-30 04:10:56 Oh, nice. It gets redefined at the very end in the Forth-code library. 2023-07-30 04:10:59 When I say "switches into immediate mode" I mean it starts *interpreting* rather than *compiling* 2023-07-30 04:11:23 Do you think the compiler '.' is defined by the assembly code in the machine code dump file? 2023-07-30 04:12:49 Found it: '. ( char -- ) nonstandard name for C,' 2023-07-30 04:13:19 If you search for that in the dump, that's the definition he gives for '.' while bringing up 2023-07-30 04:13:34 Yeah, I'm basically looking at the dump file before it gets to sys. 2023-07-30 04:14:15 But instead of looking at the commented version, I just manually dumped the hex of the binary and am working with that. 2023-07-30 04:14:56 The idea is to force myself to actually grok the code and decisions it makes. 2023-07-30 04:15:31 Oh okay 2023-07-30 04:16:09 Anyway, looking at system.fs, it does look like he's using . to manually write bytes into the data area, which lets him shove opcodes there directly from forth code. 2023-07-30 04:16:27 It's essentially just C, 2023-07-30 04:16:51 I don't know why he doesn't call it that, unless he can't parse words longer than one character in the bootstrap parser 2023-07-30 04:17:50 But he can because looking at the forth source he's using stuff like UM/MOD early on 2023-07-30 04:18:19 I suppose it's a bit easier on the eyes to write [ 01 . 02 . 03 . ] 2023-07-30 04:18:33 That's almost *exactly* the limitation. You can define arbitrary-length words, but the search code just matches on first character. 2023-07-30 04:19:18 Yeah, the +1 is literally defined as the opcode triplet for `inc [rdi]` 2023-07-30 04:19:56 Ah okay 2023-07-30 04:20:06 Whoops, s/rdi/r15/ 2023-07-30 04:20:30 I agree with his statement that he needs to use a top-of-stack register 2023-07-30 04:21:03 He does everything in memory which is a bit slower, `inc [r15]` could be just `inc rax` 2023-07-30 04:22:31 Ah, good point. After ingesting all this once, I'll have to spend some time digesting the design-level questions. 2023-07-30 04:23:48 My unsolicited advice is that I don't think this exercise is going to help you understand better, I think you should just read the commented source 2023-07-30 04:24:04 But I don't own you so do what you feel like :) 2023-07-30 04:26:53 Haha. I'm certainly understanding the x86 architecture better than I ever have. 2023-07-30 04:27:48 Which is probably good for writing and understanding Forth implementations, but not so necessary for simply writing good Forth. 2023-07-30 04:28:47 Mostly, this is my way in. I probably don't have enough motivation to learn Forth ad vacio. 2023-07-30 04:29:51 The main way I've learned x86 is by writing it in assemblers and also actually inspecting the machine code listings 2023-07-30 04:30:30 And this has allowed me to write better, shorter assembly 2023-07-30 09:10:07 xelxebar: i think i just got what you means for 0x00-0x3f, also for op[------dw] w means last bit is 0, for byte-width, W be 1 for full-with (32, or 66h prefix to 16). this http://c-jump.com/CIS77/CPU/x86/X77_0050_add_opcode.htm helps me some. 2023-07-30 09:19:48 next4th: Yeah! Exactly. 2023-07-30 09:23:44 okay, next day i'd read for op fields and maybe REX, little steps :^) 2023-07-30 10:19:15 Hope you're having fun :) It actually seems like getting to 80% proficiency might be relatively straightforward, at least for the one-byte opcodes. The 0Fxx series feels pretty intimidating still. 2023-07-30 10:48:26 KipIngram: https://journal.stuffwithstuff.com/2015/02/01/what-color-is-your-function/ # Remembrance of operations past 2023-07-30 10:48:44 LtU GoTo 2023-07-30 12:00:07 That's a good article. 2023-07-30 12:01:15 I've had problems grasping the idea of closures, but I think t his may have helped. At least it got me thinking of possibilities that "might be right." 2023-07-30 12:04:31 So a closure just relates to a process of computation that hasn't yet completed (so you've got a call stack defining its state) and eventually you let it finish and that's what actually brings you your result? 2023-07-30 12:12:24 See, the idea of a thread not being done yet seems entirely straightforward to me. No mystery there. That's part of what threads are all about - they wait for things sometimes. And obviously you can't use the thing being waited for until it's arrived. 2023-07-30 12:12:55 In all of our discussions of closures before, I don't think anyone had explicitly connected them to blocked threads the way this article did. 2023-07-30 12:13:14 That was a key insight for me - a mental "click." 2023-07-30 12:13:42 I'm sure the idea got "talked around," but I just didn't explicitly see the connection. 2023-07-30 12:17:52 So correct me if I'm wrong, but it looks like in some cases you never actually have to return up through the call stack for those operations. You call down, and get blocked. Meanwhile other stuff is running right along. And then when that block is removed and the result is delivered, that thread could potentially just die, right? 2023-07-30 12:18:23 Its work is done. So the "closure" and all the other fancy business around all this is just a way of keeping track of that whole process? 2023-07-30 12:21:56 Yes potentially and that might be Tail-Call Optimization. If has 2 _Continuations_: then and else. In Python: {True: then, False: else}[test("if desired")]("if desired"). 2023-07-30 12:23:12 Is it somehow like making a call now that gives you back not a result but a function, and later you can call that second function to get the actual result? 2023-07-30 12:23:52 It's like a roadmap that you can later use to get your result; the roadmap can exist even before the result does. 2023-07-30 12:23:54 ? 2023-07-30 12:23:57 Yes. Currying, Partial Application. 2023-07-30 12:24:09 See, those are terms that just further confuse me. 2023-07-30 12:24:16 This happens to me reading math papers all the time. 2023-07-30 12:24:49 Math is just full of ideas that are ultimately pretty straightforward, but they dress it up in so much fancy language that I wind up not being able to assemble a larger enough collection of "understood words" to get to the meaning. 2023-07-30 12:25:12 Then eventually I'll find an explanation someone writes using simeple language, and I'm like "Oh... well, that's SIMPLE." 2023-07-30 12:25:49 "This result won't be ready until later, but when it is you can call f() to get it." 2023-07-30 12:25:52 ^ Simple. 2023-07-30 12:26:33 And the code that actually puts the result where f() can find it - once it's done that it may not have any further purpose and can just go away. 2023-07-30 12:27:06 Your main flow of control moved on long ago. 2023-07-30 12:28:31 Well not quite ""get""... from whose perspective? You usually have a Parameter with appropriate name ""return"" and You call it with _results_, from 0 to how ever many! 2023-07-30 12:29:37 One little math example of that is that I'd known about function types "one to one," "onto," and so on for a long time. Those words actually DESCRIBE what's going on to some extent. 2023-07-30 12:29:55 And as a very interesting example, _if_ has two Continuations, _then_ and _else_, in place of _return_! 2023-07-30 12:29:57 Then they start tossing in "bijection," "surjection," etc. 2023-07-30 12:30:05 Why? What was wrong with one to one and onto? 2023-07-30 12:31:41 It's easy enough to learn the new words, when someone actually tells you explicitly what they mean. But that's just a little example of a pervasive issue, and in many cases the paper at hand isn't in the business of explaining all the terminology - it has some other purpose. 2023-07-30 12:31:58 So it's easy to get "blocked" by a mass of unfamiliar terminology. 2023-07-30 12:32:12 Good point about math. IDK. 2023-07-30 12:32:20 If there are only one or two "unknown" terms in a passage, you can often figure out what they must mean from context. 2023-07-30 12:32:29 But if it's 90% unknown terms, you just get lost. 2023-07-30 12:32:41 Perhaps to emphasize it's _Form_al nature? 2023-07-30 12:33:25 Well, when I get annoyed at it I accuse them of deliberately creating a "priesthood" culture, where they communicate in a secret language and the whole point is to show everyone how fancy they are. 2023-07-30 12:35:05 Good point! I agree. 2023-07-30 12:35:07 And I know that's an exaggeration - I imagine what I said up there is the real explanation: the paper they're writing has a purpose, and that purpsoe doesn't include educating noobs. 2023-07-30 12:35:26 They're trying to share their ideas with people who are fit to judge them. 2023-07-30 12:35:59 They're trying to work at the frontier of their field, and that's different from "basic education." 2023-07-30 12:36:25 Having Wikipedia around to look up terminology with helps a LOT. 2023-07-30 12:37:09 I think part of it is that I'm lazy. I like PERUSING papers more than rolling up my sleeves and studying them the hard way. 2023-07-30 12:38:02 I'll read a paper, and glean a few new insights from it. Then another and another, and when the process works i eventually reach a point where those insights start touching each other, and then a whole block of new understanding will just "cascade into place." 2023-07-30 12:38:11 It can happen really suddenly, and when it does it's a great feeling. 2023-07-30 12:38:38 Have You noticed that ""Mapping"" is signifficantly more general than ""function""? And then ""function"" in ""math"", (K&R) C, JavaScript? I it is deliberately done!! 2023-07-30 12:38:51 I took a class in tensor methods in graduate school. Made an A. But looking back I realize I didn't fully "get it." I learned to turn the crank on the machinery well enough to score well on the tests, but I didn't really understand what I was DOING. 2023-07-30 12:38:55 Didn't get the "why." 2023-07-30 12:39:17 Then decades later I was watching a leture on general relativity by Leonard Susskind, and he always goes through the relevant math in his lectures. 2023-07-30 12:39:23 So he was going through tensor methods. 2023-07-30 12:39:42 And in like the blink of an eye it all just clicked together and suddenly I "got it." 2023-07-30 12:39:55 The raw material for that had been lying around in my head for years and years. 2023-07-30 12:42:13 What happens if you call that function that initially comes back to you in a closure but the result still isn't ready yet? I guess that just blocks in the usual way and now perhaps you really do wait? 2023-07-30 12:42:54 When I was dickering around with that Xilinx FPGA Forth processor years ago, I had a fetch unit and an execute unit, with a fifo in between. 2023-07-30 12:43:12 The fetch unit unraveled the Forth call structure until it found opcodes, and passed them through the fifo. 2023-07-30 12:43:18 The execute unit just executed opcodes. 2023-07-30 12:43:28 Then the question of conditional branches came up. 2023-07-30 12:43:39 The fetch unit has to decide which way to go, but we might not know yet. 2023-07-30 12:44:20 To avoid having that just block every time, I came up with a "split decision" type thing. I separated the conditional test from the jump decision, so that I could separate them as far as the problem at hand allowed. 2023-07-30 12:44:35 So hopefully by the time fetch reached the branch point, execute had evaluated the condition. 2023-07-30 12:44:56 There was a "flags" mechanism that allowed execute to post boolean values back to execute. 2023-07-30 12:45:10 Fetch could tell if that had been done yet or not - "empty" was a possible flag value to. 2023-07-30 12:45:11 too 2023-07-30 12:45:24 So, if fetch reaches the branch and the pertinent flag is empty, fetch has to wait. 2023-07-30 12:45:33 But hopefully the flag had arrived by then. 2023-07-30 12:45:46 Seems... "related" in some ways to this other stuff. 2023-07-30 12:46:43 There was some small number of those flags, to allow for nested loops. 2023-07-30 12:47:00 No attempt to automate the choice - that was programmer chosen. I just needed to know what I was doing. 2023-07-30 12:47:04 Your Pipeline was not deep enough for stalls? Branch Delay Slots? 2023-07-30 13:42:52 The pipeline could have been as deep as I wanted. 2023-07-30 13:43:08 What I was trying to avoid was any sort of trying to pre-fetch multiple paths. 2023-07-30 14:14:44 Anyway, the goal was to approach as closely as possible reducing the call overhead to zero. 2023-07-30 14:17:19 There was at least one "classic case" where that split conditional thing worked really well. Consider just processing the elements of a linked list. One way of doing that would be do { ...process p*... ; p=p->next } until (p == NULL); 2023-07-30 14:18:24 But you can check the next field at the start of the loop and post the result to the flags. THEN do the processing, and branch after the processing. 2023-07-30 14:18:57 There would be cases where the decision was just impossible to evaluate prior to the main processing, so I'm sure there would still be periods of fetch stall. 2023-07-30 14:19:19 And consequently periods where execute exhausted the fifo. 2023-07-30 14:24:23 Of course if you ever got into a situation where fetch was stalled and the fifo was empty... that can't be recovered from. 2023-07-30 14:30:25 "Can't." I say that like it was ever real. :-) 2023-07-30 14:31:48 pull the stick back and maybe oh wait not a plane 2023-07-30 20:06:32 Ugh. My AT&T UVerse service has been down the last three and a half hours. 2023-07-30 20:06:45 I'm just hobbling along on my (very) marginal hotspot. 2023-07-30 20:07:02 Works really nicely here, given IRC's low bandwidth needs. 2023-07-30 20:33:08 I tend to forget how "network addicted I am until it goes down. 2023-07-30 20:37:08 I can quit any time! 2023-07-30 20:37:53 Heh heh. Well, I wouldn't "die" if I lost it, but there isn't much I can think of that would cause me to voluntarily cut myself off. 2023-07-30 20:38:47 This channel right here and the Dresden Files subreddit are my primary social connections. 2023-07-30 20:40:30 And my biggest "time consumer" on the leisure front is one form or another of learning things online.