2023-10-20 01:25:18 https://cs.stanford.edu/people/shadjis/blas.html 2023-10-20 07:24:40 about sectorforth: nand is shitting, with only nand gates you can build any logical circuit. 2023-10-20 07:25:17 heat dissipation is one of the major limitations in designing a ship 2023-10-20 08:06:35 you *can* but it's easier if you use other types of gates 2023-10-20 08:06:50 building a register out of just NAND gates is tedious 2023-10-20 08:15:29 You can program in brainf*** but it's not recommended 2023-10-20 09:49:03 xcombelle: Yeah, NAND is a "universal logic group" all by itself. 2023-10-20 09:49:22 NOR is too. 2023-10-20 09:50:15 To prove that you can just show how to implement AND< OR, and NOT using NAND only or NOR only. 2023-10-20 09:50:39 DeMorgan's theorem helps with a formal proof. 2023-10-20 09:52:51 https://www.nandgame.com/ 2023-10-20 09:54:27 Also, heat dissipation is one of the major limitations in the performance of electrical machinery (motors and generators) too. 2023-10-20 09:54:38 That heat - it's always getting in the way. 2023-10-20 09:56:12 I was just reading "Look ma, no fans: Mini PC boasts slimline solid-state active cooling system" https://www.theregister.com/2023/10/20/mini_pc_boasts_slimline_solidstate/ 2023-10-20 09:57:07 better than users sticking a pencil into the rattling fan that was annoying them 2023-10-20 09:58:32 Oh, that's neat. 2023-10-20 10:14:00 12 2023-10-20 10:14:10 sorry, wrong window 2023-10-20 10:36:06 xcombelle: That thermal limitation - that's really what I was reminded on last night while looking into AVX and particularly AVX512 instructions. They're hugely powerful instructions, but if you start using them heavily your processor is apt to downclock to keep temperature in bounds. So if the main limit on your performance is thermal, then to some extent using more powerful instructions doesn't do you a 2023-10-20 10:36:08 whole lot of good. 2023-10-20 10:36:53 It's going to help some, because at least with the vector instructions you're getting more "data work" done per unit of "control work," but you're definitely not going to just multiply your performance by N. 2023-10-20 10:38:59 I read some stuff last night about how that frequency scaling works. Apparently the algorithm is very complicated. Doesn't just see you use the heavy instruction once and snap your frequency down. It's a gradual thing, so it has to do with what portion of your instruction stream is "high energy." 2023-10-20 10:39:23 And it's per core, so if you can segregate your AVX work to one core, the others can still run at full speed. 2023-10-20 12:27:08 So, re: AVX and APL style vector processing, APL supports arbitrary size vectors, which will necessarily be in RAM. Each "chunk" would need to be passed through the AVX hardware, iteratively. 2023-10-20 12:28:10 But, if you happen to know you're solving a 3D or 4D problem, like maybe doing something physics related, then you could potentially just keep your "vector top of stack" in the AVX registers, much like we cache our TOS currently. 2023-10-20 12:28:53 I.e., we could implemenT a "vector stack," up to 4D (I'm assuming we'd like to use double precision). 2023-10-20 12:29:03 Up to 8D single precision. 2023-10-20 12:48:12 i've been playing around with a forth in my head. how sacreligious would you consider it if words were tokenized and then compiled rather than compilation occurring the moment a space is encountered after each word? 2023-10-20 12:48:31 tokenized and compiled when the semicolon is encountered, i mean 2023-10-20 12:48:37 is it no longer forth at that point? 2023-10-20 12:49:07 and i'm wondering what ramifications this might bring about that i haven't thought of 2023-10-20 12:52:12 I think "Forth" gives you a lot of latitude for experimentation. 2023-10-20 12:52:21 There are optimizing, native code generating Forth compilers 2023-10-20 12:52:39 Chuck Moore, Forth's creator, once said that "Forth" basically means "uses a data stack" and "has colon definitions." 2023-10-20 12:52:46 That leaves you quite a lot of room to play. 2023-10-20 12:52:54 And besides, sacrilige is fun... 2023-10-20 12:53:12 You could have compile, adding each token to a tree or other similar structure and then ; would generate the code 2023-10-20 12:53:31 s/sacrilige/sacrilege/ 2023-10-20 12:54:25 GeDaMo: that's pretty close to where i'm going. the key is still keeping with the spirit of making it simple 2023-10-20 12:56:20 zelgomer: I've also spent some time thinking about how to bring a bit more parsing sophistication to Forth. A driving reason for me is the desire to support more complicated "literals." If you think about it, Forth already "special cases" literal numbers. 2023-10-20 12:56:33 it's just a special test it does after a dictionary search fails. 2023-10-20 12:56:49 But that's just one kind of literal. What about quoted strings, literal lists, literal arrays, etc.? 2023-10-20 12:57:05 there are Forths which use multiple stacks, right? e.g. iirc in the old days floats belonged to a second stack? 2023-10-20 12:57:13 They really aren't any "different" conceptually from numbers, except that you can't put one of them into a single stack item. 2023-10-20 12:57:37 rendar: Quite a lot of multiple data stack efforts have been made over the years, but none of them ever "caught on." 2023-10-20 12:57:47 why not? 2023-10-20 12:57:55 One of Forth's key advantages is that you know precisely where the operands for each operation are. 2023-10-20 12:58:12 I.e., the top of the stack. If there are more than one stack, then you'd have to somehow specify. 2023-10-20 12:58:32 They call it "implicit addressing." 2023-10-20 12:58:32 yeah 2023-10-20 12:58:42 And it contributes to code compactness. 2023-10-20 12:59:16 I think mostly, though, is just that having multiple stacks didn't really provide a lot of advantage - no one ever pointed out a "killer app" for it. 2023-10-20 12:59:16 Normally the words you use would specify which stack they work on e.g. f@ uses an address on the data stack but fetches to the floating point stack 2023-10-20 12:59:26 Yes. 2023-10-20 12:59:39 That is done fairly often (separate floating point stack). 2023-10-20 12:59:48 It make sense when it's always clear what to do. 2023-10-20 13:00:04 wait, nowadays Forth still has a separate floating point stack?! 2023-10-20 13:00:14 You can probably find some that do. 2023-10-20 13:00:22 There's no universal rule about it. 2023-10-20 13:00:24 I believe gforth does 2023-10-20 13:00:33 you can find some that don't support floats 2023-10-20 13:00:36 Well, I shouldn't say that - the standard may have something to say on the topic. 2023-10-20 13:00:48 i see 2023-10-20 13:00:58 https://gforth.org/manual/Stack-Manipulation.html 2023-10-20 13:01:33 A big reason it was done that way in the past is because the HARDWARE had a separate floating point stack. Back when we used the 8087 co-processor. 2023-10-20 13:01:44 it had an internal stack, which encouraged people to recognize that stack in Forth. 2023-10-20 13:01:53 Only eight items deep, though. 2023-10-20 13:02:26 Eight is actually a lot - I got all the way through college with an RPN calculator sporting a four-deep stack. 2023-10-20 13:03:02 yes indeed 2023-10-20 13:03:10 It's an interesting little challenge to write a quadratic equation root taker that works completely in a four-item stack. 2023-10-20 13:03:17 i.e., uses no "scratch registers." 2023-10-20 13:03:28 It's possible, but hard enough to make it interesting. 2023-10-20 13:03:38 heh 2023-10-20 13:03:57 Yeah, I'm just such an exciting guy - that's the kind of stuff I spend my time on. :-) 2023-10-20 13:04:17 Drives the ladies crazy... 2023-10-20 13:04:26 KipIngram, i see you as very enthusiast and passionate supporter of Forth, when I'll publish my Forth version, could I link the docs to you, so you can give me some feedback? if you want.. 2023-10-20 13:04:43 I would be happy to take a looks. 2023-10-20 13:04:47 thanks! 2023-10-20 13:07:00 I think what I'm really into is simplicity and absence of "waste." Forth just exhibits those things, so... I'm a fan. 2023-10-20 13:07:47 And also, that calculator I mentioned was the first thin I ever programmed, so my brain got "tuned" to stack machines right from the start. 2023-10-20 13:08:32 It was really its RPN operation that first drew me to Forth - I only appreciated the simplicity and efficiency of it later on, after I'd scratched around under the hood for a while. 2023-10-20 13:15:19 Ha! This conversation ^ got me to thinking about Forth "parsing" in general, and suddenly it occurred to me that my Forth would probably crash if I typed a : alone on a line. 2023-10-20 13:15:32 I tried it, and sure enough - adios, folks. Hung. 2023-10-20 13:16:04 What that does is create a new definition for the null word, and then that displaces the old definition and I'm no longer able to "terminate" lines. 2023-10-20 13:16:40 It's immediately broken, since I don't have a smudge bit. That new definition takes effect the minute it's added to the dictionary. 2023-10-20 13:18:22 The existing null word basically just executes an rdop exit, which breaks me out of INTERPRET's infinite loop. 2023-10-20 13:19:36 I assume that it a) parsed out a null and created a new definition with it, then b) parsed null again, and executed that unterminated definition. 2023-10-20 13:19:55 But at any rate, even if it hadn't flown off into never never land, it would have just continued to parse that null otu forever. 2023-10-20 13:20:35 My WORD won't advance past a null termination. If it's begun peeling out a word when it hits null, then that completes that word. Then subsequent calls to WORD will just return null over and over, forever. 2023-10-20 13:34:00 Ok, I think this picture I have in mind for an "improvable" file system is pretty clear now. It will be immediately usable, but low-ish performance, with hooks that will let me accelerate it with more advanced methods later. 2023-10-20 13:34:19 And that low-ish performance version is SIMPLE. 2023-10-20 15:07:21 https://www.nature.com/articles/d41586-023-03267-0 2023-10-20 15:07:26 Yay team... 2023-10-20 15:08:55 Also of interest: 2023-10-20 15:08:57 https://www.scientificamerican.com/article/grammar-changes-how-we-see-an-australian-language-shows/ 2023-10-20 15:09:16 Newspeak, anyone? 2023-10-20 15:15:59 plusgood 2023-10-20 15:29:20 Honestly the whole notion of Newspeak struck me as one of the most frightening things I'd ever read, the first time I read 1984. 2023-10-20 15:29:28 That's a totally chilling idea. 2023-10-20 15:34:29 orwell did "simple english" translations during The War 2023-10-20 15:34:42 Wow - that language article is fascinating. 2023-10-20 15:36:22 Totally unrelated, but you mentioned "the war" and it made me think of it. Johnny Cash, the country music singer, was the first person in the West to know that Stalin had died. He was a Morse code operator for the Air Force, and was assigned to copying Soviet radio traffic. He found out that way, and his report was how others in the military found out. 2023-10-20 19:21:23 I wrote up some notes on these file system thoughts I've had the last day or two: 2023-10-20 19:21:25 https://pastebin.com/GrgVZ21F 2023-10-20 23:07:45 Anyone have a guess at the "common" range of "number of files per directory"? 2023-10-20 23:10:35 My only guess is "varies pretty wildly." 2023-10-20 23:10:51 Most directories I create never have too terribly many files. 2023-10-20 23:11:00 On the other hand, my /bin has 4556. 2023-10-20 23:52:54 You know, there are some really nice attributes of file systems, but some "pains" too. I'm planning this system for remoting comments and other documentation. It seems very straightforward to me to create a tree of links that connect " to some other . 2023-10-20 23:53:06 I.e., "this text in this block" is connected to "that text in that block." 2023-10-20 23:53:39 And when I edit a block it will be easy to look up any links that involve the higher offset parts of that block and nudge their offset to account for the edit. 2023-10-20 23:54:33 But... if I have a file system, I might make a link from some text to something somewhere else, and then later edits might not just move that link's origin text to a different place in the block - it might SPLIT IT over a block boundary. 2023-10-20 23:54:53 So what then? Logical thing would seem to be to make it two links. 2023-10-20 23:55:09 Which probably isn't actually problematic, is just is an annoyance. 2023-10-20 23:55:28 so I'm sitting here contemplating alternatives. 2023-10-20 23:56:02 The simplest one I see is to allow multiple blocks to be gathered together as a file, but treat them as "pages within files." 2023-10-20 23:56:26 That is, I'd still honor each block boundary - I'd just recognize that sequence of blocks as collected under a name. 2023-10-20 23:57:11 In that order. In that case, I wouldn't have one "sentence" or whatever you want to call my self-contained bits of text, straddle a block boundary. 2023-10-20 23:57:33 So in a real way I'd still be working with blocks - I'd just have some help organizing and associated names with them. 2023-10-20 23:57:50 A file would be "a collection of blocks." 2023-10-20 23:58:05 Not one monolithic stream of stuff. 2023-10-20 23:58:47 I could still take a set of blocks and use them to copy an image into RAM - I'd just pay attention to the bytes in use of each one and glue them together properly. 2023-10-20 23:59:50 It seems like it might retain some of the "tidyness" of blocks, while getting me some of the organizational benefits of files, directories, and so on.