2022-07-10 11:43:32 KipIngram: wasn't sure what to think of that claim about proteins and functionality. 2022-07-10 11:43:51 log(20^150) is about 195 so ... 2022-07-10 11:44:09 I'm still not sure what to think of it, but at least I've got some sense as to how few that is amongst the possibilities. 2022-07-10 11:44:46 I haven't kept up with the field to know how far along they're getting with heuristics for the protein folding problem 2022-07-10 11:45:11 so I'm not even sure where that estimate (10^77) is coming from 2022-07-10 11:46:30 It's just a number I saw in something I read. I can't comment on its integrity. 2022-07-10 11:46:57 yeah it's just interesting to think about this after not really thinking about it for a while 2022-07-10 11:47:24 Main thing I noted from it is that if that's so then they still have some things left to figure out about abiogenesis. 2022-07-10 11:47:33 oh my mercy yes 2022-07-10 11:47:55 Unless there's a multiverse - if you throw a factor of 10^500 or 10^1000 universes into the mix, then things that were otherwise improbable almost have to happen somewhere. 2022-07-10 11:48:16 one of the theories of abiogenesis is that the first things that looked like "life" were RNA 2022-07-10 11:48:49 I mean, the "random chemistry in a warm puddle over a long time" theory sounds nice, but when you start including numbers like 10^77 in the mix then the probabilities wind up really small. 2022-07-10 11:48:49 but that's a hard thing to work with in a lab because now there are so many enzymes all over the place that chew up RNA pretty thoroughly 2022-07-10 11:49:25 Yeah - there's a lot of "interference" floating around that didn't exist at that time. 2022-07-10 11:49:43 Hard for life to pop up when life is already around and is so good at eating things. 2022-07-10 11:50:02 yup 2022-07-10 11:50:06 they discovered in the '80s I think it was that RNA could perform catalytic aka enzymatic functions 2022-07-10 11:50:11 Yes. 2022-07-10 11:50:19 There've been nice bits of progress over the years. 2022-07-10 11:50:32 There's some interesting aspects of non-equilibrium thermodynamics too. 2022-07-10 11:50:50 That indicate complexity seems to emerge spontaneously in certain types of situations where there's a continuous energy flow. 2022-07-10 11:51:12 There seems to be a natural tendency for things to arrange themselves for maximum entropy production. 2022-07-10 11:51:33 And that's even before you have any self-replication and natural selection. 2022-07-10 11:54:11 I'm wondering currently if maybe that sort of thing will fill in some of the blanks in a final theory about all that stuff. 2022-07-10 11:54:52 Once you get life started, natural selection handily explains the rest. It's just that initial kicking off that's massively unlikely. 2022-07-10 11:56:29 I find it kind of entertaining that guys like Richard Dawkins, who are not physicists, have jumped all over multiverse theory. They don't really have the background to have a really strong scientific opinion about it, but it really *helps* them - it puts their own arguments on much more solid numerical ground. So, they're fans. 2022-07-10 11:57:59 I think a really important thing will be whether or not we ever find any life of independent origin. Right now we only know about ONE example of life having "initiated." That means it's ok for it to be incredibly, incredibly low-probability. No matter how low the probability was, it still could have happened once. 2022-07-10 11:58:16 But if we find another tree of life, then suddenly we need an explanation that provides a much higher probability. 2022-07-10 12:07:24 A few months ago there was a flap about spectral indicators of possible life in the upper atmosphere of Venus. I heard about it fairly often for a few weeks, but it's been a while now; may have fizzled out. At the time it seemed worth following up on to me, with a probe of some kind. Even if it's there, though, it could turn out to be from the same tree - apparently during an earlier phase of the life of 2022-07-10 12:07:26 the solar system material moved around a good bit and they it it would have been possible for genetic material to be included in that. 2022-07-10 12:07:56 they "think" 2022-07-10 12:08:57 probes don't do too well on venus 2022-07-10 12:11:46 Yeah I know. This one would just have to dip briefly into the very top of the atmosphere and then bail back out. 2022-07-10 12:11:58 It's not as bad up there (or there wouldn't be any chance of life). 2022-07-10 12:12:26 There are some non-organic chemical reactions that could have created what they think they saw, but they aren't considered likely under the conditions there. 2022-07-10 12:12:56 Anyway, it may be a dead issue now; I lost track of it after a while. 2022-07-10 12:14:40 And it would only be "hugely significant" if it turned out to be an independently originating life; if it's related to us it's "interesting" but much less important. 2022-07-10 12:14:44 someone would need to fund, plan, carry out a mission with suitable instrument. so, years 2022-07-10 12:14:52 Yes. 2022-07-10 12:15:18 I think there had already been some scuttle about a Venus mission anyway, so some of the preliminaries might have already been done. 2022-07-10 12:55:18 so, in addition to how hostile things are now for RNA I occasionally just sit in total wonder at the aerobe/anaerobe thing we've had going on since the oxygen catastrophe 2022-07-10 12:56:24 which is where my thoughts went pretty quickly seeing speculation above about Venus, or about a common initiating event for the solar system as a whole 2022-07-10 12:57:05 I'm more Team Gas Giant for a systemwide initiating event having survived somewhere 2022-07-10 12:58:50 dave0: Apparently nasm doesn't support direct assembly of i nstructions that use rip-relative addressing. 2022-07-10 12:59:09 But the instruction set does. If I try to say 'lea rcx, [rip]', that throws an error. 2022-07-10 12:59:29 But you can assemble it here: 2022-07-10 12:59:31 https://defuse.ca/online-x86-assembler.htm 2022-07-10 12:59:48 and then insert the resulting bytes right into the instruction stream in nasm using db. 2022-07-10 12:59:55 And that appears to work just fine. 2022-07-10 13:00:04 about multiverses, the thing is they solve the matter of why our universe seems so fine-tuned for senient life 2022-07-10 13:00:16 Yes, it does put that question very nicely to rest. 2022-07-10 13:00:26 Just slam dunks it. 2022-07-10 13:00:41 Which of course doesn't mean it's true. But it may be. 2022-07-10 13:01:05 because if there weren't multiverses, there would be a lot of questions that, well, imply a god 2022-07-10 13:01:10 If you *don't* assume it, though, then that fine tuning question becomes very onerous for everyone except the religious folk. 2022-07-10 13:01:22 ^ Exactly. 2022-07-10 13:01:58 the thing is, will we ever know an answer to this? 2022-07-10 13:02:29 Yeah, we may not. 2022-07-10 13:02:40 No promises that we get to know everything for sure. 2022-07-10 13:03:14 but yeah, I can see how people like Dawkins would really be attracted to multiverses 2022-07-10 13:04:21 thing is, fine-tuning does not help most traditional religious people either, but rather deists 2022-07-10 13:04:37 Yeah, I agree. 2022-07-10 13:04:46 Doesn't help the "literalists." 2022-07-10 13:06:20 if they were to either prove or disprove multiverses, though, that would have major implications either way 2022-07-10 13:06:32 It really would. 2022-07-10 13:07:05 That would also necessitate a "higher probability" theory of abiogenesis. 2022-07-10 13:07:11 i dont think its exactly something that can be proven or disproven 2022-07-10 13:07:16 If that 10^77 number is right. 2022-07-10 13:07:17 it's a metaphysical not a physical question 2022-07-10 13:07:35 Yeah, I think it's beyond our ability to observe. 2022-07-10 13:07:40 assuming that there are 'realities' 'parallel' to our own, what makes them not simply part of our universe? 2022-07-10 13:08:05 there's nothing that makes 'the multiverse' not simply 'the universe' 2022-07-10 13:08:21 Well, one of the multiverse theories holds that our "universe" is a "bubble" in which cosmic inflation has stopped, while all around it inflation continues. 2022-07-10 13:08:26 And there would be other bubbles. 2022-07-10 13:08:33 Way beyond anything we could confirm. 2022-07-10 13:08:51 the key thing is the fundamental constants of the universe could be different 2022-07-10 13:09:03 Right, for the fine tuning resolution. 2022-07-10 13:09:08 sure, but those could simply be properties of the universe 2022-07-10 13:09:21 And they throw out numbers for those "bubbles" of like 10^500. 2022-07-10 13:09:34 like, idk 2022-07-10 13:09:42 So with all of THAT out there, then suddenly 1 in 10^77, or even one in something much bigger, is no longer "rare." 2022-07-10 13:09:49 Well, no longer "unlikely." 2022-07-10 13:09:50 nothing prevents these 'bubbles' from simply being part of the universe 2022-07-10 13:09:52 Still rare I guess. 2022-07-10 13:10:00 maube the universe has different fundamental constants and laws based on location 2022-07-10 13:10:07 Yeah, it depends on what you want to call "the universe." 2022-07-10 13:10:12 the fundamental constants and laws are already subject to change 2022-07-10 13:10:14 exactly! 2022-07-10 13:10:26 it's a metaphysical question 2022-07-10 13:10:32 and metaphysics is bloody useless 2022-07-10 13:10:32 It is. 2022-07-10 13:10:44 Useless scientifically, yeah. 2022-07-10 13:10:49 It can be fun, though. 2022-07-10 13:10:55 that's fair! 2022-07-10 13:11:25 the thing is that I don't think that things like whether there could be other universes is necessarily beyond proof or disproof 2022-07-10 13:11:26 If we can't measure it and confirm or reject it, it's not really science. 2022-07-10 13:11:54 I don't think they have to be - our bubble might interact with some other bubble. 2022-07-10 13:12:15 But I think it's pretty clear that "remote" bubbles would be beyond our ability to detect. 2022-07-10 13:12:22 Unless something new comes along. 2022-07-10 13:12:58 i dont think there's really any point in pursuing it :) 2022-07-10 13:13:15 As far as the fundamental constants go, our universe could be sitting at an equilibrium point in a big nonlinear parameter space, and "vibrating" around it. If it somehow got knocked to a different equilibrium point, all those constants might change. 2022-07-10 13:13:15 i don't want to end up a rat biting into an electric cable, yknow? 2022-07-10 13:13:23 And we might go "poof." 2022-07-10 13:13:52 Very much like the response of an amplifier depending on the transistor's Q-point. 2022-07-10 13:14:55 I'm not a physicist so I don't feel qualified to speak on these matters 2022-07-10 13:17:44 Oh god, I'm not either. I read a lot about it, but I'm nowhere near expert. 2022-07-10 13:17:57 Maybe a little more than a lot of people, but still way down the curve. 2022-07-10 13:18:27 I loaded up on high math in graduate school because I knew I wanted to learn more over the years, but I still missed out on some of the important math. 2022-07-10 13:18:53 I'm pretty good with linear algebra and tensors, but I completely missed the group theory boat, and that turns out to be pretty important. 2022-07-10 13:18:58 Still trying to catch up there. 2022-07-10 13:22:21 I always feel like a dumbass when it comes to math to tell the truth 2022-07-10 13:23:36 probably because I never went to grad school 2022-07-10 13:29:34 ACTION just went to the wiki, looked up "group theory", and came out wondering what the heck a group really is 2022-07-10 13:30:28 it's an algebraic structure, which is a set of objects with some operations on it. 2022-07-10 13:30:56 the Abstractions of Zargel III are attacking! 2022-07-10 13:30:56 those operations can follow some laws. associativity, etc. 2022-07-10 13:31:42 imode: well I mean beyond that 2022-07-10 13:31:47 that's it. 2022-07-10 13:31:53 that is literally it. 2022-07-10 13:32:10 the "universe" of algebraic structures is just that. 2022-07-10 13:32:47 what is the relationship between group theory and category theory? 2022-07-10 13:33:37 group theory is a single algebraic theory. 2022-07-10 13:34:12 like I know what monads, functors, and monoids are 2022-07-10 13:34:45 rings, groups, lattices, modules. 2022-07-10 13:35:03 all different algebraic theories. 2022-07-10 13:35:32 category theory is a bit more abstract, mainly because of all the mental wankery. 2022-07-10 13:36:02 categories are "maps of the territory". they aren't the territory, but they're a map of the territory. 2022-07-10 13:36:13 how you convert between different structures. 2022-07-10 13:36:39 what operations are available and why. 2022-07-10 13:37:22 of course, the reason why I'm familiar with the likes of monads, functors, and monoids is from my time working with haskell... 2022-07-10 13:37:59 algebraic data types should've tipped you off, because the method for describing types forms an algebra. 2022-07-10 13:38:11 haskell, a system to annoy you with type errors 2022-07-10 13:38:19 hee 2022-07-10 13:38:23 *hee 2022-07-10 13:38:26 *hehe 2022-07-10 13:39:11 as they say, though, once haskell stops generating type errors your program is probably correct 2022-07-10 13:39:32 whereas Forth, well, has no guardrails 2022-07-10 13:40:00 stack/queue languages can have the same guarantees as haskell. 2022-07-10 13:40:24 types don't stop you from walking a list the wrong way 2022-07-10 13:40:26 mainly because effects are a complete language to describe the types of functions. 2022-07-10 13:40:27 imode: it's not the stack but rather the direct access to memory 2022-07-10 13:40:40 that's.. not even close. 2022-07-10 13:40:59 direct access to memory isn't something a type system or haskell prevents. 2022-07-10 13:41:14 you can actually access memory in haskell. 2022-07-10 13:41:31 imode: well, yes, in Haskell you can directly access memory... but you don't do so by mistake 2022-07-10 13:41:44 I can't do that in Java either. 2022-07-10 13:42:27 or Go. 2022-07-10 13:42:39 of course the real issue in Haskell is space leaks 2022-07-10 13:43:00 it's also founded on the wrong abstraction for building things. 2022-07-10 13:43:23 erlang is better founded. 2022-07-10 13:45:56 Haskell is better-suited to straight-line pure computation whereas Erlang is better-suited to large-scale concurrent applications 2022-07-10 13:46:12 I wouldn't write a compiler in Erlang myself 2022-07-10 13:46:26 whereas Haskell IMO is a great language to write a compiler in 2022-07-10 13:46:34 I'm not sure why you wouldn't, but I'd also wanna know what the hell "straight-line pure computation" is. 2022-07-10 13:47:02 other than some masturbatory ivory tower supremacy shit. 2022-07-10 13:47:32 pure computation, taking data A and spitting out data B, without invoking concurrency or IO 2022-07-10 13:47:57 great. and nearly every language on the planet is good at doing that, even erlang. 2022-07-10 13:48:09 brainfuck or the SKI calculus is great at doing that. 2022-07-10 13:48:34 it's literally the minimum requirement to be a programming language. sequential programming. 2022-07-10 13:48:36 you're essentially saying that being Turing-complete is "great at doing that" 2022-07-10 13:48:39 yes. 2022-07-10 13:48:47 because that's literally the point of a turing machine. 2022-07-10 13:49:00 take input tape, produce output tape. 2022-07-10 13:49:08 take input lambda expression, reduce to output lambda expression. 2022-07-10 13:49:23 the thing is that in Haskell the type system is your friend and is very well-suited for representing data 2022-07-10 13:49:32 so is OCaml's. 2022-07-10 13:49:42 and Rust's, to some extent. 2022-07-10 13:49:51 I uninstalled haskell instead of making friends with the type system 2022-07-10 13:50:09 well, the basic type systems, before you invoke extensions, of Haskell and OCaml are very similar 2022-07-10 13:50:24 because type theory isn't something Haskell-only. 2022-07-10 13:50:26 ACTION used to program in OCaml before he switched to Haskell 2022-07-10 13:50:56 but Haskell has nice things that OCaml doesn't have such as type classes 2022-07-10 13:51:08 and the extensions aren't all that useful. if you really need dependent types, go with C++ templates or Agda. have fun risking that your type checker won't halt. 2022-07-10 13:51:37 type classes are one of my favorite aspects of Haskell, and OCaml's eqtypes suck balls in comparison 2022-07-10 13:51:45 Holy cow - I walk away for a few minutes and miss a ton... 2022-07-10 13:51:47 let me take you for a tour of "why you have been retreading the wrong road along with your colleagues for the past 60 years". 2022-07-10 13:52:07 software formalists really did try to understand how to properly build software like civil engineers. 2022-07-10 13:52:12 that was in the 50's and 60's. 2022-07-10 13:52:24 they failed. 2022-07-10 13:53:03 if I ever write a large application that I'm not doing so for my job on the PC again, sorry, but I'm doing it in Haskell 2022-07-10 13:53:18 I'll be impressed if anything works after 3 months. 2022-07-10 13:53:42 I work in Forth and assembly these days on my own, but that's because I target embedded systems now 2022-07-10 13:53:52 evangelists of this paradigm of organization always hush up. 2022-07-10 13:54:01 because they genuinely don't make anything that works. 2022-07-10 13:54:18 the only thing sane that ever came out of haskell (and it was an accident) is pandoc. and I love the bloat that that thing has built up for it. 2022-07-10 13:54:23 I've definitely written things that work in Haskell 2022-07-10 13:54:28 oh yeah? show me. 2022-07-10 13:55:39 https://github.com/tabemann/botwars < mind you, it's probably suffered from software rot, because I compiled it last several years ago, and GHC has moved on since then 2022-07-10 13:56:01 something about you seems to evoke "Haskell hater" to me though 2022-07-10 13:56:03 oh neat, a toy. 2022-07-10 13:56:20 or this: 2022-07-10 13:56:21 https://github.com/tabemann/amphibian 2022-07-10 13:56:24 something about you seems to evoke "Haskell evangelist". 2022-07-10 13:56:40 as in, you don't understand why you like something, and will espouse its virtues on command, as if puppeted. 2022-07-10 13:56:57 I spend most of my time these days working in Forth and assembly, at home, and C++, at work 2022-07-10 13:57:11 I manage digital accounting at Amazon. 2022-07-10 13:57:21 so I'm a "Haskell evangelist" because I like Haskell and think it has its place 2022-07-10 13:57:36 I think you haven't been exposed to the working environments that put models to the test. 2022-07-10 13:57:55 imode: you obviously don't realize that I've run into things such as space leaks myself 2022-07-10 13:58:08 everybody runs into them. it's on the wiki. 2022-07-10 13:58:18 if I were to recreate Haskell from square one today I'd make it strict-evaluated 2022-07-10 13:58:18 congratulations, you are average. 2022-07-10 13:58:19 I consider it slightly annoying that there are valid instructions that nasm won't directly assemble. 2022-07-10 13:58:48 everybody has a favorite toy, everybody enjoys evangelizing their favorite toy. 2022-07-10 13:58:55 thinking critically is difficult. 2022-07-10 13:59:09 you have to do the latter when you're dealing with actual problems. 2022-07-10 13:59:09 I plan on trying out Idris some day myself, but from reading about Agda wonder if I will ever wrap my brain around dependent types 2022-07-10 13:59:21 Yes - it is hard to think critically about something that you are either "very fond of" or "very non-fond" of. 2022-07-10 13:59:21 dependent types are really simple. 2022-07-10 13:59:40 it's just that the side-effect is that typechecking makes them undecidable. 2022-07-10 14:00:00 because value types are things that are produced by potentially halting computations. 2022-07-10 14:00:29 hence totality checkers and that nonsense (yes, you don't need to tell me about the halting problem, I know how totality checkers work) 2022-07-10 14:00:49 which kind of defeats the point of formal proofs, eh? 2022-07-10 14:01:01 the point was to use a weaker system to describe and constrain a stronger system. 2022-07-10 14:01:27 this is the full fucking point of why church et. al. looked at bare LC and said "well shit we can't usefully describe anything because we can describe everything". 2022-07-10 14:02:16 so you affix a limiter to it, and the thing becomes useful. to the extent that you can describe what you want within that new, limited system. 2022-07-10 14:02:28 I've been working in functional programming for a long time. 2022-07-10 14:02:51 hell I even implemented my own term rewriting language for a job, and then built a type theory on top of it. 2022-07-10 14:02:53 I am woefully ignorant of it. Barely know what it "is." 2022-07-10 14:02:58 And may be wrong about that in some ways. 2022-07-10 14:03:15 there isn't much meat that's useful, KipIngram. 2022-07-10 14:03:27 KipIngram: functional programming is basically where your program, ideally, is a mathematical expression 2022-07-10 14:03:43 Yeah, that's basically how I think of it. 2022-07-10 14:03:52 "Turn it into math." 2022-07-10 14:04:01 that's a lame explanation. 2022-07-10 14:04:09 most functional programming languages get around the little problems of IO and concurrency by breaking from that model at some level, introducing monads, or introducing linear types 2022-07-10 14:04:57 a more apt explanation would be there's no way to implicitly model state in the units of code that you write. every function (it's in the name) is actually a mathematical function: one input, one output, domain and range. 2022-07-10 14:05:14 it can't change from under you, and your programs in the abstract form a dataflow graph. 2022-07-10 14:05:28 which then gets reduced and compiled down to a target. 2022-07-10 14:05:34 yeah 2022-07-10 14:05:38 it's beneficial because you don't give a shit about the evaluation of that graph. 2022-07-10 14:05:41 only that it evaluates. 2022-07-10 14:05:41 But then they talk about "side effects." 2022-07-10 14:05:50 right. which, like with a dataflow graph, can be better isolated. 2022-07-10 14:05:55 And given the work I did it was those effects I was interested in. 2022-07-10 14:05:59 you can understand and deal with the stateful parts or pass handles around. 2022-07-10 14:06:08 I wanted to DO SOMETHING to some actual piece of stuff. 2022-07-10 14:06:18 you can. all functional programs are dataflow graphs. 2022-07-10 14:06:22 and those need inputs and outputs. 2022-07-10 14:06:39 side effects are implemented either by enforcing some order to evaluation, which allows you to put parts in that do IO or concurrency, or by introducing either monads or linear types 2022-07-10 14:06:49 But the other models of computation get at that more directly. 2022-07-10 14:06:58 largely because they were built on a different paradigm. 2022-07-10 14:07:00 a linear type btw consists of values that can only be acted upon once 2022-07-10 14:07:06 Yeah. 2022-07-10 14:07:14 this is model dichotomy. 2022-07-10 14:07:17 Oh, thank you - I thought about asking, but then decided I'd try to look it up later. 2022-07-10 14:07:24 turing/post's machines, and church's lambda expressions. 2022-07-10 14:07:34 so each function can have an "inputIO" value and return an "outputIO" value, with linear types 2022-07-10 14:07:41 church's input is "the expression". turing calls out inputs. 2022-07-10 14:08:13 so in the process of reduction of that expression, in order to have any kind of I/O, you need something that gets "swapped out" for the value you're inputting on whatever I/O method you choose. 2022-07-10 14:08:39 that's where linear types come in. you hand me an "I/O token", I swap it for the thing on the input bus. 2022-07-10 14:08:44 but you can't use that again. 2022-07-10 14:09:09 notice, though, how that's a very real analogy. people talking to people. 2022-07-10 14:10:19 the difference between the use of an IO monad and an IO linear type, of course, is that the latter requires a lot of work to thread the IO linear type through your program 2022-07-10 14:10:19 ivory tower mental masturbation hides the fact that functional programming's strength is that it's stitching together data flow modules, and each module has a "pinout" (phrased as a type signature) that it's expecting to get hooked up to, in and out. 2022-07-10 14:10:35 it's why it makes a very good paradigm for HDLs. 2022-07-10 14:10:54 and it's why chuck moore moved very fast when making his own EDA systems. because forth is functional to a large extent. 2022-07-10 14:11:16 just with some additional shortcuts for I/O that you can eventually model into whatever flavor of I/O you want. 2022-07-10 14:12:51 forth can be functional yes, even though forth the way I've used it is not because so much of what is going on is talking to hardware directly or, in the case of my Forth, invoking concurrency 2022-07-10 14:13:33 the only difference between forth and haskell is the lack of enforced and well-typed stack effects, and the fact that the I/O isn't "hidden" from you. you can absolutely talk to the hardware in haskell. 2022-07-10 14:13:47 most of haskell is written in haskell. 2022-07-10 14:13:51 how else would it bootstrap. 2022-07-10 14:15:38 most forths do not rely heavily on things like tail recursion, though, since most forths do not have real tail call "optimization", but rather use traditional imperative looops 2022-07-10 14:16:51 most forths aren't interested in combining functional programming with their stuff. 2022-07-10 14:17:03 you'd need to look at the general landscape of concatenative languages. 2022-07-10 14:17:09 yes, like Factor 2022-07-10 14:17:17 I've always meant to try it out, never have though 2022-07-10 14:17:21 it's alright. 2022-07-10 14:17:50 it's interesting in how far it's developed but it doesn't bring much new to the table. 2022-07-10 14:18:14 considering that it was originally just a scripting language on top of Java 2022-07-10 14:18:36 clojure is still that and powering a great deal of things. 2022-07-10 14:19:12 I've looked at Clojure and have felt that it's impressive but is just too Java-influenced for my taste 2022-07-10 14:19:28 ACTION used to program at Java at his previous job and isn't too fond of it 2022-07-10 14:19:39 there's some good ideas there, hickey, for what he's worth, is in the right area but with the wrong scope. 2022-07-10 14:20:19 personally I like GHC's implementation of STM better than Clojure's though 2022-07-10 14:20:32 the fact that you need it means we're doing something wrong. 2022-07-10 14:21:39 well, STM fits a traditional single-machine concurrency model well but does not extend to a multi-machine distributed concurrency model 2022-07-10 14:21:54 message passing is better. 2022-07-10 14:22:08 qualified isolation is better. 2022-07-10 14:22:44 you can implement message passing on top of STM, btw 2022-07-10 14:22:54 and you can implement STM on top of message passing. 2022-07-10 14:23:16 funny how "things talking to eachother" is such a universal concept. 2022-07-10 14:23:34 that's why erlang wins. 2022-07-10 14:23:38 I imagine going the opposite way is much harder 2022-07-10 14:23:48 trust me when I say it isn't. 2022-07-10 14:27:37 of course in Forth I don't do any of this but rather do traditional imperative concurrency based on channels (whether queue or rendezvous), semaphores, locks (a specialized kind of semaphore of course), and like 2022-07-10 14:28:34 rendevouz/blocking channels are a feature of biology. 2022-07-10 14:28:44 they're robust for a reason. 2022-07-10 14:36:27 I find in practice I can get better throughput in many cases with queue channels though 2022-07-10 14:37:38 because one task can fill up the entire queue and then block once the queue is full, then the other task can receive from the entire queue and block once the queue is empty, back and forth 2022-07-10 14:38:11 whereas a rendezvous channel is essentially a queue channel with a buffer of size 0 2022-07-10 14:38:23 so it always blocks 2022-07-10 14:39:12 yep. this is because you favor larger, long-running processes that target a single or few communication channels rather than phrasing your computation as the orchestration of tons of tiny processes that communicate locally over rendezvouz channels. 2022-07-10 14:39:24 everybody does that and it's why erlang went with queues. 2022-07-10 14:42:26 the reason I go with larger, longer-running processes is that stack space is expensive on embedded systems, so having large numbers of tasks gets expensive quickly 2022-07-10 14:42:28 but 2022-07-10 14:42:53 you need a smaller model of processes. 2022-07-10 14:42:55 what I did do was implement something I called, for a lack of a better term, "actions", which are like processes minus stacks 2022-07-10 14:43:03 ayyy. 2022-07-10 14:43:03 I also think it's just easier for a lot of us to think in terms of long-running processes. 2022-07-10 14:43:16 May just be habit, but it's a comfort zone. 2022-07-10 14:43:25 KipIngram: long-running isn't bad. I guess the focal term is "larger". 2022-07-10 14:43:33 That's fair. 2022-07-10 14:43:40 actions do purely blocking messaging with one another, and are very, very fast too 2022-07-10 14:43:49 It's another "dimension" of the same thing that makes GA144 programming really different. 2022-07-10 14:43:51 monolithic. runs of sequential execution delimited by communication, rather than communication delimited by sequential execution. 2022-07-10 14:43:58 And you're right - those can be long running. 2022-07-10 14:44:07 They're just "small" compared to what many of us are used to. 2022-07-10 14:44:10 the cells in your body are long running in their relative timespan. 2022-07-10 14:44:17 Yeah. 2022-07-10 14:45:20 I still think digital circuits have some analogies with this. 2022-07-10 14:45:31 Each "event" at a gate in a digital circuit is a very small thing. 2022-07-10 14:45:40 there's a symmetry there. 2022-07-10 14:45:45 the key thing with "actions", though, is code that uses them has to be structured in terms of individual words that get called, have complete control over the parent task while they run, and then give up control, without maintaining stack space in between 2022-07-10 14:45:46 which is why Feather has a software and a hardware model. 2022-07-10 14:45:52 The gate does the same thing most of the time, but it does it over and over and those are independent events. 2022-07-10 14:46:26 they're rather similar to how asynchronous computing is done in JavaScript actually 2022-07-10 14:46:36 except JavaScript has closures 2022-07-10 14:46:40 and Forth does not 2022-07-10 14:46:47 unless you want it to. 2022-07-10 14:46:53 :-) 2022-07-10 14:47:28 I've always thought that digital logic, particularly asynchronous, is a good "training ground" for multi-thread programming. 2022-07-10 14:48:14 I find digital logic (from what I did in it at school back when at least) is very different from multitasking, at least after having really implemented a multitasking system 2022-07-10 14:48:29 there's no scheduler, that's for sure. 2022-07-10 14:48:48 Single-core/single-thread is kind of like a one-wire comm channel. You send one bit, then you send the next, etc. etc. Multi-core/multi-thread is like a general circuit, with stuff happening all over the place with independent timing to a large extent. 2022-07-10 14:48:49 components need to be self-timing, which means handshakes for literally every transaction along component boundaries. 2022-07-10 14:49:04 in multitasking an individual task has undivided control over a core, interrupts aside, until it blocks or the scheduler decided that it's times-up and gives control to another task 2022-07-10 14:49:47 the idea that every task is running at once is an illusion 2022-07-10 14:49:56 Well, I don't disagree. I'm just talking about the timing / race condition issues that can arise *between threads*. You can get that same kind of thing in general asynchronous circuits. 2022-07-10 14:50:03 Same type of "faults" can byte you. 2022-07-10 14:50:10 heh, definitely. 2022-07-10 14:50:11 So I'm only trying to draw a fairly limited analogy. 2022-07-10 14:50:37 And when I say it's a good "training ground," I only mean for certain issues that arise in multi-thread programming, not the whole business. 2022-07-10 14:50:41 yeah 2022-07-10 14:50:49 Dog bytes man, film at 11 2022-07-10 14:50:54 race conditions are much more acute in digital logic 2022-07-10 14:51:04 but that's almost a good thing 2022-07-10 14:51:06 The point being that single-thread programming doesn't really prepare you for those issues at all. 2022-07-10 14:51:10 because they bite you much sooner 2022-07-10 14:51:13 It's very "neat and tidy" by comparison. 2022-07-10 14:51:23 sequential execution, man. 2022-07-10 14:51:25 whereas in multitasking you can go quite a while without your race condition biting you 2022-07-10 14:51:26 it's a drug. 2022-07-10 14:51:43 Yes, that's a perfect description. 2022-07-10 14:52:12 yeah, desync or race conditions can actually screw up far more than local effects in a multithreaded environment. 2022-07-10 14:52:17 Right - I don't necessarily claim the training works both ways. 2022-07-10 14:52:19 it can cascade into.. everything. 2022-07-10 14:52:34 the chip can burst into flames. 2022-07-10 14:52:42 (not really but in a manner of speaking.) 2022-07-10 14:52:50 Well, in digital logic you have metastability. 2022-07-10 14:52:58 And that's just... a whole different ballgame. 2022-07-10 14:53:06 I'd be impressed if I saw a chip deflagrate due to a race condition lol 2022-07-10 14:53:07 That can REALLY ruin your day. 2022-07-10 14:53:34 How long metastable conditions persist is a probability thing. 2022-07-10 14:53:34 it can ruin your day, but problems that happen sooner are more reliably are better in the end 2022-07-10 14:53:41 Less probable for it to last longer. 2022-07-10 14:53:52 But it's *possible* for it to last any length of time you choose. 2022-07-10 14:53:55 tabemann: halt and catch fire! ;) 2022-07-10 14:54:00 It's like balancing a ball on the tip of a pencil. 2022-07-10 14:54:17 In theory it might sit there forever, but we know it's not going to. 2022-07-10 14:54:18 easier if you stab the ball 2022-07-10 14:54:26 :-) 2022-07-10 14:54:57 Sooner or later noise will take care of it, and it will find its way back into some valid state 2022-07-10 14:55:07 But God knows what state that might be, across the whole circuit. 2022-07-10 14:55:08 my most fun multitasking bugs are ones that take long periods of time to manifest 2022-07-10 14:55:42 like bugs I could only make manifest if I ran my code overnight and while I worked and eventually came back to it to see that it had failed 2022-07-10 14:56:31 Just in case someone doesn't know - metastability happens when you clock a flip flop when its input isn't decisively "high" or "low" in keeping with the setup and hold times. Flip flop doesn't know what to do with that input, and it can take an arbitrary length of time for it to make a choice. 2022-07-10 14:57:06 If it lasts long enough, it's output will then be out of valid range when the next clock comes, so metastability can be contagious. 2022-07-10 14:57:17 yeah 2022-07-10 14:57:48 at Amazon, my code handles traffic from all physical and digital ordering streams along with all of the metadata/"transaction tracking" services that multiply that traffic by 5. if we have a fault, it spreads to the point where a billion dollar bleed is a certainty and not a nightmare you get to wake up from. 2022-07-10 14:57:55 what I'm talking about is race condition issues that take many hours to manifest themselves 2022-07-10 14:58:14 cross-service race conditions and serving 800+ independent business teams is fun. 2022-07-10 14:58:28 if one of them fucks up, your system needs to be responsive and resilient to it. 2022-07-10 14:58:29 Sounds like it could have some real nightmare aspects. 2022-07-10 14:58:41 that makes me glad I now work on MRI machines 2022-07-10 14:58:42 it's why I harp on resiliency. our systems self-heal. 2022-07-10 14:58:47 so much fun when the wheels of the payments wagon come off 2022-07-10 14:58:57 thrig: it makes audits more exciting. ;) 2022-07-10 14:59:11 I worked in Amazon Payments for a while 2022-07-10 14:59:47 oh fun. you work with Herd at all? 2022-07-10 15:00:16 this was a while ago, pre 2010 2022-07-10 15:00:34 "things are different" is an understatement. 2022-07-10 15:00:35 heh. 2022-07-10 15:01:03 3000 line shell scripts to do payment batching, etc 2022-07-10 15:01:21 the idea of doing payments with 3000 line shell scripts is scary 2022-07-10 15:01:24 we still do that.. buuuut in groovy... 2022-07-10 15:01:33 ....and ion... 2022-07-10 15:01:43 the idea of doing anything mission-critical with large shell scripts is scary 2022-07-10 15:01:57 ACTION is not a fan of shell as you can imagine 2022-07-10 15:02:27 it got replaced with some Java version that had an exact same business logic bug as the old code did 2022-07-10 15:02:28 and groovy is like "let's program in Java... but without compile-time type-safety" 2022-07-10 15:03:10 we're downstream from all the upstream so we get heaped on crap like nobody's business. 2022-07-10 15:03:40 there's a multi-million dollar a day bleed because someone thought modeling the same events with a different name wouldn't result in some interesting... conflicts.. with line items. 2022-07-10 15:04:19 I'll be back in a bit 2022-07-10 15:04:22 o/ 2022-07-10 15:04:38 It's worth noting that asynchronous systems have less trouble with metastability issues, because each part can wait as long as it needs to - if you design it to "do nothing" unless its input is *valid*, then things just slow down a little and then move on properly most of the time. 2022-07-10 15:05:38 it's a beautiful concept. shamefully underutilized, though. 2022-07-10 15:06:50 Yeah. Synchronous became a habit. 2022-07-10 15:07:06 velocity keeps it up. 2022-07-10 15:07:17 And it's often more logic-efficient, since lots of elements of the circuit share their timing control. 2022-07-10 15:07:25 Early on efficiency was important. 2022-07-10 15:07:38 Well, still is, I guess. 2022-07-10 15:07:40 hard to get a clock signal from point A to point B if dist(A, B) is large enough, though. 2022-07-10 15:07:59 Yes, that's true, but dist(A,B) just kept getting smaller and smaller over the years. 2022-07-10 15:08:07 that's why you strap an atomic clock to your wrist, and then 2022-07-10 15:08:23 it's more like it flatlined. 2022-07-10 15:08:37 That was an interesting bit of that Belt machine we talked about a couple months ago. I'd never given thought to there being a down side of a larger cache. 2022-07-10 15:08:39 you have more circuitry but it's more dense, the electrical paths' distances are growing. 2022-07-10 15:08:48 But if the cache is larger, parts of it are further away, and that can slow things down. 2022-07-10 15:08:52 So there's a sweet spot. 2022-07-10 15:08:57 yeah. compute wants to be next to data. 2022-07-10 15:09:00 like, really close to it. 2022-07-10 15:09:08 it's why trillion-core chips are the future. 2022-07-10 15:09:14 Makes total sense. 2022-07-10 15:09:44 we just don't know how the fuck to program them yet. 2022-07-10 15:12:42 because we're still thinking in terms of large tasks. I don't even think our minds work on that level, it's why we need task boards and decomposition. 2022-07-10 15:12:43 Similar with quantum computers. We have a few algorithms, but I don't think we *really* understand the lay of that land yet. 2022-07-10 15:13:01 As in, we don't have a general theory of what we can do with them and what we can't, like we do with classical computers. 2022-07-10 15:13:42 Some encryption schemes we know are quantum vulnerable, but others we "think" are not. But we're not sure. 2022-07-10 15:13:56 it's unexplored. it's the edge. 2022-07-10 15:14:01 Yeah. 2022-07-10 15:14:05 we're still working on the mainland. 2022-07-10 15:14:17 And damn hard to build, too, from an engineering perspective. 2022-07-10 15:14:37 Quantum things don't generally like to stay to themselves; they like to interact with everything around them. 2022-07-10 15:16:33 QDCA is an interesting contender for logic. 2022-07-10 15:16:48 https://en.wikipedia.org/wiki/Quantum_dot_cellular_automaton 2022-07-10 15:17:20 Yeah, and there are some interesting ideas based on knot theory too, that seem to be a bit more resistant to decoherence. 2022-07-10 15:17:32 But no "practical successes" yet. 2022-07-10 15:17:57 I figure when quantum computers come they may come suddenly; someone "finds a new way" that just... works. 2022-07-10 15:18:19 Particularly if they're ever going to be cheap and small. 2022-07-10 15:18:43 But I just don't know to what extent all of us "need" quantum computing. 2022-07-10 15:18:48 we don't. 2022-07-10 15:18:54 Won't really help us check our email and update our Facebook, etc. 2022-07-10 15:19:02 it won't even help us build or compute in new ways. 2022-07-10 15:19:11 I think the chemical and pharma industries will be the biggest beneficiaries. 2022-07-10 15:19:17 yeah, always domain specific. 2022-07-10 15:19:42 Maybe some logistics type operations. 2022-07-10 15:20:25 my design partner is heavily interested in QC circuits. 2022-07-10 15:20:28 There's some famous quote where the CEO of IBM once estimated the world demand for computers at about five. 2022-07-10 15:20:44 In one of the biggest errors in history. :-) 2022-07-10 15:20:53 But it may come closer to being true for quantum computers. 2022-07-10 15:20:55 It won't be five. 2022-07-10 15:21:05 But I doubt there will be one in everyone's pocket, too. 2022-07-10 15:21:30 once popsci gets over it, it'll be better. 2022-07-10 15:21:54 I think a lot of folks don't understand that, though - they see the word "quantum" and mentally replace it with "magic" and just assume it's "more and better in every way" and will *displace* conventional computing. 2022-07-10 15:22:45 Yeah, popsci loves "quantum." And they hype the "mysticism" of it more or less to death. 2022-07-10 15:22:54 Makes their articles more sensational. 2022-07-10 15:23:05 look for the financial incentive and that's where you'll find innovation. 2022-07-10 15:23:19 Follow the money. 2022-07-10 15:24:24 That's exactly why I think chemical and pharma will be the big adopters. There a whole lot of potential payoff there for them, in having access to cheaper, more correct models of chemical processes. 2022-07-10 15:25:01 And it really might usher in a golden age in those industries. We might wind up with drugs and stuff that seem like magic compared to what we have now. 2022-07-10 15:25:23 Super materials, etc. 2022-07-10 15:26:03 I'll be glad to see it. but there's incentives that lie beyond miracles. 2022-07-10 15:26:26 if anything, it'll be the deprivation/gradual rollout of those miracles. 2022-07-10 15:29:38 I may be over-expecting. I read The Expanse a few months ago; it's hard not to think "protomolecule" here. 2022-07-10 15:29:53 Though obviously that was literary hyperbole. 2022-07-10 15:30:07 That thing more or less was "magic." 2022-07-10 15:30:27 magical mcguffin is a common trope, heh. 2022-07-10 15:30:47 Yeah. It's a pretty good story, in spite of how heavily they leaned on it. 2022-07-10 15:31:43 And they did a great job with the first couple of TV seasons too - one of the best book->TV "conversion" I've ever seen. 2022-07-10 15:32:14 The show diverged some in season 3, unfortunately. 2022-07-10 15:32:45 I've been looking for some series to watch while I do work, is it worth a watch? 2022-07-10 15:32:53 is the acting non-shit. 2022-07-10 15:33:02 Yes it is. 2022-07-10 15:33:10 Well, I thought the acting was completely competent. 2022-07-10 15:33:37 I've only watched the first three seasons, but 1 and 2 were some of the best TV I've seen in a long long time. 2022-07-10 15:33:54 Definitely recommend at least trying it out. 2022-07-10 15:34:03 I'll put it on my list! 2022-07-10 15:34:13 Cool; let us know! 2022-07-10 15:34:31 sci-fi is rarely done well IMO. 2022-07-10 15:34:44 I'm not a purist but acting and suspension of disbelief. 2022-07-10 15:35:10 I agree. 2022-07-10 15:36:07 McGuffin aside, I was pretty impressed with the "hard scifi" aspect of it. No mystical tech that is just beyond belief. 2022-07-10 15:36:28 The science is advanced but believable. 2022-07-10 15:36:42 The tech, rather. 2022-07-10 15:37:01 the ships do somehow move at the speed of plot 2022-07-10 15:37:51 Yes, they don't share the dwell of cross-solar system travel. In the books it's more clear that the appropriate length of time has elapsed, though. 2022-07-10 15:39:29 it'd be interesting to see the details of that travel. 2022-07-10 15:40:10 They do have a "space drive" tech that makes the whole "mastery of the solar system" thing possible - the "Epstein drive." But I actually have a theory about how that could work that is in tune with the laws of physics. 2022-07-10 15:40:45 It doesn't let them go at super speed - it mostly just lets them dodge the usual burden of having to carry a ridiculous amount of rocket fuel around with them. 2022-07-10 15:41:26 My theory is that it would work by somehow ejecting the propellant at near the speed of light, so you'd get the benefit of that exhaust becoming much more massive because of going so fast. 2022-07-10 15:41:50 Propellant would give you a lot more bang for the buck than it does in real rockets. 2022-07-10 15:42:01 So you wouldn't need as much for long missions. 2022-07-10 15:43:13 Anyway, I don't think anything I've said is a spoiler, but I'll shut up about it now. I predict you'll enjoy it at least some. 2022-07-10 15:43:28 I'm not terribly easy to impress, with TV or with books. 2022-07-10 15:43:36 And that one sucked me in pretty good. 2022-07-10 15:43:48 some massive accelerator on par with the LHC with some energy improvements would actually be feasible. 2022-07-10 15:43:57 just with much larger or more consistent mass streams. 2022-07-10 15:44:07 The main problem with books these days is that I've now read that urban fantasy series The Dresden Files, and now nothing else really compares for me. 2022-07-10 15:44:18 oh? 2022-07-10 15:44:28 Oh yeah - it's almost "too good." 2022-07-10 15:44:51 There was one season of a TV show for it back around 2005 or so, but it doesn't adhere to the books much at all; it was "ok," but the books are OUTSTANDING. 2022-07-10 15:45:16 Author is Jim Butcher - first novel is Storm Front. 2022-07-10 15:45:19 huh, my wife's really into urban fantasy.. maybe I'll grab a copy for her. 2022-07-10 15:45:25 There are 17 novels and two short story anthologies so far. 2022-07-10 15:45:42 We're expecting about 25 novels total, with the last three being a "big apocalyptic trilogy." 2022-07-10 15:45:48 damn. 2022-07-10 15:46:20 Yeah - he cranked out the first 15 novels in about 15 years, and then there was a big dwell, and he's taken to alternating between Dresden and another series. I think it's not as fun for him as it used to be. 2022-07-10 15:46:27 But officially at least he's "hanging in there." 2022-07-10 15:46:43 that always hurts. :\ 2022-07-10 15:46:44 I sure hope he makes it to the end - I really want to finish this one. 2022-07-10 15:47:00 Oh yeah - the fans were climbing the walls during that long wait. 2022-07-10 15:47:09 I imagine his publisher wasn't awfully happy either. 2022-07-10 15:47:32 Surely they had financial forecasts based around that "one a year" pace he'd set for 15 years. And then... nothing. And more nothing. 2022-07-10 15:47:40 Then in 2020 two new ones came out. 2022-07-10 15:47:47 Those are the last two currently. 2022-07-10 15:47:58 at least he continued. 2020 was a shakeup year for old stuff. 2022-07-10 15:48:04 NiN did a thing. 2022-07-10 15:48:04 It was originally intended to be one, but it came in too big and the publisher had him split it. 2022-07-10 15:48:11 They're really one story. 2022-07-10 15:48:49 The series he's alternating with is called Cinder Spires; there's one book out so far called The Aeronaut's Windlass. It's outstanding too. 2022-07-10 15:48:57 So I really look forward to seeing where that one goes. 2022-07-10 15:49:04 He's currently working on Cinder Spires #2. 2022-07-10 15:49:04 I need to read more.. 2022-07-10 15:49:25 Cinder Spires is a steampunk genre thing. 2022-07-10 15:49:37 aha I'm into that. 2022-07-10 15:49:59 Well, try that one out too; I was really impressed with Windlass as a "first installment." 2022-07-10 15:50:27 With Dresden it took him several books to really get the "world" rolling, but I think the Spires 'verse is already "in flight." 2022-07-10 15:50:43 Of course when he wrote those first Dresden books he was young and inexperienced. 2022-07-10 15:50:52 it's difficult to bootstrap world. 2022-07-10 15:50:53 He's matured a lot now; it's over twenty years later. 2022-07-10 15:54:41 back 2022-07-10 15:58:01 web. 2022-07-10 21:13:07 Hey, do typical direct threraded Forths have the entire code handler for definitions, variables, and so on just before the parameter field? Or is it typically a "snip" to secure the PFA and a jump to the routine proper? 2022-07-10 21:14:06 I would argue that if it's the latter then that largely defeats any speed advantage direct threading might have - you've wound up doing two direct jumps vs. one indirect jump. 2022-07-10 21:16:00 Those routines aren't very big - I could see someone just putting a copy down in every place. 2022-07-10 21:23:26 I think the whole point of direct threading is that all such stuff is copied before each definition 2022-07-10 21:23:39 which is what gives its performance boost over indirect threading 2022-07-10 21:31:13 I think that would be the only way to get that boost, so I figure you're right. 2022-07-10 21:31:50 And that boost is the only reason I know to prefer direct threading, so if you gave that up you might as well have done indirect. 2022-07-10 21:32:35 I'm fairly convinced that at the very least create/does> is easier to implement if you're indirect. 2022-07-10 21:32:56 I guess may I just did it in a dumb way when I did a direct threaded version of it. 2022-07-10 21:33:06 But it sure went better the next time, when I did indirect. 2022-07-10 21:33:16 s/may/maybe/ 2022-07-10 21:35:27 I've never done a code threaded system. 2022-07-10 21:41:43 I can't remember what processor it was, but a long time ago I ran into one case where direct threading was faster than code threading. Speaking only of the threading - the subroutine call on that processor was slow for some reason. 2022-07-10 21:42:08 I know code threading offers opportunity to inline as well; that probably would have tipped things in favor of code threading overall. 2022-07-10 21:45:18 One thing to keep in mind is that when you're doing non-inlined primitives in code threading you have to call the primitive and return from it; with direct and indirect threading you just go directly from primitive A to primitive B. 2022-07-10 21:47:12 No return stack activity. 2022-07-10 21:52:43 I think if I were starting over I'd consider having a CFA/PFA table at the low address end of my body region. Instead of storing them right by hte names in the header region. The way I've done it this time, the definitions cells point to the CFA field IN THE HEADER. So I can't do without the header region. Can't have a headerless target. The new way I'm thinking of would require an extra pointer; the two 2022-07-10 21:52:45 pointers that are in the header now would move down into the runtime region, and then I'd still have to have a pointer in the header to those. But what it would gain me would be that everything needed to run compiled code would be in the one region, and that would be the region that would be copied to a headerless target. 2022-07-10 21:53:24 Those CFA/PFA pointers would never move, but I could move the definitions that follow them up if I needed more room. 2022-07-10 21:53:51 The definitions themselves would be completely relocatable. 2022-07-10 21:54:17 Since they would refer to one another through that CFA/PFA table. 2022-07-10 21:56:10 If I moved the definitions up N bytes to make room for more CFA/PFA pairs, I'd just adjust all the CFA's and PFA's by N.