2023-09-04 02:11:05 So, now I keep thinking about roots of polynomials. We can use Newton's method to find a root, and I think that works fine with real or complex roots. And it's easy enough to divide an (x-root), or an (x^2+root^2) conjugate pair, out of the polynomial. I'm wondering, though, how feasible would it be to go straight for an (x^2+a*x+b) term, and divide those out? Then you'd only ever be doing real 2023-09-04 02:11:07 arithmetic. And we have an explicit expression for the roots associated with a term like that. 2023-09-04 02:14:56 Newton's method is x(i+1) = x(i) - f(x(i))/f'(x(i)). 2023-09-04 02:16:04 So in what I'm thinking about, it's the coefficients a and b that are the independent variables, and the "function" we seek to zero out is the remainder after diving x^+a*x+b out of the original polynomial. It will have linear and constant terms - call them u and v. 2023-09-04 02:16:46 f(a,b) would just be (u,v), and f'(a,b) would, I think, be the Jacobian matrix formed of partial derivatives: 2023-09-04 02:17:03 [ du/da du/db ] 2023-09-04 02:17:12 [ dv/da dv/db ] 2023-09-04 02:17:43 Dividing by that matrix would be multiplying by its inverse. 2023-09-04 02:18:39 So, [a,b](i+1) = [a,b](i) - [u(a,b) v(a,b)]*inverse(Jacobian) 2023-09-04 02:21:04 https://web.mit.edu/18.06/www/Spring17/Multidimensional-Newton.pdf 2023-09-04 02:22:33 https://imgur.com/a/fpZfz7v 2023-09-04 05:57:25 Wow, that "Spinors for Beginners" video playlist I linked yesterday is just outstanding. I finished the whole thing just now. 2023-09-04 05:57:54 I think there's more to come - he made reference to the "next video" during the last playlist item. I'll be watching for that one. 2023-09-04 05:58:25 This heavily leaned on that "geometric algebra" stuff I've mentioned here before. That stuff that "isn't really taught" in the mainstream curriculum that I think should be. 2023-09-04 05:58:44 It just makes everything so damn streamlined. 2023-09-04 05:59:13 It's also sometimes referred to as "Clifford algebras." 2023-09-04 06:01:10 It's a pretty simplie idea - you have some number of basis vectors, and if p of them square to +1 and q of them square to negative one, that's the Clifford algebra Cl(p,q). It's always the case that distinct basis vectors anti-commute: e1*e2 = - e2*e1. Those rules are enough to specify the whole deal. 2023-09-04 06:01:38 Special relativity treats time and space differently, so you can use either Cl(1,3) or Cl(3,1). 2023-09-04 06:02:04 Complex arithmetic is Cl(0,1). 2023-09-04 06:02:27 And quaternion arithmetic is Cl(3,0). 2023-09-04 06:03:19 Wait - maybe that's Cl(0,3). 2023-09-04 06:04:39 Because I think all of the i, j, k gizmos in quaternions square like imaginary numbers. 2023-09-04 06:37:47 The stuff I speculated on earlier - using Newton's method to find quadratic factors of polynomials - that raises the question of the initial guess on each iteration. Newton's method is always subject to trouble if your initial guess is poor. However, there's a trick that can be used generally. Let's say I've got a cubic polynomial ax^3 + bx^2 + cx + d and I'm looking for a quadratic factor x^2 + Ax + B. 2023-09-04 06:37:49 One possibility is to form the modified cubic; 2023-09-04 06:39:16 f*ax^3 + g*bx^2 + cx + d, and initially set f=0 and g=1/b. then use A=c and b=d. For those initial settings of f and g that's an exact solution. 2023-09-04 06:40:09 Now perturb f and g slightly; your candidate quadratic will still be "close," so Newton's method should quickly converge. Then nudge f and g up and repeat. Just keep going until you get f and g set so you've got your full target polynomial "engaged.' 2023-09-04 06:40:39 I think that would entirely solve the initialization problem. 2023-09-04 06:44:12 what I'm scratching my head over now is the best way to get that Jacobian matrix. The brute force way would be to calculate the partial derivatives numerically, but I wonder if the polynomial division algorith could be modified to get an exact calculation of them as an additional result of the division process. 2023-09-04 07:16:36 I just have a feeling that could be slipped in to the division process. I worked through an example, with the cubic main poly and the candidate quadratic above, and was able to write down concise expressions at the end for the partials. I'm sure they'd get more complicated as the main poly order went up, but... this is kind of what computers are for. 2023-09-04 07:17:56 If I were to write it in Python, though, then the main calculation would be in Python, whereas if I use numpy or something the division process would be much faster. It might wind up being more efficient to do that and just accept having to do the partials numerically. 2023-09-04 07:18:36 I'd just have to ru it three times, for (A,B), (A+dA,B), (A,B+dB). 2023-09-04 07:20:23 Of course, the whole point here is really for fun - if I were using Python I'd just have numpy find the roots and be done with it. 2023-09-04 17:57:28 is '.' an alias for some more meaningful word? 2023-09-04 18:03:47 "display n in free field format" according to ANS 2023-09-04 18:07:29 No - it's just the name that got chosen for that, for whatever reason. 2023-09-04 18:07:49 I use . as a "prefix" for a subset of my words, and in that context I attach a particular meaning to it. 2023-09-04 18:08:08 It means "do what the non-prefixed word would do, but hang on to the deepest stack item that would normally be discarded." 2023-09-04 18:09:18 It's useful when you need to test a value against a whole set of comparison cases. = will leave the flag on the stack as normal, and it discards the second argument - the value you compared TO, but it keeps the value you were comapring so you can easily compare it to something else. 2023-09-04 18:09:31 That is, .= would do that. 2023-09-04 18:09:47 No "reason" for the character choice - I just "decided on something." 2023-09-04 18:10:10 But it worked out well with .. which does exactly what . does but keeps the argument. 2023-09-04 18:10:15 That's handy for debugging. 2023-09-04 18:10:25 Just a "peek at the top item" ability. 2023-09-04 18:10:59 and to be clear, there is no intelligence around this in my compiler - they're just names of words. 2023-09-04 18:11:42 Like, if I were to try to type a b .+ in hopes of winding up with a a+b that wouldn't work; it doesn't "figure anything out." 2023-09-04 18:12:51 So, indeed polynomial roots are easy in Python. 2023-09-04 18:12:58 import numpy as np 2023-09-04 18:13:12 np.roots([1, 4, 6, 4]) 2023-09-04 18:13:14 for example. 2023-09-04 18:56:49 KipIngram, i see 2023-09-04 19:00:01 You know, on these calculators that have a limited stack, when you drop the stack the deepest reg will keep its value. So if you fill the stack with a value, you get as many copies of it as you want. It's great for evaluating polynomials - you just fill the stack with x and then iterate on coef + *. 2023-09-04 19:00:45 I just realized, though, that it's not as nice for evaluating a poly's derivative, because you'd at least have to specify the starting exponent, which you don't need to specify for straight evaluation. 2023-09-04 19:00:45 i see 2023-09-04 19:01:56 In case it's not obvious, I'm thinking about doing a DM-42 polynomial package. 2023-09-04 19:03:48 yeah indeed 2023-09-04 19:04:55 On this model every register (RAM and stack) can hold arrays, so I'm thinking of just implementing a polynomial stack. 2023-09-04 19:07:44 KipIngram, what about this syntax: '2 3 4 5 poly4' to represent 2x4 + 3x3 + 4x2 + 5x + 1 ? 2023-09-04 19:07:46 Then I could have a tool that could feed into that by generating the Taylor poly of order whatever from an arbitrary non-linear function. 2023-09-04 19:09:37 That's close to what I was thinking. For getting polys in I was thinking of putting the thing in deep stack mode and then being able to say coef coef ... coef n poly 2023-09-04 19:10:16 Then the native functionality will let me save those arrays in named variables if I want to. 2023-09-04 19:11:30 For the Taylor series thing, I'd put the point I wanted the expansion around in X, run the program taylor, and it would show me a menu of available functions - I'd pick one and it would produce the polynomial. 2023-09-04 19:11:48 Oh, and I'd also specify the order. 2023-09-04 19:12:12 Anyway, I'm in that "thinking" phase of figuring out just how I'd like all the pieces to come together. 2023-09-04 19:12:48 i see 2023-09-04 19:12:54 pre waterfall 2023-09-04 19:14:15 At some point I'd like to write a plotting package too. The display on this calculator is too nice to not have that. 2023-09-04 19:14:43 Unfortunately, I don't think they've got any built in font resources, so I'm not quite sure how I'd get labeling on there. 2023-09-04 19:16:50 Unfortunately there seems to be very little community work on general plotting. The thing is a derivative of the HP-42S, and most of what I find is oriented toward plotting on the IR printer that was available for that one. 2023-09-04 19:23:35 Oh, I take it back. There are some font resources. 2023-09-04 19:44:37 https://www.youtube.com/watch?v=A8DOWgo4oXU 2023-09-04 19:47:43 hmm. I know it can do some simple graphing 2023-09-04 20:06:32 Yeah; it looks like you generate output as alpha strings, with each character specifying an eight-pixel tall, one-pixel wide strip. 2023-09-04 20:06:55 And then there are ways to output the various font characters more directly. 2023-09-04 20:07:00 Looks reasonable. 2023-09-04 20:07:26 A general purpose plotting program will probably be a somewhat big program, but it looks "straightforward." 2023-09-04 20:07:39 like sixels but with eight instead of six? 2023-09-04 20:07:44 Yes, more or less. 2023-09-04 20:13:07 Watching that video it seems reaonably responsive too. This calculator runs at 24 MHz on battery power, and goes up to 80 MHz when on USB. 2023-09-04 20:17:56 how many MIPS? 2023-09-04 20:20:27 or more precisely millions of operations per second 2023-09-04 20:23:55 I don't know - I haven't seen a number for that. 2023-09-04 20:27:35 The guy says that depending on exactly how deep you're trying to go a Mandelbrot rendering can take an hour or more. Of course, that's an entirely open-ended problem, so I could make any system take an hour or more if I pushed it far enough. So they've got some sense of "commonly of interest" built into that. 2023-09-04 20:41:23 It's an ARM Cortex M4F. 2023-09-04 20:41:37 Hey, that's the same processor I have on that little board I'm working with. 2023-09-04 20:43:35 I see a table of various ARM units that gives a DMIPS benchmark; they vary from a bit under to a bit over 1 DMIPS/MHz. 2023-09-04 20:50:40 KipIngram: thats weird. i thought it had a graphing mode like the simple mode on the hp42 2023-09-04 20:53:02 There are other ways of doing graphics. You can control individual pixels too. 2023-09-04 20:53:27 Which would likely be the better way to draw an actual data curve. 2023-09-04 20:53:52 The alpha string method is probably better for drawing grids, backgrounds, etc. 2023-09-04 20:54:11 I do think it has most / all of the old HP-42S functionality. 2023-09-04 20:54:51 It's not an image-faithful emulation of the 42S - no 42S ROM required. It's a "faithful rewrite." 2023-09-04 20:55:05 Rather, that's what Free42 is, and the DM42 uses Free42. 2023-09-04 20:55:30 Apparently it's quite an open platform - you can load other software on there if you've got it. 2023-09-04 21:01:43 ya Free42 is very, very close to the original 2023-09-04 21:04:21 That combined with that 128-bit intel float library, and that's a pretty nice platform. 2023-09-04 21:44:24 Here's a video on re-loading the DM-42 with HP-48GX OS: 2023-09-04 21:44:27 https://www.youtube.com/watch?v=I5jgLzw_0-o 2023-09-04 21:44:39 Don't know if it's precise or just "very similar." 2023-09-04 21:49:22 I don't know if the 42 does this, but the WP-34S supports not only the usual LBL instruction in programs but also has "BACK" and SKIP which jumps a specified distance in the program. You don't have to have a LBL there to receive the jump, and it's much faster since it doesn't have to look for the label. 2023-09-04 21:49:46 Of course, if you edit your program you might have to adjust all of those, but once you have something working well it's a nice way to speed it up.