2023-04-02 03:40:40 I saw an article on hacker news about calculating cos(x) in C the other day 2023-04-02 03:41:06 And it inspired me to spend a little time calculating faster approximations of cos(x), a quadratic and cubic 2023-04-02 03:41:16 Well actually sin(x) because that's faster to calculate 2023-04-02 03:43:47 But what struck me was when I got into weeds I decided to try googling more help on this, there's very little info about this online at all 2023-04-02 03:44:44 It makes me wonder how many people know for instance that taylor's expansion is useless for this problem 2023-04-02 03:47:56 The 'best' polynomial f(x) to approximate a function g(x) for x in a to b is what minimises integral from a to b of (f(x)-g(x))^2 dx 2023-04-02 03:48:05 But go ahead and try calculating that, it's extremely tedious 2023-04-02 03:48:08 and error prone 2023-04-02 03:48:56 When you calculate the integral you can find the minimum with respect to the parameters by differentiation against those parameters as variables and looking for differential of zero 2023-04-02 03:49:29 And confirm it's a minimum by checking the second partial differential is positive 2023-04-02 03:50:40 Often you don't want to leave the parameters totally unconstrained, you probably want e.g. f(0)=0, f(pi/2)=1. And maybe f'(0)=1, f'(pi/2)=0 as well. 2023-04-02 03:51:35 Which is how you skip all this tedious work. For quadratic a good approximation can be found by fitting to initial, mid, and final points. 2023-04-02 03:52:13 Fitting to points works terribly for higher degrees but you can fit to initial and end points, and then fit to the derivatives of higher and higher degrees (a bit like taylor expansion in two places at once) 2023-04-02 03:55:18 Or, as someone said on the hacker news thread, sometimes cos(x)=0.7 2023-04-02 03:55:28 is enough to hit the target 2023-04-02 07:59:53 What's wrong with Taylor series for sin and cos? 2023-04-02 08:00:01 Also, did you try CORDIC? 2023-04-02 08:00:25 MrMobius: CORDIC is merely LUT interpolation, right? 2023-04-02 08:00:33 No 2023-04-02 08:00:45 You can get as many decimal places as you like 2023-04-02 08:00:52 With a relatively small table 2023-04-02 08:02:16 Although it loses accuracy when you get near 90 degrees so I guess exactly asany decimal places as you like 2023-04-02 08:02:36 *so I guess NOT 2023-04-02 09:35:27 CORDIC is the way to go for computing trig functions. 2023-04-02 09:35:36 Faster than Taylor series. 2023-04-02 09:37:01 JITn: CORDIC algorithms basically break your angle down into a sume of certain "special angles" that have trig values that correspond well to the way we represent numbers in binary in the computer. 2023-04-02 09:37:30 It's fairly straightforward, but I haven't looked at it in a while and can't describe it "correctly" here, but when you see what's going on it makes a whole lot of sense. 2023-04-02 09:39:04 faster than Taylor if you dont have a multiplier 2023-04-02 09:39:13 As algorithms they're not really "interesting" in a mathematical way, but they're interesting from a digital technology perspective. 2023-04-02 09:39:31 I've always been awfully fond of Taylor series in general. 2023-04-02 09:39:58 My daughter's taking calculus and physics in high school this year and has just studied them, and I was stressing to her how fundamentally important they are. 2023-04-02 09:40:40 They're the reason that practically everything in the world is "just a simple harmonic oscillator" if you look at small enough excitations. 2023-04-02 09:46:38 That paper I linked here once and bragged on so uses them - basically the guy regards the whole universe as a huge dynamic model sitting at some equilibrium point in a gigantic state space. Then he makes a Taylor expansion of the dynamics and ocnsiders only the simple harmonic oscillator terms (small wiggles around equilibrium). Then he imposes the requirement that there be at least one conserved quantity, 2023-04-02 09:46:40 and basically the laws of physics come pouring out. 2023-04-02 09:47:11 Electromagnetism, special relativity, pieces of quantum theory, etc. etc. Even three spatial dimensions, which he did not assume a priori, just "show up." 2023-04-02 09:47:15 It's fascinating. 2023-04-02 09:47:35 like from Pandora's pithos 2023-04-02 09:47:44 It says that the laws of physics pretty much HAVE TO BE what they are, so all the hype you read about other universes with radically different laws of physics doesn't really fly. 2023-04-02 09:48:02 You might get different values of various constants, but the "shapes" of the laws of physics are more or less required. 2023-04-02 09:48:48 and that one conserved quantity - that seems like an awfully modest assumption to me. If you don't have at least that, then you'd have nowhere to go at all - it would just be chaos. 2023-04-02 09:49:05 It's like the smallest assumption you could possibly make. 2023-04-02 09:49:34 It doesn't have to be a particular quantity that's conserved - it can be anything. 2023-04-02 09:50:21 Three wasn't the only number of dimensions that would "work," but a whole lot of dimension counts won't work - there's a limited set of possibilities. 2023-04-02 09:50:39 Three is the smallest, and is in some ways he describes the "most interesting." 2023-04-02 09:50:58 A couple of the counts that string theorists like would work as well. 2023-04-02 09:51:37 Oh, also the existence of a "maximum possible speed" is a consequence of the whole thing too. 2023-04-02 09:51:46 We just happen to call that c. 2023-04-02 09:52:38 The model he starts out with (his vector of state variables) winds up being what we'd think of as "the wave function of the universe." 2023-04-02 09:52:51 And one of the things he shows is that it wouldn't be possible to observe it directly. 2023-04-02 09:53:02 So that fits with mainstream physics too. 2023-04-02 09:54:23 And I find it PARTICULARLY interesting that the spatial dimensions emerge rather than being assumed going in. Some processes that pop up involve those dimensions, and those are limited to speed c. But there are processes that DON'T involve the spatial dimensions at all, and I think that gives us a plausible basis for the whole entanglement thing, where effects seem to move faster than light. 2023-04-02 09:54:33 Those effects just aren't interested in "space." 2023-04-02 09:55:22 It also answers the riddle re: "does space have a boundary?" 2023-04-02 09:55:33 it doesn't, but on the other hand there's not an infinite amount of it either. 2023-04-02 09:55:44 There's just however much of it you happen to need right now. 2023-04-02 09:56:15 Space just isn't a "pre-existing container" that things appeared in. 2023-04-02 09:56:45 It's just the way we've evolved to percieve certain relationships among things. 2023-04-02 09:58:13 It's kind of sad to me that's not a famous paper - it feels to me like it should be. I've asked people I've pointed it out to to please show me what's wrong with it, if they find something wrong with it, so I can stop feeling that way, but so far no one has. 2023-04-02 09:58:39 Maybe there's some huge gaff in there, but I've not been able to put my finger on one. 2023-04-02 09:59:38 He doesn't get the physics of nuclei (quarks, etc.), but when he makes that Taylor expansion he drops everything except the linear term. I think it's very reasonable to believe that further physics would arise from keeping more terms, etc. 2023-04-02 09:59:58 He cites a reference in that direction, but I wasn't able to follow that referenced paper as well. 2023-04-02 10:00:54 But it made me wonder if maybe physics basically never ends - you could just keep on carrying more and more terms of that expansion, and it might just keep on giving you physics. 2023-04-02 10:01:05 And as we test at higher and higher energies, we see more of it. 2023-04-02 10:01:29 So it maybe that there's just not a "lowest level" that we're ever actually going to find. 2023-04-02 10:02:42 But if the linear term gets you quantum electrodynamics, and the next term gets you quantum chromodynamics, then what would the one after that get? 2023-04-02 10:06:21 quantum [redacted] 2023-04-02 10:19:24 MrMobius: Taylor series is the right approach for higher degree polynomials and it approximates around a point. If you want a lower degree interval approximation it's the wrong choice. 2023-04-02 10:19:38 ah 2023-04-02 10:22:35 I'll have to look into CORDIC, looks interesting 2023-04-02 10:22:46 I was mostly just being spontaneous with my maths yesterday 2023-04-02 10:25:12 I had a good slide deck on CORDIC once - lemme see if that's around. 2023-04-02 10:26:30 I also warmed up coming up with an approximation for Pi 2023-04-02 10:26:48 Yes - here: 2023-04-02 10:26:51 https://redirect.cs.umbc.edu/~tinoosh/cmpe691/slides/CORDIC-gmu.pdf 2023-04-02 10:26:53 But it was terrible, requires a repeated square root 2023-04-02 10:27:15 Gets into implementations and so on. 2023-04-02 10:27:59 Thanks 2023-04-02 10:28:13 The older the material the better, when it comes to approximations (or anything really) 2023-04-02 10:28:34 Newer material can sometimes have a superior algorithm but if the algorithm's the same people don't seem to explain it as well the 1000th time 2023-04-02 10:28:37 Indeed. 2023-04-02 10:28:59 I have no idea if that's "state of the art," but I thought it made the basic "trick" apparent. 2023-04-02 10:29:06 I know a 'Hiren' 2023-04-02 10:29:19 Nice 2023-04-02 10:31:20 tan(phi) = +/- 2^i 2023-04-02 10:31:44 sorry - 2^-i 2023-04-02 10:31:50 where i is a bit number. 2023-04-02 10:32:14 That defines a set of special angles, and you can build up any angle using by adding or subtracting angles in that set. 2023-04-02 10:32:41 Not the imaginary unit I hope lol 2023-04-02 10:32:48 No, just a bit #. 2023-04-02 10:33:16 You're explaining it better than the slides lol 2023-04-02 10:33:32 Yeah, I recall having to stare at them and think a little. 2023-04-02 10:33:35 Something about having to learn something makes you better at teaching it than someone who's very familiar 2023-04-02 10:34:07 But once you know how to decompose your angle in terms of those special ones, you can use trig identities to combine the sine/cosine/tangent of those angles (which you have tabulated) to get the same for your whole angle. 2023-04-02 10:34:44 And you can do that combining just with add and subtract. I don't remember if products are required or not. 2023-04-02 10:35:00 But it gets it down to arithmetic digital hardware can do. 2023-04-02 10:35:09 add, subtract and shift 2023-04-02 10:35:16 That sounds right. 2023-04-02 10:36:57 It certainly struck me as about as computationally efficient as you could be. 2023-04-02 10:37:27 there is some really interesting info on how this worked on the HP35 calculator here: https://www.jacques-laporte.org/ 2023-04-02 10:37:43 site is only partially functional. i think the owner passed away 2023-04-02 10:37:56 some people have tried to preserve parts of it: https://archived.hpcalc.org/laporte/Trigonometry.htm 2023-04-02 10:38:13 whole site actually it seems: https://archived.hpcalc.org/laporte/ 2023-04-02 10:38:13 Oh, that first link does look good. 2023-04-02 10:38:32 ya I used it when I was writing CORDIC stuff for my calculator projects 2023-04-02 10:38:58 this one is good: https://archived.hpcalc.org/laporte/TheSecretOfTheAlgorithms.htm 2023-04-02 11:12:58 Wow - now I've seen it all: 2023-04-02 11:13:00 https://imgur.com/a/55qBRJp 2023-04-02 11:13:51 Note to self: never let this person be my heart surgeon.