00:00:00 --- log: started forth/19.03.14 00:23:00 --- quit: MrMobius (Read error: Connection reset by peer) 01:29:26 The *FULL* IEEE754, or just the binary formats? 02:41:41 ttmrichter: How much would I need to, for instance, implement a quadratic equation solver? 02:41:58 The binary formats, string conversion 02:51:36 forth scientific library probably solves quadratics. 03:08:52 on the topic of this, it seems that http://mrob.com/pub/ries/ is a neat project to port to Forth 03:09:34 > ries (or RIES, an acronym for RILYBOT Inverse Equation Solver) takes any number and produces a list of equations that approximately solve to that number. 03:12:42 uh, actually, http://mrob.com/pub/ries/src/ries.c.txt 03:13:01 > Constants and (eventually) functions can share a common FORTH-like syntax. 03:24:08 I assume it implements forth as part of the algorithm. 04:04:40 I like the idea of the forth scientific library. 04:50:03 --- join: dddddd (~dddddd@unaffiliated/dddddd) joined #forth 04:52:06 --- quit: Zarutian (Remote host closed the connection) 04:53:30 --- join: Zarutian (~zarutian@173-133-17-89.fiber.hringdu.is) joined #forth 05:14:33 john_cephalopoda: Me too. That's one of the things I want to work on at some point. 05:15:14 I want to write an environment similar to Octave / Matlab in mine. 05:15:24 RPN-based, of course. 05:30:51 KipIngram: Having vector and matrix math working would be a great thing. I want to eventually render 3d stuff with forth and that's practically impossible without matrices. 05:31:41 john_cephalopoda: did you watch that apl programming video? 05:32:20 Yes, totally agree. I'm going to try for some sort of generalized tensor facility; not exactly sure yet how it would work. 05:33:36 And working with larger matrices for various problems - what a great opportunity to try to deploy multiple cores in a smooth way. 05:45:07 corecode: Haven't seen that yet. 05:45:24 KipIngram: I have written some basic code for matrix multiplication but nothing really sophisticated as of now. 05:46:50 I did a lot of numerical work back when I worked for The University of Texas at Austin, so at least I have a notion of what kind of things I want to be able to do. That was mostly electromagnetic field related. 05:50:16 --- quit: dave0 (Quit: dave's not here) 06:00:25 * john_cephalopoda . o O ( The University of Texas in Canberra ) 06:02:20 --- quit: cantstanya (Remote host closed the connection) 06:04:41 --- join: cantstanya (~chatting@gateway/tor-sasl/cantstanya) joined #forth 07:14:34 --- join: MrMobius (~default@c-73-134-82-217.hsd1.va.comcast.net) joined #forth 07:18:29 what forth implementation that popularize printing 'ok' after executing a word? 07:19:30 Having complex mathematical operations almost certainly means adding some sort of type system to Forth 07:19:42 You don't want to add scalars and vectors together :) 07:23:46 i think the forth answer would be "so don't do that" 07:25:12 or, begin begin begin, instead 07:25:56 --- join: travisb (~travisb@2600:1700:7990:24e0:6112:d47b:3dbc:b8d7) joined #forth 07:31:31 --- quit: travisb (Ping timeout: 252 seconds) 07:31:57 zy]x[yz: it would mean you create a language that's no longer Forth, or you incur a large cost by rolling the type system in Forth 07:32:31 huh? 07:33:12 forth traditionally has no type checking. you deal with it by having different operators for working with different types 07:33:23 i don't see how vector versus scalar is any different 07:36:46 if you don't want to leave it up to the programmer, probably the easiest way to deal with it would be runtime dispatch 07:36:57 then you could overload operators and type-check at runtime 07:37:10 but like you said, there would be an overhead associated with that 07:47:50 Forth is already crazy fast, so I wouldn't mind incurring some cost for applications such as scientific computing 07:48:02 it is? 07:48:10 according to who? compared to what? 07:50:41 some old forths have had types. 07:50:47 Ah, got my block editor working with dual blocks. 07:51:03 I shrunk my font to 10-point the other day, so all 64 lines of a block would show on my Mac screen. 07:51:21 I don't think a lack of typing is what distinguishes Forth from something else. 07:51:28 That made the block narrower too, and I realized I could have two side-by-side in the partial screen I use for that stuff. 07:51:46 I wouldn't want types though 07:51:51 bleh 07:51:59 i don't think anyone said the lack of typing is what distinguishes forth 07:52:19 I want to write applications that support types, but that's different from having them supported in the basic Forth system. 07:52:34 siraben: you'd just write your code to not add the vectors and the scalars 07:53:18 KipIngram: Right, having them separate makes sense 07:53:29 The numerical stuff john_cephalopoda and I were talking about - when you have vectors and matrices of all different sizes floating around, you just don't want to have to keep up with all the stuff you'd have to keep up with to work with those in a "dumb words" system. 07:53:52 You'd have to put all the right sizes on the stack for every word, etc. 07:53:53 Ugh. 07:54:09 It just needs to KNOW that M is a 3x3 floating point matrix, and so on. 07:54:41 KipIngram: I used a cstring-style approach in my library. 07:54:56 Tell more... 07:55:04 Just use the first two fields for the size of the matrix (I only had float matrices) 07:55:31 yep 07:55:46 ( m n xs...) 07:56:00 Exactly. 07:56:38 my last matrix words returned the address at index 0,0 07:57:07 you would 1- to get n etc 07:57:40 Oh, right. 07:57:43 I see what you mean. 07:57:47 I think in some earlier versions I also tried ( s m n xs ) where s is the type (e.g. float, 8-bit fixed, 16-bit fixed etc) but it turns out that if you want to do matrix multiplications, float is the only sane option :þ 07:57:53 Yes; I'm not yet sure how I'd do that, but we'll see. 07:58:18 For image formats it would make sense to have bit-ness defined, for matrices not so much. 07:58:20 Generally I tend to lead toward designs where the type decisions are made at compile time, so having the dimensions and such in the headers would work too. 07:59:42 all my type decisions are made at edit time. 08:00:27 kekeke 08:00:32 8-D 08:00:36 Showoff... 08:00:44 ;) 08:00:45 Yes, THAT is the Forth way. 08:01:24 all my type decisions are made when my program stops crashing 08:01:57 I really do admire Forth's simplicity, and I don't want to "corrupt it." 08:02:20 you become one with nothing 08:02:23 But I also want to think that I can implement anything I want using it, and some applications profit from that sort of capability. 08:02:36 the alpha and the omega 08:02:38 I continue to marvel at how compact Forth code can be. 08:02:47 and the beta tester 08:02:50 I think a big part of it is that you don't call out parameters for function calls. 08:03:20 the stack is personally the greatest innovation 08:03:34 Instead of delta = foo(alpha, beta, gamma); epsilon = bar(beta, delta, zeta); you just say foo bar. 08:03:35 the dual stack machine as the target 08:04:13 it's true simplicity, and it's brilliant 08:04:22 Yeah, the fact that Forth makes the return stadk REALLY THERE for you, visible, is a huge advantage. 08:04:40 Every single language out there has a return stack, but they wrap it up in armor so you can't really deal with it. 08:05:16 Maybe they deal with it for you for certain purposes (locals, etc.), but in Forth it's a system resource like anything else. 08:05:54 I like a corrupted forth 08:06:01 And psychologically the fact that new words you define behave in every respect, and APPEAR in every respect, just like the built-ins - that's huge. 08:06:16 functions in other languages *look* different; parentheses, parameter lists, etc. 08:06:23 But it's all seamless and beautiful in Forth. 08:12:52 presiden: corrupted in what way? 08:17:22 --- quit: Zarutian (Read error: Connection reset by peer) 08:17:25 --- join: Zarutian_2 (~zarutian@173-133-17-89.fiber.hringdu.is) joined #forth 08:25:23 crc, lego blocks but instead with various bell and whistle 08:25:26 --- join: mark4 (~mark4@148.80.255.161) joined #forth 08:51:45 --- quit: nighty- (Quit: Disappears in a puff of smoke) 09:25:31 --- quit: Zarutian_2 (Remote host closed the connection) 09:26:56 --- join: Zarutian (~zarutian@173-133-17-89.fiber.hringdu.is) joined #forth 09:36:22 --- quit: proteusguy (Ping timeout: 252 seconds) 09:38:36 --- quit: pierpal (Read error: Connection reset by peer) 09:45:36 --- join: pierpal (~pierpal@host161-197-dynamic.245-95-r.retail.telecomitalia.it) joined #forth 09:57:00 --- quit: pierpal (Read error: Connection reset by peer) 10:04:09 --- join: pierpal (~pierpal@host161-197-dynamic.245-95-r.retail.telecomitalia.it) joined #forth 10:35:50 --- quit: pierpal (Read error: Connection reset by peer) 10:45:17 --- join: a3f (~a3f@chimeria.ext.pengutronix.de) joined #forth 10:46:01 --- quit: a3f (Changing host) 10:46:01 --- join: a3f (~a3f@unaffiliated/a3f) joined #forth 10:49:06 --- join: pierpal (~pierpal@host161-197-dynamic.245-95-r.retail.telecomitalia.it) joined #forth 11:21:44 --- join: proteusguy (~proteusgu@2600:1700:9b70:2e90:387b:ff73:1b17:365c) joined #forth 11:21:44 --- mode: ChanServ set +v proteusguy 11:57:38 presiden: re: 'ok'; this goes back to the very early days. In http://worrydream.com/refs/Moore%20-%20Forth%20-%20The%20Early%20Years.pdf Chuck Moore says it was in use in 1966, predating both the use of threaded code and the existence of a compiler 12:41:43 --- quit: proteusguy (Ping timeout: 252 seconds) 12:41:54 --- quit: gravicappa (Ping timeout: 244 seconds) 13:38:48 --- join: dave0 (~dave0@223.072.dsl.syd.iprimus.net.au) joined #forth 13:38:50 --- join: mtsd (~mtsd@94-137-100-130.customers.ownit.se) joined #forth 14:25:30 --- quit: mark4 (Remote host closed the connection) 14:33:02 --- quit: mtsd (Quit: leaving) 15:57:05 --- quit: Zarutian (Read error: Connection reset by peer) 15:57:44 --- join: Zarutian (~zarutian@173-133-17-89.fiber.hringdu.is) joined #forth 16:08:38 --- quit: jedb (Ping timeout: 255 seconds) 16:17:53 --- quit: dgi (Ping timeout: 246 seconds) 16:40:06 --- join: ashirase (~ashirase@modemcable098.166-22-96.mc.videotron.ca) joined #forth 17:20:16 --- join: travisb (~travisb@h193.235.138.40.static.ip.windstream.net) joined #forth 17:39:49 --- quit: john_cephalopoda (Ping timeout: 252 seconds) 17:50:39 siraben: You can solve quadratic equations with binary or decimal formats. I'm just curious to see someone finally support ALL of IEEE754 for a change. :D 17:53:26 In the "what everyone should know" floating point paper they show that binary format achieves better accuracy, for any particular size. 17:53:36 --- join: john_cephalopoda (~john@unaffiliated/john-cephalopoda/x-6407167) joined #forth 17:53:41 I forget the details, but remember it making sense at the time. 17:53:41 stdfix.h 17:59:24 crc, ah, nice to know 18:05:20 --- quit: travisb (Ping timeout: 244 seconds) 18:13:38 --- quit: ashirase (Quit: ZNC - http://znc.in) 18:15:23 --- join: ashirase (~ashirase@modemcable098.166-22-96.mc.videotron.ca) joined #forth 18:53:59 --- join: nighty- (~nighty@b157153.ppp.asahi-net.or.jp) joined #forth 18:58:12 --- quit: dave0 (Quit: dave's not here) 19:03:33 corecode, you might like this, https://www.youtube.com/watch?v=PlM9BXfu7UY 19:03:44 --- quit: dddddd (Remote host closed the connection) 19:04:35 --- quit: nerfur (Ping timeout: 245 seconds) 19:08:51 --- join: travisb (~travisb@2600:1700:7990:24e0:6112:d47b:3dbc:b8d7) joined #forth 19:25:53 KipIngram: Only by pretty trivial amounts. 19:26:37 decimal128 gives 34 decimal digits of precision. binary128 gives 34.02 decimal digits of precision. 19:26:53 --- quit: ashirase (Ping timeout: 246 seconds) 19:28:11 Exponent side is a bit messier since there's so many things you can point at, but there are a couple of cases where decimal128 is slightly superior to binary128 and vice versa, IIRC. 19:29:04 --- join: ashirase (~ashirase@modemcable098.166-22-96.mc.videotron.ca) joined #forth 19:33:02 I think people get annoyed when the enter a whole number and then it prints out as a decimal number close that that number. :-) 19:33:16 I understand. 19:33:24 If you don't know what's going on that would seem... just WRONG. 19:33:54 Well, I renamed my :: / ::WIPE thing to .: / .WIPE 19:34:16 That makes it more consistent to add .VARIABLE and .CONSTANT, and also at least obliquely evokes Linux hidden files. 19:34:37 Though I think I may go with VAR and CONST rather than the spelled out ones - that just seems like wasted space to me. 19:34:53 Or maybe even INT - not sure about that yet. 20:22:07 --- join: gravicappa (~gravicapp@h109-187-202-26.dyn.bashtel.ru) joined #forth 20:45:38 KipIngram: Personally I think floating point is usually an error. 20:46:08 Very rarely do calculations need the ranges afforded by the entirity of even binary64, not to mention binary128. 20:46:48 It would be better to use some kind of fixed-point representation in your specific needed range. (The approach used by, among other things, the Apollo control modules.) 20:47:22 I tend to agree with you, but I do believe that floating point massively reduces the amount of attention and effort you have to invest in that part of your work. 20:47:46 Fixed point can clearly be better, since it provides more mantissa bits, and any knowledge whatsoever we have about our problem helps us make that work. 20:48:41 For example, if you KNOW that a number in your work is confined to a particular range, then you can avoid having to record the exponent at all. 20:48:54 Use all the bits for mantissa and embed the exponent knowledge in your code. 20:49:32 So I'm looking at checksums / CRCs tonight. I want to implement a reasonable "easy to run test" of this code generator that will let me keep an eye on it. 20:49:44 Leaning toward Fletcher32 at the moment - anyone have any comments? 20:50:40 / 20:50:43 Ooops. 20:53:55 --- join: jedb (~jedb@199.66.90.113) joined #forth 21:08:56 --- quit: cantstanya (Ping timeout: 256 seconds) 21:14:52 --- join: cantstanya (~chatting@gateway/tor-sasl/cantstanya) joined #forth 21:20:08 What specific kinds of bit errors are you looking at? 21:54:02 :-) 21:54:23 Just implemented tight little Fletcher32 setup, using rbx as teh accumulator for both sums. 21:54:49 I'll see if I can find another register to spare later on. 21:55:32 I don't really have another handy one, though. 21:56:06 Turned out I wasn't really using rbp for anything. I switched that over to rbx, because it's handy in this application to have access to the thing as a 64 bit reg and a 32 bit reg. 22:05:48 --- quit: pierpal (Remote host closed the connection) 23:19:30 --- join: nerfur (~nerfur@broadband-95-84-184-13.ip.moscow.rt.ru) joined #forth 23:59:59 --- log: ended forth/19.03.14