00:00:00 --- log: started forth/18.10.15 00:03:25 --- join: smokeink (~smokeink@42-200-119-209.static.imsbiz.com) joined #forth 00:21:35 --- join: ncv (~neceve@2a02:c7d:c5c9:a900:6eaf:6ef7:3b81:d5f6) joined #forth 00:21:35 --- quit: ncv (Changing host) 00:21:35 --- join: ncv (~neceve@unaffiliated/neceve) joined #forth 00:33:18 --- join: wa5qjh (~quassel@110.54.188.224) joined #forth 00:33:18 --- quit: wa5qjh (Changing host) 00:33:18 --- join: wa5qjh (~quassel@freebsd/user/wa5qjh) joined #forth 00:48:46 --- quit: carc (Quit: QUIT) 00:50:16 --- join: carc (~carc@unaffiliated/carc) joined #forth 00:54:29 --- join: dave9 (~dave@90.20.215.218.dyn.iprimus.net.au) joined #forth 00:57:35 re 01:25:23 --- join: smokeink2 (2ac877d1@gateway/web/cgi-irc/kiwiirc.com/ip.42.200.119.209) joined #forth 01:25:31 how to disable this warning ? "DUMP is a system word in an application word in file" 01:26:45 --- quit: smokeink2 (Client Quit) 02:03:10 --- quit: ashirase (Ping timeout: 252 seconds) 02:06:58 --- join: ashirase (~ashirase@modemcable098.166-22-96.mc.videotron.ca) joined #forth 02:11:19 --- quit: dave9 (Quit: dave's not here) 02:19:24 --- join: smokeink2 (2ac877d1@gateway/web/cgi-irc/kiwiirc.com/ip.42.200.119.209) joined #forth 02:20:33 --- quit: wa5qjh (Read error: Connection reset by peer) 02:21:08 --- join: wa5qjh (~quassel@110.54.188.224) joined #forth 02:21:08 --- quit: wa5qjh (Changing host) 02:21:08 --- join: wa5qjh (~quassel@freebsd/user/wa5qjh) joined #forth 02:27:06 --- quit: rdrop-exit (Quit: rdrop-exit) 02:34:44 smokeink2: grorth? 02:34:57 I think you can just add "no warnings" to the top of the file 02:55:44 win32forth . tried adding "warning off" at the top of the file , it suppressed word redefinition warnings, but this DUMP warning still shows up 03:41:59 --- quit: wa5qjh (Quit: http://quassel-irc.org - Chat comfortably. Anywhere.) 03:44:36 --- join: wa5qjh (~quassel@110.54.188.224) joined #forth 03:44:36 --- quit: wa5qjh (Changing host) 03:44:36 --- join: wa5qjh (~quassel@freebsd/user/wa5qjh) joined #forth 03:44:50 --- quit: wa5qjh (Remote host closed the connection) 03:50:04 --- quit: KipIngram (Quit: WeeChat 1.4) 03:53:51 --- join: dave9 (~dave@90.20.215.218.dyn.iprimus.net.au) joined #forth 03:54:14 re 04:19:48 --- join: KipIngram (~kipingram@185.149.90.58) joined #forth 04:19:48 --- mode: ChanServ set +v KipIngram 04:24:58 hi KipIngram 04:46:47 --- join: siraben[m] (sirabenmat@gateway/shell/matrix.org/x-jwghojpcedciefsc) joined #forth 04:47:05 --- join: siraben (~user@unaffiliated/siraben) joined #forth 04:48:29 Hi Dave. 04:48:43 They rebooted my seedbox - had to get all connected up again. 04:49:01 I'm setting up a Matrix bridge right now, should be interesting. 04:49:06 test 04:57:27 --- quit: siraben (Quit: ERC (IRC client for Emacs 26.1)) 04:57:58 That looks promising, I guess. :-) 04:58:44 I'll have to get used to this graphical interface, it's a trade-off. 04:59:23 Yeah, I try to shun GUIs as much as possible. 04:59:31 The 16 character width limit on the calculator is ridiculous, I'm planning a re-write on the display system 04:59:32 So many circles 04:59:58 Um, isn't there just a limit on pixels that will put a cap on anything you do? 05:00:04 Or are you going to have it scroll? 05:00:11 16 is fine, for a calculator. 05:00:22 I guess. 05:00:29 There is a small font variant that is used in other programs 05:00:40 A few years ago I bought an "anniversary edition" calculator HP put out - HP-35S. 05:00:43 But then it becomes harder to read. I'll just manage space well. 05:00:53 --- join: nighty-- (~nighty@s229123.ppp.asahi-net.or.jp) joined #forth 05:00:55 It actually has the old-style key technology that my HP-41CV had back in college. 05:00:57 So nice... 05:00:59 --- join: dddddd (~dddddd@unaffiliated/dddddd) joined #forth 05:01:21 I haven't done a whole lot with it, but I got it out last night and fired it up. 05:01:38 My HP-42S emulator on my phone is far more capable, but the 35S is fun to play with. 05:01:59 I wonder what students will be using decades from now 05:02:07 There's also a "grass roots" calculator out there called the WP-34S. 05:02:33 It's based on some HP calculator and the community developed an alternate flash and a keyboard overlay and sticker set relabeling it. 05:02:45 It's generally considered the "most powerful non-graphing calculator." 05:02:54 There's pressure to not make calculators more powerful, because it could mean cheating. 05:02:56 I've got a couple of those too. 05:03:09 I find the graphic feature mostly a gimmick. 05:03:12 Well, they're already powerful enough for that. 05:03:21 Me too - I've never actually owned a graphing calculator. 05:03:31 My wife has an HP-48, though. 05:03:43 graphing* 05:03:44 I don't recall ever using it in tests. 05:03:58 I have sort of the same attitude there that I do re: GUIs. 05:04:05 I'm just a "text oriented" guy. 05:04:28 Technically it is possible to cheat using the approved calculators like the TI-84 05:04:45 Sure. 05:04:49 They ask you to clear RAM before tests, but there's nothing stopping you from archiving to flash and restoring 05:04:52 It's a computer, so. 05:05:08 I had a pretty open-minded professor once that said we could use our programmable calculators on the tests as long as we used only software that we wrote. 05:05:19 We had to let him inspect the calculator anytime he wanted to. 05:05:32 That makes sense 05:05:40 Back then there weren't that many upscale calculators, so he could plausibly know them all better than the students did. 05:05:41 Hah, I wonder if a Forth interpreter would pass. 05:05:45 That wouldn't be the case anymore. 05:05:52 I wrote a TON of stuff for that 41CV. 05:06:12 In what language? 05:06:18 What CPU did it use? 05:06:28 I implemented complex arithmetic, Gaussian elimination, and so on - I could actually just inspect an electric circuit, and do some complex math to set things up, hit a button, and have all the voltages and currents. 05:06:41 In its native programming language. Keystroke-based programming. 05:06:57 It didn't handle complex math antively - I implemented that. Had a "complex stack" in memory, etc. 05:06:57 Interesting. 05:07:04 The TI-84 supports that natively. 05:07:26 But it's bad at: factoring, displaying in other bases, arbitrary integer arithmetic, etc. 05:07:34 So I'd just visit each component in the circuit, do a bit of arithmetic around it, and hit a button to integrate that into the system of equations. 05:07:41 Once I'd done that for all components, I'd hit "solve." 05:07:54 It would crunch for a couple of minutes and spit back the phasor voltages and currents. 05:08:02 Coincidentally I'm learning about eletrical circuits right now 05:08:14 Cool stuff. :-) 05:08:20 RC circuits, rectifiers, transformers, etc. 05:08:22 Yeah, it is. 05:08:30 I also wrote a Smith chart program. 05:08:38 What's that? 05:08:42 Smith chart is a graphical tool used for dealing with transmission lines, waveguides, and so on. 05:08:45 --- quit: nighty-- (Quit: Disappears in a puff of smoke) 05:08:50 You "use it" in a graphical way. 05:08:59 Ah, so this is pretty useful stuff. 05:09:06 --- join: nighty-- (~nighty@s229123.ppp.asahi-net.or.jp) joined #forth 05:09:12 And I implemented it that way, so that there was an operational analog to all the "graphical things" you'd do with an actual Smith chart. 05:09:24 Another student once saw me using it, and asked about it. 05:09:32 When I told him what it did he was like "Can I have that?" 05:09:39 He sat there and wrote the whole program down. 05:10:18 I liked solving the problem that way, because (as opposed to just solving the equations) I didn't miss out on the "physical experience" of using the Smith chart. 05:10:32 So I still learned all the "what can you do with a Smith chart" stuff. 05:10:45 It depends on the problems being asked for as well. My calculator wouldn't be able to prove a group isomorphism, for instance. 05:11:11 I guess it was the same with the circuit solver - I was still going through the motions of setting up the equations and so on (node voltages, mesh currents, etc.) 05:11:19 The calculator just kept up with the details for me. 05:11:34 Sounds like a pretty advanced calculator 05:11:49 So those were "cooperative" tools - I still had to know what I was doing and understand the basics. 05:11:58 I just got rid of the tedious arithmetic stuff. 05:12:10 So I felt like that professor would have been fully satisfied with my use of the calculator. 05:12:21 It was the best there was in that day (circa 1982). 05:12:40 And of course there were the HP / TI wars we all waged on one another. 05:12:57 Sort of like vim and emacs - both were great, capable tools. 05:13:19 But I just committed myself to RPN and never looked back. 05:13:35 Now it's just TI 05:13:38 All the way 05:13:49 Yeah. HP just lost there way somewhere in the 90's. 05:13:52 Or late 80's. 05:13:55 But the thing is, a lot of people don't know how to use the more advanced features of the calculator 05:13:57 their 05:14:17 One thing I use the most is the built-in single variable equation solver. 05:14:21 I think in the hey day HP was still run by engineers - it basically still was the "garage workshop" company it had started out as. 05:14:31 But then the bean counters took over, and that was the end. 05:14:47 Type in something : "0 = ln(sin(x))-x^3" and it spits out the first answer for x 05:14:59 I use it all the time when I get an equation and don't want to manipulate it 05:15:09 Bean counters? 05:15:20 Yeah, sure. That's symbolic arithmetic, which is a step beyond mere numerical solving. 05:15:27 I never had a calculator that could do that. 05:15:34 But man, I could do the numerical stuff. 05:15:57 I wrote several different differential equation integrators for it as well. 05:15:59 Oh, it's not symbolic 05:16:01 Different core algorithms. 05:16:04 Just floating point 05:16:07 Oh, ok. 05:16:09 I believe it uses Newton's method. 05:16:14 Well, I wrote that for mine. 05:16:21 --- quit: smokeink2 (Quit: http://www.kiwiirc.com/ - A hand crafted IRC client) 05:16:25 Did your calculator have floating point? 05:16:25 Stuff to factor polynomials, etc. 05:16:38 I don't know if it had hardware floating point. 05:16:43 But obviously it did floating point. 05:16:43 Ah, there's a built-in feature for polynomial factoring as well 05:16:49 Exclusively, in fact. 05:16:58 Same here, Z80 doesn't know floating point. 05:17:00 See, I could claim I understood that stuff, since I wrote the algorithms. 05:17:10 Right. Even integers are floating point until they hit > 2^33 05:17:23 In fact, knowing something well enough to code it tightly is a deeper level of knowing - you can't just half-know it and do that. 05:17:55 I never really needed to distinguish - integers were just special cases of floats for most of the work I did. 05:18:05 A lot of problems in my math/science exams require actual understanding of the topic, the calculator can only help crunch a bit. 05:18:41 Oh, and astronomy. I had programs to give me positions of planets at any time, convert right ascension / declination to hour angle / azimuth, and so on. 05:18:47 I had a real-time clock module. 05:18:54 So I didn't have to enter the time for that stuff. 05:18:58 Nowadays people use Python, I presume 05:19:10 It's getting to be that way, yes. 05:19:12 ^^^ a lot of what you said reminds me of the Emacs calculator 05:19:19 Which is a programmable, stack-based calculator. 05:19:25 I use it all the time. 05:19:31 One of the famous books is "Numerical Recipes." It started out presenting FORTRAN, but then later they issued a C version. 05:19:43 Wouldn't surprise me at all to learn tha there's now a Python version. 05:20:01 What does it include? 05:20:12 Well, I haven't looked at it in years. 05:20:40 Roots of polynomials, Newton's method, Gaussian elimination, eigenvalues, etc. 05:20:43 "The works." 05:21:05 Special functions, like Bessel's functions, spherical harmonics, etc. 05:21:14 Makes me appreciate the built-in features more. 05:21:19 And there's source code in there for all of it. 05:21:23 :-) 05:21:29 I appreciated learning to write them. 05:21:40 See yesterday's discussion of libraries. 05:21:41 See, I wish TI made their OS at least open-source, if not totally GPL'ing it 05:22:15 The Emacs Calculator manual is a fascinating read. It has all of what you mentioned. 05:22:22 You should go hunt down the WP-34S code. 05:22:25 And arbitrary precision fixnum 05:22:27 It may be open source. 05:22:38 Yeah, I want to write a bignum package at some point. 05:22:50 For Forth? 05:23:01 I'm thinking BCD-style arthmetic, once I understand the DAA instruction 05:23:23 My "proof of principle" problem on that will be to implement a Mandelbrot viewer. 05:23:31 I know it's been done, but I want to do it myself. 05:23:39 Don't do BCD. 05:23:44 Why not? 05:23:53 Use full integer precision, just arbitrary. 05:23:59 Like for me that would be N 64-bit cells. 05:24:05 It will be higher performance. 05:24:06 Oh, hm. 05:24:13 Same concept totaly. 05:24:17 I have 16 bit cells 05:24:28 So that's four hex digits per cell. 05:24:37 So it would be "binary coded hex." 05:24:42 But then you're not wasting any bits. 05:24:43 Yeah 05:24:50 But it's the same thing, in every way. 05:24:55 Just better. 05:24:57 0304 = 34 05:25:04 Right? 05:26:01 No, 34 would be 0034. 05:26:10 Well, you could do it the way you said. 05:26:12 Ah yes 05:26:14 But you're wasting half the bits. 05:26:23 And I'm just saying let that be *hex* 34, not decimal 34. 05:26:25 * siraben[m] uploaded an image: addr0.png (4KB) < https://matrix.org/_matrix/media/v1/download/matrix.org/DhMHLCrGvVhFMdxPAZsoOxOG > 05:26:29 Apparently Matrix can send images, I'm checking to see what it looks like 05:26:49 Ah it appears as if I typed /me 05:26:54 Yeah, hex 34 05:27:29 $80, $AC, $46, $19, $18, $45, $80, $00, $00 = -4.61918458 × 10&44 05:27:37 10^44, not 10&44 05:27:48 Ok. 05:27:55 So from the third byte onward it's the significand 05:27:56 I didn't work it out, but seems reasonable. 05:28:12 No bits are wasted 05:28:40 I have a lot of the groundwork done for me, it's just a matter of reading user input into the floating pointer format. 05:28:41 If you use base 10 digits you waste bits. 05:28:53 Ah, base 10 woes. 05:29:01 I could just reimplement everything instead of relying on the calculator 05:30:13 Yes, you could. I likely would. :-) 05:30:25 That's just sort of how I roll. 05:30:57 I am using the available fpu instructions in my processor. 05:31:03 But that's relying on hardware. 05:31:21 My NUMBER will convert strings with a decimal point to 64-bit IEEE floating point. 05:31:27 Doing that "manually" would have been a nightmare. 05:31:28 It should be modular. i.e. I can rely on the OS for now and gradually phase to my own implementations. 05:31:34 I think I saw how to do it, but wow - involved. 05:31:38 But do I really need to redo floating point square root? 05:31:59 I just integer convert the mantissa and exponent and then calculate M*10^E using the fpu. 05:32:18 Ah, well no FPU here. 05:32:20 It's hard manually because the IEEE format has the exponent as a power of 2, not a power of 10. 05:32:36 I'll just stick with the OS then until I have more time 05:32:57 There's a lot of floating point ROM calls done for me, SIN COS TAN EXP SQRT LN LOG etc. 05:33:10 So if I were trying to implement floating point manually, for my own system, I'd probably cut a corner and store numbers as power-of-10 things. 05:33:23 Yeah, I'd be hesitant to give those up. 05:33:39 That wasn't quite what I was thinking of when I commented about "doing it yourself" a minute ago. 05:33:48 My end goal is this: eventually be able to make it through an entire test without leaving the interpreter at all 05:33:57 The calculator's internal format may not even be IEEE float. 05:34:07 "as if I were doing a test"* 05:34:11 Right, I don't think it is. 05:34:17 It's its own. 05:34:48 You can show that power-of-2 results in better preserved accuracy (less round off error) than power of 10, or power of anything else. 05:34:48 Fortunately, my math unit (set theory, group theory) doesn't need any calculator usage at all 05:35:08 That paper I linked you to a few weeks ago (what programmers should know about floating point) lays that argument out. 05:35:26 Yeah, I skimmed that paper but a more detailed reading will come 05:35:26 I need to learn more group theory. 05:35:32 I have only a vague understanding of it. 05:35:36 Haha, what for? 05:35:37 Sort of a "glimpse of the power." 05:35:50 It comes up in particle physics and other areas of advanced theoretical physics. 05:35:59 Well, a group consists of a set of elements and a binary operation that satifies four properties 05:36:08 associativity, identity, inverse and totality 05:36:09 Lie algebras and so on. 05:36:27 It's pretty intuitive, I think. 05:36:31 Yes, I have the basic concept down, at least. 05:36:42 Combines a lot of ideas you've had about various things 05:36:52 It may not even be group theory itself I need more of, but rather things it touches in its applications. 05:36:57 Like rotations and reflections, they form a group. Permutations are groups. Rubrik's cube moves are groups 05:37:07 Right. 05:37:14 Symmetry is a big deal in physics. 05:37:23 Are you a physicist? 05:38:19 No, I'm an engineer. PhD, though, and I crammed math in graduate school. 05:38:30 PhD in which area? 05:38:31 Specifically because I wanted to be able to self-educate in physics later on. 05:38:35 Ah I see 05:38:57 Engineering. My research was in electromagnetic theory - worked on electric guns and things like that (railguns, coilguns, etc.) 05:39:09 So engineering with a deep physics connection. 05:39:15 Wow 05:39:27 Did a lot of numerical modeling of em fields. 05:39:28 Maybe fields are more applicable to "real world" physics 05:39:32 Finite elements, etc. 05:39:49 As in mathematical notions of vector fields 05:39:57 Right. 05:40:00 Tensors are cool. 05:40:07 Physics just has a lot of math in it, it's hard to see what's most useful 05:40:10 Ah, I don't know what they are 05:40:14 You get up on top of tensor calculus and there's a "huge unification" of things. 05:40:20 Linear algebra is on my list of things to learn next 05:40:35 Also, Maxwell's equations seem like a lot of black magic to me 05:40:40 So many things that you learn earlier as special cases (like Coriolis forces and so on) turn out to be "just the plain phyiscs, properly expressed." 05:40:45 I love seeing special cases drop away. 05:41:12 When you get that, suddenly you see very plainly how gravity bends light ras even though they have no mass and so on. 05:41:43 You see how gravity affects EVERYTHING, because it's not really a force but rather is the shape of spacetime. 05:42:12 Linear algebra is VERY useful. 05:42:16 It's one of the cornerstones. 05:42:18 I'm learning a lot of seemingly unrelated things; Kirchoff's laws, induction, force acting on a moving charge through a field 05:42:28 All derived from maxwell's equations, right? 05:42:34 Right - Maxwell's Equations bring them all together. 05:42:45 Yeah, it's like finding out how calculus unifies a lot of kinematics 05:42:55 Kirchoff's laws are an "integral formulation" of certain aspects of Maxwell's equations. 05:42:59 Know of physics channels on IRC? 05:43:07 Not really. 05:43:32 In recent years my physics self-training has been primarily YouTube based. 05:43:50 Stanford has a great series of lectures, where Leonard Susskind teaches all manner of things. 05:44:15 I can't wait to have a long school break to finally get to the root of things 05:44:20 They're not just short "lay audience entertainment" pieces - they're full on college-course-length things. 05:44:39 And there's some absolutely golden stuff out there that Feynman did. 05:44:40 Ah, and not much prerequisite knowledge is needed? 05:44:43 I should watch more lectures 05:45:06 His 1964 or so Messenger lectures, and also a set he did in Austraila or New Zealand where he presented his theory, quantum electrodynamics, in a very "consumable" way. 05:45:29 I think the Messenger lectures were titled "The Character of Physical Law." 05:46:22 It's pretty fun watching how Feynman changed over the years. 05:46:37 In those 1960's lectures he wore a shirt and tie, was very clean-cut, and so on. 05:46:45 Much more casual in the Austrailan lectures. 05:46:56 Then I saw some he did much later, on computing, and he was almost a hippie. 05:47:16 But he was a master educator. 05:47:23 I'm guessing you were around to see the hippie movement? 05:47:39 Well, I was a kid. I was born in 1963. 05:47:47 So yes, but I didn't really "grok it." 05:48:09 My dad was a university professor, so I saw lots of college students in the late 60's, early 70's. 05:48:11 Oh wow, a lot of change since. 05:48:18 Oh yeah. 05:48:21 Interesting life. 05:48:48 My dad taught chemistry, particularly organic and bio. 05:49:12 So I had a pretty easy go of high school chemistry. 05:49:23 Haha, I'm having a pretty hard time with that. 05:49:27 I don't really remember much of it. 05:49:37 To the extent I do it's those parts of chemistry that are really physics. 05:49:41 Electron structure, etc. 05:50:18 The way quantum theory just tells you how the electrons will arrange themselves, and the way the periodic table just falls out of that, is pretty terrific. 05:52:10 It's kind of overwhelming seeing how deep all these areas of study are now. 05:53:19 I'd imagine it was much easier to get started with programming decades ago, because of the simpler nature of computer systems? 05:53:45 It's now non-trivial to set up a programming environment for a complete newbie 05:54:04 Sometimes it's like a Catch-22, to know how to program, you need to know how to program 05:54:17 I think so. When I was in college (at least in EE) we learned a lot about processor architecture, and so on, and the idea that it's just this state machine interpreting a set of bit patterns in consecutive memory locations was pretty straightforward. 05:54:29 There were no cache memories, there wasn't really much in the way of multiple cores, etc. 05:54:36 So it was a pretty clean, clear setof ideas. 05:54:38 many programming newbies take a lot of the "stack" for granted and never look below their high-level programming language 05:55:44 I remember the first time I heard the term "stack" and "heap" was in dealing with C 05:56:04 But even C has a lot of confusing things for beginners 05:56:20 Undefined behavior, little vs. big endian, byte alignment 05:56:22 (in this case i meant the layering of different parts of technology) 05:56:34 Ah, that too 05:57:02 Are there still microcomputers on the market? 05:58:14 if you mean keyboard-integrated computers like the C64, arguably that's what laptops with a broken or missing screen are ;) 05:59:44 Well, I don't see those things as being "C" things. 05:59:51 The compiler will handle those things. 06:00:05 I guess if you're trying to cast strings to ints, you need to know that stuff. 06:00:12 But you can learn a lot of C without ever doing any casting. 06:00:44 As long as you're just using types that you declare the compiler will handle that stuff. 06:04:37 I think there's great value in understanding a certain amount about your underlying hardware, though. 06:05:09 For example, knowing how your cache works. There are some array processing algorithms that will run nice and fast if you process by rows, but suck if you process by columns, because you don't get the same kind of cache benefit. 06:05:21 Well, there's that 06:05:24 So knowing how the array is stored and how the cache works lets you get that right deliberately. 06:05:47 And branch prediction 06:06:00 I think there's too much of a perspective that softwareruns in a "virtual environment" without any real connection to the "underneath." 06:06:05 Yes, that too. 06:06:08 For sure. 06:06:31 Back when I started, you could look up the cycle count for each instruction in a book and add 'em all up, and that was that - that's how long the program too to run. 06:06:39 Like it was a clock. 06:06:43 Same here on my calculator 06:06:47 But not anymore 06:06:59 Totally not the case today. 06:07:00 Right. 06:07:09 A big problem I'm having is what is "data" and what is "code"? 06:07:22 I'm writing a disassembler, and it happily "disassembles" strings and so on 06:07:29 The underlying hardware is just as precise as ever - it's just that the events that can occur are so much more involved now. 06:07:33 code is what you execute, data is what you dont, there is no clearer definition 06:07:57 ah, a disassembler needs heuristics or user input :| 06:08:05 Ah, I see 06:08:09 Technically SEE is a disassembler. 06:08:20 But I'm taking about disassembling binary 06:08:21 Yeah a dissassembler needs to be given a valid starting point. 06:08:28 So I can see what the ROM calls are all about 06:08:38 And cant always recognize when it stops seeing code and starts seeing "not code." 06:09:21 Well, when an unconditional RET is reached maybe 06:09:22 Or JP 06:09:24 Or really, I guess, can't EVER recognize that unless you've told it a way that will work in your system. 06:09:35 Possibly, if there's just one exit point. 06:10:01 If the dissassembler was smart enough, it could see conditional jumps and know that where they went was more code. 06:10:13 But some kind of intelligence has to guide that process. 06:10:23 To some extent you can code that in, but sometimes not. 06:10:59 Even if your dissassembler was very smart (followed conditional jumps, etc.), you could probably trick it. 06:11:15 I imagine I could pretty quickly write something that had a conditional jump that was never taken in operation. 06:11:22 And just have it point into the never never. 06:11:30 Dissassembler would lose its mind. 06:12:12 never never == Dresden Files reference. 06:12:22 Best books EVER. 06:12:33 that's the idea behind the "interactive disassembler" (https://hex-rays.com/products/ida/index.shtml for example) 06:13:04 the user can think a little better than a set of heuristics, so it can correct the heuristics' mistakes 06:13:06 I think there are a lot of problems that are best solved via a "collaboration" between human and computer. 06:13:22 yep 06:13:54 I'm pretty against a lot of machine learning stuff, because it's inherently a "black box" 06:14:03 I prefer correct programs by construction 06:14:19 Yes, and such black boxes make some really profound errors at times. 06:14:24 Not a linear algebra monster tuned by throwing data at it, although it might be "good enough" 06:14:32 There's a famous case where a learning algorithm was trained to discriminate dogs and wolves. 06:14:43 It misidentifed a dog as a wolf, and the engineers wanted to know why. 06:14:43 "heuristics" doesn't imply machine learning 06:14:54 So they put in code to "white box" the algorithm - show what parts of the picture drove the decision. 06:15:03 Turned out in that case it was snow in the bottom corners of the screen. 06:15:04 Also Google's Photo AI confused black people for gorillas, because of a bias in the data set 06:15:12 They had to remove "Gorilla" as a category 06:15:14 Without realizing it, they had correlated woves with snow in their input pictures. 06:15:26 Have you read Gödel Escher Bach by any chance? 06:15:28 Best book yet 06:15:38 Yes, 15-20 years ago. 06:15:44 That's a LONG book. 06:15:57 Took a couple of months of on-and off 06:16:02 But I totally buy into the idea that there are things a human mind can "just see" the truth of that a machine can't prove true. 06:16:05 A bit heavy on mathematical logic, for good reason. 06:16:09 Yeah, LONG but worth it 06:16:25 I see GEB as just the quantitative "backup" of that idea. 06:16:36 I particularly like the demonstration of the A.I. program "ETAION SHRDLU", something like that 06:16:44 Or PLANNER 06:16:53 I don't remember it to that level of detail. 06:17:11 It was a dialogue between a human and program 06:17:18 "Move the blue cube on top of the red cylinder" 06:17:20 etc. 06:17:28 Ah, Ok. 06:17:52 Machine learning won't solve every problem, but it's a good start 06:18:02 Something deeper is missing in our notion of intelligence. 06:18:10 I think we'll continue to get better at making machines "mimic" human thinking processes, but I think it will always be a "mimic" and never the real thing. 06:18:17 Unless we start making computers that are more like brains. 06:18:28 Organic, maybe, or possibly quantum - don't know exactly. 06:18:30 Perhaps the answer is to simluate a brain? 06:18:36 But clearly humans can construct thinking machines. 06:18:38 Yeah, a hybrid might just work 06:18:47 It hapepns all the time, and the initiation of the process is FUN. 06:18:52 A chip on a brain, or vice versa 06:18:52 They're called babies. 06:18:58 Haha 06:19:20 Yeah, A.I. is just a naturally exciting field. 06:19:22 So I won't claim we'll never figure out how to do that using some other manufacturing technology. 06:19:25 But severly overhyped nowadays 06:19:33 But I feel pretty sure it won't be standard computing as we know it today. 06:19:40 "Look! A phone with an A.I. chip that's actually a GPU in disguise!" 06:20:12 I think these days a better name for "AI" would be "advanced probabilistic algorithms." 06:20:20 It's neat stuff. 06:20:23 But it's not "thinking." 06:20:30 Not aware. 06:21:08 And you've got really scary applications - like some state up in the northern part of the US has an AI program they use to recommend sentences in criminal cases. 06:21:12 That's just scary. 06:21:16 It's also a problem of the general public not knowing enough about how computers work in general 06:21:23 And they won't reveal the source code. 06:21:28 That should be a huge priority for education systems 06:21:31 Yes, I agree. 06:21:40 They're super important tools in our culture these days. 06:21:45 If I didn't have the motivation to, I would never have done all this programming. 06:21:49 People need to have at least a *basic* understanding of them. 06:21:51 I've never had a formal class in it 06:21:58 Right. 06:22:05 Ah, you're doing great then. :-) 06:22:09 I love self-education. 06:22:13 --- quit: smokeink (Ping timeout: 260 seconds) 06:22:15 The internet is a marvelous resource for that. 06:22:21 Thanks :-) 06:22:22 Exactly! 06:22:38 IRC has accelerated my self-learning in recent years 06:22:42 The universities are secure, though, because the public won't give up football. 06:22:50 I ask around what people are doing or just watch a learn 06:22:56 Yes. 06:23:04 watch and learn* 06:23:15 Well, feel free to ask me chemistry questions. Like I said, I only know parts of it well, but within those areas I might be helpful. 06:23:36 We're doing acids bases and redox right now 06:23:56 Ok, that's not one of my good areas, beyond just very vague recollection which you're probably already beyond. 06:24:01 I just don't have a firm intuition for it 06:24:24 I remember quite enjoying chemical kinetics (predicting reaction rates), but I can't remember enough of it to count now. 06:24:52 Here's also a huge barrier to learning: UI changes a lot, people can't keep up 06:25:05 UI? 06:25:08 We end up with people knowing only how to click things 06:25:10 user interfaces 06:25:14 Oh. 06:25:18 Yes, very much. 06:25:23 Computers have become crutches. 06:26:08 Also, a lot of people I know hate software updates because a. it interrupts their workflow and b. "things don't work like they used to" 06:26:17 Take a class like statistics. 06:26:18 People seem to have the impression that updating software makes it slower 06:26:31 Too many people get through that just learning how to run a set of canned algorithms. 06:26:35 What for? 06:26:37 Ah 06:26:50 Instead of really understanding what's going on in the sense of probability theory. 06:27:25 I've never had a statistics course, but I had a very good probability theory course, and from there I can "get there." 06:27:40 I might do more work than would be necessary with a swanky algorithm someone's developed, but I can get to the answer. 06:27:55 My understanding of stats is pretty basic 06:27:59 Just what is in the syllabus 06:28:02 And I wind up with a DEEP belief in the correctness of my answer, because the foundation it rests on is undeniable. 06:28:23 e.g. Baye's theorem, a lot of black magic? 06:28:38 I read a book recently about thermodynamics, which turns out to be, fundamentally, the application of statistics to molecular behavior. 06:28:41 I don't like the idea of "just plugging it in" 06:28:44 And wow - the conclusions are POWERFUL. 06:28:49 That same sort of undeniability. 06:29:00 I get it when they say the 2nd law of thermo is our most certain, most inarguable law. 06:29:04 It pretty much has to be. 06:29:10 How so? 06:29:23 Well, that's the law that says entropy always increases. 06:29:32 Right. 06:29:57 I really don't understand enough statistics, not sure how that will help 06:29:58 And when you read through it, you see what entropy is, and so on, and you see that the probability of something ever moving the other direction, of its own volition, is just so small that it doesn't count. 06:30:26 Like, the molecules of air in your room *could* all suddenly run up into the corner. 06:30:37 That wouldn't violate any law of conservation of energy or momentum or anything like that. 06:30:42 But it's just not going to happen. 06:30:53 It's the 2nd law that gives you that. 06:31:01 I gotta drive to work - back in a bit. 06:31:20 Ah I see. 07:28:15 --- quit: proteus-guy (Ping timeout: 250 seconds) 07:28:41 --- join: proteus-guy (~proteusgu@2403:6200:88a6:329f:c150:b4eb:ca74:c2f1) joined #forth 07:28:43 --- quit: nighty-- (Ping timeout: 260 seconds) 07:28:49 I'm back. 07:29:04 why 07:30:26 Wow, this Matrix thing is really good. 07:30:32 I have it on my mobile phone as well 07:32:38 --- join: nighty-- (~nighty@s229123.ppp.asahi-net.or.jp) joined #forth 07:34:00 zy]x[yz: I really don't know - I wonder sometimes. :-) 07:34:08 Habit? 07:36:03 --- quit: tabemann (Ping timeout: 250 seconds) 07:43:01 --- join: rdrop-exit (~markwilli@112.201.162.180) joined #forth 07:44:01 My HP48GX can be programmed in RPL https://en.wikipedia.org/wiki/RPL_(programming_language) 07:45:03 Cool, looks like Forth. 07:45:23 It was at least partially inspired from Forth 07:47:37 Some of the HP calculator engineers were very much into Forth 07:51:42 :-) How could they not have been? It's such an obvious fit. 07:51:54 You wouldn't have to do much to "get there from here" either. 07:52:15 They suppressed explicit access to the return stack, but beyond that the similarity is huge. 07:52:37 They had an explicit "call" operation, instead of it being sort of implicit the way it is in Forth. 07:53:12 Why suppress access to the return stack? 07:53:23 Oh, I'm just saying they *did*. 07:53:38 It was exclusively managed by call ("XEQ") and return ("RTN"). 07:53:44 I see. 07:53:45 Nothing else messed with it. 07:53:53 I'm just saying they suppressed that aspect of Forth. 07:54:07 that would eleminate any need for ' and ['] 07:54:16 Yes. 07:54:29 it's more similar to c-like languages, where you have a call operator () 07:54:29 And there weren't any immediate words. 07:54:40 Yes, in that way. 07:54:43 Just RPN. 07:55:07 I always liked the LASTX register. 07:55:10 I found it very useful. 07:55:41 siraben: LASTX was a place that "captured" the top of stack when you did things. 07:55:51 Like 3 2 + would send 2 into LASTX before doing the +. 07:56:05 That would be stunningly easy for me to implement in my Forth. 07:56:15 Any Forth, really, but especially those with TOS in a register. 07:56:27 BC (a 16 bit register) is the TOS for me 07:56:28 Just slip one reg-to-reg move into the appropriate places. 07:57:38 Makes things like DROP fast, because all I need to do is POP BC 07:57:44 Among other things. 07:57:53 Right, it almost always pays off to do that. 07:58:02 So much so that it's become "standard" in modern implementations. 07:58:10 I'd be surprised if I saw that a Forth didn't do that. 07:58:17 I'm really coding it tight here with the minimal amount of registers 07:58:28 Experiments have been done with keeping more than one item in registers, but the benefits drop FAST. 07:59:00 If you do a lot of fancy footwork in the compiler you can make keeping two or three pay off, but mosto systems don't bother. 07:59:27 The payoff is smaller than that "first payoff" of moving TOS to a reg, and that fancy footwork is substantial. 08:00:24 I still have a pile of HP calculator related books, IIRC the manuals for the Saturn processor are available online. 08:00:38 --- quit: Keshl (Read error: Connection reset by peer) 08:01:04 Yeah, there are some nice old HP resources floating around. 08:01:16 There was this thing you could do with the 41CV that they called "synthetic programming." 08:01:25 It involved exploiting undocumented features of the processor. 08:01:47 Things that weren't documented instructions, but because of the way the instruction set mapped onto the hardware they'd work anywya. 08:01:49 anyway 08:01:58 Someone found a "back door" that let you create these instructions. 08:02:20 It wasn't really that amazing on hindsight, but it was fun and felt all "sophisticated and hacker-ish." 08:02:23 So it was entertaining. 08:02:58 Let you feel like you were a cut above people who didn't know how to do it, which I think is an ego thing lots of people take from all kinds of knowledge. 08:03:35 "Synethic programming" 08:03:36 In an FPGA based stack processor you can just put the stacks in SRAM, making the entire stacks effectively register equivalent. 08:03:37 When you entered these instructions you'd get really bizarre looking symbols on the screen. 08:03:40 Sounds like program synthesis? 08:03:44 That was just the label that got applied to it. 08:03:48 No, nothing whatsoever to do with that. 08:03:52 It wasn't a very apt name, really. 08:03:57 --- join: kumool (~kumool@adsl-64-237-233-141.prtc.net) joined #forth 08:04:00 It just caught on. 08:04:38 Well, undocumented instructions become documented if you, erm, write the documentation for them. 08:04:39 All you were really doing is tricking the calculator into letting you put bytes into the program memory that didn't correspond to any of the documented instructions. 08:04:54 :-) Right, but you have no guarantee that they'll continue to exist. 08:05:03 But firmware upgrades weren't the "thing" back then that they are now. 08:05:10 You bought the thing, and you ran the firmware in it forever. 08:05:24 So if you could do it you could do it - it was real for you. 08:05:54 Even if you could upgrade firmware on something, most people never did. 08:06:04 The things came with ROMs, not flash. 08:06:07 Flash didn't exist yet. 08:06:20 Literally chips that could only be programmed once, at manufacturing time. 08:07:25 Ah 08:07:34 Can't imagine all the hardware tricks people did back in the day to get every bit of performance 08:07:46 You could even implement each stack as a huge shift register, no need for stack pointers. 08:07:49 Nowadays we just seem to expect hardware to get exponentially faster and our code to get exponentially worse. 08:08:06 Yes, we do. 08:08:09 What a waste. 08:08:20 I want to DO STUFF with that power. 08:08:25 Instead of having the OS sop it up. 08:09:05 Wasn't there a Turing Award lecture, "The Computer Revolution Isn't Here Yet"? 08:10:33 I've not heard of that, but it sounds reasonable. 08:12:43 Here's a pretty bad thing: routers. I can't even modify mine 08:12:59 I wrote up a prototype stack machine where the stacks where shift-register based, but never got around to completing it or testing it. 08:13:00 And it's an extremely important piece of hardware 08:13:18 you can't put open source firmware on it? 08:14:04 rdrop-exit: I did some hobby work on an FPGA stack processor that had hardware stacks that were "sort of" shift registers. 08:14:21 It was a string of hardware registers, and I physically shifted the content on pushes and pops. 08:15:40 Cool, mine were shift registers, no stack pointers at all. I want to revisit that design one of these days and finish it. Too busy with other stuff right now. 08:16:05 Yeah, no stack pointers here either. 08:16:21 One down side of it, though, is that moving all that data around on every stack operation was power-intensive. 08:16:51 But I managed to develop a set of 5-bit instructions that very slickly controlled everything that needed to get done. 08:16:59 It was a VERY "low overhead" thing. 08:17:25 Harvard architecture, all instructions single-cycle, no idea how long each cycle would be though, never tested it. 08:17:42 --- quit: dave9 (Quit: dave's not here) 08:18:39 The intent was a processor for deterministic real time apps, 2 cooperative tasks each with its own pair of stacks. 08:18:48 No interrupts. 08:19:28 Gotta go, wife wants to watch a video. See you all soon. 08:19:36 --- quit: rdrop-exit (Quit: rdrop-exit) 08:26:27 wow, no wonder twitter is hemoraggahing users, it feels like everyone is in one unmoderated irc channel on a server with no ephimerality 08:28:03 I never did understand twitter 08:31:51 but then I don't really understand reddit, either 08:32:08 I miss the 90s when we used message boards 08:32:41 well, to me reddit is /. but with more topics and more control over your own frontpage 08:33:02 what is /. 08:33:48 heck the first use of twitter I saw was admins using it for downtime notifications to everyone in the company who cared. 08:33:56 zy]x[yz: Slashdot 08:34:07 never used it 08:34:44 an old nerd news site that somehow is still running and pretty good for stuff you might have otherwise missed 08:35:34 oh, I could never stay interested enough to read nerd news 08:35:56 well, it is mainly science, tech and such news. 08:35:57 I really could not care less about the new wizbang 5000 processor coming out next month 08:36:25 sure, but potshots at Intels latest ISA blunder and such, no? 08:36:49 I heard about those quickly enough as word spread through freenode 08:37:08 I think I check /. every fortnight or so 08:37:18 no fortnight is a video game 08:37:32 no, fortnight is two weeks or half a moon cycle 08:38:02 you're thinking of four score and seven years ago 08:38:12 it just happens that fortnight the game was clobbered together in a fortnight, hence the name 08:38:26 haha is that really why they named it that? 08:38:47 no, but I suspect from the code quality. Specially the shitty netcode 08:40:15 in FPS games with proper netcode, say Quake 3 or dervitives, the server checks each impulse message from a client as soon it arrives and broadcasts it back to everyone. 08:41:06 none of this "update from server at 17-20 Hz" crap 08:43:26 I can't comment on that, either. I have some kind of personality defect that I'm never into whatever is popular at the time 08:44:41 me neither, I was just curious how these games were designed and implemented and the crappiness is astounding. 08:45:44 it's software. all software is bad. 08:46:37 well, software lies on a gradiant from say bad to utter crap 08:46:48 I often wish I could get the motivation to spend money on a wintendo, but then I look at what games are popular today and it just doesn't interest me all that much 08:47:27 is this what it's like to be an adult 08:48:13 Mark Twain once said something like "When you discover you shar an opinion with the majority, it's time to do some self-reflection." 08:48:15 dont look at what is 'popular' as that has ceased to be any good metric, look for genres you might be interested in 08:48:57 I am almost never "in sync" with the crowd. 08:49:05 oh, I have, and I'm still not interested 08:49:11 video games today just don't look all that good 08:49:33 there's one guy building a hard-core space sim that looks pretty interesting, Rogue System 08:49:43 * Zarutian finds the 'crowd' shallow, trying to desperately to be 'in' and demands to be entertained. 08:49:54 I can get momentarily intrigued wit tech advances in gaming, but it never lasts. 08:49:56 I keep checking in on it, but I'm pretty confident it's one of those super ambitious things that will never actually be finished 08:50:56 zy]x[yz: what about games such as Stardew Valley? Shenzen I/O? Exapunks? Factario? Minecraft? Rimworld? 08:51:38 I haven't heard of a lot of those. I played minecraft for a bit when it was new, and that was pretty fun 08:52:32 I've played KSP too, which is also fun but also frustrating because, as you were talking about with fortnight, you can tell it's hobbled together by the shittest coders on the face of the planet and it just hurts knowing that 08:52:49 hell, I know people that still play Minecraft. But many of them have moved to modded minecraft. Some mainly due to the Chisels and Bits mod (which allows for subblock sculpting) 08:53:33 zy]x[yz: but yeah with KSP they do not even try to hide that fact, and I heard that they are steadily improving. 08:55:41 something that Epic games (make of Fortnight) tries to desperately to hide 08:55:57 oh, well. these days my entertaining is made up mostly of playing with the cat, going to the shooting range, and revisiting old nintendo games I haven't played since I was like 10. and occasionally staring at my terrible forth and wondering where it all went wrong 08:57:18 yeah, I too play old Nintendo games that I never could afford when I was ten. Still, I only play them in emulators. 08:57:23 --- join: Keshl (~Purple@24.115.185.149.res-cmts.gld.ptd.net) joined #forth 08:58:02 zy]x[yz: what kind of shooting do you do? Gun or bow or rocketry? 08:58:08 gun 08:58:41 and I won't say I'm very good at, but I think I'm getting better 08:58:54 chemically powered slug throwing or electromagntically accelerated? 08:59:23 * Zarutian leaks some future info. Should have settled at a later time period. 08:59:27 heh, I haven't seen a lot of rail guns around here 09:00:38 one thing I have seen with rocketry based shooting is that for longer ranges you dont have to compensate for gravity nor wind. 09:01:35 and btw that is an old art from China before the Ming Dynasty iirc 09:01:50 well for now I keep it at around 7-10 yards, so gravity and wind aren't so much of a concern 09:02:25 I need to get a good range rifle, though 09:02:53 there's a .22 I've been wanting, but I'm waiting for one to pop up for a good price 09:03:22 --- quit: kumool (Ping timeout: 252 seconds) 09:03:34 the only rifle I have is a 100-year-old (this month, in fact) springfield m1903 09:04:03 front loaded? or bolt action? 09:04:46 bolt action. it's the US's WWI and earlyer WWII battle rifle 09:04:58 s/earlyer/early/ 09:06:00 yeah, I read about those a bit. They were pretty relatively cheap and fast to make but still somewhat reliable. 09:06:05 that can be fun, but in moderation. if I'm only wearing a t-shirt (which is usually the case), it'll leave a bruise 09:07:16 the one I got is right from at the end of WWI, which is perfect imo because during WWII they made a lot of cost-reducing and manufacturing-simplifying changes so they could crank more of them out 09:07:43 has anyone played the prank of showing up with a shotgun and shooting it terminator 2 style? I know of one guy who made an pnumatically locking exoskeleton part to pull that off without hurting. 09:08:12 haha, no. I shoot at an indoor range, so no shotguns allowed 09:09:01 another reason to bring out the 1903 sparingly: indoors, even with double ear protection (plugs and over-the-ear), that thing is a monster 09:09:06 (the exoskeleton part locked when you pulled the tricker and transmitted the recoil force to your torso) 09:09:41 s/tricker/trigger/ 09:20:13 nice 09:32:33 --- join: dys (~dys@tmo-123-192.customers.d1-online.com) joined #forth 09:46:03 --- quit: dys (Ping timeout: 276 seconds) 09:53:27 zy]x[yz: say, why do you say your forth is shitty? 09:57:29 I struggle with factoring and coming up with good names for factored pieces 09:57:35 is just one reason 09:57:52 so before long I end up with this huge collection of unintelligible words 09:59:39 I don't know, I guess it's not THAT bad, but more often than not I'd say I find myself feeling critical rather than proud of what I've written 09:59:57 that's not exclusive to forth, but I think it happens more frequently with forth 09:59:59 hmm.. I find having copious amount of comments for each such word helps to capture the context of why and what the hell it does. 10:00:24 yeah but then if I have copious amounts of comments then I lose the brevity 10:00:54 I might just be too attached to c to every really love forth 10:01:08 s/every/ever/ what is it with that word 10:02:41 hmm.. I usually have : flargb ( stack effects ) floob floob fle ; \ comment of why this came to be 10:04:07 here's an example of two words that I'm reasonably happy with how they work and are factored, but I think it illustrates the problem that it's unreadable 10:04:25 this is from my assembler, where I have an array that I'm using as kind of an operand stack 10:05:07 : (o) does >r flags dup -4 and swap (f optr) r> execute or (flags) ! o[] ; 10:05:08 : (o) does >r flags dup -4 and swap (f optr) r> execute or (flags) ! o[] ; 10:05:25 oops. second line should be: 10:05:26 : >o does [: does tuck 1+ ;] (o) ! ; : o> does [: does 1- tuck ;] (o) @ ; 10:06:02 ( to make sense of that, in my forth you always have to use "does" to begin a word definition ) 10:06:45 the o in the names of these two words stand for operand? 10:06:50 yeah 10:07:47 for some reason I'm tracking the optr as a bitfield in a flags variable 10:07:49 why not (operand) and >operand ? some forths allow for aliasing, that is you can have (operand) and (o) point to the same word. 10:07:54 I guess it would be a lot simpler if I didn't do that 10:11:52 this is an assembler for what architecture? 10:11:58 amd64 10:12:07 so x86 based 10:12:11 yeah 10:12:16 x86-64 10:12:56 hmm.. do you try to follow gnu assembler or AT&T syntax or have you rpn'ed it? 10:13:11 it's this weird RPN AT&T-looking syntax 10:13:31 0 (%rcx, %rbx, 1) bptr %ebx movzx, 10:13:45 %rsi -8 (%rbp) mov, 10:13:45 so operands show up first, set some state vars and the opcode word consumes that state, yes? 10:13:52 right 10:14:46 hmm.. you do grok the octalness of x86, no? 10:15:07 eys 10:15:08 yes 10:15:47 a lot of my constants are written in octal 10:17:40 I think that, like the ww1 rifle, I can just only enjoy forth in small doses 10:17:57 too much and it starts to hurt 10:25:53 could also be that the first task I've chosen to do with forth is just not very fun in general - building an assembler, and all that. so it might not entirely be forth's fault 10:41:46 --- quit: ncv (Remote host closed the connection) 11:02:26 --- join: mark4 (~mark4@172.58.4.66) joined #forth 11:20:26 --- join: dys (~dys@tmo-080-247.customers.d1-online.com) joined #forth 11:33:45 I've actually had a quite nice time writing the "Forth" parts of my system. It's still being done via the assembler, but I actually write Forth, as a comment, and then just mechanically implement the word below using data definition directives. So far that's gone really smoothly - all the pesky parts were in getting the actual assembly to work. 11:33:59 Especially the system calls - MacOS system calls aren't very well documented, and that was painful. 11:35:27 I've tried to be very careful to keep that "comments Forth" synced with the assembly - I intend to actually use that Forth at some point. 11:40:08 --- quit: proteus-guy (Ping timeout: 260 seconds) 11:40:38 --- join: proteus-guy (~proteusgu@2403:6200:88a6:329f:c150:b4eb:ca74:c2f1) joined #forth 12:30:19 --- join: irileas (~irileas@200116b82274cd00f9f0c747e68f26b7.dip.versatel-1u1.de) joined #forth 12:31:14 --- quit: irileas (Remote host closed the connection) 14:25:25 --- part: unrznbl[m]1 left #forth 14:37:38 --- quit: mark4 (Ping timeout: 252 seconds) 15:16:21 --- join: wa5qjh (~quassel@175.158.225.216) joined #forth 15:16:21 --- quit: wa5qjh (Changing host) 15:16:21 --- join: wa5qjh (~quassel@freebsd/user/wa5qjh) joined #forth 15:54:32 --- quit: nighty-- (Remote host closed the connection) 15:58:51 --- join: [1]MrMobius (~default@c-73-134-82-217.hsd1.va.comcast.net) joined #forth 16:02:24 --- quit: MrMobius (Ping timeout: 272 seconds) 16:02:24 --- nick: [1]MrMobius -> MrMobius 16:12:59 --- join: kumool (~kumool@adsl-64-237-233-141.prtc.net) joined #forth 17:11:10 Good morning 17:11:58 System calls are underdocumented here too 17:14:02 --- quit: dddddd (Remote host closed the connection) 17:14:25 --- join: rdrop-exit (~markwilli@112.201.162.180) joined #forth 17:17:24 --- join: dave9 (~dave@90.20.215.218.dyn.iprimus.net.au) joined #forth 17:20:37 hi 17:36:43 --- join: groovy2shoes (~groovy2sh@unaffiliated/groovebot) joined #forth 17:56:24 Hi Dave. 17:58:02 Good morning all 17:58:11 Hi. 17:58:21 … or evening as the case may be 17:58:53 Still on my first cup of joe 17:59:29 Evening here. 18:01:06 9am here 18:01:10 First bowl of tomato and rice soup. 18:01:25 It got chilly today for the first time this year, and the family always likes this soup on chilly days. 18:01:43 Haven't had it since last winter. My mom's recipe, not that it's very involved or anything. 18:01:44 bon appetit 18:02:34 87 degrees here, cool morning 18:02:46 :) 18:07:08 --- join: smokeink (~smokeink@42-200-118-172.static.imsbiz.com) joined #forth 18:07:09 --- join: smokeink2 (2ac876ac@gateway/web/cgi-irc/kiwiirc.com/ip.42.200.118.172) joined #forth 18:11:01 It's in the low 50's here. 18:11:45 gives me chills just thinking about it 18:12:04 Windy and rainy too - that "wet air." 18:12:12 --- quit: wa5qjh (Remote host closed the connection) 18:14:30 81% humidity, partly sunny 18:14:55 It’ll rain later today 18:16:13 rainy season 18:33:09 Whatever happened to ASYST the Forth-based scientific computing package? 18:34:40 Ah, I have a vague recollection of that. 18:35:20 I think I tinkered with it at some point LONG ago. 18:35:24 At work. 18:35:53 --- join: mark4 (~mark4@172.58.4.36) joined #forth 18:36:03 There were writeups in FORML and JFAR IIRC, and a couple books published about it 18:37:45 I’ll google 18:41:15 --- quit: mark4 (Ping timeout: 245 seconds) 18:42:17 It’s mentioned on the Forth Wikipedia page 18:42:33 rainy too :D 18:45:26 --- join: mark4 (~mark4@172.58.4.36) joined #forth 18:48:21 Campbell et al, "Up and Running with Asyst 2.0", MacMillan Software Co., 1987 18:50:40 any document written in this decade? 18:51:16 Doubtful 18:54:00 I don’t mind reading old computer literature, enjoy it even 19:00:13 --- quit: mark4 (Quit: Leaving) 19:02:01 Found 3 papers about ASYST in the Proceedings of the 1988 Australian Forth Symposium 19:05:23 Ooh, there’s a nice paper titled « Forth Engines vs Assembly » in these proceedings. 19:05:52 by Ray Gardiner 19:06:20 I’ll type up a quote… 19:07:00 « Does the language help define the solution? » 19:07:32 « The final software will be a reflection of how well we understood the problem . » 19:08:19 « By allowing the problem to be probed interactively and viewed from different angles Forth leads to a greater understanding of the problem. » 19:08:55 « The careful selection of names allows the programmer to ‘think’ about a problem in a high level way." 19:10:11 « Simplicity. A correct solution is ALWAYS the simplest. This search seems to be encouraged by Forth itself. Usually with startling results. Forth itself is a simple solution to a complex problem. » 19:11:49 « When starting out on a new project it is usual to get mislead by the detail and complexity of the issues to be absorbed. It takes practice and patience to strive for that special view of a problem which leads to a deeper insight and understanding. » 19:12:14 Nice 19:12:36 Yes, that's excellent stuff. 19:12:57 That should really be the first lecture (and often repeated) in Computer Programming 101. 19:13:12 Shouldn't just be a Forth thing. 19:14:24 I’ve given up on mainstream computer science except for algorithm related research. 19:17:17 Yeah. And this nonsense going on in the kernel community... :-| 19:20:28 just watched euroforth 2018's video 19:20:38 i am glad one of the guys mentioned redis 19:20:52 but unfortunately he only mentioned on cliend side 19:22:23 Forth is a computing technology stack from hardware to application rather than just a ‘computer language’, sometimes I feel the ANS Forth related discussions lose sight of that. 19:24:25 ‘programming language’ 19:26:13 At the end of the Chuck interview he mentions that he’s revisting the concept of Sourceless Forth. I’m very intrigued by that. 19:27:06 Yeah, I've spent a bit of time recently thinking about that too. I think I see what it's all about, and how Forth wouldn't need "much changing" to make it work. 19:28:02 The real trick is to make the language 100% decompilable. 19:28:55 --- join: wa5qjh (~quassel@175.158.225.216) joined #forth 19:28:55 --- quit: wa5qjh (Changing host) 19:28:55 --- join: wa5qjh (~quassel@freebsd/user/wa5qjh) joined #forth 19:31:50 --- join: tabemann (~tabemann@2602:30a:c0d3:1890:d004:a860:b5ff:7c1c) joined #forth 19:32:34 I experimented with an approach a while back, the only real issue is derived literals (I think, not sure). 19:34:04 Do you have issues of the old ACM SIGForth newsletter? 19:34:36 --- quit: smokeink2 (Quit: http://www.kiwiirc.com/ - A hand crafted IRC client) 19:35:08 No. 19:35:53 I’ll type up an article from one them you might find interesting. 19:36:40 Literals would be an issue, at the very least because even if you can solidly identify them in the code stream you don't know what base they were expressed in in the source. 19:36:53 So the information needed to correctly render the source just isn't THERE. 19:37:31 --- quit: smokeink (Ping timeout: 252 seconds) 19:37:47 You could put number strings in the symbol table along with all other strings. 19:38:12 Then that literal in compiled code spacewould be "pointed at" by a record in the table (which contains a string) just like, say, SWAP would. 19:38:18 or you could just detect (LITERAL) and special case it 19:38:40 like in my Forth you'd special-case (LITERAL), BRANCH, and ?BRANCH 19:38:40 Well, I figured you'd do that to know you were looking at a literal in the first place. 19:39:06 but if you find 01010101, say, do you type 0x55 or do you type 85? 19:41:01 As your ‘composing’ the code, whatever will not be directly decompilable from the machine code would have to be stored in some sort of beefed up symbol table. 19:41:32 It’s spreadsheet like in a way. 19:43:21 Right. SWAP is a well-defined instruction, that will have the same value in code space everywhere it occurs. 19:43:42 So for that class of things you just need the symbol table to say "for that code value, this is the string" and you're done. 19:44:15 but the literal 0b01010101 might appear in the code some places where youtyped 0x55 into the source, and some places where you typed 85. 19:44:48 So for THAT you have to have more than just the compile pattern - you have to have all of the ADDRESSES in code space where it appears as, say, 85, in a table entry that has the string "85." 19:44:58 The machine instructions decoding would be in an Instruction Table, while the rest would be in some sort of Supplemental info table. 19:45:14 Yeah. 19:45:36 That supplemental info table would be the "source code," but with stuff that you can represent more easily removed. 19:45:40 It's the "leftovers." 19:45:45 This was the approach used by the M-Code cross-assembler. 19:46:43 which was a DOS program written by Matt Biewer of STD bus fame. 19:47:23 The SIGForth article I want to share with you is about that program. 19:47:32 The other thing you could do here would be to have multiple (LIT) instances. 19:47:40 One for each base. 19:47:54 Or else you include an information field in the compiled code, which gets skipped over by (lit). 19:48:05 That's probably simpler than complexifying the symbol table. 19:48:30 I prefer the object code to be pure machine code. 19:48:33 binary (lit), hex (lit), decimal (lit), octal (lit). 19:48:44 Well, me too really, but it solves the problem. 19:49:13 In the first way I described it (multiple (lit) instances) it is still machine code. 19:49:15 They're all lit. 19:49:19 (lit) rather. 19:49:27 They all do the same operation - push the following cell to the stack. 19:49:46 I don’t think base is an issue, I would just allow switching the view from one to another. 19:50:08 Oh, I wouldn't like that. 19:50:15 I want to specify that at codeing time and have it stick. 19:51:05 Then just save the address of the literal and the original base in the symbol table. 19:51:53 Well, we might wind up needing that because it might turn out that just knowing the base doesn't cover all the bases. 19:52:00 But for that problem alone I think it's a cleaner solution. 19:52:01 The harder part is when a literal was derived through a formula at edit time. 19:52:11 You don't need extra code to parse the extra symbol table, etc. 19:52:21 It follows the same paradigm as everything else, completely. 19:52:34 Since the formula wouldn’t end up in the object code. 19:52:34 Yes - agreed. 19:52:49 That's "source" that doesn't become object in a one-to-one way. 19:52:53 I do think that's harder. 19:53:16 And I think you would need the "extra info" for that - you'd have to have the compile time formula stored somewhere. 19:53:29 Yes, like a macro. 19:53:38 So it's sort of like this. 19:53:51 Start by thinking of a "tokenized" representation of the full source. 19:53:54 or a compiler directive. 19:53:57 Much like what I'm implementing now. 19:54:16 I think that’s the wrong direction 19:54:19 Then recognize that you can come to a series of those tokens that can be totally decompiled from object. 19:54:41 So you don't need to store that source - you can just say (in the "source") "decompile N items from object." 19:54:59 So that lets you reduce the volume of the "source" substantially. 19:55:04 Start with the machine code, and figure out what extra info you need to store elsewhere. 19:55:09 It's sort of approaching the same middle point from the other end. 19:55:16 Right 19:55:30 I like viewing it as a contnuum like that. 19:58:11 You end up with 3 data structures I think: Machine code image + Instruction table for the processor + Supplemental Info Table 19:59:34 The original programmer can use all 3 to disassemble the original source ‘view’, while 20:00:03 someone who doesn’t have the third table can still do a normal disassembly. 20:01:07 The machine code image is pure. 20:01:40 The user interface would be in some ways similar to a spreadsheet of sorts. 20:04:14 or to an old debug monitor. 20:05:20 Comments also go into the Supplemental Info Table. 20:06:01 Since you're representing most of the "source code" as compiled code, that means it has to be compiled all the time. 20:06:06 Even when it's not in RAM. 20:06:15 It would live in a "virtual address space" on disk. 20:06:27 Then you'd either re-locate it when you loaded it, or use virtual memory. 20:06:36 I think the whole thing would dovetail with modern MMUs quite well. 20:07:27 So unless you used a virtual memory system, your object code design would have to be perfectly relocatable as well. 20:07:44 You'd have to just look at it and be able to know how to put it in a different memory range. 20:09:33 Either you make it inherently relocatable using IP relative addressing, or you just disassemble it and reassemble it for a new location. 20:09:57 Ah, you're right - the other problem solves this one. 20:10:19 If you can get to source, you're done. 20:11:55 Though at that point I have to wonder what the big payoff of this is. 20:12:21 It's interesting and mentally stimulating, but what does it BUY US? 20:13:28 It would take up more room on disk - every word would have a "full virtual cell" that it sat in in the code image on disk. 20:13:34 For Chuck it’s minimizing the semantic gap between language and processor. Bringing the map closer to the territory. 20:13:52 Whereas his colorforth era stuff (that I'm somewhat emulating) supports a high level of compression. 20:14:58 I think you'd have a win if you could literally just scoop the disk content into RAM, without a lot of work relocating or decompiling/recompiling. 20:15:13 It really would work well with virtual memory - you'd effectively be "executing it on disk." 20:15:20 Just with RAM being used to accelerate it. 20:15:48 I would just store the target images WITHIN the development environments executable, save the whole thing to disk. 20:15:49 That just wouldn't work ona processor with no mmu, though. 20:16:32 Yes, it would work really well for a system that always had the same code in RAM all the time, in the same spots. 20:16:53 And honestly I think that's the type of system he worked on later on. 20:17:04 He had one goal in mind, and was crafting a RAM image devoted to that goal. 20:18:29 In a more "mainstream" computer, though, you have to be able to select arbitrary parts of your code base and make them coexist in RAM at any given time. 20:18:34 His blocks are just in RAM as part and parcel of the development environment image. 20:19:18 That’s what I do in my host Forth. 20:19:49 I don't think I'm describing this very well. When your computer is off and everything is on disk, definitions reside at specific disk locations, and referneces to those definitions have to be references to those (disk) locations. 20:20:00 When you want to RUN the code, that all has to become RAM referenced somehow. 20:20:28 So you either have to change it while loading, and keep up with the right way to do that, or you have mmu hardware that will let you always use the disk references, even when you wind up targeting RAM. 20:20:46 Either way, there's no compression. 20:20:56 It's a full-size representation, on disk and in RAM. 20:21:34 He was pretty jazzed over the compression at one point, or at least Jeff Fox was. 20:22:11 He talked about how he'd shrunk his disk blocks from 1k to 256 bytes, but wound up getting more information in the 256 bytes than he did the 1k. 20:22:28 That was ColorForth era, and when Fox was working on Aha. 20:22:43 He was concerned about compressing source code, now he’s eliminating it. 20:23:03 Except he's not, really - there still has to be a disk representation. 20:23:20 --- join: smokeink (~smokeink@42-200-118-172.static.imsbiz.com) joined #forth 20:23:29 And there still has to be this "supplemental info," which will have to cover everything that's not actual definition code. 20:24:10 I like this if there's virtual memory hardware. I think it just doesn't gain much if there's not. 20:24:38 If you have to process the disk representation to get the RAM representation, you may as well have source. 20:24:40 Sure, but supplemental info is only what is needed to supplement the machine code image. 20:24:49 It's semantically still source in that case. 20:25:44 There’s less redundancy. 20:26:25 I don’t think we’re talking about the same thing. 20:27:03 He doesn’t need to process the disk representation in what I described. 20:27:06 I think we're in the same ballpark, but in different corners of it. 20:27:12 :-) 20:27:16 This all feels like it fits together into one picture somehow. 20:27:34 Ultimate what we care about is the RAM image. 20:27:39 That's what we can execute. 20:28:04 On the target machine though, not the development machine. 20:28:11 Well, time for me to sleep. Maybe my subconscious can work on it some. 20:28:24 Take care, enjoyed the chat. 20:28:25 Right, if you have both. 20:28:39 Me too - having lots of fun in this channel these days. :-) 20:28:40 Night. 20:28:50 :-) 20:29:01 --- quit: rdrop-exit (Quit: rdrop-exit) 20:30:07 I need a snippet of code that for some reason I'm having trouble thinking up 20:30:23 What's it need to do? 20:30:38 a function with the signature ( u*n u -- u ) 20:30:56 basically drop all of the top members of the stack except the topmost n 20:30:57 : foo nip ; 20:31:05 yw 20:31:19 actually, nip with a loop might work 20:31:32 Yeah. 20:31:57 what is "all of the top members of the stack" 20:32:08 I have DROPS. 20:32:12 Drops N items. 20:32:27 so, you want to reset the stack to only the top n elements 20:32:31 So for you I could do : foo swap >r drops r> ; 20:33:01 I thought he wanted to delete n items from under the top element. 20:33:32 tabemann: DROPS is fast - I just calculate an adjustment to the stack pointer. 20:33:37 i don't know, he said "except topmost n" 20:33:48 But he also said "nip in a loop." 20:34:10 but his stank picture also doesn't agree with any of the three interpretations we've discusded 20:34:17 I wanted to delete n items from under the second-most top elements 20:34:23 lol stank picture 20:34:27 goddamn keyboard 20:34:28 and I wanted to also drop the top element 20:34:50 So N is the top item, you want to keep the item under that, and the n below *that* go away? 20:35:00 : test 1- 0 ?do nip loop ; works 20:35:02 yes 20:35:11 well the n-1 under that 20:35:22 Ok. 20:35:37 : foo swap >r 1- drops r> ; 20:35:48 And drops doesn't need a loop. 20:35:50 instead of nip i would save the top element on the return stack and use drops or a loop with drop 20:35:55 yeah that 20:36:04 that would require me to implement drops 20:36:11 Yes. 20:36:29 You can use SP@ and SP! to do that. 20:36:33 It doesn't have to be a primitive. 20:37:51 SWAP >R gets the guy you want to keep saved. 20:38:10 Then it would be in the neighborhood of this: 20:38:22 : foo swap >r cells sp@ + sp! r> ; 20:38:34 You might need a 1- in there somewhere. 20:38:56 And there might be a problem lurking in there involving getting somethign from deep down there into the TOS register. 20:39:05 You might need this instead: 20:39:20 : foo swap >r 1- cells sp@ + sp! drop r> ; 20:39:50 The 1- and the DROP conspire to fixup the TOS register at the new deeper level. 20:40:41 That seems to work on my Forth. 20:40:53 : foo swap >r 1- cells sp@ + sp! drop r> ; ok 20:40:55 1 2 3 4 5 2 foo ok 20:40:57 . 5 ok 20:40:59 . 2 ok 20:41:01 . 1 ok 20:41:14 And that leaves the stack empty. 20:42:01 I'd definitely call that NIPS. 20:42:41 back 20:55:48 Actually the 1- and dro are unnecessary. 20:55:58 : nips swap >r cells sp@ + sp! r> ; 20:56:38 Because at the time the stack pointer moves, TOS is holding the new stack pointer value, which is dropped after being stored. 20:57:16 Ok - off to bed. Night guys. 20:58:25 --- quit: dave9 (Quit: dave's not here) 21:36:39 --- quit: kumool (Quit: Leaving) 21:42:54 --- join: kumool (~kumool@adsl-64-237-233-141.prtc.net) joined #forth 21:45:11 --- quit: dys (Ping timeout: 244 seconds) 22:23:08 --- join: dave9 (~dave@90.20.215.218.dyn.iprimus.net.au) joined #forth 22:23:22 re 23:17:24 --- join: dys (~dys@tmo-096-50.customers.d1-online.com) joined #forth 23:40:47 KipIngram: How did your system for calculating voltages and currents for circuits work? 23:41:00 i.e. how did you input the arranagement of the circuits and make it output the values? 23:47:16 --- quit: nerfur (Ping timeout: 272 seconds) 23:59:59 --- log: ended forth/18.10.15