00:00:00 --- log: started forth/21.04.25 00:10:12 --- quit: Vedran (Ping timeout: 260 seconds) 00:10:51 --- join: Vedran joined #forth 00:12:09 tabemann: packaged your attoforth into my overlay packages :) https://github.com/siraben/siraben-overlay/blob/master/pkgs/attoforth/default.nix 00:12:34 it could easily be upstreamed to nixpkgs as well 00:55:57 --- join: andrei-n joined #forth 01:00:31 --- join: tech_exorcist joined #forth 02:48:16 --- quit: wineroots (Ping timeout: 245 seconds) 03:51:28 * DKordic drops 5 Ki B of Forth. 04:04:55 --- join: f-a joined #forth 04:27:43 --- quit: dave0 (Quit: dave's not here) 04:29:56 --- quit: jedb (Quit: Leaving) 04:30:19 --- join: jedb joined #forth 05:47:21 --- quit: tech_exorcist (Remote host closed the connection) 05:57:45 KipIngram, re: "Suddenly you don't NEED IF / THEN" this is a big thing to achieve. I use guard statements but otherwise don't use if as a flow control inside a function/word at all. 07:16:24 --- quit: lispmacs (Read error: Connection reset by peer) 07:20:33 proteusguy: Yes, I was fairly happy when I realized what was going on (no IF ... THEN, etc.) 07:20:58 how you do not need them 07:21:14 siraben: I don't disagree re: the value of abstraction. I just had to snicker a little at the lengths the fellow went to to avoid giving the obvious answer to the question. 07:21:28 f-a: I've incorporated a family of conditional return instructions into my Forth. 07:22:07 I replace IF ... THEN by factoring ... into a new word with a conditional return at the very beginning. 07:22:26 what is this word 07:22:29 It accomplishes the same control flow and encourages me to factor more completely. 07:22:41 Well, I have a whole bunch of them, but if you wanted just one you might call it ?; 07:22:52 ?; 07:23:05 what does 1 ? do 07:23:06 I have a whole family, like =; !=; >; 0<; etc 07:23:23 It's just a name. question mark, semicolon. 07:23:32 yes but 07:23:34 It's meant to imply "possibly return." 07:23:34 what does it do 07:23:40 If the flag is true it returns. 07:23:41 return as in C? 07:23:45 I c 07:23:47 thanks 07:24:04 Transfers control to the address on the top of the return stack, and pops the return stack. 07:24:20 I also have some that do two returns (like 0=;; ) 07:24:34 The only reason for those is so that I can push the condition return decision down into a lower factor level. 07:24:59 So if foo calls bar, I can decide to return from foo from within bar. 07:25:11 Pops two return stack items, transfers control to the lower item. 07:25:31 This family of words has obviated pretty much all conditional structures. 07:25:49 I have a similar family where the action, instead of returning, is to jump back to the start of the current word. 07:26:00 That eliminates all normal looping constructs. 07:26:28 So a while loop looks like this: 07:26:43 : 0=; ... code ... loop ; 07:27:05 A do until word would be 07:27:15 : ... code ... 0=loop ; 07:28:04 As I commented yesterday, all those usual control words just allow us to make a definition longer that ought to be factored instead. 07:33:07 siraben: this complete avoidance of physical details just isn't the direction that an electrical engineer's training goes. After all, for us it's ALL ABOUT the implementation. 07:34:32 The easy answer to the kids question, without going into too much detail, was just, I thought, "There's not just one cons. All of them are different instances of cons, stored apart from one aother, and ONE WAY OR ANOTHER the "information" about the specific numbers is "im there."" 07:34:36 in there 07:38:28 f-a: I actually worte a Python script to generate the nasm source code for the whole set of conditional returns, conditional loops, conditional double returns, etc. 07:39:02 It just systematically plugged from "list of conditionals" and the list of actions to produce the entire set, which is like order of a hundred primitives. 07:55:16 KipIngram: Continuations: Generalize any Function with a ""ret"" Parameter. Show me some examples. 08:02:45 --- join: phadthai_ joined #forth 08:02:46 --- quit: phadthai (Read error: Connection reset by peer) 08:06:57 --- quit: f-a (Ping timeout: 246 seconds) 08:08:46 --- join: f-a joined #forth 08:12:24 --- join: tech_exorcist joined #forth 08:19:21 --- quit: proteus-guy (Ping timeout: 252 seconds) 08:26:48 --- quit: Zarutian_HTC (Ping timeout: 240 seconds) 08:30:34 KipIngram: haha, they really do go into all the hairy details later! 08:30:44 Gerald Sussman is also an electrical engineer 08:31:22 But to answer the question about cons, just imagine two memory cells are being allocated 08:31:33 they point to the first and second arguments respectively 08:31:39 1 cons = 1 allocation 08:32:41 --- join: proteus-guy joined #forth 08:50:53 --- join: lispmacs joined #forth 09:02:42 --- join: Zarutian_HTC joined #forth 09:09:42 Sure - exactly. 09:09:50 But he carefully avoided saying that. 09:10:06 I just thought it was cute. 09:11:14 DKordic: I'm not sure I'm following; are you saying, consider that all words offer the possibility of choosing to return, as part of their basic operation? 09:11:54 That feels to me like mixing functionalities. In Forth + adds, - subtracts, swap swaps, and 0=; conditionally returns. We're advised not to mingle these things too much. 09:12:19 Each word is supposed to have one, completely clear, action. 09:14:24 I have a humble project going to control an ad9833 audio frequency synthesizer with FlashForth, if anyone is interested: 09:14:29 https://librehacker.com/2021/04/24/ff-ad9833-repo/ 09:28:51 this is the little board I'm using to generate the audio: 09:28:54 https://librehacker.com/2021/04/06/spi-driven-audio-circuit-w-frequency-selection/ 09:32:00 looks interesting, are there any videos showing the setup in action? 09:38:52 --- quit: Zarutian_HTC (Remote host closed the connection) 09:52:10 KipIngram: to elaborate a bit more, they talk about linear memory and all, so close to assembly even 09:52:36 and how garbage collection works in terms of traversing a pointer graph 10:02:25 YOW! https://www.cs.helsinki.fi/u/jllang/Introduction_to_Lambda_Calculus.pdf ! De Bruijn Index is a succinct definition of Combinator!? ""Lambda: The Ultimate GOTO"" ( https://www.textfiles.com/bitsavers/pdf/mit/ai/aim/AIM-443.pdf ). 10:02:25 >> [ dfn ( Forth! Data Factor : ; ) ( Data ( Factor : ) ; ) ( swap : A1 A2 ; ) ( : A2 A1 ; ) ] < siraben: I just think it's an interesting contrast with the way I'd approach teaching someone about computing. I'd be inclined to start with digital logic. Gates, flip flops, registers, etc. State machines! Then show how generic decision structures can be built up from that. Processor design. Etc. 10:06:08 KipIngram: "fix" is called the Paradoxical Combinator, W/Fixed-point_combinator . 10:06:28 Number representation. Addition, subtraction, two's complement, etc. etc. etc. 10:07:24 Peano arithmetic is so far from what computers actually DO that I'd likely never mention it. 10:09:13 KipIngram: >> [ dfn ( not P True False ) ( P False True ) ] < I'm not able to follow that yet, but I opened that link for review later today. Still too early to take something like that on - it looked fairly abstract. 10:10:43 I saw some of the same kind of math that is my current barrier in my own "self education" in physics. 10:10:46 Show me the if. 10:10:54 ? 10:10:58 What do you mean? 10:11:09 You mean how I'd replace an IF THEN? 10:11:41 Ok, so if in standard Forth one would do this: 10:12:10 : foo ..a.. 0= if ..b.. then ..c.. ; 10:12:13 I'd do this: 10:12:19 Sorry, I meant to say W/Material_conditional . 10:12:31 : foo ..a.. bar ..c.. ; 10:12:39 : bar 0!=; ..b.. ; 10:12:55 Notice the reversal of the condition. 10:13:36 bar would be suitably named to imply what it was doing. 10:13:51 and its conditionality. 10:16:59 KipIngram: this is actually a tension I noticed TAing for my uni's intro-fp class -- the previous classes had all been explaining things in a machine-oriented way, and a lot of students got tripped up /hard/ by the differences between java/python/c/c++ and ocaml 10:17:15 >> [ dfn ( eor P Q True False ) ( P ( Q False True ) ( Q True False ) ) ] < which I sorta attribute more to ocaml being taught as "just another PL" instead of "here's some machine-executable math notation", than to "knowing what an allocation is poisoned the students' minds" 10:21:03 I forgot to mention W/Combinatory_logic . 10:21:35 tbh it'd be really neat to see a curriculum that starts with very low-level and very high-level langs at the same time (Forth+asm and Haskell+Scheme?), and as you progress you sorta meet in the middle, w/ the program finishing with "write a $high_level_lang compiler in $low_level_lang" 10:50:40 --- quit: gravicappa (Ping timeout: 265 seconds) 11:07:23 remexre: Sure - we "shape ourselves" to our training. A hard gear shift is a challenge to even a good student. 11:08:59 I'm an EE. I was exposed to logic gates first. I did study FORTRAN before studying assembly, but only one course in it. 11:10:03 What REALLY sculpted me, though, was my summer internship after freshman year at General Dynamics. A contractor working there named Ed sort of "took me under his wing" andmentored me. Loaned me books on system software. So I learned that summer how an assembler works, and how a loader works, and how operating systems do the major aspects of what they do, and so on. 11:10:27 It became crystal clear to me that everything a computer can actually DO (in hardware) is almost ridiculously simple. And that everything else is just building up from there. 11:10:42 It was a real eye-opening time for me, but probably only along a ertain "knowledge front." 11:12:14 I think that once you really see how a state machine, the way you learn them in logic course, can manage the resources of a computer, you've uncovered the key mystery. After that there are just "alternative clevernesses." 11:13:34 I think where this training could perhaps be better is if they tried to prevent the mental wall between code and data from appearing. Obviously, when you really understand what the thing is doing, you realize that what's code at one moment is data at another. But nonetheless, that dichotomy creeps in, and then it's hard to tear it down again when you start to study something like Lisp. 11:16:07 I'll go back to the Peano arithmetic again - when your first training was in things like gates and adders and so forth and so on, the very idea that someone would introduce addition as repeated increment/decrement operations... it's hard to not see it as silly because of its inefficiency. 11:16:23 When "we all know" how it really works. :-) 11:16:45 I'm really trying to describe my own limitations here, that I'm struggling to overcome - not passing judgement on the course 11:17:52 This does point up a shortcoming in the common math training we give youngsters, though. I don't think many kids finishing high school REALLY grasp positional notation in all its beauty. 11:18:02 It's seriously just about the most amazing thing ever conceived. 11:18:28 We could not be where we are without it. 11:19:05 It's then natural to introduce binary arithmetic as the version of positional notation that is practical for computers to use, because of the bi-level nature of the circuitry. 11:26:59 Well, I guess spoken and written language itself ranks higher. 11:27:07 Since you can't get anything else without that. 11:44:33 `` To iterate is human, to recurse divine. '' -- L. Peter Deutsch 11:47:27 KipIngram: That is why You will find "fix" enlightening. 12:20:02 --- join: Zarutian_HTC joined #forth 12:27:58 --- quit: andrei-n (Quit: Leaving) 12:52:28 --- join: andrei-n joined #forth 13:01:48 That presentation linked earlier doesn't make lambda calculus any more clear to me. It still just seems to take simple ideas and render them in some extremely complex way that I can't yet find any sense in. 13:02:22 For one thing, it always seems far more complex than whatever conncept it says its representing. 13:02:41 I've yet to come across a reference that REALLY EXPLAINS it in clear and concise fashion. 13:03:58 For example, it says that "f compose g" is (Lvwx.v(wx))fg where L is the lambda. 13:04:12 I look at that and just think, "HUH????" 13:04:38 Maybe it's one of those things that will seem obvious once I "get it," but so far that hasn't happened. 13:06:54 "LC doesn't burden us with boring and error-prone technical details, such as memory management and instructions sequencing." I find those aspects of a system to be FASCINATING. Not boring. 13:08:01 Ok, maybe it just hadn't gotten around to trying to explain it yet. Maybe it considered that other stuff... "motivational." So I'm retracting my judgement temporarily. 13:21:08 --- quit: mtsd (Ping timeout: 252 seconds) 14:07:46 --- join: Croran joined #forth 14:08:35 --- quit: andrei-n (Quit: Leaving) 14:28:54 --- join: f-a joined #forth 14:34:50 Ok, so so far this seems like it's mostly about simplifying expressions. Is there more to it than that? I think it would help to know what the pot of gold at the end of the rainbow is. 14:34:58 I'm referring to lambda calculus here. 14:52:08 idk abt in those lectures in particular, but it's simpler to write down than most other turing-complete formalisms while being fairly mathematician-writable 14:52:23 and perhaps even human-writable :) 14:53:18 --- quit: tech_exorcist (Quit: tech_exorcist) 14:53:26 and lambda calculus plus macros is more or less a completely practical language 14:53:55 --- quit: jedb (Quit: Leaving) 14:54:18 --- join: jedb joined #forth 14:56:49 (well, add conses and fixnums to that and you've got lisp) 14:56:51 --- quit: Croran (Quit: Croran) 15:38:11 --- quit: tabemann (Remote host closed the connection) 15:47:16 --- quit: f-a (Quit: leaving) 16:11:09 --- join: tabemann joined #forth 16:35:11 --- join: dave0 joined #forth 16:36:41 maw 16:57:35 --- quit: dave0 (Quit: dave's not here) 17:37:04 --- join: mark4_ joined #forth 17:37:04 --- quit: mark4_ (Client Quit) 17:40:29 remexre: they eventually got around to explaining, and I'm grokking it ok so far. 18:01:54 --- join: boru` joined #forth 18:01:57 --- quit: boru (Disconnected by services) 18:01:59 --- nick: boru` -> boru 18:10:49 --- join: mark4 joined #forth 18:16:43 --- mode: ChanServ set +v mark4 18:59:50 --- join: boru` joined #forth 18:59:52 --- quit: boru (Disconnected by services) 18:59:55 --- nick: boru` -> boru 19:12:43 --- join: dave0 joined #forth 19:14:45 maw 19:17:47 KipIngram: that's interesting yes, I would only recommend people learn computing like that if they really are learning CS as their main pursuit 19:18:41 I'm reminded of helping a finance major who was taking CS as a minor learn about state machines and logic gates for a class and she ended up dropping the minor :( 19:18:54 it's just a lot to handle when CS is not one's main pursuit 19:20:58 KipIngram: re: peano arithmetic, you're right, it's very far removed from actual arithmetic, and even in the FP community no one uses it, but it's the simplest number system that also has a simple inductive principle 19:21:30 what happens in practice is binary numbers are encoded, the equivalence of unary and binary number systems is established and all the proofs can get carried over for free from the unary world 19:25:36 DKordic: fix FTW, I was concerned about its metatheoretical properties but denotational semantics just says it's the least fixed point 19:25:48 it's quite something to see the semantics of a while loop expressed as a least fixed point 19:29:25 > For example, it says that "f compose g" is (Lvwx.v(wx))fg where L is the lambda. 19:29:25 read that as (define (compose f g) (((lambda (v) (lambda (w) (lambda (x) (v (w x))))) f) g)) 19:29:35 I guess you can see why that notation is superior to this :P 19:30:46 --- quit: rixard (Read error: Connection reset by peer) 19:41:38 --- join: rixard joined #forth 19:44:09 --- quit: sts-q (Ping timeout: 246 seconds) 19:54:26 KipIngram: My "dfn" of "Forth!" is a Composition. 19:54:29 siraben: Semantics of Natural Numbers?! 19:55:29 Let Us start with them. Or something else? 19:57:17 --- join: gravicappa joined #forth 19:59:35 DKordic: What do you mean semantics of natural numbers? 19:59:45 --- join: sts-q joined #forth 20:44:06 --- quit: gravicappa (Ping timeout: 245 seconds) 20:53:00 siraben: I can not remember any statement mentioning semantics that made any sense to me. I completely disagree ""peano arithmetic"". 20:54:17 siraben, KipIngram: Show me some simple add, sub, mul, and div Implemented as Combinators . 20:54:58 siraben: Your (most recent) ""compose"" is Scheming. 20:55:14 DKordic: yes it's scheme, for kipgram is more familiar with that syntax 20:55:22 DKordic: peano arithmetic is simple, that's really all there is to it, it's not designed for efficiency 20:56:04 What might be wrong with it? 20:56:21 What criteria are you judging it by? 20:58:28 simple inductive principle: to show something holds for all peano numbers, show it's true for 0, and show it's true for k+1 assuming it's true for k, then it's true for all n 20:58:33 My question about ""compose"" is not about ""syntax"". (IMHO) Your statement about ""efficiency"" of ""peano arithmetic"" is _not even wrong_. 21:07:32 --- join: mtsd joined #forth 21:53:23 --- join: wineroots joined #forth 22:01:44 --- join: andrei-n joined #forth 22:22:43 KipIngram: oh yeah, the _only_ thing you can do in lambda calculus is substitution 22:23:48 remexre: I'm working through Programming Languages Foundations now and it's nice to be able to formulate Hoare logic and use it to show imperative programs satisfy a desired behavior 22:25:20 oh, nice; my reading group gave up after the Equiv chapter 22:25:51 --- quit: mark4 (Quit: Leaving) 22:26:44 --- join: mark4 joined #forth 22:27:57 (nothing against that chapter, we'd just been doing coq for months and some people had dropped out from being sick of it) 22:41:51 --- nick: phadthai_ -> phadthai 23:03:10 --- quit: jedb (Quit: Leaving) 23:03:31 --- join: jedb joined #forth 23:07:27 --- join: gravicappa joined #forth 23:21:39 remexre: I had to grind through Equiv, I'm not totally done with it (the exercises involving the HAVOC construction were much harder because you couldn't just finish the proof by using determinism of evaluation) 23:21:48 I think the verified functional algorithms chapter is a nice progression from vol 1. 23:22:00 much more automation which makes you feel better about scaling up your proofs 23:22:12 vol 1 and 2 have quite manual proofs 23:26:40 One has to be sick in order not to get sick from Coq. 23:27:12 DKordic: what do you mean? 23:27:33 Coq is good for nothing!! 23:28:09 you are being hyperbolic? 23:28:36 are you* 23:29:53 IMHO Coq is an application of Ad Nauseum, like UNIX was for hardware of it's day. 23:31:02 I don't understand, do you work directly with Coq? It's used for a wide variety of applications 23:31:16 Certainly it's been used to formalize many famous non-trivial results in mathematics 23:31:36 (this would perhaps be a better conversation to have in #coq) 23:35:32 DKordic: It would be nice if you elaborate it on its perceived technical shortcomings 23:36:45 siraben: I might be wrong. Please enlighten me with some introductory examples. 23:37:17 DKordic: I'm not understanding what aspects you are criticizing 23:37:30 I can address those specific points perhaps 23:40:42 What is straightforward in Coq? 23:44:22 DKordic: well that depends on your experience and prior knowledge 23:44:41 What's straightforward in Forth, or C++? for instance 23:45:06 maybe it would be good to motivate why coq (or any other theorem prover) is needed in certain domains 23:46:45 in mathematics, mistakes in proofs can be catastrophic especially if more work depends on it, directly or indirectly, and having machine-checked proofs guarantees the proof is valid 23:47:22 in programming language theory, similarly, soundness & completeness proofs of type systems can get pretty hairy, and you can also machine-check the proofs to be completely sure 23:47:45 I would like to "PowerON" my Computer into a Forth REPL. Then do everything from there. 23:48:11 I thought we were talking about the purpose of Coq? 23:48:34 I don't understand what you mean when you say "Coq is good for nothing", I would like to hear you elaborate 23:59:59 --- log: ended forth/21.04.25