00:00:00 --- log: started forth/10.02.17 00:43:19 --- join: kar8nga (~kar8nga@jol13-1-82-66-176-74.fbx.proxad.net) joined #forth 01:34:52 --- quit: alex4nder (Quit: leaving) 01:40:04 --- quit: kar8nga (Remote host closed the connection) 02:10:02 --- join: madwork_ (~madgarden@204.138.110.15) joined #forth 02:13:17 --- quit: madwork (Ping timeout: 276 seconds) 03:08:51 --- join: GeDaMo (~gedamo@dyn-62-56-89-110.dslaccess.co.uk) joined #forth 03:54:36 --- part: TR2N left #forth 04:36:23 --- quit: GeDaMo (Quit: Now I lay me down to sleep; Try to count electric sheep) 05:14:43 --- join: tathi (~josh@dsl-216-227-91-166.fairpoint.net) joined #forth 05:24:00 --- quit: nighty^ (Quit: Disappears in a puff of smoke) 05:43:31 Asau': I just knew you'd be the one to comment on my "stack->RPN" remark. 05:43:43 I'll have to take a look at Poplog. 06:05:08 Asau: Any good links to a "definitive overview" of Poplog syntax? All I seem to find is sites that talk about what it "is", generally, but nothing with specifics on usage. 06:05:40 Somewhere at http://www.cs.bham.ac.uk/research/projects/poplog/freepoplog.html 06:06:38 Oh, I found something. Ugh - it seems to give away all of the advantages of RPN. It has explicit parameter lists, and if it uses a stack I guess the system has to work out how to flow the math operations. 06:06:45 Like the example I see is this: 06:06:57 It still provides you with open stack. 06:07:02 : define drizzle(x,y) -> result; 06:07:09 2 * x + y - result; 06:07:12 enddefine; 06:07:35 How is that better than : drizzle 2 * + ; 06:07:58 It uses mathematical notation. 06:08:12 Thus it is absolutely better. 06:08:20 I disagree. 06:08:25 I'd pick Forth over that any day. 06:08:29 Absolutely any day. 06:08:46 --- join: nighty^ (~nighty@x122091.ppp.asahi-net.or.jp) joined #forth 06:09:26 Asau: So, I have a question. 06:09:26 Translate this into Forth: 06:09:54 Why are you here? You seem to think these other tools are so much better than Forth; why aren't you just using them, hanging on their IRC channels, and so on? 06:09:57 What do we offer you? 06:10:00 S(T, x; a) = RT (x ln x + (1-x) ln (1-x) + a x(1-x)) 06:10:54 Grr. 06:11:02 One sec; translating. 06:11:04 --- quit: Deformative (Ping timeout: 260 seconds) 06:11:08 S(T, x; a) = RT (x ln x + (1-x) ln (1-x)) + a (1+bT) x(1-x) 06:12:19 What's b? I don't see it referenced on the left side. 06:12:32 No problem. 06:12:36 S(T, x; a, b) = RT (x ln x + (1-x) ln (1-x)) + a (1+bT) x(1-x) 06:15:49 In the application that uses this would 'a', 'b', and 'T' "fixed" in a greater sense than x? Or are all of these to be regarded as fully fluid independent variables? 06:16:03 I'm asking because I want to know if one would expect any of those three to be stored in variables. 06:16:13 That would matter in an actual implementation. 06:16:20 It shouldn't matter. 06:16:27 They may be and they may be not. 06:16:35 It's a matter of moment. 06:16:52 Of course it matters; if they're in variables (so that I don't have to put them there myself) then I can get them when I want them with "a @" for instance. 06:16:59 No, it doesn't matter. 06:17:18 If you want all of this on the stack then obviously this is a touch juggle. I'd hate to have to resort to PICK. 06:17:19 I worked with those for pretty long time. 06:18:42 Ok so start with this 06:18:55 : xlnx dup ln * ; 06:19:01 That will be handy. 06:19:39 --- join: Deformative (~joe@141.212.202.207) joined #forth 06:19:54 Quite stupid because it has no physical meaning at all. 06:20:13 Alright, go on. 06:20:47 I haven't decided yet how a, b, and T will go on the stack, but when you call S x will be on top. So we will start with this: 06:21:10 : S 1 over - xlnx swap xlnx + RT 06:21:36 That gets RT( x*ln(x) + (1-x)*ln(1-x)) on top of the stack I think. 06:21:42 R is Mendeleev constant. 06:22:10 Oh; I couldn't tell without any explicit * symbols. So that would definitely be in a variable. 06:22:13 Or a constant. 06:22:27 So then we'd start with this: 06:22:43 : S 1 over - xlnx swap xlnx R * 06:23:10 I want T to be the second element on the stack on entry. So: 06:26:30 : S dup >r 1 over - dup >r xlnx swap xlnx + R * over * >r * 1 + * r> swap r> * r> * + ; 06:26:42 Ok, that might be off somewhere but it's certainly close. 06:27:02 And yes, we all know that juggling a large number of parameters on the stack isn't one of Forth's strong points. 06:27:37 But see here, somehow any compiler has to solve this problem. One of the philosophies of Forth is that you do these things yourself rather than have the system do them behind your back. 06:27:40 Now modify it to work with real numbers instead of integer ones. 06:27:47 It forces you to face the complexity of the problem so that you're aware of it. 06:28:34 No, I'm done coding for the day now. Just use the floating point stack rather than the integer stack and the integer stack rather than the return stack. 06:28:35 Writing infix to postfix, prefix, or AST translator is: 06:28:41 1) well-known task; 06:28:44 2) one-time task. 06:29:05 "Just use" doesn't pass. 06:29:10 and then the programmer tends to forget that that work has to be done every time the expression is evaluated. 06:29:15 ">r" and "r>" are essentially integer. 06:29:35 You wouldn't use those; you'd use >F and F>, or whatever they're called. 06:29:48 How so? 06:29:52 You're not making any progress here toward changing my mind. 06:30:03 And by the way, you never answered my question about why you hang out here. 06:30:07 You seem to hate Forth. 06:30:08 Do you propose to introduce the second FP stack? 06:31:12 Oh, I see what you're saying. No, I'd actually probably put all of these things into variables, except for x, and reference them as needed. That felt like cheating, though. 06:31:39 There're two problems with your solution even if we restrict to integer numbers. 06:31:44 As I said, we all know that juggling more than a couple of things on the stack is hard for Forth. And harder still with floats because of the point you've just made. 06:31:46 Or to single-cell real ones. 06:32:17 The first problem is that the structure of expression isn't obvious to a programmer. 06:32:23 The expression is too long. 06:32:46 Sure - it's obtuse. You're essentially looking at what would be the *output* of a compiler, not the *input*. 06:33:10 But this is the restriction of the source language. 06:33:12 I've already told you that's a strength of Forth, not a weakenss - Forth keeps the complexity of the task at hand (the *runtime* complexity) front and center and in your face. 06:33:35 There's a very direct correspondence between the code you write and the work the machine has to do. 06:33:40 There is nothing complex in handling original mathematical notation. 06:34:20 You can easily count the number of multiplicative and/or additive operations 06:34:27 without resorting to postfix notation. 06:34:54 Look, you've picked a corner case that maximizes the power of your argument. But in many, many other cases (like the one I listed above) Forth wins. You have to look at the global situation. 06:35:03 This isn't corner case. 06:35:27 And you're still making no progress in convincing me, and you still havne't answered my other question. 06:35:38 If you recall the flow of discussion, you've raised the use of application of mathematical notation. 06:36:29 What I've done is I've brought up _real_ example of quite simple expression. 06:38:06 I still contend, though, that we took this expression out of any context. In a real program there's no telling how I'd decide to arrange the data. I might have RT, bT, and a whole host of other things "precomputed" and stored away, because I was going to use them a bunch of times in this part of the code. 06:38:37 I might have even stored a*(1+bT) away. It just depends on the global situation and what makes overall sense in solving the problem. 06:38:59 In _real_ program there's no need to depend on data representation. 06:39:03 The nastiness of all this stack manipulation would have been one of the things that drove those decisions. 06:39:54 In all real world applications you have to deal with both direct and inverse problems at the same time. 06:40:53 That's why it doesn't matter if you implement it as (T, x) -> (a, b) -> S 06:40:53 or as (a, b) -> (T, x) -> S 06:41:49 If you use mathematical notation, it doesn't matter at all. 06:42:05 Well, I have no familiarity with those problems at all, so I wouldn't know. But part of good Forth coding is to consider the *whole problem* and craft a *whole solution* that works well. It's not about looking at one bit of the problem and making any hard and fast decisions based just on that. 06:42:13 You're not going to answer my other question, are you? 06:42:17 You're just going to ignore it. 06:42:52 It doesn't matter at all. 06:43:04 No, it doesn't. I'm just curious. 06:43:54 If you resort to this kind of questions, it means that you 06:43:54 have lost the argument and want to have some kind of retaliation. 06:44:17 Quite stupid for a man of your age. 06:44:40 I'd rather forget it. 06:44:51 --- join: GeDaMo (~gedamo@dyn-62-56-89-110.dslaccess.co.uk) joined #forth 06:45:09 No, I'm not conceding the argument at all. As I said, nothing we've discussed changes my mind about my choice of tools. 06:45:43 But I'm not going to run over to #lisp or $poplog and try to diss what those guys like. ;-) 06:45:50 --- quit: nighty^ (Quit: Disappears in a puff of smoke) 06:51:50 ASau`: Not to argue, but I've also always been curious about what keeps you coming back to Forth. 06:53:38 Oh tathi; you just lost the argument and you weren't even in it... ;-) 06:53:53 I can understand being frustrated with the broken implementations, but it seems like it's more than that. 06:54:45 I can't know for sure, but to me it seems like the same sort of mindset that Deformative attributed to the advisor guy he talked too. Just no interest in anything minimalist. More complex -> better. 06:55:05 KipIngram: Heh. No, I'm just curious. I like Forth, but it really sucks for some things. 06:55:06 Simple -> stupid, misguided, archaic, etc. 06:55:47 Yeah, I agree. 06:56:05 --- quit: Deformative (Ping timeout: 256 seconds) 06:56:39 Although, I imagine that if someone really wanted to they could code up a word called "algebra" that translated an algebraic expression into Forth at compile time. 06:56:52 Thenm you wouldn't actually see the nasty in the source code. 06:57:20 Oh, sure, that has been done. Julian Noble's FTRAN, for instance. Or one of the Andrews(?) did something for Euroforth a year or two ago. 06:57:30 You'd get the "write it once" advantage that Asau touted for other languages, but you would of course have to pay attention to the complexity of the code generated lest that bite you perforamnce wise. 06:58:40 The trouble I see is that most long-time Forthers seem to be stuck at about 1980 -- minimalism is one thing, but most Forth code that I've seen is specialized to the point of being useless outside of the program it was written for. 06:58:44 The fact that Forth does, in fact, have shortcomings that no one has worked out a good way (yet) to insulate us from is one reason I find it hard to completely get away from C. 06:59:02 I wish I could use Forth for everything, but there are enough roadblocks still in place that I can't. 06:59:18 I don't have a feel for how many of them might be resolvable, but I know I don't have time to work on all that. 06:59:24 And these days there's enough computer power for a lot of things that programmer time is often more important than cpu time, and Forth falls down badly there. 06:59:27 Yeah. 06:59:47 In small embedded situations I can see that it might be different, but I don't have any experience there. 07:00:12 But I imagine that if I'd invoked FTRAN to solve Asau's challenge above he would have cried foul, even though every language he'd prefer would have precisely similar code under the hood. 07:00:46 Yes, minimizing programmer time and error is what it's all about in the mainstream these days. 07:00:58 --- join: nighty^ (~nighty@x122091.ppp.asahi-net.or.jp) joined #forth 07:01:31 Almost all of my software development experience is in the embedded world. But I'm not a software engineer -- I'm actually an electrical engineer by training. 07:01:36 --- quit: gogonkt (Ping timeout: 256 seconds) 07:01:40 I've had some formal software training and a lot of self-training. 07:01:59 So I've never been brainwashed into the party line. 07:02:23 I tend to bring the same mindset to software design that I do to hardware design. 07:03:22 --- join: gogonkt (~info@59.38.223.14) joined #forth 07:03:22 Ok, got to work. I'll be on, but I've got to stop talking so much. 07:03:28 Yeah, me too. 07:42:00 --- join: Deformative (~joe@bursley-183118.reshall.umich.edu) joined #forth 07:42:03 Hi all. 07:44:41 Hey. 07:44:56 How's it going? 07:45:08 Good. Just working. 07:46:16 I want to try hypnosis. 07:47:08 I fell asleep in class, and I have bruxism, so I was sleeping without my mouth guard and now I am in excruciating pain. 07:47:32 And hypnosis seems to show some slight results in "curing' bruxism. 07:47:36 Heh. 07:49:31 --- join: qFox (~C00K13S@5356B263.cable.casema.nl) joined #forth 07:51:31 Ugh, president Obama will deliver the commencement address this spring. Not a fan. 07:54:40 KipIngram: Run accross any neat programming articles lately? 07:55:21 I saw one about a minimal meta-circular lisp interpreter which had speeds competing with the complex ones, but I was stupid and closed my web browser before I read it. 07:55:24 So now I cannot find it. 07:55:47 Your history doesn't have it? 07:56:35 Oh, good idea. 07:56:37 * Deformative checks. 07:57:34 Horay! 07:58:14 --- nick: madwork_ -> madwork 07:58:42 So KipIngram, what computer concepts are you interested in other than forth? 08:00:20 Well, processor design, but I tend to mingle Forth into that as well. 08:00:47 Yeah, processor design seems like a ton of fun. 08:00:58 I would educate myself, but I don't need another set of classes to be like programming. 08:01:13 Oh, I wanted to point out an analogy to you, since you're versed in software architecture but are getting into computer architecture. 08:01:30 Programming is torture, sitting through that, it is nice in processor design actually learning. 08:01:40 We were discussing game coding, and touched on how those guys have a lot of neat tricks. 08:01:50 Yep. 08:02:11 Let's define "trick" as structuring the game operations (the algorithms) to take advantage of the hardware's performance features. 08:02:32 Mhmm. 08:02:41 Processor design is a lot the same - you structure the operations (your instruction set) to take maximum advantage of the gate level capabilities your hardware has. 08:02:54 You craft an instruction set that will "work well" with your hardware. 08:03:10 If you're designing a processor as a custom thing (from transistors) you can define the hardware completely. 08:03:43 But if you're putting one in an FPGA then each manufacturer offers a different suite of hardware strong points. 08:04:13 Choosing the instructions wisely, how to map them into the bit fields wisely, etc. such that you get the top performance is the name of the game. 08:04:22 But it's really the same game that the game coders play, just at a lower level. 08:04:25 Thought you might like that. 08:04:35 From what I understand, the trend is that hte instruction sets are never designed to take advantage of hardware anymore. Companies just buy licenses for using arm instruction sets and implement them on their hardware. 08:06:03 Hmmmfff... When you know me better you'll know what I think of such things. 08:06:17 Heh. 08:06:28 How very "me too"... 08:06:43 Yeah, it is less fun, but arm isn't a half bad instruction set. 08:06:46 Seems like no one really *thinks* / takes risks anymore. 08:06:52 No, certainly not. 08:07:07 And it is nice to be able to run the same code on pretty much every embedded platform on the market. 08:07:08 For one thing, I don't like licensing stuff. 08:07:50 In the applications I care about no one else will be playing. This will go into my company's down-hole tools, and nowhere else. 08:08:05 Don't need to be able to run it on someone's PDA. 08:08:08 Well, licensing seems to work well for qualcomm, they are fabless and they license out their chips. ^^ 08:08:16 Ah, yes, that is very understandable. 08:08:52 And, for example, I'm more interested in precision timing than I am in maximum performance. Well, I'm personally interested in high performance, but here at work I'm focused on timing precision. 08:09:08 I see. 08:09:20 So my processor will march along in a very, very predictable way so that I will know *exactly* when things happen without having to mess with timers, interrputs, etc. 08:09:40 It will be sort of a simple processor / very advanced programmable logic controller hybrid. 08:09:45 But how many of these processors would you need? Few? Wouldn't it be more economical to just have overpowered processors than design an optimal one? 08:10:13 My goal is to tuck this thing into every FPGA that we field, and eventually no longer need a separate processor at all. 08:10:24 Overpowered = shorter battery life. 08:10:31 Indeed. 08:10:36 Some of these tools are battery powered, and the longer they can run downhole the better. 08:11:01 I am so looking forward to when arms compete with intel, it is getting close now. 08:11:10 Has the potential to re-open the market. 08:11:36 Arm chips have so many companies backing them now. 08:12:33 Yeah, I think that trend and some of the trends in display technology like we were discussing the other day will bring on a really cool series of products in the next few y ears. 08:12:52 The processor in the nexus 1 is more powerful than my p3 in my old laptop, yet it doesn't even have a heat sink and is cold to the touch, where my p3 would constantly overheat. 08:13:51 Ugh... I get so annoyed when people say "Let's solve performance issues with FFI! 08:13:52 " 08:14:09 FFI? 08:14:18 Foreign function interface. 08:14:30 That definitely should not be what it's for... 08:14:52 Should be for when an api is not available on the desired language, not because the desired language is too slow to do anything on it's own. 08:15:26 I agree with you. 08:17:56 --- join: erider (~chatzilla@pool-173-69-160-231.bltmmd.fios.verizon.net) joined #forth 08:18:15 --- quit: erider (Changing host) 08:18:15 --- join: erider (~chatzilla@unaffiliated/erider) joined #forth 08:19:21 I think I put too much hope into llvm. 08:19:28 I really shouldn't but it just seems so cool. 08:19:41 I know I will just end up disappointed. 08:22:50 Hmmm. Don't really know anything about it. Found the website, though, and at least it has an air of excitement about it. Seems like they have some good goals. 08:22:51 The concept of a general purpose vm, with little to no overhead capable of jit and PORTABLE self modifying code, oh goodness. 08:23:11 But it is a vm, so.... "Little ot no overhead" is unlikely at best. 08:23:48 Initial benchmarks are showing promising results. 08:24:10 But I have a feeling the optimization curve is going to plateau early. 08:24:36 Sooner or later with optimization comes bloat/size, and with bloat comes slowness. 08:25:05 What do you expect the causes to be? Because that's exactly where getting to design your processor and its instruction set could help. Identify the problems and then give the hardware a way to solve them. 08:25:51 I don't mean "managerial" or "political" issues; I mean what, in theory, would come up as the fundamental bottleneck in running a really lean and mean vm? 08:26:01 That would be particularly interesting, writing a processor capable of running llvm bytecode natively. 08:26:14 There you go. 08:26:21 My kind of fun. 08:26:34 Heh. 08:26:59 Prototype it first in software, then in an FPGA, and if things turn out well design a chip. 08:27:25 You can get everything you need to get through the FPGA prototype for just a few hundred bucks. 08:27:34 Great, *great* college project, if you ask me. 08:27:36 University would fund it. :D 08:27:50 They have all the resources I could ever ask for. 08:27:53 You should do it. 08:28:09 Yeah, maybe graduate school. 08:28:14 The digital logic stuff isn't that hard to master. 08:28:36 I was planning on being compilers and operating systems focus rather than processor design focus. 08:28:37 The hardest part, for me at least, is staying on top of the FPGA vendor toolchains; they keep changing them and I have to learn new stuff. 08:28:59 Focus on both, and on how they interact and how you can get beter results if you "co-design" them. 08:29:15 It is hard to fit both into 128 credits. 08:29:26 So like I said, it would probably need to wait for graduate school. 08:31:12 I think my major design experience for my undergrad is going to be a compiler. 08:31:23 A C compiler with self modifying code and lambdas or something. :D 08:33:06 Well, I need to go eat before class. 08:33:08 Talk to you later. 08:37:55 --- quit: Quartus` (Ping timeout: 246 seconds) 08:53:12 --- quit: Deformative (Ping timeout: 265 seconds) 09:01:45 --- quit: erider (Ping timeout: 248 seconds) 09:05:10 --- join: forther (~62d2faca@gateway/web/freenode/x-zwlkmberwzrumflo) joined #forth 09:05:22 hiall 09:09:04 --- join: ASau (~user@83.69.227.32) joined #forth 09:09:19 --- join: Deformative (~joe@67-194-33-229.wireless.umnet.umich.edu) joined #forth 09:15:22 --- quit: ASau (Remote host closed the connection) 09:16:22 --- join: ASau (~user@83.69.227.32) joined #forth 10:06:16 --- quit: Deformative (Ping timeout: 260 seconds) 10:08:24 crcz: Gah. I wish you'd quit changing the version control for Retroforth. :) 10:30:57 --- join: Deformative (~joe@bursley-183118.reshall.umich.edu) joined #forth 10:36:11 --- quit: sleepydog (Quit: leaving) 10:46:57 What is IT? 10:47:28 Apparently I have an interview at cisco for an IT position. 10:47:37 I got turned down for software development it seems. 10:48:05 Information Technology 10:48:29 --- join: xjrn (~jim@astound-69-42-10-25.ca.astound.net) joined #forth 10:48:33 Without more details, could be pretty much anything :P 10:49:03 Mmm. 10:49:09 I think I will just turn down the interview. 10:49:53 You could always use it as interview practice 10:49:53 I don't see how I am qualified to do this. 10:49:59 Yeah, I suppose. 10:50:28 The available times are garbage though. 10:50:46 I would need to skip my logic design class. 10:51:02 tathi: I won't be changing it again 10:57:21 Eh, I just signed up for a time anyway. 10:57:26 Whatever, it's practice I suppose. 10:57:38 Plus I don't want to get black balled from Cisco, I may want to work there one day. 11:20:31 as far as I know it, IT is jst internal tech support. 11:20:46 they will install windows for you 11:20:54 make sure your phone is connected 11:21:12 they also do sysadmining 11:21:31 corp e-mail, etc. 11:22:41 also being around cisco is easier starting point to apply to some other position one day 11:25:00 Well, I already have a Job at qualcomm. 11:25:01 job 11:25:07 It is just for practice. 11:25:20 Practice selling myself and such, but it is hard to sell myself for it 11:25:24 I suppose I will try though. 11:31:50 --- join: Quartus` (~Quartus`@74.198.8.60) joined #forth 11:34:17 --- quit: Deformative (Ping timeout: 240 seconds) 11:41:46 --- join: Deformative (~joe@141.212.212.197) joined #forth 11:43:05 --- quit: Quartus` (Ping timeout: 240 seconds) 11:50:09 --- join: nizchka (~nizchka@h59237.upc-h.chello.nl) joined #forth 11:50:18 good luck 11:50:36 Thanks. 11:56:26 --- join: alex4nder (~alexander@wsip-72-215-164-129.sb.sd.cox.net) joined #forth 11:56:29 hey 11:57:57 Hi. 12:00:48 how's it going? 12:01:58 Meh, trying not to think about it. 12:05:25 You? 12:07:27 it's interesting. 12:07:35 I'm trying to figure out what to hack on. 12:08:39 Fun. 12:09:13 there's some firefox extension code I need to write, that I'm not super excited about. 12:20:12 Ha! Cisco's accepted out patch. 12:25:13 --- join: TR2N (email@89-180-200-34.net.novis.pt) joined #forth 12:26:41 --- quit: nizchka (Ping timeout: 240 seconds) 12:29:15 --- join: erider (~chatzilla@pool-173-69-160-231.bltmmd.fios.verizon.net) joined #forth 12:29:52 --- quit: erider (Changing host) 12:29:52 --- join: erider (~chatzilla@unaffiliated/erider) joined #forth 12:41:06 --- join: nizchka (~nizchka@h59237.upc-h.chello.nl) joined #forth 12:51:13 --- quit: alex4nder (Quit: Reconnecting) 12:51:29 --- join: alex4nder (~alexander@wsip-72-215-164-129.sb.sd.cox.net) joined #forth 12:56:41 --- quit: Deformative (Ping timeout: 240 seconds) 13:05:48 --- join: Deformative (~joe@bursley-183118.reshall.umich.edu) joined #forth 13:08:09 --- quit: nizchka (Quit: Leaving) 13:15:07 Hah, I think I will be able to do a forth system for my engineering project. 13:15:19 I have some aerospace kids in my group, and I was like, "So, nasa uses forth." 13:15:23 And I have there vote. 13:15:32 s/some/a 13:15:38 s/there/his 13:15:49 s/there/their while I am at it. 13:15:51 :P 13:16:40 But there is still one kid who wants to do a wave form audio simulator... So he is going to be a problem. Perhaps I can convince him that it would be easier to do such in forth than asm. 13:16:53 nasa also uses python 13:17:11 What is "wave form audio simulator?" 13:18:28 --- join: DavidC99 (~DavidC99@bas2-windsor12-1088707672.dsl.bell.ca) joined #forth 13:19:24 Deformative: what would you be building in Forth? 13:19:49 that should be done in Matlab 13:20:19 or octave if you want free 13:22:58 * erider is happy he got forth to run on his calculator 13:23:07 did you build pforth? 13:23:51 yup 13:26:00 alex4nder: I was having issues with getting the rest of the primitives to load but now I am pass the problem and I have a almost complete forth system on my calc 13:29:19 nice 13:32:28 ASau: Basically draws a wave, then play it on a speaker. 13:32:45 erider: What calculator? 13:33:11 Deformative: that's not hard even in Forth. 13:33:14 alex4nder: I would be building a forth system. :P 13:33:24 ASau: Of course. 13:33:27 TI-89 Titanium 13:33:34 We have 2 months to do this project. 13:33:34 Deformative: oh,.. you want to rope other people into this project? 13:33:40 alex4nder: Yes. 13:33:42 I wrote bindings to OpenAl once. 13:34:01 alex4nder: Because the whole group needs to agree on a project. 13:35:04 Deformative: what's the end goal, just to have a Forth: 13:35:05 er ? 13:35:26 alex4nder: Well, it has vga and keyboard input, so I want an interactive forth shell, on this custom cpu. 13:35:35 coo 13:35:36 l 13:35:39 what's the CPU similar to? 13:35:52 It is just a 32 instruction educational cpu. 13:35:59 cool 13:36:04 Yeah. 13:36:29 Then depending on how much time is left over, we could do whatever in the forht. 13:36:31 forth 13:36:37 But the forth itself would get us a good grade. ^^ 13:37:10 is this highschool, or undergraduate? 13:37:21 Undergraduate. 13:40:41 who else is on your team? 13:40:56 3 other students with minimal computer science experience. 13:41:13 For some reason they made me take this silly intro course. 13:42:04 I am in upper level programming courses, but I had to take the intro course to fulfill a technical communication requirement. 13:45:22 Deformative: TI-89 Titanium is what I am using currently 13:48:14 erider: I see. 13:48:44 I had a forth on my ti 83, didn't work though. 13:58:46 I'd like to see a decent forth implementation for the Flash 10 vitual machine. 13:58:48 er virtual 14:33:29 --- quit: TR2N (Ping timeout: 240 seconds) 14:35:25 --- join: maht (~maht__@85.189.31.174.proweb.managedbroadband.co.uk) joined #forth 14:44:25 --- join: TR2N` (email@89-180-175-242.net.novis.pt) joined #forth 14:49:20 Deformative: what version of forth do you have running on your TI-83 14:54:11 --- quit: qFox (Quit: Time for cookies!) 14:59:08 --- quit: GeDaMo (Quit: Now I lay me down to sleep; Try to count electric sheep) 15:00:53 --- quit: TR2N` (Ping timeout: 272 seconds) 15:01:24 --- join: TR2N (email@89-180-139-118.net.novis.pt) joined #forth 15:06:48 --- quit: alex4nder (Ping timeout: 264 seconds) 15:08:29 --- quit: TR2N (Ping timeout: 252 seconds) 15:09:31 --- join: TR2N` (email@89.180.167.55) joined #forth 15:11:13 --- join: skas (~skas@eth488.act.adsl.internode.on.net) joined #forth 15:11:31 --- nick: TR2N` -> TR2N 15:30:49 erider: I don't rmember. 15:31:07 I downloaded it on some website and installed it, but it didn't work. 15:35:38 --- quit: maht (Quit: Leaving) 15:54:28 --- quit: forther (Quit: Page closed) 16:24:52 --- join: nighty__ (~nighty@210.188.173.245) joined #forth 16:57:42 --- join: snotforbrains (~snotforbr@ip68-226-15-108.ga.at.cox.net) joined #forth 17:04:54 --- quit: snotforbrains (Read error: Operation timed out) 17:21:36 --- quit: ASau (Ping timeout: 252 seconds) 17:27:53 --- join: Quartus` (~Quartus`@74.198.8.60) joined #forth 17:37:35 --- join: snotforbrains (~snotforbr@ip68-226-15-108.ga.at.cox.net) joined #forth 17:39:11 --- join: TR2N` (email@89-180-214-144.net.novis.pt) joined #forth 17:40:00 --- quit: TR2N (Ping timeout: 240 seconds) 17:45:54 Hello all. 17:49:57 hi Deformative 17:54:38 How are you crc? 17:54:45 I'm doing well 17:58:34 That's good. 18:02:33 I am reading this now: http://lambda-the-ultimate.org/node/2124 18:02:35 Rather interesting. 18:02:46 Instead of doing my homework.. 18:04:09 I'm working on a web app to help me manage/access my ebook library 18:04:55 Cool. 18:23:32 --- quit: TR2N` (Ping timeout: 265 seconds) 18:33:47 --- join: TR2N (email@89-180-214-36.net.novis.pt) joined #forth 19:14:23 curious if i understand this correctly : the comparison operator produces a 0 or -1 on the stack not the IF statement. The IF statement simply looks at the stack and executes if its non zero ELSE only executes if 0 and THEN always executes no matter what 19:29:52 I believe so 19:39:28 --- quit: xjrn (Quit: ChatZilla 0.9.83 [XULRunner 1.8.0.9/2006120508]) 19:39:33 --- quit: erider (Ping timeout: 272 seconds) 19:47:09 --- quit: sgtarr (Read error: Connection reset by peer) 19:47:11 --- join: sgtarr (~root@rasterburn.org) joined #forth 19:54:41 --- quit: yiyus (Ping timeout: 256 seconds) 20:07:10 --- join: yiyus (1242712427@je.je.je) joined #forth 20:40:28 I thought that IF compiles a "jump if zero" to the instruction that follows the ELSE. ELSE compiles an unconditional jump to the instruction that follows THEN. THEN doesn't compile anything; it just backpatches ELSE's jump address at compile time. 20:40:46 They also do some compile time structure checking. 20:40:53 To make sure the constructs are properly formed. 20:42:59 KipIngram: You might like the article I posted earlier. 20:43:05 It is rather lispy, but still good. 20:53:39 Yes it is kind of lispy. 20:55:14 Hey, one of my guys is implementing a "simplified" version of my FPGA processor. I threw out all attempts to pipeline, which took away all the problems. It won't run as fast, but it will be smaller, quicker to implement, and easier to be certain of operationally. 20:57:35 Smaller is important. We tend to use smallish FPGAs, so if this thing takes too much room it won't get used. 20:59:24 I'm considering a "virtual memory" mechanism. Any reference to a high address (say in the 48k - 64k range) will not access local memory. Rather, it will communicate with the host so that that memory block can live on the host. 20:59:51 This will let me put 1) the dictionary and 2) the code associated with the outer interpreter on the host. The target will contain only the core system and the application code. 21:00:28 So that stuff will execute slowly, but it's only used during development so that's ok. All application code will execute fast, during testing and during actual use. 21:00:41 --- quit: snotforbrains (Ping timeout: 240 seconds) 21:00:49 Anyone heard of that sort of thing being done? I'd love to read a paper... 21:08:05 I might do this by using a different logic core during development, so that I had no overhead at all when I deployed the app. Sort of like debug vs. release mode when compiling software. 21:13:13 Interesting. 21:13:22 I was not aware fpgas were used anywhere outside of education. 21:13:53 Or prototyping. 21:14:51 KipIngram: Did you actually read the link, not the discussion. 21:14:54 Oh my gosh, they're used all over. 21:14:55 The pdf that at the top. 21:15:14 I was reading the discussion. I'll look again. 21:15:38 Yeah, discussion is useless. 21:15:46 http://software-lab.de/radical.pdf 21:15:48 That's what you want. 21:16:09 I found it interesting anyway. 21:16:29 Well, the declaration that Lisp is the *only* language that's good for app development is misguided right from the start. But I'll exercise patience... 21:17:36 Heh. 21:18:05 Ok. Looks like a good paper. Wow; Lisp and Forth really do share certain things but differ wildly in a lot of ways, don't they? 21:18:38 Mmm, I wouldn't say they share anything more than other languages from the same era. 21:19:04 I don't know; I think some of the under-the-hood stuff might be similar. Just a feeling. 21:19:25 I don't really know a whole lot about Lisp. 21:19:36 Oh well, lisp was my first language ever. 21:19:39 I wouldn't use it for anything. 21:19:44 Don't tell ASau, though; he thinks I'm a Lisp expert. ;-) 21:19:52 Heh heh.... 21:20:04 But it was the best educational tool I could have asked for. 21:20:16 It's all the rage in AI, or at least used to be. 21:20:24 Though Prolog was used a lot for expert systems. 21:20:38 I have yet to learn prolog. 21:20:59 I suppose the ast is similar in lisp and forth. 21:21:02 I've forgotten most of what I once knew. I remember being enamored of it for a while. 21:22:14 Before I realized that AI itself isn't anything special. Just fancy algorithms; certainly not "intelligent" in the sense we think of ourselves as intelligent, or sentient. 21:22:43 Their call structure is very similar I suppose, but forth results in longer routines. 21:23:01 Since forth can have multiple operations in the same expression. 21:23:38 I am basically interested in all things minimal and portable. :) 21:24:06 Yeah, I am not a fan of AI... 21:24:27 The machine learning professor here tried to convince me to take his course. 21:24:42 He was trying to get me override into it and stuff, he really seems to like me. 21:24:43 Well, Forth *can* result in longer routines, but good programming practice encourages you to factor and modularize, which results in short words. 21:24:50 But eh, doesn't seem worth it to me. 21:25:25 The whole notion that living minds are nothing more than extremely complex computers is wrong to start with. 21:25:38 KipIngram: (define x (+(+ 1 2) 3)) vs : x 1 2 + 3 + ; 21:25:47 The (+ 1 2) is it's own s expresion. 21:25:50 Which is called. 21:26:02 Where in the forth, it is all the same expression. 21:26:30 : y 1 2 + ; : x y 3 + ; 21:26:37 Each s expression represents a pointer, in theory. 21:26:49 So each () is a structure being pointed to by the outer structure. 21:26:50 If you use 1 2 + a lot in your program then it makes sense to factor it out like that. 21:27:30 That would be like doing (define y (+ 1 2)) (define x (+ y 3)) in lisp. 21:27:36 Yeah, that makes it hard to read. "So, even though this is a separate expression, compiled and stored separately, we're going to cram it into the middle of the other expressions so you can look at a screenful of unintelligible gibberish." 21:28:01 That last post referred to the original way of writing it, not what you just said at the end. 21:28:14 I am not arguing for lisp as a practical language, just interesting. 21:28:21 I agree it's very interesting. 21:30:14 As a hobby, I am just looking for what is interesting. :) 21:30:17 And not necessarily impractical. I seem to recall thinking once that you could almost map Lisp and Forth onto each other, with a little work. If Lisp works for someone, and thay can produce good code, then by all means they should use it. 21:30:42 When I am doing something for work or school, it's about practical. :) 21:30:54 --- join: snotforbrains (~snotforbr@ip68-226-15-108.ga.at.cox.net) joined #forth 21:31:52 I don't know about the mapping lisp to forth thing. 21:32:10 There are lots of headaches that would arise with memory management. 21:32:28 Oh, that paper put its finger on the reason Lisp is so popular with AI people. Because it "allows uniform treatment of code and data." 21:33:12 I never saw why lisp was so unique in that regard, you can do that in C... 21:33:14 That's true - Lisp uses a lot of garbage collections and stuff. 21:33:39 Expressions, represented by lists, change dynamically, I think. That doesn't happen in Forth. Once compiled... 21:34:42 Lisp is very well-suited to arenas where your understanding of the world, and therefore your codified representation of that understanding, stays in a state of flux all the time. 21:34:48 Well, with Forth, you can work with addresses. When you work with lisp, you don't. You only have static pointers. Fundamentally. I mean lisp can be hacked in an uncountable umber of ways, but I am just talking theory here. 21:35:01 Forth presumes you will define your solution, express it, and call it a day. 21:35:56 Lisp more or less ignores that you are on a computer. 21:36:02 It just has variables, but no concept of memory. 21:36:12 Tries to be math. 21:36:18 Yeah. 21:37:36 Just so far from my world. 21:37:58 I tend to care about things like sampling analog to digital converters at precise rates, and so forth. 21:38:22 Understandable. 21:38:55 I tend to care about things that make me think in a way I didn't think before. 21:40:10 I enjoy those things, but time gets awfully precious sometimes. 21:40:39 Oh of course. 21:41:08 Hey I bought a nook today. The new Barnes & Noble ebook reader. 21:41:22 I was goign to buy a kindle, decided to hold out though. 21:41:51 I've never held a kindle, but I've looked at pictures and the nook seems to have an "elegance" to it that he kindle is lacking. 21:42:03 Kindle has web access. 21:42:14 And longer battery life. 21:42:29 nook's long enough. And I use my notebook for web access. 21:42:37 Hm, I didn't think they had it... 21:42:50 Had what? 21:42:57 Web access. 21:43:01 Oh, notebook. 21:43:03 I read nook. 21:43:07 :-) 21:43:08 I have been awake for too long. 21:43:26 But yes, if you want one platform for everything that would argue for the kindle. 21:43:36 I bought based on the cool factor. 21:43:52 If I had an ereader, I might actually read books. 21:43:58 I tend to just read articles now. 21:44:09 I used to read books, now I don't have time. 21:44:20 I know what you mean; it's easy to lose the habit. 21:44:31 If a pdf is longer than 9 pages of full text, I put it on my todo list to never be seen again. 21:44:47 I downloaded the second part of a sci-fi trilogy I started a long time ago and have launched into it. 21:45:38 Hey, there is, in fact, something to be said for demanding that papers and so on deliver results to you quickly. 21:45:56 A few from my todo list: 21:45:58 Standard ML 21:45:59 AspectJ 21:46:01 acl2 21:46:02 pldi articles 21:46:04 Design concepts in programming languages 21:46:05 algorithms in C part x 21:46:07 "the art of computer programming" 21:46:08 Expert C Programming: Deep C Secrets 21:46:09 If it's so obtuse that it can't be written down in a relatively straightforward, concise form, then it's probably not worth your time. 21:46:10 autonima based programming 21:46:52 Possibly. 21:47:13 But finding 9 9 page articles is much harder than finding 1 81 page book. 21:47:31 Takes longer to find the articles than read them. 21:49:11 I think I just said that nothing complicated is valuable, and that's not really what I meant to say. 21:49:29 What I meant was "there are enough simple things in the world to get the job done." 21:49:29 I understood what you meant. 21:49:52 So if you run into a complicated one, your time might be better spent by continuing to look for simple substitutes. 21:50:08 --- quit: snotforbrains (Ping timeout: 256 seconds) 21:50:12 "Good enough" is often good enough. 21:50:39 Yeah, but it is hard to learn a programming language in 9 pages.. 21:50:41 I guess this has something to do with the 80/20 principle. 21:50:46 Not Forth. ;-) 21:50:55 Indeed. 21:51:08 But man cannot live on forth alone. 21:51:22 Sad but true. 21:51:35 So see, we're back to this problem with Wiggins. 21:51:56 You can accomplish 80% of what you need to in life with 20% of the concepts that are out there. 21:52:11 But academia is completely structured around pushing for 100% of the concepts. And then a few more. 21:52:28 So any notion that you *don't need* all that is just anathama to them. 21:52:31 Haskell is useless garbage. 21:52:53 I tried learning it, and I was just so turned off by all the extra words. 21:53:02 Nothing is in plain english 21:53:03 I skimmed a paper and ran like hell. 21:53:32 So quickly that I can't even tell you *why* I didn't like it. It was just a feeling. 21:53:47 Apparently Haskell doesn't have any concepts that can be applied to any other programming language, from what I understand. 21:53:53 "There's nothing here for me..." 21:54:09 They are all so high about monads, "what can a monad be used for in C?" "Nothing, that's stupid." 21:54:12 "Kthanksbye" 21:54:49 I am not a fan of type systems really. 21:55:09 I see verbs when I program. 21:55:09 Me either. Memory is clay. Sculpt it. 21:55:27 I don't tell the computer what data to use, I tell it what to do with the data. 21:56:11 Which is also why I find basic lisp interesting. 21:56:19 But why I am fascinated by forth. 21:56:25 EVen the nouns are verbs in forth. 21:56:35 :-) That's true. 21:56:49 It is : defineverb as verb verb verb verb verb verb verb verb verb verb verb ; 21:56:53 Very good observation. 21:57:00 So, bedtime. Night. 21:57:03 So it is like "Do this, then this, then this, then this, then this, then this." to me. 21:57:09 Which is how I think. :) 21:57:15 Ok, goodnight. 21:57:18 Talk to you later. 22:10:34 --- quit: schme (Ping timeout: 245 seconds) 22:10:37 --- join: schme (~marcus@c83-254-198-4.bredband.comhem.se) joined #forth 22:10:38 --- quit: schme (Changing host) 22:10:38 --- join: schme (~marcus@sxemacs/devel/schme) joined #forth 22:27:49 --- quit: TR2N (Ping timeout: 252 seconds) 22:36:06 --- join: TR2N (email@89.180.131.239) joined #forth 22:39:54 --- join: ASau (~user@83.69.227.32) joined #forth 22:46:09 --- quit: ASau (Ping timeout: 252 seconds) 22:55:27 --- quit: crcz (Ping timeout: 256 seconds) 22:56:00 --- join: ASau (~user@83.69.227.32) joined #forth 23:59:59 --- log: ended forth/10.02.17