Hacker Newsnew | past | comments | ask | show | jobs | submit | ux266478's commentslogin

Not at all, it's a completely different language with a very different computational foundation. It's an SML-Haskell type situation.

> It's a really weird language to reason with IMO

I know you likely mean regular Prolog, but that's actually fairly easy and intuitive to reason with (code dependent). Lambda Prolog is much, much harder to reason about IMO and there's a certain intractability to it because of just how complex the language is.


What would be some applications it handles better than regular Prolog? Something that naturally requires second or higher order logic rather first order logic?

Lambda Prolog isn't a "pure" HOL. It's a very restricted form of HOL using Higher-Order Hereditary Herrop formulas, granting us pretty solid generalized mechanisms of implication and universal quantification, which itself more or less means we get contextual reasoning and scoping rules baked into the grammar for free.

Implementing other programming languages and proving theorems are the low-hanging fruits since you get variable binding without name management, but I genuinely think it has profound implications for expert systems since it essentially removes a massive amount of complexity from contextual reasoning. Being able to account for patient history when providing a diagnosis, for example.


Owning things isn't free (and a VPS isn't owning things, either)

I absolutely agree with the concept, but people have to be ready to do their own work rather than delegating it to other parties. Consolidation has happened because these massive conglomerates absorb operational complexity on the cheap, and that's attractive. Moving away from them means we take on the responsibility of doing it ourselves.


That would be the domain of logic programming languages like Prolog. SQL and its dialects are more for very specific and restricted applications of relational calculus, not general languages for expression of relations, conditions and categories.

And in your mind moral objectivism fixes this how? You equate these things to post-modernism, do you believe disloyalty came to exist in the world for the first time during 1950s?

> In the latter, if you see `foo` in the body of a function definition you have no idea if it's a simple computation or some sophisticated and complex control structure just from what it looks like.

All control structures are reserved as keywords in Haskell and they're not extensible from within the language. In C I can't tell that an if(condition) isn't a function call or a macro without searching for additional syntactic cues, or readily knowing that an if is never a function. I generally operate on syntax highlighting, followed by knowing that an if is always a control structure, and never scan around for the following statement terminator or block to disambiguate the two.

I've found in general programmers greatly overestimate the unreadability they experience with the ISWIM family to be an objective property of the grammar. It's really just a matter of unfamiliarity. Firstly, I say this as a programmer who did not get started in the ML family and initially struggled with the languages. The truth of the matter is that they simply engage a different kind of mental posture and have different structural lines you're perceiving, this is generally true of all language families.

Pertinant to that last point and secondly, the sense of "well this is clearly less readable" isn't unique when going from the Algol family to the ISWIM family. The same thing happens in reverse, or across pretty much any language family boundary. For example: Prolog/Horn clauses are one of the least ambiguous syntax families (less so than even S-expressions IMO), and yet we find Elixir is greatly more popular than Erlang, and the most commonly cited preference reason has to deal with the syntax. Many will say that Erlang is unintuitive, confusing, strange, opaque, etc. and that it's hard to read and comprehend. It's just the same unfamiliarity at play. I've never programmed Ruby, I find Elixir to be borderline incomprehensible while Erlang is in the top 3 most readable and writable languages for me because I've spent a lot of time with horn clauses.

I think there's a general belief programmers have where once you learn how to program, you are doing so in a universal sense. Once you've mastered one language, the mental structures you've built up are the platonic forms of programming and computer science. But this is not actually the case. More problematically, it's propped up and reinforced when a programmer jumps between two very similar languages (semantically and/or syntactically) and while they do encounter some friction (learning to deal without garbage collection, list comprehensions, etc), it's actually nothing that fundamentally requires building up an entirely different intuitive model. This exists on a continuum in both semantics and syntax. My Erlang example indicates this, because semantically the language is nothing like Prolog, its differentiation from Elixir is purely syntactic.

There is no real universal intuition you can build up for programming. There is no point at which you've mastered some degree of fundamentals that you would ever be able to cross language family boundaries trivially. I've built up intuition for more formal language families than is possibly reasonable, and yet every time I encounter a new one I still have to pour a new foundation for myself. The only "skill" I've gotten from doing this ad nauseum is knowing at the outset that mastery of J does not mean I'd be able to get comfortable reading complex Forth code.


> There is no real universal intuition you can build up for programming. There is no point at which you've mastered some degree of fundamentals that you would ever be able to cross language family boundaries trivially.

I don't really agree with you on this, even though I agree with everything else here. Then again, I am an outlier where I've used ~40 programming languages in my career. There are a couple of language families (array languages like APL, exotics like BF) where I cannot read it because I've had no real opportunity to learn them, and there's a significant difference in being able to read a language and use a language (I can read, but not really use Haskell -- although I have shipped a couple of patches to small libraries).

I despair at the number of developers in the profession who understand only one or two programming languages…and badly at that.

(It's worth noting that I wholly disagree with the original post. 24 years ago I chose Ruby over Python because of syntax. Ruby appealed to me, Python didn't — purely on syntax. I never pretended that Python was less capable, only that its syntactic choices drove me away from choosing it as a primary language. I'm comfortable programming in Python now, but still prefer using most other languages to Python … although these days that has more to do with package management.)


> All control structures are reserved as keywords in Haskell and they're not extensible from within the language. In C I can't tell that an if(condition) isn't a function call or a macro without searching for additional syntactic cues, or readily knowing that an if is never a function. I generally operate on syntax highlighting, followed by knowing that an if is always a control structure, and never scan around for the following statement terminator or block to disambiguate the two.

Any Haskell function can serve as a control structure in the Algol sense, not so? As for `if(test)` that could indeed be a macro if the programmer’s a durned fool; but absent macros I don’t believe it can be a function call.

Mind you, I take your point about language familiarity; obviously people manage it.


> its differentiation from Elixir is purely syntactic.

Well, there's also standard library, Erlang one is very messy while Elixir one is very consistent (and pipe operator - `|>` - enforces order of arguments even in low-quality 3rd party code as well, making whole language more pleasant to work with. Same goes for utf8-binary string everywhere and other idiomatic conventions


It's not any different than having 153 steps in any other computational sense. Even limiting ourselves to elementary arithmetic, horrendous opaqueness arises with 153 operations spanning the whole set. Are we going to pretend like arithmetic is a systemically problematic because of this? Any non-trivial formal construct is potentially dangerous.

If you're having trouble reasoning about how variables are unified, it's either because you never actually built a strong gut intuition for it, or it's because you're writing Very Bad Code with major structural issues that just so happen to live in the type system. In this case it's the latter. For an HM type system, 153 choice points for an expression is ludicrous unless you're doing heavy HKT/HOM metaprogramming. The type system, and more broadly unification, is a system to solve constraints. Explosive choice indicates a major logical fault, and most probably someone naively trying to use a structural type system like a nominal one and/or a bit too much unsound metaprogramming.

Thankfully of course, you can simply just specify the type and tell the compiler exactly what it should be using. But that's not really resolving the issue, the code still sucks at the end of the day.

Now higher order unification? That's an entirely different matter.


It's the only way to live. Strict aliasing rule? Always disabled.

What's the point? If you want a framebuffer, ask the windowing system for one. Why go through a terminal emulator?

Because this one works through screen/tmux, and works over ssh without messing with DISPLAY variables and such.

That hasn't been true for a while, it's easily the best of the bunch at this point. It's also always been trivial to change, which can't be said of the others.

I'd say it's easily the least bad of the bunch, anyway, if you're really committed to cross-platform.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: