Hacker Newsnew | past | comments | ask | show | jobs | submit | luckymate's commentslogin

this sounds like generic AI comment without any input into conversation


Does your comment pass that bar?


Can you link some of those projects? I'm genuinely curious.


pi, openclaw, vinext, browser, ccc compiler, the scope is only growing.

Look for the claude icon in the trending GitHub repos https://github.com/trending. It's like on all of them.

It's hard. :( .. Those who are not accepting this are in cognitive dissonance.


I'm taking the bait whatever. All those projects are just more fucking AI tools. It's all Claude seems to be good for - writing agents, skills, harnesses. Just a big fat ouroboros.

(Going down the /trending page - 13 of the 14 are some flavor of context manager or agent or smth)

Let me know when someone uses Gas Town or openclaw to write something that isn't "the next Gas Town or openclaw" and then we can talk


Just to be clear: you say by ‘dropping’ lisp you’re keeping it lightweight but it’s based on electron? So what does ‘lightweight’ mean in your opinion?


Thank you for the sharp question! You are absolutely right that Electron itself has a baseline memory footprint that isn't small.

To give a clearer picture of what I mean by "lightweight," here is a quick startup comparison video I took a while ago: https://x.com/elecxzy/status/2022003439757336583

(Sorry for the Japanese text in the video!)

Left: VS Code

Middle: Windows Notepad

Right: elecxzy

As you can see, elecxzy boots up almost as instantly as native Notepad.

To ensure the actual text editing remains just as snappy and responsive as Notepad despite running in a browser engine, elecxzy features several optimizations, including a custom Piece Table and a fully virtualized DOM/renderer.

So in this context, "lightweight" means "Notepad-level startup speed and typing latency, but with native CJK IME support and Emacs keybindings." I should have been clearer about this distinction in my wording!


How often do you start your editor? I start emacs once at booot and keep it running, using emacsclient to open additional files from the command line.

Look, I think that writing your own editor is useful because one learns a lot. And of course, it doesn't need to use Lisp or even be extendable at all. But it's not Emacs any longer if you remove the Lisp part.


Thank you for the comment!

To answer your question: I actually start my editor many times a day. I know the "start once and use emacsclient" workflow is the standard and most efficient way for Emacs users, but I personally tend to open and close editor windows frequently, just like using a simple notepad.

Regarding the Lisp part, I completely agree with you. As I mentioned in other threads, if you remove Lisp, it is absolutely not Emacs anymore.

I am not trying to build a true Emacs, nor am I trying to deny its great philosophy. I just deeply love the physical typing experience and muscle memory of Emacs keybindings. My goal was simply to extract that specific UX and package it into a standalone app that I could run immediately without any setup.

So you are right—it is just a personal project to recreate the typing feel I love, rather than an Emacs replacement!


Is this fucking chatgpt comment?


I’m writing these comments in Japanese and using AI to translate them into English, as I’m not a native speaker. I want to make sure I can communicate my thoughts and technical details as accurately as possible to this community.

I apologize if the phrasing feels a bit "AI-like" sometimes, but the ideas and the project itself are 100% mine!


What answer to that question and in this situation would make any sense?


The motivation/justification from the author why they believe removing lisp but adding Electron somehow sums up to being "lightweight"?

Maybe the author thought of the UX/baggage/legacy or something else when they thought about "lightweight", rather than how much memory/cpu cycles something is using? Not sure, but maybe there is a more charitable reading of it out there.


Probably none. Still I’m curious what is the authors understanding. Whether he actually thinks it is a lightweight solution or whether that’s kind of advertising phrase, like ‘blazingly fast’


I believe it's called a rhetorical question.


None, just another Electron hater.


I think I am in love. Clojure + Rust, everything is typed, but I don't need to annotate. And algebraic effects that I really wanted to explore in Ocaml, but now can do it in language with way easier syntax. I might be missing bit of Clojure dynamic nature, but it looks like a bunch of really interesting ideas in one language.


Coming from Clojure, I like types being invisible. Square brackets feels like a needless change. If you want sexprs, just use sexprs. Interesting ideas, as you say.


Yeah. Clojure is by far my favorite dynamic language. But, I love static types. At a glance, a quick glance at Loon- looks like it could just flat out become my favorite language. Loon with a standard library that approaches Go’s would be :chefskiss:


<3


What’s the gain from using that over just using python?


A lot of folks (including me) find Python limiting for non-trivial use cases. The One Python Way was a great selling point in 2004, but the thing is, it's still basically the same Way in 2024. So, I hope you like nerfed lambdas and inheritance


I totally agree but always find myself in a very small minority when expressing this. People love Python and I don’t understand it.

For me the issue is that one cannot write in a “light functional programming” style in Python. The language lacks many simple things, but chief of them is multi-line lambdas! In 2024 I’m very surprised that people aren’t clamouring for this.


Ignoring the social benefits (which is easily the biggest draw of the language), Python feels really optimal for quick, simple tasks. The language lends itself to not overthinking it and building simple solutions.


Expanding on this point (because it's important):

A lot of folks (including me) find Python limiting for non-trivial use cases. The One Python Way was a great selling point in 2004, but the thing is, it's still basically the same Way in 2024, and the fact that the language has been designed so exhaustively to ensure that there is only one correct way to do stuff, there has never been room for evolution. (Insufficient entropy!)

I'm going to compare it to JavaScript/TypeScript, because it's what I know best in 2024, and because it's an engaging contrast; yet the take-home message is also applicable to other languages, such as Rhombus (which looks cool!)

Python feels timeless to me, like Roman majuscules. It was, in its day, brilliant: cleaner than Java, saner than Perl, and just so, so hackable. The strong Pythonic cultural rejection of Perl's 'more than one way to do it' dictum was powerfully clarificatory; we didn't have StackOverflow, and the _good_ technical resources were still all in physical books, so being able to learn one pretty-good way of expressing a concept or pattern was magical.

But, like roman majusucles, Python didn't evolve, because it didn't have to. The marginal cost of change threatened the original value proposition, so it just didn't really ever happen.

By contrast, while e.g. JavaScript had to evolve, because it was gobsmackingly bad, the necessity of that evolution has made made JavaScript (as a community and language) open to variation, change, competing paradigms, and imports from academe and research. Evolution loves a mess.

TypeScript, for example, happened nearly overnight, and as a result of it and other innovations, I can spend my day working blissfully in algebraic types, doing frigging set theory to do precise type hinting, and passing around pure functions and immutable structures. My code runs everywhere, and my coding style is always changing (improving!), and the only real price I've had to pay is learning some regrettable nonsense about an extra '=' in my comparison operators, and maybe the idiocy of having both `undefined` and `null` types.

Whereas, when I peep the pythonista channels at my work, I notice they are still having essentially the same conversation about eliminating the GIL that I remember them having in 2007 (yes I am old.)

Which is not to say that Python is _bad_, per se; there are obvious advantages to having an imperfect but opinionated lingua franca, and I'd sure rather be thrown into an unfamiliar Python codebase from 10 years ago than an unfamiliar JavaScript codebase of an equivalent age.

Yet I'll warrant that Python's long summer of success, combined with its one-way-to-do-it culture, close the mind and the imagination, and will eventually make it less fit-for-purpose than its competition. It will remain in use, and it will even find new contexts (machine learning, say) but 'the code part of the codebase' will be done in other languages.

I suspect Python will, thanks to its exceptional readability and regularity, become a configuration language --- a part of the UI, essentially, a sort of shell. It will also continue to be a language used to teach programming. Hanging on here and there, sort of like how Latin hangs around in biology and medicine. But legacy Python codebases, thanks to that very readability, will probably be rewritten sooner rather than later.

Standards (Latin, Python) are _useful_, and _timeless_ standards are some of the most valuable artifacts humans have ever produced.

But it's the Innovator's Dilemma for sure.


Hm, I don't think the "one way of doing things" koan holds up in practice, other than being a nice narrative.

Nearly all other languages strive for doing things in one way, it is not something that makes Python unique. In fact, Python typically offers a complete mess of ways in which to solve something. Classes are sometimes good, sometimes they're not. Lists or Numpy arrays or Torch tensors, the choice depends mostly on performance, not on style.

And Python is evolving. There is optional type checking for instance.


The 'one way of doing things' mantra must be seen against the crazy ways of Perl (and later Ruby).


Just so. Code written in Python has always had the virtue of being _incredibly boring_, which is a virtue that, at the time of its inception, was criminally undervalued; this was, after all, the heyday of C++, and if you weren't bringing operator overloading and multiple inheritance and generics to the table, the hipsters sniffed.

For example, no one complained that Python had multiple inheritance; instead, we thought this was _a point in its favour_, over and against Java. (I imagine Guido added it grudgingly as a vox-populi.)

Thus, the Pythonic mindset emerged as a sort of 'refusal of the call', sort of like Indiana Jones shooting the sword guy (https://www.youtube.com/watch?v=kQKrmDLvijo). You could be against the hermetic complexity of Perl, but do it better than Java! Neat!

These days, however, I suspect that Python, while still boring, is boring in the _wrong_ way, leaving opportunities for concision, clarity and performance on the table -- now-basic stuff like immutable datatypes, monads, tail recursion, concatenative programming, and so on.

Python is the hobbit that stayed in Hobbiton.


> python feels timeless

As a longtime python programmer I disagree. Python is so stuck in its old ways that several useful and interesting PEPs just get rejected. In 2024, the thought of not having macros, a decent version manager, not being able to modify running code despite being an interpreted language, not having multi line lambdas, not having several of core language features in lambdas (e.g., no try/except), the pain of creating thunks, the overhead of closures, not being able to read module files easily (if they’re in another directory), etc. make Python one of the most frustrating languages.


Python arguably has better support for working with sum-types (algebraic types) than TypeScript does, because the language actually has a `match` statement (since 2021). Define a sum-type as a Union of dataclasses and the static type-checker (Pyright) can even tell you when your pattern-matching statements are non-exhaustive.

Do you track such developments, or spend the time dreaming up these elaborate theories? :)


Not particularly (point taken), but I _do_ note that describing a napkin-sketch as an 'elaborate theory' is perhaps more flattering than you mean it to be ;)


I can't tell what you mean concretely, because the two examples you give -- gradual typing and lack of concurrently executing threads -- are common to both languages. Python has support for gradual typing, and JavaScript is single-threaded (which is an even stronger property than having a GIL).


As I said, JavaScript was an example, picked because I work it it every day.

There are most certainly better examples, that I am less equipped to present; Rust springs to mind.

Yet whatever example is chosen, the claim remains unchanged: Python experiences the Innovator's Dilemma.


I don't think Rhombus is python style. And this won't convert anyone anyway; it's experimenting with new language constructs and such which is what racket is used for by most who use it. In many years from now, the lesson learned by these experiments might end up in a new language that might get converts and even rival python/js (I hope so, I find both terrible).


I was skeptical about Rhombus in the past. But as it turns out making new programming languages is the whole point behind Racket.

This not supposed to be a production use language, think of it more like used to teach, and experiment with programming language features.

To that end, it will be used by its target audience(i.e, teachers and researchers)


Best-in-class macros! Rhombus makes metaprogramming safe and fun.

Rhombus uses RRB trees for its native list structure: immutable data type with O(log32(n)) random access and functional update. Really amazing data structure.

Also, the Rhombus compiler isn’t the quickest right now (still in development) but once compiled, it’s generally much faster than Python. (Though that’s not hard.)


Macros, ease of making internal DSLs a la Haskell?


macros.


Indentation instead of syntax is still a crime.


Won’t it cause problems with Google/Apple stores?


No, this isn’t shipping native code dynamically, just content. Crudely, will the fetched bytes go into executable or non-executable memory? If non-executable, Google/Apple are usually lax


That’s right, no native code is shipped dynamically. The dynamic part is the layout of components and styling. Many apps use this approach and refer to it as Server-driven UI (SDUI).


Pretty arbitrary and boring set of rules. Would discuss with few of them. Repeating obvious things helps people consume content, pay attention and actually remember. You start from obvious, lead people into unknown and get back to obvious, with few additional details added along the way. It’s like a good standup show, where comic catches you in 40 minute with a joke introduced in 3rd. Then you remember. Presenting obvious, well known, information in few interesting ways is mastery. And lots of simple, short slides might really help with dynamics of your presentation.

One general rule I give to speakers, and stand by it, is: take time to breath.

I don’t see proof of usability of this rules presented in article, so yeah, just arbitrary list, don’t mind it too much. Watch good standup, and learn presentation structure from those guys.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: