Hacker Newsnew | past | comments | ask | show | jobs | submit | EuAndreh's commentslogin

As free advertisement to rsync.net, I also like the service and support.

I have not deleted any of my daily borg backups for many years, covering my laptop and some servers. Borg has an equivalent `borg serve` feature as detailed in the article.

I use borg mount every other week. After the initial config, you can just type:

  $ borg mount :: borg/
...and browse the full archive lazily. Most of the time I'm looking at the last entry, but I've done some historical searching once or twice.

Even though I count the rsync.net backup only once in the 3-2-1 backup rule, its geo-distributed under the hood, so there's also that.

I forgot to update the payment credit card once and got a few months of grace with periodic notices, but no service cut and excellent customer support.

I'm also fond of their stance on pretend standards, like PCI compliance: https://www.rsync.net/resources/regulatory/pci.html

IIRC I created the account for use as a git-annex remote, but I've used it for archival and sshfs, which is very convenient.


Despite its flaws and issues, email is federated and has been working at scale for decades.


I agree. That’s an interesting model for chat. But remember, modern email is quite complex when you add all the standards for authenticity and security together, as well as informal requirements like spam filtering. But even if self-hosting were a pain, it’d still be an improvement over today’s fragmentation-by-app status quo.


> But even if self-hosting were a pain, it’d still be an improvement

Why hypothetical? Matrix already works sufficiently well.


Not at global scale it doesn't.


Are you aware of any particular bottleneck?


Besides being the stardand, it also has a standard:

https://pubs.opengroup.org/onlinepubs/9799919799/utilities/e...


And my scripts are fully POSIX-compliant, running on GNU, slewsys, and OpenBSD ed(1), so the standard is pretty reasonable.


Come on, these three phrases aren't equivalent or comparable.

They "can arguably be construed as the violence" from a discriminatory argument. An honest argument wouldn't provide us this false equivalence.


There are many other factors that influence language popularity besides technical quality, like:

  - marketing;
  - big companies using it;
  - familiarity;
  - history of the creators;
  - history of the influencing languages;
  - timing;
  - luck;
  - regional usage;
  - etc.
Despite some programmers seeing themselves as fully rational making cold decisions, we're like everyone else.


> - marketing; - big companies using it;

These are the deciding factors.

If you look at which newish languages have gotten popular over the last few years, it was Rust, Kotlin, Swift, Go and Typescript.

Building a language and ecosystem around it takes a huge amount of resources, and often tedious work that doesn't happen if people aren't paid for it.

The street cred of "hey, large company X is using it, it must be good" is also very important.

(of course Swift and Kotlin are somewhat distinct as the platform languages for Android and iOS)


> The street cred of "hey, large company X is using it, it must be good" is also very important.

Yes, and also, "large company X is spending lots of money on it, so they aren't just going to abandon it once it's no longer the newest, coolest thing."


if "Space industry" isn't big I don't know what is


Good suggestions, but also meh, e.g.: forward declaration requirement enables a single-pass compiler to emit code on-the-fly.

I have a much better list for things to add to C: Nothing. C isn't perfect, or nearly as good as it could be, but simply adding things onto C gets you C++.

Adjusting what sircmpwn says: in C you don't solve problems by adding features, but by writing more code in C.

I liked an answer on stack overflow on a question on "how to write a generic function wrapper in Go", or something similar. Many suggestions included reflection, but the author wanted something simpler with varargs without reflection. A comment simply said: "wrong language".

I'd rather adopt this position for some languages, instead of add more and more to C3X. I do away with things in C23, and don't want even more things added in to C.

Making a strech of OP's arguments: "look at all this cool things that C could do, and that D does!". Well, go on and use D, nothing wrong with that.

(BTW, I do write test targets for every file in my C projects, but I'm not so much into jogging).

Those things aren't that obvious, and I'd rather not have them added to C.

Wrong language.


> forward declaration requirement enables a single-pass compiler to emit code on-the-fly.

True, I know all about that. My Zortech C and C++ compiler was one pass (after the multiple preprocessing passes). The ground up ImportC C compiler completed a couple years ago has a separate parse pass.

So I well know the tradeoffs. The parser being stand-alone means it is much simpler to understand and unittest. I found no advantage to a single pass compiler. It isn't any faster.

> simply adding things onto C gets you C++

C++ doesn't allow forward declarations either.

Successfully doing a parse-only on C code doesn't quite work. It turns out the grammar relies on a symbol table. Fortunately, only a symbol table of the typedefs. Once adding that in, ImportC worked. (I really tried to make it work without the typedef symbol table!)

C++ added a bunch more syntax that relies on the symbol table. I would not even try fixing it to work as parse-only.

> in C you don't solve problems by adding features, but by writing more code in C

The trouble with such sayings is like following a google map that says cross this bridge, but wasn't updated with news that the bridge is out.

> Those things aren't that obvious,

They are once you use another language that doesn't have those restrictions.

> and I'd rather not have them added to C.

C adds new things all the time to the Standard, like normalized Unicode identifiers, which are a complete waste of time. Every C compiler also adds a boatload of extensions, some good, some wacky, many ineptly documented, all incompatible with every other C compiler extensions.


> The parser being stand-alone means it is much simpler to understand and unittest.

Stand-aloneness and single-passness are orthogonal.

> I found no advantage to a single pass compiler. It isn't any faster.

A gigantic advantage: a single-pass-compilable language is simpler. By definition.

Implementations may or may not be simpler or faster.

> C++ doesn't allow forward declarations either.

Well, that's not what I meant.

C++ is "C with just this thing" done way too many times.

> The trouble with such sayings is like following a google map that says cross this bridge, but wasn't updated with news that the bridge is out.

TBH, I didn't really get this. Is this about sticking to C as is, but it is outdated as is?

C would be outdated if it didn't have, say, long long for 64-bit numbers. Having "true" be a keyword instead of a macro doesn't change how outdated it is or isn't, just like compile-time evaluation also doesn't.

> They are once you use another language that doesn't have those restrictions.

I have used many, and I still don't find them obvious.

> C adds new things all the time to the Standard, like normalized Unicode identifiers, which are a complete waste of time.

I agree that many/most are a waste of time, and shouldn't be added to C. The fact of C adding things to the standard all the time shouldn't justify adding even more things, but make one question if those are needed at all, and how to accomplish the goal without it.

> Every C compiler also adds a boatload of extensions, some good, some wacky, many ineptly documented, all incompatible with every other C compiler extensions.

I know about that, and my position is the same: just don't.

I don't use them also.


> A gigantic advantage: a single-pass-compilable language is simpler. By definition.

That's only "by definition" if you take a language that needs multiple passes, then remove the features that need multiple passes, and don't replace them with anything else to compensate.

The "by definition simpler" version of C would not only disallow forward references, it would have no forward declarations either. As-is, forward declarations add some complexity of their own.

(Also, if you can figure out a way to emit jump instructions in a single pass, you can probably figure out a way to call unknown functions in a single pass.)


Doing jump instructions in a single pass is done by creating a patch list, and when the compilation is done walking the patch list and "fixing them up".

Doing this with functions is a lot more difficult, because one cannot anticipate the argument types and return types, which downstream influence the code generation. Of course, early C would just assume such forward references had integer arguments and integer types, but that has long since fallen by the wayside.


I have the impression you're mixing single-pass compilation and O(1) memory use of the compiler.

As is, C already is single-pass compilable, modulo some unnecessary syntax ambiguities.

As the compiler reads the text, it marks some character strings as tokens, these tokens are grouped as a fragment of code, and some fragments of code are turned into machine code. A simple function of a 100 lines doesn't need to be parsed until the end for the compiler to start emitting machine code.

Like the parser, this requires memory to keep tabs of information and doesn't work for all types of constructs, like a jump instruction to a label defined later in a function. The code emitter soaks input untill it is possible, and does so, like when the label is already known and can be jumped to.


You cannot do any optimization when generating machine code that way. That's fine for a primitive compiler built for a school project, but not much else. (Even "no optimization" switch settings on a compile do a lot of optimizations, because otherwise the code quality is execrable.)


> That's fine for a primitive compiler built for a school project, but not much else.

Not true.

On the one hand, just see how many non-compiled languages are used outside of primitive school projects.

On the other hand, this simpler approach is actually faster for writing actually fqst compilers. Many modern compiled languages have compilers that work on the order of ~100ms on a simple file with 1k LoC, when it could (and arguably should) work on the order of ~1ms, IOW, imperceptible given the syscalls overhead.

A 100x faster compiler that generates meh code is more useful 99% of the time: when one is recompiling all the time during development.


I have my own list of things that could "easily" be added to C, but I'd rather them not to be.


You get them anyway in the form of extensions.


Thanks, but no thanks.


    LISP doesn't really read well the bigger the codebase
[citation needed]


There is nothing wrong with using PEGs for SQL parsing, but this article (I didn't read the paper) presents flawed arguments:

- tech $X is from the 60s, therefore it is bad and/or outdated: one doesn't need to "disrupt" or innovate in everything to become modern. There are plenty of things from the 60s that still don't have a better replacement, and its OK to keep using it.

- "YACC-style parsers" clumps together parsers that are generated at compile-time, from declarative grammars, using LALR(1). But that's not inherit to the technique or algorithm: a parser can be LALR(1) from a declarative grammar and still extensible at run-time, or provide LL(1) alongside, or be built from statements instead of a grammar. There's nothing wrong with using PEGs over "YACC-style" parsers, but not for these distorted reasons.


I'm not sure that was their position. They're not saying tech from the 60s was inherently bad. They specifically mentioned that we today are not constrained by the same hardware restrictions that gave rise to the software in the 1960s. Those are two very different positions.

For example, I like Rust. But if Rust had been introduced as-is fifty years ago, no one would have used it, because the hardware requirements to make Rust compilation practical simply didn't exist yet. Taking a week just to compile "hello world" would have been a nonstarter. Not because Rust is bad but because hardware requirements at the time ruled something like it out.

2024 is not 1964 however, and it's always good to re-examine old assumptions.


Not key value dictionaries, just pointers are needed.

A closure with no behaviour is just a pointer to the enclosed variable. A closure with 2 pointers is a pair, which you can get the car and cdr.

The runtime needs to make the pointee available outside its definition, so escape analysis, garbage collection, etc. But no dictionary is needed.


Zero accessibility =/

As great as it may be, this isn't something I'd pick as it'd be unusable in this cases.


Untrue. One of the changes mentioned is 1.4 supports scaling the interface with ctrl/+/-/0 (like in your browser). This is a huge accessibility win, the more so because it is now enabled by default for every FLTK program, actually putting it ahead of other toolkits.

Maybe you're thinking of screen-reader support for the blind though.


Judging by a sibling comment to yours... it seems a lot of people don't think of supporting blind users.

(I'm not blind, at least yet, but it is a pretty sad state of affairs.)


> Zero accessibility =/

You should see Windows (10, 11) or Android. Gray on gray, no scrollbars, messages using half of screen, buttons you didn't asked for ( why do i need a microfone icon when the app has no microfone permission ? )


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: