Hacker Newsnew | past | comments | ask | show | jobs | submit | II2II's commentslogin

I understand that dealing with complaints is annoying, but the response in the article was very unprofessional. Feel free to say what the change is, why it is there, and perhaps even address some of the concerns. But attacking users, even if it is a small segment of the population, does not paint The Document Foundation in a positive light.

The response is fine.

Entitlement and, really, some of this crosses the line into bullying of the foundation and the maintainers, should be dealt with robustly. It will help to reset expectations around what's reasonable for the relationship of those developing LibreOffice with the community of users.

People need to recognise that they get a huge amount of value out of LibreOffice, for which they aren't required to pay a penny, so it's not unreasonable to be asked if they would like to contribute something back in return.

But amongst large populations of people, when it comees to free things, some portion of that population will always undervalue that free thing and fail to recognise how much benefit they get from it and start acting entitled. There's nothing wrong with calling that out.


It doesn't look professional when you loose your temper, this article is comparable to that.

When reading it, I kept thinking that the writer was too emotionally involved.


> It doesn't look professional when you loose your temper, this article is comparable to that.

Nobody's lost their temper. In no world does the article read like anyone has. That's you applying your own interpretive lens to the text, not what the text actually says.

(But actually, alienating the troublesome portions of their userbase might actually help them and the LibreOffice community over the longer term. C.f., firing customers.)


Bashing media for having done their job badly

> Media coverage has largely omitted the fact that LibreOffice has been displaying donation requests for years.

Bringing thunderbird under the bus

> Nobody is making the comparison with Mozilla Thunderbird, which has asked its users for donations practically every time it starts up, with clearly visible banners

And then Wikipedia

> The same logic applies to Wikipedia.

Answering to 'comments'

> Some comments have even suggested

C'mon don't tell me it's professional, it looks amateurish.

First rule: you don't give out names.

Second rule: You don't push the fault on other even when it's their.

Third rule: you don't answer to 'comments', 'tweets', and so on. You say 'we heard feedback that this and this'.

I say it again, it feels like it's been written by a guy alone, no supervision whatsoever, and who didn't have the The necessary step back.


> Bashing media for having done their job badly

Making a statement of fact about media coverage isn't "bashing". And when you start off your argument by characterising it that way you've already lost.


The article is professional.

I agree, what's the alternative, not only are they not allowed to ask for Donations, but now they are not allowed to complain about entitlement?

How ridiculous would that be, users that get stuff for free setting the rules on how they should receive the free stuff.


Absurd take. The response was completely measured, and even if it wasn't The Document Foundation has no obligation either legal or moral to present as professional. They are not a business.

And I think it is fair to acknowledge that Linux doesn't fit the needs of all people. The thing is, the flip side is also true. While I can pick up my (admittedly technical) hobbies under Windows, it is more convenient under Linux. Without the FLOSS ecosystem, I could not afford to do so at all.

That's true. I run almost everything under Linux. All my daily driver and work-related desktop systems are Linux for more than two decades now. Heck, we don't have any Windows machines used for work in the datacenter. However, I wanted to highlight that Linux is not "there" yet, and telling "just use Linux, duh" doesn't solve all the problems a user has.

For photography and graphic arts, Linux can handle many if not most of the work (I use Digikam and Darktable with great success, for example), yet when it comes to audio for example, it falls short due to a thousand papercuts.


And if you are a professional photographer, Darktable falls short by a thousand cuts. It is not even close to Lightroom, let alone Photoshop.

I'm not a professional photographer though. I'm also not a professional musician, either.

Yet, Darktable allows me to process my RAWs to a point which I like. Similarly, my audio equipment allows me to create some music which I like, too.

I didn't push Darktable to professional levels, but I believe it can match bigger tools for what I want to do with it. I don't do photo manipulation, for example. Just process RAWs. I expect the same from my audio equipment for my music endeavors.


That's the thing, not everyone is a professional photographer. Open source tools are fine for many of us. They are also great to get a taste of a field, to learn the basics, without a massive investment.

You don't have to be everything to everyone. You just have to satisfy a need.


Been there, done that, and all without the benefit of a home Internet connection. I also created a couple of scripts that I could run on my desktop computer to install new software or update my operating system. After running the scripts on my computer, I would wander over to the library with nothing more than my phone to download the packages along with grabbing some videos to watch offline.

The issue isn't really living with 500 MB/month of data. For most people, it will simply be knowing that you can do that. The next issue they will face is having the technical ability to actually do so. Then, once you've done all of that, the question will remain: will they be interested in the stripped down Internet. A lot of us who frequent HN may be since the results will still reflect our interests. There are people on IRC who we would want to talk to. There is a slant towards tech sites with RSS. And so on. That isn't going to be reflected in sites targeted at a general audience.


The syntax of languages like Lisp and Forth are so fundamentally different that they don't need an explicit statement separator. You don't have to think about many other things either, or I should say you don't have to think about them in the same way. Consider how much simpler the order of operations is in those languages.

Things I noticed right off the bat: framing it as a tourist verses locals issue, a complete lack of numbers backing that claim, and the few numbers presented in the article have any context. I realize this is a travel article, but it seems to be more of a propaganda piece.

Take the claim that the locals hate the changes. Well, the mayor was reelected. So they claim the voter turnout was low and people were complaining, so people obviously don't support it. Sorry, you can't make that conclusion. Under ordinary circumstances, 100% turnout would only tell you the overall support for a particular candidate or party, not a particular policy. A low turnout may reflect an electorate who is not particularly passionate in any of the issues presented in the election, or it may mean something else. It was probably something else in the 2020 elections because those were anything but ordinary: they fell during the peak of pandemic uncertainty (i.e. March to June). So a flimsy assertion based upon flimsy evidence.

Then there are the scanty numbers without context. A 4% increase in traffic jams since 2015 and 31% decline in bus use between 2018 and 2024. First of all, the words "bus use" sounds highly selective. It looks like the Paris metro has been expanding and modernizing rapidly in recent years, which would both take load off of busses and be disruptive to transit users. Oh, and that pandemic thing raises its head again. I don't know about Paris, but a lot of cities took a hit to transit ridership during the pandemic and some are claiming to reach pre-pandemic levels only now. Also, cyclists tend to be the whipping boy for traffic congestion. I can't speak for Paris, but the reality in my parts are that population growth and a surge in construction have been far more disruptive than cycling infrastructure.

Sorry about the rant, but I'm sick and tired of the views of one segment of the population completely overriding the views of another segment of the population ... especially when there are assertions based upon assumptions and flimsy evidence.


The thing is, any generation can say something similar. Just look at the article: it manages to produce and describe the creation of a simple spreadsheet, yet the code and accompanying description would only fill a small pamphlet.

There are various reasons for that, and those reasons extend beyond leaving out vital functionality. While C is archaic by our standards, and existed at the time VisiCalc was developed, it was programmed in assembly language. It pretty much had to be, simply to hold the program and a reasonable amount of data in memory. That, in turn, meant understanding the machine: what the processor was capable of, the particular computer's memory map, how to interface with the various peripherals. You sure weren't going to be reaching for a library like curses. While it, like C, existed by the time of VisiCalc's release, it was the domain of minicomputers.

I mean, can the current generation truly understand the craft when the hard work is being done my compilers and libraries?


I almost entirely agree with the author's assessment of new technology. Yet that statement rubbed me the wrong way.

Sometimes it is better to get into things early because it will grow more complex as time goes on, so it will be easier to pick up early in its development. Consider the Web. In the early days, it was just HTML. That was easy to learn. From there on, it was simply a matter of picking up new skills as the environment changed. I'm not sure how I would deal with picking up web development if I started today.


This isn't a good example - people were completing 6-month bootcamps and getting $100k offers to do web development not too long ago, decades after the web and HTML took off. After a few years they were making as much as anyone who learned HTML and Web 1.0 back in the 90s.

Are the bootcampers better developers? Probably not. But they still were employable and paid relatively the same.


I think this applies a bit less to the AI sphere, which has the purported goal of making things easier and more automated over time. 90% of the time if you have an AI question you can just... ask the LLM itself.

Remember all the hoopla over how people needed be a "prompt engineer" a couple years back? A lot of that alchemy is basically totally obsolete.

Think about the hoops you had to jump through with early GenAI diffusion models: tons of positive prompt suffixes (“4K, OCTANE RENDER, HYPERREALISTIC TURBO HD FINAL CHALLENGERS SPECIAL EDITION”) bordering on magical incantations, samplers (Euler vs. DPM), latent upscalers, CFG scales, denoising strengths for img2img, masking workflows, etc.

And now? The vast majority of people can mostly just describe desired image in natural language, and any decent SOTA model can handle the vast majority of use cases (gpt-image-1.5, Seedream 4, Nano-banana).

Even when you’re running things locally, it’s still significantly easier than it used to be a few years ago, with options like Flux and Qwen which can handle natural language along with a nice intuitive frontend such as InvokeAI instead of the heavily node-based ComfyUI. (which I still love but understand it's not for everybody).


Things will never be easier long term, you will just be expected to get more done. And if you dont spend time learning the tools, you will get less done than your competition.

The goal posts will move.


"It will grow more complex" is never a good reason to get into things early. It's just your mind playing FOMO tricks on you.

Many developers who picked up the web in the early years struggle with (front-end) web development today. It doesn't matter if they fetched jQuery or MooTools from some CDN as it was done in the mid 00s. Once the tooling became too complicated and ever changing they couldn't keep up as front-end dilettante. It required to commit as professionals.

If you started today, you'd simply learn the hard way, as it's always been done: get a few books or register for a course. Carve some time every day for theory and practice. All the while prioritizing what matters the most to get stuff done quickly right now, with little fluff. You will not learn Grunt, Bower, and a large array of historic tech. You'll go straight for what's relevant today. That applies to abstractions, frameworks, and tooling, but also to the fundamentals. You'll probably learn ES6+ and TS, not JS WAT. A lot of the early stuff seems like an utter waste of time in retrospect.

This is true for all tech. If you knew nothing about LLMs by the end of this year, you could find a course that teaches you all the latest relevant tricks in 5 to 10 hours for 10 bucks.


No, this thread and sub-discussion is about specifically early web fundamentals. The web is special in this sense, it's intentionally long-lived warts and all. So the fundamentals pay outsized dividends. The rube goldberg machine that is modern JS dev still spits out an index.html result.

Being a good professional developer means getting the primitives and the data model not horribly pointed in the wrong direction. So it's extremely helpful to be aware of those primitives. And the argument "nobody is better off knowing assembly as a primitive" doesn't hold because as-said the web is literally still html files. It's right there in the source.


The discussion is centered around the idea that "adopting early" provides some future proofing in a rapidly evolving (and largely non-standard) terrain. I share the FA's position that it does not.

> The web is special in this sense, it's intentionally long-lived warts and all. So the fundamentals pay outsized dividends.

Fundamentals pay dividends, but what makes you think that what you learn as an early adopter are fundamentals? Fundamentals are knowledge that is deemed intemporal, not "just discovered".

The historical web and its simplicity are as available to anyone today as it was back then. People can still learn HTML today and make table-based layouts. HTML is still HTML, whether you learned it then or today. But if back then you intended to become a professional front-end developer, you would still have to contend with the tremendous difficulties that some seem to have forgotten out of nostalgia. You'd soon have to also learn CSS in its early and buggy drafts, then (mostly non-standard) JavaScript (Netscape and IE6) and the multiple browser bugs that required all kinds of hacks and shims. Then you'd have to keep up with the cycles of changing front-end tools and practices, as efforts to put some sense into the madness were moved there. Much in all that knowledge went nowhere since it was not always part of a progression, but rather a set of competing cycles.

Fundamentals are indisputably relevant, but they're knowledge that emerges as victorious after all the fluff of uncertainty has been left behind. Front-end development is only now settling into that phase. With LLMs we're still figuring out where we're going.


This sounds exactly right. I'm someone who learned the web back when IE6 was something we wished everyone was on, and also someone who learned the fundamentals of the web and CS in general enough to try writing a book about it to teach everyone else.

Picking up the web early didn't help with the latter. I spent most of my early time memorizing tips and tricks that only applied to old browsers. I didn't pick up the fundamentals till I went back to school for CS and took a networking class.


web fundamentals and web development fundamentals are different.

How HTML, CSS and javascript come together is extremely relevant to developers 20 years ago and today.

I do support and agree with the parent comment, see the discussion, but I do credit getting into web development when it was raw and open paid dividends for me. Todays ecosystem is opaque in comparison. You don't think there's more friction today?


HTML CSS and JavaScript are just a small subset of web development.

And yes understanding them is still relevant. But when I started I was spending more time memorizing the the quirks of IE6 than I was learning how JavaScript, CSS, and HTML come together.

I think it you start directly in react you don’t learn the layer below it sure. But there’s no reason you have to start leaning react. There’s nothing inherent about starting today that forces you to start directly with React. You could start building a static webpage. And if you did that it would be easier and more fundamental than if you did that same thing 20 years ago because you can ignore most of the non-standard browser quirks.


Good points and thoughtful reply.

You're right, fundamentals are distilled, so to think they are free just by getting in early is likely backwards. And earning one's professional chops doesn't stop or start based on when you enter.

Web dev definitely is nostalgic. I miss the early days but I also conveniently erased ie6, binding data to HTML, the need for backbone and jQuery to do anything. hmmm yeah doesn't matter when you start, it's all a grind if you dig deep enough.


> I also conveniently erased ie6

Also known as PTSD-induced amnesia, haha. We all tried to forget.


> Once the tooling became too complicated and ever changing they couldn't keep up as front-end dilettante. It required to commit as professionals.

The best professionals did not fall for insanity of the modern front-end dilettante and continued hacking shit without that insanitity.

> You will not learn Grunt, Bower, and a large array of historic tech. You'll go straight for what's relevant today.

which will be outdated "tomorrow" just like grunt/bower... are looked at today

> A lot of the early stuff seems like an utter waste of time in retrospect.

This cannot be further from the truth, if you learned Javascript early, like really learned it, that mastery gets you far today. The best front-end devs I know are basically Javascript developers, everything else is "tech du jour" that comes and goes and the less of it you invest in the better off you'll be in the long-run.

> If you knew nothing about LLMs by the end of this year, you could find a course that teaches you all the latest relevant tricks in 5 to 10 hours for 10 bucks.

Hard disagree with this unless you are doing simple CRUD-like stuff


> The best professionals did not fall for insanity of the modern front-end dilettante and continued hacking shit without that insanitity.

"Front-end professional" and "no tooling" have been exclusive propositions since the early 2010s. You either learned to use tools or you were out of the loop.

> which will be outdated "tomorrow" just like grunt/bower... are looked at today

Not really. Historically, the main problem with front-end development has not been change, but the pace of it. That's how it ties in with the current discussion regarding the (now) ever-changing terrain of LLM-assisted coding. Front-end development is still changing today, but it's coalescing and congealing more than it's revolving. The chasms between transitions are narrowing. If you observe how long Webpack lasted and familiarity with it carried over to using Vite, it's somewhat safe to expect that the latter will last even longer and that its replacement will be a near copy. Someone putting time to learn front-end skills today might reap the benefits of that investment longer.

> if you learned Javascript early, like really learned it, that mastery gets you far today.

I did. I got a copy of the Rhino book 4th ed. and read it cover to cover. I would not advise to learn JS today with historical references. JS was not designed like most other languages. It was hastily put together to get things done and it had a lot of "interesting", but ultimately undesirable, artifacts. It only slowly turned into a more sensible standard after-the-fact. Yes, there are some parts that are still in its core identity, but a lot in the implementation has changed. Efforts like "Javascript: The Good Parts", further standardization, and TS helped to slowly turn it into what we know today. You don't need to travel back in time for that mastery. Get a modern copy of the Rhino book and you'll be as good as the best of them.


Yeah, I still get use out of XMLHttpRequest to this day good thing I got in early and variable hoisting isn't gonna get me! /s

A lot of snark aside there's a bit of a false dichotomy (I think) here at work. Whenever or wherever your jumping in point is into $something it will always pay dividends to learn the fundamentals of that $something well and unless you interact with older iterations on that $something then you'll never have to bother learning the equivalent of Grunt, Gulp, Stylus, Nunjuncks and so on for that $something.

With that being said it's also good to put aside time once a year to check out a good recommended (and usually paid) course from an established professional aimed at busy professionals.

As for LLMs I feel it's slowly becoming a thing big enough where people will have to consider where to focus their energy starting with 2027. Kinda like some people branched from web development into backend, frontend and UI/UX a good while back. Do you want to get good at using Claude Code or do you want to integrate gen AI features at work for coworkers to use or customers/users? It's still early days just like when NodeJS started gaining a lot of traction and people were making fun of leftpad.


And yet, at some point most web developers will have picked it up after the "raw html" era -- that point has probably come, even.

The web/html is a great analogy. I too am in no rush to be hyper effective with LLMs. In fact i want to deliberately slow down because ai-native coding is so exhausting.

That said, your point about the leverage of learning html and web in the early days compared to now rings true. pre-compiled isomorphic typescript apps are completely unrecognizable from the early days of index.html


Faster than OS/2, sure. Now try to download a file in the background while doing work in the foreground. You would be lucky if your Windows 3.1 communications application could complete it without multiple retries.

The two operating systems were trying to solve different problems, and had different system requirements because of that. Windows 3.1 was fine for running multiple interactive applications since neither application would be doing real work in the background. When Windows 95 entered the picture, that changed and its system requirements weren't all that different from OS/2.

And that is just one example. Windows 3.1 didn't provide much in the way of memory protection. (From my recollections, it could detect a memory access violation. At that point it would blue screen.) One of OS/2's most noteworthy features was memory protection. All of a sudden you could use your computer for an entire day without losing work from crashing. Yeah, OS/2 would happily terminate an application (rather than the OS) when there was a memory access violation. On the other hand, it made it much easier for developers to detect and address such bugs.

On that last point: I have fond memories of bringing OS/2 boot disks to my high school programming classes after the upgrade to Windows 95 (and, when they started refusing to let me boot OS/2, they let me use the NT server). There was a world of difference between programming under OS/2 or NT verses Windows 95. No one bothered to try programming under Windows 3.1!


> No one bothered to try programming under Windows 3.1!

VB and Windows 3.11 paid for my first home. I wouldn’t enjoy programming in C in Windows though, and, IIRC, it was a while before Microsoft’s C tooling got a Windows version.


I should have said: no one in my programming class bothered to program under Windows 3.1. Clearly there were people out there writing software for Windows 3.1, and I have heard that VisualBASIC was a nice development platform for Windows 3.1.

I'm not sure what the C situation was like for Windows 3.1. I did have Borland C++ and fiddled around with the Windows IDE a bit, but never recalled making any progress.


While we're at it, lets stop bundling UEFI.

I'm being slightly absurd here since you need some sort of firmware to simply start up the computer and install an operating system, but here is my point: to most people, the operating system is part of the computer. The computer is simply an expensive brick without it. On top of that, a lot of the negativity towards bundling Windows originates from Microsoft's past monopolistic practices[1]. We certainly don't hear many people criticizing the bundling of macOS or iOS on Apple products or Android/Chrome OS on Android devices or Chromebooks. (There may be people who want to load alternative operating systems on these devices, but that is different from criticizing the bundling of the OS.)

[1] Is Microsoft forcing hardware vendors to install Windows even a thing these days?


> Society has become quite 'entitled' to 'free' things. As popular as they are, torrents and free streams and emulation and clones of games in an open source lib are all stealing something. I know thats an unpopular thing to say but it a fact.

Emulators and game engine clones may encourage "stealing", but they are also unique creations. The people who develop said software are typically careful about keeping their software separate from copyrighted materials. In the case of OpenTTD, they did so by creating their own graphics and sound assets to accompany the game engine.

If you are claiming that creating an independent clone of the game engine is stealing, you are entitled to your own opinion. But do understand that it is an opinion and not a fact.

I would also ask you to consider the consequences if that opinion were codified into law. It would make all forms of progress (e.g. literary and technical) nearly impossible since nearly all ideas are derivative. To give an example: the computing landscape would be very different. IBM compatibles would not have been a thing, leaving the market either fragmented or consolidated in the hands of a single company. Oh sure, there were companies that did steal by producing verbatim copies of the IBM PC ROM or the mainboard layout ... but we are talking about a reimplementation in the case of IBM compatibles and OpenTTD, not copies.

Come to think of it, the entire computer industry would have been set back by decades with an excessively strong IP regime. No one seriously classifies the ABC as the first computer, yet the courts used it to strike down patents on early computers. In the early days, IBM played games with IP licensing to try to restrict their competition, something the courts shot down. AT&T didn't give away Unix, nor did they license patents on transistors out of the goodness of their heart. They did so because regulators and the courts recognized that IP could be used to stifle competition (and, by extension, it would have inhibited progress). So I doubt that the courts would agree on emulation or game engine clones being stealing either.


Ehh, it's not theft, it's copyright infringement. And in the case of openttd, it's not the engine that has the legal problem, it's the graphics and the sound. Openttd is an engine to play transport tycoon content. If openttd distributes transport tycoon's graphics and sound, (which they were) they are infringing on the owners right to distribute.

Update, I got openTTD confused with openRCT, It looks like openttd did redo the graphics and sound, so I think the parent post is correct and atari has little to no legal ground to stand on, the only thing they could reasonably claim is trademark, that is, it is them using the name transport tycoon that is the problem. And that is still not theft, it is trademark infringement.

At this point I would like to plug Simutrans a transport tycoon clone that actually took the effort to make their own graphics and sound. But really, as much as I enjoy simutrans any normal transport tycoon connoisseur will hate it, a bit too different and clunky for them.

https://www.simutrans.com but steam is probably the easiest way to play.


>which they were)

1) No, is not the case, stop the FUD.

2) Simutrans it's half propietary and a good atempt of SPAM, dear friend.

3) Ok, fair, you corrected yourself. But on OpenTTD the OpenSFX and the rest are actually a way to create both compatible graphics and sounds with the existing MODs and stand out as themselves, kinda like FreeDoom: it's obviously made to be compatible with the Doom assets for walls and the like, but the artwork it's closer to a modern HL than Doom. FreeDoom needs to be like a weird Doom in a parallel universe for floors, walls and the like because PWADs demand it so the art looks like compatible (texturing, tiling, lightning) while not being an obvious Doom rip off. And yet it does, I played lots of classic Doom2 compatible PWADs and TC's and the FreeDoom assets perfectly blend ingame. Strain.wad looks even greater.

Look: https://freedoom.github.io/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: