Hacker Newsnew | past | comments | ask | show | jobs | submit | binary_ninja's commentslogin

My humble perspective, having gone through that, is that young men have it very hard to find their value. As a young man, after you've been past your teenage years young men don't get the "kid's pity" and unconditional love anymore. You become only as useful as the things you do, and what can you do as a young man? Specially if you haven't really grown up with a supportive father you end up in a limbo where you're not yet a man but nobody really tells you how to be a man. Equally women generally don't find men trying to figure things out very attractive. It is, then, easier to feel better with yourself when you're on your 30s and you are lucky enough to have found a way to figure things out by yourself. Then and only then you suddenly become "useful", attractive to women and respected by other men. Getting through the first stage without falling into suicide or hedonism is the hardest part.


Another problem, (at least here), is what happens if you "overcorrect" with regard to hedonism.

If you completely shun partying/clubbing/bars, casual sex, TV/social media addiction, etc... in favor of a more virtuous way of life (perhaps filled with more intellectual endeavors; without neglecting the body either, I should mention), you gain a superpower known as "invisibility from the female sight". And if you dare have standards about said opposite sex, $DEITY help you.

The state of things is really dire, let me tell you.


You can meet women at church... but that's another can of worms.


Can you please confirm that "under the couch" is a literal statement. If so, could you please provide further details? I am extremely curious.


“Under the couch” is a literal statement. This is a modern, not traditional couch, so its legs are raised maybe 10 inches? Gives some room for access and passive cooling.


Doesn't it get loads of dust?


Getting it dusted and vacuumed was the part of the deal I have with my wife regarding the existence of this server :-)


Why do you think 16GB is not enough for a home server? I am currently running a NAS with SMB, NFS, Nextcloud, LDAP, mail server, Roundcube, Uptime Kuma and a couple of websites and using around 1GB of RAM. Just curious what other people is doing.


Probably stuff it to the brim with virtual machines, that kind of usage


Well, yes, if you go full enterprise with 6 VM's and 30 Docker containers I can understand how you'd aim for 64GB of RAM but then, why would you want to use a fanless mini PC for it?


Fanless, mini and 64 GB or RAM in a single device is going to be a tough challenge anyway.


I think super micro makes such edge devices. But then we are talking a different category now.


Interesting!

https://www.ahead-it.eu/en/shop/servers/embedded-iot/intel-a...

Those?

Technically they are fanless but I wonder how well they would work outside of a rack with high ambient, they seem to be fanless themselves but still rely on having plenty of air pulled across the casing. Mounting them vertically might work. 850 euro though!


Yep. I think they would throttle under heavy sustained load. But I don’t think these are for sustained load or home servers. Probably designed for places like traffic cams where maybe large image models need ram and prefer no moving parts that can fail. Even places like factory floors.


If somebody wrote a book about a civilisation that managed to put a foot on the moon and create an artifical intellegence that can write code but still struggles with printing you could easily categorise it as a comedy.


Is it me or there was a certain passive-agressivenes on that Oracle article?


Sheesh, that's not passive ...

"Finally, to IBM, here’s a big idea for you. You say that you don’t want to pay all those RHEL developers? Here’s how you can save money: just pull from us. Become a downstream distributor of Oracle Linux. We will happily take on the burden."


Don't get me wrong, I am very happy with this announcement, however it bothers me slightly when I see these kind of announcements with the tone of "we love open source and we want to give this to the community" when the rationale for this probably was "RHEL is going to lose a bunch of customers and we can profit off of that". If it was a non-profit they might convince me but as a Linux for enterprise-kind of company I am not fully convinced behind the motivations.


Likewise it grinds my gears when people think that a company should just do things to do them. At the end of the day they need to make a profit to be sustainable, to pay employees, to perform R&D, keep the lights on, etc. Altruism isn't enough on its own.

Why not both? Capitalizing on customers who need a solution while also helping the community? Not preparing for an influx of new business is a foolish business decision.


That is the freemium model and more or less the model that I thought the open source community had long ago agreed was acceptable. Open source development for free is cool with what spare time you have, but at a certain point you have to pay your own bills to keep a roof over your head, food in your belly, maintain your health, and actually be able to enjoy your life. Value added features and support have long been the way open source companies have been able to do that.


Not sure if the comment was addressed specifically to mine. If it was: I am OK with companies trying to make money, that's what they are for. It was more about how they phrase things. If you are a company that truly has open source as a driver, then make whatever claims you wish, however if your company's main driver is revenue (as SUSE likely is) don't make business announcements with "WE LOVE OPEN SOURCE" as your main rationale, just say "there's an opportunity for our business to grow" or any other corporate wording.


When you see Oracle making similar claims, you just KNOW that the reason is money, not any sort of care for Open Source.


Awk has always been a language that I loved but I have struggled to use besides quick jobs for parsing text files. I understand it is meant to be use for exactly that, but the fact that is simple, fast and lightweight sometimes makes me want to do something more with it, but when I start trying to do something besides parsing text I find that it starts becoming awkward (pun intented?).


> but the fact that is simple, fast and lightweight

I see awk as a DSL to be honest. Yes, it can be used as a general purpose language, but that quickly becomes, as you say, awkward :D

Like many DSLs, it is simple, fast and lightweight as long as it is used for it's intended purpose. Once you start using it for something else, these advantages evaporate pretty quickly, because then you have to essentially work around the DSL design to get it to do what you want.


DSL == Domain Specific Language?


Yes


One simple thing I do with awk is to create a command processor: read one line at a time and do things on my data as a response. This is very useful because you can make your command as powerful as needed and call other unix tools as a result.


Do you have an example of this that is available somewhere?


I find it pretty nice for writing simple preprocessors. For example I have one which takes anything between two marker lines and pipes it through a command (one invocation per block). Awk has an amazing pipe operator which lets you do something like this:

    ... {
        print $0 | "command"
    }
"command" is executed once, and the pipe is kept open until closed explicitly by close("command"), at which point the next invocation will execute it again. The command string itself acts as a key for the pipe file descriptor.

And of course, no mention of awk is complete without the "uniq" implementation, which beats the coreutils uniq in every way possible (by supporting arbitrary expressions as keys and not requiring sorted input):

    !a[$0]++


I had no idea about this "keep the pipe open" behaviour. I thought it would spawn the binary on every print statement and thus didn't consider it in the past. But now...


This is exactly why I moved from AWK to Perl for these quick jobs a couple of years ago. If you stick to an AWK-like subset, Perl is also simple, fast and lightweight. If you want to grow your scripts (and you have a lot of discipline) Perl – in contrast to AWK – gives you enough noose to hang^W^W^W^Wthe tools you need.


Perl? Wow. Is that better than bash, python or even nodejs? Why write in Perl over these? Serious question, was propaghandized to hate Perl.


I write bash python and nodejs all day, and have no professional history with Perl.

One day while avoiding working on something important, I spent half a day learning Perl in order to implement something related to a build tool that was being used in the important thing I was avoiding.

I was blown away. It's a really delightful language. Its big downfall is that it makes it feel good to do something "clever."

Perl is a joy to write, and a devil to read. I liked it, and wish I had started my career earlier so I could have enjoyed Perl in its heyday.

I have similar feelings about Ruby.


You need to make sure that you write the clever bits clearly. Maybe add a comment. It takes some discipline, but isn't hard.

In fact, Perl remains remarkably robust if you stack clever tricks on top of each other.


The same shortcut syntax that people complain about does make perl really handy for one-time tasks where you're iterating on ideas. Lots of features there that make that easy. One example:

  #!/usr/bin/perl
  while (<>) {
      # various processing here
      # $ARGV is set to either "-" for piped input, or the current filename
      # $_ is the data of the current line
  } 
That (<>) construct accepts data from stdin, redirection or file(s) named as arguments and iterates over the data. There's lots of things like that throughout the language.


And you can avoid even that minor boilerplate with the -n or -p flag. It even supports BEGIN and END like awk.


> Perl? Wow. Is that better than bash, python or even nodejs? Why write in Perl over these?

It depends on scale.

If you have some quick parsing to do, then awk will get you started quickly, but as you expand your experimentation on what you want to extract/manipulate, it may not be easy to add onto the awk beginnings of your "one liner".

But if you start with awk-like† syntax but invoking it with Perl, then if you find you have to expand, Perl has more elbow room.

The intention is not to 'go big', which those other languages may be better at, but to more easily 'start small'.

† IIRC, Larry Wall wanted a utility that had awk/(s)ed-like syntax for text manipulation, just 'with more'.


Have you ever tried to dig a hole? What tool did you use?

- Want to cut through and move loam, compost, sandy, and compacted soil? You're gonna want a rounded shovel.

- Want to break up rocky, clay soil? A pick mattock will penetrate deep, breaking up soil, shattering smaller rocks, and is used as a lever to uproot. A tiller is a faster method but disturbs the soil more.

- Want to dig a narrow, deep hole? An augur will quickly break up rocks and soil in a shaft and move them upwards.

What do you use the Perl tool for?

- Quickly and efficiently open files, read line by line, analyze text, and perform any kind of operation you can think of, with complex data structures, objects and modular code, using very few lines of code.

- Executing external commands with a shell, returning their output, and making complex yet short programs easily with arguments to the interpreter from a command line.


Perl can do sh/awk/sed and a bunch more at once.


Absolutely. It is comparable to python in some ways, but makes it much easier to write quick one-liners using regexes and data manipulation, and to scale those up to real programs. It fills the gap between bash scripts using awk, grep and sed, and C/java/C#. Compared to bash scripting, perl is a real programming language. The documentation and library ecosystem are excellent, backwards compatibility is legendary, yet it supports modern Unicode. The syntax is weird, but try it for a bit, read the man pages, it's not that hard. The OO system is weirder, and I wouldn't make complex class hierarchies in it, but it is usable.


I like how Awk is just a single executable. A single-executable Perl that includes only the core library would be great. There is Microperl [0, 1], but no idea how well it compiles with more up-to-date Perl versions.

0: https://github.com/bentxt/microperl-standalone

1: Original article from 2000 by the author Simon Cozens: https://www.foo.be/docs/tpj/issues/vol5_3/tpj0503-0003.html


Perl better? maybe or maybe not.

It can be very useful and they are pretty robust. I often found Perl scripts running for years and years without issues at different companies.

My main issue with Perl-scripts is that they often are not "readable" by anybody but the original creator. Which of course left the company. (not a fault of Perl itself tough)

But your millage may vary and any script can be made (un)readable.


I've always found it weird that people bash on Perl relentlessly for being hard to read and then turn around and praise Rust's syntax when it is full of stuff like this:

    fn print_d(t: &'static impl Display) {


>> My main issue with Perl-scripts is that they often are not "readable" by anybody but the original creator.

Anyone writing Perl scripts like this should not be trusted with any programming language.

Perl scripts are no less readable than bash scripts or Awk scripts. This is because so much of Perl was written to do the same work as bash, awk, sed, and the other related Unix text processing command line programs, but all under one roof.

Don't believe me? Take a look for yourself:

https://learn.perl.org/

http://blob.perl.org/books/impatient-perl/iperl.htm


Perl can also be hilariously unreadable: https://www.foo.be/docs/tpj/issues/vol4_3/tpj0403-0017.html


>> Perl can also be hilariously unreadable: https://www.foo.be/docs/tpj/issues/vol4_3/tpj0403-0017.html

Most programming languages can be obfuscated. That does not mean people write code in those programming languages like that:

C: https://www.ioccc.org/

Javascript: view-source:https://www.google.com/

The truth is that insulting Perl is considered stylish by some, so many people do despite knowing little to nothing about Perl and having never used it.

However, if you want Perl to be hilariously unreadable, why not write it in Latin:

https://metacpan.org/dist/Lingua-Romana-Perligata/view/lib/L...

Or Klingon:

https://metacpan.org/pod/Lingua::tlhInganHol::yIghun


[flagged]


It's like when we, Gen-X ers, were repeating bad stuff about COBOL without having seen a single line of it.

Then I saw a real COBOL program and... well... it was even worse than what I had imagined :-)


Perl has both oneliners/spaggethi code and games like Pangzero.


There's a limited problem domain where it's unquestionably the best. Perl beats awk and bash at their own game on their home turf. That's the best way to put it. It's faster, has more shortcuts, less warts, more power, and more readability when well written, and while aged and not huge by modern standards, CPAN (like pypi or npm) is incredible for a hyper-powered awk and bash mash-up for those tasks at the edge of of that limited problem domain. It's installed almost everywhere, so almost always available.

That stuff is just awkward and painful in Python by comparison.


I don't write Perl code, but its CLI has been a very good way to replace sed with something decent. sed not supoorting Perl regex syntax, the most commonly kind of regex out there by large, is frankly disappointing. Even grep was able to put it together and add the -P switch. But sed is still stuck in the prehistoric syntax of ERE ("Extended Regular Expressions", as described in man pages) which e.g. instead of \d for a digit, use [[:digit:]], a syntax present in... zero? other tools or programming environments.


Better than BASH? Mostly. Better than Python, subjective as you would have to use them both yourself. I lean towards Perl as I like sigils to denote things. I have nothing against Python though. Both are typically installed as a default now. I have never used nodejs for sys admin work.


Perl is super-specialized at reporting (that's in fact the "r" in Perl). In particular there's a bunch of extremely useful implicitly defined variables that take their context from your place in a line-by-line loop through a text file.


Perl is a great language, but please listen to this old perl programmer's advice:

1. You can write totally unreadable perl. It is probably the single worst language in this regard most programmers will run into. Be careful to make your code readable.

2. Keep your amount of perl small. 200-300 lines is a good bit of it.

So for quick bang it out scripts that want to parse text etc... perl is great. For writing a major application, not so much.


One other advantage is that Perl will be found in the base install of almost any unix-like system. Python, nodejs, even bash may not.


When discussing such languages, I would like to point out that Raku is also an option.


I have found a handful of unconventional applications for awk -- I once needed a tiny pcm pulsewave generator, and awk was surprisingly decent for the job [1].

Aside from that I've mostly been using it for quick statistics [2], but it quickly moves into perl territory...

1: https://github.com/9001/asm/blob/hovudstraum/etc/bin/beeps#L...

2: https://ocv.me/doc/unix/oneliners/#965bfcb8


It's a language for creating quick alternative views from line- and column-oriented text streams. That means, take the output of another tool and represent it in a different way.


I use awk mostly for one-liners and resort to Python when I need more than a few lines of code.


I feel like the intent was to say notepad from 20 years ago and notepad from today has (approx) the same functionality whereas the processors are x4 times faster, it should be at least as fast as it was before, shouldn't it? In my mind, regardless of the OS requirements, a processor x4 more powerful shouldn't need double the time to launch the same program unless you've added x4+ features.


Notepad back then could only edit 32kB maximum files, even on 32bit NT, it was literally all the text widget could handle.

So no, it's not really fair to compare a 'simple' text editor.


It is a fair comparison.

If you edit the same 1KB file on each computer side by side the 30 year old computer will be more responsive than the modern one.

That's what people are taking issue with.


Heh, I've not any one talk about AV and things like the smart screen filter.

A huge number of security related things are going on.

Also windows logs a ton of telemetry these days.


I think the stock Notepad in Windows 10 is perfectly fine and speedy at least, I've never considered it too slow unless I open a huge file with word wrapping on.

Notepad2 is my all-time favorite though. It supports key features like line numbers and directionless search, but is much closer to stock than Notepad++. [0]

[0] https://www.flos-freeware.ch/notepad2.html


Notepad on NT4 could edit files as large as you had memory. I never used 3.5 but I guess they must have made that change in NT4.


64KB.


Another very similar example to this is the adding text feature in MS Paint. I noticed that somehow on the Windows 11 version, it takes many seconds after clicking the "add text" button to be able to actually start typing. Previously, it was instantaneous.


I can start notepad on my relatively slow Win10 VM with spinning disks in RAID and it starts with similar speeds - starting it on my physical windows machine with a SSD, it launches at exactly the same speed.


I would assume the people that actually develop those either already are RHEL customers or they would just pay a single subscription so they can download the code and keep it in sync


are you suggesting to block 127.0.0.1?


If you want to block requests to 127.0.0.1 DNS is the wrong level to do it


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: