> if you must ask questions, imply the correctness of your originally held position by wording your question suggestively
So, is the question legit? If so, why can it not be answered?
This reminds me a bit of StackOverflow. "Question already solved" elsewhere. Well, before StackOverflow, people often asked a question, and were told "read the manual". So, were these people pro-social? Does the ORIGINAL position hold any merit when it comes to a question? HOW is it even inferred that a question was asked "suggestively"?
These bulletin points are no good. They make too many assumptions. What does "anti-social" mean? When reddit moderators ban people and censor statements, is this pro-social behaviour?
> when all hope is lost in conversation, retreat into your self
What horrible recommendations. Hopefully AI wrote those, because I can not believe a human wrote that, not even as sarcasm. I don't even see any sarcasm there. How do you detect sarcasm in written text accurately? Is my text sarcasm? Everyone agree or disagree with that? People are different. All that attempt to group into social or anti-social, is rubbish nonsense from A to Z.
Hmmm. I kind of like CSS but I hate the creep-up of complexity.
It's not that I don't understand the rationale - any programming
language offers more power than a non-programming language. But
I'd rather think here that something else could instead replace
all of HTML, CSS and JavaScript, rather than constantly wanting
to make everything more complex. I don't use most of the new
elements in HTML5, largely because I don't see the point in
using specialized tags for micro-describing a webpage. I succumbed
to the "it is a div-HTML tag and it has a unique ID"; that's what
I think mots of those containers actually really are. I even wanted
to have aliases to such IDs, simply to use as navigational href
intralink.
yeah I mean, to be clear, I'm less proposing "What if we add even more syntax and semantics to CSS" and more "what if we steal ideas from CSS, notice their similarity to logic / relational query languages, and use them to build something new". I probably could have articulated some of this better.
means, in English/pseudocode, roughly: "If you have an element X with attribute data-theme="dark", and X has a child Y with attribute data-theme="light", and Y is focused, then the outline-color of Y is black".
so we could write this also as, e.g.:
outline-color(Y, black) if
data-theme(X, "dark") and
parent(X, Y) and
data-theme(Y, "light") and
focused(Y)
that's Datalog, except I went ahead and replaced :- with "if" and "," with "and".
if we want even more syntax sugar, we could do:
Y.outline_color := black if
X.data-theme == dark and
Y.parent == X and
Y.data-theme == dark and
Y.focused
imagine `X.attr == val` <==> `attr(X, val)` as a kind of UFCS for Datalog to make it palatable to Regular Programmers, right
the declaration and scope of these variables is implicit here; if you want something even more ALGOL-family, we could write
forall Y {
Y.outline_color := black if
Y.data_theme == "dark" and
Y.focused and
Y.parent.data_theme == "light"
}
here we've explicitly introduced Y, and made one of our joins implicit, and it looks even more like Regular Programming now, except the Datalog engine (or equivalent) is kind of running all these loops for you, every time one of their dependencies changes, in an efficient way ...
SELECT 'black' AS outline_color
FROM elements parent
JOIN elements child ON parent.id = child.parent_id
WHERE parent.data_theme = 'light'
AND child.data_theme = 'dark'
AND child.focused = true
there's a lot of ways to express the same thing! it's interesting to notice the connections between them, I think, and their strengths and weaknesses, e.g. I probably wouldn't want to write my whole design system in SQL, but since it's relational queries over the elements structure and properties, you could.
"Over roughly the same period my day job has changed and transitioned me from writing thick clients in Swing to big freaking enterprise web apps."
I mean, the web kind of won. We just don't have a simple and useful way to
design for the web AND the desktop at the same time. I also use the www of
course, with a gazillion of useful CSS and JavaScript where I have to. I
have not entirely given up on the desktop world, but I abandoned ruby-gtk
and switched to ... jruby-swing. I know, I know, nobody uses swing anymore.
The point is not so much about using swing per se, but simply to have a GUI
that is functional on windows, with the same code base (I ultimately use the
same code base for everything on the backend anyway). I guess I would fully
transition into the world wide web too, but how can you access files on the
filesystem, create directories etc... without using node? JavaScript is deliberately restricted, node is pretty awful, ruby-wasm has no real documentation.
Those AI using software developers begin to show signs of addiction:
From "yay, claude is awesome" to "damn, it sucks". This is like with
withdrawal symptoms now.
My approach is much easier: I'll stay the oldschool way, avoid AI
and come up with other solutions. I am definitely slower, but I reason
that the quality FOR other humans will be better.
Now, most will say "but why, 1995 is ancient history, no such hardware exists anymore". The thing is ... should Linux get rid of what is old? I understand you have a smaller kernel when you have less code, less cost to maintain, I get it. Still, I wonder whether this should be the only allowed opinion. Would it not be better to, kind of, transition into a situation where any hardware built in the future, would be supported? So in 2050, we'd not say "damn, computers from 2026 are obsolete now". We could say "no problem, linux is forever". Everything is supported. I actually would prefer the latter than the "older than 30 years, we no longer support it".
> Would it not be better to, kind of, transition into a situation where any hardware built in the future, would be supported?
easier said then done -- the kernel's internal interfaces aren't static, they change often. The project has never committed to stabilizing it's driver api, so every driver takes non-zero work to maintain.
I would assume computers that are still running these old ISA mouses (mice?) probably are also running an older version of linux; and if they're running a new kernel then it'll be somebodys job to port the drivers forward. There's some likelihood this will end up maintained by someone out-of-tree, which is a nice way of saying "we've sent your dog to a farm upstate..."
To add to this, as long as the diff representing the removal of the driver is kept in the git history it would be trivial for someone in the far future to say to an AI agent:
"Please take this linux source and patch the Bus mouse driver back in but match the new driver interface".
With code preserved in git history it's never actually "removed". It's just, disconnected.
That date feels a little bit late. The PS/2 devices that superseded the bus mouse started appearing around 1987. There were certainly still bus mice around in 1995, but they were thoroughly obsolete.
The real issue is that they don't have stable intra-kernel ABI/APIs. It should entirely be the case that technologies that are 10+ years old are stable and a clean abstraction layer can be created. You maintain the abstraction layer and all the things on the other side of it don't have to track random kernel changes. Things like this just keep working indefinitely.
> It is not. Most parents I know have seen what it does to their kids, but have zero childcare.
And you are able to tell this ... how exactly? Why should other parents care about YOUR opinion in this regard? Because ultimately this comes down to a difference in opinion.
I claim this is not about "protecting children", but to mandate age sniffing on the OS level eventually.
I also find this all questionable. A 18 years old is not penalised? So why is that a difference? I should say that I don't use "social" media (unless commenting on a forum is called "social" now), but I find the attempt to explain this ... very poor. I could not try to reason about this. I could not claim it is meant to "protect" anyone at all. Is this pushed by over-eager parents, who don't understand what to do on a technical level? I really hate censorship in general. So, even while I think unsocial media such as Facebook should be gone, I hate any such restrictions. Then again I also don't trust any legislator who pushes for this - I am certain this is to force age-sniffing onto everyone. And then extend this slowly. Step by step. Salami by Salami. Until anonymity is gone.
> eval, send, method_missing, define_method , as a non-rubyist how common are these in real-world code
This depends on the individual writing code. Some use it more than others.
I can only give my use case.
.send() I use a lot. I feel that it is simple to understand - you simply
invoke a specific method here. Of course people can just use .method_name()
instead (usually without the () in ruby), but sometimes you may autogenerate
methods and then need to call something dynamically.
.define_method() I use sometimes, when I batch create methods. For instance
I use the HTML colour names, steelblue, darkgreen and so forth, and often
I then batch-generate the methods for this, e. g. via the correct RGB code.
And similar use cases. But, from about 50 of my main projects in ruby, at
best only ... 20 or so use it, whereas about 40 may use .send (or, both a
bit lower than that).
eval() I try to avoid; in a few cases I use them or the variants. For instance, in a simple but stupid calculator, I use eval() to calculate the expression (I sanitize
it before). It's not ideal but simple. I use instance_eval and class_eval more often, usually for aliases (my brain is bad so I need aliases to remember, and sometimes it helps to think properly about a problem).
method_missing I almost never use anymore. There are a few use cases when it is nice to have, but I found that whenever I would use it, the code became more complex and harder to understand, and I kind of got tired of that. So I try to avoid it. It is not always possible to avoid it, but I try to avoid it when possible.
So, to answer your second question, to me personally I would only think of .send() as very important; the others are sometimes but not that important to me. Real-world code may differ, the rails ecosystem is super-weird to me. They even came up with HashWithIndifferentAccess, and while I understand why they came up with it, it also shows a lack of UNDERSTANDING. This is a really big problem with the rails ecosystem - many rails people really did not or do not know ruby. It is strange.
"untyped parsing" I don't understand why that would ever be a problem. I guess only people whose brain is tied to types think about this as a problem. Types are not a problem to me. I know others disagree but it really is not a problem anywhere. It's interesting to see that some people can only operate when there is a type system in place. Usually in ruby you check for behaviour and capabilities, or, if you are lazy, like me, you use .is_a?() which I also do since it is so simple. I actually often prefer it over .respond_to?() as it is shorter to type. And often the checks I use are simple, e. g. "object, are you a string, hash or array" - that covers perhaps 95% of my use cases already. I would not know why types are needed here or fit in anywhere. They may give additional security (perhaps) but they are not necessary IMO.
Why do you say HashWithIndifferentAccess shows a lack of understanding? Like many Rails features, it's a convenience that abstracts away details that some find unpleasant to work with. Rails sometimes takes "magic" to the extreme through meta-programming. However, looking at the source [1], HashWithIndifferentAccess doesn't use eval, send, method_missing, or define_method. So I'm not sure how it seems weird to someone who works more with plain Ruby.
Seeing the performance improvement numbers I'm pretty sure there's a type-inference system below it to realize types in all paths (same as the AOT JS compiler I created).
It's not to be beholden to types per-se, but rather that fixed types are way faster to execute since they map to basic CPU instructions rather than operations having to first determine the type and then branch depending on the type used.
The problem with dynamic types is that they either need to somehow join into fixed types (like with TypeScript specifying a type-specification of the parsed object) or remain dynamic through execution (thus costing performance).
I think you could work around send(). Not a Ruby person, but in most languages you could store functions in a hashmap, and write an implementation of send that does a lookup and invokes the method (passing the instance pointer through if need be).
Won’t work with actual class methods, but if you know ahead of time all the functions it will call are dynamic then it’s not a big deal.
I find the current documentation difficult to understand.
This is a problem I see with many ruby projects. How would I reword
this?
Well, first thing, after stating what spinel is, I would show a
simple example. Ideally a standalone .rb file or something like
that, that can be downloaded (or whatever other format). Yes,
the README shows this, but believe it or not, I have realised
that I am usually below average when trying to understand something
that is now. I even manage to make copy/paste mistakes. This is
why I think one or two standalone as-is examples would be best.
And then I would explain use cases.
The current structure of the document is strange. Has that been
written with AI? If AI replaces the human individual, why is it
then expected that real people should read that? So many questions
here ...
Also, I would really like for the ruby ecosystem to not be split
up into different entities. I understand that mruby does not have
the same goals as MRI ruby, but still, there is fragmentation.
Now there is spinel - how does it relate to other parts of ruby?
Why are truffleruby and jruby separate? (I know why, so I am not
objecting to the rationale; I am pointing out that for a USER it
would be better if things would be more unified here in general.)
Ruby really needs to focus on its inner core. The base should be
solid. Even more so when it is harder to attract genuinely new
developers.
So, is the question legit? If so, why can it not be answered?
This reminds me a bit of StackOverflow. "Question already solved" elsewhere. Well, before StackOverflow, people often asked a question, and were told "read the manual". So, were these people pro-social? Does the ORIGINAL position hold any merit when it comes to a question? HOW is it even inferred that a question was asked "suggestively"?
These bulletin points are no good. They make too many assumptions. What does "anti-social" mean? When reddit moderators ban people and censor statements, is this pro-social behaviour?
> when all hope is lost in conversation, retreat into your self
What horrible recommendations. Hopefully AI wrote those, because I can not believe a human wrote that, not even as sarcasm. I don't even see any sarcasm there. How do you detect sarcasm in written text accurately? Is my text sarcasm? Everyone agree or disagree with that? People are different. All that attempt to group into social or anti-social, is rubbish nonsense from A to Z.
reply