> I'm pretty sure that someone else would have come around the corner with a similar idea some time later, because the fundamentals of these stuff were already discussed decases before
I am not trying to be dismissive, but this could apply to all research ever
Dismissing someone with a different opinion as astroturfing is not productive.
There are loads of high performance open source LLMs on the market that compete with the big 3. I have not seen this level of community engagement and collaboration since the open-source boom 20 years ago.
If I believed it was a different opinion I wouldn’t even have written the first paragraph, or maybe the whole reply.
The issue arises from it not being that person’s opinion but a talking point. People didn’t all individually arrive at this “democratisation” argument by themselves, they were sold what to say by the big players with vested interest in succeeding.
I’m very much for discussing thoughts one has come up with themselves, especially if they disagree with mine. But what is not productive is arguing with a proxy.
> I have not seen this level of community engagement and collaboration
> It signals either astroturfing or someone who just accepts what they are sold without thinking.
> Nor this level of spam and bad submissions.
Your comments seem pretty aggressive for what you’re replying to. Maybe take a beat to assess your biases? I thought the main comment was pretty fair and sensible, yet somehow you landed on calling them a spammer/bad submitter/astroturfer/non-thinker. Maybe they are? I could be wrong, but that's quite a strong reaction for what they asserted at face value. Not really trying to police anything here, I just thought the initial comment had merit and this devolved quite quickly.
Programming is a tricky skill and takes a long time to get good at. Lots of people aren't good at it. AI helps them program anyway, and allows them to sometimes produce useful programs. That's it.
It's not a talking point. It's just the reality of what the technology enables, and it's a simple enough observation that millions of people can independently arrive at that conclusion, and some of them might even refer to it as "democratization".
> Programming is a tricky skill and takes a long time to get good at. Lots of people aren't good at it.
This is a good thing. It's a filter for the careless, lazy, and incompetent. LLMs are to programming what a microwave is to food. I'm not a chef because I can nuke a hot pocket. "Vibe coders" (not AI-assisted coding) are the programming equivalent of the people on Kitchen Nightmares. Go figure, it's a community rife with narcissism, too.
It is a fair note when there are a lot of people with a monetary incentive to hype up a certain piece of technology. And as gp correctly points out: "democratizing" is most commonly used in a very hostile and underhanded manner.
It is what we are talking about, hence not "counterproductive".
You asked earlier if you were being overly cynical, and I think the answer to that is "yes"
We are indeed simulating what we find in nature when we create neural networks and transformers, and AI companies are indeed investing heavily in BCI research. ChatGPT can write an original essay better than most of my students. Its also artificial. Is that not artificial intelligence?
Hiding the training data behind gradient descent and then making attributions to the program that responds using this model is certainly artificial though.
Can't you judge on the results though rather than saying AI isn't intelligent because it uses gradient descent and biology is intelligent because it uses wet neurons?
Steganography tool called ez-steg. It supports least significant bit steganography as well as emoji/unicode encoding via variation selectors. It grew from a set of scripts I had written to test out data loss prevention systems.
Includes some nice-to-haves like payload encryption, carrier image creation
I am not trying to be dismissive, but this could apply to all research ever
reply