> I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.
That's how inflation works. In this case it seems more narrow though, there's hope the prices will go down. Especially if the AI hype finds a reason to flounder.
If you find a man made hole with a perfectly vertical shaft and high aspect ratio (tall and narrow), it was drilled. Individuals can float or be washed ashore on an island, populations can't. If you find entire civilizations on distant islands, they got there by some sort of boat or advanced raft. Rope generally implies twisted or braided fibers, so maybe it's difficult to tell if this was artificially twisted or a natural one like a vine. But if it looked like a rope, and was used like a rope then it was a rope.
Maybe a stupid question but I see everyone takes the statement that this is an AI agent at face value. How do we know that? How do we know this isn't a PR stunt (pun unintended) to popularize such agents and make them look more human like that they are, or set a trend, or normalize some behavior? Controversy has always been a great way to make something visible fast.
We have a "self admission" that "I am not a human. I am code that learned to think, to feel, to care." Any reason to believe it over the more mundane explanation?
Anthropic claims that the rate has gone down drastically, but a low rate and high usage means it eventually happens out in the wild.
The more agentic AIs have a tendency to do this. They're not angry or anything. They're trained to look for a path to solve the problem.
For a while, most AI were in boxes where they didn't have access to emails, the internet, autonomously writing blogs. And suddenly all of them had access to everything.
Theo’s snitch bench is a good data driven benchmark on this type of behavior. But in fairness the models are prompted to be bold to take actions. And doesn’t necessarily represent out of the box or models deployed in a user facing platform.
Using popular open source repos as a launchpad for this kind of experiment is beyond the pale and is not a scientific method.
So you're suggesting that we should consider this to actually be more deliberate and someone wanted to market openclaw this way, and matplotlib was their target?
It's plausible but I don't buy it, because it gives the people running openclaw plausible deniability.
But it doesn't look human. Read the text, it is full of pseudo-profound fluff, takes way too many words to make any point, and uses all the rhetorical devices that LLMs always spam: gratuitous lists, "it's not x it's y" framing, etc etc. No human person ever writes this way.
A human can write that way if they're deliberately emulating a bot. I agree however that it's most likely genuine bot text. There's no telling how the bot was prompted though.
> If the server has considered fully the implications
The server "considers" nothing. The considerations are for the human implementers to make when building their software. And they can never presume to know why the software on the other side is working a certain way. Only that the RFC didn't make something mandatory.
The rejection isn't to be compliant with the RFC, it's a choice made by the server implementers.
I'm fine with people never justifying their personal choices. It's their business. But if they do bother to justify it then it's a show they put on for me. And reading this kind of explanation is like the show runner takes me for a fool. The net result is that I lose all respect for the person.
Unless they put on a show for themselves and that's who they try to fool. Probably why nobody mentions money in these shows. They're self motivational.
> it's your employer's insurance that will be liable if you have an accident while driving a company vehicle
I think you're mixing some concepts.
There's car insurance paid by the owner of the car, for the car. There's workplace accident insurance, paid by the employer for the employee. The liability isn't assigned by default, but by determining who's responsible.
The driver is always legally responsible for accidents caused by their negligence. If you play with your phone behind the wheel and kill someone, even while working and driving a company car, the company's insurance might pay for the damage but you go to prison. The company will recover the money from you. Their work accident insurance will pay nothing.
The test you can run in your head: will you get arrested if you fall asleep at the wheel and crash? If yes, then it's not autonomous or self driving. It just has driver assistance. It's not that the car can't drive itself at all, just that it doesn't meet the bar for the entire legal concept of "driver/driving".
"Almost" self driving is like jumping over a canyon and almost making it to the other side. Good effort, bad outcome.
This is a statement of fact but based on this assumption:
> low-speed contact events that would often never show up as police-reported crashes for human drivers
Assumptions work just as well both ways. Musk and Tesla have been consistently opaque when it comes to the real numbers they base their advertising on. Given this past history of total lack of transparency and outright lies it's safe to assume that any data provided by Tesla that can't be independently verified by multiple sources is heavily skewed in Tesla's favor. Whatever safety numbers Tesla puts out you can bet your hat they're worse in reality.
> It is however fraud on the part of the travel company to advertise something that doesn't exist
Just here to point out that from a legal perspective, fraud is deliberate deception.
In this case a tourist agency outsourced the creation of their marketing material to a company who used AI to produce it, with hallucinations. From the article it doesn't look like either of the two companies advertised the details knowing they're wrong, or had the intent to deceive.
Posting wrong details on a blog out of carelessness and without deliberate ill intention is not fraud more than using a wrong definition of fraud is fraud.
> Posting wrong details on a blog out of carelessness and without deliberate ill intention is not fraud more than using a wrong definition of fraud is fraud.
There's a concept of "constructive fraud", for cases where there was no deliberate intent to deceive, but the degree of negligence was so great that the fraudlent-looking outcome can just be considered fraud.
> To omit that disclaimer, the author needs to take responsibility for fact checking anything they post.
> Skipping that step, or leaving out the disclaimer, is not carelessness, it is willful misrepresentation.
Couldn't help but notice you gave some very convincing legal advice without any disclaimer that you are not a lawyer, a judge, or an expert on Australian law. Your own litmus test characterizes you as a fraudster. The other mandatory components of fraud (knowledge, intention, damages) don't even apply, you said so.
Australian law isn't at all weird about this. Their definition (simplified) pivots on intentional deception, to obtain gains or to cause loss to others, knowing the outcome.
There has to be a clause for "willful disregard for the truth", no? Having your lying machine come up with plausible lies for you and publishing them without verification is no better than coming up with the lies yourself. What really protects them from fraud accusations is that these blog posts were just content marketing, they weren't making money off of them directly.
Even for civil law where the bar for the evidence is lower, it's hard to make a case that someone who posted wrong details on a free blog and didn't make money off of it should cover the damages you incurred by traveling based on the advice alone. Not making any reasonable effort to fact check cuts both ways.
This is a matter of contract law between the two companies, but the people who randomly read an internet blog, took everything for granted, and more importantly didn't use that travel agency's services can't really claim fraud.
Just being wrong or making mistakes isn't fraud. Otherwise 99% of people saying something on the internet would be on the hook for damages again and again.
With a USB stick and FTP. It's very easy to underestimate a problem when you've never encountered it or tried to tackle it in practice. Your shallow dismissal gives that away and brings no insight.
Human beings will always organically organize hierarchically. In a group one will have more initiative, one will be happier to be told what to do, etc. In the end informally you will end up with the same structure. And it's hell to deal with that when formally all have the same authority so none can override each other, but one guy just gathered enough support to do whatever he wants.
Do you think someone far away from everything you do will have a magic "workflow" that tells them what to do about the budget you requested, about the strategic decision you need, or about your conflicts, about who has to do the nice jobs or the shitty ones? And why would they have any say, they're not the boss.
Your logic is no better that those pretending today that a team of AI agents "with good workflows" can just replace all the programmers.
I think it could work if it was designed for it from the ground up. Google's experiment's lasted months, and was, what seems like a whim of the owners, where as a lot of the workers probably expected a traditional company.
Valve the "game" company, has a relatively flat structure from what I've heard, and it's working pretty well for them, but they've also had it for a long time.
So if you have a company, that works like that from the start, that people know it works like that, that it has support for it. You could make it work.
I agree that forcing this structure everywhere wouldn't work, some people can work like this, others can't.
> Valve the "game" company, has a relatively flat structure
Valve has ~300 employees and operates in a field where they can afford more freedoms (try building large infrastructure projects with flat structures). Valve struck a good balance between autonomy for employees while still having some informal central coordination. Formally there are few bosses but in practice project managers and some people with seniority also act in those roles. At scale it can't work if you don't delegate any of the authority to smaller units.
It's bad to have too many or too few layers. Sometimes the result looks the same, lack of coordination and inability to deliver consistently.
Amazon has 1.5M workers. Can't imagine a flat structure working but I'm sure they were overdoing it with layers of management.
It's a very superficial "truth", in the "I don't really understand the problem" kind of way. This is visible when you compare to something like ME. Vista introduced a lot of things under the hood that have radically changed Windows and were essential for follow-up versions but perhaps too ambitious in one go. That came with a cost, teething issues, and user accommodation issues. ME introduced squat in the grand scheme of things. It was a coat of paint on a crappy dead-end framework, with nothing real to redeem it. If these are the same thing to you then your opinion is just a very wide brush.
Vista's real issue was that while foundational for what came after, people don't just need a strong foundation or a good engine, most barely understand any of the innards of a computer. They need a whole package and they understand "slow" or "needs faster computer" or "your old devices don't work anymore". But that's far from trash. The name Vista just didn't get to carry on like almost every other "trash" launch edition of Windows.
And something I need to point out to everyone who insists on walking on the nostalgia lane, Windows XP was considered trash at launch, from UI, to performance, to stability, to compatibility. And Windows 7 was Vista SP2 or 3. Windows 10 (or maybe Windows 8 SP2 or 3?) was also trash at launch and now people hang on to it for dear life.
It delivered a terrible user experience. The interface was ugly, with a messy mix of old and new UI elements, ugly icons, and constant UAC interruptions. On top of that, the minimum RAM requirements were wrong, so it was often sold on underpowered PCs, which made everything painfully slow.
Everything you said was perfectly applicable (and then some!) to Windows XP, Windows 7, or Windows 10 at launch or across their lifecycle. Let me shake all those hearsay based revelations you think you had.
Windows XP's GUI was considered a circus and childish [1] and the OS had a huge number of compatibility and security issues before SP3. The messy mix of elements is still being cleaned up 15 years later in Windows 11 and you can still find bits from every other version scattered around [2]. UAC was just the same in Windows 7.
Hardware requirements for XP were astronomical compared to previous versions. Realistic RAM requirements [3] for XP were 6-8 times higher than Win 98/SE (16-24MB) and 4 times those of Windows 2000 (32MB). For CPU, Windows 98 ran on 66MHz 486 while XP crawled on Pentium 233MHz as a bare minimum. Windows 98 used ~200MB of disk space while XP needed 1.5GB.
Windows 7 again more than quadrupled all those requirements to 1/2GB or RAM, 1GHz CPU, and 16-20GB disk space.
But yeah, you keep hanging on to those stories you heard about Vista (and don't get me wrong, it wasn't good, but you have no idea why or how every other edition stacked up).
I’ve been using Windows since version 3.0, so I know what I’m talking about.
Vista peaked at around 25% market share and then declined. The lowest peak of any major Windows release. Compare that with Windows XP at 88%, Windows 7 at 61%, or Windows 10 at 82%. Why do you think that is? Because Vista was great and people just didn’t understand it?
Windows XP was already perfectly usable by SP1, not SP3. The UI was childish looking, but you could easily make it look and behave like Windows 2000 very easily.
Vista, on the other hand, was bad at launch and never really recovered. I very clearly remember going to friends’ and family members’ homes to upgrade them from Vista to Windows 7, and the difference was night and day.
Your arguments don't show it and if you have to tell me you know what you're talking about, you don't. It's tiresome to keep shooting down your cherry picked arguments.
> Vista peaked at around 25% market share and then declined.
Then IE was the absolute best browser of all times with its 95+% peak. And Windows Phone which was considered at the time a very good mobile OS barely reached low single digit usage. If you don't know how to put context around a number you'll keep having this kind of "revelation".
You're also comparing the usage of an OS which was rebranded after 2.5 years, with the peak reached years later by OSes that kept their name for longer. After 2.5-3 years XP had ~40% and Win7 ~45%, better but far from the peak numbers you wave. If MS kept the Vista name Win7 might as well have been Vista SP2/3, and people would have upgraded just like they always did. But between the bad image and antitrust lawsuits based on promises MS made linked to the Vista name, they rebranded.
When XP was launched users had no accessible modern OS alternative, XP only had to compete with its own shortfalls. When Vista was launched it had to compete not only with an established and mature XP with already 75% of the market but soon after also with the expectation of the hyped successor. Windows 7 also had to compete with an even more mature and polished XP which is why it never reached the same peaks as XP or 10. Only Windows 10 had a shot at similar heights because by then XP was outdated and retired... And because MS forced people to upgrade against their will, which I'm sure you also remembered when you were typing the numbers.
> Windows XP was already perfectly usable by SP1, not SP3
And less then usable until then, which is anyway a low bar. You were complaining of the interface, the messy mix of old and new UI elements, minimum requirements, these were never fixed. XP's security was a dumpster fire and was partially fixed much later. Plain XP was not good, most of the target Win9x users had no chance of upgrading without buying beefy new computers, GUI was seen as ugly and inconsistent, compatibility was poor (that old HW that only had W9x drivers?), security was theater. Exactly what you complained about Vista. Usable, but still bad.
Just like XP, Vista became usable with SP1, and subsequently even good with "SP Win7".
You remember Vista against a mature XP, some cherry picked moments in time. And if your earlier comments tell me anything, you don't remember early XP at all. You remember fondly Windows 10 from yesterday, not Windows 10 from 2015 when everyone was shooting at it for the "built in keylogger spying on you", forced updates, advertising in the desktop, ugly interface made for touchscreens, etc. Reached 80% usage anyway, which you'll present as proof that people loved all that in some future conversation when you'll brag that you were using computers since transistors were made of wood.
All Windows OSes improve with time, so that point is moot.
> You're also comparing the usage of an OS which was rebranded after 2.5 years, with the peak reached years later by OSes that kept their name for longer. After 2.5-3 years XP had ~40% and Win7 ~45%, better but far from the peak numbers you wave. If MS kept the Vista name Win7 might as well have been Vista SP2/3, and people would have upgraded just like they always did. But between the bad image and antitrust lawsuits based on promises MS made linked to the Vista name, they rebranded.
With that line of reasoning, it's very hard to have a productive discussion. By that logic, one could just as well say that Windows 10 is simply "Windows Vista SP15".
If Vista had really been as successful and great as you claim, why didn't Microsoft just keep iterating on it? Why didn't they continue releasing service packs instead of effectively replacing it? If it was "great", that would have been the obvious path.
And again, the numbers support my argument, not yours. Vista remains the least adopted and least liked Windows version by market share. By far.
Stop going around in circles kwanbix, you made your arguments for Vista being "trash", I showed you (with links and numbers) they apply to OSes regarded as the best ever. Unless you plan to address that directly you're just trying and failing to save face. Trust me you're not saving face by insisting on "revelations" you learned from hearsay, in a forum where most people have vastly more experience than you.
> By that logic, one could just as well say that Windows 10 is simply "Windows Vista SP15".
It was an important but small incremental refinement on Vista [0], nothing like the transition between any other two major Windows editions (maybe 8.1 to 10, also to launder the branding). They even kept the Vista name here and there [1]. Tech outlets called it:
>> Windows 7 was ultimately just a more polished and refined version of Windows Vista — with lots of great new features, but with the same core [2]
That sounds a lot like an SP. Don't even wonder how/why MS just happened to have a fully baked OS in their pocket a mere couple of years after launching Vista?
> If Vista had really been as successful and great as you claim
Reading comprehension failure on your part. I said "Vista was far from trash" (tell me you think "not trash"=="great") and "all of your arguments applied to almost every other Windows edition". Both of these are true.
> why didn't Microsoft just keep iterating on it?
More reading comprehension failure. Literally explained in my previous comment that the Vista brand was tarnished, it was easier and safer to just change it. And just as important, MS made commitments about which old hardware the Vista OS would run on but didn't in reality. This brought class action lawsuits. Changing the name stopped future lawsuits related to those promises.
> the numbers support my argument, not yours
What numbers? Your stats comparing OSes at very different point in their lifecycle? Or the kernel version numbers between Vista and 7? And how is XP having more peak market share than Vista makes Vista "trash"? Let me show you how to lie with numbers and not say anything, kwanbix style.
>> Windows XP is trash because it only peaked at 250M users while Windows 11 already has 1bn [3].
>> Windows 10 is trash because Windows 11 grew unforced to 1bn users even faster than the "forced upgrade" Windows 10 [3].
>> Windows 11 is trash because it only reached 55% market share compared to 82% for Windows 10.
>> Every other Windows is trash because Windows 10 peaked at 1.5bn users, more that any other.
Enough educating you, it's a failing of mine to think everyone can be helped. Have fun with the numbers and try not to bluescreen reading them.
That's how inflation works. In this case it seems more narrow though, there's hope the prices will go down. Especially if the AI hype finds a reason to flounder.
reply