Hacker Newsnew | past | comments | ask | show | jobs | submit | rwmj's commentslogin

Also the US politicians suffering from Khan Derangement Syndrome. He really is one of the most anodyne politicians around, obviously no one is genuinely upset about him.

"Guy being humped by a horse" wouldn't have been my first choice for demoing the capabilities of the model, but each to their own I guess.

It looks like a marketing move. It's a good quality, detailed picture. It's going to get shared a lot. I would assume they knew exactly what they were doing. Nothing like a bit of controversy for extra clicks.

Because every ML researcher is a viral social media expert.

(I don’t even know if I’m being sarcastic)


This is not some random ML researcher doing fun things at home. Qwen is backed by Alibaba cloud. They likely have whole departments of marketing people available.

Not to mention the gigantic legal iceberg the first time someone is attacked or worse after meeting through the app.

Private Equity has been buying up vets in the UK and jacking up prices massively, causing all sorts of knock-on negative consequences. Vets in the UK were not failing before.

Unfortunately our competition authorities are toothless. Their proposed remedy is that vets will have to publish prices on their websites from now on. Woohoo.

https://www.bbc.co.uk/news/articles/c8j3020kl04o

https://archive.ph/ikhpP (link to FT.com)


I have a discord account that I use very rarely, and just tried it (from the UK) and it didn't ask me for any ID or face scan. If they do start doing that, I'll simply stop using the service.

Not precisely, but C-- (hard to search for!) was a C-like (or C subset?) intermediate language for compilers to generate.

I found this Reddit thread that gives a bit more detail:

https://www.reddit.com/r/haskell/comments/1pbbon/c_as_a_proj...

and the project link:

https://www.cs.tufts.edu/~nr/c--/


A good, meditative film with a long arc of time and a bit of prison is Ash is the Purest White (https://www.imdb.com/title/tt7298400/)

You can't mention Kore-eda without mentioning After Life (1998), surely? (Confusingly called Wonderful Life in Japanese, and also I don't mean the Gervais series.)

There's a recent US "remake"/homage which I haven't dared to watch.


Yess! So good too. We could probably just recommend all his films

I’d say he is my favorite contemporary director.

The only american director I’d consider right now is Terrence Malick. I just hope his Jesus film gets released…


I love Kore-eda but it took me many attempts to get through After Life. I'm really glad it wasn't my first Kore-eda film. I find it terribly boring.

It's not really important in latent space / conceptually.

This is the proper deep critique / skepticism (or sophisticated goal-post moving, if you prefer) here. Yes, obviously this isn't just reproducing C compiler code in the training set, since this is Rust, but it is much less clear how much of the generated Rust code can (or can not) be accurately seen as being translated from C code in the training set.

The interesting thing here is what's this code worth (in money terms)? I would say it's worth only the cost of recreation, apparently $20,000, and not very much more. Perhaps you can add a bit for the time taken to prompt it. Anyone who can afford that can use the same prompt to generate another C compiler, and another one and another one.

GCC and Clang are worth much much more because they are battle-tested compilers that we understand and know work, even in a multitude of corner cases, over decades.

In future there's going to be lots and lots of basically worthless code, generated and regenerated over and over again. What will distinguish code that provides value? It's going to be code - however it was created, could be AI or human - that has actually been used and maintained in production for a long time, with a community or company behind it, bugs being triaged and fixed and so on.


The code isn't worth money. This is an experiment. The knowledge that something like this is even possible is what is worth money.

If you had the knowledge that a transformer could pull this off in 2022. Even with all its flawed code. You would be floored.

Keep in mind that just a few years ago, the state of the art in what these LLMs could do was questions of this nature:

Suppose g(x) = f−1 (x), g(0) = 5, g(4) = 7, g(3) = 2, g(7) = 9, g(9) = 6 what is f(f(f(6)))?

The above is from the "sparks of AGI paper" on GPT-4, where they were floored that it could coherently reason through the 3 steps of inverting things (6 -> 9 -> 7 -> 4) while GPT 3.5 was still spitting out a nonsense argument of this form:

f(f(f(6))) = f(f(g(9))) = f(f(6)) = f(g(7)) = f(9).

This is from March 2023 and it was genuinely very surprising at the time that these pattern matching machines trained on next token prediction could do this. Something like a LSTM can't do anything like this at all btw, no where close.

To me its very surprising that the C compiler works. It takes a ton of effort to build such a thing. I can imagine the flaws actually do get better over the next year as we push the goalposts out.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: