Hacker Newsnew | past | comments | ask | show | jobs | submit | Grazester's commentslogin

The id tech 8 engine is a whole lot more performant than the unreal 5 engine and absolutely does what it needs to, fantastically, I would add for the game it was made for.

By the way, the AMD athlon 64-bit launched 2003. The PS3 launched in 2006. I had an AMD64 bit process in my laptop in 2005.

What wasn't viable?


Yeah that part didn't make sense, not to mention that neither the PS3 nor the 360 were running 64-bit software. They didn't have enough memory for it to be worth it.

you don't need memory to make 64 bit software worth it. Just 64 bit mathematics requirements. Which basically no video game console uses as from what I understand 32-bit floating point continue to be state of the art in video game simulations

Fundamentally it's still a memory limitation, just in terms of memory latency/cache misses instead of capacity. If you double the size of your numbers you're doubling the space it takes up and all the problems that come with it.

No it isn't. The 64-bit capabilities of modern CPUs have almost nothing to do with memory. The address space is rarely 64 bits of physical address space anyways. A "64-bit" computer doesn't actually have the ability to deal with 64 bits of memory.

If you double the size of numbers, sure it takes up twice the space. If the total size is still less that one page it isn't likely to make a big difference anyways. What really makes a difference is trying to do 64-bit mathematics with 32-bit hardware. This implies some degree of emulation with a series of instructions, whereas a 64-bit CPU could execute that in 1 instruction. That 1 instruction very likely executes in less cycles than a series of other instructions. Otherwise no one would have bothered with it


"Bitness" of a CPU almost always refers to memory addressing.

Now you could build a weird CPU that has "more memory" than it has addressable width (the 8086 is kind of like this with segmentation and 8/16 bit) but if your CPU is 64 bit you're likely not to use anything less than 64 bit math in general (though you can get some tricks with multiple adds of 32 bit numbers packed).

But a 32 bit CPU can do all sorts of things with larger numbers, it's just that moving them around may be more time-consuming. After all, that's basically what MMX and friends are.


The original 8087 implemented 80-bit operands in its stack.

It would also process binary-coded decimal integers, as well as floating point.

"The two came up with a revolutionary design with 64 bits of mantissa and 16 bits of exponent for the longest-format real number, with a stack architecture CPU and eight 80-bit stack registers, with a computationally rich instruction set."

https://en.wikipedia.org/wiki/Intel_8087


Typically, it doesn't have the ability to deal with a full 64 bits of memory, but it does have the ability to deal with more than 32 bits of memory, and all pointers are 64 bits long for alignment reasons.

It's possible but rare for systems to have 64-bit GPRs but a 32-bit address space. Examples I can think of include the Nintendo 64 (MIPS; apparently commercial games rarely actually used the 64-bit instructions, so the console's name was pretty much a misnomer), some Apple Watch models (standard 64-bit ARM but with a compiler ABI that made pointers 32 bits to save memory), and the ill-fated x32 ABI on Linux (same thing but on x86-64).

That said, even "32-bit" CPUs usually have some kind of support for 64-bit floats (except for tiny embedded CPUs).


The 360 and PS3 also ran like the N64. On PowerPC, 32 bit mode on a 64 bit processor just enables a 32 bit mask on effective addresses. All of the rest is still there line the upper halves of GPRs and the instructions like ldd.

You misread my comment. I'm not saying that it limits the amount of memory, I'm saying that _using more memory has cost_.

> If the total size is still less that one page it isn't likely to make a big difference anyways

It makes a significant difference when you're optimizing around cache behavior and SIMD lanes.


Parts of the 360 did. The hypervisor ran in 64bit mode, and use multiple simultaneous mirrors of physical address space with different security properties as part of its security model.

It's not like the games weren't running in 64 bit mode too (on both consoles)

They had full access to the 64 bit GPRs. There wasn't anything technically stopping game code from accessing the 64 bit address space by reinterpreting a 64 bit int as a pointer (except that nothing was mapped there).

It's only the pointers that were 32 bit, and that was nothing more than a compiler modification (like the linux x32 ABI).

They did it to minimise memory space/bandwidth. With only 512 MB of memory, it made zero sense to waste the full 8 bytes per pointer. The savings quickly add up for pointer heavy structures.

I remember this being a pain point for early PS3 homebrew. Stock gcc was missing the compiler modifications, and you had a choice between compiling 32 bit code (which couldn't use the 64bit GPRs) or wasting bandwidth on 64 bit pointers (with a bunch of hacky adapter code for dealing 32 bit pointers from Sony libraries)


Games themselves ran in 32 bit mode.

The difference is that on PowerPC, 32bit mode on 64bit processors (clearing the SF bit in the MSR) is just enabling a hardware 32bit mask on the effective address before it gets translated into a virtual address.

Unlike on x86-64 and arm64, there's no free (or even that cheap) way to do an ILP32 abi purely in software. x86 and arm allow encodings for memory reference instructions that only use the bottom half of the registers (the E* registers on x86, and the W* registers on arm64). No such encoding exists on PowerPC for memory reference instructions, so you'd be stuck manually masking each generated pointer.

Because of that, the compiler hacks you're talking about are kind of the opposite from what you're describing. The hacks are because on the upstream gcc PowerPC backend, having a 32bit pointers in hardware and having operations that operate on 64bit quantities had the same feature flag despite technically being able to be separately enabled on actual hardware. It was just very rare to do so. So the goal of the hacks was to describe to the compiler that the target has 32 hardware pointers, but still can issue instructions like ldd to operate on the full 64bit GPRs.


I have some confidence that AMD's acquisition of ATI had a huge impact.

That allowed both a CPU and an advanced GPU to be on the same die.

They also wisely sold Global Foundries, and were able to scale with TSMC.


You have to remember that the AMD and Intel of today are very different companies than they were 20-25 years ago. AMD split off it's fab capabilities, acquired ATI, adopted TSMC as a fab, and developed a custom silicon business.

At that time AMD wasn't in the custom CPU business, AMD64 was a new unproven ISA, and x86 based CPUs of that time were notoriously hot for a console. These were also some of the reasons why Microsoft moved away from the Pentium III it had used in the original Xbox.

The PS3 was launched in 2006 but the hardware design was decided years earlier to provide a reference platform for the software.


Because consoles don't use off-the-shelf CPUs for many reasons. Neither Intel nor AMD of that time would even consider making a bespoke CPU for Sony or MS.

Even they could use off-the-shelf SKU it wouldn't be viable - neither one had one that fits in power envelope (not that it helped xbox...)


Consoles used off-the-shelf CPUs until the 6th generation. Even the Dreamcast and the first Xbox used off-the-shelf CPUs, it was only the PS2 and the GameCube that started the trend of using custom-made CPUs.

Not entirely accurate.

The PSX's CPU is semi-custom. The core is a reasonably stock R3000 CPU, but the MMU is slightly modified and they attached a custom GTE coprocessor.... I guess you can debate if attaching a co-processor counts as custom or not (but then the ps4/xbone/ps5/xbs use unmodified AMD jaguar/zen2 cores)

IMO, the N64's CPU counts as off-the-shelf... however the requirements of the N64 (especially cost requirements) might have slightly leaked into the design of the R4300i. But the N64's RSP is a custom CPU, a from scratch MIPS design that doesn't share DNA with anything else.

But the Dreamcast's CPU is actually the result of a joint venture between Hitachi and Sega. There are actually two variants of the SH4, the SH4 and SH4a. The Dreamcast uses the SH4a (despite half the documentation on the internet saying it uses the SH4), which adds a 4-way SIMD unit that's absolutely essential for processing vertices.

We don't know how much influence Sega's needs had over the whole SH4 design, but the SIMD unit is absolutely there for the Dreamcast, I'm pretty sure it's the first 4-way floating point SIMD on the market. The fact that both the SH4/SH4a were then sold to everyone else, doesn't mean they were off the shelf.

Really, the original Xbox using an off-the-shelf CPU is an outlier (technically it's a custom SKU, but really it's just a binned die with half the cache disabled).


They would have started designing the systems in 2003, and one of the first choices is CPU partner.

Do you trust the new line of CPUs that just launched that year?


It has been my experience that they would call a female agent when they needed to pat down another female. Has this changed?


It was a female agent, that doesn't mean that it's okay to start groping someone without warning.


Yeah, IME they recite a script (I’ve heard it many times) where they explain why they are doing it, what it will involve, and how you can refuse (in that case, it would probably mean not flying that day). And offer to do it in private.


You phrased it as "If you're not going to get consent to grope girls", so why is it relevant that she is female?


Mostly to make a point; more people are likely to care about it not being okay if it's phrased that way. (Something, something, toxic masculinity, "suck it up tough guy, can't handle a little pat-down?")

The fact that the TSA agent a woman doesn't automatically make someone else comfortable with whatever liberties the agent feels like taking. It's still likely worse if it's a man doing it, but sharing a gender isn't an excuse for the agent to do whatever they want.


So you think that's a toxic attitude, but cater to it anyway? It's like saying "I mention she's whites because racists might not care otherwise".


> It's like saying "I mention she's whites because racists might not care otherwise".

Which is a very effective technique.


That's not really an equivalent statement. In general, women do face more harassment and unwanted touching, which often doesn't translate to the "white" example you give.


Because non-white people don't face more harassment from leos?


I'm saying exactly the opposite. Your example statement being equivalent to the one I gave would imply that white people face more harassment, which isn't the case.


But that isn't relevant to the point being made. Say "I mention they are black to pander to black supremists" if you wish.


My job entails me writing mostly Coldfusion all day long. I write new code in Coldfusion script. Its syntax heavily inspired by javascript, right down to the optional terminating semicolon. I still have to support a fair bit of code written in Coldfusion tag syntax. That I dislike especially given the code base was written by amateur developers and just makes me feel like bad php from 2003.


Oh I am very familiar with Coldfusion. My first job after dropping out of college [1] was doing Flash and Coldfusion work for a Martial Arts management company.

I have very mixed feeling on the language as a whole, both the tag and script language, though they’re mostly negative nowadays. I joined the CFML Slack a few months ago, which I was surprised to find, and the people on there were very nice and I respected their passion for the platform, but I personally still find the language pretty irritating, even with the scripty version.

Granted, I am very removed from web stuff now, and mostly work in data-land.

[1] I have a degree now, but that came considerably later.


There was no disputing it. Sony won the 5th and 6th.


Here's what's interesting.The Maple bus and the Dreamcast supported a second analog stick, They just never added it to the controller.


The Saturn is also my favorite. Is it bad that I think depends a dragoon saga kind of overeated? Zwei is my favorite. The music in one was awesome I thought though


Yeah that's not how you close an app on the Shield as was pointed out.


Say way? Crazy Taxi? The Dreamcast had an amazing library! Sonic Adventures Shenmue(1,2), Grandia 2, Skies of Arcadia, Virtua Tennis, Entire 2k sports series, Samba de Amigo, House of the dead, Soul Calibur, Dead or Alive 2, Jet Grind Radio, Test Drive LeMans, F355 Challenge, Rez, ...the list goes one

The weird yet cool games Roommania, Segaga, Seaman

Of course many of these games got ported over later on the other consoles or had sequels release on system after the Dreamcast's demise


The Dreamcast had some of the most innovative games by far, Samba de Amigo and Rez was unheard of in your home before DC and laid the foundation for many games. We also got dance mats for the home.

When someone says Crazy Taxi was the only hit I can only assume they haven’t played much at all, or is using some weird sales metric, by which all games were terrible on the DC because of how terrible SEGA did marketing.


Definitely sales metrics. DC didn’t sell shit.


If going by sales alone then Sonic Adventure(1) should be the only game you talk about.


Don't modern TV's come with a game mode to reduce this latency (turns off any kinds of image processing)?

I have a 12 year old Samsung LCD monitor that is advertised as 2.5ms


Yes but like all non-default settings, a large portion of the player base doesn't have it enabled. Games have to be designed for a large market, not just high end OLED buyers.

Even then, most VA/IPS/LED displays have something more like 20ms of latency in game mode due to slow LCD refresh rates. Controllers are also randomly delayed by 2.4GHz interference.

This 8bitdo Pro 2 on my desk has 18ms latency all the time. It actually kind of sucks and it's one of the faster wireless controllers.


The Dual Shock 5(wired) and Dual Shock 4(wireless) were some of the best controller where it comes to latency. https://rpubs.com/misteraddons/inputlatency

Edit:// those 8bitdo controllers are pretty terrible looking at that list, Wow.


Yes. I get about 5 ms latency on my 2024 LG OLED (a bit more at 120 Hz, a bit less at 144 Hz).

But there are other sources of latency that stack.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: