Hacker Newsnew | past | comments | ask | show | jobs | submit | ____________g's commentslogin

Just wanted to add that Nitro was built by Nik Bhatt, who was the Senior Director of Engineering at Apple leading the Aperture and Core Image teams. I believe he built Nitro specifically to fill the vacuum Apple left behind. Not sure how close it gets to the OG.


I generally agree with the sentiment—if something is free, there’s often a tradeoff. But when there’s a paid tier, the free version can act more like an entry point or hook to get users into the ecosystem, rather than relying on harvesting user data. In JetBrains’ case, broad adoption brings a lot of strategic value on its own (like establishing industry standards or building community mindshare), so it makes sense for them to offer a genuinely free version without necessarily treating users as the product.


I’m intrigued—what’s your setup? Do you use any advanced refactoring features like Change Signature or Extract Selected Members? It’s been a couple of years since I last tried setting up Java in VS Code, and I’m curious how far things have come. Intuitively, I’d expect JetBrains to have the upper hand in this area, since they can build specialized UIs for complex tasks, rather than being constrained by the limitations of the Language Server Protocol.


> I’m intrigued—what’s your setup?

Quite often it's vim/emacs with a crazy collection of plugins and custom-written scripts where the most powerful tool is a fuzzy search.

Surprisingly few people know what an actual powerful IDE can even do.


Incorrect, many of them do, and make the determination that a few nice to have features doesn't outweigh the cost, lock in, and incongruence with the use of open source software where possible.

I know several world class developers who have invented corner stone technologies that use text editors without any plugins and just run the compiler in a separate window. It turns out that you actually are not held back by not having your hand held by an IDE, it was just a skill issue.


> Incorrect, many of them do,

Most of the time in these discussions it's revealed that no, they don't.

The rest is a non-sequitur

Edit: Oh. You don't know either https://news.ycombinator.com/item?id=43917174


This gives me the impression that you are speaking about something you don't have tacit experience with.


Nowadays I just use an LLM. None of these features are relevant anymore. Unless JetBrains has a proprietary LLM that is better at coding than what ever else is on the market (they don't) then there's no reason to pay money for their products.

I am not even particularly bullish on AI, but not seeing how LLMs have made IDEs irrelevant is like using Vim and crying foul about IDEs without trying one out.


The best solution I've found is to not use Prettier and instead use ESLint Stylistic for formatting:

https://eslint.style/guide/why


I agree. I've tried coding in C-like languages with proportional fonts a few times, and punctuation ends up feeling cramped, hurting legibility. We need more proportional fonts for programming where punctuation gets the same size and spacing as in monospaced fonts.


I'm also curious about this func/proc distinction and have some questions. It certainly isn't a new concept. Functional languages like Haskell have a similar distinction between effectful actions and pure functions. D also has pure functions that can only call other pure functions, but I'm not sure how well that has worked out in practice. Languages like SML/OCaml make it easy to write purely functional code, but I don't think the language enforces it.

1: Is there really no way to reuse procedural code in a pure setting? Say I have a procedure

    proc inPlaceSort(a: mutable IntArray) { ... }
Is there any way to use this proc to define a func that returns a sorted copy? For example:

    func pureSort(a: IntArray): IntArray =>
        mutable array_copy = copyOf(a),
        sort(array_copy),
        array_copy
Do I have to write two separate sorting algorithms? In Haskell I can implement both efficiently with a single implementation in the ST Monad.

2: Can pure functions perform logging?

3: Can a procedure declare some parameters as immutable or as output-only?

- Immutable parameters: often you will want to perform an effect with some context which you don't want to mutate. Const parameters are heavily used in C++ and Rust.

- Output parameters: these are parameters that are initialized by the callee and not the caller. Most languages don't need this as they can return one or more values instead, but C# is a language that has this specific feature.

4: Why invent a new term "expressive conditional" when "conditional expression" already exists? :^)


1. You’re correct, there’s no way to do that. Fortunately in this case, you’d want to use the native sort function anyway because it’ll be much faster than one written in JS/Bagel, and that is accessible in a pure context (as a pure function of an array or iterator; not in-place). There will be some barriers to certain optimizations in Bagel, but that’s done knowingly. Bagel is intended as an application-level language, so certain choices are made that prioritize maintainability over performance. With that said, performance is still a priority, and certain things like the focus on iterators over array-cloning are there to tackle common JS performance problems.

2. I’m planning to make an exception here for debugging purposes; I’m thinking a function or operator that logs a value and then evaluates to it. Not strictly pure, but close enough from a maintainability standpoint.

3. Yes, any type can be labeled “const”, which is recursive. If a proc has a const argument, an otherwise-mutable object can be passed to it and it will just be treated as const within that proc. However, a const object cannot be passed to a non-const parameter (or variable, etc), otherwise some other code could then mutate it. Assignable (output) parameters will not be supported, but contents of parameters can be mutated by procs. I specifically wanted to discourage people from using procs to generate values; they should only say what to do with the values.

4. I may not have been fully versed on the terminology :)


1. Solving that one example kind of misses the point of comment.


Yes, I agree. It feels like some context is missing here.


[OP] Ah, I've just seen a bunch of people showcase "amazing new tools" that are just bad compilers in disguise. State of the art Database query optimizers are good examples of this.


I have been waiting for good external Retina displays for years. That the situation has not improved after all this time is incredibly frustrating. Why is no display manufacturer [0] interested in setting itself apart by producing and marketing a lineup of reasonably priced external pixel-doubled (~200 ppi) displays? For some reason everything must be 16:9 and 4K with no regards to display size, resulting in some very awkward pixel densities. The fact that Apple is able to mass-manufacture a 5k display with a whole computer in it for just a little bit more than an UltraFine 5k is to me an indication that at least it’s technically feasible.

[0] There is LG, but these displays have their issues as the article explains.


The monitor market as a whole is pretty underwhelming.

TVs and mobile/portable devices get lots of attention. But pc monitor they are very far in manufactors priorities.


Focus is on 1) gaming, where response time and high frame rate is more important than high resolution, and 2) regular offices, where price is more important than any other feature.

Add to this fact that manufacturing displays is a costly affair, so there are no "artisinal" choices.


If you're looking for a 5k display, I've found Wikipedia's list of 5k devices to be valuable reference point: https://en.wikipedia.org/wiki/5K_resolution#List_of_devices_...

Personally, I'm hanging out for a 5120x2880 display larger than 27". We've had 27" 5120x2880 displays for 6 or 7 years now, you'd think someone would have taken that res to a larger panel by now. But nope, still waiting.


If a 5120x2880 display were larger than 27", then it wouldn't have a good pixel density. In my experience, either slightly above 100 or 200 ppi are the sweet spots for screen resolution, because you get along well with an integer scaling factor.


I'm aware of all of these monitors, but most are discontinued or unavailable. Luckily I've recently been able to acquire a used LG 5K monitor and it's pretty great most of the time, but my model has some display quality problems.


it's a matter of connectors and display bandwidth. Apple's extra-expensive display doesn't use HDMI or DisplayPort connectors, it uses a proprietary connector.

edit: I'm wrong on that. oops. apparently that invalidates my entire point. (it doesn't)

display bandwidth matters, and connectors are where display bandwidth goes to die. apple had to design a special one for the bandwidth requirements of that display.

it's not so much a matter of panels, but display protocol bandwidth and connectors that allow it.

even Microsoft have the Surface Studio, and it's excellent 4500x3000 display, and it's only available as part of the Surface Studio because that's the cheapest way they can get the display bandwidth all the way to the screen. eliminate the connectors and hardwire it.

this is also why laptops (especially apple laptops) have such good displays. they don't have to destroy the signal integrity with connectors, and they can use more wires to carry the signal than HDMI or DisplayPort allow.

high resolution, high framerate monitors just won't happen over DisplayPort or HDMI without serious advances. I expect a new connector to appear before that.


DisplayPort over Thunderbolt is not a proprietary connector (this is what the Pro Display XDR uses).


It's really stupid to force Thunderbolt usage though when USB-C has an alt mode for DP directly.


I believe the reason, IIRC, is it's actually doing two DP streams because one isn't enough bandwidth for the full resolution, which you can't do in alt mode where the PC is literally using the pins of the connection as though it were a single DP cable (although this may be more viable with newer DP specs these days, not sure).


The DP altmode didn’t provide enough bandwidth, which is why the best experience is over the Thunderbolt alternate mode.

edit: you can in fact get HBR3 working, what's not working at full rate with DP instead of thunderbolt is the USB3 hub on the display.


Oh it does? Nice. Kinda expected Apple to go full asshole mode on that.


You mean Intel - older Thunderbolt displays don't support USB-C alt-mode because Intel didn't support it as input until Titan Ridge Thunderbolt controllers.


DisplayPort 1.4 (quite old and common) has 8k at 60Hz. DisplayPort 2.0 has 16k at 60Hz. How is that not enough?


I think at the time the 5k Mac screens arrived only DisplayPort 1.2 was available, and probably some of the chips they use still only support that. But indeed - since time moved on and DisplayPort can now do 8k/60Hz it would be nice to see more screens going beyond 4K and supporting it.


Why in the world do you need proof from me?

the proof is that video cards and monitors and TVs either simply don't support that bandwidth, or they're so expensive that they're essentially hand made.

That's how you know, you look at the market and see what's available, and what it costs. People would buy the heck out of this stuff if it were possible to make it work with common, sloppy, reusable connectors like HDMI or DisplayPort. The fact of the matter is that it isn't possible to make this stuff with removable connectors, yet.

That's why you see very high DPI displays in applications where there isn't any need for high cycle-count connectors, or only single-use connectors well before you see the same displays with high-cycle count connectors.

or they use display compression to fake it.

but what do I know? you did some google searches.


The "serious advances" is DSC, which the XDR is already using to support 6k over a single DisplayPort HBR2 link. Which, as others mentioned, can be carried via USB-C alt-mode in addition to Thunderbolt.

6k 120Hz and 5k 144Hz are both possible over a single HBR3 link with DSC. Such monitors don't exist because the panels don't exist; rather the closest panel for sale is perhaps the 5120x1440 at 240Hz in the Samsung G9. Which has the same bandwidth requirements as a hypothetical 5120x2880 at 120Hz.

If you want 240Hz at >4k, or 8k 120Hz, then sure that exceeds existing DisplayPort 1.4 and ThunderBolt 4 bandwidth capabilities even with DSC, and you'll need the upcoming DisplayPort 2.0 link rates. (or DSC+chroma subsampling)


It's using a Thunderbolt 3 connector I wouldn't call that proprietary. There are quite a few TB3 monitors out there.


It uses a usb 3 connector, although, given bandwidth and power requirements, I guess it has to be rated for the task.


Looks like a really great app! However, screen shots from older versions of Mac OS on a product page always make me a bit worried about the future of an app. Do you intend to update the UI and such to match Big Sur and the upcoming Monterey?


It's been on my list and we've made some progress but to be completely transparent, I've been extremely overloaded and it's been hard to set aside time for it.

So, I'll try to get to it eventually but no promises on timeline.


Completely understandable. Best of luck!


I doubt they would intentionally limit the internal resolution in order to obtain a high maximum external resolution.


You're forgetting about the Pro Display XDR.

They want to make sure their 6k display is supported across their entire lineup. Even the M1 Air supports it.


They do: from mac mini's specs: - HDMI 2.0 with up to 4k60 - Displayport with up to 6k60

and their laptops have lower than 4k resolutions.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: