Hacker Newsnew | past | comments | ask | show | jobs | submit | s_tec's commentslogin

Nifty! I recently bought a RISC-V VisionFive 2 Lite SBC, which required a lot of mucking with firmware and talking to the U-Boot serial console before it would boot Linux for the first time. A tool like this would have been super-handy during that time.

On the other hand, I'm a low-budget hobby user. I like things that are cheap, easy, and hackable. It sounds like your product might be for more-advanced users? Or do all these fancy features stay tucked away until you need them? If you make your product cheaply, that might hurt profit margins, but it might also open up the low-end market. I have so many questions about the business side of this.

But really, I am most curious about the user experience. It's not super-helpful if learning the tool becomes its own project, so I'm hoping it's simple.

Edit: Oh, it's a software project. I thought it was a hardware project. My bad.


This is more of a compressed-air battery than a sand battery, except that the "air" is CO2 and it's "compressed" enough to cause a phase change.

Heat-based energy storage is always going to be inefficient, since it's limited by the Carnot efficiency of turning heat back into electricity. It's always better to store energy mechanically (pumping water, lifting weights, compressing gas), since these are already low-entropy forms of energy, and aren't limited by Carnot's theorem.

I don't know much about this CO2 battery, but I'm guessing the liquid-gas transition occurs under favorable conditions (reasonable temperatures and pressures). The goal is to minimize the amount of heat involved in the process, since all heat is loss (even if they can re-capture it to some extent).


Thermoelectric cooling is pretty inefficient, because the materials need to balance competing requirements:

- Good thermal insulator - Good electrical conductor - Good semiconductor

This is because the hot & cold sides are sandwiched closely together as a PN junction, so once you move heat from one side to the other, it just leaks right back. Mechanical cooling doesn't have this problem, because the hot & cold sides are separated by thin bits of tubing. This makes the thermal leakage a "minor annoyance" in a mechanical system as opposed to "literally the whole problem we're trying to solve" as it is with thermoelectrics.

One work-around is to stack lots & lots of thermoelectric coolers on top of each other. That reduces the temperature difference at each individual PN junction, which in turn lowers the leakage. That's what this team is doing, but using layers that are only a few nanometers thick, so they can fit dozens or hundreds of junctions in a single package.


Twenty years ago there was a company trying to commercialise thermoelectric cooling based on a vacuum gap: https://web.archive.org/web/20031213235132/http://www.coolch...

They claimed 55% Carnot efficiency based on a 30-100 angstrom gap maintained by piezoelectric controllers, and a method to construct large electrodes with matched surfaces so that the gap could be maintained over a large area. It all sounded plausible but never went anywhere as far as I know.

Incidentally that means all their patents will have expired...


But isn't condensation based cooling like 500% efficient?


The Carnot limit is the theoretical upper limit of the efficiency of a heat pump, so the stated number is presumably with respect to that, not heat moved per unit energy input like you're quoting.


A Carnot heat pump maintains the temperature in a house at 20C on a day when the temperature outside is 5C. What is the coefficient of performance of the heat pump?

The coefficient of performance (COP) of the Carnot heat pump is 19.5.

The coefficient of performance of a typical heat pump in a british home is around 4.

There is obviously a huge difference between 4 and 19.5 - although a good chunk of this is explained by large temperature differentials in the condenser and evaporator, and a british desire to use a water heating loop.


What if you use the technology in places where you actually want to maximize heat conductivity?

I'm thinking of the separation walls in counterflow heat exchangers (only useful at the end where the incoming stream is closer to its end temperature than the delta offered by thermoelectrics I guess). Can it do whatever it does across a temperature gradient?


that's an idea but would you not have to power each one which then destroys efficiency anyway


I'm not an expert, but my understanding is that silicon really loves to be with lithium, which makes it a really energy-dense anode material for batteries. It's also cheap and abundant. The problem is that absorbing lithium causes silicon to mechanically expand, which quickly destroys cell life. The anode physically crumbles away with repeated charging and discharging.

These researchers have found a way to make a silicon electrode like a sponge, which helps with the mechanical problems. Their test cell has pretty "normal" degradation of 80% capacity retention over 1700 cycles, which is incredibly good for silicon. Normally it would be in the 10's or 100's, iirc.


This isn't the charging circuit - that goes in the charger. This circuit is responsible monitoring the state-of-charge (for that little LED bar graph on the front), disconnecting the cells if something goes wrong, and negotiating available current with the tool. It should also be responsible for cell balancing, but it looks like Milwaukee forgot to implement that feature (oops).

The videos at the bottom of the article have most of the details, since those dive into the communications protocols as opposed to the raw schematics.


Their M12 batteries don't have balancing (or a BMS inside), so they go out of balance and 'bad' very quickly. I've just added a balance plug to the outside of mine that I plug into my hobby Li-ion charger.


I believe you on the technical details, but as an anecdote, I have M12 batteries that are 10 years old and still working fine. At least, good enough that I have noticed no issues with them and I don't even know off the top of my head which of my batteries are newer and which are older. I also have a bunch of M18 tools and batteries, and I've noticed no particular difference in how they age compared to the M12 ones. But I'm just a DIY homeowner, so my usage is relatively light.


Yeah I figure most of them work fine, otherwise I'd find more similar stories.

We have about 10 of the M12 batteries, about half are the 3 cell ones and half the 6 cell larger capacity ones. And every single one has gone completely out of balance within 2 years of use.

I plug them into a balancer and they last another solid 6 months or so.


Fun fact, DeWalt 18/20V batteries also don't have balancing. The batteries have pins going to each cell for it but no chargers actually use them.


It's such a strange thing to do unless they wanted to intentionally make them have short lifespans.

The M12s have the pins too for voltage sensing in the charger (and tool maybe?), but they have 1 mega ohm resistors in line so cannot be used for balancing.


DeWalt also makes a series of batteries that will either provide ~20v (5s3p) or ~60v (15s1p) depending on the tool you plug it into. I've taken one of these apart before and how they implement this is just a mechanical switch between the cell groups, which only gets depressed when inserted into 60v tools. Not sure if thats why they put all the circuitry on the tool side, but I can't think of another good way to implement the same. (You'd need a pretty beefy boost/buck controller for the amps these tools pull )


There are always two charging circuits for Li batteries, one in the battery and the other in the charger. They pair together to negotiate the voltage and current. There's a dedicated protocol to do this.


This is absolutely not "always" true. There are tons of 18650 cells that have no electronics whatsoever, and there are tons of dedicated charging ICs that connect directly to cells/batteries.


This is an excellent idea! It would probably make the most sense to perform these adjustments on April 1.


I assumed the post title meant nanometers. Why? Floating-point rounding bugs. A nanometer is about 9e-15 degrees of latitude, which is right about where a double-precision floating point number runs out of digits. So, if a piece of software uses exact `==` equality, it could easily have a bug where two positions 3600 nanometers apart are seen as being different, even though they should be treated as the same.


Thank you. People can be very bad about judging which scenarios are truly implausible.

Here’s a previous thread where someone thought it was absurd that there could exist native English speakers who don’t regularly go shopping, and treated that supposed impossibility as a huge “checkmate”!

https://news.ycombinator.com/item?id=32625340


React Native added auto-linking years ago, which solved the native dependency problems. Just `yarn add` whatever you need, and if it has native code, the the Android side will incorporate it on the next build. On the iOS side, you do have to run `pod install` to lock in the changes, but everything after that is automatic.

Use Expo because you like the extra features it ships with, but not because you have problems with native dependencies. The React Native built-in experience is pretty much perfect to start with.


Each OS process has its own virtual address space, which is why one process cannot read another's memory. The CPU implements these address spaces in hardware, since literally every memory read or write needs to have its address translated from virtual to physical.

The CPU's address translation process relies on tables that the OS sets up. For instance, one table entry might say that the 4K memory chunk with virtual address 0x21000-0x21fff maps to physical address 0xf56e3000, and is both executable and read-only. So yes, the OS sets up the tables, but the hardware implements the protection.

Since memory protection is a hardware feature, the hardware needs to decide how fine-grained the pages are. It's possible to build a CPU with byte-level protection, but this would be crazy-inefficient. Bigger pages mean less translation work, but they can also create more wasted space. Sizes in the 4K-64K range seem to offer good tradeoffs for everyday workloads.


Just because some people do bad things does not mean you get to paint the whole group with those crimes. Imagine if you had said, "Black people are criminals, because a black person robbed the liquor store down the street last week." We all (hopefully) recognize the deep racism in this statement!

Unfortunately, there are parts of the country where this type of racism is acceptable and even common. There as similar attitudes towards Catholics, due to America's history as a predominately protestant country. The formula is the same both way - pick a heinous crime from a few members, blame it on the group as a whole, and feel smug about yourself.


I'm attacking an institution, not individuals.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: