Hacker Newsnew | past | comments | ask | show | jobs | submit | miellaby's commentslogin

À simple way to debunk this fake French pretending non-profit complainer is to check it's present in the Répertoire national des associations (RNA)



The keyword you missed is 'idle'. An app should use as much Ram as needed to cache computation and network as it runs. But it shouldn't do when in the background. The Ram used here is not for accelerating the user experience. It's for the internal functioning of the managed language (Javascript) and virtual machine (Dom and browser API) because it has been packaged as a full Chrome clone running a single web app.


2048 times


I was being generous and assuming the chat app "only" used 1GB of ram.


I share with the author this youth where a child learns coding before everything else. I really loved coding and made it my carrier. Yet I don't think I would have been on the side of recognized genius if born earlier. I don't think any of them spent most of their time smashing keys. They were rather conceptualizing and planning stuff, and had human skills I could only dream of.

That being said, we untalented programmers are experiencing what most jobs suffered in the last 2 centuries: massive automation of their everyday activities. I especially identify with these traditional farmers who took their life as their way of life was wiped out by artificial fertilizers, mechanic, chemicals and hyperscaling.


What about generating an answer, scoring its confidence in parallel. Then running a second llm to re phrase the answer accordingly: 'I vaguely remember Bobs birday is 1st March but I may be wrong, I should search the web"


Plucker was amazing. I used it to transform online docs into palm ebookd


You're supposed to use venv for everything but the python scripts distributed with your os


This is brilliant. Checkout my [Anatomically Correct Fish Tank](https://drawafish.com/fishtank-view.html?id=gYCWJ24lhxSvR4lG...)


This article seems like fantasy fiction: 'We thought antibiotics were to blame, but actually, it's NO2.' (next 5G?) while it's widely recognized for the last ten years that the primary culprit is neonicotinoids: very potent and pervasive chemicals that accumulate in the biotope, killing all insects indiscriminately, contrary to the misleading claims made by the agro-industry.


Varroa mites are widely considered a greater cause of bee population decline than neonicotinoids.


And neonicotinoids are directly thought to increase susceptibility to Varroa mites


well whos giving them the neonicotinoids? just get them to stop


> it's widely recognized for the last ten years that the primary culprit is neonicotinoids

What would be your best source to back that ?

(I'm not trolling - we've been having a vivid debates about that exact topic for the past few weeks in France, and one common counter-point is that the decrease in bee population is multifactorial, as opposed to having any "primary" culprit. So any source welcome :) )


Neonicotinoids kill all insects. They are extremely good at this. Noone contests this. CCD ("Colony Collapse Disorder") started as neonicotinoids usage raised, and so did the "windshield phenomemom" that all rural residents above 50 can tell you about.

If there were no parasite, no pathogen, and no predator, then bees would not be as much affected by pesticides for sure. But parasites and pathogens existed before, while habitat loss and monoculture farming don't explain what happens in relatively preserved areas.

All pesticides have an impact on insect populations obviously. I agree one should not focus on only one class of them, and work on an actual reduction of their usage and compensate farmers for profitability loss due to changes in pest management strategies.

You don't need to comment this. I know It won't be done and we're all screwed.


So why the distraction?


I like how the author is surprised by the technological aberration that form Linux powered home appliances. A node server to power and publish over wifi a web site, an API, a web socket, while the site is being displayed by a outdated webview engine within an heavily constrained terminal which cant be reused for anything else. That's... the norm.

All this is very common. And yet displaying a couple of digits and a bar graph could be done with a pair of microcontrollers communicating onto some wired bus.

With the power supplies of this era, this pair of devices probably pumps 16w idle. Running 24/24 7/7, they probably consume as much as a small fridge as a whole. The LCA of the solution must be consterning as well, especially compared with few one dollard microcontrollers.

The worst of all is that this whole mess turned into bricks probably 3 years after it was installed, maybe less.


The reason why the Mirai botnet is still at large is: Android.

From a business perspective nobody wants to pay the costly people that can do microcontroller programming. Frontend devs are dirt cheap, especially for something as simple as that interface displaying the bar charts.


From employee perspective it was my impression that EE developers tend to get lower salaries than web developers.

But it could be the case that building an android or web app for a simple UI would take less dev-months than an embedded app with similar functionality.


> building an android or web app for a simple UI would take less dev-months than an embedded app with similar functionality

I'm quite sure that's correct.


There is also an enormous amount of flexibility gained when, instead if designing and building your own single-purpose device, you just use a cheap, mass produced, off the shelf, general purpose device.


> nobody wants to pay the costly people that can do microcontroller programming

The embedded world isn't known for paying well.


Actually competent microcontroller programmers definitely earn way more then bargain barrel front end devs.


You can easily say the same for the actually competent front-end devs.


Yes but the original point is that 'competent front-end devs' aren't needed in this case, so they can get away with the bottom of the barrel.


> within an heavily constrained terminal which cant be reused for anything else.

Except for botnets and/or spying. Some of those boards already contain MEMS microphones and cameras (the box in the picture even shows the camera objective). I'd have took apart the device to take a look inside, or at least run some diagnostics to explore which hardware was installed/detected.


I wonder if cat /dev/video1 would be enough to turn it into a surveillance device..


Not sure about cat, but if one could sneak in netcat or better ffmpeg on Android, then opening a audio/video channel to the outside could become trivial.


He probably would get more savings by removing the fuse again than keeping that useless thing on…


Pulling wires through anywhere after it's finished is an immense installation hassle though. It might be possible...or it might be completely impractical even if you can (i.e. low voltage buses and unshielded power wires don't play nice together if they're parallel).


Yeah I don't understand why he is shocked that this communicates wirelessly. He even bought a modern flat with Ethernet because he clearly knows how much pain it is to add wiring to a house. Very weird.


It's painful to add wiring to a house as a random normie user. It's not painful if you're an installer or a construction person.

Or at least this is my understanding why all construction is still done in a way that makes it near-impossible for the user/dweller to change anything without having to do general renovation of half the place. Which, again, is hard primarily because it makes no sense to buy all the hardware that makes such work easy for a single job.


It is still painful if you're an installer. You have to chase walls, pull up carpets and floorboards, install trunking etc.

Why on earth would you do that when you can just use wireless? It would make zero sense.


> With the power supplies of this era, this pair of devices probably pumps 16w idle. Running 24/24 7/7, they probably consume as much as a small fridge as a whole. The LCA of the solution must be consterning as well, especially compared with few one dollard microcontrollers.

At the average cost of electric in the USA this amounts to under $2/month. Seems negligible to me?

https://www.wolframalpha.com/input?i=16+watts+*+24+hours+*+3...


For a device with so little functionality and non-critical functionality at that, I wouldn't call $24/year negligible. My whole home Ryzen router/server idles around there. Honestly I'd bet the fuse was missing because the last tenant was an engineer, investigated this thing themselves, found it a useless waste, and pulled it.


$2 is median daily wage for several countries. Negligible in US but not everywhere

source: https://ourworldindata.org/grapher/daily-median-income


I don't think places like that have folks in new construction with Ethernet and Android tablets in the wall. TFA uses GBP as the currency in the images.


Yes, but those countries also have different energy cost.


you must be fun at parties.


Still cheap, but the cost per KWH in UK is at least twice the average in the US.


Please send me 2$/month if you don't mind.


Sure no problem. Just power my 16w tablet in exchange. That's what this thread is about: A tiny cost for a tiny convenience.


On the contrary, using HTML, age-proven open tech, guarantees that even decades later, the product will be usable and easily expandable.

On the other hand, adding one more graph to the microcontroller-powered solution will probably mean redoing it from scratch.


> A node server to power and publish over wifi a web site, an API, a web socket, while the site is being displayed by a outdated webview engine within an heavily constrained terminal which cant be reused for anything else. That's... the norm.

I really wonder why this happens. Seems penny wise & pound foolish. Perhaps they failed to hire the right developer for the right abstraction level, and ended up with "web developers" I guess.


Money. That's almost always the reason for "why would they do that?"

It's much cheaper and more sustainable for the wealthy and powerful to train individuals on very high level technologies then reuse their skills in every way they can, regardless of how feasible, the economic and ecological footprint, or any concern outside of making profit.

Electron is not some comic book villain. JavaScript is not horrible and can be the optimal choice for many software applications.

But these technologies and tools are easy to teach to many workers who may or may not: understand the computational architecture to come up with better economic efficiencies, have interest in applying their skills to properly solve a problem rather than put food on the table, and so on.

The higher level the skill is, the less interest and deep systemic understanding needed for the job: many new jobs created.


> And yet displaying a couple of digits and a bar graph could be done with a pair of microcontrollers communicating onto some wired bus

Can you expand on that? Would 7 segment displays and a couple of leds be enough ? Which hardware would you use ?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: