Hacker Newsnew | past | comments | ask | show | jobs | submit | prashnts's commentslogin

> packet-loss

For one it can adequately show if your WiFi is performing well...

In a home network setting I've got a UDP display sink (64x64px, RGB) with a custom protocol. It works fine-- no hard number as the performance varies based on other factors, including congestion. I've been able to push it to >90fps, but around 42fps there is virtually no flicker.

For context in my protocol I use all 512bytes, with 3 first being the "line number and such" data and the rest containing ~128 pixels.


I monitor packet loss on both sides all day every day. It’s still a thing, but different than I’d imagined:

- Episodic most often. Something transient causes high loss for a short time. This happens locally and “in the cloud”. - Persistent due to a back connection or very high network load. I only really ever see this locally.

But I can go days at a time and not lose even one of millions of probes and responses.

When connections are good, they’re excellent these days. When they’re bad, well that doesn’t seem to have changed.


I would give the LLM statistical tools that work to find general anomalies, distributions etc. from the spreadsheet. Then it's about an LLM interpreting (or not) those results in natural language.

Of course this is a can of worms for a product because we can still not guarantee accuracy.


That makes sense to me in terms of tech, but I don't get how the economics will work. If one company has an agent that's a great statistical tool, another is great for making charts, another is great for grammar and spelling, etc., I'm guessing it'll cost a bazillion dollars in subscriptions.


Reduce Transparency in Accessibility settings removes the glass effect, but I believe has been updated to be closer to the translucent effects in current iOS.


It's sad when so many settings people use to make Apple's products better/more usable seem to always be hidden in Accessibility. I'm sure that says something.


That building for accessibility helps more than just disabled people?


I find the "reduce motion" toggle to be a more pleasant experience on iOS as well.


Also this is way better compared to Android, where "remove animations" make apps feel like a dumpster fire, many of them lose parts of UI that were animated instead of showing them statically, feedback for touching gets often lost, things are waiting for animation so you are still stuck waiting a second or two for nothing, etc.


Another option is water slide decal. It can give a really seamless look, but is time consuming and expensive.


Really cool!

A quick enhancement you can add, which will help mobile users, is adding appropriate touch specifiers to the element. Currently I am unable to drag the pen to draw -- the whole page moves. By fixing that, I should also be able to use my Apple Pencil.

Specifically, this page describes it better: https://developer.mozilla.org/en-US/docs/Web/CSS/touch-actio...


Hi, thank you very much for your feedback! Unfortunately, I don’t have plans to support mobile so far, but maybe one day I will! Please stay tuned as the project continues to grow!


Better to draw boxes, print, then scan. I reckon it will somehow be possible to see through the sharpie -- for example change in shine.


As a hobbyist I use it to drive HUB75 LED matrix. I use the W version so it receives UDP packets of each line to draw a full frame. This equates to about ~5mbps at my desired framerate, and it handles the task like a champ!


I can't stress how fast it is when using on resource constrained envs like a Pi Zero.

I intend to use system python there but previously poetry will simply crash the whole Pi while installing itself.


What really helped me was doing some sessions with an Orthoptist to reeducate my eyes. I used to see double when stressed sometimes and could never imagine to converge/cross my eyes and retain focus. With the reeducation I was able to see the Impossible one in focus after a couple tries.


I had to see an eye doctor at the hospital when I was ~7 and I got to do some exercises, but I never learned to cross my eyes, and then it was like it probably wasn't very important since I did not have to go to the doctor again and no one mentioned it so I just went on with my life and it seems overall like not being able to cross my eyes is not a huge problem. But I guess it may be connected to my complete inability to see 3D effects or figure out how to see anything in the images in the article.


I tried to make a bidirectional channel from a webpage to a python script over MIDI. I'd just found that with sysex you can pack any arbitrary data that you want, that python can create virtual MIDI devices, and that Chrome can then connect to such devices.

I'm sure I'd more code than what's in my 8yo repo, but the premise is simply https://github.com/prashnts/midipacks/blob/master/midipacks/...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: