Hacker Newsnew | past | comments | ask | show | jobs | submit | WillDaSilva's commentslogin

There's a repository setting you can enable to prevent actions from running unless they have their version pinned to a SHA digest. This setting applies transitively, so while you can't force your dependencies to use SHA pinning for their dependencies, you can block any workflow from running if it doesn't.

A lockfile would address this issue, with the added benefit that it would work

Pulumi may be what you're looking for. Same concept as Terraform, and many of its provider libraries are just wrappers around Terraform provider libraries, but you can use a variety of common programming languages to declare your desired state, rather than HCL.


Yeah, I tried it briefly some time ago and it seems like a solution.


Timezones are not static, and actually change somewhat frequently. A program that converts any given future time to UTC risks becoming incorrect about when that time is due to political changes that affect the timezone the given future time was in.


I was able to get a Pasmo card as an alternative to Suica, and it was rechargable with cash. I also did the same with an Icoca card. I didn't need a Japanese address in either case. I just bought the cards from machines at a couple of the larger metro stations. They could be reloaded with cash at any metro station.


There are times when a cache is appropriate, but I often find that it's more appropriate for the cache to be on the side of whoever is making all the requests. This isn't applicable when that is e.g. millions of different clients all making their own requests, but rather when we're talking about one internal service putting heavy load on another one.

The team with the demanding service can add a cache that's appropriate for their needs, and will be motivated to do so in order to avoid hitting the rate limit (or reduce costs, which should be attributed to them).


You cannot trust your clients. Period. It doesn’t matter if they’re internal or external. If you design (and test!) with this assumption in mind, you’ll never have a bad day. I’ve really never understood why teams and companies have taken this defensive stance that their service is being “abused” despite having nothing even resembling an SLA. It seemed pretty inexcusable to not have a horizontally scaling service back in 2010 when I first started interning at tech companies, and I’m really confused why this is still an issue today.


I fully agree. The rate limits are how you control the behaviour of the clients. My suggestion of leaving caching to the clients, which they may want to do in order to avoid hitting the rate limit.


>why teams and companies have taken this defensive stance that their service is being “abused” despite having nothing even resembling an SLA.

I mean because bad code on a fast client system can cause a load higher than all other users put together. This is why half the internet is behind something like cloudflare these days. Limiting, blocking, and banning has to be baked in.


This is exciting to see. I run some latency sensitive code on Lambda with the Node runtime, so cold starts are troublesome. I hope I'll be able to use this one it's in beta or fully released.


Distributing your backup over the spare storage of many other NAS servers is the main idea behind Storj, which provides a remarkably cheap price per TB per month.


It is done, albeit rarely due to how expensive and complex an operation it is. My friend's mother moved her house a few kilometers across some farmland. It was a rather large 2 story tall house, with a basement. It had to be moved to a plot that had a similar foundation and basement prepared for it.


If they sell their land, then presumably they'd receive a substantial windfall with which they could buy a new cheaper place, or rent. If they don't receive a substantial windfall, then the amount they were paying for the LVT must've been low.


"If they sell their land, then presumably they'd receive a substantial windfall with which they could buy a new cheaper place, or rent."

Not really. You might be forcing someone out of a 2.5% mortgage into a 6% mortgage on a new property, incurring property transfer taxes, moving costs, loan underwriting, and other fees. You very well could lose money in some situations. Your argument also assumes they are not in the cheapest homes already. If they are, they could be forced out of the geographic area altogether if there are no cheaper homes (and inherently rents will be more expensive than the cheapest mortgages under that system).


Land is worthless under LVT. Its entire revenue stream is captured by the tax, an asset without revenue is definitionally worthless.


Assuming the tax is high enough (or grows over time to become high enough) to offset the negative externalities, and that the money raised is used to offset negative externalities, they're better phrase not as "it's fine to burn up the world as long as you're rich", but rather as "it's fine to emit CO2 as long as you sufficiently offset the damage". Accounting for the damage could involve investments into green technologies, or paying ordinary people to make the tax popular, among other things.

Personally I like the idea of setting the price for emitting 1 ton of CO2 equivalent emissions to the realistic cost of capturing 1 ton of CO2. At least, that seems like a reasonable end goal for a carbon tax, since that could fully account for the negative externality. This would of course be obscenely expensive, which would be a strong incentive to lower carbon emissions where possible, and for people to consume less of products that require large emissions to make or use.

The carbon tax would also have to apply to imported goods to be effective, and tracking how much tax should apply to imports would be even more difficult than doing so for domestic sources of pollution.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: