Yeah - it's a selective testosterone blocker, kind of, and tries to target the part that causes male pattern baldness without lowering overall T Levels.
I assume it still blocks enough hormones to cause mood shifts or other effects?
> Yeah - it's a selective testosterone blocker, kind of, and tries to target the part that causes male pattern baldness without lowering overall T Levels.
> I assume it still blocks enough hormones to cause mood shifts or other effects?
Endocrinology is a lot more complicated than you're giving it credit for. DHT blockers don't necessarily lower testosterone levels; they can actually increase it (although even then, the mechanism isn't as direct as you might think).
It's neither established nor a given that any side effects of finasteride have anything to do with effects on testosterone or hormone levels at all. A lot of people make that assumption, and there's reason to suspect there's truth to that hypothesis, but it's completely possible it's an unknown side effect of the drug, and there hasn't been enough study into the mechanism to understand it (in part because the side effects are relatively rare and weakly established).
It’s not a testosterone blocker at all. It blocks 5-alpha reductase, an enzyme that converts testosterone to DHT. It can actually increase serum testosterone by more than 10%.
As for DHT, that hormone doesn’t appear to have much significance for adult males. It’s critically important in puberty, though!
Blocking the conversion of testosterone to DHT (the effect of this medication) causes an indirect and minor increase in both testosterone and œstrogen levels, although DHT is more potent than testosterone for the receptors in many tissues.
Are you a climate scientist? Do you have any understanding of how co2 lowers alkalinity in a solution or what impacts that might have on the planet? It seems sort sighted to say “it’s not even close”
Ocean acidification is small fries compared to how much impact thermal effects have. Just about every area of concern when it comes to climate change - heat waves and extreme weather events, agricultural impacts, sea level rise - comes from thermal imbalance alone.
Can you "reduce output" globally, to negative values, within the next 5 years?
Because that's what's required to match the predicted effects of doing stratospheric aerosol injection at scale.
Currently, the temperature is still "chasing down" the sheer amount of CO2 that was emitted over time. Even the completely unrealistic scenario of reducing emissions to zero instantly would cause climate change to continue for a while.
Geoengineering offers a range of sharp, cost-effective interventions that can knock the temperature down more quickly and more directly.
you just restated what you already said in response to the question, "do you know what you're talking about?"
i'm not judging either way, i'm not a climate scientist and i have no opinion on the importance of ocean acidification, i just find it obnoxious when someone's asked to defend their position and they just say it again, but _harder_.
Unfortunately, I do know what I'm talking about. Which is where my sheer hatred for environmental activists is coming from.
The top 3 enemies of doing something about climate change are: fossil fuel megacorp PR and lobbying efforts (no surprise), mainstream media (little to no surprise) and environmental activists (fucking why).
I agree that we need to have a conversation about geo engineering. And I've been staying up to date with the sulphur regulation thing. That being said what do you think of the position that we should avoid temporary solutions to global warming in order to drive a sense of urgency? You could make the argument that we want the slope of temperature change to be as high as possible in the near term to drive political action. Again I have no doubt that geo engineering will become the only viable solution. But right now a significant % of tbe population doesnt even believe in climate change and wouldnt support any action taken. So maybe they need to be convinced - and so far education hasn't convinced them so maybe 5 degrees fahrenheit will.
I do not think that "climate change accelerationism" is a defensible position.
We are fighting climate change not to feel good about ourselves, but to prevent those higher-degree impacts from happening in the first place.
What's worse is that climate change has a considerable momentum. If you resolve to hit +2C before taking climate action, then even stopping all GHG emissions instantly would leave you with another ~+1C that would trickle in over time. In reality, there is no fucking way to obliterate all GHG emissions overnight.
Geoengineering solves a lot, but it doesn't delete all of the problems outright. Unless you commit to some truly unhinged methods. Which might not be the worst idea, really - but then every problem we have with making geoengineering happen apples at least tenfold.
Yeah I was mostly playing Devil's advocate. This sulphur cloud thing has been driving me nuts for over a year so I've entertained the accelerationist concept to save my sanity. It's incredible that it's not talked about more and yes it does call into question the rationality of many climate activists. At the end of the day I don't think the monkey brain evolved to handle this type of decision making.
Both of these games are still going. Atlantic has a huge player base. It’s not the cutthroat game it once was but it’s still very much exciting. You can still die and all your shit poof. Housing on Atlantic is still in demand and hard to get if that gives an idea how healthy it is.
Eq has of course had some major server merges but your old account will still be on both UO and EQ.
To me UO is a breath of fresh air after 20 years of trash games except for a stand out few. Seeing my old wood elf ranger with swift wind and lupine dagger still glowing was magical. Almost as magical as re-exploring kelethin.
Not only is it still going but it's possible to dust off your ancient account. An old guild mate reported that he had recovered his account originally closed ~2000. Sadly, of the two accounts I had back in the day the only one still recoverable was my alt/mule account. My original account was one of the, well, original accounts from opening day, so it's a shame it was lost.
But still, it was fun to run around with one of my guys for a couple of hours. One thing I thought was cool was there had been some custom content involving my guild added near the bank where my guild hung out. It was still there, all these years later!
Becuase builds are gated by test coverage people write tests for coverage and not for functionality. I’d say a good portion of the inherited tests I’ve ran in to wouldn’t catch anything meaningfully breaking in the function being tested.
Your issue is with targeting a metric then (coverage), not the unit tests. Good unit tests can be so useful. I've got a project currently that can't be run locally because of some dependencies, and coding against unit tests means I get to iterate at a reasonable speed without needing to run all code remotely.
I spent 3 years getting a Ruby codebase to 100% branch coverage running locally in a few minutes (I wasn't just looking at coverage, I was also profiling for slow tests). Found a few bugs ofc having to read through code so carefully. The value was having a layer of defence while refactoring, if some unrelated test failed it implied you missed impact of your change. It also helped people avoid the issue of making changes to an area of code with no testing, where existing tests act as docs (which execute, so won't go stale as easily) & make it easier for new code to write new tests building on existing tests
This codebase was quick to deploy at Microsoft. We'd rollout every week. Compared to other projects that took months to rollout with a tangling release pipeline
Anyways I left for a startup & most of this fast moving team dissolved, so the Ruby codebase has been cast aside in favor of projects with tangling release pipelines
Politics isn’t a line. Center in most countries is “fine with whatever the government tells me” from forced medical experimentation to genocide.
You’re right tho in that it does seem like people who reject the lies their government tells them may be slightly more likely to say things that upset you on the internet (since I’m guessing that’s what you mean by persicution)
I was using persecution as a response to the parent post, which I took to mean that the far left and right are more likely to persecute political opponents and expect persecution themselves, so they are reluctant to approve of government surveillance out of fear that they are on the receiving.
> Center in most countries is “fine with whatever the government tells me” from forced medical experimentation to genocide.
This is a strawman, plain and simple. Calling the political center of any country "fine with genocide" isn't an argument, it's a smear. I'm a centrist democrat, I'm not fine with either of those things and you'd be hard pressed to find someone who is. You can disagree with moderates like me, but painting them as compliant with atrocities is dishonest and lazy.
> Politics isn’t a line
No disagreement here, sometimes it's a horseshoe and you're proving the theory true.
> Calling the political center of any country "fine with genocide" isn't an argument, it's a smear.
Well the US isn't doing genocide really so yeah it's a smear.
But moderates run the status-quo. MLK talked about how the biggest hurdle to integration and civil rights wasn't racists - it was the moderate white man. Ultimately things are the way they are because there are so many moderates who propose no solutions to anything and are terrified of anything that could be misconstrued as a change.
Moderates are, by in large, complacent with atrocities currently going on in the government they belong to. Moderates in Israel certainly seem fine with genocide. Moderates in the US are fine with the US' war-mongering. They're fine with the pseudo-slavery system that exists in some US states like Georgia.
I wish I had a dollar for every time someone tried to clumsily wield MLK's Birmingham Letter as a moral fulcrum to dismiss modern centrism. That letter was written at a specific moment in time, aimed at a narrow group of people -- white moderates who used "order" as an excuse for inaction on civil rights. He wasn’t condemning all moderation for all time, and invoking his letter every time a moderate calls for practical reform over revolution cheapens its message.
> Ultimately things are the way they are because there are so many moderates who propose no solutions to anything and are terrified of anything that could be misconstrued as a change.
Are moderates "terrified" of the change itself, or do you disagree with how they want to get there? When I call myself a moderate, I don't do it because I'm "terrified" of a better healthcare system; I do it because I want reforms that are realistic, durable and broadly supported. I want change, but I want a solid plan and incremental steps to get us there -- not fiery rhetoric and slogans that crash the economy.
> Moderates in Israel certainly seem fine with genocide.
I'm not Israeli and don't pay attention to their politics, but you're playing a rhetorical shell game here by exporting the moral failure of one electorate and applying them to moderates of all electorates across the entire world. If you want to critique e.g. US policy, do it on the merits, not by hitching your argument to atrocities halfway around the world and then implying guilt by association.
> They're fine with the pseudo-slavery system that exists in some US states like Georgia.
I'm pretty in tune with US politics and I can't figure out what you're referencing here. Are you talking about for-profit prisons and prison labor? I agree that those things need to be abolished, and that's why I supported Biden ending federal contracts with private prisons. As someone who has had several family members in the prison system, I sincerely hope to see more criminal justice reform.
This is sound advice for keeping yourself free from malware as well. Many of these TVs end up running super vulnerable junk that doesn’t get updated and has known exploits.
I’ve had two devices end up with malware like this. A Sony blue ray player that was uploading 2gig a month before I caught it and a Samsung tv.
It’s worth mentioning you have to block or change WiFi credentials. The device with malware may attempt to connect to any known wifi even if you disable it on the device. I get 45000 auth attempts a day from my tv.
That is not what is happening. Listen to Ursula. She’s telling you what is happening. Eu countries are being “allowed” to go into debt without triggering eu debt procedures. It won’t be reinvestment. It will be dilution of currency though debt. Something all too familiar to Americans.
Correct. Interestingly enough, it will massively increase the supply of euro bonds, and probably pull in a bunch of cash that goes to US treasuries now.
If there's enough pan European bonds (which there won't be) then the reserve currency status of the dollar could be threatened.
Can you really make these claims when streamers for months were showing people looting? What about chop? What about that federal courthouse in Portland?
My city had 100s of stores ransacked with groups of cars with 100s of people all working together. Say what you want about opportunists - riots happened.
What about it? Did it burn down? How many were injured?
> streamers for months were showing people looting
It's interesting that people criticize the professional news media, then believe whatever they see on social media.
> riots happened
I expect some did, depending on the definition of riot, but that's not 'nationwide rioting' and a call to use the military to suppress the great majority of peaceful protests.
There was rioting in many major cities and it was absolutely appropriate to call in the national guard to put a stop to it. My family didn’t leave Bangladesh to put up with third-world behavior like that.
Protest isn't third-world behavior, and part of being in the US is not restricting other people's liberties - regardless of your personal preferences and outrage.
If they are harming others or causing significant material damage, that's another story, but even that is a matter of degree (especially the latter).
Just for a giggles, what projects do you work on that require such variation in tooling where something like this becomes worthwhile?
I always see these type of arguments for why nix is so great but it’s never been a pain point for me in 10+ languages and 20 years of development experience. I see your example of bash scripts but this can’t be all for writing scripts.
Not the OP, but I work in consulting. When I was still hands on keyboard, this would have been very helpful for the clients who don’t provide their own hardware or environment for us to use. I also do work for extremely large organizations who have literally dozens of different stacks accumulated over the decades.
In addition, I play with all sorts of open source tools and they often come with their own tool chains and expectations. Python version management in particular benefits a lot from this level of isolation. Instead of figuring out the different version management tools for each stack I use a higher order environment management tool in Nix.
Some others are solving these issues with containers, and that’s a part of the nix strategy as well.
I've previously used Nix to manage C/C++ projects and ended up with a really nice flow, so I really want to use Nix for Python, since I've had so many issues with conda. However, every time I've tried, I've ran into enough issues trying to get a lot of ML packages I use to work (dealing with transitive dependencies on esoteric packages, mostly) that I couldn't justify continuing rather than just hacking my way to getting the conda environment working with random pip packages, pinned versions, etc.
I've been considering an AI project for consuming a conda build recipe and digging into the codebase to extract extra info about the project and make it into a nix flake--which would be a bit more stable. I figure you could test for equivalence in a few ways and feed the outputs of that test back into the model. Hopefully there's enough context in the conda recipe, the project codebase, and whatever errors pop out to get some of them converted with minimal handholding.
Because regardless of what the cool kids are doing, important work is being done in conda, and usually by people whose expertise isn't software packaging.
Yeah I get the idea but I’m asking op for concrete examples. Python has its own environment management options that work well. I’ve read on this site over and over what it can do - I’m wondering if anyone has hard examples of tooling they switch about enough to make it worthwhile.
Scripts are the main place where it matters tbh - most language ecosystems have their own way of doing this stuff, if you can stay within the language you're fine. But if you (or your client) have a culture where people throw in awk/grep/sed then there's just no real alternative. Or if it's a polyglot project where you have three different languages (including shell) then you may not be able to use a single language package manager.
Agreed, but people have a tendency to do otherwise. At the very least you still have to install the right version of the language. And there are probably an few other tools, linters and such... Next thing you know you've got quite a pile that's not covered by your language's package manager.
I find different languages/ecosystems have different cultures around this. In Java/Maven land it's fairly common to have a self-contained project where all the helpers like linters etc. are set up in Maven so all you need is a vaguely recent JVM and vaguely recent Maven. But there are other ecosystems where people like to throw a bunch of shell scripts etc. in.
When Python comes up as an example of a problematic packaging ecosystem, Java often comes up as an example of it being done right. I think the key is the cultural difference you're pointing to. JVM folk are not tempted to stray from the JVM. Python folk think of Python as a convenient harness for that cumbersome bit of FORTRAN that they can't live without.
I only worked in a Java shop once, but I remember that they looked at me like I was an alien when I proposed that we involve a subprocess written in a different language.
At the time I thought they were insane for writing everything themselves but I've since seen how gnarly packaging can get and now think that they're... less insane.
The most recent offenders were nodejs and kustomize used as part of a test flow orchestrated by a Makefile, run both locally and in CircleCI.
People will just install the latest version and start hacking away, and now you've got all this code that depends on that version. Backwards compatibility ain't perfect, so maybe several years later the original author doesn't work here anymore, so tests are breaking in subtle ways when you install what's the now latest version and there's nobody to ask what the "right" version is.
But since we're a culture that uses these tools (though I wish we weren't), this story has played out several times so different projects need different versions--you can't just discover the right one once and leave it installed in your system, you have to install the right one for your project and change it when you switch to a new project.
For the most part these are go projects, so even though there is language-specific dependency locking via go.mod and such, dependencies which aren't go libraries but which are nonetheless needed to work with the project (e.g. make) are left as an exercise to the reader. Make is pretty well behaved, I haven't had to do much version antics with it, but I wouldn't say that's the norm.
When I find one of these repos I put my archaeologist hat on and write a flake.nix to provide whatever the dependency is, and then I walk the version backwards until it starts working. That way next time I'm in that project I don't have to go through that exercise again.
To make matters worse, people often try to help by adding entries to the makefiles which download the correct version for that project, but some people have the newer arm chips and others are still on x86 so confusion of a new kind ensues. Of course it's easy to fix these scripts to detect the local architecture, but that's a whole extra step.
And then maybe you're trying to make this stuff work in CircleCI or somesuch, and you don't want the workflow to just be reaching out via https and blindly running whatever comes across the wire because who knows if it'll be the same thing tomorrow, so you add hash checks, but once you've got the hash checks and the architecture checks and you're checking the right hash for your architecture... we'll you've basically got a poor man's flake.lock at that point, might as well use the same nix config in both places rather than use homebrew or apt or whatever locally and then figure out how to do the same thing via circleci orbs in yaml and god forbid you have to do it in prod too so now there's a Dockerfile... Having a single source of truth for dependencies and using it everywhere is super handy.
That's work. Another example is in my personal projects. I use helix, so there's a .helix/languages.toml in my repo which defines the language servers that I'd like it to use for that project. But merely pointing helix at the language server isn't enough, it also has to be on the PATH. My older projects are using mypy, and my newer ones are using pyright (python type checkers).
Sure I guess I could just install both at the system level everywhere I go, but when I clone the project on a new machine I want to have everything work right away--I don't want to start coding and then wonder why my editor sucks and then go discover and install the right LSP and then resume coding. I'd end up with a smattering of different versions installed across all of my devices, even for the same project. If I find a bug which happens on this machine but not that one, I'd have a much harder time knowing where to start re: debugging it.
Finally there's this idea that maybe you don't even have to clone the repo, you can just reference the app you want to run from anywhere. I invoke some of my tools like:
nix run git+ssh://[email protected]/myorg/myrepo -- mycli --best-arg
Nix knows how to set up the environment for running the app (the one I have in mind is written in python, so a certain version of poetry is involved etc...) so I really don't need the caller to think about that environment at all. I like this because it decouples the orchestrator from the executor. So if I can manage to get something working locally with one of these commands, then I can go put the command someplace weird like in an Airflow DAG (kubernetes pod operator, NixOS image) and I have a pretty strong assurance that it'll work the same in the remote environment as it does locally.
From the perspective of a nix user, these problems are all the same problem and they're everywhere and nix is the only thing that solves all of them at once. My feeling is that from the outside, they look like separate problems, and it's not clear just how many of them are solved by nix--so the juice doesn't appear to be worth the squeeze.