One big disadvantage of using the domain name in the module specification is, that as a package provider, you are kind of permanentely chained to the hosting provider of your choice, e.g. github.com. Moving your package would destroy its "identity".
It is bad enough how much of a monopoly and tie-in github today has (probably the reason Microsoft bought it) and a language environment shouldn't contribute or even aplify that role.
The alternative (as with e.g. npm) is being tied to the hosting provider for the package ecosystem. If you want to, you can use your own domain name for your Go package URL and then link it to a repo hosted elsewhere. There's really no namespace on the internet that you have a more permanent claim on than a domain that you yourself have registered.
In the case of npm you could also use something like Sonatype Nexus, push your packages to your own npm repository and install them with --registry or something like that: https://stackoverflow.com/a/35624145
I also ran GitLab in the past: https://about.gitlab.com/ but keeping it updated and giving it enough resources for it to be happy was troublesome.
There's also GitBucket: https://gitbucket.github.io/ and some other platforms, though those tend to be a little bit more niche.
Either way, there's lots of nice options out there, albeit I'd still have to admit that just using GitHub or cloud GitLab version would be easier for most folks. Convenience and all.
Not at all. If you just reference to a package by its name, that is good enough for a compiler. I think automatic download of packages from third party sites doesn't belong into a language and its native tooling. The downloading and installing of third party packages could be done by a tool provided together with the language core tools, but should not be required to be able to compile a project at all.
Strictly speaking, as with cargo/rustc, the Go module system and Go compiler are separate, and the latter can run without calling the former. There are various flags to tell the "go" command (which is a frontend to other tools) how to behave when modules are involved, e.g. so that it works well on airgapped networks and locked-down intranets or can be used safely with untrusted source. You can also still vendor modules to ensure they are locally available.
That's why there's https://go.dev/ref/mod#vcs-find .
Your import path doesn't have to map directly to vcs repos, as long as you can serve an html meta tag to redirect it to where your current repo is:
But you can't serve this meta tag if you don't control the URL/domain anymore, and you can't force all of your users to use e.g. an intercepting HTTP proxy for such requests.
How exactly? If e.g. the registrar seizes the domain name that I've originally used and gives it to someone else, what, exactly, are my plans supposed to be for that?
Thanks for the link. When I started learning Go modules that spec was not available (or it was a lot smaller). After a brief look I feel the Go modules spec is longer and more complex than the (original) Go language spec which is quite ironic :-D. I still like Go although it's sad seeing it straying from the 'one true' path of simplicity.
It's a bit shorter than the current spec, but it does cover more: how it works conceptually, how the default toolchain works, and how it interacts with a lot of things outside of a world it creates on its own (which is what a language is).
The benefit is that you're not creating a new directory of names, they reuse an existing well established one.
Creating a central authority for names is a lot of work.
Im not that experienced with Go, but I believe it's possible to create a vanity package name on a domain you control. If you want to change hosts, you can just point your domain to something else.
Yes, if your module URL is on a site you control, you can serve a <meta name="go-import" ...> tag to redirect it to your source repo. The module URL is permanent but you can move the repo.
It's a bit more fiddly if your module is part of a monorepo and doesn't live at the root. In that case your go-import tag needs to point to a GOPROXY server. I have a proxy server here: https://github.com/more-please/more-stuff/tree/main/gosub
Indeed not creating a new directory of names is an advantage, plus it also means all the short names are effectively reserved for the standard library, and there's no scramble from developers to squat on all the "good" names.
I don’t write go myself but man did they get a lot of big decisions right. I’d be totally open to writing go in future but I have java so don’t really need it. I envy go’s build and deploy stories though.
The obvious advantage of using domain names and in general URLs as the package names is that the Go project doesn't have to run a registry for package names. Running a registry is both a technical and especially a political challenge, as you must deal with contention over names, people trying to recover access to their names, and so on. By using URLs, the Go project was able to avoid touching all of those issues; instead they're the problem of domain name registries, code hosting providers, and so on.
I'm a simple man, I experiment a lot, I create a module locally and bam, straight from the beginning I have to decide where I will host this module and very often I don't want to make it public so I only keep it locally on my machine, but often I need to share my module among my several machines (laptop, mini desktop) but I still don't want to share with github publicly and it's annoying that there is no easy way to do this AFAIK. In a better world I would create a "module" in a folder, give it a symbolic name at most (like 'ShinyModule') and share it in various ways; like I could share the folder using samba, ftp, ftps, sftp, https and in the consuming side, you would just import 'ShinyModule' and have a single file per consuming module which says:
And you can do all these things smoothly with rust's cargo: use a local relative path, use a git URL, or use a published package name. It's perfect if you want to try and hack around a dependency.
It's not because the tooling is better, which also happens to be true by far, but because they didn't tie themselves down to a domain name scheme. Funny, given that go waited a long time to take a shot.
Rust has a different problem: too many dead packages with desirable names on crates.io. There's a lot of derelict cruft in that shared namespace, especially for packages outside those most commonly used.
The company I am currently at has changed the domain name for the inner Gitlab server three times in two years. The older domain's are tentatively supported but they behave differently in respect to authorization so... we had to switch our imports whole-sale, or the builds would just keep breaking for no apparent reason.
Don't play naive.. the laws and regulations are not put in place for that percent of business owners/managers that would not abuse their employees out of principle even when they can, it's for the other ones that do.
There are actually employees a company is allowed to abuse more than their "regular" workers, namely managers.
If there was a legal category for "startup employee", with significantly fewer protections, but who can only be employed with a minimum wage of e.g. 3x the average income, would you object to that?
Makes sens though. Overclocking APUs in laptops is much more trickier and riskier than desktop CPUs, especially that the optimal settings have already been tuned by the OEM and locked in firmware based on known thermal and VRM power design limitations, so overclocking won't get you any gains anyway but might brick your system.
If you really do want to tinker with your laptop APU in less risky ways, try this:
UXTU is unfortunately not very well written. I actually have a commit on this repo (actually the handheld version because for some reason, that's a hard fork) to stop it from leaking process handles (a resource that can't be reclaimed on Windows, except with a reboot)... and it adds up fast since UXTU runs `powercfg` every 2 seconds or so.
The good news is that we can start helping right away, there is no need to wait for these long debates about the role of the first world towards the second and the third. That will take way too long. We can each pick 2 or three persons from the third world and send them money to aquire the means of surviving the coming crisis. How much have you thought donating monthly/yearly? I can put you in touch with people ready to act.
I'm not mastax, but I personally think borders are stupid and we shouldn't be locking people into the place they were born. I want us to move beyond such feudal ideas.
Of course I also understand that unrestrained migration right now would cause massive problems. The real problem here is not the migration itself, but what's causing it. Most people (apart from some adventurous souls) don't normally want to leave the place they were born and where their family lives, except when driven to leave by circumstances, whether those circumstances are poverty, civil war or climate change.
If we have a problem with migration, we should do more to address the causes of migration. And not create more causes for migration.
>>we should do more to address the causes of migration
In my opinion not even God himself (if he exists) can address the causes of migration unless He changes the natural rules overnight. The idea that you and me or even Elon Musk + Bill Gates can actually do something to address the actual causes of migration sounds very naive, horribly naive, and please don't take it as an offense. Lets say for a moment that there is a miracle and "we" fix the poverty and the climate in the world tomorrow. How long do you think will that last ? 20-30 years until the world population doubles again ? What do you do then when this overpopulation combined with the corresponding consumption and pollution brings us back to the current situation ? Will you invite "us" all again to fix everything back ? How are "we" going to fix this situation ? By working each of us overtime or what ?
The world population is unlikely to double again. In fact, a lot of countries have shrinking populations. And you know what reduces both population growth and poverty? Education.
I strongly disagree with your defeatism. We have the ability to solve problems, and we have done so in the past. We could do so again. The only thing we lack is the political will.
Good intentions and high hopes are worth close to 0. IMHO at least one of us is badly disconnected from reality. In my part of the world "we" fail to educate our kids worse than ever, how on earth are "we" gonna educate the rest 3-4 billion people.. "Somebody do something !!!"
I disagree. Good intentions would help a lot. The problem is that in some areas, people with bad intentions dominate.
However, my purpose was initially to point out the possibility. The possibility is there. The next step is what we do with it. If you start out by giving up, you're never going to get anywhere. That's your disconnect.
I'm trying to understand what is this and it seems I'm missing a lot of context. Could someone help me understand what is this and why is important ? Thank you.
Reddit announced new pricing for their API. The Apollo(an ios app) developer said that with the new pricing he couldn't make it affordable to keep the Apollo app running and it started messaging with reddit to see if they were open to make some changes.
They were not and they made a lot of false statements about the Apollo developer. To show that those statements are false the Apollo developer is showing a lot of proofs, one of those proofs is this code.
go mod init example.com/hello
and also hardcoded in all the import statements
import "example.com/hello"
what if you move the code to other domain ? you need to change the module and all the references to it instead of only the references
I find it quite cumbersome/counter intuitive even after all these years.