Hacker Newsnew | past | comments | ask | show | jobs | submit | piliberto's commentslogin

> One reason projects with large binary files don't use Git is because, when a Git repository is cloned, Git will download every version of every file in the repository.

Wrong? There's a --depth option for the git fetch command which allows the user to specify how many commits they want to fetch from the repository


depth is broken; it cannot be used for submodules/recursive submodules dependably because most hosts will refuse to serve unadvertised refs. We learned this the hard way. Or maybe it is submodules that are broken. Or git itself.


Depth, submodules and multiple work trees are all half-baked features that work fine up to a point, then start falling over frantically—most notably if you try to use them together.


depth just let you control the amount of history. It will not let you exclude files that are at the highest depth that you don't want. So while that statement was not accurate, it's not what this feature is intended for.


Yes, but 95% of devs, even fairly talented ones, don't really know how to use Git.


Author seems to be a manager, not necessarily a dev.


Git is fundamentally very simple. Any dev who doesn't understand exactly how it works is not even remotely talented.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: