Just a tiny project with over 100 million downloads every month, over 4 million every day. No big deal. Just a small shop, don't overstate its importance.
Ruff is nice, but not important, uv is one of the few things making the python ecosystem bearable. Python is a language for monkeys, and if you don't give monkeys good tools, they will forever entangle themselves and you. It is all garbage wrapped in garbage. At least let me deploy it without having to manually detangle all that garbage by version.
I'm done pretending this is a "right tools for the right job" kind of thing, there's wrong people in the right job, and they only know python. If no one self-writes code anymore anyway, at least use a language that isn't a clusterfuck of bad design decisions, and has 1 trillion lines of code in the collective memory of people who don't know what a stack is.
> If no one self-writes code anymore anyway, at least use a language that isn't a clusterfuck of bad design decisions
I can get behind the idea that LLM's probably don't need a language designed for humans if humans arent writing it, but the rest of this is just daft. Pythons popularity isn't just pure luck, in fact its only been in recent years that the tooling has caught up to the point where its as easy to setup as it is to write, which should really tell you something if people persevered with it anyway.
I'm sorry your favourite language doesnt have the recognition it so rightfully deserves, but reducing python to just "stupid language for stupid people" is, well, stupid
Keep in mind that when Graham coined that term Java and C++ were considered blub languages.
Speaking as a grey beard myself, I think its safe to say that the grey beards among us will always deride those who didn't have to work as hard as they did.
I used to do backend development in superior languages, and sometimes do hobby frontend in superior languages, but my work is Python now. And it kind of has to be Python: we do machine learning, and I work with GDAL and PDAL and all these other weird libraries and everything has Python bindings! I search for "coherent point drift" and of course there's a Python library.
The superior languages I mentioned... perhaps they have like a library for JSON encoding and decoding. You need anything else? Great, now you're a library author and maintainer!
relax, soon u be rewriting the essence of all these libs into something new. python has its days numbered also perhaps for many engineering decisions that are now cheap via llms.
This is cool! I ended up also inventing my own syntax to place at the top of one-off scripts to specify deps. (For single-file Python scripts, vs one with a full project dir that has pyproject.toml) I will adopt this instead.
if you are working on one tiny project on your machine that pips in four packages you probably think pip was OK.
Circa 2017 I was working on systems that were complex enough that pip couldn't build them and after I got to the bottom of it I knew it not my fault but it was the fault of pip.
I built a system which could build usable environments out of pre-built wheels and sketched out the design of a system that was roughly 'uv but written in Python' but saw two problems: (1) a Python dependent system can be destroyed by people messing with Python environments, like my experience is that my poetry gets trashed every six months or so and (2) there was just no awareness by the 'one tiny project on your machine that pips in four packages' people that there was a correctness problem at all and everybody else was blaming themselves for a problem and didn't have a clear understanding of what was wrong with pip or what a correct model for managing python dependencies is (short answer: see maven) or that a 100% correct model was even possible and that we'd have to always settle for a 97% model. The politics looked intractable so I gave up.
Now written in rust, uv evaded the bootstrap problem and it dealt with the adoption problem by targeting 'speed' as people would see the value in that even if they didn't see the value in 'correctness'. My system would have been faster than pip because it would have kept a cache, but uv is faster still.
> everybody else was blaming themselves for a problem and didn't have a clear understanding of what was wrong with pip or what a correct model for managing python dependencies is (short answer: see maven)
I always looked down on the Java ecosystem but if it turns out Maven had a better story all along and we all overlooked it, that's wild.
I still believe Rust is a red herring here. Your ‘uv but written in Python’ would probably have the same success as uv does now, if you did focus on speed over correctness. And I’ve yet to hear about pipx or Poetry getting trashed, but if it is a problem, I don’t think it’s impossible to solve in Python vs Rust.
> The politics looked intractable so I gave up.
So yeah, this is your actual problem. (Don’t worry, I’m in the same camp here.)
As much as I'm a Python fan I strongly disagree here that rust is a red herring.
Having a static binary makes distribution way simplier. There are a bunch of ways you could try to achive something like in python but it would be significantly larger.
Performance-wise writing it in python would have heavy startup overhead and wouldn't be able to get close to the same level of performance.
Obviously you could achive the same thing in many other languages, but rust ends up being a really good fit for making a small static binary for this workload of network heavy, IO-bound, async/threading friendly with the occasional bit of CPU heavy work.
How is uv awesome and Poetry so bad? They do basically the same things except Astral re-invents the wheel but only part way instead of just relying on the existing tools. uv is fast. As far as I can tell, there's hardly any difference in functionality except for it also replacing PyEnv, which I never use anyway.
uv assuming your local Python is busted to hell and back helps a lot with isolation.
Poetry's CLI would often, for me, just fall over and crash. Crashing a lot is not a fundamental problem in the sense you can fix the bugs, but hey I'm not hitting uv crashes.
pipenv was even worse in terms of just hanging during package resolution. Tools that hang are not tools you want in a CI pipeline!
The end result: `uv run` I expect to work. `pipenv` or `poetry` calls I have to assume don't work, have to put retries into CI pipelines and things like that.
uv has a lot of sensible defaults that prevent clueless developers to shoot their own feet. Uv sync, for example, would uninstall packages not in pyproject.toml
i kind of disagree with this. uv run is clunky, i don't want that. i want to keep the activate the venv and do shit model. i hate uv run as a primitive.
I don't know if it's still true, but ~7 years ago when I last looked at it, Poetry didn't have the kind of UX I have in mind (That Astral/UV do). I remember trying to make it work, and it would choose Python 2 for some reason, despite me never having used it, and it having been obsoleted years before. I remember hitting many problems/errors I can't recall the detail of, but bad UX.
I guess it's an individual solution to that, but it's a solution that basically worsens the actual problem, as I see it, which is strict/narrow version pinning with frequent updates to latest and minimal effort to track backwards compatibility let alone try to maintain it. It just turns it into nodejs constant wrestling with package.json changes.
I've used python for roughly 15 years, and 10 of those years I was paid to primarily write and maintain projects written in Python.
Things got bearable with virtualenv/virtualenv wrappers, but it was never what I would call great. Pip was always painful, and slow. I never looked forward to using them - and every time I worked on a new system - the amount of finaggling I had to do to avoid problems, and the amount of time I spent supporting other people who had problems was significant.
The day I first used uv (about is as memorable to me as the the day I first started using python (roughly 2004) - everything changed.
I've used uv pretty much every single day since then and the joy has never left. Every operation is twitch fast. There has never once been any issues. Combined with direnv - I can create projects/venvs on the fly so quickly I don't even bother using it's various affordances to run projects without a venv.
To put it succinctly - uv gives me two things.
One - zero messing around with virtualenvwrappers and friends. For whatever reason, I've never once run into an error like "virtualenvwrapper.sh: There was a problem running the initialization hooks."
Two - fast. It may be the fastest software I've ever used. Everything is instant - so you never experience any type of cognitive distraction when creating a python project and diving into anything - you think it - and it's done. I genuinely look forward to uv pip install - even when it's not already in cache - the parallel download is epically fast - always a joy.
All of them (well - no HPUX in 15+ years, and I've never used uv in Solaris, or AIX) - but the major two client side environments that I use 'uv' in would be WSL2+Ubuntu/ext4 (work) and macOS/APFS at home.
But - neither the speed nor constant issues with pip/virtualenvwrappers are really a function of the OS/File System.
A frequent theme in this thread (probably most clearly described in https://news.ycombinator.com/item?id=47444936) is that relying on your Python Environment to manager your Python Environment - always ends up in pain. Poetry had this issue as well.
One of the key architectural decisions of Astral was to write the Python Environment Management tooling in rust - so that it could never footgun itself.
Everything “just works” and is fast - and that’s basically it.
You can run a script with a one liner and it will automatically get you the same python and venv and everything as whoever distributed the python code, in milliseconds if the packages are already cached on your local computer.
Very easy to get going without even knowing what a venv or pypi or anything is.
If you are already an expert you get “faster simpler tooling” and if you are a complete beginner it’s “easy peasy lemon squeezy”.
for one, it's one tool, that does the job of all three.
it just works. i'm not sure how else to describe it other than less faffing about. it just does the right thing, every time. there's a tiny learning curve (mostly unlearning bad or redundant habits), but once you know how to wield it, it's a one stop shop.
uv is nice, but not irreplaceable. An open source, maintenance mode fork would work just as fine. And even if all of uv disappeared today, I’d go just back to Poetry. Slower? Sure, a bit.
...and then I’ve read the rest of your comment. Please do go read the HN guidelines.
I was using poetry pretty happily before uv came along. I’d probably go back.
Note that uv is fast because — yes, Rust, but also because it doesn’t have to handle a lot of legacy that pip does[1], and some smart language independent design choices.
If uv became unavailable, it’d suck but the world would move on.
Like, the whole point of open source is that this thread is not a thing. The whole point is "if this software is taken on by a malevolent dictator for life, we'll just fork it and keep going with our own thing." Or like if I'm evaluating whether to open-source stuff at a startup, the question is "if this startup fails to get funding and we have to close up shop, do I want the team to still have access to these tools at my next gig?" -- there are other reasons it might be in the company's interests, like getting free feature development or hiring better devs, but that's the main reason it'd be in the employees' best interests to want to contribute to an open-source legacy rather than keep everything proprietary.
The leadership and product direction work are at least as hard as the code work. Astral/uv has absolutely proven this, otherwise Python wouldn't be a boneyard for build tools.
Projects - including forks - fail all the time because the leadership/product direction on a project goes missing despite the tech still being viable, which is why people are concerned about these people being locked up inside OpenAI. Successfully forking is much easier said than done.
I had a lot of trouble convincing people that a correct Python package manager was even possible. uv proved it was possible and won people over with speed.
I had a sketched out design for a correct package manager in 2018 but when I talked to people about it I couldn't get any interest in it. I think the brilliant idea that uv had that I missed was that it can't be written in Python because if is written in Python developers are going to corrupt its environment sooner or later and you lose your correctness.
I think that now that people are used to uv it won't be that hard to develop a competitor and get people to switch.
You seem to be underestestimating the laziness of the people, and overestimating their resolve. Angry forks usually don't last, angst doesn't prevent maintenance burnouts.
You underestimate the value that something like uv and company bring to the ecosystem. Given enough time I could have seen it replacing some core utilities, now that its owned by OpenAI I don't see that happening, unless OpenAI "donates" the project but keeps the devs on a payroll.
You are aware that ty has only recently entered beta status?
Ruff isn’t stable yet either and has evolved into the de facto standard for new projects. It has more than double the amount of rules than Pylint does. Also downloaded more than 3 times as often as Pylint in the past month.
Pylint has some advantages, sure, but Ruffs adoption speaks for itself. Pylint is 25 years old. You’d hope they do some things better.
Saying that uv is their only winner is a hilarious take.
> I would stare longingly into the void, wondering if I can ever work another python project after having experienced uv, ruff, and ty.
You think you're disagreeing with me, but you're agreeing. To wit: The original post is silly, because ty is beta quality and Ruff isn't stable yet either. Your words.
These are just tools, Pylint included. Use them, don't use then, make them your whole personality to the point that you feel compelled to defend them when someone on the Internet points out their flaws. Whatever churns your butter.
>Saying that uv is their only winner is a hilarious take.
na this news is good enough reason to move from Ruff back to black and stay the course, I won't use anything else from Astral. I will use uv but only until pip 2/++ gets its shit together and catches up and hopefully then as a community we should jump back on board and keep using pip even if it's not as good, it's free in the freedom sense.
Because I hate dynamically typed languages for anything besides scripting and glue code.
Or are you asking why I haven't had success? Mostly because the people I work with are dead set that Python is perfect for everything. I had one guy argue it should be used for embedded work
i think the main problem was that people didn't believe that pip was broken, or didn't think there was any value in a 100% correct package manager over a 97% correct package manager (e.g. misread "worse is better")
I had the problem basically understood in 2018 and I am still pissed that everybody wanted to keep taking their chances with pip just like they like to gamble with agent coders today.
Now that people know a decent package manager is possible in Python I think there is going to be no problem getting people to maintain one.
Idk how anyone could sustain the impression that pip was not broken unless they had basically never used anything else (including Linux package managers) long enough to have even a basic understanding of it.
And that's a big part of what's so frustrating about Python generally: it seems to be a language used by lots of people who've never used anything else and have an attitude like "why would I ever try anything else"?
Python has a culture where nominal values of user-friendliness, pragmatism, and simplicity often turn into plain old philistinism.
I had a breakthrough moment when someone at a workplace (software dev) said something about a thing that wasn't working on their device. Their language made it clear to me that they didn't know how to troubleshoot to figure out how to fix it. But they could write software that ran on millions of devices. Ok, that made me take a step back.
In the early 2000s I was in a rough patch in my career and wound up working at a small town web design shop that had done a lot of really amazing work, like a line of business system for car dealers, an e-commerce site for vineyards, a custom CRM for an academic department, etc. Nobody there knew about version control (not so weird in 2005) or how to write joins in SQL.
that makes zero sense to me. developing something like ruff from scratch takes a lot of things happening - someone having the idea, the time to develop it from scratch in their free time, or the money to do it as a job, and perhaps the need to find collaborators if it's too large a project for one person. but now ruff is there, there's no need to build it from scratch. if I wanted to build a python linter or formatter I would simply fork ruff and build on top of it. as others have said in this subthread, that's the whole point of open source!
> the time to develop it [not] from scratch in their free time, or the money...
How do you think the magic of open source resolves this issue? Think about this for it to make some sense
> I would simply fork
The only simple part here is pressing the "fork" button, which only gives you exactly the same code that already exists, without user awareness or distribution
you're moving the goalposts now. I never said it would be easy to get used awareness or adoption, just that it would be a lot easier to write a new linter by forking and continuing ruff development than it would doing so from scratch.
as to how the magic of open source resolves the time and money issue, it literally gives you the building blocks you need to not have to invent everything from scratch. how is that not significant?
> just that it would be a lot easier to write a new linter by forking
And I never said about the relative ease, you've moved the goalpost there yourself. $1m required to maintain is much less than $10m required to create, yet when you don't have $1m it doesn't matter - you'll still fail, and reasons are the same as the reasons you couldn't build the original.
Blocks lying around does not a building make, so you haven't addressed that magic either.
it does not take $1M to maintain a linter, these tools can and have been built and maintained by people in their spare time. astral built a better one, for which I am genuinely grateful to them, but it's not like they invented linting or that the open source community was just waiting around for some business to supply their tooling. indeed developer tools are notoriously hard to make money off simply because so many good ones have been developed as either solo or community open source projects, largely by people in their free time.
Cannot we at one point consider the tool to be "done"? I mean, what is there to constantly change and improve? Genuinely curious. It sounds like a tool that can be finished. Can it not be?
The “requests” package gets downloaded one billion times every month, should that be a multi billion dollar VC company as well? It’s a package manager and other neat tooling, it’s great but it’s hardly the essence of what makes Python awesome, it’s one of the many things that makes this ecosystem flourish. If OpenAI would enshittify it people would just fork or move on, that’s all I’m saying, it’s not in any way a single point of failure for the Python ecosystem.
This is not the point of uv or any good package manager. The point is what prevents Python to suck. For a long time package management had been horrible in Python compared what you could see in other languages.
Don't understate its importance. I've been using Python for more than 30 years. They solved a problem that a lot of smart people didn't solve (). Python developer experience improved an order of magnitude.
I mean, these sorts of numbers speak to the mind-bogglingly inefficient CI workflows we as an industry have built. I’d be surprised if there were 4 million people in the world who actually know what ‘uv’ is.
Maybe there needs to be some nonprofit watchdog which helps identify those cases in their early stages and helps bootstrap open forks. I'd fund to a sort of open capture protection savings account if I believed it would help ensure continuity of support from the things I rely on.
Right. If anything, this "tiny part" has pretty much taken over Python and turned it from OSS BDFL language into a company-backed one (like Erlang, Scala, C#).
I am still not sure why everyone jumped on uv. Sure, it's quicker than pip, but an installation rarely takes so long as to become annoying. Anyway, pip is still there, so whatever impact they have made can be rolled back if they try to pull the rug
I'm not sure but it seems to be because of dependency management behaviors I find confusing. Like, I found out that apparently people or packages would just do this `pip freeze > requirements.txt` or otherwise just not pay attention to what version limitations there are. It's not something that I ever really ran into much though
In the 2024 Python developer survey, 18% of the ecosystem used Poetry. When I opened this manifold question[0], I'm pretty sure uv was about half of Poetry downloads.
Estimating from these numbers, probably about 30% of the ecosystem is using `uv` now. We'll get better numbers when the 2025 Python developer survey is published.
Same. It's game-changing - leaps and bounds above every previous attempt to make Python's packaging, dependency management, and dev workflow easy. I don't know anyone who has tried uv and not immediately thrown every other tool out the window.
I use uv here and there but have a bunch of projects using regular pip with pip-tools to do a requirements.in -> requirements.txt as a lockfile workflow that I've never seen enough value in converting over. uv is clearly much faster but that's a pretty minor consideration unless I were for some reason changing project dependencies all day long.
Perhaps it never grabbed me as much because I've been running basically everything in Docker for years now, which takes care of Python versioning issues and caches the dependency install steps, so they only take a long time if they've changed. I also like containers for all of the other project setup and environment scaffolding stuff they roll up, e.g. having a consistently working GDAL environment available instantly for a project I haven't worked on in a long time.
2 things: First, you can (and should) replace your `pip install` with `uv pip install` for instant speed boost. This matters even for Docker builds.
Second, you can use uv to build and install to a separate venv in a Docker container and then, thanks to the wonders of multistage Docker builds, copy that venv to a new container and have a fully working minimal image in no time, with almost no effort.
been in the python game a long time and i've seen so many tools in this space come and go over the years. i still rely on good ol pip and have had no issues. that said, we utilize mypy and ruff, and have moved to pyproject etc to remotely keep up with the times.
uv solved it, it will be the only tool people use in 2 more years. if you’re a python shop / expert then you can do pip etc but uv turned incidental python + deps from a huge PITA for the rest of us, to It Just Works simplicity on the same level or better than Golang.
Solved with direnv. Also - in my .bashrc in all of my (many) clients:
$ type uvi uvl uvv
uvi is a function
uvi ()
{
uv pip install $@
}
uvl is a function
uvl ()
{
uv pip list
}
uvv is a function
uvv ()
{
uv venv;
cat > .envrc <<EOF
source .venv/bin/activate
EOF
direnv allow
}
You're welcome to live in the 90s dark ages, I feel this attitude and the shape of the old linux distros like Debian that laboriously re-package years-old software have been one of the biggest failures of open source and squandered untold hours of human effort. It's a model that works okay for generic infrastructure but requires far too much labor and moves far too slowly with quite a poor experience for end users and developers. Why else would all modern software development (going back to perl's cpan package manager in 1995) route around it?
If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool?
Mostly no, sometimes I give up and still use pip as a separate user.
> If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool
I haven't felt the need to use Go, the only Java software I use is in the OS repo. I don't want to use JS software for other reasons. This is one of the reasons why I don't like Rust rewrites. Python dependencies are very often in the OS repo. If there is anything else, I compile it from source and I curse when software doesn't use or adheres to the standard of the GNU build system.
Thanks for explaining your workflow. It seems predictable, but like it really locks you into one of the few (albeit popular) programming languages that has many/most of its development libraries repackaged by your OS. There are plenty of very popular languages that don't offer that at all.
Go and Rust, specifically, seem a bit odd to be allergic to. Their "package managers" are largely downloading sources into your code repository, not downloading/installing truly arbitrary stuff. How is that different from your (presumably "wget the file into my repo or include path") workflow for depending on a header-only C library from the internet which your OS doesn't repackage?
I understand if your resistance to those platforms is because of how much source code things download, but that still seems qualitatively different to me from "npm install can do god-knows-what to my workstation" or "pip install can install packages that shadow system-wide trusted ones".
I very much appreciate the sentiment - and agree that random crap (particularly some of the insane dependency chains that you get from NPM, but also Rust) in which you go to install a simple (at least you believe) package - and the Rust/NPM manager downloads several hundred dependencies.
But the problem with only using the OS package manager is that you then lock yourself out of the entire ecosystem of node, python, rust packages that have never been migrated to whatever operating system you are using - which might be very significant.
How do you feel about Nix? It feels like this is a nice half-way measure between reliable/reproducible builds, but without all of the Free For all where you are downloading who-knows-what-from-where onto your OS?
In general I agree with you. But not for software dev packages.
The package manager I use, apt on Debian, does not package many Python development repos. They've got the big ones, e.g. requests, but not e.g. uuid6. And I wouldn't want it to - I like the limited Debian dev effort to be put towards the user experience and let the Python dev devs worry about packaging Python dev dependencies.
What’s the point of constraining oneself to what is in the OS package manager? I like to keep my dependencies up to date. The versions in the OS package manager are much older.
And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.
> What’s the point of constraining oneself to what is in the OS package manager? I like to keep my dependencies up to date. The versions in the OS package manager are much older.
I favor stability and the stripping of unwanted features (e.g. telemetry) by my OS vendor over cutting edge software. If I really need that I install it into /usr/local, that it what this is for after all.
> And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.
This is a reason to select the OS. Software shouldn't require exact versions, but should stick to stable interfaces.
Geospatial tends to be the Achilles heel for python projects for me. Fiona is a wiley beast of a package, and GDAL too. Conda helped some but was always so slow. Pip almost uniformly fails in this area for me.
Trump had repetitively used the term "Chinese Virus" since the pandemic. Quoting the Conversation[1]
> The expressions “Chinese virus” and “Wuhan virus” personify the threat...The adjective “Chinese” is particularly problematic as it associates the infection with an ethnicity. Talking about group identities withan explicitly medical language is a recognized process of Othering (here and here), historically used in anti-immigrant rhetoric and policy, including toward Chinese immigrants in North America. This type of language stokes anxiety, resentment, fear and disgust toward people associated with that group.
In this particular tweet, Trump used the term "China Virus" instead of "Chinese Virus". Following the development of the recent events, it is very clear that such a term still ties to people of an ethnicity, or people of a location, instead of, say, just a location. The dismiss of the comparison is weak.
> So what? I'm gonna guess he wanted to amplify the photo of a mask-wearing Trump, not the China Virus.
Even if pg's intend was just to amplify the photo of a mask-wearing Trump, the fact that he liked and retweed a message inciting hatred is beyond disappointing.
My dad was an aero space engineer, and I always had trouble with this. It is the speed you need to be going at, to not be pulled back by gravity. As a kid, I always thought, well big deal, if I am in a plane and just keep going up, eventually I will be out. The key I always missed was the velocity at that point was with no added power. So when the engine cuts off, what is your velocity? Is it enough to escape? Or do you need more power. Of course there is a good chance I still do not understand it.
Our mission is to provide an accessible and affordable health service in the hands of every person on earth.
We are doing it by combining the ever-growing computing power of machines with the best medical expertise of humans to create a comprehensive, immediate and personalised health service and making it universally available.
Our mission is to provide an accessible and affordable health service in the hands of every person on earth.
We are doing it by combining the ever-growing computing power of machines with the best medical expertise of humans to create a comprehensive, immediate and personalised health service and making it universally available.
We’re the fastest growing mobile wallet in Europe and one of the most exciting FinTech companies around. Transforming the way retailers and their customers interact is no small task, but we’ve got the team, the board and the investors to meet our lofty goals.
The Yoyo Wallet product comprises of iOS and Android apps that talk to a suite of APIs powered by the Yoyo platform. We use a service-oriented architecture to support real-time, high-volume transactions that consistently deliver sub-one-second response times at the point-of-sale.
Backend stack: includes (but is not limited to): Python, Django, Event messaging and RESTful APIs, Micro-Services Oriented Architecture, PostgreSQL, DynamoDB, RabbitMQ, Celery, Puppet, Fabric, Docker, CircleCI / Continuous Deployment via ChatOps, and is hosted on AWS.
Yoyo Wallet | London ONSITE | Full-time
We’re the fastest growing mobile wallet in Europe and one of the most exciting FinTech companies around. Transforming the way retailers and their customers interact is no small task, but we’ve got the team, the board and the investors to meet our lofty goals.
The Yoyo Wallet product comprises of iOS and Android apps that talk to a suite of APIs powered by the Yoyo platform. We use a service-oriented architecture to support real-time, high-volume transactions that consistently deliver sub-one-second response times at the point-of-sale.
Backend stack: includes (but is not limited to): Python, Django, Event messaging and RESTful APIs, Micro-Services Oriented Architecture, PostgreSQL, DynamoDB, RabbitMQ, Celery, Puppet, Fabric, Docker, CircleCI / Continuous Deployment via ChatOps, and is hosted on AWS.
Yes but I don't want to manage version numbers and release a new one each time I push a commit to my repo. Also anyone can take the code s/he wants directly.
Do you have some kind of automatic pypi release system?
We’re the fastest growing mobile wallet in Europe and one of the most exciting FinTech companies around. Transforming the way retailers and their customers interact is no small task, but we’ve got the team, the board and the investors to meet our lofty goals.
The Yoyo Wallet product comprises of iOS and Android apps that talk to a suite of APIs powered by the Yoyo platform. We use a service-oriented architecture to support real-time, high-volume transactions that consistently deliver sub-one-second response times at the point-of-sale.
Backend stack: includes (but is not limited to): Python, Django, Event messaging and RESTful APIs, Micro-Services Oriented Architecture, PostgreSQL, DynamoDB, RabbitMQ, Celery, Puppet, Fabric, Docker, CircleCI / Continuous Deployment via ChatOps, and is hosted on AWS.
reply