Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The article mentions:

> Scrum got in the way of shipping on a daily basis. The whole idea of Scrum revolves around Sprints, of committing to tasks at the beginning of the sprint, working on these during the sprint, and demoing what we did at the end.

What if each team member came up with a few tickets to work on during let's say a 2 week sprint and then these tickets were shipped as they were finished resulting in releasing new code potentially every day or couple of days?

I've worked with a bunch of companies (contract work) and very rarely did they stick with 4-6 week sprints but only shipped everything at once in the end. That would be really dangerous. Almost always things were shipped as they were completed and reviewed.



You forget that there's a big administrative load added to a two week scrum (planning, daily, daily...), that you need to test and validate, potentially validate the work with QA and do a demonstration at the end. I've never seen a two week sprint working as intended. Especially if you factor the interruption for fix, new feature discussion, task that are more complex than expected and so on. If you add a manager in the loop, pressuring the team member to agree to unrealistic deadline, Scrum is pretty much the best way to destroy everyone involved.


I think most people I see doing scrum these days, do a story, demo a story to owner, release story. Move onto next story. Rarely do they release at the end.

The review at the end is for all stakeholders to be able to catch up on what's being going on the last 2 weeks.


> You forget that there's a big administrative load added to a two week scrum

It wasn't forgotten. Maybe the places I've done contract work for didn't follow scrum to a T. There was no daily planning. It was meeting once for 2 hours every 2 weeks to figure out what to work on and then a few small teams self regulating themselves asynchronously until things were done. This included doing the work, updating the ticket, having at least 1 person review / test the PR and it getting released. Usually there was a 1-3 day turnaround time from the PR being set to review to it being live on the site. Could be longer if the PR needed a lot of changes from things caught in the review process but the ball was always moving.

It never killed a single developer's productivity to wait on a review because every developer had a few tickets to work on during that 2 week process so they could jump to the next one or review another person's PR in the mean time.

There was no manager component once the tickets were chosen for the sprint (which was often a manager + developer group effort). Code could go from a ticket to production with no administrative bottlenecks.


That's a long time from pr to release.


I think the biggest misunderstanding that always occurs when talking about Scrum and/or Agile is when people from radically different industries meet. From the perspective of my previous team, 1-3 days from PR to prod would be indeed quite long. For such a team, "two weeks per sprint" can seem like an eternity and will only slow things down.

On the other hand, one of my best friends works for the largest insurer in the company. They do about one release per six months and working in two week cycles would be an unimaginable speedup for them. It frequently takes more than a month to get word back from the regulators that their proposed code changes have been approved at all.

Both of the groups ("scrum slows you down"/"scrum is uncomfortably fast") have trouble imagining that the problems of the other group can even exist.


What would you do to go from PR to prod faster when you want at least 1 human who didn't open the PR to manually review each PR in a 10-20 developer remote team?

1 day doesn't seem too bad, often times it can be 2-6 hours. It really depends on what lines up and how big the PR is.


A code review for a smallish change shouldn't take more time than sending a few chat messages. You send them the code review, they look through the 20 lines in the code review tool, note that the changes corresponds to the change description, see that you added tests for it and that those tests passed, they ask you to clarify a variable name, you update the code and send it again a few minutes later, they review the new change and see that the name was properly changed and tests still pass and press accept.

This is how most of the code reviews went for me at Google. If you do a bigger change it will take more time, but I see no reason why smaller should take longer than that. After all you are just sending small bits of text back and forth, it really isn't more complicated than a few chat messages.


Yep, what you described is how most reviews go.

The delay isn't always around the review itself taking long. It's having a dev working on their own tickets and then having an opportunity to check in and review someone elses code. You could end up getting a review on your code in 10 minutes or 5 hours based on what's going on for everyone at the time you open the PR. It might even take until the next day and for a bigger PR it could involve a few more eyeballs and time commitment for the review itself (outliers but they do happen).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: