I'm may be struggling in project management right now. Any help would be highly appreciated π
For further actions, you may consider blocking this person and/or reporting abuse
I'm may be struggling in project management right now. Any help would be highly appreciated π
For further actions, you may consider blocking this person and/or reporting abuse
Eryk NapieraΕa -
Sebastien Lorber -
zpillsbury -
Shannon Lal -
Top comments (5)
Software (and systems) exist to do one of a few things:
These purposes come with a range of uncertainty, eg: pin manufacturing is very well defined, whereas entertaining people is very vague.
The traditional approach to software creation (ie: the waterfall method) was created when the majority of purposes were well defined; solving known problems or augmenting labour, and thus assumes low uncertainty. This means requirements are well defined, and software (or systems) can be created efficiently in a production-line-like flow. The efficiency of creation is the highest priority.
Many more recent purposes (indeed most software purposes) are vaguely defined; providing services that can change over time, breaking the primary assumption of the traditional approach, which fails to keep up as there is slow feedback from downstream consumers back through changing requirements and delivery of changes. The Agile Manifesto defines a new approach to creating software (or systems) that focuses on fast feedback and the measures the effectiveness of the resulting software (or system). The effectiveness of the software (or system) is the highest priority.
For some historical context, I recall reading some years ago about how business software was developed in the early days in the mainframe computing era (from the 1950s).
It was apparently not unusual during those decades to build a bespoke piece of software completely from scratch, designed specifically for a given mainframe computer. Often such mainframes were single-purpose machines that just ran that one piece of software, without even having an operating system.
It seems that it was common to put in, from the start, all of the functionality that would be required for the life of the software: Therefore, a great deal of up-front analysis was needed (waterfall). After the initial deployment, it was not common to make changes to the software - and it was not at all easy to do so. At that time, the capabilities of the hardware were so limited that squeezing the required logic into the system in the first place was a feat in and of itself. It was not necessarily realistic to add more functionality to that code afterwards, and even making small changes was challenging.
If a company wanted to move the software to a newer mainframe, the design of the new mainframe was generally incompatible with the previous one, so the software was re-written from scratch at that point. The IBM 360 series appears to have been a big innovation in this sense - the architecture of newer models was compatible with previous ones, so porting code became possible.
With the standardization of instruction sets, operating systems, libraries, and programming languages, along with more memory and CPU, we now expect extreme plasticity from software - totally different from those early days. So we're in the agile era @phlash909 describes. The focus is on making a codebase useful as quickly as possible, releasing it into the wild, and then continually evolving it over time to meet the needs of its users as they arise. Automating the building, testing, and deployment of software is essential, and the practices around software development (things like object-oriented or functional programming, test-driven development, small loosely-coupled services) help to make software more amenable to change.
Others have mentioned the manifesto which is a great place to start.
One way I like to explain agility is that agility is about sensing what is happening and responding to it. So you look at the honest reality of things, and make a decision on what to do next.
That sounds obvious, but most of us crave plans and want to make predictions about the future. The bigger or more in the future those are, the more risk they have. Sadly a lot of what people do relies solely on plans and predictions.
Ever have a conversation where you had to get back on track because of a plan that is now behind? Reality has told you the plan and prediction is wrong. An agile approach would be to acknowledge things as they are and make some new decisions. A non-agile approach would be to tell people to try to make the plan a reality.
I'm not getting into scrum, xp, or any of that here because while they can allow you to pursue agility they don't create it on their own. They provide you opportunity to sense and respond, but if you aren't going to use them it won't make much of a difference.
Try starting with the Agile Manifesto if you havenβt yet.
βWe are uncovering better ways of developing software by doing it and helping others do it.
Through this work we have come to value:
Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan
That is, while there is value in the items on the right, we value the items on the left more.β
agilemanifesto.org/
Rather than blindly following a plan you made up long ago β take an exploratory approach, acknowledging and adapting to the impediments on your way. Always mindful of the one true metric that matters: happy customers/users.
Rather than treating humans like robots, embrace their human nature. Empower them to access their innate desire to do great things β perhaps even outside what they were hired for. Happy developers (including everyone who contributes β not just programmers) are the most effective in the long term.
Working in small (4-5 people), focused units with a variety of skills usually works out very well.