Confusion Leads to Plausible but Incorrect Explanations
An organization can use guidelines to tell it how correct deeper, costly problems. Yet, many institutions might resist those tips, because they believe simpler solutions exist. They think those recommendations are too elaborate or time-consuming. More-straightforward fixes are all they need. Yet, those corrections are unlikely to address a company’s expensive issues, because they are based on a misunderstanding about what drives those difficulties. That confusion exists, due to seemingly plausible explanations.
When a developer or architect sees a floundering project, he can find a litany of poor technical and design choices. He concludes quite sensibly that the endeavor in question would be in better health, if different decisions were made. Yet, that technologist is only seeing the downstream effects. He is unaware of the impact his organization’s machinery had on those decisions. For example, the operations of launching an undertaking might have picked a schedule or scope that encouraged certain judgements. When a due date is staring a programmer or designer in the face, he makes choices he would not otherwise make. In retrospect, he blames those decisions. Yet, they were downstream of his institution’s broader operations. A developer or architect does not see that machinery, so he reasonably blames what he does see: his own technical decisions.
Other personnel from project managers to the CEO see people below taking questionable actions, and they decide that staffing changes are needed. Those people are rationally inferring that individuals who consistently make dubious choices are not fit for their roles. Yet, a project manager might not know how certain external forces affect technical selections. A CEO might only be aware that projects are struggling, but he is unaware of the interplay amongst specific cogs in his organization’s machine. Both people lack certain knowledge, so they draw conclusions, using the information they have. They conclude staffing problem exist, due to the knowledge they possess.
However, a wider band of insight leads to management to conclude that its organization needs development process changes. A methodology is a deeper modification than alterations to personnel and technology choices, making it a rational tactic. Yet, no procedure can insulate a project from the effects an institution’s other units have. For example, an iterative and incremental approach allows a team to avoid pre-mature commitments and adjust to frequent changes. Those capabilities fix many scoping, scheduling, and budgeting problems but not all of them. If an endeavor’s breadth, timeline, and resource allocation are greatly misaligned, then no technique can overcome that issue. Yet, processes such as iterative and incremental development can defeat less monstrous foes such as an inefficient use of man-hours. They can fix many difficulties, making them a sensible path for executives to pursue.
Curated Content and Authors
Matt Heusser describes ways to rethink software estimates.
Louis Columbus discusses the growth of APIs in DevOps and the security implications of that.
Andrew Davis meditates on how to measure the performance of a delivery team.
Ernie Smith and Amelia Pang talk about what DevSecOps is, what benefits it brings, and how it can be incorporated into higher education.
Nataliya Shevchenko describes criteria for choosing and evaluating a threat modeling method.
We wrote an article describing how an organization can decide whether to investigate an underlying problem.
We wrote a piece stating that a model is the theory behind a strategy.
End Notes
This article describes why some straightforward solutions are rational but misguided. It does not dig deeper into each one or what a better tactic would be. Those subjects are addressed in future editions and blog posts. To read those works, subscribe to this newsletter and read the Software Development Journal. To get content as it is published, follow ExperTech Insights on Twitter.