Systems Analysis Quotes
Quotes tagged as "systems-analysis"
Showing 1-11 of 11
“If we want to solve problems effectively...we must keep in mind not only many features but also the influences among them. Complexity is the label we will give to the existence of many interdependent variables in a given system. The more variables and the greater their interdependence, the greater the system's complexity. Great complexity places high demands on a planner's capacity to gather information, integrate findings, and design effective actions. The links between the variables oblige us to attend to a great many features simultaneously, and that, concomitantly, makes it impossible for us to undertake only one action in a complex system.
A system of variables is "interrelated" if an action that affects or meant to affect one part of the system will also affect other parts of it. Interrelatedness guarantees that an action aimed at one variable will have side effects and long-term repercussions. A large number of variables will make it easy to overlook them.
We might think of complexity could be regarded as an objective attribute of systems. We might even think we could assign a numerical value to it, making it, for instance, the product of the number of features times the number of interrelationships. If a system had ten variables and five links between them, then its "complexity quotient", measured in this way would be fifty. If there are no links, its complexity quotient would be zero. Such attempts to measure the complexity of a system have in fact been made.
Complexity is not an objective factor but a subjective one. Supersignals reduce complexity, collapsing a number of features into one. Consequently, complexity must be understood in terms of a specific individual and his or her supply of supersignals. We learn supersignals from experience, and our supply can differ greatly from another individual's. Therefore there can be no objective measure of complexity.”
― The Logic of Failure: Recognizing and Avoiding Error in Complex Situations
A system of variables is "interrelated" if an action that affects or meant to affect one part of the system will also affect other parts of it. Interrelatedness guarantees that an action aimed at one variable will have side effects and long-term repercussions. A large number of variables will make it easy to overlook them.
We might think of complexity could be regarded as an objective attribute of systems. We might even think we could assign a numerical value to it, making it, for instance, the product of the number of features times the number of interrelationships. If a system had ten variables and five links between them, then its "complexity quotient", measured in this way would be fifty. If there are no links, its complexity quotient would be zero. Such attempts to measure the complexity of a system have in fact been made.
Complexity is not an objective factor but a subjective one. Supersignals reduce complexity, collapsing a number of features into one. Consequently, complexity must be understood in terms of a specific individual and his or her supply of supersignals. We learn supersignals from experience, and our supply can differ greatly from another individual's. Therefore there can be no objective measure of complexity.”
― The Logic of Failure: Recognizing and Avoiding Error in Complex Situations

“Military analysis is not an exact science. To return to the wisdom of Sun Tzu, and paraphrase the great Chinese political philosopher, it is at least as close to art. But many logical methods offer insight into military problems-even if solutions to those problems ultimately require the use of judgement and of broader political and strategic considerations as well. Military affairs may not be as amenable to quantification and formal methodological treatment as economics, for example. However, even if our main goal in analysis is generally to illuminate choices, bound problems, and rule out bad options - rather than arrive unambiguously at clear policy choices-the discipline of military analysis has a great deal to offer. Moreover, simple back-of-the envelope methodologies often provide substantial insight without requiring the churning of giant computer models or access to the classified data of official Pentagon studies, allowing generalities and outsiders to play important roles in defense analytical debates.
We have seen all too often (in the broad course of history as well as in modern times) what happens when we make key defense policy decisions based solely on instinct, ideology, and impression. To avoid cavalier, careless, and agenda-driven decision-making, we therefore need to study the science of war as well-even as we also remember the cautions of Clausewitz and avoid hubris in our predictions about how any war or other major military endeavor will ultimately unfold.”
―
We have seen all too often (in the broad course of history as well as in modern times) what happens when we make key defense policy decisions based solely on instinct, ideology, and impression. To avoid cavalier, careless, and agenda-driven decision-making, we therefore need to study the science of war as well-even as we also remember the cautions of Clausewitz and avoid hubris in our predictions about how any war or other major military endeavor will ultimately unfold.”
―
“Viewed abstractly, systems analysis implies rigorous thinking, hopefully quantitative, regarding the gains and the resource-expenditures involved in a particular course of action -- to insure that scarce resources are employed productively rather than wastefully.”
―
―
“The organizational consequence of this highly quantitative image of defense decision-making is an independent and high-level office of systems analysis (or program analysis) reporting directly to the secretary. This office, separated from the parochialism of the individual services, commands, and functional offices of the Defense Department, is charged with de novo analysis of the services' program proposals (and, indeed, with the generation of alternative programs) to assess the relative merits of different potential uses of the same dollars. Its activities culminate in the secretary's decision on a single coherent set of numerically defined programs.
This model imposes a requirement for close interaction between the secretary of defense and the principal program analyst. A suitable person for the job is difficult to obtain without granting him or her direct access to the secretary. This model, therefore, requires that the chief program analyst report directly to the secretary and it inevitably limits the program and budget role of the other chief officials of the OSD, especially the principal policy adviser.
The model leaves unresolved how the guidance for the analysis process is to be developed and even how choices are to be made. Analysis is not always made rigorous and objective simply by making it quantitative, and not everything relevant can be quantified. At its extreme, it can degenerate into a system in which objectives become important because they can be quantified, rather than quantification being important because it can illuminate objectives.”
―
This model imposes a requirement for close interaction between the secretary of defense and the principal program analyst. A suitable person for the job is difficult to obtain without granting him or her direct access to the secretary. This model, therefore, requires that the chief program analyst report directly to the secretary and it inevitably limits the program and budget role of the other chief officials of the OSD, especially the principal policy adviser.
The model leaves unresolved how the guidance for the analysis process is to be developed and even how choices are to be made. Analysis is not always made rigorous and objective simply by making it quantitative, and not everything relevant can be quantified. At its extreme, it can degenerate into a system in which objectives become important because they can be quantified, rather than quantification being important because it can illuminate objectives.”
―
“On le voit, la question est systémique ; il faut une sorte boussole au leadership pour lui permettre d’assumer une fonction de prévention ; l’adage s’applique alors à la gestion des affaires de la cité : « Mieux vaut prévenir que guérir ».”
―
―
“On le voit, la question est systémique ; il faut une sorte boussole au leadership pour lui permettre d’assumer une fonction de prévention ; l’adage s’applique alors à la gestion des affaires de la cité : « Mieux vaut prévenir que guérir ».”
―
―

“The limits on a growing system may be temporary or permanent. The system may find ways to get around them for a short while or a long while, but eventually there must come some kind of accommodation, the system adjusting to the constraint, or the constraint to the system, or both to each other. In that accommodation come some interesting dynamics.
Whether the constraining balancing loops originate from a renewable or nonrenewable resource makes some difference, not in whether growth can continue forever, but in how growth is likely to end.”
― Thinking In Systems: A Primer
Whether the constraining balancing loops originate from a renewable or nonrenewable resource makes some difference, not in whether growth can continue forever, but in how growth is likely to end.”
― Thinking In Systems: A Primer

“If insolvency is not transparent or well understood, and if illiquidity is backstopped by the Federal Reserve, then why do bank runs commence? The answer is psychology. Some customers or counterparties come to believe a bank will not repay them so they pull their money out or close transactions as quickly as possible. They are not reassured by ... press releases or positive comments by management. Word spreads, the withdrawals accelerate, and within days, sometimes hours, the bank closes its doors. From there it's an open issue whether the lost confidence spreads to other banks, in a process called contagion. No amount of capital or comment can stop a bank panic; it has a life of its own.
...
Enter AI. The next bank run may be triggered not by human panic but by AI imitating human panic. An AI bank analysis program with deeply layered neural networks and machine learning capability (perhaps complimented by a GPT capacity to speak with human analysts) Could read millions of pages of financial data on thousands of individual banks, far more than any team of human analysts could review. It's training set of materials provides familiarity with the dynamics of bank runs, basically an emerging property of a complex dynamic system, along with historical examples, worst case scenarios, and defensive moves. Events like the gold corner of 1869, the panic of 1907, the Great Depression of the 1930s, and the S&L crisis of the 1980s would all seem as fresh as today's news. This system would reach the same conclusion as a human analyst — move first, get your money out fast, don't be the last in line.
The true danger is not that the machine thinks like a human — it's supposed to. The danger is that it can act faster and communicate with other machines.”
― MoneyGPT: AI and the Threat to the Global Economy
...
Enter AI. The next bank run may be triggered not by human panic but by AI imitating human panic. An AI bank analysis program with deeply layered neural networks and machine learning capability (perhaps complimented by a GPT capacity to speak with human analysts) Could read millions of pages of financial data on thousands of individual banks, far more than any team of human analysts could review. It's training set of materials provides familiarity with the dynamics of bank runs, basically an emerging property of a complex dynamic system, along with historical examples, worst case scenarios, and defensive moves. Events like the gold corner of 1869, the panic of 1907, the Great Depression of the 1930s, and the S&L crisis of the 1980s would all seem as fresh as today's news. This system would reach the same conclusion as a human analyst — move first, get your money out fast, don't be the last in line.
The true danger is not that the machine thinks like a human — it's supposed to. The danger is that it can act faster and communicate with other machines.”
― MoneyGPT: AI and the Threat to the Global Economy

“AI creates its own fog. The more sophisticated the algorithms, the less developers and engineers understand how the output emerges.”
― MoneyGPT: AI and the Threat to the Global Economy
― MoneyGPT: AI and the Threat to the Global Economy
All Quotes
|
My Quotes
|
Add A Quote
Browse By Tag
- Love Quotes 100.5k
- Life Quotes 79k
- Inspirational Quotes 75.5k
- Humor Quotes 44k
- Philosophy Quotes 30.5k
- Inspirational Quotes Quotes 28.5k
- God Quotes 27k
- Truth Quotes 24.5k
- Wisdom Quotes 24.5k
- Romance Quotes 24k
- Poetry Quotes 23k
- Life Lessons Quotes 22k
- Quotes Quotes 20.5k
- Death Quotes 20.5k
- Happiness Quotes 19k
- Hope Quotes 18.5k
- Faith Quotes 18.5k
- Inspiration Quotes 17k
- Spirituality Quotes 15.5k
- Relationships Quotes 15.5k
- Religion Quotes 15.5k
- Motivational Quotes 15k
- Life Quotes Quotes 15k
- Love Quotes Quotes 15k
- Writing Quotes 15k
- Success Quotes 14k
- Motivation Quotes 13k
- Travel Quotes 13k
- Time Quotes 13k
- Science Quotes 12k