Stefan m wasilewski: Upgrade Report: November 2009April 2010 icon

Stefan m wasilewski: Upgrade Report: November 2009April 2010


Загрузка...
страницы:   1   2   3   4   5   6   7   8   9   ...   16

Upgrade Report: 2009 April 2010

Stefan M Wasilewski: Upgrade Report: November 2009April 2010


Supervisors: Angela Espinosa & Jennifer Willby

Student 361429

Abstract

In 1990 Michael Rothschild wrote “Bionomics”: Economy as Ecosystem (Rothschild 1990), within which he described "BIONOMICS holds that economic development, and the social change flowing from it, is not shaped by a society's genes, but by its accumulated technical knowledge". Rothschild recognises that the theories of Darwin's (Darwin 1859), Adam Smith's (Smith 1776), Thomas Malthus (Malthus 1798), and David Ricardo (Ricardo 1817) have co-evolved along with Lamarck (Gould 2002). Yet even when he analyses economic evolution "by its very nature, an evolving economy cannot be planned, so the entire rationale of centralised economic decision-making collapses" he is thinking of Marx and Communism, but he is still in a two-dimensional plane upon which "the selfish genes" (Dawkins 1976) compete. Rothschild never takes us beyond competing genes and a tantalising glimpse of spontaneous order that would become complexity theory. Yet had he look wider and into the works of Ashby (Ashby 1956), McCulloch (McCulloch 1988), Weiner (Wiener 1954) and Beer (Beer 1981) he may have seen a world closure to Lovelock’s (Lovelock 2000) and defined the ‘economic-ecosystem’ differently. Their view was of a highly connected and interdependent set of systems, whether natural and/or social.


By late 2007, as the beginning of the ‘Credit Crunch’ was unfolding, Black, Scholes and Merton’s (Black and Scholes 1973; Merton 1973) option pricing approach had been implemented into two trading and one investment platform, RiskMetrics™/CreditMetrics™1 and Long Term Capital Management (“LTCM”), of which LTCM had to be saved in 1998 by the US Government and the former withdrew the embedded Normal Distribution reference. The former had been the bedrock of trading shares and credit whilst the latter had implement huge leveraged investments in a vehicle outside of the regular governance regimes. Both (Morgan 1997; Lowenstein 2000) believed they were able to manage risk according to models based upon the abundance of new economic data and yet upon reflection new capital (Settlements. 2009) rules talk in the same terms of connectedness as Ashby, Beer Weiner and Vester.


Lord Kelvin once said: “Anything that exists, exists in some quantity and can therefore be measured” (Beer 1967; Adams 1995). Whilst the physical and biological sciences have moved beyond Newton and Descartes into quantum electrodynamics (Feynman 1985), economics and business is still grounded in the second law of thermodynamics and the Keynesian cloud (Keynes 1936). Efforts have been made to model economics using Chaos Theory (Peters 1996) and modern thermodynamics (Prigogine 1997, 2003) but investment practice still follows a model sketched in 1900 (Bachelier 1900) and built upon by Prof. Eugene Fama (Fama 1976) called ‘the Efficient Market Hypothesis’, where information is perfect, rational investment expectations are the norm and follow a normal distribution2 (Moivre 1756) of activity. Risk measurement theories using economic waves by R. N. Elliott (Elliott 1938) and socio-economic by N. Kondratiev (Schumpeter 1939) were followed by Modigliani & Miller’s work (Stiglitz 1969) the latter who’s paper some believe ushered in modern capital markets (Culp 2001) have all been criticised (Mandelbrot and Hudson 2008; Sorkin 2009) as their application led to risk strategies like LTCM.


Current experience seems to indicate that the economy is a complex set of closely coupled agents whose governance model is fragmented. The recent increase in economic ‘boom-bust cycles’ seems correlated to rapid increases in regulatory activity (Goodhart 1988; Cooper 2008; Wiggin, Incontrera et al. 2008) whose original aim was to protect investors but whose implementation created feed-forward and feedback processes with unforeseen consequences.


TwoMandelbrot’s (Mandelbrot and Hudson 2008) criticism of Elliott Waves is its subjective nature by chartists and even though the "Schumpeter-Freeman-Perez" paradigm (Schumpeter 1939) has its followers for Kondratiev his critics point out that too little explanation supported the components of his theory much based around the accuracy of the data and the exact development of technology. This is very important when considering the transliteration of technology within finance and how it affects risk and pricing as the follows of a theme assume that the underlying rationale is robust.


With regard to the latter two major events are worthy of comparison: the Lloyds of London Reinsurance Spiral (1988) and the global credit crunch 2007. They have a lot in common from a regulatory and pricing perspective. Each have been analysed but any predictive power and regulatory responses proved ineffectual, too little too late, the markets had moved on and adjusted. The first begat the second by providing the capital markets with new financial structures and improved data manipulation. with which to create new products. In between there were twothree intermediary economic bubbles in 1992/3 and , 1998/9 and 2004 the lattermiddle seeing the failure of Long-Term Capital Management “LTCM” (Lowenstein 2000) of which a more stringent regulatory response here may have averted theboth the2004 and 2007 crisis all together (Bookstaber 2007). A few reviews captured the ‘agent-type’ activity (Arthur 1999) but none extended to embrace a close coupling between the two paradigms (Kuhn 1974) of insurance and banking as much of the data was opaque to the underlying operational nature of the products being created (Cooper 2008).


Adam Smith envisioned a self-regulating economy (Heilbroner 1999) but in today’s economy investors are faced with at least 30-years of changing regulatory accounting, opaque reporting, inflationary markets and for the larger deals a leveraging process biased to the lead principal. It is also the longest period of relatively peaceful financial activity where data was gathered but there’s a catch. Homogeneous data is the source of good pricing whether equity or debt (Briys and Varenne 2001) and the precursor to such is access to consistent, clear and well correlated information.; though the data may have been gather it was neither homogeneous nor transparent.


Whilst risk and investment management has taken advantage of processing technology the actual data is still based upon published accounts and cursory interviews with management. This is a Newtonian view that takes no account of an agent’s coupling to the broad economy and one Complexity Economics (Arthur 1995; Beinhocker 2006) tries to take into account. The consequences of ignoring the close coupling of economies and impact of regulatory change have been an increase in economic bubbles both in size and frequency.


The purpose of this research is to explore how Complexity Economics More importantly both Beer (Durlauf 1997Casti 1975), and Vester (Vester 2007) reflect upon the importance of any Systems predictability horizon which imputes a time limitation in financial risk analysis; for instance securitisation may lock in profit now but the ‘tail risk’ is left with the buyer that may extend over this horzon and in a changing economy incorrect pricing beyond predictability boundaries exposes the underlying business to loss. This is compounded is leveraged finance is used the consequence being the transfer of equity risk to debt investors and the creation of seemingly new asset class with high returns thereby diverting investors from establishing better data-models for risk diversification (Ashby 1956) and a new supply of information on the systems that feed them (Beer 1972; Shannon and Weaver 1998).


Current governance regimes base capital adequacy on Risk Weighted asset models relating to systems that are dynamic, conditional and have ‘fat-tails’ which Malcolm Gladwell (Gladwell 2008) points out should not be dismissed. In its recent consultative document the Bank of International Settlements (Settlements. 2009) offer a new standard of capital adequacy that is still based around risk weightings and not the systems that support them but does suggest a scenario based approach for capital default.


If current governance structures and investment processes are failing to manage an increasing frequency of boom/bust cycles how can Organisational Cybernetics (Wiener 1950; Beer 1985), Complexity Economics (Durlauf 1997), and modern Network Theory (Strogatz 2003; Watts 2003) assist investors in making better informed decisions and the market more self-regulated?


The purpose of this research is to explore how Complexity Economics, Organisational Cybernetics and the Viable System Model (Beer 1972) can be used to develop innovative analytical tools to better enable investor decisions thereby returning the transparency of risk information at a primary level that Smith (Heilbroner 1999) required for a market to be self-regulating.


The hypothesis is that non-homogeneous data sets and opaque management data cause poor investment decisions butand that the ‘fat-tails’ embedded within many Value at Risk (Wilmott, Howison et al. 1995; Wilmott 2000) models can(Wilmott, Howison et al. 1995; Wilmott 2000) should be removereconsidered as vital outliers that could be interpreted using scenario modelling through tools based upon organisational cybernetics and complexity economic theory. It also posits that structured finance theory coupled with excess leveraging creates feedback loops that cause asset bubbles if the products that support it remain ungoverned.


^ Keywords: Chaos, Complexity Economics, Emergence, Network Theory, CyberFilter Organisational Cybernetics, and Longitudinal

Research Aims and Objectives 8

Literature Review 11

Current State of Knowledge 11

Economics: A General Background 13

Capital Markets 15

Insurance 15

Equity/Debt Capital Markets 17

Systems 18

Stafford Beer: The Viable System Model & Team Syntegrity 19

Cyber Filter 20

Tying it all together 20

Complexity, Complexity Economics & Chaos Theory 21

Second Order Cybernetics 24

Summary 24

Conceptual Framework & Methodology 26

Conceptual Framework 27

Conceptual Methodology 31

Summary 35

Research Strategy 37

“Do I know what I know and how do I find out?”? 37

“I’m satisfied that I know, but is it real to me?”? 42

A Path to a Methodology? 43

So what is myThe Research Method? 43

Ethics, Intellectual Property and the Observer’s Role 46

Fieldwork Design 57

Structure 57

59

Appendices 60

Examples of Market Failures 60

Glossary of Terms 61

Ecosystems 61

Economics 61

Time 62

Recursion 62

System & Organisation 62

Supplemental Information 63

The Viable System Model 64

The Triple Index or CyberFilter 67

Team Syntegrity – TSI 69

Vester Sensitivity Model 70

R. Buckminster Fuller’s: Tensegrity Model 72

Algorithmic vs Linear or Deterministic Modelling 73

An Axiomatic Language for Financial Products 74

So you want to be a Researcher? 75

Complexity: Papers by Stafford Beer, John Casti & Steven Durlauf 77



^ Table of Contents

Research Aims and Objectives 8

Literature Review 11

Current State of Knowledge 11

Economics: A General Background 13

Capital Markets 15

Insurance 15

Equity/Debt Capital Markets 17

Systems 18

Stafford Beer: The Viable System Model & Team Syntegrity 19

Cyber Filter 20

Tying it all together 20

Complexity, Complexity Economics & Chaos Theory 21

Second Order Cybernetics 24

Summary 24

Conceptual Framework & Methodology 26

Conceptual Framework 27

Conceptual Methodology 31

Summary 35

Research Strategy 37

“Do I know what I know and how do I find out?”? 37

“I’m satisfied that I know, but is it real to me?”? 42

A Path to a Methodology? 43

So what is myThe Research Method? 43

Ethics, Intellectual Property and the Observer’s Role 46

Fieldwork Design 57

Structure 57

59

Appendices 60

Examples of Market Failures 60

Glossary of Terms 61

Ecosystems 61

Economics 61

Time 62

Recursion 62

System & Organisation 62

Supplemental Information 63

The Viable System Model 64

The Triple Index or CyberFilter 67

Team Syntegrity – TSI 69

Vester Sensitivity Model 70

R. Buckminster Fuller’s: Tensegrity Model 72

Algorithmic vs Linear or Deterministic Modelling 73

An Axiomatic Language for Financial Products 74

So you want to be a Researcher? 75

Complexity: Papers by Stafford Beer, John Casti & Steven Durlauf 77







Download 6.6 Mb.
leave a comment
Page1/16
Date conversion31.08.2011
Size6.6 Mb.
TypeДокументы, Educational materials
Add document to your blog or website

страницы:   1   2   3   4   5   6   7   8   9   ...   16
Be the first user to rate this..
Your rate:
Place this button on your site:
docs.exdat.com

The database is protected by copyright ©exdat 2000-2017
При копировании материала укажите ссылку
send message
Documents

upload
Documents

Рейтинг@Mail.ru
наверх