Monday, December 31, 2007

Stakeholder Theory

As originally detailed by R. Edward Freeman (1984), stakeholder theory identifies and models the groups which are stakeholders of a corporation, and both describes and recommends methods by which management can give due regard to the interests of those groups. In short, it attempts to address the "Principle of Who or What Really Counts."

In the traditional view of the firm, the shareholder view (the only one recognized in business law in most countries), the shareholders or stockholders are the owners of the company, and the firm has a binding fiduciary duty to put their needs first, to increase value for them. In older input-output models of the corporation, the firm converts the inputs of investors, employees, and suppliers into usable (salable) outputs which customers buy, thereby returning some capital benefit to the firm. By this model, firms only address the needs and wishes of those four parties: investors, employees, suppliers, and customers. However, stakeholder theory argues that there are other parties involved, including governmental bodies, political groups, trade associations, trade unions, communities, associated corporations, prospective employees, prospective customers, and the public at large. Sometimes even competitors are counted as stakeholders.

The stakeholder view of strategy is an instrumental theory of the corporation, integrating both the resource-based view as well as the market-based view, and adding a socio-political level.

This view of the firm is used to define the specific stakeholders of a corporation (the normative theory (Donaldson) of stakeholder identification) as well as examine the conditions under which these parties should be treated as stakeholders (the descriptive theory of stakeholder salience). These two questions make up the modern treatment of Stakeholder Theory.

==Scholarly Contributions== 45

There have been well over 100 articles and numerous books written on stakeholder theory including an entire issue of the Academy of Management Journal (v 42 n 5, 1999). Recent scholarly works on the topic of stakeholder theory that exemplify research and theorizing in this area include Donaldson and Preston (1995) and Mitchell, Agle, and Wood (1997), Friedman and Miles (2002) and Phillips (2003).

Donaldson and Preston argue that the normative base of the theory, including the "identification of moral or philosophical guidelines for the operation and management of the corporation" (pg. 71), is the core of the theory. Mitchell, et al derive a typology of stakeholders based on the attributes of power (the extent a party has means to impose its will in a relationship), legitimacy (socially accepted and expected structures or behaviors), and urgency (time sensitivity or criticality of the stakeholder's claims). By examining the combination of these attributes in a binary manner, 8 types of stakeholders are derived along with their implications for the organization. Friedman and Miles explore the implications of contentious relationships between stakeholders and organizations by introducing compatible/incompatible interests and necessary/contingent connections as additional attributes with which to examine the configuration of these relationships.

The political philosopher Charles Blattberg has criticized stakeholder theory for assuming that the interests of the various stakeholders can be, at best, compromised or balanced against each other. Blattberg argues that this is a product of its emphasis on negotiation as the chief mode of dialogue for dealing with conflicts between stakeholder interests. He recommends conversation instead and this leads him to defend what he calls a 'patriotic' conception of the corporation as an alternative to that associated with stakeholder theory.

http://en.wikipedia.org/wiki/Stakeholder_theory

Value Chain Analysis

The value chain, also known as value chain analysis, is a concept from business management that was first described and popularized by Michael Porter in his 1985 best-seller, Competitive Advantage: Creating and Sustaining Superior Performance.

A value chain is a chain of activities. Products pass all activities of the chain in order and at each activity the product gains some value. The chain of activities gives the products more added value than the sum of added values of all activities. It is important not to mix the concept of the value chain, with the costs occurring throughout the activities. A diamond cutter can be used as an example of the difference. The cutting activity may have a low cost, but the activity adds to much of the value of the end product, since a rough diamond is a lot less valuable than a cut diamond.

The value chain categorizes the generic value-adding activities of an organization. The "primary activities" include: inbound logistics, operations (production), outbound logistics, marketing and sales, and services (maintenance). The "support activities" include: administrative infrastructure management, human resource management, R&D, and procurement. The costs and value drivers are identified for each value activity. The value chain framework quickly made its way to the forefront of management thought as a powerful analysis tool for strategic planning. Its ultimate goal is to maximize value creation while minimizing costs.

The concept has been extended beyond individual organizations. It can apply to whole supply chains and distribution networks. The delivery of a mix of products and services to the end customer will mobilize different economic factors, each managing its own value chain. The industry wide synchronized interactions of those local value chains create an extended value chain, sometimes global in extent. Porter terms this larger interconnected system of value chains the "value system." A value system includes the value chains of a firm's supplier (and their suppliers all the way back), the firm itself, the firm distribution channels, and the firm's buyers (and presumably extended to the buyers of their products, and so on).

Capturing the value generated along the chain is the new approach taken by many management strategists. For example, a manufacturer might require its parts suppliers to be located nearby its assembly plant to minimize the cost of transportation. By exploiting the upstream and downstream information flowing along the value chain, the firms may try to bypass the intermediaries creating new business models, or in other ways create improvements in its value system.

The Supply-Chain Council, a global trade consortium in operation with over 700 member companies, governmental, academic, and consulting groups participating in the last 10 years, manages the de facto universal reference model for Supply Chain including Planning, Procurement, Manufacturing, Order Management, Logistics, Returns, and Retail; Product and Service Design including Design Planning, Research, Prototyping, Integration, Launch and Revision, and Sales including CRM, Service Support, Sales, and Contract Management which are congruent to the Porter framework. The "SCOR" framework has been adopted by hundreds of companies as well as national entities as a standard for business excellence, and the US DOD has adopted the newly-launched "DCOR" framework for product design as a standard to use for managing their development processes. In addition to process elements, these reference frameworks also maintain a vast database of standard process metrics aligned to the Porter model, as well as a large and constantly researched database of prescriptive universal best practices for process execution.

http://en.wikipedia.org/wiki/Value_chain

Game theory

Game theory is a branch of applied mathematics that is often used in the context of economics. It studies strategic interactions between agents. In strategic games, agents choose strategies that will maximize their return, given the strategies the other agents choose. The essential feature is that it provides a formal modelling approach to social situations in which decision makers interact with other agents. Game theory extends the simpler optimisation approach developed in neoclassical economics.

The field of game theory came into being with the 1944 classic Theory of Games and Economic Behavior by John von Neumann and Oskar Morgenstern. A major center for the development of game theory was RAND Corporation where it helped to define nuclear strategies.

Game theory has played, and continues to play, a large role in the social sciences, and is now also used in many diverse academic fields. Beginning in the 1970s, game theory has been applied to animal behaviour, including evolutionary theory. Many games, especially the prisoner's dilemma, are used to illustrate ideas in political science and ethics. Game theory has recently drawn attention from computer scientists because of its use in artificial intelligence and cybernetics.

In addition to its academic interest, game theory has received attention in popular culture. A Nobel Prize–winning game theorist, John Nash, was the subject of the 1998 biography by Sylvia Nasar and the 2001 film A Beautiful Mind. Game theory was also a theme in the 1983 film WarGames. Several game shows have adopted game theoretic situations, including Friend or Foe? and to some extent Survivor. The character Jack Bristow on the television show Alias is one of the few fictional game theorists in popular culture,[1] along with math professor Charlie Eppes from the show NUMB3RS, who uses Game Theory in many episodes to solve crime.

Although some game theoretic analyses appear similar to decision theory, game theory studies decisions made in an environment in which players interact. In other words, game theory studies choice of optimal behavior when costs and benefits of each option depend upon the choices of other individuals.

The first known discussion of game theory occurred in a letter written by James Waldegrave in 1713. In this letter, Waldegrave provides a minimax mixed strategy solution to a two-person version of the card game le Her. It was not until the publication of Antoine Augustin Cournot's Researches into the Mathematical Principles of the Theory of Wealth in 1838 that a general game theoretic analysis was pursued. In this work Cournot considers a duopoly and presents a solution that is a restricted version of the Nash equilibrium.

Although Cournot's analysis is more general than Waldegrave's, game theory did not really exist as a unique field until John von Neumann published a series of papers in 1928. While the French mathematician Borel did some earlier work on games, Von Neumann can rightfully be credited as the inventor of game theory. Von Neumann was a brilliant mathematician whose work was far-reaching from set theory to his calculations that were key to development of both the Atom and Hydrogen bombs and finally to his work developing computers. Von Neumann's work in game theory culminated in the 1944 book Theory of Games and Economic Behavior by von Neumann and Oskar Morgenstern. This profound work contains the method for finding mutually consistent solutions for two-person zero-sum games. During this time period, work on game theory was primarily focused on cooperative game theory, which analyzes optimal strategies for groups of individuals, presuming that they can enforce agreements between them about proper strategies.

In 1950, the first discussion of the prisoner's dilemma appeared, and an experiment was undertaken on this game at the RAND corporation. Around this same time, John Nash developed a criterion for mutual consistency of players' strategies, known as Nash equilibrium, applicable to a wider variety of games than the criterion proposed by von Neumann and Morgenstern. This equilibrium is sufficiently general, allowing for the analysis of non-cooperative games in addition to cooperative ones.

Game theory experienced a flurry of activity in the 1950s, during which time the concepts of the core, the extensive form game, fictitious play, repeated games, and the Shapley value were developed. In addition, the first applications of Game theory to philosophy and political science occurred during this time.

In 1965, Reinhard Selten introduced his solution concept of subgame perfect equilibria, which further refined the Nash equilibrium (later he would introduce trembling hand perfection as well). In 1967, John Harsanyi developed the concepts of complete information and Bayesian games. Nash, Selten and Harsanyi became Economics Nobel Laureates in 1994 for their contributions to economic game theory.

In the 1970s, game theory was extensively applied in biology, largely as a result of the work of John Maynard Smith and his evolutionary stable strategy. In addition, the concepts of correlated equilibrium, trembling hand perfection, and common knowledge[6] were introduced and analysed.

In 2005, game theorists Thomas Schelling and Robert Aumann followed Nash, Selten and Harsanyi as Nobel Laureates. Schelling worked on dynamic models, early examples of evolutionary game theory. Aumann contributed more to the equilibrium school, introducing an equilibrium coarsening, correlated equilibrium, and developing an extensive formal analysis of the assumption of common knowledge and of its consequences.

In 2007, Roger Myerson, together with Leonid Hurwicz and Eric Maskin, was awarded of the Nobel Prize in Economics "for having laid the foundations of mechanism design theory." Among his contributions, is also the notion of proper equilibrium, and an important graduate text: Game Theory, Analysis of Conflict, published in 1991.

http://en.wikipedia.org/wiki/Game_theory

Real Business Cycle Theory

Real Business Cycle Theory (or RBC Theory) is a macroeconomic school of thought that holds that the business cycle is caused by random fluctuations in productivity. (The four primary economic fluctuations are secular (trend), business cycle, seasonal, and random.) Unlike other leading theories of the business cycle, it sees recessions and periods of economic growth as the efficient response of output to exogenous variables. That is, RBC theorists argue that at any point in time, the level of national output necessarily maximizes utility, and government should therefore not intervene through fiscal or monetary policy designed to offset the effects of a recession or cool down a rapidly growing economy.

According to RBC theory, business cycles are therefore "real" in that they do not represent a failure of markets to clear, but rather reflect the most efficient possible operation of the economy. It differs in this way from other theories of the business cycle, like Keynesian economics and Monetarism, which see recessions as the failure of some market to clear.

Economists have come up with many ideas to answer the above question. The one which currently dominates the academic literature was introduced by Finn Kydland and Edward Prescott in their seminal 1982 work “Time to Build And Aggregate Fluctuations.” They envisioned this factor to be technological shocks i.e., random fluctuations in the productivity level that shifted the constant growth trend up or down. Examples of such shocks include innovations, bad weather, imported oil price increase, stricter environmental and safety regulations, etc. The general gist is that something occurs that directly changes the effectiveness of capital and/or labor. This in turn affects the decisions of workers and firms, who in turn change what they buy and produce and thus eventually affect output. RBC models predict time sequences of allocation for consumption, investment, etc. given these shocks.

But exactly how do these productivity shocks cause ups and downs in economic activity? Let’s consider a good but temporary shock to productivity. This momentarily increases the effectiveness of workers and capital. Also consider a world where individuals produce goods they consume. This may seem silly but at the aggregate level, this averages out.

Individuals face two types of trade offs. One is the consumption-investment decision. Since productivity is higher, people have more output to consume. An individual might choose to consume all of it today. But if he values future consumption, all that extra output might not be worth consuming entirety today. Instead, he may consume some but invest the rest in capital to enhance production in subsequent periods and thus increase future consumption. This explains why investment spending is more volatile than consumption. The life cycle hypothesis argues that households base their consumption decisions on expected lifetime income and so they prefer to “smooth” consumption over time. They will thus save (and invest) in periods of high income and defer consumption of this to periods of low income.

The other decision is the labor-leisure trade off. Higher productivity encourages substitution of current work for future work since workers will earn more per hour today and less tomorrow. More labor and less leisure results in higher output today. More output means greater consumption and investment today. On the other hand, there is an opposing effect: since workers are earning more, they may not want to work as much today and in future periods. However, given the pro-cyclical nature of labor, it seems that the above “substitution effect” dominates this “income effect.”

Overall, the basic RBC model predicts that given a temporary shock, output, consumption, investment and labor all rise above their long-term trends and hence formulate into a positive deviation. Furthermore, since more investment means more capital is available for the future, a short-lived shock may have an impact in the future. That is, above-trend behavior may persist for some time even after the shock disappears. This capital accumulation is often referred to as an internal “propagation mechanism” since it converts shocks without persistence into highly persistent shocks to output.

It is easy to see that a string of such productivity shocks will likely result in a boom. Similarly, recessions follow a string of bad shocks to the economy. If there were no shocks, the economy would just continue following the growth trend with no business cycles.

Essentially this is how the basic RBC model qualitatively explains key business cycle regularities. Yet any good model should also generate business cycles that quantitatively match the stylized facts in Table 1, our empirical benchmark. Kydland and Prescott introduced calibration techniques to do just this. The reason why this theory is so celebrated today is that using this methodology, the model closely mimics many business cycle properties. Yet current RBC models have not fully explained all behavior and neoclassical economists are still searching for better variations.

It is important to note the main assumption in RBC theory is that individuals and firms respond optimally all the time. In other words, if the government came along and forced people to work more or less than they would have otherwise, it would most likely make people unhappy. It follows that business cycles exhibited in an economy are chosen in preference to no business cycles at all. This is not to say that people like to be in a recession. Slumps are preceded by an undesirable productivity shock which constrains the situation. But given these new constraints, people will still achieve the best outcomes possible and markets will react efficiently. So when there is a slump, people are choosing to be in that slump because given the situation, it is the best solution. This suggests laissez-faire is the best type of government intervention but given the abstract nature of the model, this has been debated.

A pre-cursor to RBC theory was developed by monetary economists Milton Friedman and Robert Lucas in the early 1970s. They envisioned the factor that influenced people’s decisions to be misperception of wages -- that booms/recessions occurred when workers perceived wages higher/lower than they really were. This meant they worked and consumed more/less than otherwise. In a world of perfect information, there would be no booms or recessions.

http://en.wikipedia.org/wiki/Real_business_cycle

Friday, October 26, 2007

Merger of Nations Bank and Bank America

Merger of NationsBank and BankAmerica In 1997, BankAmerica lent D.E. Shaw & Co., a large hedge fund, $1.4bn so that the hedge fund would run various businesses for the bank. However, D.E. Shaw suffered significant loss after 1998 Russia bond default. BankAmerica was later acquired by NationsBank that year.

The purchase of BankAmerica Corp. by the NationsBank Corporation was the largest bank acquisition in history at that time. While the deal was technically a purchase of BankAmerica Corporation by NationsBank, the deal was structured as merger with NationsBank renamed to Bank of America Corporation, and Bank of America NT&SA, changing its name to Bank of America, N.A. as the remaining legal bank entity. The bank still operates under Federal Charter 13044 which was granted to Giannini's Bank of Italy on March 1, 1927. However, SEC filings before 1998 are listed under NationsBank, not BankAmerica.

Following the US$64.8 billion acquisition of BankAmerica by NationsBank, the resulting Bank of America had combined assets of US$570 billion, as well as 4,800 branches in 22 states. Despite the mammoth size of the two companies, federal regulators insisted only upon the divestiture of 13 branches in New Mexico, in towns that would be left with only a single bank following the combination. This is because branch divestitures are only required if the combined company will have a larger than 25 percent FDIC deposit market share in a particular state or 10 percent deposit market share overall.


Source:http://en.wikipedia.org/wiki/Bank_of_America

Bank of ItalyThe Roots Of The Pre1998 Bank of America

Bank of ItalyThe roots of the pre-1998 Bank of America lie in the American Bank of Italy, founded in San Francisco by Amadeo Giannini in 1904. When the 1906 San Francisco earthquake struck, Giannini was able to get all of the deposits out of the bank building and away from the fires. Thus, unlike many other banks, he retained the confidence of the depositors and also had money to lend to those struck by the disaster.

In the late 1920s, Giannini approached Orra E. Monnette, President and founder of Bank of America, Los Angeles, about a merger between the two entities. The Los Angeles based bank had exhibited strong growth throughout the 1920s, due in part to its success in developing an advanced branch banking system. The merger was completed in early 1929 and took the name Bank of America. The combined company was headed by Giannini with Monnette serving as co-Chair.

While the names of many nationally chartered banks end with the initials 'N.A.' (National Association), Giannini picked a unique ending, National Trust and Savings Association, or 'NT&SA', because he wanted the name to highlight the different functions of the bank. Bank of America was the only NT&SA in the country. Thanks to good management, but also to aggressive development of the branch banking concept, the bank was soon the largest in California.

Bank of America Largest Commercial Bank In The United States

Bank of America (NYSE: BAC TYO: 8648) is the largest commercial bank in the United States in terms of deposits, and the largest company of its kind in the world.

It is the largest American company (by market capitalization) that is not part of the Dow Jones Industrial Average.

Type - Public (NYSE: BAC TYO: 8648)
Founded - (as "Bank of Italy")
San Francisco, CA (1928)
(acquiring banks)
Charlotte, NC (1874)
Boston, MA (1784)
Headquarters Charlotte, North Carolina, U.S.
Key people Kenneth D. Lewis, Chairman, CEO & President
Amy W. Brinkley, Global Risk Executive
J. Steele Alphin, CAO
Joe L. Price, CFO
Industry - Banking
Products - Financial Services
Revenue - $117.01 billion USD (2006)[2]
Operating income - $21.64 billion USD (2006)[3]
Net income - $21.13 billion USD (2006)[3][2]
Employees - 203,425 (2006)[2]
Slogan - Bank of Opportunity[1]
Website - www.bankofamerica.com