Monday, August 5, 2013

Jobs and Cook at Apple: From Visionary to Managerial Leadership

Founders and otherwise visionary leaders in business can be distinguished from managers, even though a manager may be running a company. For one thing, managers may resent leaders for being able to take in a larger view while relegating—even dismissing the petty, which can be so alluring to the managerial mentality. Leaders in turn may view the implementation of a vision as nugatory at best. More abstractly, change as paradigmatic (i.e., shifting from one broad framework to another) has its fans (i.e., visionary leaders), while the status quo has its own defenders (i.e., managers). Vision and big ideas are typically associated with a company’s founder or visionary leader, whereas bureaucracy tends to go with the implementation-focus of managers (including executives). In short, to suppose that leadership and management are the same is to ignore a lot that separates them. In the case of Apple, the shift from leadership to management that occurred with the passing of Steve Jobs may be at least partially responsible for the subsequent decline in the company’s stock price. In this essay, I explore the change at Apple to demonstrate why management should not be conflated with leadership.
 
Generally speaking, the shift from a company’s visionary founder to status-quo management can introduce political and psychological instability in the organization as managers struggle to fill the shoes of the founder both in terms of ideas and power. Although managers have a sufficient instinct for power to quickly grab up any available power, none of the epigones are likely able to be visionary as they are detail- and control-oriented. Although a way to develop a visionary ability may yet be found, I suspect that the wherewithal is either built into a person or it isn’t. That is to say, none of the managerial executives surviving a founder at a company are likely to stand out as a visionary leader, not to mention as THE visionary leader.
 
Steve Jobs of Apple is an interesting example of a founder who took quite naturally to being the visionary leader at Apple. Rather than merely being incremental improvements, the product innovations resulting from his unique ideas changed the world. The ipod, for example, stood to make even the laptop obsolete even as it in turn was beginning to replace the television for many young adults. Meanwhile, the smart phone was fundamentally altering the notion of a telephone. Crucially, Jobs could treat the assumptions that we take for granted regarding the products that we use as relics or artifacts of an age, and thus as replaceable. He could envision new products not beholden to those now-antiquated assumptions. This is the dynamic in visionary leadership. Even though questioning taken-for-granted assumptions can be taught, imagination is likely either something someone has or does not have. Hence, it is unlikely that people can be trained to be visionary leaders. Managers in particular may be especially handicapped.
 
It is thus significant that Tim Cook took over as CEO in 2011 after Jobs died. Cook had come to Apple in 1998 as an expert in sales and operations. According to the New York Times, he created “the efficient supply chain that helped catapult the company into the top ranks of the technology industry.” Being an expert in operations with an orientation to efficiency is tantamount to having “manager” tattooed on one’s back. Perhaps nothing else could be further from visionary leadership. The New York Times observes that whereas Jobs “was famous for his creative vision and flamboyant performances at introductions of the company’s products,” Cook “was known for his behind-the-scenes work—particularly for his shrewd negotiating tactics with suppliers.” Creativity and attention-getting can be associated with visionary leaders, whereas being oriented to tactics “behind-the-scenes” is oriented to implementation and thus managing that which has been set out as a goal. It is quite understandable that the shift from Jobs to Cook was not an easy one for the company.
 
A year after Jobs’ death, the senior management of Apple was shaken up in a move that Cook formally orchestrated. It has the dull ring of old-fashioned “office politics.” Specifically, Cook pushed Scott Forstall and John Browett out of the company. To be sure, both managers had “stumbled,” according to the Wall Street Journal. Forstall had overseen the new mapping service, which was rushed out, “riddled with bugs.” Meanwhile, Browett had overseen “the faulty implementation of a new staffing formula that cut some employee hours.” In spite of these vulnerabilities, it is also true that the two departures took place “as new fissures” emerged “among Apple executives, after some took on new roles following the death [of Steve Jobs].” That is, the organizational phase of transition from the leader-founder to the manager-executives was a contributory factor.
 
Pointing to the void that typically exists after the departure of a visionary leader, Forstall observed that there was no “decider” after Jobs had left the company. Having been used to a central authority, the managers did not sufficiently fill the void. They could not. Besides being insufficiently constituted to be viewed as unique, they had been so dependent on Jobs that they did not do enough to make up for the gaps in authority following his death. As a result, problems such as in the mapping service and the staffing formula fell through the cracks organizationally. The mapping service, for instance, might not have been rushed out had there been a process of checks not depending on the system based on a central authority. Therefore, Scott Forstall and John Browett were not completely to blame for the failures attributed to them.
 
Whereas Steve Jobs’ “outsize personality had kept managers in check” by “always casting the winning vote or by having the last word,” Tim Cook was not able to keep the clashes from manifesting out in the open. For instance, Ive and Forstall “clashed so severely” that “they avoided being in the same room together.” Previously, they had always just let Jobs decide. With that option gone, the additional pressure of decision-making exacerbated the acrimony. Ive and Forstall could no longer simply push their unresolved disputes to Jobs’ desk. Either the two men had to resolve their own disputes, or the problems fell through the cracks.
 
By announcing “Now the Tim Cook era at Apple Inc. really begins” after the shakeup, the Wall Street Journal  erred in implying that Cook had acted as a leader. Whereas managers at the company had “lived in constant fear of falling victim to a Jobs’ tirade or a whim,” Cook was “pushed into” firing Forstall and Browett in order to “steady the ship.” Such a muted, or “small picture,” response is a good indication that Apple had moved into the manager-executive stage. It is no wonder that Apple’s stock fell nearly $100 in six weeks. According to the Wall Street Journal, the firings failed “to address the question of who will fill [Steve Jobs’] role as Apple’s ultimate decider on products.” Being pushed into incrementally stabilizing an organization without addressing the more fundamental issue does not evince leadership. A visionary leader acts proactively in keeping organizational politics from getting out of hand because they are trivial in relation to the leader’s vision. In fact, the vision should include making that fundamental issue obsolete by “reimagining” the organization itself.
 
For a CEO, such “re-imagining” can or even should include society itself. Steve Jobs doubtlessly imagined a very different world with his product ideas. In contrast, when Cook was asked while testifying before a U.S. Senate committee why two-thirds of Apple’s global pretax income in 2011 had been recorded in Ireland even though only 1% of the company’s customers were located in that low corporate-tax state in the E.U., he replied, “Unfortunately, the tax code has not kept up with the digital age.” Had he been a visionary leader, he most likely would have used the opportunity to present his vision of an alternative basis for corporate taxation that is in sync with the age. Instead, he “trenched in,” insisting that Apple had paid taxes on all of its profits. Even though a Senate report had found that Apple had paid little or no corporate taxes on at least $74 billion in the previous four years, Cook insisted, “We pay all the taxes we owe, every single dollar.” Astonishingly, Sen. McCain credited Cook with managing “to change the world, which is an incredible legacy for Apple.”

 

Sources:

Jessica E. Lessin, “Apple Executives to Exit,” The Wall Street Journal, October 30, 2012.

­­­­­____________, "Apple Shake-Up Signals Tim Cook Era,The Wall Street Journal, October 31, 2012.

Danny Yadron, Kate Linebaugh, and Jessica Lessin, “Apple CEO Defends Tax Practices as Proper,” The Wall Street Journal, May 21, 2013.

 Nelson Schwartz and Brian Chen, “Disarming Senators, Apple Chief Eases Tax Tensions,” The New York Times, May 22, 2013.

Sunday, August 4, 2013

Return of the Mortgage-Based Bonds: Another Bubble in the Making?

In case it has been a while since you have been entertained by going around in circles while sitting on a painted wooden horse, permit me to re-introduce you to the Merry-Go-Round, a staple at virtually any amusement park. The world itself might just be such a ride, with us mere earthlings playing out our respective roles while spinning around and around as the world goes by once and then again, and again. Lest it seem monotonous to go around in circles, it is possible—at least in principle—to learn something new on each pass.  Weighing against a learning-curve that might thwart an eternal recurrence of past failures are other, less salubrious proclivities of the mind. These include (but are not limited to) avarice, power, and even the force of habit. More damning still may be the arrogance of pride—the hubris of presumption.  The anti-epistemological mechanism of human presumptuousness may dwarf greed and power-aggrandizement in holding us back as a species from realizing the better angels of our nature—our human, all too human, nature. Lest all of this comes off as too abstract, dissolving in your hand as you grasp it (something like jello), fortunately we have Wall Street, which can always be counted on to render such matters sufficiently concrete, if not utterly banal.
The full essay is at Another Bubble.

Wednesday, July 31, 2013

A Critique of Corporate Political Risk Analysis and U.S. Foreign Policy: The Case of Libya

Even though people the world over instinctively recoiled as reports came in of Gadhafi's violent retaliation against Libyan protests on February 21, 2011, the official reaction from the US Government was muted at best. The refusal to act on an intuitive response to immediately remove the Libyan dictator's ability to wantonly kill people resisting his right to rule may have come from concerns that the mounting tumult of a change of government in a major oil-producing region of North Africa could cause even just a disruption in the supply of crude. Indeed, even the mere possibility was prompting a spike in the price of oil (and gas)--what one might call a risk premium. Even the prospect of an ensuing nasty electoral backlash from consumers having to face a possible increase in their largely non-discretionary gas expense was not lost on their elected representative in chief at the White House.  Even five days later, after some serious press on the rising price of gasoline hitting American consumers, the most the president would do is proffer a verbal "demand" from afar that Gadhafi leave Libya.  "When a leader's only means of staying in power is to use mass violence against his own people, he has lost the legitimacy to rule and needs to do what is right for his country by leaving now," the White House said in a statement. The dictator must have been shaking in his boots.  In actuality, Gadhafi had lost his legitmacy to rule five days earlier, and by the day of the statement the American administration could have been actively involved with willing EU states in stopping him inside Libya. Given the progress of the protesters-turned rebels and the behavior of Brent crude that week, the interests of the American consumer (and Western oil companies, as well as the business sector over all) were by then firmly in line with an enforced regime change in Libya.  Oddly, the old dogma of an absolute governmental sovereignty was colluding with an inherently excessive risk-averse corporate political risk methodology to hold America back from acting as midwife to a new political awareness breaking out in the Middle East.

On the day of Gadhafi's self-vaunted shooting spree, Brent crude benchmark vaulted past $108 a barrel (settling at $105.74, a two-year high).  On the following day, it rose to $111.25. On the first day of March, the Dow Jones Industrial Average dropped 168.32 points, or 1.38%, to finish at 12058.02, its third triple-digit decline in the past week. Oil futures on the New York Mercantile Exchange, already up 6% this year, jumped 2.7% to settle at $99.63 a barrel.  Brent Crude in London hit $115.42 a barrel, the highest settlement since Aug. 27, 2008.The graph shows the change in oil, though the change looks astounding in part simply because the graph only goes to 15%; were it to go to 100%, the picture might seem less dramatic.

The Wall Street Journal had reported already on February 21st that the rise was "driven by increasing unrest in the Middle East." Specifically, worries that the turmoil in Libya was curtailing output of that country's oil were said to be driving the price climb. However, USA Today cites Darin Newsome, an energy analyst at DTN, as pointing to the role of speculators around the world as propelling the price of oil. "The flow of money plays an enormous role in the direction, speed and volatility of these markets." In fact, the market mechanism itself may be flawed because speculators could push commodity prices out of sync with the underlying supply of the respective commodities. Turmoil in Libya cannot be blamed for the ensuing “creation” of artificial value (such an increase, by the way, had fueled the housing bubble in the US that came in for a hard landing in 2008). In fact, the rise in world oil prices began before the final third of 2010—before the prospect of widespread popular protest in the Middle East was realized. Indeed, the climb during the last third of 2010 looks a lot like that which took place in the first third of 2009 (during a recession). It was not until well into February, 2011, that the turmoil in the Middle East appeared, according to MSNBC, “to pose limited risk to global oil supplies. Neither Tunisia nor Egypt produce oil or gas.” Such “limited risk,” besides being mitigated, cannot very well be projected back well into 2010 to explain the rise in the price of gas.

Incidentally, another interesting feature of this graph is the sustained drop in 2008, before the financial crisis in September (and the U.S. Presidential election in November!).  The “V” pattern at the end of 2008 is classic “electoral.” It suggests that the price of gas may be very attuned to the electoral interests of those in power, and therefore to government policy. My contention in this essay is that this dynamic was alive and well in Washington when Gadhafi was turning on his own people.

In any event, The Wall Street Journal observed on the day after the massacre that rising oil prices "could have big implications for the U.S. economy." Although perhaps overreacting from the day's news, it is true that the price of oil has a big impact on a consumer-driven economy. Energy expenses, like food, are nondiscretionary, Howard Ward of GAMCO Growth Fund told MSNBC. “And they’re now poised to take a bigger share of wages than we’ve seen in several years. That will have a dampening impact on discretionary spending. We still have an economy that is 70 percent consumer spending.” In such an economy, how could politicians turn a blind eye to domestic consumer interests, even at the expense of defending human rights abroad? Arjun Murti, an oil analyst at Goldman Sachs, told The Wall Street Journal that even as people "put so much emphasis on the U.S., . . . what is going on in the rest of the world matters as much if not more."  However, elected representatives are inclined by their desire to stay in power to put world news through the prism of their constituents' pocket-books, and thus to frame foreign policy to protect their consumers. In other words, an elected representative is apt to be more finely attuned to the grievances of his or her electorate than to stopping human rights violations abroad. Perhaps it is such politics that keeps heads of democratic governments from agreeing on an intergovernmental or international military mechanism that would act to stop a regime once it has violently turned on its own people.

Besides the political implications from consumers being even potentially shell-shocked by higher gas prices, the business sector can be expected to be averse to political instability in a region of the world in which so much oil is produced.  This aversion is, in my view, overly risk averse. As MSNBC points out, it is unlikely that any new regime in an oil-producing country would withhold supply as a matter of policy because “any new government would badly need those oil revenues.”  Libya produces only 2% of global supply of crude, and the Saudi-controlled OPEC cartel would make up for any loss.  “OPEC is ready to meet any shortage in supply when it happens,” the Saudi oil minister, Ali al-Naimi, said at a news conference after an OPEC meeting, according to The New York Times on February 23rd. “There is concern and fear, but there is no shortage.” In my view, the minister’s statement reflects the excessive risk aversion in corporate political risk departments, for while fear is perfectly understandable for a protester who is being gunned down in the streets, the emotion represents or points to an over-reaction among managers assessing the political risk in financial terms from the vantage point of their carpeted offices in the steel fortresses of the modern cities.

In another piece in The New York Times on February 23rd, Clifford Krauss put forth the argument that the relative quality of Libya’s reserves magnified its importance in the price spike.  Saudi Arabia has more than 4 million barrels in spare capacity, but it includes “heavier grades of crude that are higher in sulfur content and more expensive to refine.” Larry Goldstein, a director of the Energy Policy Research Foundation, an organization partly financed by the oil industry, argues that “Quality matters more than quantity.”  Furthermore, should Europe need to buy sweet crude from Algeria and Nigeria, that could push prices higher. “Nigeria and Algeria are already producing flat out so they can’t come up with another million barrels a day,” Michael Lynch of the Strategic Energy and Economic Research consultancy firm, said. “That means there will be a scramble for lighter crude supplies.” The last time there had been a shortage of sweet crude (in 2007 and early 2008), oil prices soared to more than $140 a barrel, although the cause then was spiraling demand. Moreover, placing quality before quantity seems questionable to me  in looking at supply as it interacts with demand. Furthermore, the analysts are discounting the impact of the Saudis and OPEC to counter for any increase in costs by increasing supply. The New York Times reported on February 23rd that “Tom Kloza, the chief oil analyst at the Oil Price Information Service, estimated that the Saudis could pump an additional 1 million to 1.5 million barrels in a matter of days.” Additionally, OPEC has “a reserve capacity to deliver an additional four million to five million barrels to the world markets after several weeks of preparation. That is more than twice the oil that world markets would lose if production were halted completely by unrest in Libya.”  In other words, in the wake of Gadhafi’s massacre as Brent crude hit $110, the business analysts should have realized that the Saudis would have to virtually agree or otherwise go along with any cost-induced spikes. Or course, the political risk analysts have also argued that the Saudi royal family could fall, given the spread of protests throughout the region. To be sure, that is a possibility, but not necessarily as the analysts play it out or with a cut off in Saudi oil.

On March 2nd, The Wall Street Journal ascribed the previous day's market jidders to fears of unrest intensifying in Saudi Arabia as authorities there arrested a prominent Shiite cleric who had been calling for political reforms. "If there are problems in Saudia Arabia, we will feel it and that's causing concern, obviously," said Marc Pado, a U.S. market strategist at Cantor Fitzgerald. Also, Iran reported clashes between protestors and security forces in Tehran. Concerning Saudi Arabia, which seems to have been the epicenter for the worry, analysts believed at the time that the political instability in Bahrain meant that Saudi Arabia itself could be at risk. Indeed, the political risk argument may have come down to this contingency.  Kloza points out that unless the unrest were to spread to the streets of Jeddah and Riyadh, “I think it’s a very manageable situation and prices are closer to cresting than they are to exploding higher.” Even he could be overstating the risk, for besides discounting the financial appetite that a republic in Arabia would have in selling oil, his analysis projects too much based on a kinship between Saudi Arabia and Bahrain. The New York Times article also points to oil experts who argued at the time that the “island nation has a majority Shiite population with cultural and religious ties to the Saudi Shiite minority that lives close to some of the richest Saudi oil fields.” However, there are a number of “ifs” that must first be satisfied before this fuse could have gone off.  For one thing, Saudi oil fields are well defended. Also, that a majority population might do something does not mean that as a minority population it would do likewise (and in a different and much larger country). Were the unrest sweeping the Middle East to hit Saudi Arabia and turn it too into a republic, it would be a part of the broader sweep. In other words, I think the analysts overstate the significance of Bahrain and, moreover, miss the bigger picture (i.e., the transformation of the Middle East into democracies from autocracies). Such a historical transformation of the entire region could well be happening. but that doesn't necessarily mean that a significant sustained cut-off in the supply of oil would result. Indeed, such a conclusion ignores a basic fixture in human nature: greed. It is ironic that political risk analysts in business would miss that element. In short, they are over-reacting via over-projecting.
 
Going overboard in making projections is one indication of an excessive aversion to risk in a personality.  I suspect that this bias in corporate political risk analysis comes not only from like personalities, but also from corporate culture, which eschews controversy of any sort. In the rarified corporate office, conflicting values are willowed away in favor of the hegemony of efficiency and the associated business technique. This cultural aversion to uncertainty impacts business practice, including political risk analysis. A well-run corporation would have someone in that department saying, in effect, “hey, loosen up, guys.”  When it really is bad, such as it was in September, 2008 when the financial system almost collapsed, business is typically caught off guard just like the rest of us.  In terms of the protests in the Middle East, we can take it to the bank that business was on the side of political stability, and thus, the extant regimes.

The price of oil affects so many industries that virtually any industry can be expected to lobby for foreign policies that give priority to the stability in the status quo (rather than to revolution).  That is, both consumer and business interests could be expected to have pressured elected representatives in the U.S. Government to resist giving too much support to the protesters in the Middle East. For example, President Obama’s policy was that Mubarak should stay in power through the transition even as events in Egypt were rapidly forcing him out of office. Whereas strategic interests such as the Suez canal might have been foremost in Obama’s calculation regarding Egypt, oil, and thus American consumers and business, might have been primary in his muted statements in the wake of Gadhafi’s retaliation. This sets up an interesting dilemma. While the immediate reaction of most people worldwide who were recoiled in horror at the atrocities in Libya on February 21st was for something to be done right away to stop Gadahfi even if it meant more chaos in the short-run, business political risk analysis proffered an alternative course--that of reducing the turmoil immediately even if that meant retaining Gadhafi in power. Whereas proponents of democracy and human rights viewed the protests in Libya as a good thing, such people would be surprised to find the activity portrayed from the business standpoint in negative terms even in our midst. For example, USA Today reported Peter Beutel, of Cameron Hanover, as saying, "We have all the wrong things working together at the right time: an economic recovery, (stocks) making new highs, a lower dollar, strong seasonal demand and unrest in the heart of oil production" (italics added).  Libyans putting their lives on the line is also unrest in the heart of oil production. It is the starkness in the vector of valuation (i.e., very good vs. very bad) that is striking here. That a person in one house could have been viewing the spreading protests in the Middle East as instantiating a much overdue development in government while a person next door was disdainful of all the unrest attests to how differently the same event can be viewed.

 From the standpoint of the environment of international business, standing on human rights is not as much of a priority as an observer might want. In other words, what is good for GM is not necessarily good for the world.  The theory that increasing international business (e.g., trade and foreign direct investment) leads to or guarantees peace suddenly looks insufficient as a sufficient philosophy of international business.  An implication is that if corporate lobbyists have real sway over governments, the latter can be expected to shy away from policies and actions that would increase short-term political instability even where such turmoil were a good thing from the standpoint of democracy and human rights. Politicians who allow themselves to be controlled by corporate executives can be expected to overstate stability and shortchange leadership (and real change).  It may be that even the very existence of large corporations in a republic could thus be problematic from this standpoint. Corporations, and even ironically elected representatives, may be predisposed to advocate policies that are at odds with expanding democracy in the world.

In general terms, I contend that both toady politicians and the timid business executives who do not want to rock the boat for financial reasons are short-sighted even by their own rather narrow criteria.  In the case of Libya, were an overwhelming multinational military force to have descended on Libya as Gadhafi's men were ravaging Libyans on the streets rather than waiting for the U.N. Security Council to act, Gadhafi could have been stopped in his tracks in short order and thus order and civilians preserved (i.e., oil supplies undisturbed and a slaughter averted). Of course, as with any military action, things can go wrong.  To be sure, military action is always risky. For example, Gadhafi could sabotage the Libyan oil wells as Saddam did in Iraq in the first Gulf War (1992). However, the failure of the world to take first initiative could have given Gadhafi time to set up explosives ready at the touch of a finger in Tripoli. Indeed, there were reports on the day following the massacre of Gadhafi intending to blow up the oil wells anyway.  So the destruction of Libyan oil production could have come either from the world acting or failing to act in the wake of Gadafhi's violence against the protesters. Given the ambiguity of such risk, corporate political risk analysis would probably still come down in favor of retaining Gadhafi because the status quo is typically presumed to proffer the most stability. This I would call the fallacy of the status quo, which I believe dominates bureaucratic and state department thinking.

Instead of placing corporate political risk analysis on center stage, I submit that business is not the focal point of society (or politics). At the societal level, the hub and spokes stakeholder framework must be replaced by a web-structure wherein there is no central entity. Corporate political risk analysis from this broader perspective should be consulted without being allowed to become dominate. Therefore, governments around the world ought to overcome the presssure from their respective corporate political risk analyses in favor of human rights to place real limits on governmental sovereignty backed up by an international or multinational force on permanent stand-by, with a mechanism for activation agreed to before any occasion.  Such a leap would of course take principled leadership. Such leadership could be partially reconciled with more immediate strategic political interests by making the mechanism go into effect after the present term of office. While not optimal, this method would indeed deliver (eventually). 

Hence, even after five days of carnage in Libya and worsening volitility and price spiking of oil, as well as gasoline and jet fuel, at the expense of the American consumer and business firm, the Obama administration--the regime of real change--could only muster a statement and a freezing of assets. "When a leader's only means of staying in power is to use mass violence against his own people, he has lost the legitimacy to rule and needs to do what is right for his country by leaving now." It would be almost a month after Gadahfi had turned on his protesting people that the U.N.'s Security Council brought itself to act in authorizing all necessary means for member countries who want to step in to protect civilians in Libya. By that time, the protesters had become armed rebels and Gadafhi's military had been on the roll, killing rebels and civilians alike. A clean cut would have been better than a period of indecision.
 

Sources:

Jerry DiColo and Brian Baskin, "A Stealth Comeback for $100 Crude Oil," The Wall Street Journal, February 22, 2011, pp. C1, C3.

http://online.wsj.com/article/SB10001424052748704506004576173961240139414.html?mod=ITP_moneyandinvesting_0

Gary Strauss, "If Unrest Spreads, Gas May hit $5", USA Today, February 22, 2011, p. AI.

http://www.msnbc.msn.com/id/41739499/ns/business-personal_finance/

http://www.nytimes.com/2011/02/24/business/energy-environment/24oil.html?_r=1&hp

http://www.nytimes.com/2011/02/23/business/global/23oil.html?ref=todayspaper

http://www.msnbc.msn.com/id/41785849/ns/world_news-mideastn_africa/

http://www.nytimes.com/2011/03/18/world/africa/18nations.html?hp

Wall Street As More of the Economy: Unjust and Riskier?

The financial sector, which includes banks like JPMorgan and insurance companies like AIG, had the fastest earnings growth in the Standard& Poor’s 500 in 2012.[1] As of mid-2013, the sector comprised 16.8% of the S&P 500, almost double the percentage back in 2009. With the technology sector weighing in at 17.6 percent in 2013, the financial sector was poised to become the largest sector in the S&P 500. The traditional critique of the financial sector having a larger share of the economy is that the sector doesn’t “make” anything. As this argument is well-known, I want to point to two others.
 
 
First, the financial sector had been responsible for much of the rising economic inequality in the U.S. over decades. "Together, finance and executives accounted for 58 percent of the expansion of income for the top 1.0 percent of households and an even greater two-thirds share (67 percent) of the income growth of the top 0.1 percent of households," according to Josh Bivens and Lawrence Mishel of the Economic Policy Institute.[2] Increasing the financial sector’s portion of the U.S. economy translates into more income going to the top 1 percent. This trend in turn has rather negative implications for not only the poor, but also American democracy (as distinct from plutocracy, being ruled by the wealthy).
 
 
The increasing income inequality from banking harkens back to the medieval thought on usury (the charging of interest on lent funds). Under the just price theory, the respective values exchanged in a transaction should be equal or the transaction is not fair.[3] With the time value of money not yet recognized and most loans being for consumption needs rather than productive enterprise in the Middle Ages, the return of lent principal was taken to be of equal value to the money (or commodities) lent.[4] A lender demanding more (e.g., interest) was thus considered unjust according to Canon Law, unless the debtor was late in returning the funds, or the lender missed an opportunity for economic gain or incurred a loss due to the lent funds being unavailable during the period of the loan. These exceptions came into effect during the High Middle Age. In the modern context, if the interest being extracted exceeds the value of the time value of money, we could predict increasing inequality of wealth as a result. In other words, the contribution of the financial sector to increasing economic inequality may indicate that the just price theory is being violated by Wall Street charging clients too much. As that sector becomes more of the economy, the unfairness is expanded.
 
 
Second, to the extent that a larger financial sector means that the economy is more reliant on leverage, the systemic risk of banks being too big to fail increases as well. According to former U.S. Sen. Ted Kaufman, the Dodd-Frank Financial Reform law of 2010 had largely failed to diminish systemic risk in the financial sector in the law’s first three years. As that sector becomes a larger proportion of the economy, relying on Dodd-Frank regulations becomes increasingly risky for the economy as a whole. In other words, increasing the role of Wall Street in the economy means that unless government can safeguard the public interest against harm from the financial sector imploding, the American tax-payer is even more exposed than in September 2008.




1. Alex Barinka and Whitney Kisling, “Banks Poised to Lead S&P 500 as JPMorgan Beats Microsoft,” Bloomberg, July 29, 2013.
2. Josh Bivens and Lawrence Mishel, “The Pay of Corporate Executives and Financial Professionals as Evidence of Rents in Top 1 Percent Incomes,” The Economic Policy Institute, June 20, 2013. (Accessed July 29, 2013).
3. Childress, James F. and John MacQuarrie, eds., The Westminster Dictionary of Christian Ethics (Philadelphia: The Westminster Press, 1986), p. 639.
4. The time value of money is based on the principle of instant gratification. Having money to spend today (i.e., immediate utility or pleasure) is worth more than only having money tomorrow (i.e., delayed gratification). Applied to lending, the time value of money means that the fact that a lender does not have use of his or her funds until the debtor repays the principal represents a loss in value for the lender in addition to the value of the principal. Just price, or equivalence of value exchanged, could thus justify interest of equal value to the time value of money.

The Financial Crisis: A Systemic and Ethical Analysis

According to a study by the Dallas Federal Reserve, the financial crisis of 2007-2009 “was associated with a huge loss of economic output and financial wealth, psychological consequences and skill atrophy from extended unemployment, an increase in government intervention, and other significant costs.”[1]The study’s abstract goes on to “conservatively estimate that 40 to 90 percent of one year’s output ($6 trillion to $14 trillion, the equivalent of $50,000 to $120,000 for every U.S. household) was foregone due to the 2007-09 [sic] recession.”[2]

Interestingly, the Huffington Post “reports” the study’s finding in the following terms: “a ‘conservative’ estimate of the damage is $14 trillion, or roughly one year’s U.S. gross domestic product. This is based on how much output was lost during the crisis and Great Recession, along with all the damage done to potential future economic growth.”[3] In fact, the article’s title claims that the crisis cost more than $14 trillion! Lest it be thought that the reporter and editor suffer from a learning or reading disability, the gilding here is notably in the direction of “selling more papers."

Ironically, the Huffington Post also published an article pointing to the lack of accountability in that “the executives that [sic] were in charge of Bear’s headlong dive into the cesspool of subprime mortgage lending hold similar jobs at the most powerful banks on Wall Street: JPMorgan, Goldman Sachs, Bank of America and Deutsche Bank."[4]

The upshot is that those stakeholders who played a role in the crisis, most significantly the people running the government, the media, and the banks, have gone on, relatively unscathed, while the systemic risk remained or has actually become even greater. As a first step toward recovery, a systemic map depicting the interrelated parts in the systemic failure and a related ethical analysis can provide a basis for reforms sufficient to thwart another major financial crisis.
 
                                                Two Alternative System-Designs Depicting a Society
I created this diagram for my dissertation to illustrate my theory of ethical strategic leadership. For our purposes here, the circles and lines between them depict society as a system. Here, two designs are shown, each of which can be used to depict a society as a system of interrelated stakeholders. The design on the left privileges a company's point of view. The design on the right stresses the "web-like" structure applied to the societal level. "Fixing" a society from a systemic design standpoint involves analyzing and changing whichever design you are using to depict the society.
 
 What contributed to the financial crisis? The most obvious players include the mortgage servicers and their underwriters who respectively produced and approved subprime mortgages as if handing out candy to all-too-willing children who also should have known better. As if this practice were not bad enough, many such mortgage-division managers suffered little if any negative consequences even from committing fraud.

For example, “AMBAC Assurance Corp., a company that guaranteed some of Bear’s mortgage bonds and went bankrupt in 2010, accused Bear of fraud in a . . . lawsuit that described actions by . . . six mortgage division leaders . . . . Yet all six continue to work at the top levels of their field, earning salaries and bonuses that have allowed them to live in luxury while the mortgages that made up the bonds they sold have defaulted at alarming rates.”[5]The home-owners foreclosed on were doubtlessly in a much worse financial condition than the “leaders” who continued to live in luxury. Besides the financial and psychological costs estimated by the Fed in Dallas, the ethical dimension is salient.

The ethical problem in fraud is obvious; the verdict can be gleamed simply from the label, “liars’ loans,” which applies to a good many of the sub-prime mortgage applications that the mortgage producers accepted. In terms of the consequences, a distinct ethical verdict can be applied using Rawl’s theory of justice. Rawls contends in A Theory of Justice that a system’s rules, or policies, should favor the least well-off most, for under Rawls’ “veil of ignorance” no one designing the system knows where in it her or she will be. For the wealthy mortgage bankers to continue to live in luxury while the poor who faced foreclosure pay the price turns Rawls’ “difference principle” on its head. In other word, the consequences themselves are very unethical in terms of the foreclosure policies and the lack of accountability on the bankers. The implication prescriptively is that the relevant white-collar crime laws and enforcement should be strengthened while the foreclosure laws should be changed from favoring the banks to taking into greater account the dire situations of the homeowners, particularly during a recession.

Besides the mortgage producers and the subprime borrowers, the investment bankers who securitized subprime mortgages into bonds referred to at Goldman as “crap” built a house of cards, which insurance companies like AIG and rating agencies like Moodys’ negligently disregarded. In the late 1990s, Clinton administration officials, including Larry Summers who would serve as Obama’s chief economic advisor, had lobbied Congress to keep financial derivatives unregulated. Meanwhile, Clinton himself signed off on repealing the 1933 law that had prohibited commercial banks from engaging in the more risky investment banking field. After the financial crisis, Congress and Obama did not close the circle.

First, the Dodd-Frank law allows commercial banks to continue their activities in investment banking. Hence the $6.2 billion “London Whale” trading loss at JPMorgan, with Jamie Dimon holding onto both the CEO post and the chair of the board tasked with holding the CEO accountable. Second, the law continued to allow rating agencies to be compensated by the parties issuing the securities to be rated. Third, the law allowed public accounting firms to continue to be paid by the audited companies. Moreover, Congress and the U.S. President looked the other way concerning at least two major institutional conflicts of interest. The system itself is thus still vulnerable, sort of like a bike tire with a weak spot that could cause a flat (my back tire has such a spot, which feels like a bump when I’m riding fast).

Besides the failure of Congress and the White House to enactsystemic reform requisite to obviating another such financial crisis, the media has continued to ignore the two conflicts of interest mentioned above. Moreover, as demonstrated above in the reporting by the Huffington Post, the media can be more interested in “selling papers” or scoring political points than investigatingand reporting. For example, the media was nearly silent on Paul Volcker’s recommendation that the $1 trillion plus banks be broken up. It is perhaps no accident that the media companies are intertwined with the financial sector in particular and the business world in general. That members of Congress and the U.S. President are too (Obama’s highest contribution in ’08 coming from Goldman Sachs) are also all too willing to do Wall Street’s bidding in exchange for campaign contributions essentially closes the loop on systemic reform being viable. The problem is that the system itself must be changed for there to be any chance of cutting off another financial crisis like that in 2007-2009. Ethically here too, the verdict is not good.

Whether an editor or an elected representative, forsaking social responsibility and the public interest, respectively, for private gain is unethical on utilitarian grounds. That is, “the greatest good for the greatest number” is violated.[6] It is also violated by a CPA firm or rating agency that puts its own profit above acting in the public interest. That the public relies on audits and ratings means that the egoism is that much more unethical.

In short, the stakeholders involved in the financial crisis have continued to be overly concerned about their own interests and welfare at the expense of the public good. Hence, the damaged system goes on as if it were whole rather than warped. Hence the question, who will look out for the system itself? In this regard, who has the incentive to do so? If the answer is “nobody,”then it might be asked whether incremental fixes to the parts are enough for the system’s design itself to be corrected.


1.Tyler Atkinson, David Luttrell, and Harvey Rosenblum, “How Bad Was It? The Costs and Consequences of the 2007-09 Financial Crisis,”Staff Paper No. 20, Federal Reserve Bank of Dallas, July 2013.
2. Ibid.
3.Mark Gongloff, “The Financial Crisis Cost More Than $14 Trillion: Dallas Fed Study,” The Huffington Post, July 30, 2013.
4.Lauren Kyger and Alison Fitzgerald, “Former Bear Stearns Executives Seemingly Unscathed by Financial Crisis They Helped Trigger,” The Huffington Post, July 31, 2013. The article was originally published by the Center for Public Integrity.
5.Lauren Kyger and Alison Fitzgerald, “Former Bear Stearns Executives Seemingly Unscathed by Financial Crisis They Helped Trigger,” The Huffington Post, July 31, 2013. The article was originally published by the Center for Public Integrity.
6. John Stewart Mill and Jeremy Bentham, two early modern philosophers, provide the basis of the ethical theory.