Some thoughts on Globalization

Preamble

The concept of globalization is deceptively simple. Like most academic concepts, there is no monolithic definition of globalization. Concisely put, globalization is the extension of a simple, country-bound market. Broadly speaking, it can be considered the expansion of linkages across geo-political boundaries, leading to a re-organization of social and economic life in an unprecedented scale, and by extension global consciousness. Despite the broadness of this definition, its parsimony in capturing the true scope of the term is confessed. While generally globalization is construed in positive light due to the perceived benefits of increased communication, free trade, economic growth, greater trans-national exchange of people, ideas, culture, and technology, empirical claims suggest its pros may have been overstated (Kiser and Laing, 2001). Strictly from an economic perspective, the widening wealth gap, the contagious nature of inter-dependent financial markets brought about by globalized economic integration, over-dependence on debt to sustain the economic pyramid, irrational expansion of markets have not only garnered the attention of economists but also pundits from all disciplines; so much so that many socialists consider the term globalization a pejorative (Gupta, 2006), with some believing it a pre-cursor to revolution (Tally, 2013). In the course of this paper, the economic pros, cons and dichotomies of this phenomenon will be addressed, and potential areas for future research will be adumbrated.

 What does globalization entail?

Admitting upfront the ambit of this paper is on the economic aspects of globalization, it would be remiss to overlook the multi-disciplinary tentacles of globalization. As such, the definitions propounded by experts are rich in the diversity of their disciplinary spectrum. For example, McMichael (2000), from his Economics vantage point, considers globalization integration on the basis of a project that pursues market rule on a global scale. T L Friedman offered an influential view, calling it the inexorable integration of markets, nation-states, and technologies that enable reaching the world farther, faster, deeper, and cheaper (Satyavrata, 2004). The most eloquent and holistic definition, in the survey of this writer, lies in Mittelman’s (2000) understanding of globalization:

“As experienced from below, the dominant form of globalization means a historical transformation: in the economy, of livelihoods and modes of existence; in politics, a loss in the degree of control exercised locally . . . . and in culture, a devaluation of a collectivity’s achievements . . . . Globalization is emerging as a political response to the expansion of market power . . . . [It] is a domain of knowledge.”

In light of the definitions presented above, the practical scope of globalization is fairly clear.

Debate on Globalization

The controversy surrounding globalization is not new. The question of whether its pros outweigh its cons has been debated by many experts from many disciplines. While many economists are concerned with the empirical challenges of quantifying the performance of globalization, experts in other fields offer alternate viewpoints of apprising globalization. For example, Cooper (2001) analyzes globalization from a historical perspective, given Africa’s colonial and slavery-ridden past. A study sanctioned by WHO investigates the question of how globalization impacts the healthcare of global citizens (Dollar, 2001). From a risk and management angle, Schwartz and Gibbs (1999) conduct a unique study chronicling risky and irresponsible behavior of trans-national corporations impelled by the age of globalization. Without digging deeper, it’s safe to say the debate surrounding the efficacies and justification of globalization has is old, extensive, and far from settled. In the proceeding paragraphs some backdrop will be provided for this debate as a prelude for the writer’s subsequent economic analyses.

Globalization generally positively impacts a nation specializing in a particular good or service, provided that country is able to find a consenting partner for exchange; i.e., generate trade. The torrent of regional and global economic integration, coupled with free trade campaigns have resulted in an unstoppable push for economic globalization. While many experts laud the trend (Irwin, 2015; Behren and Murata, 2012), citing the abundance brought forth by modernization, growing wealth, and living standards, a sizeable number of nations haven’t forgotten the ills of globalization. The evidences range from colossal inequality and underdevelopment of previous colonial countries of Africa and Asia to present day American imperialism (Kwame-Sundaram, 2014), modern financial aggression (Borio, 2014), proliferating culture of profit-maximization (Young and Akhija, 2014; Gandolfo, 2014; Dolgui, Kovalev, Pesch, 2015), etc.

Longitudinal and cross-sectional studies have shown ameliorated living conditions based on broad indicators of well-being across the decades contemporaneous with globalization (Zhang and Herring, 2012; Shaikh, 2007; Yusuf, Everett, and Wu, 2001). While this rings true for most countries, the highest beneficiaries have been advanced economies; in fact, most developing economies significantly lag behind (Askari, Iqbal, and Mirakhor, 2010).

So far, the main spur behind globalization is international trade. The exponential rise in trade is linked with higher economic activity, and thereby growth. This is not universal, however. For instance, at a time when yearly growth rates in ASEAN countries ranged between 6% to 8%, most African countries scored below 0.5% (Urata, 2002). This growth is also touted to have created more jobs and thus reducing unemployment. A closer look, however, shows us that despite creating more overall new jobs, the effect hasn’t been across the board. Indeed, many economies witnessed massive laying-off of workers to cut costs to remain competitive due to higher international competition, boost efficiency, and simply improve profit (Margalit, 2011; Tripathi, 2014). Case in point: China in recent years has been grappling with urban unemployment, which is beginning to affect rural economic clusters too (Zhang and Rasiah, 2015).

Thus, it is understandable why the debate rages on regarding the acceptability of globalization’s virtues. In the following sections, the good, bad, and the objective dichotomy of economic well-being of nations brought on by globalization will be presented.

Globalization: The Good

The positive economic impacts of globalization can be distilled into several distinct aspects. First, despite what detractors of globalization claim, the economic growth and development of most nations in 20th century was astounding (Crafts and O’Rourke, 2013). This accelerated ascension to economic well-being is exemplified by the Asian Tiger economies of the 1990s, who leap-frogged the West’s 200 years of economic growth in just over 200 years (Rodrik, 2012). Secondly, technology has further catalyzed this phenomenon. Case in point: Norwegian telecom giant Telenor, while justifying their foray into South Asian market, noted the accelerated effect of globalization in helping poor nations like Bangladesh and Pakistan bypass a proper land-line telephonic network and to embrace 2G GSM/WCDMA network right away. Furthermore, the dissemination of knowledge brought about by globalization led to increased awareness, transfer of technology, and ideas, all of which in turn accelerated both globalization and its capacity to generate wealth (McMillan and Rodrik, 2011). So in a way globalization has achieved a self-perpetuating growth phenomenon.

One of the most vaunted products of globalization has been the explosion of free trade. In theory, it reduces barriers like import tariffs, VAT, government subsidies and other non-tariff barriers (Costinot and Rodriguez-Clare, 2013). Enhanced global trade leads to economic growth of participant economies and creation of jobs. It also makes businesses more competitive and offers more variety and options for consumers, who also benefit from lower prices. In addition, countries reliant on semi-skilled and menial labor, who typically are also economically laggards (Jensen and Lindstadt, 2013), benefit from influx of direct foreign investment, transfer of technology and stand to gain from economic and technical development, higher standard of living, lower unemployment—all of which that are linked to fostering a democracy-friendly environment; this is supposed to make a country socially and politically more stable.

Since most businesses now have a global market to tap, the potential to grow is theoretically unlimited. Cultural intermingling facilitated by technology means two economies which have historically nothing in common and/or are geographically apart can now benefit from mutual trade and exchange of information, ideas, etc. (Narula, 2014). This has also led to higher level of tolerance and mutual understanding heterogeneous peoples. Labor and capital also have the option of finding the most rewarding destinations. The proliferation of multinational companies enabled installation of plants, highways and infrastructure facilities in poorer countries that resulted in employment and basic benefits for their citizens—helping alleviate poverty (Rodrik, 2012). And lastly, as the recent TPPA is a reminder, globalization has led to forging of many regional alliances and economic integration (Kaplinsky, 2013).

Globalization: The Bad

The greatest bane of globalization has been creation of wealth disparity. Simply put, the rich got richer, and the non-rich poorer. During the most prolific period of international trade (1960 to 2000), inequality of income grew heavily between nations, as well as within. UNDP reports suggest that 86% of the world’s resources are hogged by the richest 20%, and the remainder 14% is accessible to poorest 80% (von Braun, 2012, Van de Vliert, and Postmes, 2014).

Despite theoretical slogans of free trade with no barriers, in practice nearly 170 countries still impose VAT on imports (Leviner, 2014). In many European countries, taxes of nearly 25% are imposed (De-Hu, 2012). For the rich countries, loss of jobs is a big economic as well as social concern. As capital and labor seek the most optimal, economically proficient destination (usually poor economies), jobs are lost in the developed nations (Margalit, 2012; Geishecker, Riedl, and Frijters, 2012). China’s emergence as the export superpower has contributed to de-industrialization and manufacturing job loss of many millions in the USA and Europe. It also resulted in widening trade deficit.

The loss of jobs in developed nations is aggravated by the derivative trend of demanding pay cuts from their employees, who would otherwise lose their jobs for non-compliance (Margalit, 2012). This fosters a culture of fear for many blue collar and white collar workers.

Lack of uniformity in tax laws enables transnational enterprises to exploit tax havens (Leviner, 2014) like Cayman Islands, Bermuda, etc. Spurred by an obsessive motive of profit, these companies also carry the blame of imposing unjust and unfair working conditions, indentured slavery, exploitation of poor workers and economic migrants, child labor, unsafe working conditions, environmental mismanagement, and causing permanent ecological damage to many natural resources (von Braun, 2012). Although not strictly economic effects per se, when translated to monetary units, these losses grow staggering.

The instability witnessed in financial markets worldwide comes with massive socio-economic costs and human resource costs (Asongu, 2012, Bekaert, Ehrmann, and Fratzscher, 2014). There’s also a mismatch between the economic and financial system, as well as historic institutions that were originally designed to govern the economic and financial system (Chai, Liu, Zhang, and Xu, 2013). The propensity of funds to flee one market at the earliest sign of discomfort has had a cascade effect on capital flight (Briere, Chapelle, and Szafarz, 2012), increasing volatility of equity and derivatives market. The currency crises of 1997 and fall of Euro in the 2010s have been precipitated by the inter-connectedness brought about by globalized economic integration (Beck, Claessens, and Schmukler, 2013).

Globalization: The Dichotomy

As globalization envelops the world, the economic well-being and social welfare of economies are inextricably intertwined. Such interdependence has evolved over the past 20th century, culminating in today’s impasse where the richer nations are now rich but with the caveat that their richness is dependent on the performance of poorer countries. The proportion of world population today living in indigence is an alarming figure, and it’s growing. The irony is this growth of poverty is concomitant with unprecedented human capacity to generate wealth. As cited by Frenk and Moon (2013), the Commission on Global Governance succinctly prophesied in 1996, “a sophisticated, globalized, increasingly affluent world currently co-exists with a marginalized global underclass.” Lower travel and transportation cost has spurred a slew of economic migrants seeking better fortune for themselves and their families, leading to a migrant crisis in many countries. The result has been unsatisfactory for all—a lose-lose situation as global governance has failed to cope with newly rising challenges of migratory crises.

Another interesting aspect of globalization is, as Roine and Waldenstrom (2008) find out, the rich countries are now beginning to get poorer together. An image generated from their study is attached in the next page to demonstrate the trend.

The cultural aspect of globalization has economic implications as well. More communication induced by globalization allows for higher exchange of ideas and a broader vision. The interchange of ideas and culture allows one nation to shape its products and services to fill the void in another and thusly benefitting both nations. Obviously, the poorer economies, having poorer technological infrastructure, are surely lagging in this regard.

Globalization: A Tale of Two Hemispheres

Most topical in recent years, especially around the landmark TPPA signing has been the disproportionate distribution of income. While some studies find that the rising volume of trading between Northern nations and Southern nations can be linked to reduction in income disparity amidst skilled workers (Sheppard, 2012), the intra-North income inequality rose in the

Northern countries. Since manufactured exports of Southern origins increased the wages and demand for low-skill jobs in the South, in the North the service sector usually offers the highest paid jobs. As such, workers of semi-skilled or menial persuasion faced lay-off (Glen, 2012).

Topics Ripe for Globalization Research 

From an economic research perspective, the jury is still out on whether the world itself is fully better off due to globalization, or at least most parts. The disagreement among scholars ensues from disputing interpretations of various economic impacts of globalization, irregularity of empirical evidence, uneven nature of economic costs and gains for various countries with very few conspicuous trends emerging, difficulty in quantifying trans-generational effects, politicized nature of a nation’s capacity to maneuver economic levers, geo-political vulnerability, etc. All the factors mentioned are notoriously difficult to quantify in a monetary unit. In spite of such challenges, extensive studies have looked at settling the question of whether globalization has paid off or not.

Most literature suggests that the extent and scope of wealth inequality has deteriorated in multiple dimensions since the 80s in both rich and non-rich economies. This phenomenon has been pitted against parameters of globalization and global trade in many empirical studies. Most findings report an affirmative link of higher international trade and globalization indices with income inequality—some within economies, some among economies. Atkinson, Pietty, and Saez (2011) categorize such changes by tabulating a database where they examine it through economic factors (capital gains, incomes, labor wages, taxation systems, financial crises, macro-economic variables), political changes, and global-social themes.

Recent works have recognized considerable heterogeneity in various performance measures across firms within narrowly defined industries in richer and non-rich economies (Tybout, 2003). This carries significant consequence on participatory trends of firms in global markets. When the cost of exports become fixed, firms choose to export more and expand beyond their traditional and local markets. Naturally, less proficient firms shrink their business operations (Mayer, Melitz, and Ottaviano, 2011). The former phenomenon also adds to upgradation of product line and quality through technological research and innovation as the productive firm actively seeks out newer export frontiers (Kugler and Verhoogen, 2008; Bustos, 2011).

The pervasive trade of merchandises between rich and poor countries has led to studies investigating skill premiums, which create disparity of wages and job loss in certain countries. Some studied examined this to verify existence of  Stolper-Samuelson Effect and found that most exploited countries demonstrated elements of the workhorse sub-model of trade, part of the Hecksher-Ohlin framework (Goldberg and Pavcnik, 2007; Hanson, 2007; Harrison, 2006). Interestingly, majority of such studies stem from American and European narrative. It is hardly a secret that the majority that bear the brunt of globalization’s ills stem from the Eastern nations—particularly Africa and Asia. Thus a huge opportunity exists in investigating where Stolper-Samuelson Effect applies in intra-Asian or intra-African or trans-AfroAsian trades.

Some areas still not heavily explored include factors that have a considerable bearing on the quantification of costs and benefits, including scopes of gain from trading, free mobility of capital, income disparity, unemployment patterns, etc. In particular, interpersonal and inter-generational comparisons of economic pros and cons are worthy of scrutiny, especially considering the implications for eradicating dynastic poverty in many marginalized communities of poorest countries. Other interesting areas of research surrounding globalization include the premonition of Madison (1998), who believes globalization has run its course and is now in a transitional stage; as such, the economic system is evolving into a new form of capitalism—one that is distinct from 19th century laissez-faire capitalism, and 20th century ‘managed’ capitalism.

Final Words

The reality of globalization is undeniable. Despite its drawbacks, arguments for reversing it are as inane as putting toothpaste back in the tube, or arguing for reverting to typewriter because computers have drawbacks. In the economic history of mankind, it was an inevitable period. However, as many studies cited in this paper have shown, the spate of globalization in its current form is simply untenable. Its economic defects and the resultant widespread income inequality, financial contagion, exploitation of resources, etc. render it unsustainable due to a lack of self-correcting disciplinary mechanism. The proliferation of technology, transportation, communication, and cultural amalgamation has only magnified its effects—good or bad. The rising income inequality among high income and low income economies is a cause for great concern. Additionally, the sheer number of people living below poverty line worldwide is on the rise, despite a reduction percentage-wise. This divergence is worrisome too. Besides, the failure of many developing economies in benefitting from the fruits of globalization isn’t entirely their fault either. The challenges of integration for a low-income economy are substantial, and it isn’t merely because of their chosen policies. A lot of factors are beyond these countries’ control. Although some experts believe despite the current fervor for globalization, the phenomenon itself is not irreversible; it’s safe to assume globalization is here to stay—at least for a few more decades. Accepting this reality, the international community ought to aid the poorest economies by strengthening the global financial system via trade, aid, and inclusive approach. The end result will be stimulated growth and hopefully eradicated poverty. Thus the benefits of globalization can be actualized truly globally.

 Bibliography

Askari, H., Iqbal, Z., & Mirakhor, A. (2010). Globalization and islamic finance: convergence, prospects and challenges (Vol. 778). John Wiley & Sons.

Asongu, S. A. (2012). Globalization, financial crisis and contagion: time-dynamic evidence from financial markets of developing countries. Journal of Advanced Studies in Finance (JASF), (2 (III), 131-139.

Atkinson, A. B., Piketty, T., & Saez, E. (2011). Top incomes over a century or more. Journal of Economic Literature49, 3-71.

Beck, T., Claessens, S., & Schmukler, S. L. (2013). Financial globalization and crises: overview. The Evidence and Impact of Financial Globalization, Elsevier Inc, 1-12.

Behrens, K., & Murata, Y. (2012). Globalization and individual gains from trade.Journal of Monetary Economics59(8), 703-720.

Bekaert, G., Ehrmann, M., Fratzscher, M., & Mehl, A. (2014). The global crisis and equity market contagion. The Journal of Finance69(6), 2597-2649.

Borio, C. E. (2014). The international monetary and financial system: Its Achilles heel and what to do about it.

Brière, M., Chapelle, A., & Szafarz, A. (2012). No contagion, only globalization and flight to quality. Journal of international Money and Finance31(6), 1729-1744.

Bustos, P. (2011). Trade liberalization, exports, and technology upgrading: Evidence on the impact of MERCOSUR on Argentinian firms. The American economic review101(1), 304-340.

Chai, Y. X., Liu, D., Zhang, Z. B., & Xu, Y. L. (2013, September). Empirical Research on the Contagion Effect of Financial Crisis. In Advanced Materials Research (Vol. 740, pp. 364-367).

Cooper, F. (2001). What is the concept of globalization good for? An African historian’s perspective. African affairs100(399), 189-213.

Costinot, A., & Rodriguez-Clare, A. (2013). Trade theory with numbers: Quantifying the consequences of globalization (No. w18896). National Bureau of Economic Research.

Crafts, N., & O’Rourke, K. (2013). Twentieth century growth.

De-hu, W. A. N. G. (2012). On the Future Positioning of VAT in China’s Tax System. Journal of Nanjing University (Philosophy, Humanities and Social Sciences)5, 004.

Dolgui, A., Kovalev, S., & Pesch, E. (2015). Approximate solution of a profit maximization constrained virtual business planning problem. Omega.

Dollar, D. (2001). Is globalization good for your health?. Bulletin of the world Health Organization79(9), 827-833.

Dunning, J. H. (2002). Global Capitalism, FDI and competitiveness (Vol. 2). Edward Elgar Publishing.

Frenk, J., & Moon, S. (2013). Governance challenges in global health. New England Journal of Medicine368(10), 936-942.

Geishecker, I., Riedl, M., & Frijters, P. (2012). Offshoring and job loss fears: An econometric analysis of individual perceptions. Labour Economics19(5), 738-747.

Glenn, J. (2012). Globalization: north-south perspectives. Routledge.

Goldberg, P. K., & Pavcnik, N. (2007). Distributional effects of globalization in developing countries (No. w12885). National bureau of economic research.

GUPTA, S. D. (2006). THE IDEOLOGICAL CHALLENGE OF GLOBALIZATION BEFORE THE INDIAN LEFT AND ITS RELEVANCE FOR THE THIRD WORLD. Socialist Perspective34, 55.

Hanson, G. H. (2007). Globalization, labor income, and poverty in Mexico. InGlobalization and poverty (pp. 417-456). University of Chicago Press.

Harrison, A. (2006). Globalization and poverty (No. w12347). National Bureau of Economic Research.

Irwin, D. A. (2015). Free trade under fire. Princeton University Press.

Jensen, N. M., & Lindstädt, R. (2013). Globalization with whom: context-dependent foreign direct investment preferences. Working Paper.

Kaplinsky, R. (2013). Globalization, poverty and inequality: Between a rock and a hard place. John Wiley & Sons.

Kiser, E., & Laing, A. M. (2001). Have we overestimated the effects of neoliberalism and globalization? Some speculations on the anomalous stability of taxes on business. The rise of neoliberalism and institutional analysis, 51-68.

Kugler, M., & Verhoogen, E. A. (2008). Product quality at the plant level: plant size, exports, output prices and input prices in Colombia.

Kwame Sundaram, J. (2014). Globalization, imperialism and its discontents.Inter-Asia Cultural Studies15(1), 17-24.

Leviner, S. (2014). The Intricacies of Tax & Globalization. Columbia Journal of Tax Law5(2).

Madison, G. B. (1998). Globalization: Challenges and opportunities. na.

Margalit, Y. (2011). Costly jobs: Trade-related layoffs, government compensation, and voting in US elections. American Political Science Review,105(01), 166-188.

Mayer, T., Melitz, M. J., & Ottaviano, G. I. (2011). Market size, competition, and the product mix of exporters (No. w16959). National Bureau of Economic Research.

McMichael, P. (2000). World-systems analysis, globalization, and incorporated comparison. journal of world-systems research6(3), 68-99.

McMillan, M. S., & Rodrik, D. (2011). Globalization, structural change and productivity growth (No. w17143). National Bureau of Economic Research.

Mittelman, J. H. (2000). Globalization: captors and captive. Third World Quarterly21(6), 917-929.

Narula, R. (2014). Globalization and technology: Interdependence, innovation systems and industrial policy. John Wiley & Sons.

Rodrik, D. (2012). Global Poverty amid Global Plenty: Getting Globalization Right. Americas Quarterly, 40-45.

Roine, J., & Waldenström, D. (2008). The evolution of top incomes in an egalitarian society: Sweden, 1903–2004. Journal of public economics92(1), 366-387.

Satyavrata, I. (2004). ‘Glocalization’and Leadership Development for Transforming Mission in India. Transformation, 211-217.

Shaikh, A. (2007). 3 Globalization and the myth of free trade. Globalization and the Myths of Free Trade: History, Theory and Empirical Evidence, 50.

Sheppard, E. (2012). Trade, globalization and uneven development Entanglements of geographical political economy. Progress in Human Geography36(1), 44-71.

Smart, T. (1998). Globalization: a dream or a deadly fantasy?’. International Herald Tribune, 10th November.

Tally Jr, R. T. (2013). Utopia in the Age of Globalization: Space, Representation, and the World-system. Palgrave Macmillan.

Tripathi, A. (2014). Globalization and Downsizing in India. International Journal of Multidisciplinary and Current Research2, 932-939.

Tybout, J. R. (2003). Plant-and firm-level evidence on “new” trade theories.Handbook of international trade1, 388-415.

Urata, S. (2002). Globalization and the growth in free trade agreements. Asia Pacific Review9(1), 20-32.

Van de Vliert, E., & Postmes, T. (2014). Democracy Does Not Promote Well-Being Except in Rich Countries With Demanding Climates. Journal of Cross-Cultural Psychology45(8), 1179-1195.

von Braun, J. (2012). Globalization, poverty, and food.

Young, S. L., & Makhija, M. V. (2014). Firms’ corporate social responsibility behavior: An integration of institutional and profit maximization approaches.Journal of International Business Studies45(6), 670-698.

Yusuf, S., Evenett, S. J., & Wu, W. (Eds.). (2001). Facets of globalization: international and local dimensions of development (Vol. 415). World Bank Publications.

Zhang, G., & Herring, S. C. (2012). Globalization or Localization? A longitudinal study of successful American and Chinese online store websites.

 

Review Remarks on an Empirical Paper Investigating Budget Forecast using AR-MIDAS

Mixed Data Sampling based econometric techniques have grown in popularity in recent times due to their superior performance in extracting informational content out of data sampled at heterogeneous frequencies. Thus, it comes as no surprise that regulatory bodies are showing greater interest in using MIDAS to forecast instead of the orthodox approach where all time series must be aggregated into lowest common frequency. Therefore, I commend Damane’s (2018) ongoing endeavor in testing predictive superiority of MIDAS to forecast Lesotho’s budget. To improve the manuscript further, I proffer the following remarks.

Capitalization
All currency names (Loti/Rand/Dollar) and Brent should be capitalized.

 

This approach will not only assist to give an early warning signal of risks to budgetary executions but it will also help refine fiscal surveillance and planning by narrowing forecasting errors during the government budget process and also decrease uncertainty around government revenue and expenditure.

 

This statement in Page 3 can be buttressed by the broad empirical findings of the aforestated  papers that intra-annual data, as it becomes available in real-time, improves overall current year forecasts compared to traditional approach.

Figure 1
This is purely a cosmetic issue, but I feel the image (Page 4) should be reworked differently than that of Eptisa (2018). It is unclear if the image depicts a time-series of events. Also, the significance of the arrows is hard to comprehend. Besides, streamlining the fonts and graphics to suit the Working Paper’s official layout will make it more attractive and credible.

 

 In order to appreciate the importance of the relationship between forecasting and the government budgetary process, it is necessary to divide the budget period into three distinct periods. 

 

This is a momentous assertion in Page 8, based on Williams and Calabrese’s (2016) preferences. I feel their paper should be cited here rather than the paragraph’s end. On a related side-note, I feel the author cites this paper too generously. It may be better to reduce some of Williams and Calabrese’s (2016) citations lest the reader gets an erroneous perception that the current paper piggybacks on their work.

Midyear forecasts are therefore ideal for purposes of cash management or the choice among investing, holding, cashing out, and borrowing to pay for current expenditure.   

 

Could it be possible to provide examples of Central Banks who are exponents of Midyear forecasts?

  • Footnote 11 has a slight grammatical error.

For example, conservative revenue forecasting or revenue underestimation (pessimism) may be sort for its advantage of providing a rational hedge against future revenue uncertainty.

 

I do not quite understand this sentence.

Contrary to this, the popularity maximization assumption dictates that political actors can be tempted to overestimate revenue forecasts (optimism) especially during election cycles in order to use non-binding financial planning as a marketing tool to depict a bright fiscal future and in turn generate votes.

 

This is an astute observation and an intuitively correct statement. I do feel, however, that the author may want to cite the following papers in its support.

  1. Jeffrey Frankel; Over-optimism in forecasts by official budget agencies and its implications, Oxford Review of Economic Policy, Volume 27, Issue 4, 1 December 2011, Pages 536–562, https://doi.org/10.1093/oxrep/grr025
  2. Brück, T. and Stephan, A. (2006), Do Eurozone Countries Cheat with their Budget Deficit Forecasts?. Kyklos, 59: 3-15. doi:10.1111/j.1467-6435.2006.00317.x
  • Sabaj and Kahveci’s (2018) MIDAS actually outperformed the official forecasts. This is a major selling point of their paper and yours. It is worth highlighting.
  • Can a table be designed summarizing the major findings of these papers?
  • While the empirical approaches are well-chronicled and their findings adequately described, any kind of critique is noticeably absent. As such, to avoid making the review look like a laundry list of papers, I suggest adopting a theme-based approach, either geographically, chronologically, performance-based, or any other criterion deemed suitable.

STL is able to work on any frequency of data. It can also be calculated on time series with irregular patterns and missing values (EViews, 2015).

 

The claim in footnote 26 that STL is able to tackle missing values and irregular patterns is attributed to Eviews. Can we have an academic/professional reference here?

Tense
There are some minor inconsistencies in the overall tense of the paper. It is particularly noticeable in the sections where the author describes adopting Sabaj and Kahveci’s (2018) measures of forecasting quality/accuracy, and Rufino’s (2018) preference of RMSE.

 

 

 

 

 

Grounded Theory

Background of Grounded Theory 

The roots of grounded theory are in sociology, originating from theoretical experimentations with symbolic interactionism[1] (Charmaz, 2014; Chamberlain-Salaun, Mills, and Usher, 2013). The social processes adumbrated by symbolic interactionism portray social processes with concrete structures, containing implicit or explicit codes of conduct, and procedures which demarcate how interactions reveal and cast the meanings from them. Developing from this premise, grounded theory slowly emerged as an attempt to erect explanatory theories of simple, basic, and elementary social processes by studying them in their natural environments (Glaser and Strauss, 1967, 2009; Glaser, 1978; Zarif, 2012). One of its earliest proponents Strauss and Corbin describe grounded theory as a collection of 6 Cs: causes, contexts, contingencies, consequences, co-variances, and conditions. The 6 Cs enable a researcher to understand the patterns and relationships among those elements (Boychuk, and Morgan; 2004; Klag and Langley, 2013). For grounded theorists knowledge of social realities is achieved through watchful observation of behavior and speech practices. 

Advantages of Grounded Theory

 Personally, I find grounded theory to be charmingly simple, intuitively attractive, and conducive to creativity. It is also conceptually strong, and I particularly enjoy its highly structured approach to concomitant data collection and analysis, which allows rich and deep data to evolve into a theory. However, before delving into detailed pros and cons of grounded theory, a brief historical consideration is warranted. First of all, within qualitative framework, grounded theory first appeared as a protest toward passive acceptance by researchers of all “great theories” popular at the time, and the certitude that researchers only need to test the theories through quantitative scientific procedures. Earliest grounded theorists endeavored to counter this inaction by subjecting research data to rigorous analysis with a view to developing theoretical analysis (Charmaz, 2003, 2011). Thus grounded theory can be understood as a way of thinking about data in order to conceptualize it (El Hussein, 2015). To accomplish this data is continuously subjected to interrogation until a theory surfaces (Bryant and Charmaz, 2007). Having mentioned this, let us now explore the merits of grounded theory as a method of inquiry in greater details.

Pro: Inductive Simplicity

Unlike the hypothetico-deductive approach of dominant positivist culture of 3-4 decades ago, grounded theorists extol the potency of inductive reasoning (Charmaz, 2014; Hussein and Hirst, 2016). Though grounded theorists share the affinity for inductive logic with other qualitative colleagues, they posit that researchers should not begin research with a hypothesis or theory and then test if it upholds or not; they should rather first begin with data collection in the research’s native environment, simultaneously analyze it, and finally generate a hypothesis (Smith, 2015; Locke, 2015; Halcomb, 2016). Glaser calls this an enjoyable, meaningful, informative, and empowering venture and claims its appeal lies in the facts of its fitness and simplicity (Glaser and Strauss, 1998). Therefore, much like Nike’s slogan, a research can “just do it” (Glaser, 1998).

Pro: Intuitive Brilliance

Urquhart, Lehmann, and Myers (2009) point out that grounded theory is intuitively appealing to beginner researchers because it allows them to be engrossed deeply in the data. This makes the researcher continuously compare, code, and memo throughout the research term. This echoes Charmaz’s (2006) earlier remark that grounded theory equips neophyte researchers with the necessary principles and heuristic devices to start a research, remain committed to it, and finish it timely. Charmaz further contrasts this with other approaches within qualitative traditions which make researchers treat data however they want—this makes the research process haphazard. Morse, Stern, Corbin, Bowers, Clarke, and Charmaz (2009) further comment that for reasons stated by Charmaz (2006) grounded theory lets new researchers successfully answer their predefined questions, enlighten their thinking process, and reassure them when they hesitate during the actual research experience. Finally, the fact that grounded theory can be applied to almost all disciplines of study and is compatible with any type of data demonstrates its wide applicability (Glaser, 1992).

Pro: Creativity

A common complaint of grounded theorists with other strands in qualitative and quantitative approaches is of bias springing from a priori assumptions. Grounded theory sidesteps this problem by virtue of completely disregarding existing hypotheses at the beginning of research. Instead it utilizes empirical data to produce concepts. Pioneers of grounded theory—in particular, Glaser (1978)—encourages the neophyte researcher to shun preconceived theoretical data to augment creativity and invest time toward generating new ideas. Morse (2009) also encourages grounded theorists to tread along a process of discovery where themes and interpretations logically surface from the data. Thus researchers can develop meaning from the data and analyze it concurrently via creative, inductive processes. Kriflik, Zanko, and Jones (2005) credit this sequence for leading to original findings from the data compared to grounded theory’s sister qualitative approaches.

Pro: Superior Conceptualization

If the reader remembers the introductory section of this paper, the father of symbolic interactionism, Herbert Blumer (1969b) criticized the culture of imprecise conceptualization and blamed it for most of scientific difficulties faced by researchers in a post-positivist era (Snow, 2001). Grounded theorists concurred with this (Strauss and Corbin, 1994; Bowen, 2006) by pointing out that the prime problem in research process for finding an answer is badly defined concepts which obstruct generating precise, agreeable, and realistic interpretation of empirical social (subjective) data. Blumer (1969a, 1969b) prescribed one remedy to this disease: simplification. Simplification of concepts occurs when the relevant is sieved from the irrelevant. Through abundant practice of constant comparison and frequent memo-ing, grounded theory fosters a persistent back-and-forth of data collection and data analysis (Charmaz, 2014; Creswell, 2015). This furnishes concepts with broadening power which are easier to remember and are applicable to a wide range of events. It also facilitates transferability of those concepts to previously unexplored milieus. Grounded theorists today claim one of the greatest achievements of grounded theory to be Glaser and Strauss’s unremitting emphasis on concept generation and its legitimacy in mainstream methodological discourse (Timmermans and Tavory, 2012; Ruppel and Mey, 2015; Charmaz, 2014; Schwandt, 2015). Superior conceptualization is also a unique attribute which makes grounded theory stand apart from its qualitative sister methods.

Pro: Systematic Insight into Reality

Unlike many qualitative methods which rely on broad principles making application and interpretation cumbrous, grounded theory benefits from a robust, structured, and systematic approach to data analysis. The earliest definitions of grounded theory from its pioneers[2] demonstrate the level of emphasis given to the “systematic” nature of both theory generation and procedures to allow for inductive insight into a phenomenon. This improves the efficacy of data analysis to judge, generalize, and compare the results of grounded theory studies. It also helps with ensuring rigor and trustworthiness in the emergent theory (Rolfe, 2006; Cooney, 2011; Starts and Trinidad, 2007.) Stebbins (2001), Denk, Kaufmann, and Carter (2012) point out that the systematic approach distinguishes accidental discovery from deliberate exploration congruent with epistemological and ontological underpinnings of the researcher. The systematic approach is purposive, wide-ranging, and product of forethought. The serendipity approach of accidental discovery, although still scientific, suffers from waste of time and resources because of its overt reliance on fortune—waiting for the Archimedes-like Eureka moment. The grounded theorist, contrarily, places himself/herself actively on the path to discovery.

Pro: Deep and Rich Data

The adventure of research begins with finding data. This data helps dig out the context and structure of informants’’ lives while revealing their feelings, opinions, perspectives, intentions, actions, etc. Thick descriptions are required to procure rich data. Geertz (1973), Charmaz and Mitchell (2001), as well as Corbin and Strauss (1990) recommend taking extensive field notes during observation, gathering narratives from interviews, and assembling respondents’ written personal accounts. Grounded theory makes use of all the techniques above to make sense of the data. But it doesn’t just stop there. It also polishes it to produce insight into participants’ world. Grounded theory underscores data’s coexistence with context to expose what lies beneath the surface. This makes the world appear anew by extracting respondents’ social and subjective life (Charmaz, 2006). Extricating multiple views of participants’ gamut of actions empowers researchers to develop analytic categories to compare data to percolate new ideas (Holton, 2006). Grounded theory fosters this desire by goading the researcher to illumine the otherwise inaccessible views of informants’ lives. Grounded theory stimulates researchers to go back to the data and venture into analysis to decide whether more data is needed to enhance the emerging theoretical framework. This endows researchers with a renewed outlook and creates innovative categories and concepts.

Pro: Pro-Exploration

The grounded theory approach is ideal for exploration. Since it delivers a methodology to develop an understanding of social occurrences which are not pre-defined or pre-theoretically ossified with existing theories and paradigms, it offers a good platform for launching exploratory studies (Engward, 2013.)  This makes grounded theory suitable for examining social processes which have been neglected in mainstream research milieu, or where previous research is narrow or relatively scarce, or on topics where a fresh outlook promises deeper insight (Milliken, 2010). It also makes exploration easier because by definition it acknowledges the situated nature of knowledge and contingent nature of practice, as well as adapting easily to diverse phenomena.

Disadvantages of Grounded Theory

Although I am required to describe some of the weaknesses of ground theory for purpose of this assignment, I should disclose that I am indeed an advocate of grounded theory. Therefore, despite acknowledging its cons, I should point out that none of its demerits are fatal. They are limitations at best which can be overcome with careful attention to details, rigorous hard work, improved skill set, and experience. Some of grounded theory’s shortcomings are as follows.

Con: Exhaustive and Time-Consuming

Grounded theory usually produces huge sums of data which can be daunting to manage. Without proper training and requisite skills, using grounded theory can be a waste of time. Thus new researchers can become overwhelmed by the exhaustive coding requirements of grounded theory (Myers, 2013). Open coding requires not only a great deal of time, but also can be physically and mentally taxing (Charmaz, 2006; Walker and Myrick, 2006). Abstraction process regarding conceptualizations is quite demanding too. It is possible for novel researchers to be so immersed in coding that they can be oblivious to performing the job of discovering ideas and themes from where theory can emerge out of subjective social data. Moreover, grounded theory in hand of inexperienced researchers usually leads to generation of lower level theories. To overcome this, Annells (1996) recommends researchers to be patient and recognize that grounded theory is not “so simple.” It can take months if not years to tweak around the data to generate a theory regarding core themes. Annells (1997) also recommends undertaking grounded theory under the wings of an experienced mentor to share the burden of the journey.

Con: Methodological Pitfalls

Annels (1996, 1997) and Myers (2013) hint at the possibilities of novice researcher blurring methodological lines through using purposeful sampling and ignoring theory. Charmaz (2006) points out that while it is acceptable to begin a research with purposive sampling, the researcher should revert to theoretical sampling when data collection process is controlled by emerging theory. Failure in this department will result in shallow conceptual rigor. Researchers also often use small number of data sources and interviews. To avoid this problem Glaser (1992) suggests using both observation and interview for data collection. Failure to do so shifts focus from social process toward lived experiences of subjects.

Con: Literature Review

Grounded theorists themselves are divided whether to do literature review prior to beginning the actual research or not. Its founders Glaser and Strauss (1967) unambiguously urged researchers to write the literature review after finishing analysis so that it doesn’t pollute the findings of the study. Four decades later Corbin and Strauss (2008) echoed the sentiment by saying that there is always something new to discover. Thus they deem it unessential to conduct a full literature review before embarking on a research. Schreiber and Stern (2001) disagree with this position by claiming that theoretical sensitivity is important to avert personal biases, which can threaten the validity and credibility of a study. By sensitivity Schreiber and Stern mean researcher’s own insight into data, which can prepare him or her to interpret the data through own professional background, knowledge, and perspective (McCallin, 2003).

Con: Sub-schools within Grounded Theory

Having multiple traditions inside a school of thought is not inherently negative. However, the divergent educational backgrounds of Glaser and Strauss provoked a fissure inside the grounded theory tradition owing to the founders’ contrasting ontological and epistemological underpinnings. Till date, a minimum of 4 sub-schools[3] can be listed—most prominent being “original” school by Glaser and Strauss (now held only by Glaser who assiduously continues to promulgate the original story), the rebellious Strauss and Corbin school, and the constructivist school. The intra-school debates are mainly concerning verification and conceptual differences in understanding what theory entails. This can be very confusing to the new researcher both prior to beginning the research to determine research question and objectives as well as mid-way through the research during coding and analysis.

Con: Generalizability

Broadly speaking, qualitative researchers care less about generalizability. This is more in quantitative domain. Research questions explored via the lenses of grounded theory enable a unique scope of exploration and reveal potentially intricate and high level concepts which are not restrainted to a particular respondent or environment. Glaser (2002), Ayres, Kavanagh, and Knafl (2003) disagree with their qualitative-minded colleagues that just like quantitative, final product of qualitative studies should be generalization too—no matter what language is used to represent it (Buetow, 2014). Similarly, Lipscomb (2012), Sandelowski and Leeman (2012), and Polit and Beck (2010) berated grounded theorists for their disregard for theory-testing. They believe that knowledge grows through testing and confirmation of theories. Simply building theories after theories without testing them is worthless to overall progress of science. These confirmations take place by systematic replication—the result of which either confirms or rejects the theory. Lack of concern for external validity and generalizability can thus threaten acceptance of a grounded theory oriented study.

Parting Remarks

Among the children of the qualitative family, grounded theory is arguably the most rebellious and controversial one. It stands out not only for its unorthodox meaning, understanding, and description of phenomena, but also for its inordinate emphasis on theory-production. This approach—like most scientific endeavors—has changed, refined, and evolved over the decades due to scientific community’s perception and treatment of knowledge. Compared to the 1960’s zeitgeist, science is much more humanized today. Facts are seldom taken at face value today unless they run the gauntlet of analysis and critique. Glaser’s original realist ontology is a polar opposite of Strauss and Corbin’s ontology—which is in tune with conventional qualitative researchers. This shift in attitude reflects the overall shift in scientific climate which now is more tolerant of subjectivist and constructed orientation of truth. As a former finance professional and current researcher, I can appreciate the richness the grounded theory movement has brought to the business schools. Finance Professors no longer outright balk at the prospect of research using grounded theories. Some admire, if not apply, it for its flexibility in data connection, analysis and interpretation in complex environments—particularly in my personal interest area of capital markets. Admitting my membership in the post-positivist club, I acknowledge that using grounded theory as a method of inquiry has the potential to furnish unparalleled insights into experiences of stakeholders in my own discipline of finance, and business/economics in general.

 

References

Annells, M. (1997). Grounded theory method, part II: Options for users of the method. Nursing inquiry4(3), 176-180.

Ayres, L., Kavanaugh, K., & Knafl, K. A. (2003). Within-case and across-case approaches to qualitative data analysis. Qualitative health research,13(6), 871-883.

Blumer, H. (1969a). The nature of symbolic interactionism. Basic Readings in Communication Theory, 102-120.

Blumer, H. (1980). Mead and Blumer: The convergent methodological perspectives of social behaviorism and symbolic interactionism. American Sociological Review, 409-419.

Bowen, G. A. (2006). Grounded theory and sensitizing concepts.International journal of qualitative methods, 5(3), 12-23.

Boychuk Duchscher, J. E., & Morgan, D. (2004). Grounded theory: reflections on the emergence vs. forcing debate. Journal of advanced nursing, 48(6), 605-612.

Bryant, A., & Charmaz, K. (2007). Grounded theory research: Methods and practices. The Sage handbook of grounded theory, 1-28.

Buetow, S. (2014). How Can a Family Resemblances Approach Help to Typify Qualitative Research? Exploring the Complexity of Simplicity. SAGE Open4(4), 2158244014556604.

Chamberlain-Salaun, J., Mills, J., & Usher, K. (2013). Linking symbolic interactionism and grounded theory methods in a research design. Sage Open, 3(3), 2158244013505757.

Charmaz, K. (2003). Grounded theory. Qualitative psychology: A practical guide to research methods, 81-110.

Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative research. SagePublications Ltd, London.

Charmaz, K. (2011). Grounded theory methods in social justice research.The Sage handbook of qualitative research, 4, 359-380.

Charmaz, K. (2014). Constructing Grounded Theory. SAGE.

Charmaz, K., & Mitchell, R. G. (2001). Grounded theory in ethnography.Handbook of ethnography, 160-174.

Cooney, A. (2011). Rigour and grounded theory. Nurse researcher, 18(4), 17-22.

Corbin, J. M., & Strauss, A. (1990). Grounded theory research: Procedures, canons, and evaluative criteria. Qualitative sociology, 13(1), 3-21.

Corbin, J., & Strauss, A. (2014). Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage publications.

Creswell, J. W. (2015). Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, Enhanced Pearson eText with Loose-Leaf Version–Access Card Package. Pearson Education, Inc.

EL HUSSEIN, M. O. H. A. M. E. D. (2015). Nurse-Client Situated Interaction (NCSI): A Constructivist Grounded Theory of the Indicators and Clinical Reasoning Processes that Registered Nurses Use to Recognize Delirium in Older Adults in Acute Care Settings (Doctoral dissertation, University of Calgary).

Engward, H. (2013). Understanding grounded theory. Nursing Standard,28(7), 37-41.

Geertz, C. (1973). The interpretation of cultures: Selected essays (Vol. 5019). Basic books.

Glaser, B. G. (1978). Theoretical sensitivity: Advances in the methodology of grounded theory. Sociology Pr.

Glaser, B. G. (1992). Emergence vs forcing: Basics of grounded theory analysis. Sociology Press.

Glaser, B. G. (1998). Doing grounded theory: Issues and discussions. Sociology Press.

Glaser, B. G. (2002). Conceptualization: On theory and theorizing using grounded theory. International Journal of Qualitative Methods1(2), 23-38.

Glaser, B. G., & Strauss, A. L. (1998). Grounded theory. Strategien qualitativer Forschung. Bern, 53-84.

Glaser, B. G., & Strauss, A. L. (2009). The discovery of grounded theory: Strategies for qualitative research. Transaction Publishers.

Glaser, B., & Strauss, A. (1967). The discovery of grounded theory. London: Weidenfeld and Nicholson.

Halcomb, E. (2016). Understanding the importance of collecting qualitative data creatively: Elizabeth Halcomb considers how innovative methods of data collection can engage participants and enrich the information gathered.Nurse Researcher, 23(3), 6-7.

Herbert, B. (1969b). Symbolic interactionism: perspective and method.Berkely (USA): University of Califórnia.

Holton, J. A. (2006). Rehumanising knowledge work through fluctuating support networks: a grounded theory. University of Northampton.

Hussein, M. E., & Hirst, S. (2016). Tracking the footsteps: a constructivist grounded theory of the clinical reasoning processes that registered nurses use to recognise delirium. Journal of clinical nursing, 25(3-4), 381-391.

Klag, M., & Langley, A. (2013). Approaching the conceptual leap in qualitative research. International Journal of Management Reviews, 15(2), 149-166.

Kriflik, G., Zanko, M., & Jones, M. L. (2005). Grounded Theory: A theoretical and practical application in the Australian Film Industry.

Lipscomb, M. (2012). Questioning the use value of qualitative research findings. Nursing Philosophy13(2), 112-125.

Locke, K. (2015). Pragmatic reflections on a conversation about grounded theory in management and organization studies. Organizational Research Methods, 1094428115574858.

McCallin, A. (2003). Grappling with the literature in a grounded theory study.Contemporary Nurse15(1-2), 61-69.

Milliken, P. J. (2010). Grounded theory. Encyclopedia of research design, 1, 548-553.

Morse, J. M. (2009). Tussles, tensions, and resolutions. Developing grounded theory: The second generation, 13-22.

Morse, J. M., Stern, P. N., Corbin, J., Bowers, B., Clarke, A. E., & Charmaz, K. (2009). Developing grounded theory: The second generation (developing qualitative inquiry).

Myers, M. D. (2013). Qualitative research in business and management. Sage.

Polit, D. F., & Beck, C. T. (2010). Generalization in quantitative and qualitative research: Myths and strategies. International journal of nursing studies47(11), 1451-1458.

Rolfe, G. (2006). Validity, trustworthiness and rigour: quality and the idea of qualitative research. Journal of advanced nursing, 53(3), 304-310.

Ruppel, P. S., & Mey, G. (2015). Grounded Theory Methodology—Narrativity Revisited. Integrative Psychological and Behavioral Science, 49(2), 174-186.

Sandelowski, M., & Leeman, J. (2012). Writing usable qualitative health research findings. Qualitative Health Research22(10), 1404-1413.

Schreiber, R. S., & Stern, P. N. (Eds.). (2001). Using grounded theory in nursing. Springer Publishing Company.

Schwandt, T. A. (2015). The Sage dictionary of qualitative inquiry. Sage Publications.

Smit, J. (2007). Book review: CATHY CHARMAZ, Constructing Grounded Theory: A Practical Guide Through Qualitative Analysis. London: Sage, 2006. 208 pp.(including index). ISBN 0 7619 73524 (hbk)£ 65.00; ISBN 0 7619 7353 2 (pbk)£ 19.99. Qualitative Research, 7(4), 553-555.

Smith, A. (2015). Introduction “What Grounded Theory Is…”. Organizational Research Methods, 18(4), 578-580.

Snow, D. A. (2001). Extending and broadening Blumer’s conceptualization of symbolic interactionism. Symbolic interaction, 24(3), 367-377.

Starks, H., & Trinidad, S. B. (2007). Choose your method: A comparison of phenomenology, discourse analysis, and grounded theory. Qualitative health research, 17(10), 1372-1380.

Stebbins, R. A. (2001). Exploratory research in the social sciences (Vol. 48). Sage.

Strauss, A., & Corbin, J. (1994). Grounded theory methodology. Handbook of qualitative research, 273-285.

Stryker, S. (1980). Symbolic interactionism: A social structural version. Benjamin-Cummings Publishing Company.

Timmermans, S., & Tavory, I. (2012). Theory construction in qualitative research from grounded theory to abductive analysis. Sociological Theory,30(3), 167-186.

Urquhart, C., Lehmann, H., & Myers, M. D. (2010). Putting the ‘theory’ back into grounded theory: guidelines for grounded theory studies in information systems. Information systems journal, 20(4), 357-381.

Walker, D., & Myrick, F. (2006). Grounded theory: An exploration of process and procedure. Qualitative health research16(4), 547-559.

Woods, P., Gapp, R., & King, M. A. (2016). Generating or developing grounded theory: methods to understand health and illness. International journal of clinical pharmacy, 1-8.

Zarif, T. (2012). Grounded theory method: An overview. Interdisciplinary Journal of Contemporary Research in Business, 4(5), 969.

[1] A term coined by Herberrt Blumer outlining how people react toward matters based on what meanings things have to people and how those meanings are rooted in social interaction and modified continuously through interpretation. For a seminal overview of symbolic interactionism school, reader can peruse Stryker (1980).

[2] Glaser (1978) defined grounded theory as a systematic generation of theory from data itself obtained systematically from social research. Strauss and Corbin (1994) defined it is a qualitative research method that uses a systematized set of procedures.

[3] Discussing the diversity of intra-grounded-theory traditions is beyond the scope of this assignment. Nonetheless, an interested reader is invited to read an enlightening exposé on this by Ralph, Birks and Chapman (2015): The Methodological Dynamism of Grounded Theory. International Journal of Qualitative Methods, 14(4), 1609406915611576.

Circuit Breakers as Stability Levers

ABSTRACT

Circuit breaker, an automated regulatory instrument employed to deter panic, temper volatility, and prevent crashes, is controversial in financial markets. Proponents claim it provides a propitious time-out when price-levels are stressed and persuades traders to make rational trading decisions. Opponents demur its potency, dubbing it a barrier to laissez-faire price discovery process. Since conceptualization in 1970s and practice from 1980s, researchers focused mostly on its ability to allay panic, interference in trading, volatility transmission, prospect of self-fulfilling prophecy through gravitational pull towards itself, and delayed dissemination of information. Though financial economists are forked on circuit breakers’ usefulness, they’re a clear favorite among regulators, who downplay the reliability of anti-circuit-breaker findings citing, inter alia, suspect methodology and lack of statistical power. In the backdrop of 2007-2008 Crisis and 2010 Flash Crash, the drumbeats for more regulatory intervention in markets grew louder. Hence, it is unlikely that intervening mechanism like circuit breakers will ebb. But are circuit breakers worth it? This paper synthesizes three decades of theoretical and empirical works, underlines the limitations, issues, and methodological shortcomings undermining findings, attempts to explain regulatory rationale, and provides direction for future research in an increasingly complex market climate.

PROS & CONS

The integrity of a financial market relies heavily on the integrity of pricing. Prices determine worth, dictate where savings will be mobilized and channelized, resources will should be allocated, and liquidity will be sought or provided. Thus, when a market fails to facilitate price discovery and signaling to the extent that supply and demand no longer are the key determinants, a problem emerges. Therefore, regulatory intervention to iron out such kinks has merit. Nonetheless, whether the employed mechanisms have untoward consequences or fail to achieve professed objectives—or worse, impair market quality—warrants examination. The debate of whether circuit breakers are merited depends on a variety of factors:

Type of circuit breaker
o Is it a price limit or a trading halt?
o Is it a call auction?
o Is it discretionary or rule based?

• Triggering mechanism
o Is it induced by order?
o Is it induced by volume?
o Is it induced by price?

As of 2018—the time of writing this paper—data collected for Table 1 in this paper indicates 48 trading halts, 98 price limits, and 31 volatility interruption mechanisms active among the studied 152 exchanges, with some venues opting for multiple, overlapping, and/or discretionary schemes. In 16 cases, continuous trading is paused, leading to an auction. Meanwhile, with regards to trigger parameter, 11 venues activate the circuit breaker on discretionary basis; meaning pre-set values are not publicly disclosed.

Historically, the Brady Commission Report, sanctioned by Reagan administration in 1988 to uncover why the 1987 crash happened, was the first formal paper to advocate market-wide and individual circuit breakers. Latter proponents invoked the cooling-off hypothesis propounded by Ma, Rao, and Sears (1989), which argued circuit breakers could enforce price stability by curbing large price swings caused by speculative overreaction, avert panic, and dissuade price manipulation. Advocates also argue that traders’ ability to modify or withdraw standing limit orders during the halt enables informed traders to manage their risk without incurring losses, which in turn will increase liquidity and participation around equilibrium price upon resumption (Copeland & Galai, 1983). Moreover, circuit breakers are hoped to educate the market when channels of information transmission (i.e. quotes) are absent (Greenwald & Stein, 1988) and in so doing promote price discovery and decrease information asymmetry.

Both price limits and trading halts brake the price change mechanism. Whether this slowing-down is beneficial depends on the source of volatility. If the source is newly available fundamental information, the halt is only delaying the inevitable. Trade stoppage conveys no information on price expectations and could fuel further panic speculation resulting in greater transitory volatility when trade resumes. However, halt of trade before noise traders execute panic-driven orders would reduce transitory volatility and hence be desirable. Similarly, if excessive noise trades cause an order imbalance, halts could benefit the market by protecting noise traders from losses stemming from operating in a market with sub-optimal depth. Moreover, this allows market makers a chance to enter the market and provide liquidity (Kodres & O’Brien, 1994). In this scenario, Greenwald and Stein’s (1991) theoretical model shows informed traders’ transactional risk drops when prices move fast due to uninformed trades as value-seeking traders cautiously retreat since they are unsure at what price their trades will execute. Consequently, market participants have a greater incentive to be more informed before opening or closing a position.
Trading halts can also give brokers more time to collect margins. Brennan (1986) argued circuit breakers can act as a partial substitute for margin requirements if market participants are unsure about the eventual equilibrium prices during the timeout period. For example, a commodity trader posting 8% margin will lose all if price goes down by 20%. Should an 8% fall happen immediately, though the trader should lose his whole position, in practice, he only loses the 8% margin, and the broker stands to collect additional 12% later. However, with a circuit breaker of 10% in place, and neither the trader nor the broker knowing the eventual price trajectory to be 20% lower, the trader has an opportunity to attend to the first margin call voluntarily. This allows the broker multiple opportunities to collect margin when circuit breakers are in place. Failure to meet the margin call gives brokers more time to trade to stop the loss. Moreover, when a security or the market is in duress, stop-loss orders aggravate an already stressed order-book flooded with uninformed orders. In this way, trading halts can decrease transitory volatility. Moreover, an order-driven market may boast higher liquidity with a trading halt mechanism. In these markets, traders offering standing limit order suffuse liquidity. In normal circumstances, if price drops fast, a trader with standing limit orders will incur loss as the price continues to drop. However, a halt changes the mechanism from continuous to single-price auction upon resumption, when all orders are executed at the same settlement price. If a large selling order imbalance exists, all limit order buyers will receive it at the low clearing price. This protects the limit order buyers and encourages them to provide more liquidity in calmer market circumstances.

Contrarily, price limits and halts can increase transitory volatility if traders are afraid that trading will stop before they can submit order, leading to a hastening of order placement to increase the likelihood of execution. This triggers greater volatility, and rational traders recoil from trading amid fast-changing quotes. This phenomenon is known as the magnet effect, coined by Subrahmanyam (1994), who expanded on Lehmann’s (1989) predictions and later theoretically demonstrated this effect. He later postulated that rule-based halts are more susceptible to magnet effect due to higher predictability compared to discretion-based halts (Subrahmanyam, 1997).

There’s a possibility that informed traders may devote less time to monitoring the market if they know that they’ll be notified if trade is halted. Therefore, market liquidity may worsen in between trade halts, exacerbating transitory volatility, and leading—eventually—to more trade halts. For countries with multiple exchanges, a circuit breaker trigger in one market may cause perils in other exchanges. If only one market’s trade stops, order flow diverts to the remaining open market(s). Thus, solitary circuit breaker regimes may be counterproductive. Hence, many early researchers suggested coordination of regulation among exchanges to facilitate meeting higher demand for liquidity in multiple markets instead of one (Lauterbach & Ben-Zion, 1993).

Through a sequential microstructure trade model, Glosten and Milgrom (1985) demonstrate that uninformed traders acquire information by observing the trade process. Thus, trade contains an informational content, which is learnable only when trade is active. This leads to opponents arguing that absence of trade delays price discovery by postponing informed and uninformed agents’ reactions to new information (Fama, 1989). Moreover, if large price moves are induced by heavy one-sided order flow (i.e. order imbalance) and trigger a halt, informed traders are forced to temporize partial or full trading strategies, and whatever volatility was due to take place is splattered over subsequent trading sessions; typically with reduced liquidity (Chordia, Roll, & Subrahmanyam, 2002; Seasholes & Wu, 2007). Regarding this, Roll (1989) remarks: “…most investors would see little difference between a market that went down 20 percent in one day and a market that hit a 5 percent down limit four days in a row. Indeed, the former might very well be preferable.”

To sum up, opponents’ view on circuit breakers can be condensed into four points: volatility spillover across subsequent trading session, trading interference, delayed information transmission, gravitational pull or magnet effect hypothesis. After the flash crash of 2010, fresh questions surfaced as to whether the trading halt devices designed in rather simpler trading environments three decades ago are still relevant in today’s high frequency zeitgeist and, if not, to what extent should they be tailored, or should the regulators go back to the drawing board and start anew. The computerization of trading and liquidity provision, coupled with trade decentralization has led to a distinct rise in volume and volatility in the new climate (Brogaard, 2011). Regulators and markets in the US coordinated on a large-scale pilot project after the crash to investigate the efficacy of the classical circuit breaker regime vs. recalibrated narrow band of single stock price limits. Concurrently, European regulators took steps to move away from the endemic discrete circuit breaker regimes towards a unified framework to allow circuit breakers to operate across venues. To what extent this will succeed remains to be seen since exchanges have a vested interest in setting individual rules in a competitive environment to attract order flow. Nonetheless, Biais and Woolley (2011) posit that without tailor-made cross-platform streamlining across markets, circuit breakers cannot be effective anymore since in modern age of high-frequency trading arbitrage occurs across markets and suspending trade in the underlying spot while allowing the derivative trade can be dangerous.

UPSHOT

In the process of surveying for this paper, we have noted a striking similarity between the post-1987 crisis and post-2010 crisis understanding of circuit breakers. In both cases, much hullabaloo ensued in favor of greater regulation of equity and derivatives markets. In the 1990s, a slew of theoretical studies followed to understand if new measures would be beneficial, with a few empirical studies yielding ambiguous results. Paucity of data and a wide array of possible alternative explanations for sources of volatility meant that empirical studies lacked statistical power. In the next phase–new millennium–academia appeared to lose interest in the debate and very few theoretical studies were attempted, coinciding with lack of global crashes. It is interesting to note, however, that the dot-com bubble and crash didn’t inspire any circuit breaker studies. During 2000-2010, with growing availability of high-frequency data, soon a slew of empirical studies followed, mostly on Asian markets. Once again, the results were non-conclusive, though–on average–tilted against regulators. Meanwhile, around the same time, widespread adoption of circuit breakers took place (figure 2). In the final phase, 2010-today, empirical studies became more feasible due to greater availability of long time-series granular datasets, access to order books, limit-up-limit-down databases, and cooperation of exchanges. A noteworthy advancement in this stage has been considering insightful, new, alternative explanations for regulatory intervention and nuanced look at the circuit breaker regimes. Synthetic, laboratory market studies too were conducted, though some of its assumptions are questioned by researchers and industry practitioners. And yet the number of exchanges adopting circuit breakers kept growing, which is not surprising considering the post-crisis market climate. Therefore, now, the upshot is humanity’s understanding of circuit breaker mechanisms remains still limited and compared to the 1990s the progress has not been significant. In light of failure to address core methodological constraints plaguing empirical studies and relative disinterest in theoretical development, regulators defy or downplay academic findings and continue its practice, albeit with periodic tweaking. From a regulatory perspective, the circuit breaker issue is a foregone conclusion, although most are yet to provide empirical justification other than hopeful claims and windy rhetoric. It is not disputed that trading halts may ease the margin collection process for brokers (though others argue the benefits primarily in futures market cannot be generalized to equity), and that halts allow an opportunity to traders to reflect in turbulent times as well as protecting them from an informationally disorganized market when order flows surpass the market’s processing capacity. Innovations in processing capacity and trading algorithms has meant that many of these benefits are less relevant today. Besides, the benefits should be considered against the risk of neglecting the needs of hedgers, arbitrageurs, and investors participating on utilitarian motives, without whom the markets would be empty. Instruments impeding their trading ability are unlikely to be ultimately advantageous. Besides, speculation is necessary for market survival as it incentivizes competition for liquidity. Irrational as uninformed traders may be, denying them chance to participate hurts the informed traders in the long run as they miss out on profitable trading opportunities by exploiting the uninformed traders’ erroneous strategies. Moreover, protecting the interest of limit order traders should be considered as well. They perform a thankless job of providing liquidity and deserve to be compensated for it. Finally, the closing price of a security is a beacon for the economy. Intervention that distorts or impedes the process to discovering a rational price for a security imposes economic costs on the society. Large inter-day or intraday price swings are not necessarily bad or irrational. The cause of the price change matters more than its manifestation. Thus, demonizing volatility indiscriminately only serves to slow the price discovery and by extension result in subpar resource allocation in the economy.

Despite all the short-comings of circuit breaker schemes, innovative measures are on the way. After years of minor fine-tuning and incremental improvements, conceptually complex are and potentially superior mechanisms are garnering more attention. A one-size-fits all mechanism may not be far away. However, until such a solution arrives, circuit breakers are the best ad hoc measures available to stymie the erosion of market confidence. Hence, we conclude in words of Leinweber (2017): “Critics call them “Band-aids”, but for now, band-aids work.”

DOWNLOAD

The full version of the paper is downloadable at the journal’s webpage: https://doi.org/10.1002/ijfe.1709 

CITATION

Sifat, I. M., & Mohamad, A. (2018). Circuit Breakers as Stability Levers: A Survey of Research, Praxis, and Challenges. International Journal of Finance and Economics. https://doi.org/10.1002/ijfe.1709

Ex-Post Effects of Circuit Breakers in Crisis and Calm Markets: Long Horizon Evidence from Wide-Band Malaysian Price Limits

Abstract

Despite regulatory claims of straitening volatility and preventing crashes, evidences on circuit breakers’ ability to achieve so are non-conclusive. While previous scholars study general performances of circuit breakers, we examine whether Malaysian price limits aggravate volatility, impede price discovery, and interfere with trading activities in both tranquil and stressufl periods. For calm markets, we find significant success of upper limits in tempering volatility with low trading interference. Lowe limits show mixed results. Conversely, in crisis markets limits fare poorly in nearly all aspects, particularly for lower limits. We highlight implications of our findings for stakeholders and suggest future research avenues.

SCImago Journal & Country Rank

Under review at Journal of Economic Studies

JEL Classifications: D43, D47, D53

Keywords: Price Limits, Trading Halts, Malaysia, Financial Crisis, Circuit Breakers, Emerging Markets

Molding Malaysian Business School Curriculum in Light of 21 st Century Skills with Special Emphasis on Risk Awareness

Paper / E-Poster to be Presented at University Malaya in April 2018. 

Abstract

The traditional theory of economics—and, by extension, business schools—stems from assumptions of a rational man: homo economicus. This man, representative of the average citizen, has been—for centuries—presumed to be narrowly self-interested and a utility maximizer. Multidisciplinary research rooted in psychology in the 1980s and 1990s began to challenge these assumptions, leading to emergence of a discipline now called behavioral finance. This school of thought is more cognizant of and incorporates the innate cognitive biases exhibited by the average man. Its most celebrated advocates also note that when facing risk and uncertainty, humans tend to resort to heuristics to arrive at decisions rather than deep contemplation. Moreover, contrary to prior belief, humans are loss-averse, not risk-averse. In today’s age of informational overload where uncertainty is by far more replete than certainty, types and ambit of risks in need of handling by business school graduates appear to grow rapidly. While number of schools offering business degrees flourish, the Malaysian experience of incorporating elements of behavioral finance in the programs’ curricula has been sluggish. In fact, no business school yet offers a full-fledged degree; a trend that is slowly—but surely—changing in the West. This paper, through surveying the endemic business school curricular practice in Malaysia, advances a case for introducing behavioral finance content at undergraduate level degrees as a stepping point to future possibilities of introducing full-fledged programs as the discipline grows in breadth and depth. The arguments of this theoretical paper, I argue, are extendable to other disciplines where groundbreaking paradigm shifts threaten to render obsolete the traditional means of navigating life’s travails in a risk-savvy fashion.

Keywords

Curriculum

Business School

Malaysia

Behavioral Economics

Behavioral Finance