Airline Service Quality

Airline Service Quality

The Airline Industry, just like any other service provider, has had to confront issues associated with service quality. Service quality involves understanding customers’ expectations in comparison to company’s performance. Quality of the service determines customer’s satisfaction which is essential in a competitive market. Customer’s satisfaction highly depends on the dimensions of service quality referred as SERVQUALwhich includes various considerations like reliability, tangibles, assurance, empathy, and responsiveness. SERVQUAL guarantees not only customers’ satisfaction but also loyalty, retention, and company’s performance superiority (Parasuraman, 1988). This shows that quality in a service industry is vital in its marketing strategy.

In the airline industry, service quality has become a key consideration in marketing strategy since the beginning of airline deregulation. Competition in the airline industry has led to improvements in the services offered in this industry. To meet the standards required, and to be ever improving, companies have to deal with issues associated with service quality (Keith, 1961). These issues involve customers’ satisfaction, expectations, service performance, and changes in services.

A competitive airline company has to first and foremost address complaints emanating from unsatisfied customers. This helps in knowing what they lack in order to determine what needs to be improved in the company. For instance, time in air travel is one of the major factors that contribute to airline satisfaction. Customers gauge an airline company using its on-time performance. Delaying of taking off or landing times are things that should be avoided if an airline company has to survive in the industry. Identifying dissatisfied customers involves research which helps in understanding the areas that need improvements. Every company faces a hard task in implementation of changes geared towards improvements since its different customers have different opinions about the services offered. In trying to satisfy all customers a company has to increase its expenditures in offering of first, economy or tourist classes in order to accommodate all types of clients. This fact has seen many airline companies that emerged after deregulation in the United States exiting the market for failing to meet the customers’ needs (Dawna & Dawna, 1998).

Discovering customers’ requirements and expectations is also an important aspect in researching for improvements in service quality in the airline industry. The comfort of aircrafts seats is a requirement that customers use in selecting an airline company to travel in. An airline company has to discover what customers require in their seats such as adjustments, texture, and proximity to the window among others. Providing all seats adjacent to the windows is uneconomical since it will mean reducing the number of seats. Each row usually has two or at at least three seats and removing the seats not adjacent to the windows however reduces the number of passengers boarding the aircraft which in turn reduces the income earned by the airline company. This issue can be tackled by offering different classes that suit customers’ needs in regard to what they can pay (Caves, 1962).

An airline company has to forecast future expectations. This involves its expectation about the market together with customers’ future expectations of the firm. Forecasting helps in improving the quality of services and understanding what changes are to be made. Most customers expect prices to fall in the future. In dealing with the issue of prices, an airline company has to come up with better marketing strategies that will ensure that they meet the expectations of their customers. Such strategies include Frequent Flier Program (FPP). FPP is a program that offers low fares to frequent customers. FPP mostly works for small or emerging airline companies competing with large and already established airline companies.

Change in service is also an issue associated with service quality. Change in routes is an example of changes of service in an airline industry. An airline industry may decide to change its route due to many reasons such as weather, war, cost, or increase in customers in a certain route. This change may be advantageous to others and disadvantageous to some. Changes need to be gauged for in terms of their effectiveness. A change in service has to fulfill the company requirement and at the same time guaranteeing customers’ satisfaction. Gauging effectiveness of any changes implemented is based on certainties which call for expert assessment (Merlin & Fareena, 2001). In addition to making changes a company has to keep track of the repercussions of their changes. Monitoring of the changes is essential in determining their effectiveness.

In service quality, a company performance has to be assessed in comparison with its competitors. Since the airline deregulation, companies have always sought out to outdo each other. Any company’s performance is vital to its survival in the industry. An airline company performing well increases its customer’s base. The need of expansion entirely depends on the current performance. If an airline company is to expand its business operations, it has to assess its financial position in the targeted market.

In addition, service performance can be assessed by the appraisal and rewards a company offers to its workers. An example of reward an airline company can get is the World Airline Award. Such an award is a measure of performance which shows that the quality of services offered by the company is the best according to worldwide standards. Appraisal of a company’s employees also motivates its staff in their performance.

References

Caves, R. (1962). Air Transport and its Regulators: An Industry Study. Cambridge, MA: HarvardUniversity Press.

Dawna L. R. & Dawna L. R. (1998). Service quality in the US airline industry: progress andproblems. Journal of Managing Service Quality, 8(5), 306-311

Keith J. Mason. (2001). Marketing low-cost airline services to business travelers. Journal of AirTransport Management, 7, 103-109

Merlin C. S. Jr & Fareena Sultan. (2000). International service variants: airline passengerexpectations and perceptions of service quality. journal of services marketing, 14(3), 145148.

Parasuraman, A., Berry, L.L. and Zeithaml, V.A. (1988). SERVQUAL: A multiple-item scalefor measuring customer perceptions of service quality. Journal of Retailing, 64(1), 12-40.

Airlines sets Deadline on Talks with Italia (International Business)

Name

Professor

Business Essay

Date

Airlines sets Deadline on Talks with Italia (International Business)

Significantly, Etihad airways, the national airline of the UAE, has had a lot of success over the years mainly through expansions: for instance, in 2012, it expanded its commercial team to Saudi Arabia and also added new sale positions in two major offices (new appointments were Aiyaz Khot as manager of Central Province, based in Riyadh and Imran Rakhangi as manager of Western Province, based in Jeddah).

This was a core step for Etihad as one of the front-line officials of the airline put it out that the country was a key part of the airway’s networks which was true since Saudi Arabia’s sales force reflected the developing significance of that market for Etihad airways. Definitely, the year 2012 was significant to Etihad airways since shortly after the Saudi Arabia expansion the great experience from the two appointed managers who had extensive knowledge in sales and business analysis broadly increased the sales in the western and central provinces of nation.

The main point here is that Etihad has made grave impact in the airlines industry and the Alitalia investment should not be any different from the rest of the commercial expansions and stakes that Etihad has been able to acquire over time (Clark 1). Reflectively, Etihad bought a twenty four percent stake in India’s Jet Airways (JETIN) and a forty nine percent stake in Air Serbia; additionally, the carrier increased its stake in Virgin Australia (VAH) by ten percent, holds a twenty nine percent stake in Air Berlin, 40 percent of Air Seychelles, three percent of HYPERLINK “http://www.bloomberg.com/quote/AERL:ID” Aer Lingus (AERL), and recently added a stake in a Swiss regional carrier.

Basically, Etihad’s investment on Alitalia will be a mark of ‘presence’ on the European segment of the business even though it has stakes in Air Berlin and Swiss-based airline a full service legacy carrier would be great for the airway’s reputation: moreover, Alitalia has been trying to recover from some major losses that caused the airline greatly depicting a negative picture to the international market (Kamel 1). Initially, Air France owned about twenty five percent of Alitalia and they had a good thing going on until early last year when AF declined to invest further in the airline after Italia wanted to raise funds under the notion that the carrier belonging to the Italian sector was not yet per to its conditions.

After Etihad made its announcement about investing in Alitalia, Air France refrained from commenting on the matter which could mean a lot of things; for instance, they might be of the idea that it is a bad corporate move that they would not recommend but they don’t want to get in the way or AF might be feeling inferior since Etihad is willing to take on a task that the AF could not counter. All the same, the decision and aim of Etihad still remains at light though there is no way that such Etihad, an internationally recognized airline could enter into such a deal without having done more than enough homework. James Hogan, the Chief Executive of Etihad backed this notion by saying that as a successful corporation, they do not invest for the sake of doing it and only step in when they are sure of a solution; additionally, Etihad has claimed to have already devised ways that they will apply and enable Alitalia to get back on their feet.

Conversely, Alitalia also needs the investment even though they appear to be reluctant on the matter but this is mainly as a result of the management avoiding more embarrassment; all the same, Alitalia’s despair can be seen from their uptight compliance with Etihad’s management (Clark 1). Leave alone Alitalia itself, everyone else supports Etihad in their want for investment; resultantly, Abu Dhabi and his government, who helped prop up Alitalia have promised to step aside and let things go according to plan –‘they would not want to be imposters of the agreement in any way’ (Kamel 1) In another situation, a partner at Clyde and Co was also of the idea that the investment would be a breakthrough for Alitalia giving an illustration of the company being given ‘a shot’ after struggling for a significant period of time; sincerely, Etihad’s best and proven practices plus the three hundred million Euros will create a platform for the airline’s development and its future.

Generally, buying a stake in Alitalia will enable Etihad’s management tap a major Economy in western European that has been drawing major airlines like Emirates and low cost carrier EZI and at the same time they will seek to capitalize on the local airline’s afflictions.

Works Cited

Clark, Nicola. Airline Sets Deadline on Talks with Alitalia: International Business/ The New York Times. 2014. Web, February 3, 2014 < HYPERLINK “http://www.nytimes.com/2014/02/03/business/international/etihad-sets-deadline-on-investment-talks-with-alitalia.html?ref=international&_r=0” http://www.nytimes.com/2014/02/03/business/international/etihad-sets-deadline-on-investment-talks-with-alitalia.html?ref=international&_r=0 >

Kamel, Deena. Etihad Sets Deadline to Decide on Alitalia Investment: Bloomberg News. 2014. Web, February 3, 2014 < HYPERLINK “http://www.bloomberg.com/news/2014-02-02/etihad-sets-30-day-deadline-to-decide-on-investment-in-alitalia.html” http://www.bloomberg.com/news/2014-02-02/etihad-sets-30-day-deadline-to-decide-on-investment-in-alitalia.html >

Evaluation of the sources

Name

Professor

Course

Date

To spank your child or not

While spanking a child in an ideal world is not necessary, it is imperative to note that it may be necessary in some occasions. However, based on researched scholarly articles is spanking effective in disciplining children?

Evaluation of the sources

The sources used in this articles are relevant, reliable , up to date as well as accurate, most of the information in the sources are footnoted in a correct manner and the articles are referenced to acknowledge the copyright owners of the articles they are pulled from, and the sources are also consistent with the existing body of knowledge on the child punishment and discipline.

Day, R.; Peterson, G. W.; McCracken, C. (1998). “Predicting Spanking of Younger and Older Children by their Mothers and Fathers”. Journal of Marriage and the Family 60 (1): 79–94.

The effective of spanking of children is relative to the person spanking, for example, while it is a common thing in schools, it may not be effective at home. This is because most chidklren do not view parental spanking as a fair punishment but as away by which are not show their dislike and rejection. Most children become even worse after spanking. Longitudinal studies show that there is e marked negative outcomes in spanked children. The argument for this study is that spanking may be effective, if offered by authorities other than the parents.

Taylor, CA. Manganello, JA. Lee, SJ.; Rice, JC. (May 2010). “Mothers’ spanking of 3-year-old children and subsequent risk of children’s aggressive behavior”. Pediatrics 125 (5): e1057–65.

Delivery of punishment should ever be personal, most of the punished children end up becoming aggressive as the punishment remains in the memory of the of the child and has psychological effects on the child, these children base their reasoning on the memory of their parents punishments in reaction to their daily experiences and are always aggressive.

Statistics Sweden. (1996). Spanking and other forms of physical punishment. Stockholm: Statistics Sweden

According To statistics Sweden (1243), there are strong correlation between child spanking and juvenile delinquency. Most of the children who are spanked heavily by their parent tend to become more delinquent at a very young age. Most of them run away from home after punishment and end up in the street where they meet their peers and become initiated in to the crime wave. This article proposes that child punishment should be commensurate to the mistake made and if possible should be delivered by authorities such as school heads.

Park, Alice (3 May 2010). HYPERLINK “http://www.time.com/time/magazine/article/0,9171,1983895,00.html” “The Long-Term Effects of Spanking”. HYPERLINK “http://en.wikipedia.org/wiki/Time_%28magazine%29” o “Time (magazine)” Time (New York).

Spanking is also reserved for the worst offenders and nit just every child. Spanking children with minor mistakes might lead to disastrous events such as suicide, and aggressiveness. Wrong spanking might not produce the desired results in children. This may also lead to loss of respect.

Baumrind, Diana. Cowan, P.; Larzelere, Robert. (2002). HYPERLINK “http://www.apa.org/journals/releases/bul1284580.pdf” “Ordinary Physical Punishment: Is It Harmful?” Psychological Bulletin, American Psychological Association, Vol. 128, No. 4, 580–58.

Children should be warned that negative behaviors might lead to quick and certain discipline. Unlike spanking, warnings are effective and non humilitive. Warnings help children learn the clear boundaries. This article also vouches against punishments.

Straus, Murray A. (1971). “Some Social Antecedents of Physical Punishment: a linkage theory interpretation”. Journal of Marriage and the Family 33 (4): 658–663.

This article vouches for spanking but argues that spanking may not be effective, if the behaviors leading to spanking do not deserve e spanking. Children may view spanking as personal fight and not disciplinary action. It is important to understand and determine the kind of behaviors that deserve spanking. However, it argues that toddlers and teens must be allowed to make mistakes and have those behaviors that are appropriate. This punishment should be geared at establishing authority in a mature way and with love.

Conclusion

Most Parents tend to love their children and may feel pain spanking their beloved children not until they realize that their love is appreciated and the child is spoilt beyond help. It is also imperative to note that they use of non abusive punishments is healthy. On the other hand, many child psychologists tend to support spanking as a disciplinary action on children. However, if a child is spanked with fairness, love and care, is a very effective disciplinary technique? The best recommendation is that the usefulness of child spanking is relative to parents spanking their children

Works cited

Day, R.; Peterson, G. W.; McCracken, C. (1998). “Predicting Spanking of Younger and Older Children by their Mothers and Fathers”. Journal of Marriage and the Family 60 (1): 79–94.

Straus, Murray A. (1971). “Some Social Antecedents of Physical Punishment: a linkage theory interpretation”. Journal of Marriage and the Family 33 (4): 658–663.

Baumrind, Diana. Cowan, P.; Larzelere, Robert. (2002). HYPERLINK “http://www.apa.org/journals/releases/bul1284580.pdf” “Ordinary Physical Punishment: Is It Harmful?” Psychological Bulletin, American Psychological Association, Vol. 128, No. 4, 580–58.

Park, Alice (3 May 2010). HYPERLINK “http://www.time.com/time/magazine/article/0,9171,1983895,00.html” “The Long-Term Effects of Spanking”. HYPERLINK “http://en.wikipedia.org/wiki/Time_%28magazine%29” o “Time (magazine)” Time (New York).

Statistics Sweden. (1996). Spanking and other forms of physical punishment. Stockholm: Statistics Sweden.

Taylor, CA. Manganello, JA. Lee, SJ.; Rice, JC. (May 2010). “Mothers’ spanking of 3-year-old children and subsequent risk of children’s aggressive behavior”. Pediatrics 125 (5): e1057–65.

Evaluation Of Two Empirical Studies

Evaluation Of Two Empirical Studies

In the study “Prevalence of Prostate Cancer among Men with a Prostate-Specific Antigen Level ≤4.0 ng per Milliliter” by Thompson et al (2004) set out to undertake a study aimed at investigating prevalence of prostate cancer among men with a prostate-specific antigen level less than or equal to 4.0 ng/mL. In a slightly different fashion and approach, Kelly et al (2008) undertook a study titled “Learner Outcomes for English Language Learner Low Readers in an Early Intervention” that aimed at investigating the “efficacy of Reading Recovery (RR) with first grade English language learners (ELLs) in U.S. schools by evaluating the literacy effects of ELLs in comparison with their native English- speaking (NES) counterparts, who were also enrolled in the same RR.” By focusing on the two above mentioned studies, this discourse looks at the approaches each study used in terms of research rationale, philosophical basis, methodology used, extent to which the research objectives were formulated and achieved and conclusions derived from the entire study after conducting the research study. To achieve this, the discourse takes a convergent approach by focusing on each subject matter and delving by the first research study before moving on to the second. The study by Thompson et al (2004) opens the assessment followed by the study by Kelly et al (2008).

Thompson, I. M. et al (2004) Prevalence of Prostate Cancer among Men with a Prostate-Specific Antigen Level ≤4.0 ng per Milliliter. New England Journal of Medicine, 350:2239-224,

With respect to study objective, the research study the main objective of the study was to investigate the prevalence of prostate cancer among men with a particular antigen that is prostate-specific with antigen level less than or equal to 4.0 nanograms per milliliter. From the objective, it is evident that data gathering must involve real measurement to establish if the antigen level is within the desired levels.

Approach and Methodology

The first part of the methodology research critique looks at the study conducted by Thompson et al (2004) that sort to investigate prevalence of prostate cancer among men with a prostate-specific antigen level less than or equal to 4.0 ng/mL. The study identifies its corporate sponsor as the National Cancer Institute. In terms of sampling technique, the study used random sampling to come up with a sample size comprising a randomly selected sample drawn from 18,882 men based on fulfillment of initial selection criterion that entailed having no more than 3.0 ng per milliliter prostate-specific antigen. Other qualifications for the selection included the patient being at least 55 years of age, having a normal digital rectal examination, exhibiting an American Urological association symptom and posses no clinically significant coexisting conditions. This was a good start based on the fact that the criteria laid down for selection would reduce possibility of erroneous conclusion.

Research ethics and consent are very important aspects of any research study involving human subjects since the participants must not be coerced (Creswell, 2009). The study points out that the participants gave their written informed consent and that this was based on the fact that the participants had been provided with the details of the study. Since the study had different stages or phases, each phase mattered for the purpose of participants’ informed consent hence some participants who were eligible for the inclusion the final sample did not consent to undergo biopsy. Despite this, the study drew a connection between declining to undergo biopsy and age noting that those who declined were older than 75 years at 99% confidence level.

On the part of methodology, as already explained above, stratified random sampling was used whereby participants were randomly selected after meeting some initial qualifications. The study design took a randomized quasi-experiment approach with a placebo-controlled group included (Creswell, 2009). The PSA (Prostate-specific antigen) measurements were carried out in one central laboratory. This sounds like a good measure to ensure that same conditions are prevalent for all the measurements and accountability is easier to track. The study took a period of over 7 years. While the study reported that eligibility entailed, among other factors, having PSA level no more than 3.0 ng/mL and at another time having no more than 4.0 ng/ml, those who were found to have the PSA level more than 4.0 ng/mL were recommended for prostate biopsy.

Therefore, from the initial setting of standards for the study, there seems to be a confusion of whether the study took 3.0 ng/ml or 4.0 ng/mL to be the threshold of PSA levels. Nevertheless, there is a reason provided for including those having more than 4.0 ng/mL PSA levels in the study. The study included them as a measure to ensure generalizability of prevalence of prostate cancer for men with PSA ≤ 4.0 ng/ml to the general population. Therefore, only the control group (that was given the placebo) was used for the analysis. Among other features, the study assessed the relationships among base-line characteristics and prostate cancer. Therefore, prostate cancer was the dependent variable while independent variables included prostate-specific antigen.

Analysis and Results

The study used a confidence interval of 99% hence rejection or acceptance of a hypothesis was based on a p-value of 0.001. Final sample size included for data analysis was 2,950 men. Average PSA level for 96.2% of the initial participants is reported to have been found to exceed 7. At the end of study biopsy the study established that about 15.2% of the 2950 men had prostate cancer when end-of-study biopsy was carried out. However, there was no significant difference in cancer between those who underwent sextant biopsy and those who underwent biopsy with more than six samples taken.

The study identifies limiting factors that might have hindered the ability to detect correlation between age and risk of cancer given that risk of cancer increases with age. These factors included setting age limit for inclusion for participation and setting PSA level to 4.0 ng/mL. However, these factors helped in setting the control group which was administered with placebo. The study found a positive correlation between PSA and risk of prostate cancer noting that during the seven-year period of the study there was an annual increase in PSA levels noted among 449 men who had prostate cancer. The correlation was however not significant. The 449 men with prostate cancer had PSA mean of 1.78 ng/mL with standard deviation of 0.92 ng/mL as compared to the 2501 who had no prostate cancer and who had a PSA mean of 1.34 ng/mL with 0.86 ng/mL standard deviation.

It can be concluded that the study effectively accomplished its objectives of investigating the relationship between PSA levels and prostate cancer and its stage-wise approach allowed to even come out with supplementary findings. For the recommendations for future inquiry and professional nursing practice, the study recommends that even men with lower PSA levels should be screened for cancer as the results point to the possibility of men with lower PSA levels having cancer (also Krumholtz et al 2002). Therefore, they recommend a change in the practice cancer diagnosis.

Kelly, E. P., et al (2010) “Learner Outcomes for English Language Learner Low Readers in an Early Intervention” TESOL QUARTERLY Vol. 42, No. 2, June 2008

Approach and Methodology

From the outset, the study by Kelly, et al takes a mixed methodology approach to investigate how learner outcomes for English learner low readers differ from those of the English language learners. Even though the researchers do not explicitly indicate that they used the mixed methodology approach, a few characteristics of the mixed methodology are sufficiently indicative of it being used. First, the researchers used both qualitative and quantitative approaches to achieve the objectives of their study. For instance, measurements are carried out to establish the triple jeopardy of socio-demographic risk, low reader and ELL status and the outcomes for the associated students.

In matters concerning the sampling procedure and inclusion into the study, the researchers mention that inclusion into the study involved careful purposive sampling technique that ensured that the readers included in the study had reading level of 20% and below. Using this technique, the researchers came out with 8,581 ELLs and 121,961 Native English Speakers that undertook the RR program for school year covering between 2002 and 2003. It is notable that from the start of the research study, the researchers endeavored to place the study in theoretical settings and explain the study using theoretical frameworks such that even categorization of children in terms of English readership using the 4 levels helps the study in classifying the participants at the beginning of the research and at the end of the study when it is necessary to gauge the efficacy of the RR program.

From the qualitative study point of view, the study has several elements that clearly fit into a phenomenological study in that it involves collection of in-depth information and insights relating to the English language readership and these are initially achieved through extensive observation of the participants using several inductive, qualitative techniques. Other methods that are used by the researchers that help in boosting the claim of a phenomenological qualitative approach include the use of interviews, participant observations, interviews and discussions. Creswell (2009) note that phenomenological qualitative approaches take a different direction from other qualitative approaches by focusing on illuminating the precise subject matter and identifies phenomena through the way they are recognized by the actors in the particular situation.

Creswell (2009) further notes that, when conducted in the human sphere, phenomenological approach usually translates into collecting deep information and insights by use of inductive, qualitative techniques, which include interviews, participant observations and discussions. The information gathered through these techniques is presented from the research participants’ perspective. Therefore, phenomenological approach is focused on studying experience from the perspective of the individual and subjective knowledge of the individual in a way that emphasizes paradigm of personal perspective and interpretation. The approach is helpful in boosting understanding of individual’s motivations or actions while transcending the muddle of assumptions that are taken for granted and other conventional wisdom. It is further reasonable to argue from the perspective of a phenomenological approach from the formulation of research questions that are provided in the results at the end of the study. For example:

“Is the rate of students who discontinue successfully their series of lessons comparable between ELLs and NESs? lb. Do both groups have similar outcomes on the text reading and phonemic awareness tasks?” Kelly et al (2008, 247)

Comparison of Methodologies

With respect to comparison between the two studies, the study by Thompson et al (2004) used a quantitative research where numerical data about the patients was collected. The quantitative data was then quantitatively analyzed. On the other hand, the study by Kelly et al (2008) employs a mixed methodology approach with triangulation design through the convergence model. Creswell and Plano Clark (2011) describe triangulation mixed methodology as a strategy that can be explained as a single-phase research design where both qualitative and quantitative methods are implemented within the same time frame. This therefore brings out a clear understanding of the triangulation mixed methodology as involving qualitative and quantitative techniques in a similar time frame.

While the triangulation design has various approaches, the convergence triangulation model is employed when it is desired that the quantitative and qualitative data should be merged concurrently (Bazeley, 2007). This approach allows for thorough comparison of data hence brings out a better understanding of the phenomenon being investigated. Therefore for the research study by Kelly et al (2008), it must have been desirable that qualitative aspects of the study such as lesson designs and structure and the quantitative aspects such as the measurements of the words read to enable the categorization as used by the researchers.

With respect to whether the researchers take a deductive approach or inductive, research study by Kelly uses deductive research approach which is a type of reasoning that works from a more general to a more specific area. The research begins by formulating research theory from which the hypotheses are developed which are then evaluated and confirmed based on the observations generated from formulated study approach. It can be seen that the other research study by Thompson et al employs an inductive research approach, which works contrary to the deductive research approach in that for the study by Thompson et al (2004) the researchers work from specific observations developed by observing the various patients and participants and then develop these specific observation into broader generalizations that can be applied in other broader sense settings and situations of similar conditions. For the study by Thompson et al (2004), the researchers or investigators collect data from the field to facilitate him/her to develop a theory in inductive research approach (Creswell & Plano Clark, 2011).

An important question that must be tackled is how the two studies deal with the issue of generalizability of their findings. The two studies, while dealing with totally two different topics and areas of study, were chosen for the way they approach generalization of their findings despite taking greatly diverse kind of participants. In the study by Thompson et al (2004), the researchers based their inclusion on PSA levels less than or equal to 4.0 nano-grams per milliliter. However, the study also included individuals having more than 4.0 ng/mL PSA levels in the study. The study included these individuals as a measure to ensure generalizability of prevalence of prostate cancer for men with PSA ≤ 4.0 ng/ml to the general population. Therefore, only the control group (that was given the placebo) was used for the analysis. For that study, prostate cancer was the dependent variable while independent variables included prostate-specific antigen.

On the other hand, the study by Kelly et al (2008) focused on comparing native English speakers and English speakers whose first language is Spanish. This alone would warrant generalization of the results and conclusions drawn from the study to the general population. Nevertheless, Kelly and the fellow researchers in the study still found a reason to generalize the findings to the general population of English Low Learners. This was achieved by invoking theoretical models already introduced at the beginning of the study and also referring to past studies that had similar findings but which targeted English low learners of other racial descent.

In summative perspective, it is notable that while the two studies took totally different perspectives in the research approach and dealt with topics of totally diverse nature, there was ample realization implications of the study findings must be well articulated and generalization well established and justified. This justification must not necessarily be expressly highlighted but it can be deduced from the way the whole generalization issue is achieved. Secondly, the study conducted by Thompson was basically a quantitative one and issues dealing with validity and reliability are emphasized through the scrutiny of instrumentation used.

In contrast to the above, the study by Kelly is more of qualitative than quantitative and the issues relating to validity or reliability are handled through scrutiny of credibility, dependability and transferability of the results. For qualitative research, internal validity is taken as credibility and this involves ascertaining that the results of the qualitative research are credible from the point of view of the participant. The external validity as applicable in quantitative research is referred to as transferability in qualitative research and it basically entails the extent to which the results can be generalized with respect to other related contexts and settings. This is increased through subjective judgment and thorough description of context and underlying research settings.

With respect to validity, qualitative research calls for dependability by assessing the need for accountability of the researcher in terms of the ever-changing context within which research studies are conducted. Therefore, while Thompson et al (2004) were justified to undertake random sample of participants, ensure consent and take measurements seriously to ensure validity and credibility, Kelly et al (2008) were justified when they thoroughly described the underpinning contexts and settings of the research study and then also provided an account of the ever-changing context within which the research study was conducted. These measures ensured generalizability to have a strong philosophical backing.

Reference:

Bazeley, P. (2007). Qualitative data analysis with NVivo. Thousand Oaks, CA: Sage Publications, Inc

Creswell, J. W., (2009). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, (3rd Ed.). Thousand Oaks, CA: Sage Publications.

Creswell J. W., & Plano Clark, V. L. (2011) Designing and Conducting Mixed methods research. (2nd ed.) Thousand Oaks, CA: Sage

Krumholtz JS, Carvalhal GF, Ramos CG, et al. (2002) Prostate-specific antigen cutoff of 2.6 ng/mL for prostate cancer screening is associated with favorable pathologic tumor features. Urology 2002;60:469-473

Pauler DK, Gower KB, Goodman PJ, Crowley JJ, Thompson I M. (2003) Biomarker-based methods for determining noncompliance in a prevention trial. Control Clin Trials New England Journal of Medicine,;23:675-685

Thompson I. M, Goodman PJ, Tangen C. M, et al. (2003) The influence of finasteride on the development of prostate cancer. New England Journal of Medicine, 349:215-224

Thompson IM, Tangen C, Goodman P. (2003) The Prostate Cancer Prevention Trial: design, status, and promise. World J Urol 2003; 21:28-3

Thompson, I. M. M.D., Donna K. Pauler, Ph.D., Phyllis J. Goodman, M.S., Catherine M. Tangen, Dr.P.H., M. Scott Lucia, M.D., Howard L. Parnes, M.D., Lori M. Minasian, M.D., Leslie G. Ford, M.D., Scott M. Lippman, M.D., E. David Crawford, M.D., John J. Crowley, Ph.D., and Charles A. Coltman, Jr., M.D. (2004) Prevalence of Prostate Cancer among Men with a Prostate-Specific Antigen Level ≤4.0 ng per Milliliter. New England Journal of Medicine, 350:2239-2246

Airport Recovery report

Executive Summary

Evaluations have been made that, around the world, every year up to around two billion metric huge amounts of dust are conveyed up into the air, essentially by sandstorms. One sandstorm can lift and store more than 200 metric huge amounts of dust (Wallace & Webber, 2010). The Abu Dhabi airport has been recorded as one of five significant dust delivering locales, included that sandstorms are successive in the territory. Separated from being a danger and irritation to the overall population, sandstorms and sandstorms and their specialist poor perceivability and windy winds are a threat to aircraft arriving and taking off. This can prompt occupied flights, deferred takeoffs and specialist airplane terminal operational issues. Different impacts incorporate the scouring of aircraft surfaces and harm to motors and in addition hampering ground operations.

A sandstorm is a gathering of particles of dust, or sand, vivaciously lifted to an extraordinary stature by a solid and turbulent wind and the perceivability is diminished to beneath 1000 meters (Wallace & Webber, 2010). The perceivability is destined to be busy’s most exceedingly terrible amid light hours when the wind is grinding a way’s strongest. Criteria for characterizing a sandstorm, or sandstorm, in the area shift. At Abu Dhabi Airport the definition is that the 10-meter wind must be in abundance of 17 bunches and the surface flat perceivability underneath 1000 meters. Safar (1985) uses the same stipulations, however includes that when the perceivability falls underneath 200 meters, the storm is delegated serious.

Table of Contents

TOC o “1-3” h z u HYPERLINK l “_Toc402816580” Executive Summary PAGEREF _Toc402816580 h 1

HYPERLINK l “_Toc402816581” 1.Background PAGEREF _Toc402816581 h 3

HYPERLINK l “_Toc402816582” 2.Issues PAGEREF _Toc402816582 h 3

HYPERLINK l “_Toc402816583” 2.1 Risk to Airplane amid Arriving and Take-Off PAGEREF _Toc402816583 h 3

HYPERLINK l “_Toc402816584” 2.2 Increase Surface Friction PAGEREF _Toc402816584 h 4

HYPERLINK l “_Toc402816585” 2.3 Reversal Pathways PAGEREF _Toc402816585 h 4

HYPERLINK l “_Toc402816586” 2.4 Loss of life PAGEREF _Toc402816586 h 5

HYPERLINK l “_Toc402816587” 2.5 Financial Impacts PAGEREF _Toc402816587 h 5

HYPERLINK l “_Toc402816588” 3.Recovery PAGEREF _Toc402816588 h 6

HYPERLINK l “_Toc402816589” 4.Summary and Recommendations PAGEREF _Toc402816589 h 7

HYPERLINK l “_Toc402816590” 5.Glossary PAGEREF _Toc402816590 h 7

HYPERLINK l “_Toc402816591” 5.1 Sandstorm PAGEREF _Toc402816591 h 7

HYPERLINK l “_Toc402816592” 5.2 Disaster recovery plan PAGEREF _Toc402816592 h 7

HYPERLINK l “_Toc402816593” 5.3 Catastrophes PAGEREF _Toc402816593 h 8

HYPERLINK l “_Toc402816594” 5.4 Financial Impacts PAGEREF _Toc402816594 h 8

HYPERLINK l “_Toc402816595” 5.5 Dust Bowl PAGEREF _Toc402816595 h 8

Background

An especially extreme sandstorm happened on the twelfth and thirteenth March 2003 when an overall created surface low weight cell passed close by to the north of the UAE. Separated from the poor perceivability accomplished, the occasion was likewise essential in that the storm went on for two days. Generally the framework travels through quick enough for the storm not to last more than the sunshine hours of one day (Vanholder, Gibney, Luyckx, & Sever, 2010). Usually, diurnal area and ocean Abu Dhabi airport warming differential causes the evening ocean breeze from the north to defeat the southerly leave wind. It was accordingly curious, amid the nighttime of the first day, to see smudged road and vehicle lights through a murkiness brought about by dust as opposed to the more ordinary moistness cloudiness.

Perceivability on the twelfth at Abu Dhabi AIRPORT crumbled to 900 meters in a southerly wind that found the middle value of 15 to 20 bunches. At Abu Dhabi airport the perceivability tumbled to 3500 meters in a windy wind that arrived at a normal velocity of 25 bunches. On the second day at Abu Dhabi the normal wind was 25 to 35 bunches and the watched perceivability irregularly lessened to somewhere around 300 and 600 meters for almost 10 hours amid sunshine. At Abu Dhabi airport the still windy wind arrived at 39 bunches with the perceivability down to 1200 meters.

The wind off the hot and dry desert brought on the temperature to top at 52°c and 51°c on the first and second day separately at Abu Dhabi, with the relative dampness down to 10% and never over 30% amid the day. In the interim at Abu Dhabi airport the crest temperature was 40°c and 42°c on the two days, separately and the daytime relative moistness was somewhere around 23% (Snedaker, 2013)

Issues2.1 Risk to Airplane amid Arriving and Take-OffLow level wind shear is a critical risk to airplane amid arriving and take-off. An unmistakable danger exists if the wind change is sudden enough and enormous enough to surpass the aircraft’s quickening, or deceleration limit and huge enough to match its velocity wellbeing edge over the base approach, or ascension speed. On the morning of the thirteenth, an arriving aircraft approaching to land into the wind on runway 13 at Abu Dhabi Airport, would encounter a quick loss of velocity underneath 200 meters because of the headwind dropping from 40 bunches to 5 bunches. This means loss of lift and fast sink. If not quickly checked, at the best it brings about a hard arriving, even under the least favorable conditions an accident arriving short of the runway. Tackling runway 41 with an adequate 4 bunches tailwind brings about a tail wind of 40 bunches at 250 meters with comparative loss of lift impacts (Snedaker, 2013).

2.2 Increase Surface Friction

A comparable issue exists when there is an expansive temperature shear, for example, connected with a surface temperature reversal. On the morning of the thirteenth the temperature expanded by 5°c at 300 meters, yet on the morning of the twelfth it expanded by 8° at 180 meters. An increment in temperature implies an abatement in air thickness, less vitality accessible to the aircraft motors, a loss of lift and airplane sink. Hence an intensely laden aircraft undertaking runway 31 inside ground take-forbidden, possibly could have ended up in genuine challenges, if the group were not mindful of the tailwind and higher temperatures on high.

2.3 Reversal Pathways

A comparable penchant for threat exists in any circumstance where solid wind shear and/or solid surface temperature reversals pathways, such connected with storms, area and ocean breezes and by suggestion Shamal conditions in the past section. The barometrical soundings at 0000 UTC and 1200 UTC on the thirteenth are illustrative of the conditions that won on both days (Olshansky, Hopkins & Johnson, 2012). The main distinction being that the surface temperature reversal was more declared on the morning of the twelfth. The soundings affirm the Eta model prognostic vertical profiles in

Amid real sandstorms, the statement of dust over airport can be wide arriving at, frequently influencing different urban communities and towns. Sandstorms can bring down trees, cover supplies and reason harm to houses. In the last years of the Dust Bowl, ranch creatures were discovered dead in the fields and individuals began experiencing “dust pneumonia”

2.4 Loss of life

While the loss of human life amid sandstorms is moderately little when contrasted with other characteristic catastrophes, long haul wellbeing concerns have sprung up as of late. This is basically because of the expanded number of storms beginning from zones of desertification. The dust in these storms has been indicated to contain toxins and poisons, for example, salt, sulfur, overwhelming metals, pesticides and carbon monoxide to name a couple of. The contamination laden dust can be extended many miles, influencing a huge number of individuals who may not so much experience the ill effects of the intense occasions of the storm.

2.5 Financial Impacts

The prompt financial effect of sandstorms is huge, however it doesn’t opponent significant regular debacles that wreck whole urban communities. For example, the harm because of sandstorms in China midpoints at about $8.5 billion was lost as asset value. A solitary real seismic tremor can do harm to the tune of five times that figure. Nonetheless, masters contend that the true financial effect of sandstorms, especially those that start in zones of desertification, is hard to bind in view of the long haul results they have on the job of individuals who live in the territory. At the point when sandstorms kick up in rural dry terrains that are debased, they uproot the topsoil, which brings on additional desertification. Thus, ranchers are compelled to watch the topsoil, and their occupation, truly clear out. This cycle, if gone unchecked, debilitates to remove entire groups in a few areas.

Some sandstorm movement can be averted, yet clean storms will dependably be a necessary piece of the characteristic biological system. Realize what we can do to avoid and live with sandstorms in the following segment.

Recovery

Our recovery plan is good to the point that the recovery time is just about nothing. The bigger the downtime is, the more cash we lose. Besides, Operational recovery are additionally essential. We generally have Operational recovery accessible available with the goal that we can prioritize the occurrence in customer based circumstances. In addition, we have reinforcements and RAID that are put away and kept sheltered in better places. “Debacle recovery will keep on evolving with the saving money industry (Olshansky, Hopkins & Johnson, 2012). As banks get to be more complex engineering clients, fiasco recovery plans will take after. Anyhow banks must plan for fiasco recovery at all times. The way to fruitful catastrophe recovery is the thing that happens much sooner than a fiasco strikes. With a practical recovery plan, appropriately tried and focused on by senior administration, banks can adequately keep up operations while accommodating the wellbeing of individuals and resources”

The airport’s disaster recovery plan is extremely successful. We move down all information religiously and test them semi-yearly (Chandra & Acosta, 2010). We have been honing every single steps of the debacle anticipate servers and even on the whole frameworks to verify that we can recuperate the data we require at any given time. Moreover, we have a rundown of each conceivable dangers and catastrophes that are out there, and we likewise have regulated systems on the best way to alleviate those dangers. We have reinforcements of very nearly everything that we use to run our everyday operations. Case in point, our devoted servers and machines are extremely basic to our associations; in this way, we have reinforcements of all the fittings and programming utilities that are required to keep our servers up and running, which helps us to run our business easily and without a solitary purpose of disappointment (Olshansky, Hopkins & Johnson, 2012). After what we have inclined from a gigantic quake in Northern California, we have an extremely solid and recently composed arrange that centers particularly on keeping up operations and machine frameworks.

Summary and Recommendations

The management has to reinforce hard and fast systems, servers, programming, and even equipment. We generally keep additional machine parts in the event that on the off chance that we have to supplant anything. Since we are relocating to the cloud reinforcements, we additionally have reinforcements of reinforcements to verify that we have the information.

The management must have disaster recovery plan group in the saving money industry. The recovery plan group comprises of parts from distinctive division levels and diverse positions. Our catastrophe recovery plan groups dependably guarantee that the organization’s strategies and systems are taken after and recorded constantly. We test every last of our plans semi-yearly to verify that the plan lives up to expectations in all actuality. The airport has made all the strides that are obliged to keep our advantages and individuals sheltered from any debacle whether it is huge or little.

Glossary

5.1 SandstormA strong wind carrying clouds of sand with it, especially in a desert.

5.2 Disaster recovery planSometimes referred to as a business continuity plan (BCP) or business process contingency plan (BPCP) — describes how an organization is to deal with potential disasters can be managed.

5.3 CatastrophesThis is an event causing great and often sudden damage or suffering; a disaster.

5.4 Financial Impacts

This are effects related to economic and monetary effects of the sandstorm

5.5 Dust BowlThis is a big collection of sand that covers the entire atmosphere causing negative impacts such as death.

References

Chandra, A., & Acosta, J. D. (2010). Disaster recovery also involves human recovery. JAMA, 304(14), 1608-1609.

Olshansky, R. B., Hopkins, L. D., & Johnson, L. A. (2012). Disaster and recovery: Processes compressed in time. Natural Hazards Review, 13(3), 173-178.

Snedaker, S. (2013). Business continuity and disaster recovery planning for IT professionals. Newnes.

Vanholder, R., Gibney, N., Luyckx, V. A., & Sever, M. S. (2010). Renal disaster relief task force in Haiti earthquake. The Lancet, 375(9721), 1162-1163.

Wallace, M., & Webber, L. (2010). The disaster recovery handbook: A step-by-step plan to ensure business continuity and protect vital operations, facilities, and assets. AMACOM Div American Mgmt Assn.

Air Pollution In Beijing

Air Pollution In Beijing

Beijing is one of the largest cities in the world with a population of over 20 million people. The city has a remarkable number of heavy industrial complexes powered by coal energy. This makes the city a major economic hub in China, as it contributes a considerable proportion to the country’s GDP. Given its level of industrialization and population density, it is arguable that Beijing is a major employment center in China, which gives many people a source of living. Due to the use of coal as the primary source of energy, Beijing suffers some of the worst kind of air pollution in the world. Li Qiong of CCTV reports that the city ranks third out of 113 cities in the world in terms of air pollution levels (1). The city’s phenomenal blanket of smog persists even as the government claims to have stepped up its efforts to control the pollution. This clearly shows that the government and the city authorities are not doing enough to reduce air pollution in Beijing.

The causes of pollution are mainly industrial and become persistent with increased urbanization. Phys.org reports that the condition of the air in Beijing deteriorated as the city became an industrial complex rapidly, which increased the output of pollutants in the city (2). Wang et al. report that the biggest cause of air pollution is the particulate matter (PM), which they categorize into PM10 and PM2.5 (1). The particulate matter emanates from different natural and human processes and has different effects to the safeness of the air. The PRC government uses various measures to reduce air pollution in the city. Most of the strategies laid down by the government aim at cutting down the emission of pollutants with particular interest in reducing the use of coal. For instance, Wang T. et al. indicate that in 2008, the PRC government engaged in a series of controls aimed at achieving cleaner air that enabled it to host the 2008 Olympic marathon. Among the government’s strategies for controlling air pollution in 2008, was banning heavily polluting vehicles from accessing the municipality and closing down a number of heavily polluting factories (Wang et al. 7603).

PM10 particles are the particles suspended in the air with an aerodynamic diameter of less than 10µm while the PM2.5 particles have an aerodynamic diameter of less than 2.5µm (Wang et al. 1). The PM10 particles may occur naturally, for example, dust particles, mold, pollen, dirt, and spores or may be resulting from human activities, like smoke. PM2.5 mainly emanates from toxic organic matter, smoke from combustion of factory material, and heavy metal escaping from smelting furnaces (Wang et al. 1). The government faces various hurdles in achieving its goals as far as reduced air pollution is concerned. One of the challenges is the non-compliance of producers who may fail to comply with the set production limits. A report by China Daily indicates that there are producers who often exceed the legal limits of pollution resulting in higher than projected pollution levels (1). The problems of compliance may also couple with lack of transparency in the setting of standards by authorities. With low transparency, the standards set are subject to compromise resulting in inefficiencies of the process.

The only way that the PRC government can overcome air pollution in Beijing is by acknowledging that most of its efforts so far have failed to achieve worthwhile results. The government requires to also accept that trying to resolve pollution issues only in the winter season when the skies demonstrates to the whole world how much they are polluted is vain. All the stakeholders should participate in drafting measures to tackle air pollution throughout the year. Participation of factory owners and their management is essential in order to draw a collective policy to handle the air pollution problems.

Alles gives a comprehensive evaluation of the extreme air pollution events for the period 2010-2013. In the report, it is clear to find out that Beijing has polluted her air in an unprecedented manner. The full ambit of air pollution in Beijing manifests in the winter season in every year. BURGESS reports that a thick blanket of smog, which reduces visibility to 100 meters, characterizes the Beijing winters (1). The measure of air pollution levels in Beijing indicates that the toxicity of the air in Beijing exceeds all the limits set by the World Health Organization and other international bodies (Greenpeace 1).

According to Alles, the Beijing air pollution levels are consistently high during the winter. In January 2010, the recorded pollution levels in Beijing and in many Chinese cities exceeded an Air Pollution Index (API) of 100 with extreme conditions extending for long hours per day (Alles 3). The pollution levels exhibited by the concentration of the PM10 and PM2.5 imply that the Beijing air remained polluted heavily for long hours in a day. Alles indicates that in “18 January 2010 = 143 avg. API PM10; PM 2.5 = 319 to 435 conc.; for 7hrs at 500 AQI.” (Alles 4) Such pollution levels prevailed in most parts of China with Shanghai recording the lowest PM10 concentration at 44 while Chifeng recorded the highest at 343 (Alles 4).

The most toxic cause of pollution in the air is the PM2.5 particles that are capable of causing adverse respiratory problems when inhaled. The toxicity of the PM2.5 emanates from their small size, which facilitates their adherence to the lung posing health risks to organisms (Center for Chinese Studies, BURGESS 1). An argument that smog only becomes problematic during the winter season may refute the premise that air pollution in Beijing has grown persistent and is slipping out of control. Reports on extreme air pollution are all covering the situation during the winter season when the prevailing weather conditions and the high use of coal exaggerate the real picture. “The PM2.5 mass concentration peak during February was most likely due to emissions from coal consumption for heating purposes…this was the month with the lowest temperatures and slowest winds in 2011” (Wang 5).

Greenpeace indicates that the PM2.5 particles carry traces of “toxic heavy metals, acid oxides, organic pollutants and other chemicals, as well as microorganisms such as bacteria and viruses” (2). This makes the PM2.5 a more hazardous form of air pollution. Greenpeace reports that modern toxicology research findings consistently prove that the “heavy metals and PAHs (polycyclic aromatic hydrocarbons) carried by PM2.5 can enter and deposit in human alveoli, causing inflammation and lung diseases” (1).

Other than affecting the lungs, PM2.5 also affects the functioning of the human circulatory and cardiovascular systems. This implies that “exposure to PM2.5 can lead to significantly increased mortality due to cardiovascular, cerebrovascular and respiratory diseases, as well as greater cancer risks” (Greenpeace 1).

Although not as toxic as the PM2.5, PM10 particles affect the people and the environment in equally disastrous ways. The most evident effect of PM10 particles is their influence on visibility. BURGESS reports that the impact on visibility due to the presence of smog in Beijing was so severe in April 2012 resulting in the cancellation of over 150 flights to and from Beijing (1).

Wang T. et al. observe that the levels of air pollution in Beijing were still high in 2008. They also indicate that the air condition had potential to affect the economic activities in the city considering that an outlook into the Beijing 2008 Olympic marathon revealed that concerns about the weather are bound to increase from both the local citizens and from the international consumers (Wang et al., 7603). Wang et al. indicate that some pollutants reduced in concentrations after the government adopted control measures to reduce air pollution in anticipation of the Olympic. “Vehicle-related nitrogen oxides (NOx) and volatile organic compounds (VOCs) at an urban site dropped by 25% and 20–45% in the first two weeks after full control was put in place” (7603).

There are pollutants that have remained consistently high despite the efforts by the government to curb air pollution. For instance, Wang T. et al. report that the concentration “levels of ozone, sulfate, and nitrate in PM2.5 particles increased by 16%, 64%, 37%, respectively, compared to the period prior to the full control” (7603). This indicates that the pollutants increased in the same period that the government was spearheading pollution control programs in preparation to host the 2008 Olympics.

Wu et al. also give more evidence on the pollution rates in Beijing adding to the evidence that Beijing has a long-standing history of heavily polluted air. Wu et al. segment the incidences of air pollution based on the sources of the pollutants. They observed that local sources contribute to the pollution in the surface layer (30m in height), which accounts for “65 % of SO2, 75 % of PM10 and nearly 90% of NO2” (5997). On the other hand, pollutants observed in the higher layer (above 1.1km) emanate from the neighboring regions like southern Beijing and makes up “more than half of the SO2 and PM10 concentrations” (Wu et al. 5997).

The PRC government uses various measures to reduce air pollution in the city. Most of the strategies laid down by the government aim at cutting down the emission of pollutants with particular interest in reducing the use of coal. For instance, Wang T. et al. indicate that in 2008, the PRC government engaged in a series of controls aimed at achieving cleaner air that enabled it to host the 2008 Olympic marathon. Among the government’s strategies for controlling air pollution in 2008, was banning heavily polluting vehicles from accessing the municipality and closing down a number of heavily polluting factories (Wang et al. 7603).

The long-term directive for controlling air pollution may lie in the replacement of coal as the primary source of energy for Beijing’s industries. This is a position that the government understands fully and intends to execute within the shortest time possible. CCTV reports that the government seeks to initiate a program that sees a replacement of coal heating in the domestic sector and adopt technologies that reduce pollutant emission from combustion of coal (1). This is a sustainable step towards implementation of a long-term strategy. Natural gas, apart from it being cleaner in comparison with coal, has a higher energy content implying that its use will achieve results in both environmental and economic terms (CCTV 1).

The government faces various hurdles in achieving its goals as far as reduced air pollution is concerned. One of the challenges is the non-compliance of producers who may fail to comply with the set production limits. A report by China Daily indicates that there are producers who often exceed the legal limits of pollution resulting in higher than projected pollution levels (1). The problems of compliance may also couple with lack of transparency in the setting of standards by authorities. With low transparency, the standards set are subject to compromise resulting in inefficiencies of the process.

The only way that the PRC government can overcome air pollution in Beijing is by acknowledging that most of its efforts so far have failed to achieve worthwhile results. The government requires to also accept that trying to resolve pollution issues only in the winter season when the skies demonstrates to the whole world how much they are polluted is vain. All the stakeholders should participate in drafting measures to tackle air pollution throughout the year. Participation of factory owners and their management is essential in order to draw a collective policy to handle the air pollution problems.

An argument that smog only becomes problematic during the winter season may refute the premise that air pollution in Beijing has grown persistent and is slipping out of control. Reports on extreme air pollution are all covering the situation during the winter season when the prevailing weather conditions and the high use of coal exaggerate the real picture. “The PM2.5 mass concentration peak during February was most likely due to emissions from coal consumption for heating purposes…this was the month with the lowest temperatures and slowest winds in 2011” (Wang 5).

While it is plausible to argue that air pollution becomes noticeable during the winter season alone, it would be illogical to argue that air pollution only occurs in the cold months. The truth is that although coal use increases during the winter season, since the majority of the people use it for heating their houses, coal is the primary source of energy in China, and factories and small-scale domestic users use it throughout the year. Greenpeace reports that industrial use of coal is still high and accounts for over fifty percent of total coal use in China (13). Even in the generation of other forms of power China’s power generating companies use coal, albeit employing cleaner methods of use than the industrial boilers (Greenpeace 13).

Considering that levels of air pollution in Beijing exceed the pollution threshold set by the World Health Organization by as big a margin as recorded by BURGESS, it is plausible to argue that even in other seasons, Beijing’s air is still heavily polluted. BURGESS reports that official readings for PM2.5 in Beijing early 2013, “suggested pollution levels of over 400 micrograms (mg) per cubic meter, while an unofficial reading from the US embassy monitors recorded levels of over 800mg” (1). The records do not have any close alignment with the WHO guidelines that stipulate that countries should maintain the average concentration of PM2.5 particles in the air at a maximum of 25mg per cubic meter while affirming that above 100mg/m3, air is unhealthy (BURGESS 1). This implies that air pollution is high in Beijing not only during the cold seasons, but also throughout the year. Arguably, it is the favorable weather conditions that help conceal the ambit of air pollution during other seasons of the year.

Air pollution in Beijing may persist over a long time, as there are no solid indicators that China will resolve it any time soon. Many factors work together to ensure that pollutants will characterize Beijing’s skies for at least as long as it will take to completely replace coal energy use in the city. Although the government is working towards reducing pollution levels, there are grounds to believe that it can achieve more results if it involves the producers in the process of achieving environmental sustainability. With the winter season setting in, it is only a matter of days and the world will know whether the phenomenal smog blanket will engulf Beijing again.

The BURGESS report indicates that there is a possibility that air pollution has not increased in the recent past, but rather remained static (2). The report theorizes that the recent move by the PRC’s government to allow the public access to more national data may contribute to the impression that pollution is high. The argument is that the levels of air pollution in China have always been high but concealed. The BURGESS report proposes that it is the sudden public awareness on the pollution issues that make the situation to appear to have veered out of control (2). BURGESS even enthuses that air pollution in Beijing might have abated in the recent years (2).

Work Cited

Alles, David, L. (Eds). “Extreme Air Pollution Events in Beijing China 2010 & 2013.” WesternWashington University, 24 Apr. 2013. 21 Nov. 2013<http://fire.biol.wwu.edu/trent/alles/ExtremeAirPollutionEventsBeijing.pdf>.

Burgess, Meryl. “Beijing Smog: An annual Affair.” CCS Commentary, 18 February 2013. 21Nov. 2013 <http://www.ccs.org.za/wp-content/uploads/2013/02/CCS_Commentary_Beijing_Smog_an_Annual_Affair_MB.pd

CCTV.com. “Beijing Starts Shift from Coal to Gas.” CCTV.com, 11-08-2013. 21 Nov. 2013<http://english.cntv.cn/program/china24/20131108/101036.shtml>.

CCTV.com. “Polluters Still Flouting Law: Inspection.” CCTV.com, 2013-11-19. 21 Nov. 2013<http://www.chinadaily.com.cn/m/hebei/2013-11/19/content_17115404.htm>.

Greenpeace. Dangerous Breathing. PM2.5: Measuring the Human Health and Economic Impactson China’s largest cities. Dongcheng, Beijing: Greenpeace, 2013. Print.

Phys.org. “Smog-blanketed Beijing urges residents to stay indoors.” 30 Jan 2013. 2013<http://phys.org/pdf278741284.pdf> .

Qioing, Li. Air pollution: top concern in Beijing. 01-29-2013. 21 Nov. 2013<http://english.cntv.cn/program/china24/20130129/103339.shtml>.

Wang, Jin-Feng, et al. “Estimation of citywide air pollution in Beijing.” PloS one 8.1 (2013):e53400. Web.<http://www.researchgate.net/publication/234134928_Estimation_of_citywide_air_pollution_in_beijing/file/d912f50f932ab97688.pdf>.

Wang, Tao, et al. “Air quality during the 2008 Beijing Olympics: secondary pollutants andregional impact.” Atmospheric Chemistry and Physics 10.16 (2010): 7603-7615. Print.

Wu, Q. Z., et al. “A numerical study of contributions to air pollution in Beijing duringCAREBeijing-2006.” Atmospheric Chemistry and Physics 11.12 (2011): 5997-6011.Print.

Evaluation of Meta-analysis

Meta-analysis

Name

Institution

Course

date Executive summary

Although the advantages of using meta-analysis seem obvious, critics have been raised. The critics entail the possibility of introducing bias into sampling findings and placing emphasis on personal effects. There are various disadvantages associated with meta-analysis as witnessed in this research. This paper analyses a meta-analysis of Psychodynamic Psychotherapy Outcomes. It evaluates the Effects of Research-Specific Procedures.

The first weakness of DeCoster (2009) paper analysis is personal bias in choosing and including existing studies in the analysis. In the DeCoster (2009), there is no database that includes all empirical studies on the topic of interest. It is also true that not every computer assisted search can identify all journal articles on the chosen topic. Many good studies are not available simply because they were never published. There might be a publication bias; to mean, important results are more likely to be published, whereas non-significant results are neglected into file drawers.  Meta-analysis researchers in DeCoster (2009) paper need to set a clear and consistent standard for including all valid studies that meet this standard and to make a great effort in including all valid studies that meet this standard.

In the recommendation, DeCoster (2009) paper researchers also must avoid personal bias in deciding which studies from the literature to include in the analysis (Joel et al., 2012).

The second drawback of meta-analysis in DeCoster (2009) originates from the huge variation of present studies. Even on the same research topic and question, existing empirical studies may vary considerably in theoretical foundations and methodological issues, such as sampling strategy, measurements of interested variables, data analysis techniques, and the reporting formats and contents. It is evident that there exist considerable changes among various studies.

Evaluation of Meta-analysis

Various scholars have argued in defending meta-analysis by stating that it helps in synthesizing disparate researches. Various scholars have arguments that even though past research varies in their methodology, a meta-analysis which is well designed considers variations by treating them as moderator variables. Meta-analysis researchers ought to be careful in the aggregation of various studies with different participants and sampling methods. They should also consider operationalization and measuring variables of interests which was not carried out in DeCoster (2009) paper. When joining studies, scholars should be attentive to appropriate moderator variables that can result to alterations in research outcomes (Joel et al., 2012).

The other drawback of meta-analysis in DeCoster (2009) paper is its dependence on personal effects on various predictors on a dependent variable. In DeCoster (2009) paper, Meta-analysis methodically measures only personal relations between dependent and independent variables and cannot give a broad picture. Rosenthal and Dimatteo (2001) argue that this sample, systematic approach is essential in most research domains, stating that individual effects and correlations provide a foundation of building a comprehensive model that integrates many individual variables. In addition, meta-analysis tends to be a powerful tool for examining the combination and interactions of individual predictor variables. Such analysis is an essential condition for realizing multi-level and multi-factorial models. Meanwhile, meta-analysis scholars in DeCoster (2009) paper ought to be aware of the loss of information when they concentrate on particular impacts at a time in the analysis, also taking into consideration probable interactions among predictor variables.

Internal Validity

Although the list of threats to validity in DeCoster (2009) may seem devastating there is good reason to continue (Joel et al., 2012). Unlike the pilot’s checklist, on which any item overlooked could spell disaster, it is generally agreed by which any item overlooked could spell disaster. It is generally agreed by even the most finicky statisticians that a meta-analysis is not automatic failure. Suppose it has a shortcoming or two in validity; to fail, its validity problems must be serious or multiple. Nevertheless, after looking at the validity threats to meta-analysis, one can only agree with Ingram Olkin that doing a meta-analysis may be easy but doing one well is hard.

Whether the implementation of the primary studies and the meta-analysis justify the claims the researchers are making should be assessed. That is, whether the primary studies and meta-analysis actually test what the researchers say is being tested. Over a dozen threats to internal validity have been identified in DeCoster (2009) paper. In primary studies, most can be avoided by the use of proper methods, with random assignment of subjects to treatment and control conditions being the most obvious. In meta-analyses, scholars can protect against majority of the key threats to interior validity by using only principal studies, an action that was not carried out in DeCoster (2009) paper. The studies use randomization or make statistical modifications compensating for its lack.

Two other significant fears to the interior validity of meta-analysis in DeCoster (2009) paper comprise unfinished literature searchers, which produce samples of studies, and untrustworthy information coding from various studies. These threats can be guided against by more complete methods of collecting studies and by checks on coding discussed earlier, an action that was never carried out.

Number of studies

Piot, (2003) points out the possibility that poorly conducted studies included in his previously published meta-analysis could have artificially inflated or decreased the magnitude of the aforementioned effects sizes. However, due to a limited number of studies in the 2001 meta-analysis, Piot, (2003) was unable to statistically test the impact of study quality on effect size. To address this issue, Jones, (2006) re-analysed the now larger video game literature and identified studies that contained few methodological flaws. In order to identify methodologically sounds studies, Kimball, (2006) made use of a nine-item coding scheme. Included within this scheme were the following weaknesses: a nonviolent video game condition that involved the paying of a game that actually contained violence; a violent video game condition that contained little violence; difference between the violent and nonviolent conditions in terms of difficulty, frustration level, or generated interest, and so on. When Owens 2006 limited the revised meta-analysis to best practice studies, the effects sizes for aggressive behaviour, aggressive cognitions, and hostile affects and physiological arousal all increased. This indicates weaknesses of meta-analysis and so their applicability in DeCoster (2009).

The measurable process of DeCoster (2009) meta-analysis can be employed in addressing the challenges imposed by the presence of diverse research finding on a particular question. It allows researchers to combine numerical results from a number of studies, to accurately estimate descriptive statistics. The researchers are also able to explain the inconsistencies of findings in the literature, and to discover moderate variables for a dependent variable of interest (Rosenthal & DiMatteo, 2001). The major strength of DeCoster (2009) meta-analysis comes from its capacity to help researchers reach accurate and credible conclusions that other research approaches cannot provide(Joel et al., 2012). Such as one single primary study and qualitative or narrative literature review. Meta-analytic research strategy nonetheless has both benefits and drawbacks.

Moderator check

Lastly, the other drawback of meta-analysis as evident in DeCoster (2009) paper is its inadequate capability of including other variables that are melodramatically dissimilar from the current theory (Yang, 2002). Meta-analysis researchers cannot form new hypothetical ideas past the variables and study qualities that have not been involved in current studies. Although researchers may be able to discover different effects unless existing studies have reported relevant features (Joel et al., 2012). Consequently, a meta-analytic approach to theory building tends to be more applicable to a “research-analytic approach to theory” than “theory-than-research” strategy of theory building. Meta-analysis, consequently, has its constraints in mounting and validation of pioneering theory.

Reliability check

The principal difficulty of meta-analysis as witnessed in DeCoster (2009)is individual bias in choosing and including present studies in the analysis. In DeCoster (2009), there is no one database that entails all experiential studies in the examination. Not all computer-helped searches can classify all journal articles on the research topic. Various researches are unavailable since they are unpublished there can be a publication bias. That is, the importance in results has higher chances of being published, while non-important outcomes are relegated into file drawers (Rosenthal, 1979). Meta-analysis scholars are to set a perfect and reliable standard for containing experiential studies and to create a pronounced effort in entailing all valid studies that attain these conditions. Researchers should also evade individual bias in determining the studies from the past research to contain in the analysis.

The other drawback of meta-analysis in DeCoster (2009) comes from the huge alteration of present research. Although similar research theme and question, current experiential studies may fluctuate considerably in theoretic foundations and organizational issues, for example sampling approach, measurements of concerned variables, data analysis methods, and the reporting setups and subjects (Joel et al., 2012). It is evident that there are substantial differences among published researches considering quality of research. Subsequently, some scholars have argued against the practice of meta-analysis for mixing good and bad studies.

The other critics have been raised concerning the contrast of various kinds of studies as being same to mixing different research designs (Hunt, 1997). Rosenthal (2001), though, have dissimilar studies. The arguments are that even though studies change methodologically, a meta-analysis well designed considers such variances by considering them as moderator variables. When uniting studies, researchers are to pay attention to appropriate moderator variables that can cause variances in research outcomes.

External Validity

Hall and Rosenthal (1995) suggest three basic principles to guide meta-analysis, certain processes are vital to meta-analytic study. A typical meta-analysis has the following steps:

Define variables of interest, and formulate the research questions. This is evident is DeCoster (2009) paper.

Search the literature, and identify adequate empirical studies in a systematic way. Though there is a thorough literature search in DeCoster (2009), but the search is biased.

Construct validity has to do with whether the measure used to appraise the outcome is a trustworthy indicator of effect or if it distorts the true intervention-outcome connection. DeCoster (2009) meta-analysis, for example, examined how well a mental test of job abilities correlated with individuals’ actual job performance. Of the measures used in the primary studies, the mental test was known to be fairly reliable, while the job performance rating, based on a single supervisor’s opinion in each case, was generally unreliable.

The last group deals with threats to statistical validity in DeCoster (2009). These include errors in the way data are analysis is carried out. At the chief level of study, such error comprise in the way data are analyzed. At the principal level, such errors comprise using numerical tests that are unsuitable for the kind of data, exploiting on chance, and not recording statistical tests that were done. At the meta-analytic level, statistical drawbacks include unsuitable suppositions when effect sizes must be assessed, bias in changing effect sizes, and failing to weigh the studies’ results by samples size and other situations.

One of the most widely cited meta-analytic studies on media violence is Anderson, Berkowitz, (2003) systematic review of over 215 empirical studies. Each of the individual studies included in this meta-analysis assessed the negative effects of violent imagery seen in movies and on television on aggressive behaviour. Meta-analysis in DeCoster (2009) proved to be an important piece of work, as the impact of violent television and movies on aggressive behaviour produced medium-sized effects sizes, far larger than many critics had suspected. More recently, Funk et al., 2002 conducted a systematic review of the impact of violent video games on aggression and aggression-related constructs using 35 independent research projects. The results of this meta-analysis provided empirical support for the contention that violent video games influence aggressive behaviour, one of the most widely cited meta-analytic studies on media violence is Funk et al., 2002 systematic review of over 215 empirical studies. This shows strength of meta-analysis and so their applicability in DeCoster (2009)(Joel et al., 2012).

The other demerit of meta-analysis in DeCoster (2009) and other areas is in its dependence on personal effects on various predictors on a dependent variable (Joel et al., 2012). Meta-analysis methodically measures only personal relations amid dependent and independent variables and cannot afford a broad picture. Rosenthal (2001) argue that this is easy, methodical approach is important in majority of researches, stating that personal impacts and associations give a foundation for erecting a complete model that assimilates many personal variables.

Theoretical contribution

One advantage of DeCoster (2009) meta-analytic design is its capacity to integrate and synthesize current empirical studies for a research question. DeCoster (2009) meta-analysis allows researchers to integrate the existing empirical studies for a research question. Meta-analysis lets investigators to assimilate the present findings with some erudite tools for instance combined tests. Because different existing studies may come from various empirical areas, a combined test tends to cumulate the existing findings in a scientific way and thus to present the results with more generalizability (Joel et al., 2012). Researchers understand that it is crucial to conduct a literature review, yet they often get inconsistent or even conflicting findings. Qualitative or narrative review of the literature cannot deal with such findings, and thus sometimes such a review can be quite confusing.

DeCoster (2009) meta-analysis provides a cumulative view of a specific research topic by carefully analyzing similarities and differences of methodologies and findings across many studies. In other words, meta-analysis aims at getting a whole picture. By coding existing studies quantitatively, meta-analysis researchers can keep track of a large amount of potential information and then conduct a more detailed analysis (Joel et al., 2012). DeCoster (2009) meta-analysis can easily summarize multiple variables from hundreds of studies that most narrative reviews cannot handle and so its advantage. In addition, meta-analysis allows researchers to examine a wider range of relationships, interactions, and other such complex analyses that are normally not permitted under qualitative research techniques.

Moderator

The second benefit or strengths of DeCoster (2009) meta-analysis for getting good research outcomes originates forms its nature of analyzing the analysis. DeCoster (2009) meta-analysis does not only cumulate outcomes from personal studies but also can be using test multifaceted theories entailing various variables (Joel et al., 2012). Since social and administrative phenomenon seems to be complicated, diverse theories from different domains have been put in place to assist explain such phenomena. There might be various competing philosophies or hypothetical frameworks in one research domain. For instance, scholars can recognize various predictors for the efficiency of training in firms, including design of training, various methods of training, ability or task features, and evaluation features (Arthur et al., 2003). DeCoster (2009) meta-analysis gives essential method of estimating the relative effects of present predictors on the dependent variable and this gives aggregated experiential results for studying and judging present studies.

The other strengths of DeCoster (2009) meta-analysis employing meta-analysis is its ability to offer strategies for selection of variables and designing the research in coming studies. Meta-analysis uses the chosen literature with experiential evidences. Such outlook has various utility. For instance, investigators can use such data to reflect on the present design and determine some hopeful variables for future research. They researchers in DeCoster (2009)used meta-analysis in developing new theoretical and hypothetical ideas founded on experiential evidence shown in meta-analysis permits researchers to improve and validate new theoretical philosophies based on experiential evidence revealed in meta-analysis for example moderators and interactions impacts(Joel et al., 2012). In summary, meta-analysis permits researchers to improve other theoretical ideas basing on likely attributes and features of all imaginable current studies. This is to say, meta-analysis can follow a research then theory strategy of building theory (Reynolds, 1971). Comparing with other strategies, the chief advantage of meta-analysis is that it is based on a number of proven empirical studies as witnessed in DeCoster (2009). For instance, meta-analysis is based on published works and not only one piece of research.

The other advantage of employing meta-analysis as research method originates from its function in the continuous development refinement of present theory. Through discovering and examining the important moderators and likely interactions impacts, meta-analysis gives solid conclusions about entailing other proven variables or discarding old, less significant variables in the present theories and theoretical models. Even though the demerits of employing meta-analysis are obvious, critics have been raised, entailing the possibility of introducing bias into findings sampling (Joel et al., 2012). Meta-analysis researchers ought to be alert of drawbacks in connection with this method.

When adding up the current developments in the field of individually controlled learning on one hand, and the strengths and restrictions of meta-analysis on the other, some implications for additional research become deceptive.

The outcomes of DeCoster (2009) meta-analysis alone do not give enough bases for creating entire recommendations for an intercession. Obviously, other evidences not included in DeCoster (2009) are also needed (Joel et al., 2012). Succeeding chief research should be the re-evaluation of the integrative outcomes of the meta-analysis in an experimental setting. An intervention research ought to be established, which considers the outcomes of the meta-analysis of training studies dealing with the improvement of the educational self-regulation. In such a research, intervening with perfect characteristics ought to be assessed.

In the same way, DeCoster (2009) meta-analysis ought to be considered as an experiential study which synthesizes personal experiments quantitatively on the base of previous study objectives. Because meta-analyses are founded on a bigger sample size, the analysis tends to be more exact and dependable than any of the principal studies that they assess.

With regards to further exploration of the concept of self-regulated learning, it would be interesting to conduct a meta-analytical evaluation of the training studies. The evaluation should be aimed at fostering self-regulated learning. Furthermore, the evaluation should also be designed to synthesize on a smaller scale the experimental research on how learners acquire self-regulated learning skills. The main aim is in investigating single strategies more in depth, an action that was thoroughly carried out in DeCoster (2009). This is one of the greatest strengths of DeCoster (2009) meta-analysis (Joel et al., 2012).

Further, in order to advance the scope of the research in DeCoster (2009)’s field, conclusions established in adjacent research areas. The meta-synthesis of Hattie (2009), for instance, discovered that the effect of tutors on learners is bigger when equating the collaborative inspirations of learner, educator, instructional, and school factors. The outcome should be considered in the research on learners’ attainment of individual-regulation by examining the effect of educator feedback on learners’ individual regulation. In the assessment of training program, these issues should be combined into the involvements meant to foster individual-regulating learning.

Meta-analysis is no longer a new method. Among the main difficulties associated with meta-analysis is its dependence on the main studies involved. The information as reported by the authors is often not complete. For instance, information on certain important decisions taken may not be revealed(Joel et al., 2012). One way of solving this problem could be that authors provide the meta-analysis with their raw data when their study is being published (see e.g., Glass, 2000; Hattie et al., in press). This would allow the reviewers to work with these primary data and carry out additional analyses, whereby at least some of the limitations of meta-analyses as mentioned earlier could be overcome.

Until more sophisticated methods are established to synthesize literature, DeCoster (2009)meta-analysis approach can offer an important overview of the state of the art in a particular research field. They can also provide useful information about the effectiveness of interventions or, in more general field and provide useful information between two variables under investigation as witnessed in DeCoster (2009). Especially in a growing and evolving research field such as the area of self-regulated learning, procedures are needed that can structure the existing body of evidence and help to find comprehensive answers, while also facilitating the formulation of new hypotheses for future studies.

Interpretation of findings

In analysing the meta-analysis for validity, Rosenthal and Rubin ensured that their outcomes would not leak away under close examination. The validity of meta-analysis refers to the soundness of the original studies and the processes used in combining the data, and a minimum of three dozen possible validity leaks were identified in the processes. Studying all the likely faults, one can feel that the entire enterprise in DeCoster (2009) paper is desperate, but the condition is not bad, to alter metaphors, than that of an aircraft pilot reading a preflight (Joel et al., 2012).

References

Yang B., 2009.Research in Organizations c.21.Business & Economics / Management Science. Publisher Berrett-Koehler Publishers.

Swanson, R., 2005. Research in Organizations: Foundations and Methods of Inquiry. The Berrett-Koehler Organizational Performance Series. Publisher Berrett-Koehler Publishers.

Diana B. 1999. Meta-Analysis, Decision Analysis, and Cost-Effectiveness Analysis : Methods for Quantitative Synthesis in Medicine: Methods for Quantitative Synthesis in Medicine. Volume 31 of Monographs in Epidemiology and Biostatistics.Publisher Oxford University Press.

Polit, Denise F., Cheryl T., 2013. Essentials of Nursing Research: Appraising Evidence for Nursing Practice. 8th Edition.Revised Publisher Lippincott Williams & Wilkins.

Ringquist E., 2013. Meta-Analysis for Public Management and Policy.Publisher John Wiley & Sons.

Harris M. Cooper, Larry V. Hedges, 1994. The Handbook of research synthesis, Volume 236.The Handbook of Research Synthesis.Publisher Russell Sage Foundation.

Joel M., Marc J., and Allan A., 2012.A Meta-Analysis of Psychodynamic Psychotherapy Outcomes.Evaluating the Effects of Research-Specific Procedures.

Evaluation essay

Name

Professor

Course

Date

Evaluation essay

Introduction

People have differing views regarding the characteristics of a musician. For one, an accomplished musician should have superiority technical and aural skills. Additionally, an accomplished musician should be able to blend with the audience and create a rapport. However, the most notable characteristics of a god musician are to be able to control his game.

Controlling the audience

An accomplished musician should be in control his concerts. On of the most outstanding things is that an accomplished musician should not be moved emotionally by the audience. James can make his audience cry, joyously or frighten them; however, he must able to control his emotions. He does not feel any of these during performance. He is moved by his performance and consummated by the flow of the performance. , be he controls everything. It may be detrimental for a musician to be emotionally moved by the crowd. James Morison has fire in his voice but ice in his veins. This is what keeps him going even when the audience is dropping. During his launch of one life track, he moved women to cry, men dropped, and pandemonium broke loose. He kept on singing until the end of the shows, not moved a bit himself.

Technical and aural skill

While a talented musician should be proficient at playing a number of instruments, the policy of division of labor is easy; one does what he does best. Specialization is tremendously valuable. It is necessary for players to be able to know than to play the right tunes, by playing the right tune at the right time. This is mostly demanded in the orchestra, but it applies to all fields. Once a player knows when to play the right tune, they can do it naturally. The only examples on point are Elvis princely and James Morison. Both of these players never went to a music school; they play the guitar with a natural gusto. In his hit single, “You Give Me Something”, James set the tempo with his syrupy backing track of the guitar, piano and the brass, He knows how to set the tempo with his guitar riffing. He can easily improvise notes and tunes without creating discourse in his performance

Ability to create a rapport

For a musician to excel in his performance, he must know how to blend with the crowd. The audience is extremely significant factors in the success of a musician, for one, the musician must be able to determine the mod of the crowd and leverage it to his advantage. Rapport is extremely valuable. Once James has built a rapport with his audience, he knows which tracks to start with to set the mood, and while on gear; he introduces other lines to create a mood swing in his audience. The chemistry is just not imaginable and practical, so it is never rehearsed before the concert, it comes naturally. His may be something to do within his extroverted nature, but this does not lie on personality characteristics, but on psychology.

Choice of chords and style

Musician hold be able make the right choice of chord to be able to capture the crowd, however, the style of accompaniment is also noteworthy. Though tin is a matter of personal taste. It is also necessary to choose the right backing, a mixture if common chords may not move the audience, but if chosen well, is likely to endear the crowd. One can decide to use the common three chord trick, but additional voicing and relevant minor, in addition to modal chords is particularly appropriate in music.

Decision criteria

While the criteria are many, the choice of the best criteria is manly dependent on the who is evaluating whom. If it is the crowd evaluating the signer, then the ability to create a rapport is much more valuable. This is because; the singer easily identifies with the audience and provides them with exactly what they want. Just the right chemistry to keep them busy and committed. Some of the self confessed diehard fans of James Morrison say that they love him because, of the chemistry between him and his audience.

On the other hand, if an event management company carries out the evaluation, then the ability to control the audience is extremely beneficial because the event management team is much more interested in ensuring that the audience are moved. If an artist moves the audience, this may ensure longevity of contracts and shows. However, if the artist is self evaluating, he is likely to consider rapport, the technical and aural skill as well as the ability to control the crowd. The more, the artist can easily exploit the audience’s emotion, the better because at the end of the day, the audience is the x factor.

Conclusion

Ability to create and develop a rapport with the audience or crowd is the most notable in criteria for selecting the best artist. The audience is the reason why the showbiz is on; the revenue comes from the audience, and this makes the rapport the most significant criteria.

Evaluation Of Methods And Introduction Of Complementary Research Devices To Improve Research Robustness

Evaluation Of Methods And Introduction Of Complementary Research Devices To Improve Research Robustness

Abstract

According to Kisfalvi (2003), Entrepreneurship has been conceptualized in a number of ways in literature, common ground has still not been found. Social scientists have embarked to study effects and reasons of entrepreneurial actions to find the essence of entrepreneurship, (Kisfalvi, 2003). Research on ‘true’ entrepreneurial behavior is scarce though, researchers agree that investigating this field could help solidify entrepreneurship theory, its delimitation from management and lay groundwork for studies on entrepreneurial efficacy and competence.

Referring to ADDIN EN.CITE <EndNote><Cite><Author>Edmondson</Author><Year>2007</Year><RecNum>809</RecNum><record><rec-number>809</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>809</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Edmondson, Amy C.</author><author>McManus, Stacy E.</author></authors></contributors><titles><title>METHODOLOGICAL FIT IN MANAGEMENT FIELD RESEARCH</title><secondary-title>Academy of Management Review</secondary-title></titles><periodical><full-title>Academy of Management Review</full-title></periodical><pages>1155-1179</pages><volume>32</volume><number>4</number><keywords><keyword>MANAGEMENT science</keyword><keyword>MANAGEMENT literature</keyword><keyword>MANAGEMENT</keyword><keyword>QUALITY control</keyword><keyword>RESEARCH</keyword><keyword>EXPERIMENTAL design</keyword><keyword>DATA analysis</keyword><keyword>METHODOLOGY</keyword></keywords><dates><year>2007</year></dates><publisher>Academy of Management</publisher><isbn>03637425</isbn><urls><related-urls><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=buh&amp;AN=26586086&amp;loginpage=login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote>Edmondson & McManus (2007), this paper provides that a research design based on field methods, in meticulous methods of observation, can be used to study and consequently code and evaluate behavioral scenario in entrepreneurship. A method to studying managerial behavior postulated by Henry Mintzberg’s (1968) proves a valuable starting point, however, an evaluation of the method based on Yin (1998) and ADDIN EN.CITE <EndNote><Cite><Author>Gibbert</Author><Year>2008</Year><RecNum>821</RecNum><record><rec-number>821</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>821</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Gibbert, M. </author><author>Ruigrok, W.</author><author>Wicki, B.</author></authors></contributors><titles><title>What passes as a rigorous case study?</title><secondary-title>Strategic Management Journal</secondary-title></titles><periodical><full-title>Strategic Management Journal</full-title></periodical><pages>1465-1474</pages><volume>29</volume><number>13</number><dates><year>2008</year></dates><urls></urls></record></Cite></EndNote>Gibbert et al. (2008) shows apparent lack of rigor in his proceeding. Consequently, the research design is updated and remedies as well as complimentary methods are introduced. These measures are integrated into a research approach which enables the generation of reliable and valid data on entrepreneurial behavior.

Table of Contents TOC o “1-3” h z u HYPERLINK l “_Toc229733400” Table of Contents PAGEREF _Toc229733400 h ii

HYPERLINK l “_Toc229733401” List of Abbreviations PAGEREF _Toc229733401 h iii

HYPERLINK l “_Toc229733402” 1Introduction PAGEREF _Toc229733402 h 1

HYPERLINK l “_Toc229733403” 2Conceptual foundations PAGEREF _Toc229733403 h 3

HYPERLINK l “_Toc229733404” 2.1The job of the entrepreneur as research focus PAGEREF _Toc229733404 h 3

HYPERLINK l “_Toc229733405” 2.2Methodological fit of systematic observation in entrepreneurial settings PAGEREF _Toc229733405 h 4

HYPERLINK l “_Toc229733406” 2.3Key concepts of scientific observation PAGEREF _Toc229733406 h 6

HYPERLINK l “_Toc229733407” 2.4Using observation to generate data PAGEREF _Toc229733407 h 7

HYPERLINK l “_Toc229733408” 3Evaluation of Mintzberg’s observation approach PAGEREF _Toc229733408 h 8

HYPERLINK l “_Toc229733409” 3.1Internal validity PAGEREF _Toc229733409 h 8

HYPERLINK l “_Toc229733410” 3.2Construct validity PAGEREF _Toc229733410 h 9

HYPERLINK l “_Toc229733411” 3.3External validity PAGEREF _Toc229733411 h 10

HYPERLINK l “_Toc229733412” 3.4Reliability PAGEREF _Toc229733412 h 10

HYPERLINK l “_Toc229733413” 3.5Attainments of Mintzberg’s work PAGEREF _Toc229733413 h 11

HYPERLINK l “_Toc229733414” 4Measures to improve rigor of systematic observation PAGEREF _Toc229733414 h 12

HYPERLINK l “_Toc229733415” 4.1Clear definitions, epistemological foundations as well as pattern matching to improve internal validity PAGEREF _Toc229733415 h 12

HYPERLINK l “_Toc229733416” 4.2Social learning theory as theoretical foundation for the research approach PAGEREF _Toc229733416 h 13

HYPERLINK l “_Toc229733417” 4.3Multi-rater methods, in particular Delphi processes to code behaviour PAGEREF _Toc229733417 h 14

HYPERLINK l “_Toc229733418” 4.4Methodological triangulation to enhance construct validity PAGEREF _Toc229733418 h 15

HYPERLINK l “_Toc229733419” 4.5Larger sample size and clear rationale for case selection to enhance external validity PAGEREF _Toc229733419 h 16

HYPERLINK l “_Toc229733420” 4.6Comprehensive and comprehensible documentation to improve reliability PAGEREF _Toc229733420 h 17

HYPERLINK l “_Toc229733421” 4.7Overview of approach to studying entrepreneurial behaviour with systematic observation PAGEREF _Toc229733421 h 17

HYPERLINK l “_Toc229733422” 5Conclusion PAGEREF _Toc229733422 h 18

HYPERLINK l “_Toc229733423” 5.1Summary PAGEREF _Toc229733423 h 18

HYPERLINK l “_Toc229733424” 5.2Directions for future research PAGEREF _Toc229733424 h 19

HYPERLINK l “_Toc229733425” References PAGEREF _Toc229733425 h 20

List of AbbreviationsAbbreviationFull term

e.g.exempli gratia (for example)

et al.et alteri (and others)

i.e.id est (that means)

p.page

pp.pages

Introduction

The purpose of this paper is to analyze, evaluate and improve methodological rigor of field research for deployment in entrepreneurial settings. According to Edmondson and McManus (2007), Field research is defined as “systematic studies that rely on the collection of original data in real organizations” ADDIN EN.CITE <EndNote><Cite><Author>Edmondson</Author><Year>2007</Year><RecNum>809</RecNum><Pages>1155</Pages><record><rec-number>809</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>809</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Edmondson, Amy C.</author><author>McManus, Stacy E.</author></authors></contributors><titles><title>METHODOLOGICAL FIT IN MANAGEMENT FIELD RESEARCH</title><secondary-title>Academy of Management Review</secondary-title></titles><periodical><full-title>Academy of Management Review</full-title></periodical><pages>1155-1179</pages><volume>32</volume><number>4</number><keywords><keyword>MANAGEMENT science</keyword><keyword>MANAGEMENT literature</keyword><keyword>MANAGEMENT</keyword><keyword>QUALITY control</keyword><keyword>RESEARCH</keyword><keyword>EXPERIMENTAL design</keyword><keyword>DATA analysis</keyword><keyword>METHODOLOGY</keyword></keywords><dates><year>2007</year></dates><publisher>Academy of Management</publisher><isbn>03637425</isbn><urls><related-urls><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=buh&amp;AN=26586086&amp;loginpage=login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote>(Edmondson & McManus, 2007). This paper mainly focuses on systematic observation techniques as well as complimentary research devices to gather original data and a scientific process to generate activity categories. These methods fit the study of a research gap in entrepreneurship particularly well (see Section 2.2). Systematic observation has been used to explore and analyse a variety of subject matters, such as educational, healthcare and military institutions as well as political and economic actors (Sulsky & Kline, 2007; Yukl, 2005).

Henry Mintzberg pioneered the deployment of direct systematic observation in his seminal PhD thesis on managerial behaviour “The manager at work – determining his activities, roles and programs by structured observation” ADDIN EN.CITE <EndNote><Cite><Author>Mintzberg</Author><Year>1968</Year><RecNum>721</RecNum><record><rec-number>721</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>721</key></foreign-keys><ref-type name=”Thesis”>32</ref-type><contributors><authors><author>Mintzberg, Henry</author></authors></contributors><titles><title>The Manager at Work – Determinig his Activities, Roles, And Programs by Structured Observation</title></titles><dates><year>1968</year></dates><pub-location>Boston</pub-location><publisher>Massachusetts Institute of Technology</publisher><urls></urls></record></Cite></EndNote>(Mintzberg, 1968). Since then, researchers have repeatedly relied upon systematic observation as a method to study and understand managerial behaviour ADDIN EN.CITE ADDIN EN.CITE.DATA (Kotter, 1982; Kurke & Aldrich, 1983; Luthans, 1987; S. Tengblad, 2001a). ADDIN EN.CITE ADDIN EN.CITE.DATA Brown & Hanlon, 2004; Gartner, 1989; Schwehm, 2007 , argue that entrepreneurship is a young discipline whose actual activities, tasks and behaviours of entrepreneurs have not yet been studied comprehensively, leaving a research gap which could be addressed by systematical analysis of data from systematic observation ADDIN EN.CITE ADDIN EN.CITE.DATA (Brown & Hanlon, 2004; Gartner, 1989; Schwehm, 2007).

The incorporation of typical advantages of case study research, like the coverage of events in their natural context, in real time, and the generation of detailed, voluminous evidence (Yin, 1998) have not saved the situation because still observation approaches have been criticized for several reasons in the literature. ADDIN EN.CITE ADDIN EN.CITE.DATA Brown & Hanlon, 2004; Hales, 1986; Martinko & Gardner, 1985 denotes that limitations of observant research relate normally to problems of validity, lack of reliability checks, subjective coding methodology and a number of other conceptual problems ADDIN EN.CITE ADDIN EN.CITE.DATA (Brown & Hanlon, 2004; Hales, 1986; Martinko & Gardner, 1985). The underlying research question for this chiefly methodological paper is thus:

R: How can we address the limitations of observational field research while preserving its methodological advantages?

To answer this question in the context of management and entrepreneurship, three main objectives subside. First, direct, systematic observation has been widely used to study managers but not to study entrepreneurs ADDIN EN.CITE <EndNote><Cite><Author>Brown</Author><Year>2004</Year><RecNum>817</RecNum><record><rec-number>817</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>817</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Brown, Travor C.</author><author>Hanlon, Dennis</author></authors></contributors><auth-address>U Newfoundland</auth-address><titles><title>Developing Behavioural Observation Scales to Foster Effective Entrepreneurship</title><secondary-title>Journal of Small Business and Entrepreneurship</secondary-title></titles><periodical><full-title>Journal of Small Business and Entrepreneurship</full-title></periodical><pages>103-116</pages><volume>17</volume><number>2</number><keywords><keyword>Human Capital</keyword><keyword>Skills</keyword><keyword>Occupational Choice</keyword><keyword>Labor Productivity J240</keyword><keyword>Firm Performance L250</keyword><keyword>New Firms</keyword><keyword>Startups M130</keyword></keywords><dates><year>2004</year><pub-dates><date>Winter</date></pub-dates></dates><isbn>08276331</isbn><urls><related-urls><url>http://www.ccsbe.org/jsbe/</url><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=ecn&amp;AN=0743863&amp;loginpage=login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote>(Brown & Hanlon, 2004). therefore, identification and address of the methodological issues in the scientific observation of managers can bring about research designs which, could make valid and reveal significant aspects of administrative or entrepreneurial behaviour ADDIN EN.CITE <EndNote><Cite><Author>Gartner</Author><Year>1989</Year><RecNum>731</RecNum><record><rec-number>731</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>731</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Gartner, William B.</author></authors></contributors><titles><title>&quot;Who Is an Entrepreneur?&quot; Is the Wrong Question</title><secondary-title>Entrepreneurship: Theory &amp; Practice</secondary-title></titles><periodical><full-title>Entrepreneurship: Theory &amp; Practice</full-title></periodical><pages>47-68</pages><volume>13</volume><number>4</number><keywords><keyword>ENTREPRENEURSHIP</keyword><keyword>BUSINESSMEN</keyword><keyword>ORGANIZATION</keyword><keyword>BUSINESS</keyword><keyword>RESEARCH</keyword><keyword>PSYCHOLOGY</keyword><keyword>PERSONALITY</keyword></keywords><dates><year>1989</year><pub-dates><date>Summer89</date></pub-dates></dates><publisher>Blackwell Publishing Limited</publisher><isbn>10422587</isbn><urls><related-urls><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=buh&amp;AN=5331834&amp;loginpage=Login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite><Cite><Author>Schwehm</Author><Year>2007</Year><RecNum>850</RecNum><record><rec-number>850</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>850</key></foreign-keys><ref-type name=”Book”>6</ref-type><contributors><authors><author>Schwehm, M. O. </author></authors></contributors><titles><title>Entrepreneurship im internationalen Vergleich: Erfassung, Beobachtung und Erklärung</title></titles><dates><year>2007</year></dates><pub-location>Marburg</pub-location><publisher>Books on Demand GmbH</publisher><urls></urls></record></Cite></EndNote>(Gartner, 1989; Schwehm, 2007). Second, this paper aims at enhancing rigor of direct behaviour observation and behaviour coding approaches by methodically identifying weaknesses and strengths to consequently put in place measures to improve and advance the methodology for future fruitful use in the social sciences. Third, a gradual research framework shall be devised which can be applied in bridging a concrete research gap in entrepreneurial setting basing on the evaluation and development of systematic observation methods.

After the introduction, Section II will lay the foundation for the remainder of the paper. To substantiate the identified research gap, current issues in entrepreneurship theory will be reviewed briefly. Thereafter, methodological fit of an approach, in particular systematic observation and subsequent activity coding procedures to address the research gap shall be assessed. Finally, observation approaches as deployed in field research in management will be introduced as scientific methodologies. Subsequent to the fundamental groundwork laid out in Section II, Section III will evaluate the rigor of studies relying on structured observation by deploying Yin’s (1994) and Gibbert et al.’s (2008) suggested validity and reliability criteria. The analysis shall exemplarily focus on Mintzberg’s 1973 case studies on managerial behaviour, since they incorporate common weaknesses but also strengths of systematic observation and behavioural coding ADDIN EN.CITE <EndNote><Cite><Author>Brown</Author><Year>2004</Year><RecNum>817</RecNum><record><rec-number>817</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>817</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Brown, Travor C.</author><author>Hanlon, Dennis</author></authors></contributors><auth-address>U Newfoundland</auth-address><titles><title>Developing Behavioural Observation Scales to Foster Effective Entrepreneurship</title><secondary-title>Journal of Small Business and Entrepreneurship</secondary-title></titles><periodical><full-title>Journal of Small Business and Entrepreneurship</full-title></periodical><pages>103-116</pages><volume>17</volume><number>2</number><keywords><keyword>Human Capital</keyword><keyword>Skills</keyword><keyword>Occupational Choice</keyword><keyword>Labor Productivity J240</keyword><keyword>Firm Performance L250</keyword><keyword>New Firms</keyword><keyword>Startups M130</keyword></keywords><dates><year>2004</year><pub-dates><date>Winter</date></pub-dates></dates><isbn>08276331</isbn><urls><related-urls><url>http://www.ccsbe.org/jsbe/</url><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=ecn&amp;AN=0743863&amp;loginpage=login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote>(Brown & Hanlon, 2004). Based on identified methodological and conceptual shortcomings, Section IV will suggest incremental remedies and complementary research devices to advance research designs featuring observation as a method to generate valid and reliable data on entrepreneurial behaviour for research and practice. Section V concludes the paper by providing a summary as well as directions for future research.

Conceptual foundationsThe job of the entrepreneur as research focusEntrepreneurship, although being a relatively young discipline, has been conceptualized in a number of different ways over the last decades ADDIN EN.CITE <EndNote><Cite><Author>Gartner</Author><Year>2008</Year><RecNum>724</RecNum><record><rec-number>724</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>724</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Gartner, William B.</author></authors></contributors><titles><title>Entrepreneurship Hop</title><secondary-title>Entrepreneurship: Theory &amp; Practice</secondary-title></titles><periodical><full-title>Entrepreneurship: Theory &amp; Practice</full-title></periodical><pages>361-368</pages><volume>32</volume><number>2</number><keywords><keyword>ENTREPRENEURSHIP</keyword><keyword>NEW business enterprises</keyword><keyword>RESEARCH</keyword><keyword>MANAGEMENT science</keyword><keyword>STUDY &amp; teaching</keyword><keyword>RHYMING slang</keyword><keyword>HIP-hop — Influence</keyword><keyword>RAP (Music) — Social aspects</keyword></keywords><dates><year>2008</year></dates><publisher>Blackwell Publishing Limited</publisher><isbn>10422587</isbn><urls><related-urls><url>10.1111/j.1540-6520.2007.00230.x</url><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=buh&amp;AN=30033421&amp;loginpage=Login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote>(Gartner, 2008). More recently, its definition in literature has been predominantly processual: According to Fueglistaller et al. (2008) Entrepreneurship is a process which is initiated and executed by individuals to identify, evaluate and exploit opportunities. The entrepreneur is hence an individual performing these processes, succeeding with new products or production methods in the market and establishing new economic structures ADDIN EN.CITE ADDIN EN.CITE.DATA (Blanchflower & Oswald, 1998; Gartner et al., 1994) .

However, academic discourse about the terms and concepts entrepreneurship and entrepreneur has not settled ADDIN EN.CITE ADDIN EN.CITE.DATA (Fueglistaller et al., 2008; Gartner, 2001, 2008; Schwehm, 2007). as a result, researchers have embarked in a variety of directions to explain and conceptualize entrepreneurship phenomena, which can be summarized under the following three questions ADDIN EN.CITE <EndNote><Cite><Author>Stevenson</Author><Year>1990</Year><RecNum>823</RecNum><record><rec-number>823</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>823</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Stevenson, Howard H.</author><author>Jarillo, J. Carlos</author></authors></contributors><titles><title>A PARADIGM OF ENTREPRENEURSHIP: ENTREPRENEURIAL MANAGEMENT</title><secondary-title>Strategic Management Journal</secondary-title></titles><periodical><full-title>Strategic Management Journal</full-title></periodical><pages>17-27</pages><volume>11</volume><number>4</number><keywords><keyword>ENTREPRENEURSHIP</keyword><keyword>BUSINESS literature</keyword><keyword>STRATEGIC planning</keyword><keyword>NEW business enterprises</keyword><keyword>INDUSTRIAL management</keyword><keyword>BUSINESS</keyword><keyword>RESEARCH</keyword><keyword>MANAGEMENT science</keyword><keyword>BUSINESS planning</keyword><keyword>MANAGEMENT — Research</keyword></keywords><dates><year>1990</year></dates><isbn>01432095</isbn><urls><related-urls><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=buh&amp;AN=12496733&amp;loginpage=login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote>(Stevenson & Jarillo, 1990)

What happens when entrepreneurs act? (Results of entrepreneurial action)

Why do entrepreneurs act? (Predispositions of entrepreneurial action)

How do entrepreneurs act? (Behaviours / actions of entrepreneurs)

According to ADDIN EN.CITE <EndNote><Cite><Author>Timmons</Author><Year>2002</Year><RecNum>3</RecNum><record><rec-number>3</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>3</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Timmons, Jeffrey A.</author></authors></contributors><titles><title>New Venture Creation – Entrepreneurship for the 21st Century</title></titles><keywords><keyword>1080 Ent – Manage start-ups</keyword></keywords><dates><year>2002</year></dates><urls></urls></record></Cite><Cite><Author>Gartner</Author><Year>1989</Year><RecNum>731</RecNum><record><rec-number>731</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>731</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Gartner, William B.</author></authors></contributors><titles><title>&quot;Who Is an Entrepreneur?&quot; Is the Wrong Question</title><secondary-title>Entrepreneurship: Theory &amp; Practice</secondary-title></titles><periodical><full-title>Entrepreneurship: Theory &amp; Practice</full-title></periodical><pages>47-68</pages><volume>13</volume><number>4</number><keywords><keyword>ENTREPRENEURSHIP</keyword><keyword>BUSINESSMEN</keyword><keyword>ORGANIZATION</keyword><keyword>BUSINESS</keyword><keyword>RESEARCH</keyword><keyword>PSYCHOLOGY</keyword><keyword>PERSONALITY</keyword></keywords><dates><year>1989</year><pub-dates><date>Summer89</date></pub-dates></dates><publisher>Blackwell Publishing Limited</publisher><isbn>10422587</isbn><urls><related-urls><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=buh&amp;AN=5331834&amp;loginpage=Login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote> Gartner, (1989); Timmons, (2002) the first two questions are investigated in the disciplines of economics and psychology / sociology respectively. While The third question, is best to be examined in the field of business management research ADDIN EN.CITE <EndNote><Cite><Author>Timmons</Author><Year>2002</Year><RecNum>3</RecNum><record><rec-number>3</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>3</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Timmons, Jeffrey A.</author></authors></contributors><titles><title>New Venture Creation – Entrepreneurship for the 21st Century</title></titles><keywords><keyword>1080 Ent – Manage start-ups</keyword></keywords><dates><year>2002</year></dates><urls></urls></record></Cite><Cite><Author>Gartner</Author><Year>1989</Year><RecNum>731</RecNum><record><rec-number>731</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>731</key></foreign-keys><ref-type name=”Journal Article”>17</ref-type><contributors><authors><author>Gartner, William B.</author></authors></contributors><titles><title>&quot;Who Is an Entrepreneur?&quot; Is the Wrong Question</title><secondary-title>Entrepreneurship: Theory &amp; Practice</secondary-title></titles><periodical><full-title>Entrepreneurship: Theory &amp; Practice</full-title></periodical><pages>47-68</pages><volume>13</volume><number>4</number><keywords><keyword>ENTREPRENEURSHIP</keyword><keyword>BUSINESSMEN</keyword><keyword>ORGANIZATION</keyword><keyword>BUSINESS</keyword><keyword>RESEARCH</keyword><keyword>PSYCHOLOGY</keyword><keyword>PERSONALITY</keyword></keywords><dates><year>1989</year><pub-dates><date>Summer89</date></pub-dates></dates><publisher>Blackwell Publishing Limited</publisher><isbn>10422587</isbn><urls><related-urls><url>http://search.ebscohost.com/login.aspx?direct=true&amp;db=buh&amp;AN=5331834&amp;loginpage=Login.asp&amp;site=ehost-live</url></related-urls></urls></record></Cite></EndNote>(Gartner, 1989; Timmons, 2002). Stevenson and Jarillo (1990) suggest that management researcher should put into consideration what entrepreneurs do behaviourally and how they succeed as entrepreneurs. little has been published on this topic to this day despite almost 20 years having passed, recent research by Schwehm (2007) still claim for a focus on direct contemplation of entrepreneurs ADDIN EN.CITE <EndNote><Cite><Author>Schwehm</Author><Year>2007</Year><RecNum>850</RecNum><record><rec-number>850</rec-number><foreign-keys><key app=”EN” db-id=”0azr0ttfyztxvuevse659fwe5dsvddazpf0v”>850</key></foreign-keys><ref-type name=”Book”>6</ref-type><contributors><authors><author>Schwehm, M. O. </author></authors></contributors><titles><title>Entrepreneurship im internationale

Ahmaud Arbery’s Murder Case

Ahmaud Arbery’s Murder Case

Institutional Affiliation

Date

Abstract

Ahmaud Arbery’s murder, the selected case, is a recent crime event that has drawn significant national attention across the United States. His fatal shooting last year has sparked heated debates on gun rights and the limits of self-defense. Besides, the delayed arrest of the culprits has triggered nationwide discourses and criticisms over racial profiling in the country (Ellis, 2020; Wootson & Brice-Saddler, 2020). With the trial over Arbery’s death commencing more than one year after his murder, the media coverage of this case has been instrumental in broadcasting live trial updates to the public, illuminating what transpired in this case.

Ahmaud Arbery’s Murder Case

The Crime in Its Historical Setting

This section highlights the incidence, evidence, arrest, and events influencing the court involved, law enforcement agencies, and the media.

The Incidence

Ahmaud Arbery, a 25-year-old Africa American man, was killed on February 23, 2020, in Satilla Shores, a neighborhood in Brunswick, Georgia, while out jogging (Fausset, 2021). The main murder culprits were two white men who chased Arbery acting as vigilantes (Vila et al., 2021). They suspected that Arbery had perpetrated theft or burglary in Satilla Shores, thereby assuming that he was responsible for burglaries witnessed and reported earlier (Wootson & Brice-Saddler, 2020). Evidence later showed that these two white men were Gregory McMichael and his son Travis McMichael (Mckay, 2021).

The Evidence

Two pieces of evidence are critical to the historical setting of Arbery’s murder. The first is a half-minute long videotape on William “Roddie” Bryan’s cell phone. Bryan, the third murder perpetrator, used his cellphone to record Gregory and Travis, armed and in one vehicle, chasing and shooting at the unarmed Arbery. Bryan made this video recording while also pursuing Arbery in his own truck. In this video, Travis was seen opening the driver’s side door, exiting the vehicle, wielding the shotgun, engaging in a conformation with Arbery, and eventually shooting him (Fausset, 2021). This evidence has been crucial in the culprits’ trial because it shows Arbery was unarmed, besides revealing that Gregory and Travis indeed ambushed him (Ellis, 2020; Kasakove & Heyward, 2021). The second evidence is police interview transcripts, showing that Gregory confessed to initiating the pursuit after seeing Arbery run past his home and suspecting he was a burglar (Sayers & Pamela, 2021).

Influential Events

Three major events have influenced the court, the murder case’s media coverage, and law enforcement actions. The first is the circulation of the viral video detailing how Gregory, Travis, and Bryan pursued and shot Arbery. The second encompasses the public outcry, widespread national protests, community outrage, and demonstrations after Bryan’s video was leaked and publicized (Kasakove & Heyward, 2021; Laughland, 2021). The last event was the alarm raised by legal experts, activists, and politicians across the country over the delayed culprits’ arrest.

The Arrest

Following the public outcry, justice activism, and video evidence review, the Georgia Bureau of Investigation arrested Travis, Bryan, and Gregory for the murder (Kasakove & Heyward, 2021). Currently, the Bureau is investigating some local prosecutors (include Tom Durden) and law enforcement agencies for failing to make the initial arrest obtaining the video evidence (Ellis, 2020; Wootson & Brice-Saddler, 2020).

Crime Event’s Theoretical Analysis

The assemblage theory helps conceptualize and analyze Arbery’s murder case.

The Theory’s Premise

Félix Guattari and Gilles Deleuze originally developed the assemblage theory in the 1980s, positing that social constructions are assemblages of complex configurations of discrete, contingent, heterogeneous, partial, and unstable social elements that sequentially play roles in other, more extended configurations (Rey, 2012). Based on this proposition, a stable ontology for the social world is nonexistent. Since its development, the assemblage theory has been adopted in various social structures, fields, and formations such as race and racialization, explaining them as machinic and enunciative assemblages. While machinic assemblages relate the appropriations, utilizations, and transpositions of social elements, enunciative assemblages deal with the laws and transformations associated with social elements (Yu, 2013).

Within the racialization context, Alexander Weheliye adopted the assemblage theory to develop his concept of racializing assemblages. He held that race encompasses sociopolitical processes that anchor political hierarchies in human flesh to discipline humanity into complete humans, nonhumans, and not-quite-humans (Conley 2017; White, 2018). Accordingly, race is a social construction where an assemblage of stereotypical hardships, indignities against African Americans, their racial profiling, and other racism elements culminate in racialization (Vila & Avery-Natale, 2020; Vila et al., 2021). Scilicet, racialization occurs via a complex race-centered process that immanently emerge as a social assemblage’s outcome. In this assemblage, affects and emotions circulate to alter the assemblage’s elements, causing race materialization. These affects and emotions also dictate the enactment, materialization and performance of specific identitarian articulations that deny racialized groups certain privileges and capacities (Vila & Avery-Natale, 2020).

Theory Application in Arbery’s Case

Arbery is only one among many cases of Whites murdering Blacks in America under unclear circumstances. The assemblage theory and its linkage with the concept of identitarian articulations provides useful tools to better comprehend the occurrence of this crime event. Particularly, Travis, Bryan, and Gregory operated in a Neighborhood Watch assemblage that only anticipated that a Black individual in that vicinity only qualified to be felonious (Vila et al., 2021). When Arbery entered this assemblage during jogging, his overly racialized role as criminal constituted the identitarian articulation of his victimization. Since this identitarian articulation emerges as arrangements of bodies, identifications, objects, affects, and emotions that collectively actualize race on bodies (Vila et al., 2021), his identification by the culprits as a black body only articulated him as a racialized object. Accordingly, the perpetrators’ emotions of hate against the minority materialized their notion that he was a burglar, and tragically, death via murder became this assemblage’s capacity.

Media Involvement and Coverage of the Case

Numerous media outlets across the U.S. and other parts of the world have expansively covered this case since its occurrence. Examples include the New York Times, CNN, the Washington Post, and the USA Today, which have covered the case’s stages from investigation to disposition.

Role of Media Coverage in Case Disposition

The accounts and reports given by different authors and editors affiliated with these media outlets show considerable consensuses about how this crime started, transpired, and advanced to the final verdict. Their coverage also demonstrate unanimities in their reporting of the case facts, investigations, ongoing trial the jury’s deliberations and reasoning, validations for the verdict and determinations, and other criminal case prosecution phases. For instance, most of the media channels covering this case agree that during investigations that led to the arrest of Gregory, Travis, and Bryan, the video recording by the latter played a key role in the arrest, trial, and convictions (Fausset, 2021; Kasakove & Heyward, 2021; Mckay, 2021; Kennedy & Diaz, 2021). Indeed, the video evidence was central to the arraignment, indictment, and preliminary hearing, together with the trial and sentencing of the three, who were found guilty of murdering Ahmaud Arbery, with life imprisonment possibilities after federal trial in February 2022. The media exposure of this evidence was crucial in affirming to the public that the victim did not pose any imminent threat to the three because he was unarmed.

Relevance of Media Coverage of Arbery’s Murder

The case’s coverage by diverse media outlets has been relevant in four aspects. Firstly, the media have shed more light on rising racial profiling incidents. While evidence existed for months to warrant the filing of charges against Bryan and the McMichaels, the media unmask the district attorney offices’ deliberately failure to make arrests. This sparked and fuelled public uproar countrywide for what has been publicized as racial profiling and racial injustice against Arbery’s family while the killers remained free since the early 2020 (Ellis, 2020; Kennedy & Diaz, 2021). Secondly, the media bring to light judicial bias and intentional discrimination in the justice process of handling murder cases involving minority groups. As reported by some media outlets (Kennedy & Diaz, 2021; Laughland, 2021; Yang, 2021), there were already substantial concerns at the trial’s onset because defense attorneys eliminated nearly all Black jurors, leaving an almost all-white jury of eleven white judges and only one Black judge.

Thirdly, the media draw a clear picture of traumatic and costly outcomes of assumptions-based decision making in citizens’ arrests. The defense attorneys have sustained that the culprits were attempting to enact a lawful citizen’s arrest (Fausset, 2021; Sayers & Pamela, 2021; Wootson & Brice-Saddler, 2020). However, the evidence presented and reported in the media clearly shows that the perpetrators based their decisions and actions principally on assumptions that Arbery was a burglar. Such assumptions cost a life due to mistaken identity of a regular jogger. Lastly, the media have highlighted what has been termed the public lynching of the accused (Kennedy & Diaz, 2021; Laughland, 2021). Defense attorneys have held that the case has been marred by mistrial, arguing that media coverage and the presence of outside activists are influencing court proceedings, amounting to contemporary public lynching (Yang, 2021).

Perspective on the Media’s Effect on the Case Outcomes

The media coverage of murder cases and trials is essential in exposing instances of judicial unfairness. In Arbery’s case, the media have been pivotal to this end, exposing the intentional discrimination in the trial where almost all Black jurors were eliminated from the trial panel (Fausset, 2021; Sayers & Pamela, 2021). Also, the media coverage was indispensable in guaranteeing adequate validation of evidence. Through the media circulation of video evidence that fuelled public unrest, Georgia Bureau of Investigation engaged in deeper investigations and established the factualness of the evidence, leading to the arrests and possibly impartial murder case administration. Lastly, the media coverage helps reflect on laxity and failures in the fight against the racism pandemic, seeing that allegations of intentional bias and racism have continued to surround this court case.

References

Conley, T. L. (2017). Decoding black feminist hashtags as becoming. The Black Scholar, 47(3), 22-32. Doi: 10.1080/00064246.2017.1330107.

Ellis, N. T. (May 7, 2020). Why it took more than 2 months for murder charges and arrests in the death of Ahmaud Arbery. USA Today. Gannett Satellite Information Network, LLC. Retrieved November 26, 2021, from https://www.usatoday.com/story/news/2020/05/07/ahmaud-arbery-shooting-video-prosecutor-arrest-mcmichael/3089040001/.

Fausset, Richard (November 24, 2021). What we know about the shooting death of Ahmaud Arbery. The New York Times. The New York Times Company. Retrieved November 26, 2021, from https://www.nytimes.com/article/ahmaud-arbery-shooting-georgia.html.

Kasakove, S., & Heyward G. (November 24, 2021). In the Arbery killing trial, video evidence once again played a crucial role. The New York Times. The New York Times Company. Retrieved November 26, 2021, from https://www.nytimes.com/2021/11/24/us/arbery-video-evidence-murder-trial.html.

Kennedy, M., & Diaz, J. (November 24, 2021). 3 white men are found guilty of murder in the killing of Ahmaud Arbery. NPR Special Series: America Reckons With Racial Injustice. Retrieved November 26, 2021, from NPR. https://www.npr.org/2021/11/24/1058240388/ahmaud-arbery-murder-trial-verdict-travis-greg-mcmichael

Laughland, O. (November 24, 2021). Ahmaud Arbery verdict: three white men found guilty of murdering Black man as he jogged. The Guardian. Guardian News & Media Limited. Retrieved November 26, 2021, from https://www.theguardian.com/us-news/2021/nov/24/ahmaud-arbery-verdict-guilty.

Mckay, R. (October 27, 2021). Factbox: Why a viral video is key evidence in trial of men accused of killing Ahmaud Arbery. Reuters. Thompson Reuters. Retrieved November 26, 2021, from https://www.reuters.com/world/us/why-viral-video-is-key-evidence-trial-men-accused-killing-ahmaud-arbery-2021-10-25/.

Rey, P. J. (2012). Assemblage theory. The Wiley‐Blackwell Encyclopedia of Globalization. 10.1002/9780470670590.wbeog032.

Sayers, D. M., & Pamela, K. (November 10, 2021). Detective testifies that Gregory McMichael told him he did not see Ahmaud Arbery commit a crime. Cable News Network (CNN). Retrieved November 26, 2021, from http://lite.cnn.com/en/article/h_84b94e956775e785ec542f70c5eacd76.

Vila, P., & Avery-Natale, E. (2020). Towards an affective understanding of processes of racialization. Ethnicities, 20(5), 844-862. Doi: 10.1177/1468796820909453.

Vila, P., Ford, M., & Avery-Natale, E. (2021). Ahmaud Arbery: Murder as the outcome of an assemblage’s enactment. Social Identities, 27(6), 1-17. Doi: 10.1080/13504630.2021.1975536.

White, E. J. (2018, November). Alexander G. Weheliye. Habeas Viscus: Racializing assemblages, biopolitics, and black feminist theories of the human. In Seminar: A Journal of Germanic Studies (Vol. 54, No. 4, pp. 541-543). University of Toronto Press. Doi: 10.3138/seminar.54.4.541.

Wootson, C. R., Jr., & Brice-Saddler, M. (May 9, 2020). It took 74 days for suspects to be charged in the death of a black jogger. Many people are asking why it took so long. The Washington Post. Retrieved November 26, 2021, from https://www.washingtonpost.com/national/outraged-by-the-delayed-arrests-in-killing-of-black-jogger-protesters-in-georgia-demand-justice/2020/05/08/8e7d212a-90a9-11ea-9e23-6914ee410a5f_story.html.

Yang, M. (November 24, 2021). Ahmaud Arbery murder: five key moments from the trial. The Guardian. Guardian News & Media Limited. Retrieved November 26, 2021, from https://www.theguardian.com/us-news/2021/nov/24/ahmaud-arbery-trial-key-moments-what-happened.

Yu, J. E. (2013). The use of Deleuze’s theory of assemblage for process-oriented methodology. Historical Social Research, 38(2), 196-217. Doi: 10.12759/hsr.38.2013.2.196-217.