Thursday, December 26, 2019

Electronic Health Records Breaches Within Security

Tracey M Wright ENG 1101 Benjamin Kolenda December 1, 2014 Electronic Health Records Breaches in Security Research Focus Working in the medical field with Electronic Health Records, a lot of my responsibilities are reliant on Health Insurance Portability and Accountability (HIPPA) compliance, EHR updates and template building. EHR breaches in security is a constant concern in this age of modern and sophisticated technology. With recent security breaches of major corporations, this has caused technology experts to heighten its security encryptions to prevent further breaches. The increasing concern over the security of health information stems from the rise of EHRs, increased use of mobile devices such as the smartphone, medical identity theft, and the widely anticipated exchange of data between and among organizations, clinicians, federal agencies, and patients. If patients’ trust is undermined, they may not be forthright with the physician. For the patient to trust the clinician, records in the office must be protected. Having the knowledge of how these security breaches are on t he rise increases my awareness on the security protection of the health records. Previous Research As with any online digital format, concerns of breach exist. Internet hackers possess a digital power that frightens individuals looking to conceal sensitive data. There have been cases in which medical information has been accessed by unauthorized users. While this does not occur all too frequently,Show MoreRelatedInformation Governance : The Center Of The Healthcare Industry1488 Words   |  6 Pagesinformation is handled (Hutchinson Sharples, 2006). IG includes the protection of data, personal health records (PHR), electronic health records (EHR), and medical information exchanged via telemedicine. Breaches of personal information have been occurring more often and the time for information governance is indeed now. This paper will explain what information governance is, give examples of data breaches and how the particular organization was affected, and explain the importance of implementing informationRead MoreComputer Security At The Health Care Sector1653 Words   |  7 Pag esComputer Security in the Health Care Sector Medical records are a very desirable asset on the black market valuing $50.00 each. This is much higher than other personal information including credit card numbers and social security numbers which are valued at $1.50 and $3.00 respectively (Robonsin). The Health Information Technology for Economic and Clinical Health Act (HITECH) has encouraged the health care industry to embrace information technology by adopting electronic health records and electronicRead MoreData Breaches And The Healthcare Industry1676 Words   |  7 PagesSummary of data breaches in hospital industry Data breaches have become common nowadays especially in the healthcare industry. For example, a number of hacking events have been reported in the past years (Croll, 2007). Such events in the healthcare industry, have threatened the safety of private medical records. Since the healthcare environment posses the most valuable information of patients, they are the establishments who are most likely to suffer from hackers. Most importantly, patients worryRead MoreEssay On Healthcare Data Breaches1101 Words   |  5 PagesAbstract In today’s age of healthcare, health informatic innovations such as the health information exchange have allowed electronically available healthcare data, such as clinical, administrative, and financial information, to be shared within healthcare systems, hospital networks, and other healthcare settings. As organizations begin to share sensitive information across political, geographical, and institutional boundaries, there is a constant risk of patient data being compromised. ThereforeRead MoreEssay On Patient Data Breaches1325 Words   |  6 PagesThe nature of healthcare is constantly evolving due to innovations in technology which enable health records to be electronically exchanged between healthcare systems, hospital networks, and other healthcare settings. This is referred to as a healthcare information exchange. This electronic exchange of information has the potential to enhance the quality of healthcare. Health records can be transmitted between patients, doctors, h ospitals, and other providers at the time of service. Despite the effortRead MoreInformation Security in the Healthcare Industry1562 Words   |  6 PagesInformation Security in the Healthcare Industry The rapid changes in technology over the past few decades has left the healthcare industry ill-prepared to operate in today’s environment. Most substantial protections of sensitive consumer information has come as a result of federal regulation, most notably in 1996 with the Health Insurance Portability and Accountability Act and 2009 as part of the American Recovery and Reinvestment Act. Protection of information in the healthcare industry hasRead MoreUsing Electronic Medical Records For Patient Care Essay1315 Words   |  6 PagesIntroduction Patient data security in hospitals and every healthcare organization is facing issues with breaches that are causing a challenge for the healthcare industry to provide quality care to their patients. Improving patient’s data security should be a top priority. The focus of this paper is to examine four quantitative researches on the threats and challenges hospitals are facing due to patient data security breach. Quantitative Research Study 1 The purpose of this study was to decide howRead MoreHealth Issues Of Health Informatics918 Words   |  4 PagesHealth Informatics has been around for ages, but over the pass ten plus years the profession has increased with higher demand. Health informatics is one of the nation’s largest growth industries. Health informatics has grown as a discipline with specialization in areas within the health profession. This field of study incorporates procedure, theories and concepts from computer information science. As the medical profession increases so do the health data security and privacy has become a major growingRead MoreThe Right to Privacy1152 Words   |  5 Pagesuseful, some see them as a situation where anyone can watch and record the actions of every individual, and where the individual has lost control over information about herself and thus over her very life. As a reaction to these concerns, new regulations have been formulated to define the rights of individuals and the limits on the use of technology with respect to personal information. Among the categories of personal information, health information is of particular interest for a number of reasonsRead MoreImproper Admission Orders From Morphine Overdose And Death1185 Words   |  5 PagesIntroduction The department of Health and Human Services protects and guides the health and well being of individuals here in America (Thacker, 2014). They fulfill these duties providing Americans with adequate and efficient health and human services and monitoring services designed to increase the efficiency of care in the health system (Thacker, 2014). One of the services being monitored by the department of Health and Human Services is the electronic health record system, which carries private

Tuesday, December 17, 2019

Emergency Department Essay - 1828 Words

Aiding Throughput by Adding Advanced Practice Providers to Saint Joseph’s Emergency Department Cavan Quam California State University, Stanislaus Introduction Emergency departments (EDs) across the country are inundated with too many patients based upon staffing and resources (Sun et al., 2013). This is a problem, as overcrowding in the ED has been shown to increase the likelihood of multiple negative outcomes. Patients who go to overcrowded EDs experience a higher likelihood of mortality, have a longer length of stay in the hospital, and incur higher costs (Sun et al., 2013). While the problem of overcrowding is a multifaceted one, one contributing factor is that low acuity patients go to EDs because they lack access to primary†¦show more content†¦Approach - Urgent Care in ED Model Since overcrowded EDs must provide care to patients across the acuity spectrum, it is imperative that the appropriate level of care is delivered as often as possible. In other words, a physician does not need to perform assessments on a patient who can be managed by a nurse practitioner (NP) or a physician assistant (PA). Similarly, a high acuity pat ient should not be relegated to a longer time to assessment because the ED physicians are assigned to too many patients already. In order to appropriately match patients with the level of care required, EDs must increase NP and PA staffing to treat those who seek primary care and urgent care treatment at the ED. Urgent care models utilize multiple NPs and PAs. The reason being is these providers can deliver the care at the level the patient requires. It has been shown that utilizing NPs in the ED can result is lower acuity patients being seen and discharged faster, as well as helping overcrowding in the ED overall (Burlingame Simpson, 2009). In concert with an effective front-end and ambulance triage model, NPs and other advanced practice providers can lessen the burden on ED physicians. Importantly, nurse practitioners and other advanced providers contribute to throughput in the ED even if they are not the final provider signing off on the patient’s discharge. Rapid access to NPs and PAsShow MoreRelatedEmergency Department Bottleneck Proposal Essay example1844 Words   |  8 PagesEmergency Department Bottleneck Proposal Middletown Hospital is a 200-bed, not-for-profit-general hospital that has an emergency department with 20 emergency beds. The emergency department handles on an average 100 patients per day. The hospital’s CEO has authorized the Six Sigma Team (SST) to address complaints received from patients seeking treatment between 6:00 p.m. and 10:00 p.m. The complaints are centered on waiting times and poor service. During this time the data indicates that approximatelyRead MoreEssay on truama c-spine1431 Words   |  6 Pages Trauma C-Spine nbsp;nbsp;nbsp;nbsp;nbsp;This essay is not intended to criticize any emergency medical or hospital staff. I am writing this essay out of concern for patients who come into the emergency room that may have a jeopardized spinal cord resulting from an injury or suspected injury to their cervical spine. I am a certified emergency medical technician, farm-medic instructor and currently a medical diagnostic student doing clinicials. In the United States each year there are approximatelyRead MoreImproving Patient Throughput in the Emergency Department Essay2929 Words   |  12 PagesImproving Patient Throughput In the Emergency Department Introduction St. Vincent’s Medical Center, a 501 bed facility located in Jacksonville, Florida, provides general medical and surgical care to the North Florida Region. St. Vincent’s admits over 26,000 patients annually. The average occupancy rate is approximately 84% with the Emergency Department (ED) peeking at 100% for approximately 4-12 hours daily. The hospital is struggling with availability of bed space. This shortage of availableRead MoreEmergency Management And Emergency Managers1695 Words   |  7 PagesIntroduction There are many roles an Emergency Managers will have to take in today’s emergency response and management field. The fact that emergency management covers such a vide field of concern means that the roles themselves will not be cookie cutter standardized throughout the world. In the United States there are Emergency Managers at the local, state, and federal level and each of these roles are similar. The fact that these Emergency Managers deal with separate issues that are not similarRead MoreWhy Are the Waiting Times in Public Hospital Emergency Departments so Long? What Contributes to This? What Are We Doing Too Address This Problem?809 Words   |  4 Pages8/04/11 1:22 AM Jasmin Charles: Essay Why are the waiting times in Public hospital emergency Departments so long? What contributes to this? What are we doing too address this problem? Waiting times in public hospital have been a big issue in the media lately. Politicians addressing these issues and using them as a bargaining point in their campaigns by making promises to fix the current health care problem by extra funding or a re-form in the health care. Public health patients featuring inRead MoreSecurity Plan1490 Words   |  6 PagesClifton Liquor Store located in Clifton, Colorado. This essay will explain the entire floor plan of the store. Moving forward we will discuss the threats and evaluate the risk of each threat. We will point out the times in which the store is most vulnerable for each threat as well as counter-measures for each threat. We will then discuss the security measures the liquor store has put into place. Lastly we will point out the plans in place in an emergency situation such as a fire or a bomb threat. ThisRead MoreThe Threat Of Emergency Response Operations1604 Words   |  7 Pagesthe crux of the U.S. all-hazard approach to homeland security (HS), but this approach appears to be inherently flawed.† is not true. This essay will argue that emergency response operations are at the crux of the U.S. all-hazard approach to homeland security and that this approach is not inherently flawed just not all encompassing. This is because the emergency response operations such as crisis and consequence management directly correlate with the length and overall effect of vulnerabilities andRead MoreDisaster Preparedness At The Houston Methodist Hospital Essay1229 Words   |  5 Pagescontingent and well detailed disaster preparedness plan and procedures. Healthcare systems, on a day to day basis, are faced with emergencies in form of disasters. As a result, majority of medical centers have well-structured exit plans in the event of a disaster occurring(Hospital Disaster Preparedness: You r Guide to Getting Started - Emergency Preparedness, 2011). However, this essay will aim at interviewing one of the top disaster preparedness staffat the Houston Methodist Hospital. In the interview,Read MoreHealth Benefits Appeal Process759 Words   |  3 Pagesinsurance claims will have been filed in 2011 (U.S. Department of Treasury, 2010, p. 43343). If the government sector and the market for individual coverage are included, an additional 70 and 62 million claims, respectively, were expected to be filed. Of these, 48.1 million or 12.6% will be denied. Only a small percentage of denied claims are expected to be appealed, approximately 162,300 or 0.34%, but nearly 40% of these should be successful. This essay describes the appeal process and its benefits. Read MoreEssay about emergency disaster1691 Words   |  7 Pagesï » ¿Write a clear and well thought out essay on the following problem: â€Å"Your County Manager has just gone to a federally sponsored program on getting volunteers to support the management and administrative side of disaster and recovery portions of emergency management. The Manager attended a seminar on a program called ‘Professional Volunteer Disaster Survey Team (PRO-V-DST)’ which had been developed in Texas in the mid-1990’s. She was quite enthusiastic about the program as it provides:

Monday, December 9, 2019

PhDs in the Humanities and Social Sciences

Questions: 1. Argue the criteria for selection of a research method. 2. Justify quantitative research designs when appropriate. 3. Justify qualitative research designs when appropriate? Answers: 1. Introduction The report shall evaluate the NCU degree CP 2013 applications. The course details shall be explained along with the use of attached templates to identify and elaborate. The qualitative and questions in aspects of the survey used to carry out the specific answers. The queries are ten of the framework proposed. The pros and cons of the application, costing, investments required are all taken into consideration, NUC templates suggest the qualitative and quantitative are indictors of the problem in recognition while the impact of the exercises is used in finding the consumer perspective to solve the problems area fix. The search methodologies are used to determine the qualitative and qualities issues. To have a string of working structure the use of both the data are used to design the projects likewise. The students applying for the post doctoral studied in terms of subject selection like the humanities to its science and engineering courses are losing the number of participants while the awardees in such humanities are reducing to its other contemporary subjects as science or engineering. The educational institutes attitudes, the social changes; the rate of attrition, the reasons of such a case is being evaluated with reasons to identify the way forward. The Problem Statement The problem statement is the identified issue that the assignment seeks to address. The guideline for any research is determined by the issues those are expected to be answered and thus give the researcher the line of argument to follow in determining he needed solutions. The qualitative aspect of the course is in its ability to make the concepts and evidences clear to the learner so that the issues can be relayed back with suitable qualitative examples. The descriptive answers or observations generally follow the same line of descriptive style of explanation. The quantitative answers for the solution determines the evidence based on calculated mathematical interpretations that may be the density, identity, calculated observations of a research or study which is demonstrated in numerical terms to express the idea. Statically implications in determining the quantitative and qualitative data are majorly looked into in the case of the course structure where the peer reviews and contrast with other course structures in a qualitative perspective is been elaborated. The problem statement is thus prompted with the idea of comparing the course structure and its evidence in giving the learners a quality education (Stanford.edu, 2014). The doctoral degree awarded to students if humanities in the year 1998 till 2008 were 12% while in science and engineering was 20.4%. Further the rate of attrition in doctoral degree persuasion had an attrition of 32% compared to 27% and 26% in science and engineering subjects respectively. The idea is to determine the reasons for such attrition and the unequal results in Doctoral courses. Hence the problem statement guiding the research would try and find out the reasons behind such results and determine the quality of education provide, retention problems, institutional characteristics and its influence etc to find a solution for such parity. So the logic maps in the research direction finds the problem which is the parity in doctoral course attractions, purpose of such education and hence design research questions to propose a method and design of the study (Wang Moran, 1999). Purpose of The study The study is done with a purpose to find the reasons behind the problem statement as identified. The use of qualitative and quantity data shall be used to identify the difference in attractively of science versus humanities The identification for the reasons for such diversity both in retention and building social interest about the doctoral studies of the students. The qualitative and quantitative data are being compared to reflect the reasons of such declining interest towards humanities while the rate of awardees are also lesser. The Purpose Statement Mixed methods of both qualitative and quantitative data are used in the research. The purpose is to reflect upon the various degree of deviation of interest that incorporates the phenomenon of social variable and ideological changes in the students minds those leads to the quantitative and qualitative results. The design is thus formed to apply both the qualitative and quantitative states from the US university study results about promoting the subjects to generate substantiate interest. The peers or the students as well as the observation of various published journals with secondary information sources shall be used in the derivation of results The logical conclusion of the reduction in number of people using the Humanities as Doctoral subjects are reducing as shown in previous research while the portion f science and engineering doctoral awardees are more hen compared. The gap may be less when compared in percentile but the reasons of humanities being unpopular to the later is the objective of the research to evaluate. The study would incur the reasons and find a solution for future course designers and teachers to make the subjects more interesting to reduce the attrition (Ssrc.org,2014). Research Questions The research questions are designed to answer the objective of the study. The questions are designed to give the reader an idea of what answers the study is going to portray. The research shall investigate the course structure of humanities in the context of the study to investigate the need of portraying humanities higher studied in new lights. Thus the questions are set on both qualitative and quantitative aspect of research. The qualities of the educational approach while quantity of the students enrolling and then facing attrition are discus in a mixed method in the light to suitably explain the same. Why are humanities loosing the students interest of perusal in higher education? What all are the determinants in a students success for humanities and what changes may help it to become popular among the students? How the system does needs to change the approach towards the Humanities to be a productive and popular subject where the rate of attrition is lesser with increased success rates? The aforesaid questions are designed to be answered in the research which shall also recommend the needed changes in course curriculum and approach towards higher studies. The allied social, economical and job market evacuation in brief shall also be put to justify the problem statement in context. Hypothesis The hypothesis can be said to be the explanation of phenomenon. It happens that when a previous observation or happening is being tested the observations are not always be explained based upon the existing theories and is called the hypothesis. For the study the hypothesis are explained and tested for a correlation formation. The thesis those can be explained from the prepared results may be kept while those which shows no correlation is negated Each of the used hypothesis shows a prediction which can be negated or considered based on the objectivity of the course. The hypothesis is said to be following a mathematical model which enables prediction for deductable reasoning. The deductions from the findings support the statically found results helping the deduction. However, there may be a qualitative aspect which suggests the experience or thoughts those may not have a mathematical explanation. Such is called the hypothesis of experience. Nevertheless, the new day testable hypothesis or scientific approaches of any hypothesis suggest that the alternative hypothesis has to have testable aspects, give proper numerical values, must be applicable to a number of other issues too to be held true and should be fit for the purpose that it had been designed upon (Fassbender, 1994). The qualitative researches have explanatory data as primary source for its conceptual framework and entails exploration of the subject. Thus working hypothesis is considered as a workable source of research process. The deductive reasoning those are derived from the research of hypothesis where the deductive measurable model is used as the deduction formulator. Again, correlations of phenomenon are to be similar and can be called null hypothesis and alternative hypothesis. The Null hypothesis suggests that the findings would have no correlation between phenomenons and would not have any bindings with alternative hypothesis. Thus the alternative hypothesis shows that there is a relation while the null hypothesis has none. Schulenberg (2007) observes that the rejection of null hypothesis rejection while alternative to be accepted needs to be done before the observations are framed to form a result to research subject. This would give the correct realization of topic and help to have th e direction and alternatives collected. The alternative hypothesis assumes a relation to data and its co-relation to a hypothesis considered prior to the result deduction. However, Hariri (2008) argues that the hypothesis being null or alternate is based upon the sample size. If the sample size used is too small then the rejection of null hypothesis may occur. Thus a substantial amount of participant for the result findings is proposed. 2. The Key Terminologies The terminologies used in the assignment proposal contains few phrases and concepts those e elaborated from the defined state. The assignment suggests that the topic shall use qualitative data, Quantitative analysis, Hypothesis, Correlation for findings, ANOVA, regression, APA referencing etc The study of humanities being unpopular when compared to doctoral student enrollment in engineering or Science subjects have to be defines in terms of what each of the expression means. Humanities: Te study of humanities is also considered a subject of study that relates to creative arts and is concerned with human civilization, growth, history, literature, culture, mythologies etc. The study of international politics, behavior, decision making and the reasons of cultural differences and the essence of human creativity in terms of artistic and thoughts are called the study of humanities. Science and Engineering: The study of subjects that are been found and created on the basis of scientific observation, findings and are based on quantitative and qualitative findings are subjects of science and engineering. The methodologies used in the study are related to the findings of the phenomenon those are real and based upon mathematical conditions, The engineering is another part of science that explains why the proposed solution to a problem shall follow a specific style and the benefits of it. The best part of science and engineering is that that the probable outcomes are measured on the theories and calculations already existing. The phenomenon is repeatable in certain conditions and the precondition of the study is to create something better with the use of technology and knowledge to achieve the findings (McKeleve, 1994). ANOVA: The simple explanation of ANOVA is analysis of variances. The collection of data to explain a phenomenon and its variances from one state to another based upon the results of survey is ANOVA. Thus it is a tool to explain the observation. The quantitative analysis gives a result; the explanation of the result brought out of the tally made out of those observations gives a resulting observation, simply termed as ANOVA (Careereducation.columbia.edu,2015). Qualitative research: The research of a subject based upon the open ended surveys which have the capacity to let the analysis happen based on observations, which would not have any qualitative aspect in it is called qualitative research. The expressions and observations are explained from a perspective of feeling, experience in such a research survey. Quantitative Analysis: The close ended questions that have the options limited to be answered from the given choices to from mathematical observation of the percentile for each given option in the findings. The quantifiable effects of the jobs make the observation have mathematical connotations to access the strength or preference of each option, in the survey. Correlation: The relationship of the findings from the research conducted with the hypothesis of the same in the subject suggests the correlation or the relation of the findings with hypothesis that connotes the links between the two, observation of findings and the hypothesis assumed prior to study. Review of The literature The literature review in terms of the current trends of the doctoral students preference in choosing the science and engineering over humanities are been ascribed to various reasons. The findings of the said literature are varied depending upon the social, geographical, economic, gender, political, cultural issues. Thus the concept of one nation or society does not necessarily explain the phenomenon present in another (Editors-in-Chief, 2008). The Humanities and its use or facilitating in securing a job is an issue that governs many a society. The other part is the economy of it. The studying activity is a expensive job in itself where only few privileged can aspire to continue. The grants and part-time jobs apart, the study consumes a lot of time needed for a humanities doctoral project. The same is applicable for the Science and Engineering doctoral applicants as well. But the aspirations of having to pursue a field that have better option of securing a job or a career ahead is pre ferred. The other issue is the motivation of it which was suggested to be a min attrition builder for humanity studies in US. The engineering and science doctoral students are based upon the choice or topic they choose and the firm of the same line of business has interest in the findings and thus gives grants and a post doctoral position to further assist the research and development wing. For humanities the findings would suggest the viability where the researcher have build an hypothesis of being unattractive compared to the former. 3. Research Methodology The demography for the research shall consist of the student who aspires to be doctoral students and post doctoral individuals for a quantitative survey. The qualitative angle of the hypotheses is to be found from the teachers, industry observers and peers from various backgrounds that have a post doctoral study experience. The size for quantitative data is been chosen to be 100 where the post doctoral individuals and aspiring individuals shall be equal or 50 each. The Qualitative descriptive answer would be sought from the people numbering 20 of the demography as said. The qualitative answers shall be open ended where 7 to eight formal questions shall be used and depending on the quest for clarity the number may increase as subjected to the interview data collection. The quantitative survey shall have 20 closed questions with only 1 to select out of 3 to 5 options given. The answers shall be accumulated to form the graph to indicate the closeness to the hypothesis. The primary and s econdary data would thus be compared with the literature to assume a stance and describe the findings. Variations and construction The data may vary due to many reasons like the inability of the individual to interpret the question are deliberately mislead. These can only be answered by being present and exercise controls to see the quality and validity is at its best for good and true evaluation of the problem questions formulated. The variability constant can be used if the size of the participant is to be larger where the deviation constant could have been chosen as a fraction in percentile for a valid finding with unbiased resulting observations. Thus the use of constant observation while collecting the data is needed to comply with the ethical findings and keep the use of research methodology undiluted as is proposed. List of Reference Careereducation.columbia.edu,. (2015). Non-Academic Career Options for PhDs in the Humanities and Social Sciences | Center for Career Education. Retrieved 5 June 2015, from https://www.careereducation.columbia.edu/resources/tipsheets/non-academic-career-options-phds-and-mas Editors-in-Chief, H. (2008). Food Science Gone Bad. Hypothesis, 5(1). doi:10.5779/hypothesis.v5i1.59 FASSBENDER, P. (1994). INTERCULTURAL DIFFERENCES IN THE BIOETHICAL ASSESSMENT OF ABORTION: PRELIMINARY RESULTS AND A PROPOSAL FOR FURTHER RESEARCH. Perceptual And Motor Skills, 79(3), 1375-1381. doi:10.2466/pms.1994.79.3.1375 Hariri, M. (2008). Hypothesis Big Picture. Hypothesis, 3(2). doi:10.5779/hypothesis.v3i2.38 McKELVIE, S. (1994). GUIDELINES FOR JUDGING PSYCHOMETRIC PROPERTIES OF IMAGERY QUESTIONNAIRES AS RESEARCH INSTRUMENTS: A QUANTITATIVE PROPOSAL. Perceptual And Motor Skills, 79(3), 1219-1231. doi:10.2466/pms.1994.79.3.1219 Schulenberg, J. (2007). Analysing Police Decision Making: Assessing the Application of a Mixed Method/Mixed Model Research Design. International Journal Of Social Research Methodology, 10(2), 99-119. doi:10.1080/13645570701334050 Ssrc.org,. (2014). Dissertation Proposal Development Fellowship (DPDF) Program Programs Social Science Research Council. Retrieved 5 June 2015, from https://www.ssrc.org/programs/dpdf/ Stanford.edu,. (2014). Subject: Humanities. Retrieved 5 June 2015, from https://shc.stanford.edu/why-do-humanities-matte Tight, M. (2015). Phenomenography: the development and application of an innovative research design in higher education research. International Journal Of Social Research Methodology, 1-20. doi:10.1080/13645579.2015.1010284 Wang, C., Moran, M. (1989). Proposal for an Integrated Minicourse on Research Skills. Community Junior College Libraries, 6(2), 61-70. doi:10.1300/j107v06n02_08

Monday, December 2, 2019

Lord Of The Flies Essays (1067 words) - English-language Films

Lord of the Flies A running theme in Lord of the Flies is that man is savage at heart, always ultimately reverting back to an evil and primitive nature. The cycle of man's rise to power, or righteousness, and his inevitable fall from grace is an important point that book proves again and again, often comparing man with characters from the Bible to give a more vivid picture of his descent. Lord Of The Flies symbolizes this fall in different manners, ranging from the illustration of the mentality of actual primitive man to the reflections of a corrupt seaman in purgatory. The novel is the story of a group of boys of different backgrounds who are marooned on an unknown island when their plane crashes. As the boys try to organize and formulate a plan to get rescued, they begin to separate and as a result of the dissension a band of savage tribal hunters is formed. Eventually the "stranded boys in Lord of the Flies almost entirely shake off civilized behavior: (Riley 1: 119). When the confusion finally leads to a manhunt [for Ralph], the reader realizes that despite the strong sense of British character and civility that has been instilled in the youth throughout their lives, the boys have backpedaled and shown the underlying savage side existent in all humans. "Golding senses that institutions and order imposed from without are temporary, but man's irrationality and urge for destruction are enduring" (Riley 1: 119). The novel shows the reader how easy it is to revert back to the evil nature inherent in man. If a group of well-conditioned school boys can ultimately wind up committing various extreme travesties, one can imagine what adults, leaders of society, are capable of doing under the pressures of trying to maintain world relations. Lord of the Flies's apprehension of evil is such that it touches the nerve of contemporary horror as no English novel of its time has done; it takes us, through symbolism, into a world of active, proliferating evil which is seen, one feels, as the natural condition of man and which is bound to remind the reader of the vilest manifestations of Nazi regression (Riley 1: 120). In the novel, Simon is a peaceful lad who tries to show the boys that there is no monster on the island except the fears that the boys have. "Simon tries to state the truth: there is a beast, but 'it's only us'" (Baker 11). When he makes this revelation, he is ridiculed. This is an uncanny parallel to the misunderstanding that Christ had to deal with throughout his life. Later in the story, the savage hunters are chasing a pig. Once they kill the pig, they put its head on a stick and Simon experiences an epiphany in which he "sees the perennial fall which is the central reality of our history: the defeat of reason and the release of... madness in souls wounded by fear" (Baker 12). As Simon rushes to the campfire to tell the boys of his discovery, he is hit in the side with a spear, his prophecy rejected and the word he wished to spread ignored. Simon falls to the ground dead and is described as beautiful and pure. The description of his death, the manner in which he died, and the cause for which he died are remarkably similar to the circumstances of Christ's life and ultimate demise. The major difference is that Christ died on the cross, while Simon was speared. However, a reader familiar with the Bible recalls that Christ was stabbed in the side with a a spear before his crucifixion. William Golding discusses man's capacity for fear and cowardice. In the novel, the boys on the island first encounter a natural fear of being stranded on an uncharted island without the counsel of adults. Once the boys begin to organize and begin to feel more adult-like themselves, the fear of monsters takes over. It is understandable that boys ranging in ages from toddlers to young teenagers would have fears of monsters, especially when it is taken into consideration that the children are stranded on the island. The author wishes to show, however, that

Tuesday, November 26, 2019

Apply for the CPP Retirement Pension

Apply for the CPP Retirement Pension The application for the Canada Pension Plan (CPP) retirement pension is quite simple. However, there are a lot of things to learn and decide  before you apply. What is the CPP Retirement Pension? The CPP retirement pension is a government pension based on workers earnings and contributions. Just about everybody over the age of 18 who works in Canada (except in Quebec) contributes to the CPP. (In Quebec, the Quebec Pension Plan (QPP) is similar.) The CPP is planned to cover about 25 percent of pre-retirement earnings from work. Other pensions, savings and interest income are expected to make up the other 75 percent of your retirement income. Who is Eligible for a CPP Retirement Pension? In theory, you must have made at least one valid contribution to the CPP. Contributions are based on employment income between a set minimum and maximum. How much and how long you contribute to the CPP affects the amount of your pension benefits. Service Canada maintains a Statement of Contributions and can provide an estimate of what your pension would be if you were eligible to take it now. Register for and visit My Service Canada Account to see and print a copy. You can also get a copy by writing to: Contributor Client ServicesCanada Pension PlanService CanadaPO Box 9750 Postal Station TOttawa, ON K1G 3Z4 The standard age to start receiving a CPP retirement pension is 65. You can receive a reduced pension at the age of 60 and an increased pension if you delay starting your pension until after the age of 65. You can see some of the changes that are taking place in the reductions and increases in CPP retirement pensions in the article Canada Pension Plan (CPP) Changes. Important Considerations There are numerous situations that can affect your CPP retirement pension, and some may increase your pension income. Some of those are: Child rearing provision  can be requested if you stopped working or received a lower income as the primary caregiver of your children under the age of seven, which could increase your retirement pension.Pension sharing with your spouse or common-law partner could mean tax savings for you.Credit splitting  after a divorce or separation  allows CPP contributions made by you and your spouse or common-law partner to be equally divided.International social security agreements  may make you eligible for a pension if youve lived and worked in certain countries. How to Apply for the CPP Retirement Pension You must apply for the CPP retirement pension. It is not automatic. For your application to be eligible You must be at least a month past your 59th birthdayYou must have contributed to the CPPYou must want your pension payments to begin within 11 months. You can apply online. This is a two-part process. You can submit your application electronically. However, you must print and sign a signature page that you then must sign and mail to Service Canada. You could also print and complete the ISP1000 application form and mail it to the appropriate address. Dont miss the detailed information sheet that comes with the application form. After You Apply for the CPP Retirement Pension You can expect to receive your first CPP payment approximately eight weeks after Service Canada receives your application. Service Canada has other  useful information to be aware of once you start receiving your benefits.

Saturday, November 23, 2019

This January I Switched to Apple. What are You Tolerating in Your Life

This January I Switched to Apple. What are You Tolerating in Your Life I’d say it was a long time coming, given that I’ve had nothing but problems with my Dell PCs for the last †¦ oh †¦ 20 years? For the most recent 3 or so of those 20, several of my friends and colleagues have been begging and pleading with me to convert to Apple. Did I listen? No. Changing just seemed like way too much work. Then, in January, I hit a limit. My 1-year-old Windows 8 computer, whose operating system I had just reinstalled, was not working any better than it was before I reinstalled it. My programs were constantly going to â€Å"Not Responding.† Tech support could not fix the problem and was telling me I needed a more powerful computer with more RAM. Sales was telling me the 8 GB of RAM on my current computer should be plenty. I figured either sales was wrong or tech support was wrong, and Dell should either fix the problem or give me some amount of credit toward a new computer. They claimed to be unable to do either. It was decision time, and I was DONE with Dell. Perhaps you are celebrating, along with many of my friends, colleagues and even distant acquaintances, that I waltzed into an Apple store and bought a MacBook Pro. In the end, this change happened in an instant. It wasn’t easy getting up to speed on the MacBook. The delete button drives me crazy. The command button is located in the most inconvenient spot I can imagine. My files are all organized differently now. Outlook was downloading all my email repeatedly and I had to get tech support to get a duplicate deletion program. I needed a new way to access my accountant’s server so I could use my QuickBooks program. I had to call HP support to get my printer working wirelessly. I blew out two adaptors trying to connect the Mac to an external monitor. And there’s more. This is why I did not want to switch to a Mac. But get this: The computer doesn’t use battery power while it’s asleep. It wakes up immediately. I can leave my house carrying my laptop and no power cord and trust that the battery will last. The programs work and don’t slow down on me ever. And iCal integrates with Google Calendar without a 3rd party program! Most of the issues I faced were ramping up issues and are all resolved. And I get all the good stuff. I’m starting to be a proud Mac user. My question out of all of this is, â€Å"Why the heck did I wait so long?† You can ask any of my close friends and relatives and they will attest to the fact that I was spending hours upon hours with Dell tech support for years. I have never been happy with a Dell computer! And yet, I resisted change. Pure and simple. I kept choosing to upgrade to a â€Å"better† Dell, hoping it would solve my problems. It never did. People do this. Look at how many people stay in relationships that require hours of conversation to try to make them work. Look at how many of these people move in together, or get married, thinking that the â€Å"upgrade† will help. Or they have children in order to fix their relationship. Now that’s an upgrade! We so often avoid the risk of starting over with someone else because it would require an unknown amount of work – even if we have a strong inkling that ultimately the benefits would justify the investment. We resist change even if all our friends are telling us to â€Å"switch to Apple.† Many of us stay in jobs that are not a good fit. Even if we’re miserable, at least we’re dealing with a known quantity. I myself kept working for 10 years as a lawyer, because it was safe and provided a living wage, even though there was no amount of adjusting and mind talk that could make me enjoy that job. I even accepted a promotion (my â€Å"upgrade†) before reaching my breaking point and starting something new. The February issue of LeaderMag featured an article by Bruce Hodes, Five Ogres and an Angel, about the resistance to change in organizations. I love this quote which he shares: â€Å"Change is hard because people overestimate the value of what they have and underestimate the value of what they may gain by giving that up.† James Belasco and Ralph Stayer, Flight of the Buffalo (1994). Hodes asserts that two of the main elements blocking change are â€Å"comfort† and â€Å"drift.† Comfort is something we’re all familiar with. We humans like things to stay the same. We get attached to our routines like a warm blanket, even if they aren’t serving us. And drift, the pull of the current always in the same direction (toward the status quo), affects us whether in our homes or workplaces. Hodes’ advice: â€Å"Trust your intuition- be convinced that even in the face of resistance this is the way forward.† The payoff according to Hodes is Performance Improvement. I certainly got that with my MacBook Pro. My question to you is: Where in your life are you resisting change, falling victim to comfort and drift, when you really know it’s time to make a move? Where is there room for performance improvement in your life? Maybe it’s time to stop â€Å"upgrading† what you already have and to start something new.

Thursday, November 21, 2019

What are the social benefits of using lie detectors Research Paper

What are the social benefits of using lie detectors - Research Paper Example The danger of using lie detector is that innocent people will be mistakenly pronounced as being guilty since the test only measures physiological responses. Such responses may be caused by a number of emotions, for instance, fear and anger of which guilt is only one. The device has an equally unacceptable rate of falsely accusing innocent people. One question that is emerging is that; how reliable are the polygraph tests? Cheating is human. There has never been, nor ever be, an honest society. So long as human beings lack the means to quantify lies or weigh hypocrisies, there is no need of any individual or society, supposing that any other society is more dishonest than another. Various culture of the world can be distinguished on the basis of how it copes with deceit, the types of lies it denounces, the type of institutions it fashions to expose the deceits(Messer and Jones, p 108). The lie detector and its used have been great in the modern society. The instrument has become one of the greatest projects of the twentieth century, aiming at improving the effort to transform the central moral question of our collective life and how to fashion a just society. The instrument also drew its legitimization from two noble half-truths about our political life which states that: democracy depends on transparency in public life and that justice depends on equal treatment for all persons. As a society based on the political principles rather than a common history or shared kinship, the modern society has decided to resolve social conflicts with public rules regardless of any other factors taking place behind the scenes. Social rules are often in conflict and the society is quick to justify them in the name of science. Science, in itself, is considered as the least arbitrary and the most transparent form of rule making. This has led to the treatment of deceit and

Tuesday, November 19, 2019

Deepening Essay Example | Topics and Well Written Essays - 1750 words

Deepening - Essay Example This has been interpreted differently, while there are some that have assimilated, some did not have the capacity to go through the assimilation process. This kind of individuals, eventually gave up and held up to their familiar ways of life. We begin with a story comprising of two close friends, Amy and Jeehynum, who have had similar experiences in high school. At the beginning, the two friends separated, and one travelled to Korea. It is apparent that the two have been friends from their childhood days, and their friendship culminates in high school, at a stage where their lives are close together. The future is uncertain, and the two do not know what to expect when they relocate to a different environment. For Amy, she has a different experience in regards to her college life, which makes her get filled up with mixed reactions, as events unfold. However, she starts off nicely, gets new friends and finally begins to catch up with the new environment. For Jeehyun, college years have been tough, and most of the times she would often try to assimilate into the environment with no much progress. It is evident that the two friends are experiencing culture shock; a situation in which a person feels disoriented as a result of being exposed to a new way of life. There are many causes, of this scenario, the notable and most familiar one being, when an individual is exposed to a foreign land (Ward, Bochner, and Furnham). Culture shock is addressed in four defining stages, these are honeymoon, negotiation, adjustment and mastery. The honeymoon stage comprises of a healthy view of the differences that are evident in the subjects. The strange land at this stage seems to be good and at times adventuress. It is at this stage, where the subjects would find nationals that are familiar and at times speak their language. This stage does not last for long, after a short period, it can take about two to three months. The next stage

Sunday, November 17, 2019

The Group of 20 Essay Example for Free

The Group of 20 Essay The Group of 20 was created in 1999 to develop a solution to economic hardship and financial crisis. On November 10-11, 2010, leaders around the world gathered in Seoul, Korea. The implementations included ensuring the current economic recovery by creating jobs, this will balance our recovery and increase wealth. Other commitments include strengthening the international financial regulatory system and institutions, which will sustain global growth and prevent future crisis. With this a summit was formed and plans were made. Cameron, the prime minister of UK, told the Commons that the UK had four priorities at the talks: tackling deficits, fighting protectionism, encouraging development issues and dealing with trade imbalances. During the summit Cameron said The G20 has been a vital forum in fighting to keep markets open. Increasing trade is the biggest boost and the biggest stimulus we can give to the world economy. It doesnt cost any money, it is not a zero-sum game and it creates wealth and jobs. † After the proceedings, the prime minister revealed that the summit was very successful. Korea brought new perspective and issues to G20 as well. Korea introduced plans for a global financial safety net system and development issues as additional agenda items. In Korea’s view, a credible global financial safety net would reduce pressure on governments to continue adding to smaller reserves. Korea is also interested in furthering renewed discussions on development issues, proposing them to be essential to rebalancing the global economy and to ensuring that the Agenda can operate as intended. It also includes addressing gaps in income and development. Korea is concerned with future economic growth and therefore organized a Business summit collecting and gathering close to 100 key corporate Chairmen and CEOs from around the world just prior to the G20 Summit, on November 10-11, 2010. The Business Summit is intended to be a forum for business leaders to exchange views on how to boost the recovery and put the global economy back on the path to greater growth. World leaders agreed to develop new guidelines to prevent so-called currency wars. The EU helped to build a consensus on cooperative solutions to tensions on currency issues and trade. A joint letter that was created during the summit included that leaders would agree to move to more market based exchange rates and to enhance exchange rate flexibility to reflect essential economic fundamentals and abstain from competitive deflations. They are also happy with the commitment to fight protectionism. It was important for the EU to place development securely on the agenda of the G20. This was achieved with the Seoul Development Consensus for Shared Growth, interlinking development, trade and investment. This is the fifth G20 summit since the global financial crisis hit in 2008. This summit was chaired by Korea and seemed to have been successful. In this summit several actions and plans were developed: the Seoul Action plan, the joint letter, and the Seoul Development Consensus for shared growth. The solution for the global economic and financial crisis is underway.

Thursday, November 14, 2019

To Kill a Mockingbird - The Powerful Character of Atticus Finch Essay

To Kill a Mockingbird - The Powerful Character of Atticus Finch In the beginning epigraph of To Kill a Mocking Bird, Harper Lee quotes a statement made by Charles Lamb: "Lawyers, I suppose, were children once." As told through the eyes of the rambunctious elementary school child, Scout Finch, we see not only how she and her brother's lives are affected by their community, also how they develop and mature under the watch of their father, lawyer Atticus Finch. As a wise role model to his town of Maycomb as well as his children, Atticus Finch becomes a prominently admirable character. As a father, lawyer, friend, and foe, Atticus Finch proves himself to be an honest, selfless, and courageous human being. Throughout many of the books main events, it is always clear that Atticus Finch holds true to his morals and beliefs despite all obstacles. After returning from reading to Mrs. Dubose, a morphine addict, with her brother, Jem, Scout asks her father, "Atticus, what exactly is a nigger-lover?" Having heard the term used many times referring to her father it is easy t...

Tuesday, November 12, 2019

Philips vs Panasonic: Facing the 2008 Economic Crisis Essay

1. Introduction Panasonic and Philips are two of the main consumer electronics companies in the world with different origins but similar international path. Several hurdles were faced by both companies in their evolution. This paper will analysis how the administrative heritage of Philips and Panasonic caused problem in the changing of their strategy, causing problem in the implementation of the respective strategic decisions. Moreover it tries to explain how time contingences and the external environment had influenced the strategy of the two MNEs and shape the organizational structure; sometimes leading to success, sometimes losing ground in the market field. Nowadays the main tasks of the two companies are conflictual : for Panasonic defending its leadership position, for Philips challenging the global leader. How to reach these goals? The last paragraph will address these objectives providing to the CEOs of the respective companies further steps to consider in order to remain com petitive in the market field, from 2008 looking forward. 2. How the environment and culture shape companies’ strategy and organization 2.1 Philips evolution: from responsive to integrated Philips’s strategy, right after WWII, led the company to its success. The economic situation during the 30s, forced the company to transfer part of its assets and laboratories abroad. This led to a dispersion of responsibilities. Countries protectionism, high tariffs and trade barriers required local production facilities. These problems led to the adoption of a decentralized federation with independent and self sufficient units and autonomous marketing. The contingent environment spurred the management to rebuild their structures upon National Operations (NOs). Philips built its success on a worldwide portfolio of responsive national organizations. Economic conditions, tastes and preferences, at that time, differed across countries; corporate management treated subsidiaries as independent national businesses with the aim to satisfy local needs. National Organizations were so vigil that they manage to preempt products required by customers, launching products such as the first color TV, the first stereo TV and the first TV with teletext. Innovation and R&D were the core strength of the company. They were flexible, responsive and fast in the approach to market. Entrepreneurial initiatives derived, not from top down imposition, but from every single division. The company succeeded in managing its innovation and to bring it to the next level, making innovation and customer’s needs the purpose of their business. The focus on national responsiveness was appropriate since the 50-60s, when it started to become the firm’s limitation. The great focus given to tailor solution to costumers’ tastes increased the cost of production and led to a dispersion of subsidiaries across too many countries. Problems of efficiencies and coordination arose. The company took several years to get rid of its matrix structure. Attempts to shift the companies to a slimmer organization in order to become more efficient in its production were slow and cumbersome. The National Operations continued to detain major responsibilities. The company was captive of its past. Contrarily to management prevision, the matrix structure created more problems that it solved. It was more complex than either the â€Å"worldwide area structure† or â€Å"the worldwide product structure†, and it created conflicts of responsibilities. Market signals warned the company to implement changes in the way business was conducted. However, in the 1990s, the company was, still, going through major losses. The structure was too costly and value added higher, compared to Japanese production facilities. Even if a good objective were settled in the strategic planning, however, as history showed, the further step of re-organization failed due to difficulties encountered in rebuilding the organization. It was not a zero-base reconfiguration. For more than 35 years, from 1971 onwards, different CEOs tried to take action in order to reduce the power of NO and create an agile and simple divisional organization that could create efficiencies, however trying not to neglects their sources of innovation: respond to country specific markets. 2.2 Matsushita evolution: from integration to responsiveness Matsushita point of departure instead was complete different from the one undertaken by Philips. Matsushita employed a divisional structure with a strong centralized decision making. The adoption of the divisional organization was well-thought due to the large and highly differentiated product range manufactured by the companies, aimed to different target clients. This allowed the company to reduce the organizational complexity and reducing transaction costs within the company. Matsushita exploited the favorable characteristic of the post war era, such as a convergence of tastes across borders and the â€Å"uncontrollable† globalization. In the 60s, the firm managed to expand its product range. It created self-funded research laboratories to develop new product solution. At the time, Matsushita detained a strong distribution channel directly owned by the company with more than 25.000 domestic retail store. However, when the demand in the domestic marketed slowed down, what was a competitive advantage domestically was not replicable abroad. The company made a lot of effort in expanding its international presence, without being demoralized. However as times passed by, the highly centralized control structure, create problems in their offshore operations. By the 1980s, the company, mainly concentrated on global integration instead of localization, decided to re-tailored its strategy . Decentralization of responsibilities was more than a necessity. Matsushita’s strategy was too focused on global efficiencies neglecting innovation, and learning and flexibility. R&D was vital in this industry. Actions of several CEO tried to address these issues, changing the company’s strategy from copycat to promoter of invention. However, as Philips encountered problem in its â€Å"transformation†, so did Mastushita. The firm had difficulties in promoting self initiatives among different divisions. The former centralized hub organizational structure was still present in the employees’ mentality. The initiatives, such as the introduction of local managers with stronger responsibilities in key positions and locations, did not give the expected outcome in the short run. It took several years and a lot of effort to see some res ults. Again the company was captive of its past. Today, Mitsushita’s configuration is mainly built around 3 main divisions: Digital networks, Home appliance and Components; reduced from the 36 product divisions used in the 1980s. The simplification was clear and so its advantages. The reaction of such a structure led the company to be more flexible to local need, reduce costs of duplication and achieve economies of scope. The products developed within each division were highly linked. However the company was not safe from competition. With the crisis’ approaching, the economical situation of the company does not remain untouched. What else is left to do? 2 Core issues today in the consumer electronic industry The current economic situation differs drastically from the one encountered in 10 or 20 years ago. Phenomena like globalization have given companies such as Philips and Panasonic the chance to provide their product to an extensive market, raising however new challenges. The global economic recession has created new needs. Consumers are more price sensitive and less willing to pay high prices for low quality products. Companies need to be aware of their cost structure and be ready to take respond to rapid technological changes and changing consumer preferences with timely and cost-effective introductions of new products in markets that are highly competitive in terms of both price and technology. The access to low factor cost, such as low labor force in emerging country as become crucial. Moreover, developing markets, such as India and China, are not only location where firms can manufacture products at lower costs, but also new opportunities where companies can market their own produ cts. In the consumer electronic industry, competition has become extremely high and innovation, now more than ever has become crucial as new source of revenues. 3. Recommendation for Philips: Exploit contingencies of the time you are in The 2008 situation of Philips is not flourishing. In 2008, the company scored a net loss of â‚ ¬260 million. This was probably due to the financial crisis that affected the company drastically. However, company’s loss can not only be attributed to external factors. The company has made some mistakes, in the recent years, and should take action and correct its own strategy, remembering it origins. Coherence with past is crucial to tailor an effective strategy that could lead to innovative solutions for the future. That’s why Philips should continue to develop consumer-centric solutions. Research laboratories should remain independent but linked across one another. Philips should tackle the recession without sacrificing its long term strategic ambition: â€Å"Improving people live through meaningful innovation†1. That why now, more than ever, the company should be able to forecast marke t trends. Philips should continue more efficiently to re-allocate resources to growth opportunities and emerging markets. To do so it should transfer part of its qualified personnel to target markets, such as from India and China, building a strong teams of both local and expatriates managers and engineers. The combination of foreign and local figures could help the firm to continue the strategy of local responsiveness. To do so, it is necessary to include in the company agenda selective mergers and acquisitions. M&A, however, should not be out of focus. Philips has reduced its current division to 3 main domains: healthcare, lighting and customers lifestyle. My recommendation would be to stay true to these three core segments, acquiring key strategic companies in foreign markets. How to finance these new objectives? Disinvestments in peripheral activities and less profitable plans are still needed in order to obtain liquidity to reinvest in developing countries. This tactic would not repay within short term but long term profits would offset the high level of investments. Moreover leveraging of local subsidiaries in emerging markets can a springboard, to target back developed markets selling low cost products. 4. Recommendation for Mastushita: defending worldwide dominance The economic situation of Panasonic is different from the once faced by Philips. Panasonic is the world’s leading plasma maker in the industry and has to defend its worldwide dominance. Matsushita has based its competitive advantage ion low cost production. However, particular during financial crisis period, several other initiatives are to be implemented. First the company has to reduce its cost structure in order to face the deep decrease in demand in order to continue to make profits. Panasonic its already leader in this activities, however, there are always margins of improvements. To achieve cost reduction, it has to downsize its workforce. The firm should prioritized certain businesses over less profitable ones. Moreover adopt a lean management in order to reduce at the minimum the wastes in the value chain. Secondly it should restructure its organization to make it as clear and simple as possible in order to avoid cost duplication, slow process and cost of bureaucracy. In addition, The shift of demand and the focus on emerging market, in particular Asian markets, should be the highest priority. The market signaled opportunities in untapped market. The firm should be upfront in order to gain advantages over competition. These is a necessity to shut down plants in Japan and oversea and transfer workforce and capital to new sites. This initiative can also overcome exchange currency risk. Also distribution channels in these countries should be enhanced. However, the company should not forget its heritage: leverage on efficiency. The company should achieve a â€Å"transnational strategy†; incrementing its local responsiveness and innovation and learning, at the same time. Even during times of recession in essential to continue to invest in R&D, with the objective to grow faster than competitors when the market recovers. The main revenues for the companies come from distinct technologies. Local responsiveness could be achieved through the delegation of, even more, profit responsibilities to the three main divisions. This could enhance the company performance and to keep it on the podium. 6. Conclusions The structural variety is function of the environmental characteristics, Organization has to adapt to the external environment. Both companies has gone through several changes in their evolution. Environmental factors pushed companies to rethink their strategy and consequently their organizational structures. In some cases the difficulties encountered in the transition, for instance from an highly integrated company to more responsive, was slow and cumbersome. Both companies has been proved to be reluctant to change. . It is true that the set of strategies a firm is limited by the decisions the company has made in the past, however, these limitation should not be considered as insurmountable. Change can occur and rapidly. Philips and Panasonic lacked the ambition to design solution at 360 °. Firms should learn from their past and continue to leverage on their own competitive advantages, building on existing infrastructure. Do not be captive of your own past, but use your past experience to create a strong advantage, compensating your deficiencies. 5

Sunday, November 10, 2019

Response to William Wordsworth’s ‘I Wandered Lonely as a Cloud’

Response to William Wordsworth’s ‘I Wandered Lonely as a Cloud’ It is most difficult, I feel, to compose a response to William Wordsworth’s classic and idolised poem, ‘I Wandered Lonely as a Cloud’ in such few words. A response to a poem may be seen as a reflection on features such as the language, the imagery and certainly, how the poem made me feel. I will however attempt to outline the influence this poem has had on me, considering the aforementioned features. This poem has evidently stood the test of time.It has breezed through generation after generation being read and reread and this, I believe, is due to its simplistic, yet compelling, story it tells. The first three stanzas are a collection of beautiful images painted by the use of comprehensible yet rich language. The language may be considered plain, however, I feel it echoes a calm and tranquil atmosphere as it does not busy the reader. Wordsworth is describing this truly memorable e xperience by personifying ‘the host of golden daffodils’.He does this throughout the poem for example at the close of the second stanza: ‘Tossing their heads in sprightly dance. ’ This, I believe, adds life to the poem. This personification links the language with incredible imagery. I, as a reader, have entered Wordsworth’s memory and I envisage this alluring scene; the countless daffodils ‘dancing in the breeze’. This further creates a more wondrous and astounding ambience that fulfils the reader like it fulfils Wordsworth in stanza three: ‘A poet could not but be gay’.William Wordsworth stated that â€Å"Poetry is a spontaneous overflow of emotion, not the emotion of the actual experience, but the emotion recollected in tranquillity’’. I believe his poem ‘I Wandered Lonely as a Cloud’ is a justified example of this quote. I truly admire this poem as it tells a story of Wordsworth’s feeli ngs toward nature. It, in turn, gave me a positive and joyous outlook on the art of nature unleashing feelings similar to Wordsworth in the final stanza: ‘And then my heart with pleasure fills’.

Thursday, November 7, 2019

Free Essays on Imperial Nations

In The Decades Prior To 1914, The Culture Of The European Great Powers Was Profoundly Marked By Their Self-image As â€Å"imperial† Nations, With All That Implied. Discuss To address this question an understanding needs to be established of what is meant by imperialism prior to 1914. Then European events that had occurred will need to be clarified paying special attention to Britain and France. To illustrate the culture of Britain and France and how it related to their self-image as an imperial nation a case study on the events in Africa will be included. Imperialism changed towards the latter part of the 19th Century towards what is now referred to as New Imperialism. New Imperialism has no set definition but has many views such as the Marxist capitalist view taken by Luxemburg and Lenin for example that capital was the seed to imperialism. Woolf however believes that there was a more nationalistic view towards new imperialism. New imperialism is believed to be the internal development of society and relations with other countries, this is a simplistic form of the term however. The more accurate description is a system in which countries gather acquisition or maintenance of territories and claim sovereignty over that territory and the people inside the boundaries . This is often to facilitate economic domination over their resources, labour and goods markets. This definition therefore supports the Marxist idea of new imperialism being based on the culture of money and capitalism. The tradition of empire is also important to the definition of new imperialism. This is because for an empire it was vital to be the most powerful and therefore be the one idolised by other countries. Britain is a good example in this century. The tradition of empire meant there was great economic and political rivalry between countries especially in Western Europe to become a member of the European great powers . Cultural changes took place that would... Free Essays on Imperial Nations Free Essays on Imperial Nations In The Decades Prior To 1914, The Culture Of The European Great Powers Was Profoundly Marked By Their Self-image As â€Å"imperial† Nations, With All That Implied. Discuss To address this question an understanding needs to be established of what is meant by imperialism prior to 1914. Then European events that had occurred will need to be clarified paying special attention to Britain and France. To illustrate the culture of Britain and France and how it related to their self-image as an imperial nation a case study on the events in Africa will be included. Imperialism changed towards the latter part of the 19th Century towards what is now referred to as New Imperialism. New Imperialism has no set definition but has many views such as the Marxist capitalist view taken by Luxemburg and Lenin for example that capital was the seed to imperialism. Woolf however believes that there was a more nationalistic view towards new imperialism. New imperialism is believed to be the internal development of society and relations with other countries, this is a simplistic form of the term however. The more accurate description is a system in which countries gather acquisition or maintenance of territories and claim sovereignty over that territory and the people inside the boundaries . This is often to facilitate economic domination over their resources, labour and goods markets. This definition therefore supports the Marxist idea of new imperialism being based on the culture of money and capitalism. The tradition of empire is also important to the definition of new imperialism. This is because for an empire it was vital to be the most powerful and therefore be the one idolised by other countries. Britain is a good example in this century. The tradition of empire meant there was great economic and political rivalry between countries especially in Western Europe to become a member of the European great powers . Cultural changes took place that would...

Tuesday, November 5, 2019

Freelance Editors for Indie Authors

Freelance Editors for Indie Authors Do you want to make your book the best it can be? Of course you do. But how do you do that? If you are an indie author, as so many of us are these days, it is essential that you work with both a professional editor and an experienced cover designer. The process for finding these pros is pretty similar but today we are going to take a look at how to find the perfect-fit editor for you and your book. Editors come in two general categories: a.) developmental editors who work with you right out of the gate to help you understand your market, conceptualize your book, organize the material, avoid repetition, keep it moving, engaging, fresh and original, and b.) copyeditors and proofreaders who make sure your completed manuscript is error-free and professionally designed. How Do You Find A Good Developmental Editor? Good is the key word here. A good editor is likely to be a professional editor who has had lots of experience in the traditional book publishing world. Once you have located an editor you feel may be right for your book, find out where she has worked and what published books she has worked on. Make sure she has had experience editing the kind of book you are writing. This is very important. And where do you find this person? = Get a referral. Check around, ask other writers, your friends, agents at writers conferences. = Look online. Go to Google and type in Freelance Editors.   Of course, you can type in Freelance Copyeditors and Proofreaders or Freelance Book Designers, too, if thats what you happen to be looking for. Also, you can check out our recently published guide The Self-Publishers Ultimate Resource Guide www.bookdocs.com The Independent Editors Group (full disclosure, Im a member of this group. www.bookworks.com The Self-Publishers Association www.bibliocrunch.com Bibliocrunch www.consulting-editors.com Consulting Editors Alliance www.digitalbookworld.com Digital Book World www.elance.com Elance www.the-efa.org Editorial Freelancers Association www.mediabistro.com Media Bistro www.publishersmarketplace.com Publishers Market Place = Chat with the editor. Define your goals. Feel free to ask him for his credentials and for the titles of a few of the published books he has worked on. If he has not worked on any books that have been published = Be clear about the fee structure. Does she charge = Ask about the time line. How long does she expect the editing process to take? This depends on how quickly you get the revisions back to her, but make sure you both agree on what is a reasonable amount of time for this project to take. = Ask if he has had experience with self-publishing. This is not essential, but sometimes an editor who knows his way around self-publishing, or who can at least refer you to others who do, can be a big help for first time indie authors. Always remember, this is your book. You are the creator, the artist, and your name is on the cover. When you disagree with your editors suggestions, trust your instincts and go with what you feel is right. You are the boss and thats as it should be.

Sunday, November 3, 2019

Job Description for a Retail Sales Associate Research Paper

Job Description for a Retail Sales Associate - Research Paper Example The three behaviors necessary for job performance are customer focused attitude, taking initiative, and excellent communication skills. All these behaviors are extremely important for job performance at the retail store. Customers focused attitude should be enforced in employees by rewarding employees on a per sales basis. Employees who take initiative should also be recognized and employee of the month competition should be used to encourage employees to take initiative. Taking initiatives involves doing something that is not a necessary part of their job description. Communication skills should also be rewarded and customers can be asked to fill a feedback form to evaluate communication skills of employees. Organization behavior modification is known to influence performance of employees and, therefore, can effectively be used to enhance performance of employees of retail store (Stajkovic & Luthans, 1997).  Evaluating Performance and Informing EmployeesPerformance of employees ca n be evaluated different techniques. One obvious way is to observe change in sales pattern in order to understand whether employees are exhibiting key behaviors or not. If employees have a customer focused attitude then sales should definitely improve. Also it is important to see whether employee behavior is having a positive financial effect on the business. The second method to evaluate employee performance can be to take customer feedback and observe results of ‘employee of the month’ competition. Employees who are lagging.... Employees who are lagging behind in such competitions are the ones who are not exhibiting key behaviors. Another method can be to take customer feedback as it can also provide first hand information regarding the performance of employees. It is also essential to inform employees about the new performance standards set in the organization so that they can also mend their behavior according to the new standards. One simple way is to email employees or give them a small pep talk in which manager can explain about the new performance standards and how employee performance will be measured from now on. A better approach would be to conduct training session in which all employees should be explained why new performance standards are implemented and how these standards might improve the overall business of the store. All small details of the performance standards should be explained and manager should communicate what he expects from his employees. It is important to communicate with employ ees in a detailed manner on a regular basis when any organization is going through a change process (Vakola & Nikolaou, 2005). This is why communication with employees regarding performance management is so important for managers. Feedback Plan and Methods to Reinforce Positive Behavior The two methods to provide feedback to employees will be through email and via face to face interaction. These two methods can be used together to communicate the employees about their performance. Email is important because employees should have a record of their performance. Also it is a personalized way to provide feedback as other employees will not know about the performance

Friday, November 1, 2019

Preventing biodiversity reduction in the coastal zone Essay

Preventing biodiversity reduction in the coastal zone - Essay Example Biodiversity have three levels - genetic diversity (i.e. diversity of genes within a species and between species), species diversity (refers to the differences in populations within a species, between populations, and between the various species), and ecosystem diversity (refers to the various habitats, biological communities and ecological systems; also denotes differences within ecosystems) (Ecological Society 1997). An ecosystem remains stable and balanced due to the variety and richness of organisms and species. The relationship of the varied species within an ecosystem has been well documented that destruction or extinction of one species may affect other living things, which may even lead to the extinction of the other species in the ecosystem. Thus, preserving and protecting one species in a particular area is similarly protecting the rest of the species. This is equally true to land-based species and sea and coastal creatures. Biodiversity denotes ecosystem balance and surviv al of the species within the system. Degradation in Biodiversity Generally, the ecosystem remains balanced in its usual natural course. The imbalance occurs when outside elements interfere with the usual processes. Man has interfered with the natural ecosystem for a long time. One form of invasion is by establishing habitation in the coastal areas (Water ecology 2009). It is reported that an estimated two-thirds of the total population lives near or along the coasts (Water ecology 2009). In fact, wetlands and some coastal areas are being dried by people in order to reclaim land for urban expansion (Water ecology 2009). The wetlands are also converted for farming, mining, gas and oil extraction, and highways for land transportation (Water ecology 2009). Sewage run-off and toxic contaminants (e.g. pesticides, heavy metals) are passed to coastal zones that become concentrated over a period of time (Water ecology 2009). These chemicals threaten aquatic life and biodiversity. Over-fishin g of a certain species without proper regulation likewise degrades biodiversity (Water ecology 2009). The coral reefs, that comprise a great number of plants and animals, are important in the coastal ecosystem balance. Ten percent of the coral reefs worldwide are being destroyed by human beings and only half of the countries around the world are capable of protecting them due to the expense that it entails (Water ecology 2009). According to Island Resources Foundation (1996), tourism has a greater indirect contribution to the degradation of coastal waters in terms of oil, fertilizer and pesticide pollution. The foundation’s report cited the US Virgin Islands waters that received oil spills from motorized vessels such as yacht, ferry and cruise ship (Island Resources Foundation 1996). In the Sarasota Bay and the Corpus Christy National Estuary Program assessments, the care practices for golf courses and condominium resorts release nitrate and phosphate to the waters during run off of storm water (Sarasota, 1993, cited in Island Resources Foundation 1996). It was also reported that the top 20 percent of countries that depend on tourism (e.g. Cayman Islands, Northern Netherlands Antilles, Anguilla, etc.) suffered environmental degradation that include reef, mangrove and related ecosystem damages (e.g. damage caused by anchors, clearing of mangrove, use of dynamite, littering, etc.) (Hoagland, et al. 1995, cited in Island Resources Foundation 1996). An alteration in the coastal zone such as construction of piers and wharves which

Tuesday, October 29, 2019

History of fast food restaurants in America Research Paper

History of fast food restaurants in America - Research Paper Example Fast foods include tacos, ice creams, hot dogs, fried chicken, juices, chicken nuggets, meat pies, pizzas, sausages, chips and sandwiches. Other foods that are often served in fast food restaurants are mashed potatoes, salads and chilli. One of the main characteristic of fast food restaurants is that they often maintain a limited menu, with or without seating space. This paper will analyse the history of fast food restaurants in the US tracing its development especially from the 1920s to date. Before the fast food restaurants gained ground, such foods as hamburger sandwiches and hotdogs had been big business in the early 1900s, their popularity bolstered by the holding of the St Louis World’s Fair. The first pizzeria in the United States of America opened in 1905 setting stable ground for the establishment of fast food restaurants (Famouswhy, 2010). Before the establishment of what is today known as the fast food restaurant in the US, White Castle, founded in Kansas’ Wichita was already established in 1921 (Howstuffworks, 2010). Then, most people assumed that burgers that were being sold in circuses, lunch counters, carts and fairs were of low quality. The assumption was based on the belief that hamburgers were made of soiled meat and scraps gotten from slaughterhouses. Taking note of this damning misconception, White Castles owners endeavoured to ensure that this misconception was destroyed. The founders of White Castle started to prepare their hamburgers in a manner that customers would appreciate. Their restaurants prepared the hamburgers such that clients saw how the ingredients were being mixed and the food cooked (Howstuffworks, 2010). They also painted their restaurants white and gave them names that suggested high levels of hygiene. With time, the popularity of the restaurant chain grew especially in the East and Midwest parts of the US. The

Sunday, October 27, 2019

Alternative Volatility Forecasting Method Evaluation

Alternative Volatility Forecasting Method Evaluation For many financial market applications, including option pricing and investment decisions, volatility forecasting is crucial. Therefore, the research of volatility forecasting has been an active area of study since the past years. In recent years, the emergence of many financial time series methods for volatility forecasting has proved the importance of understanding the nature of volatility in any financial instruments. Often, people will think price is used as an indicator of the stock market performance. Due to the non-stationary nature of price series of the stock market, most researchers actually transformed series of price change (return) or absolute price changes (absolute return) in their studies. There is a difference between the term return and the term volatility. The term volatility is used as a crude measure of the total risk of financial assets. Actually, volatility is the standard deviation or the variance of returns whereas return is merely the changes of prices. An increasingly commonly adopted tool for the measurement of the risk exposure associated with a particular portfolio of assets known as Value at Risk (VaR) involves calculation of the expected losses that might result from changes in the market prices of particular securities (Jorion, 2001; Bessis, 2002). Thus, the VaR of a particular portfolio is defined as the maximum loss on a portfolio occurring within a specified time and with a given (small) probability. Under this approach, the validity of a banks internally modeled VaR is backtested by comparing actual daily trading gains or losses with the estimated VaR and noting the number of exceptions occurring, in the sense of days when the VaR estimate was insufficient to cover actual trading losses, with concerns naturally arising where such exceptions frequently occur, and that can result in a range of penalties for the financial institution concerned (Saunders Cornett, 2003). A crucial parameter in the implementation of parametric VaR calculation methods is an estimate of the volatility parameter that describes the asset or portfolio, or more accurately a forecast of that volatility where the simplifying assumption of constancy is relaxed and time-varying volatility is acknowledged. While it has long been recognized that returns volatility exhibits clustering, such that large (small) returns follow large (small) returns of random sign (Mandelbrot, 1963; Fama, 1965), it is only following the introduction of the generalized autoregressive conditional heteroskedasticity (GARCH) model (Engle, 1982; Bollerslev, 1986) that financial economists have modeled and forecast these temporal dependencies using econometric techniques, and a variety of adaptations of the basic GARCH framework are now widely used in modeling time-varying volatility. In particular, the significance of asymmetric effects in stock index returns has been widely documented, such that equity re turn volatility increases by a greater amount following positive shocks, usually associated with the leverage effect, whereby a firms debt-to-equity ratio increases when equity values decline, and holders of that equity perceive future income streams of the firm as being more risky (Black, 1976; Christie, 1982). Such variance asymmetry has been successfully modeled and forecast in a variety of market contexts (Henry, 1998) using the threshold-GARCH (TGARCH) model (Glosten et al., 1993), and the exponential-GARCH (EGARCH) model (Nelson, 1991) in particular. Problem Statement While risk management practises in financial institutions often rely on simpler volatility forecasting approaches based on heuristics and moving average, smoothing or RiskMetrics techniques, symmetric and asymmetric GARCH models have also recently begun to be considered in the VaR context. However, the standard GARCH model and variants within that class of model impose rapid exponential decay in the effect of shocks on conditional variance. In contrast, empirical evidence has suggested that volatility tends to change slowly and that shocks take a considerable time to decay (Ding et al., 1993). The fractionally integrated-GARCH (FIGARCH) model (Baillie et al., 1996; Chung, 1999) has provided a popular means of capturing and forecasting such non-integrated but highly persistent long memory dynamics in volatility in the recent empirical literature, as well as its exponential (FIEGARCH) variant (Bollerslev Mikkelsen, 1996) which parallels the EGARCH extension of the basic GARCH form, an d therefore provides a generalization capable of capturing both the volatility asymmetry and long memory in volatility which are potential characteristics of emerging equity markets. Research Objectives This paper therefore seeks to extend previous research concerned with the evaluation of alternative volatility forecasting methods under VaR modeling in the context of the Basle Committee criterion for determining the adequacy of the resulting VaR estimates in two ways. First, by broadening the class of GARCH models under consideration to include more recently proposed models such as the FIGARCH and FIEGARCH representations described above, which are capable of accommodating potential fractional integration and the associated long memory characteristics of return volatility, as well as the more simple and computationally less intensive methods commonly used in financial institutions. Second, extending the scope of previous research through evaluative application of these methods to daily index data of nine stock market indexes. Significance of this study The extensive research of volatility forecasting plays an important role for investment, financial risk management, security valuation, and also business decision-making process. Without a proper forecasting tools and research on this field, many financial decision making process will be difficult and risky to be implemented. The positive contribution of volatility forecasting in the field of finance is no doubt a fact as it given many practitioners a mean of guidelines to estimate their management risk such as option pricing, hedging and estimating investment risk. Therefore, it is crucial to study on the performance of different approaches and methods of forecast model to determine the best suitable practical application for different situation. The most common form of financial instrument is the stock market. The stock indices consist of a particular countrys most prominent stocks. Thus, in this study our aim is to focus on forecasting the stock indices volatility of eight different stock indices that provide us the ability to test the forecast approaches. There are quite a number of forecast models since the recent years. However, the new concern is on the performance of these forecast model when incorporated with higher frequency data with the realized volatility method. There are still gap for researching the intra-day data effects on forecasting model which is comparative new as compared to daily data volatility forecasting. The significant role of this study also include whether intra-day data can really help at improving the performance of forecast model to estimate volatility for the stock index. Review of Chapters In this proposal, the report is mainly subdivided into three chapters. Chapter 1 is about the overview of this research which includes the background of the study, the research objective, problem statement, and the significance of this study. Chapter 2 presents the literature review of volatility forecasting, GARCH models, exponentially smoothing and realized volatility. CHAPTER 2: LITERATURE REVIEW 2.1 Volatility forecasting Volatility forecasts are produced by either market-based or time-series methods. Market-based forecasting involves the calculation of implied volatility from current option prices by solving the Black and Scholes option pricing model for the volatility that results in a price equal to the market price. In this paper, our focus is on the development of a new time series method. These methods provide estimates of the conditional variance, à Ã†â€™2t = var(rt | It-1), of the log return, rt, at time t conditional on It à ¢Ã¢â€š ¬Ã¢â‚¬Å" 1, the information set of all observed returns up to time t à ¢Ã¢â€š ¬Ã¢â‚¬Å" 1. This can be viewed as the variance of an error (or residual) term, ÃŽÂ µt, defined by ÃŽÂ µt = rt à ¢Ã¢â€š ¬Ã¢â‚¬Å" E(rt | It à ¢Ã¢â€š ¬Ã¢â‚¬Å" 1 ), where E(rt | It à ¢Ã¢â€š ¬Ã¢â‚¬Å" 1 ) is a conditional mean term, which is often assumed to be zero or a constant. ÃŽÂ µt is often referred to as the price à ¢Ã¢â€š ¬Ã…“shockà ¢Ã¢â€š ¬? or à ¢Ã¢â€š ¬Ã…“newsà ƒ ¢Ã¢â€š ¬?. 2.2 Overview of standard volatility forecast model 2.2.1 GARCH model GARCH models (Engle, 1982; Bollersle, 1986) are the most widely used statistical models for volatility. GARCH models express the conditional variance as a linear function of lagged squared error terms and lagged conditional variance terms. For example, the GARCH(1, 1) model is shown in the following expression: à Ã†â€™2t = à Ã¢â‚¬ ° + ÃŽÂ ±ÃƒÅ½Ã‚ µ2t à ¢Ã¢â€š ¬Ã¢â‚¬Å" 1 + ÃŽÂ ²Ãƒ Ã†â€™2t à ¢Ã¢â€š ¬Ã¢â‚¬Å" 1, where à Ã¢â‚¬ °, ÃŽÂ ±, and ÃŽÂ ² are parameters. The multiperiod variance forecast, , is calculated as the sum of the variance forecasts for each of the k periods making up the holding period: where is the one-step-ahead variance forecast. Empirical results for the GARCH(1, 1) model have shown that often ÃŽÂ ² à ¢Ã¢â‚¬ °Ã‹â€  (1 à ¢Ã¢â€š ¬Ã¢â‚¬Å" ÃŽÂ ±). The model in which ÃŽÂ ² = (1 à ¢Ã¢â€š ¬Ã¢â‚¬Å" ÃŽÂ ±) is term integrated GARCH (IGARCH) (Nelson, 1990). Exponential smoothing has the same formulation as the IGARCH(1, 1) model with the additional restriction that à Ã¢â‚¬ ° = 0. The IGARCH(1, 1) multiperiod forecast is written as Stock return volatility is often found to be greater following a negative return than a positive return of equal size. This leverage effect has promted the development of a number of GARCH models that allow for asymmetry. The first asymmetric formulation was the exponential GARCH model of Nelson (1991). In this log formulation for volatility, the impact of lagged squared residuals is exponential, which may exaggerate the impact of large shocks. A simpler asymmetric model is the GJRGARCH model of Glosten et al. (1993). The GJRGARCH(1, 1) model is given by , where à Ã¢â‚¬ °, ÃŽÂ ±, ÃŽÂ ³, and ÃŽÂ ² are parameters; and I[.] is the indicator function. Typically, it is found that ÃŽÂ ± > ÃŽÂ ³, which indicates the presence of the leverage effect. The assumption that the median of the distribution of ÃŽÂ µt is zero implies that the expectation of the indicator function is 0.5, which enables the derivation of the following multiperiod forecast expression: GARCH parameters are estimated by maximum likelihood, which requires the assumption that the standardized errors, ÃŽÂ µt / à Ã†â€™t, are independent and identically distributed (i.i.d.). Although a Gaussian assumption is common, the distribution is often fat tailed, which has prompted the use of the Student-t distribution (Bollerslev, 1987) and the generalized error distribution (Nelson, 1991). Stochastic volatility models provide an alternative statistical volatility modelling approach (Ghysels et al., 1996). However, estimation of these models has proved difficult and, consequently, they are not as widely used as GARCH models. Andersen et al. (2003) show how daily exchange rate volatility can be forecasted by fitting long-memory, or fractionally integrated, autoregressive and vector autoregressive models to the log realized daily volatility constructed from half-hourly returns. Although results for this approach are impressive, such high frequency data are not available to many forecasters, so there is still great interest in methods applied to daily data. A useful review of the volatility forecasting literature is provided by Poon and Granger (2003). 2.2.2 Exponentially Smoothing Exponentially Weighted Moving Average (EWMA) is simple and well-known volatility forecast method. The method is based on the simple average of past squared residuals to estimate its variance forecasts. The EWMA allows the latest observations to have a stronger weighted impact on the volatility forecast of past data observations. The equation for the EWMA is shown and written as exponential smoothing in recursive form. The ÃŽÂ ± parameter is the smoothing parameter. The equation: There is no proper guideline or statistic model for exponential smoothing. Generally, literature suggested using reduction in the sum of in-sample one-step-ahead estimation of errors (Taylor, 2004 cited from Gardner, 1985). In RiskMetrics (1996), volatility forecasting for exponential smoothing is recommended to use the following minimisation: In the above equation, ÃŽÂ µ2t is the in-sample squared error which acted as the proxy for actual variance whereby it is said to be not observable. By using ÃŽÂ µ2t as a proxy for variance, the actual squared residual, ÃŽÂ µ2t, is said to be biased and noisy. In Andersen et al. (1998), the research showed the evaluation of variance forecasts using realised volatility as a more accurate proxy. The next section would discuss more on the literature of realised volatility. The usage of high frequency data for realised volatility in forecast evaluation can be applied in parameter estimation for exponential smoothing with the following minimisation expression: . 2.2.3 Realised volatility The recent researchs interest in using a comparative volatility estimator as an alternative has emerged a significant literatures on volatility models that incorporated high frequency data. One of the emerging theories for a comparative volatility estimator is the so called Realized Volatility. Realized volatility is referred as the volatility calculated using a short period time series or using higher frequency periods. In Andersen and Bollerslev (1998) showed that high frequency data can be used to compute daily realize volatility which showed a better true variance than the usual daily return variance. This concept is adopted in Andersen, Bollerslev, Diebold Labys (2003) to forecast the daily stock volatility which found that the additional intraday information are provide better result in forecasting low volume and up market day. The application of realized volatility has also been employed by Taylor (2004) in parameters estimation for weekly volatility forecasting using realised volatility derived from daily data. An encouraging result were showed by using the smooth transition exponential smoothing method whereby the research used eight stock indices to compare the weekly volatility forecast of this method with other GARCH models (Taylor, 2004). The concept of realized volatility has been employed by many researchers in forecasting of many other financial assets such as foreign exchange rates, individual stocks, stock indices and etcetera. One of the early application of realized volatility concept has used spot exchange rates of Deutschemark-US dollar and Japanese Yen-US dollar to show the superiority of using intraday data as realized volatility measure. The sum of squared five-minute high frequency returns incorporated in the forecasting model proved to outperform the daily squared returns as a volatility measure (Andersen et al., 1998). Another similar study done by Martens (2001) has adopted realized volatility in forecasting daily exchange rate volatility using intraday returns. The results showed that using highest available frequency of intraday returns leads to superior daily volatility forecast. Furthermore, realized volatility approach has also been extended to studies for risk and return trade-off using high frequency data. In Bali et al. (2005), the research provided strong positive correlation between risk and return for stock market using high frequency data. The usage of daily realized which incorporated valuable information from intraday returns produce more accurate measure of market risk. In addition to this study, Tzang et al. (2009) as applied the realized volatility approach as a proxy for market volatility rather than squared daily returns to assess the efficiency of various model based volatility forecast. Finally, the findings from a research done by Andersen, Bollerslev, Diebold Labys (2001) shown that realized volatility in certain conditions is free for measurement error and unbiased estimator for return volatility. The proven research has prompted many recent works in forecasting intra-day volatility to applied realized volatility for their studies. This can be observed in McMillan Garcia (2009), Fuertes et al. (2009), Frijns et al.(2008) and Martens (2001). Many researchers exploit the advantage of realised volatility as an unbiased estimators measure for intra-day data and also as a simplified way to incorporated additional information into other forecast models. McMillan et al. (2009) utilised realised volatility to capture intraday volatilities itself as opposed to most researchers that uses realised volatility for daily realised approach. The study showed Hyperbolic Generalized Autoregressive Conditional Heteroscedasity (HYGARCH) as the best forecast model of intra-day volatility. 2.3 Forecast Models used in this study The forecast models that are presented in this study include: Random Walk (RW) 30 days Moving Average (MA30) Exponentially Weighted Moving Average (EWMA) with =0.06 (RiskMetrics) Exponentially Smoothing with ÃŽÂ ± optimised (ES) Integrated General Autoregressive Conditional Heteroskedastic using daily data (IGARCH) Exponentially Weighted Moving Average (Riskmetrics) on daily realised volatility calculated from intraday data. (EWMA-RV) Exponentially Smoothing with ÃŽÂ ± optimised on daily realised volatility calculated from intraday data. (ES-RV) General Autoregressive Conditional Heteroskedasticity model with intraday data using realised volatility approach (INTRAGARCH) Integrated General Autoregressive Conditional Heteroskedasticity with intraday data using realised volatility approach (IGARCH) General Autoregressive Conditional Heteroskedasticity with daily realised volatility (RV-GARCH) CHAPTER 3: DATA AND METHODOLOGY 3.1 Sample selection and description of the study Various comparative forecast models are used in order to evaluate the performance of incorporating intraday data. This study used dataset from nine stock indices include Malaysia (FTSE-BMKLCI), Singapore (STI), Frankfurt-Germany (DAX30), Hong Kong (Hang Seng Index), London-United Kingdom (FTSE100), France (CAC40), Shanghai-China (SSE), Shenzhen-China (SZSE), and United States (SP 100). These series consisted of daily closing prices and also the intraday hourly last price of their respective indices. The daily closing prices were retrieved using à ¢Ã¢â€š ¬Ã…“DataStream Advance 4.0à ¢Ã¢â€š ¬? and also from Yahoo Finance (http://finance.yahoo.com). Whereas, the hourly intraday last prices of these stock indices were retrieved from Bloomberg Terminal from Bursa Malaysia. Each stock index has their respective trading hours last price which produced a different number of observations for each series. The total number of trading hours within the day differed among different stock index. However, the sample period used in this study spanned approximately for 300 trading days, from 15 October 2009 to 15 March 2011. In order to simplify the study, the focus is based on a one-step-ahead volatility forecast. The first 200 trading days log returns were applied to estimate the parameters for various forecast models which is known as the in-sample forecast. The remaining 100 trading days log returns were used for post-sample evaluation. This study aimed to forecast volatility in daily log returns for various forecasting methods and used daily realised volatility as proxy for actual volatility. The next subsections presented the data description and the 10 forecast methods which will be considered in the study. 3.2 Data Analysis 3.2.1 Forecasting Methods This subsection describes the methodology to forecast the in-sample and out-sample performance of various forecast models. The forecast model includes Random Walk (RW), Moving Average, GARCH models, and Exponential smoothing techniques. 3.2.1.1 Standard volatility forecast model using daily returns This project paper adopted the simple moving average of squared residuals from the recent past 30 daily observations which is labelled as MA30 and the Random Walk (RW) for the standard volatility forecast model as performance benchmark. The 30 day simple moving average is given by: Whereby, ÃŽÂ µ2 = (rt ÃŽÂ ¼)2 shown in the previous section. The moving average is able to smooth out the short running fluctuations and emphasize on the long run trends or cycles through a series of averaging different subsets of datasets. On the other hand, the Random Walk (RW) is explained as the forecast result is equal to the actual value of the recent period. The actual value in this study used is the squared residual denoted as, ÃŽÂ µ2t. The equation is as shown below:à ¯? ¥ Tomorrows forecasted value = yesterday actual value ()à ¯Ã¢â€š ¬Ã‚ ½ 3.2.1.2 GARCH models for hourly and daily returns There are many different GARCH models for forecasting volatility that can be included in this research. However, the consideration in this study is limited to 2 forecast GARCH models which are the GARCH and IGARCH for practicality. The GARCH models in this study have applied GARCH (1, 1) specifications. The three forecast model used were labelled as IGARCH, INTRA-IGARCH, and INTRA-GARCH models. The IGARCH model is estimated using daily residuals as daily data is easily obtained from the source mentioned above. The general IGARCH forecast model used is given by: à ¯? ¢Ãƒ ¯? ¥ à ¯? ³ But, the parameter estimate generate by EVIEW 7 will be using the following expression: à ¯? ³ à ¯? ¢ à ¯? ¡Ãƒ ¯? ¥ à ¯? ³ However, the INTRA-IGARCH and INTRA-GARCH models used hourly residual data to estimate the forecast for daily realised volatility. The forecast for volatility of these models over an N-trading hours span period would be recognised as the forecast of daily volatility. The N trading hours span period is dependent on the trading hours of a specified stock index. In order to calculate the daily realised volatility, the equation is for N trading hours in a day for a particular stock index is given by: Where period i is the higher frequency of hourly data and the ÃŽÂ µ2t, is the squared residual of the particular hour. For example, if KLCI index has a 7 trading hours per day, the realised daily volatility is calculated from the sum of squared residual of these 7 hours. Additionally, forecast models such as INTRA-IGARCH and INTRA-GARCH applied equation 3 to obtain the daily realised volatility by replacing the squared residual, ÃŽÂ µ2t with values that is forecasted using these models. 3.2.1.3 GARCH model using realised volatility The GARCH model can be estimated using daily realised volatility which is derived from the hourly squared residual with equation 3. In order to apply RV for GARCH forecast model, equation 3 has to be modified to be squared root to be able to obtain the parameter estimates that is needed using EVIEW 6. The equation is as follow: As for this project paper, the GARCH model that used daily realised volatility as input data is labelled as RV-GARCH. 3.2.1.4 Exponential smoothing and EWMA methods The forecast model for exponential smoothing method has been implemented into two approaches. The first is by using minimisation of equation 3 to optimise the parameter and it is labelled as ES for this project paper. The actual value (squared residual), ÃŽÂ µ2t is obtained from the daily data. The second approach which is said to be the better proxy variance forecast has applied equation 4 for the minimisation. The forecast model for this exponential smoothing method is termed as ES-RV which adopted daily realised volatility from hourly data. Apart from that, the study also considered the smoothing parameter ÃŽÂ ± as a fixed value of 0.06 as recommended by RiskMetrics (1996) for model using daily data and daily realised volatility data derived from hourly data. The forecast model is termed as EWMA and EWMA-RV respectively. By using equation 2 as shown previously, the EWMA used daily squared residual as ÃŽÂ µ2t 1 parameter input while the EWMA-RV used the daily realised volatility as the ÃŽÂ µ2t 1 parameter input. 3.3 Research Design (Gantt Chart) Jul Aug Sep Oct Nov Dec Jan Feb Mar Literature Review Methodology Research proposal Data collection Data analysis Discussion and conclusion