Degree, Academic

I INTRODUCTION

Degree, Academic, title granted by a college or university, usually signifying completion of an established course of study. Honorary degrees are conferred as marks of distinction, not necessarily of scholarship.

II HISTORY

Institutions of higher learning have granted degrees since the 12th century. The word itself was then used for the baccalaureate and licentiate, the two intermediate steps that led to the certificates of master and doctor, requisites for teaching in a medieval university. During the same period, honorary degrees were sometimes conferred by a pope or an emperor. In England, the Archbishop of Canterbury, by an act passed during the reign of King Henry VIII, acquired the authority to grant honorary Lambeth degrees.

During the Middle Ages, the conferring of a doctorate also allowed the recipient to practice the profession in which the certificate was awarded; this condition still holds true for the legal and medical professions in European countries, such as France, in which the government controls the universities.

III DEGREE SYSTEMS

In Germany and at most Continental universities, only the doctor’s degree is conferred, except in theology, in which the licentiate, or master’s degree, is also presented. Granting of the doctorate is contingent upon the acceptance of a dissertation and the passing of examinations. The baccalaureate is not usually a university degree in Europe. In France, it is acquired by passing a state examination at the completion of secondary education; the only university-conferred baccalaureate is that awarded by the faculty of law.

Most British universities grant the bachelor’s degree after the satisfactory completion of a three- or four-year course. Some, such as Oxford and Cambridge, have examinations, called tripos, for honours. The master’s degree in arts or science is granted after a further period of residence and study and the payment of fees. Other English universities grant the master’s degree only after a candidate has passed a series of examinations and presented an approved thesis. Doctorates (Ph.D. or D.Phil.) are awarded for individual research contributions in the arts or sciences. Postgraduate work leads to the writing of a thesis and the passing of oral (viva voce) and written examinations. Honorary degrees, such as the D.Litt. are sometimes given to prominent public figures.

In Australia, a bachelor’s degree precedes a master’s degree, with the latter being earned after an additional year or two of study. Degree courses vary between three and six years of study, with a doctor’s requiring a further two to five years. The most commonly granted degrees in the United States are the BA, or bachelor of arts, and the B.Sc., or bachelor of science, both generally given after the completion of a four-year course of study and sometimes followed by a mark of excellence, such as cum laude (“with praise”); magna cum laude (“with great praise”); or summa cum laude (“with highest praise”). The master’s degree is granted after one or two years of postgraduate work and may require the writing of a thesis or dissertation. The doctorate requires two to five years of postgraduate work, the writing of a thesis, and the passing of oral and written examinations.

IV ACADEMIC COSTUME

The academic dress worn at degree-granting ceremonies consists of a long, full-cut gown and a mortarboard, a stiff square-shaped cap with a tassel. In the United Kingdom, a hood lined with coloured silk indicating the graduate’s institution is worn. More ornate gowns are worn for the D.Phil. and other higher degrees, and for ceremonies such as the Insaenia (the ceremony at which honorary degrees are granted). Full subfusc is still required at Oxford for examinations; this black and white outfit consists of dark suits for men and a white blouse and black skirt for women, as well as a mortarboard and cap.

Credited image: aliexpress

Loading...

Literacy

I INTRODUCTION

Literacy, the ability to read and write at a level for an individual to operate and progress in the society they live in. It is sometimes further defined as the ability to decode written or printed signs, symbols, or letters, combined into words. In 1958 UNESCO defined an illiterate person as someone “who cannot with understanding both read and write a short simple statement on his/her everyday life”.

II THE DEVELOPING WORLD

Most literacy surveys use this basic definition, particularly surveys of literacy levels in developing countries. Based on this definition about 4 in 5 of the population of the world over 15 years of age would be considered literate. According to information from UNESCO released in 2003 more women are now literate than ever before.

The number of adults who are illiterate in the world has fallen from 22.4 per cent in 1995 to 20.3 per cent in 2000; in total from about 872 million adults in 1995 to 862 million adults in 2000. If this trend continues the number of illiterate adults in the world in 2010 should have dropped to 824 million, or 16.5 per cent. The largest fall in illiteracy has been in Africa and Asia.

Although women continue to make up 2 in every 3 of the illiterate adults in the world, the number of illiterate women is falling and the percentage of illiterate women has dropped from 28.5 per cent to 25.8 per cent. This tendency is particularly marked in Africa, where, for the first time, most women are now literate. Although discrimination is one major reason why girls and women lack access to education, countries in the developing world increasingly recognize the benefits of providing access to education for girls and women, particularly as the children of educated women are more likely to become educated themselves.

Progress is slow, however, and about 20 per cent of adults remain illiterate. Worryingly, at the present rate, it is likely that the number of illiterate adults will further fall by about 5 per cent by the year 2015. Just as worryingly, the United Nations Children’s Fund (UNICEF) reports that 121 million children in the world are not in school and most of these are girls.

Although the Universal Declaration of Human Rights in 1948 and the 1989 Convention on the Rights of the Child established education as a basic human right, about 1 in 5 adults in the world were unable to read and write at the beginning of the 21st century.

III THE INDUSTRIALIZED WORLD

Far more than basic literacy as defined by UNESCO is necessary for any adult living in an industrialized society. Recognition of this has led to the use of more complex definitions of literacy in most industrialized countries and the use of the term “functionally illiterate” rather than “illiterate”. This term—functionally illiterate—is usually used to refer to adults who are unable to use a variety of skills beyond the reading or writing of a simple sentence. In the industrialized world someone is considered to be functionally literate if they can “use reading, writing, and calculation for his or her own and the community’s development” rather than if they can merely read and write to some limited extent.

There have been rather fewer surveys of functional illiteracy in industrialized countries than of illiteracy in the developing world. However, beginning in 1994 governments, national statistical agencies, research institutions, and the Organisation for Economic Co-operation and Development (OECD) undertook a large-scale assessment of the literacy skills of adults, called the International Adult Literacy Survey (IALS). The IALS considered literacy in three areas: the first, “prose literacy”, focused on reading and interpreting prose in newspaper articles, magazines, and books; the second area, “document literacy”, focused on identifying and using information located in documents, such as forms, tables, charts, and indexes; the third, “quantitative literacy”, considered how well adults could apply numerical operations to information contained in printed material, such as a menu, a chequebook, or an advertisement.

The IALS considered how well adults in industrialized countries could use information to function in society and the economy rather than just classifying adults as either functionally literate or functionally illiterate. This definition was much more about the ability to operate in a print-based industrialized world than about the simple ability to decode print.

Nine countries—Canada (English- and French-speaking populations), France, Germany, Ireland, the Netherlands, Poland, Sweden, Switzerland (German- and French-speaking regions), and the United States—took part in the IALS in 1994. Two years later, in 1996, five more areas—Australia, the Flemish Community in Belgium, Great Britain, New Zealand, and Northern Ireland—administered the IALS assessment tests to samples of their adults. Finally, Chile, the Czech Republic, Denmark, Finland, Hungary, Italy, Norway, Slovenia, and the Italian-speaking region of Switzerland took part in the IALS in 1998.

The IALS established five levels of literacy:

• Level 1 suggested that a person had very poor skills. Adults at this Level would be unable to determine the correct amount of medicine to give a child from information printed on the package.
• Level 2 adults could only deal with simple material that was clearly laid out, and where the tasks were not very complex. Adults at level 2 had weak skills, could read, but tested poorly.
• Level 3 was considered the minimum for coping with the demands of everyday life and work in an industrialized country. It suggests the approximate level of literacy required for successfully completing secondary education and for moving on to higher education. At this level an adult should be able to integrate several sources of information and solve more complex problems.
• Levels 4 and 5 were used to describe adults who had higher-order information processing skills.
What is clear from the IALS is that there are considerable differences in the average level of literacy both within and between countries. In every country some adults had low-level literacy skills, although this varied from country to country. Common factors that influenced literacy level were home background and previous educational attainment. Recently, however, researchers and other experts have questioned the method used in the IALS and have suggested that it may not give an accurate picture of literacy in industrialized countries.
Contributed By:
Alan Wells

Credited Images: aliexpress

Loading...

Educational Broadcasting

I INTRODUCTION

Educational Broadcasting, use of radio and television to assist teaching and learning.

Pioneers of wireless telegraphy (radio), such as Marconi, working 100 or so years ago, believed the new technology would soon be put to useful purposes. Shortly after it was set up in 1922, the British Broadcasting Company, later the British Broadcasting Corporation (BBC), began to see how it could use “wireless”, as it was first called, to assist learning. John Reith, who, as a public service broadcaster, aimed to “educate, inform, and entertain”, set up the first National Advisory Committee on Education in 1923 and appointed a Director of Education, a school inspector, who, in 1924, wrote an article in the BBC’s programme listings magazine the Radio Times, proposing a Broadcasting University.

The earliest experimental broadcasts to schools emanated from Glasgow and London in 1924 and by the autumn of that year regular secondary school and adult education broadcasts were in place, with regular supporting publications coming soon after. A new weekly publication, The Listener, began publishing transcripts of educational talks from 1929 and developed into a magazine until 1991 when it ceased. As the services grew, education officers were appointed to liaise with the educational world and to advise on policy. Separate Advisory Committees for School and Adult Education were set up and, for the latter, a Group Listening movement was encouraged. During the 1930s the whole system flourished, with most subjects on the curriculum treated. Mathematics was, interestingly, an exception.

Among initiatives at this time were new ways of learning, emphasizing a more imaginative, child-centered approach. Programmes in Gaelic and Welsh were introduced for children in Scotland and Wales. In the early 1930s, it was not thought appropriate to make broadcasts for younger pupils. However, largely because of improved broadcasting practices, using drama and music in place of straight “talk”, such broadcasts quickly became successful later in the decade. History and foreign language teaching series were firm favourites.

II THE IMPACT OF EDUCATIONAL BROADCASTING

The British system of educational broadcasting, transmitted nationally, was soon widely studied and used as a model in many countries, notably those in what, later, became the Commonwealth, and in more distant countries, such as Japan. It was realized that radio had great potential in both formal and informal education by adding to what teachers could provide, reaching isolated groups of learners, filling in for non-existent teachers, and acting as an agent of in-service training. Before long, “radio schools” were operating in countries with dispersed and remote populations, a notable example being the School of the Air in Australia, where two-way radio supplemented correspondence courses.

World War II severely disrupted life in Britain, but school broadcasting flourished and by 1945 some 2,000 more schools were using the service than in 1939, with 30 weekly series offered. Music, drama, and arts and crafts became popular subjects, along with civics and current affairs. Educational broadcasts became an anchor for teachers who, in the words of one headmistress, saw them as “lifebuoys in a queer, turbulent, scholastic sea”. As the war ended, Forces Educational Broadcasts were devised to help demobilized service people with their return to civilian life. Television then became the centre of interest, with pilot experiments in school television in 1952 leading to a permanent service in 1957.

By the mid-1960s a comprehensive system of school broadcasts was being provided both by the BBC and Independent Television (ITV), which had started school television broadcasts just ahead of the BBC in 1957. Channel 4, which was launched in 1982, now handles school broadcasting for commercial television. The visual medium added a new dimension to learning and, significantly, mathematics became a successful subject for educational broadcasting, helping pupils and teachers to deal with the “new maths”, then much in the news. Apart from continuing series in such subjects as modern language teaching, science, and history, adult education broadcasts began to address social issues such as parenting, old age, illiteracy, and unemployment—what became known as Social Action broadcasts.

In less-developed countries, educational broadcasts became a tool in social and political development, with campaigns treating health and farming issues. By the 1960s there was considerable evidence that educational broadcasting was a powerful branch of distance learning.

Much work had been done worldwide, when, in 1969, a major step was taken in the United Kingdom with the setting up of the Open University (OU). This combined the practice of correspondence learning, a well-proven distance learning technique, with educational broadcasts. It was effected by an alliance with the BBC, which created a department to make the radio and television programmes. These were the first broadcast in 1971, accompanying the OU printed courses, prepared by course teams, including the BBC producers. The OU model is now being used worldwide and uses a very broad range of new educational technology.

In the late 1990s the consultative method of deciding educational output with councils started in the 1920s, was changed. An important strand of programmes supported pupils taking new national examinations, and the Internet came into play. The “new” technology was now more interactive.

III TECHNOLOGY IN EDUCATIONAL BROADCASTING

The Internet is the latest “new” technology to propel a development in educational broadcasting. Prominent examples in the past have been the transistor, which allowed many more people to use radio conveniently, especially in developing countries where teachers were scarce; stereo sound; black and white television, followed by colour television, which was an important addition to programmes dealing with subjects such as natural history and geography; improvements in recording techniques, cassetting, and disc recording (CD-ROMs); and the growth of transmission systems, such as cable and satellite, resulting in much increased coverage.

There have been many successful and unsuccessful attempts using satellites to transmit educational radio and television. India, in 1975, saw the Satellite Instructional Television Experiment, aimed at small, distant villages, supported by money and know-how from the United States. Canada and Australia, among many other countries, devised distance learning projects. Most of these earlier, often ambitious, schemes, faced the problems experienced by their earthbound forerunners in the 1920s, namely inefficient transmitting and receiving apparatus, unreliable liaison between users and providers, inadequate backup print material, inappropriate syllabuses, and ultimately the need for human contact.

Most of these problems have been solved to some degree in developed countries, with even liaison and interactivity partially solved. Efficient postal systems, the telephone (fixed and mobile), and various recording and playback systems based on computer technology all have a part to play in educational broadcasting in the new millennium. The Internet and its websites are now familiar to many children in developed countries and among educational elites elsewhere, but it remains of little significance to very much more, who lack the most basic means of subsistence.

Reviewed By:
John Cain

 

Credited Images: Youstyle

Loading...

Education Act 1944

Education Act 1944, legislation that radically changed the structure of education in England and Wales. Rab Butler, as Minister of Education, championed the progression of the Act, and it is after him that the Act is commonly known: “The Butler Act”. The Act affected education in four main ways.

Education Act 1944

First, it increased the role and powers of the Minister of Education. Before 1944, the Minister was simply responsible for “the superintendence of certain matters relating to education in England and Wales”. Under the 1944 Act, the Minister was charged with a positive duty to “promote” education and to “secure the effective execution by local authorities, under his control and direction” of the national policy for education. These strengthened ministerial powers created an education system that came to be known as “a national service, locally administered”.

Second, the administration of education was itself re-shaped: 169 of the 315 local education authorities that existed before 1944 were abolished on April 1, 1945. That left the local administration of all forms of public education in the hands of 146 county councils and county borough councils.

Third, the Act restructured the school system. Previously, publicly funded schools and colleges were either “elementary” or “higher”. “Higher” institutions included everything that was not elementary. So secondary schools, to which about 20 per cent of children went at the age of 11, and junior technical schools, to which only about 1 percent of children were admitted, were both distinguished from “elementary” schools, attended by nearly 80 per cent of the school population. Compulsory education ended at the age of 14 so the country’s 11-to-14-year-olds were in two overlapping systems leading to a very different job or further education opportunities. The Act put an end to this. It organized schools “in three progressive stages to be known as primary education, secondary education, and further education”. This structural change often described as “secondary education for all” was perhaps the Act’s greatest achievement. Part of that achievement, at a time of great economic difficulty, was to raise the school leaving age, initially from 14 to 15, in 1947.

Finally, Butler, the government minister responsible for seeing the Education Bill through parliament, brought about a compromise between schools maintained by councils and denominational schools. The essence of this “dual control” system, which still exists, was that nearly all church (denominational) schools chose either to become “controlled” or “aided”. Controlled schools were entirely funded by the local council, whereas the governors or managers of aided schools remained responsible for capital expenditure on the fabric of the buildings, which they continued to own, and had increased rights over staffing and the curriculum (see Religious Schools).

Much of the framework of the 1944 Education Act remained in place until the end of the 20th century. Although it achieved much, the Act left two important issues unresolved. One was the structure of the newly created secondary schools. Initially, the emphasis was on a tripartite system: grammar schools for some, technical schools for a very few, and secondary modern schools for most. Selection of these schools was managed by local education authorities. The 11+ tests, as they came to be known, were taken in the last year of primary education by all children, and usually consisted of a general intelligence test, a test of attainment in English and arithmetic, and a report from each child’s primary school teacher. As the pattern of schools that followed the 1944 Act has increasingly been replaced by schools that combine all three elements in one “comprehensive” school, by January 2000 the 11+ examination was retained only in the areas served by the 164 remaining grammar schools. Specialisms within comprehensive schools have, however, been encouraged, and aptitude testing related to specialisms, such as music, foreign languages, and technology, has steadily increased.

A second unresolved issue was the school curriculum. The Act deliberately left the curriculum to be determined locally by schools and local education authorities. It was not until 1988 that the notion of a national curriculum, defining a common entitlement for all children in whatever type of school they were enrolled, was given legislative force.

Contributed By:
Peter Anthony Newsam

Loading...

Indo-European Languages

I INTRODUCTION

Indo-European Languages, the most widely spoken family of languages in the world (although not the largest language family in the world), containing the following nine subfamilies: Albanian, Armenian, Baltic, Celtic, Germanic, Greek, Indo-Iranian, Italic (including the Romance languages), Slavic; and five extinct subfamilies, Anatolian (including Hittite), Phrygian, Thracian, Tocharian, and an Unclassified group (including Venetic, which some linguists believe to be an Italic language). Indo-European languages were first spoken in Europe and southern Asia and, because of European colonialism, are now widespread throughout the world.

II ESTABLISHMENT OF THE FAMILY

Proof that these highly diverse languages are members of a single family was largely accumulated during a 50-year period around the turn of the 19th century. The extensive Sanskrit and Ancient Greek literature (older than those of any other Indo-European language except the then-undeciphered Hittite) preserved characteristics of the basic Indo-European forms and pointed to the existence of a common parent language. By 1800 the close relationship between Sanskrit, Ancient Greek, and Latin had been demonstrated. Hindu grammarians had systematically classified the formative elements of their ancient language. To their studies were added extensive grammatical and phonetic comparisons of European languages. Further studies led to specific conclusions about the sounds and grammar of the assumed parent language (called Proto-Indo-European), the reconstruction of that hypothetical language, and estimates about when it began to break up into separate languages. (By 2000 bc, for example, Greek, Hittite, and Sanskrit were distinct languages, but the differences between them are such that the original tongue must have been fairly unified about a millennium earlier or in about 3000 bc.) The decipherment of Hittite texts (identified as Indo-European in 1915) and the discovery of Tocharian in the 1890s (spoken in medieval Chinese, or Eastern, Turkistan, and identified as Indo-European in 1908) added new insights into the development of the family and the probable character of Proto-Indo-European.

The early Indo-European studies established many principles basic to comparative linguistics. One of the most important of these was that the sounds of related languages correspond to one another in predictable ways under specified conditions (see Grimm’s Law and Verner’s Law for examples). According to one such pattern, in some Indo-European subfamilies—Albanian, Armenian, Indo-Iranian, Slavic, and (partially) Baltic—certain presumed k sounds of Proto-Indo-European became sibilants such as s and ś (a sh sound). The common example of this pattern is the Avestan (ancient Iranian) word satem (“100”), as opposed to the Latin word centum (“100”, pronounced “kentum”). Formerly, the Indo-European languages were routinely characterized as belonging either to a Western (centum) or an Eastern (satem) division. Most linguists, however, no longer automatically divide the family in two in this way, partly because they wish to avoid implying that the family underwent an early split into two major branches, and partly because of this trait, although prominent, is only one of several significant patterns that cut across different subfamilies.

III EVOLUTION

In general, the evolution of the Indo-European languages displays a progressive decay of infection. Thus, Proto-Indo-European seems to have been highly inflected, as are ancient languages such as Sanskrit, Avestan, and classical Greek; in contrast, comparatively modern languages, such as English, French, and Persian, have moved towards an analytic system (using prepositional phrases and auxiliary verbs). In large part the decay of infection was a result of the loss of the final syllables of many words over time, so that modern Indo-European words are often much shorter than the ancestral Proto-Indo-European words. Many languages also developed new forms and grammatical distinctions. Changes in the meanings of individual words have been extensive.

IV ANCIENT CULTURE

The original meanings of only a limited number of hypothetical Proto-Indo-European words can be stated with much certainty; derivatives of these words occur with consistent meanings in most Indo-European languages. This small vocabulary suggests a New Stone Age or perhaps an early metal-using culture with farmers and domestic animals. The identity and location of this culture have been the object of much speculation. Archaeological discoveries in the 1960s, however, suggest the prehistoric Kurgan culture. Located in the steppes west of the Ural Mountains between 5000 and 3000 bc, this culture had diffused as far as Eastern Europe and northern Iran by about 2000 bc.

See also Franz Bopp; Jacob Grimm; Ferdinand de Saussure; Philology.

Selected statistical data from Ethnologue: Languages of the World, SIL International.

DR FROW. GRADUATING WITH A DOCTORATE
Loading...

Degree, Academic

I INTRODUCTION

Degree, Academic, title granted by a college or university, usually signifying completion of an established course of study. Honorary degrees are conferred as marks of distinction, not necessarily of scholarship.

Clever girl: Emma graduated with a bachelor’s degree in English literature from the Ivy League university in Rhode Island

II HISTORY

Institutions of higher learning have granted degrees since the 12th century. The word itself was then used for the baccalaureate and licentiate, the two intermediate steps that led to the certificates of master and doctor, requisites for teaching in a medieval university. During the same period, honorary degrees were sometimes conferred by a pope or an emperor. In England, the Archbishop of Canterbury, by an act passed during the reign of King Henry VIII, acquired the authority to grant honorary Lambeth degrees.

During the Middle Ages, the conferring of a doctorate also allowed the recipient to practice the profession in which the certificate was awarded; this condition still holds true for the legal and medical professions in European countries, such as France, in which the government controls the universities.

III DEGREE SYSTEMS

In Germany and at most Continental universities, only the doctor’s degree is conferred, except in theology, in which the licentiate, or master’s degree, is also presented. Granting of the doctorate is contingent upon the acceptance of a dissertation and the passing of examinations. The baccalaureate is not usually a university degree in Europe. In France, it is acquired by passing a state examination at the completion of secondary education; the only university-conferred baccalaureate is that awarded by the faculty of law.

Most British universities grant the bachelor’s degree after the satisfactory completion of a three- or four-year course. Some, such as Oxford and Cambridge, have examinations, called tripos, for honours. The master’s degree in arts or science is granted after a further period of residence and study and the payment of fees. Other English universities grant the master’s degree only after a candidate has passed a series of examinations and presented an approved thesis. Doctorates (Ph.D. or D.Phil.) are awarded for individual research contributions in the arts or sciences. Postgraduate work leads to the writing of a thesis and the passing of oral (viva voce) and written examinations. Honorary degrees, such as the D.Litt. are sometimes given to prominent public figures.

In Australia, a bachelor’s degree precedes a master’s degree, with the latter being earned after an additional year or two of study. Degree courses vary between three and six years of study, with a doctor’s requiring a further two to five years. The most commonly granted degrees in the United States are the BA, or bachelor of arts, and the B.Sc., or bachelor of science, both generally given after the completion of a four-year course of study and sometimes followed by a mark of excellence, such as cum laude (“with praise”); magna cum laude (“with great praise”); or summa cum laude (“with highest praise”). The master’s degree is granted after one or two years of postgraduate work and may require the writing of a thesis or dissertation. The doctorate requires two to five years of postgraduate work, the writing of a thesis, and the passing of oral and written examinations.

IV ACADEMIC COSTUME

The academic dress worn at degree-granting ceremonies consists of a long, full-cut gown and a mortarboard, a stiff square-shaped cap with a tassel. In the United Kingdom, a hood lined with coloured silk indicating the graduate’s institution is worn. More ornate gowns are worn for the D.Phil. and other higher degrees, and for ceremonies such as the Insaenia (the ceremony at which honorary degrees are granted). Full subfusc is still required at Oxford for examinations; this black and white outfit consists of dark suits for men and a white blouse and black skirt for women, as well as a mortarboard and cap.

More about  Clever girl: Emma graduated with a bachelor’s degree in English literature from the Ivy League university in Rhode Island

 

Semi-Exclusive… 51429934 ‘Harry Potter’ actress Emma Watson, in cap and gown, attending Brown University’s 2014 Graduation Ceremony in Providence, Rhode Island on May 25, 2014. Watson received a degree in English Literature and it took five years to earn her diploma, starting in 2009! Emma was surrounded by security guards once the graduation ceremony was complete. ‘Harry Potter’ actress Emma Watson, in cap and gown, attending Brown UniversityÕs 2014 Graduation Ceremony in Providence, Rhode Island on May 25, 2014. Watson received a degree in English Literature and it took five years to earn her diploma, starting in 2009! Emma was surrounded by security guards once the graduation ceremony was complete. FameFlynet, Inc – Beverly Hills, CA, USA – +1 (818) 307-4813

March 25, 2014: Emma Watson at the Ed Sullivan Theater in New York City for an appearance on ‘The Late Show With David Letterman’.
Mandatory Credit: Kristin Callahan/ACE/INFphoto.com Ref: infusny-220|sp|U.S., UK, AUSTRALIA, NEW ZEALAND SALES ONLY.
LONDON, UNITED KINGDOM – MARCH 31: Emma Watson attends the UK film premiere of ‘Noah’ at Odeon Leicester Square on March 31, 2014 in London, England. (Photo by Fred Duval/FilmMagic)
Cast member Emma Watson arrives for the UK Premiere of “Noah” at Leicester Square in London, March 31, 2014. REUTERS/Paul Hackett (BRITAIN – Tags: ENTERTAINMENT) – RTR3JDZ5

FILE – This March 26, 2014 file photo shows actress Emma Watson at the premiere of “Noah,” in New York. Watson, most known for her role as Hermione Granger in the “Harry Potter” franchise, is graduating from Brown University, an Ivy League school in Providence, R.I., on May 25. (Photo by Evan Agostini/Invision/AP, FIle)
Loading...

Education, Military

I INTRODUCTION

Education, Military, training of the officers and enlisted (or conscripted) personnel of a nation’s military and naval forces. The goal of such training is to equip members of the services with the basic skills and discipline needed for appropriate action under the stress of combat. Qualified personnel may receive more formal education to enable them to advance professionally.

II DEVELOPMENT OF SERVICE ACADEMIES

Until the mid-18th century, military training was restricted to drill and acquiring knowledge of weaponry and tactics primarily from practical experience in battle. The technological evolution of warfare, however, led to the establishment of formal military and naval academies specifically for training professional officers. Prussia was a pioneer in this field with the establishment in 1810 of the Kriegsakademie (War Academy) in Berlin. Today, in Germany, the universities of the Bundeswehr (German Armed Forces) are primarily technical and scientific in orientation.

In France, a two-track system evolved. L’École Polytechnique serves as a general technical university for army, naval, and civil service personnel, whereas L’École Spéciale Militaire, founded in 1808 by Napoleon, provides two years of initial military training. Originally located at St Cyr, it is now based at Coëtquidan. Advanced military training is provided by the École Supérieure de Guerre.

In 1947 the Royal Military Academy at Sandhurst, founded in 1802, was combined with the older Royal Military Academy at Woolwich, founded in 1741, to form the Royal Military Academy, Sandhurst. Until World War II broke out in 1939, British army officers had been trained in artillery, communications, and engineering at Woolwich or received cavalry and infantry training at Sandhurst. Naval cadets are trained at the Royal Naval College, Dartmouth; air force cadets at the RAF College, Cranwell. The Imperial Defence College serves as the school for senior service personnel. The emphasis on military élitism in the Soviet Union was reflected by the maintenance of 21 strongly research-orientated military academies on a level with civilian universities.

In the United States, the Reserve Officers Training Corps (ROTC) programmes in high schools and colleges prepare students for commissions in the Army Reserve. The United States Military Academy is a four-year engineering college whose graduates are commissioned into the Regular Army. Corresponding service institutions are the United States Naval Academy, the United States Air Force Academy, the United States Coast Guard Academy, and the US Merchant Marine Academy (see Merchant Marine of the United States).

The US Department of Defense maintains the National Defense University in Washington, D.C., which trains senior officers and selected civilians in theories and procedures relevant to national security.

III MODERN MILITARY EDUCATION

Current trends in military education reflect rapid changes brought about by technological innovation and are geared to national security requirements. Advanced courses taken at civilian colleges and universities play a major role in supplementing the educational experience of selected personnel (primarily officers). In the United States, integration of women into the service academies was begun in 1976. On fulfilling their mandatory obligations or on retiring from the services, many embark on civilian careers in management or in professions requiring the skills acquired in the services.

List of beautiful militaries:

 

Loading...

Education, Medical

Education, Medical, a process by which individuals acquire and maintain the skills necessary for the effective practice of medicine.

To train as a conventional doctor in the Western world a person needs to have achieved a good level of understanding in the sciences (for example, physics, chemistry, biology), either at senior (high) school or at college. Medical schools are usually part of a university (although not all universities have medical schools) and they offer only a limited number of training places in any one year. This results in fierce competition for places, with only the best students being admitted.

Most medical schools offer a training course of between three and six years in duration. The curriculum is traditionally divided into two parts: a preclinical course in which the basic science of how the human body works is studied; and a clinical course in which the student is introduced to actual patient care in a hospital. The former is usually taught in science departments at the university and the latter at a hospital affiliated with the university.

The preclinical course involves such areas of study as the gross and microscopic appearance and connections of the human body (anatomy), the organization and basic functions of different types of human cell (cell biology), the function and underlying biochemical processes of parts of human cells (biochemistry), the integrated functions of tissues, organs, and body fluids (physiology), the principal actions, distribution, and elimination of drugs in the body (pharmacology), the general principles underlying disease processes and such disease-related micro-organisms as viruses, bacteria, and parasites (pathology), the defence mechanisms of the body (immunology), and the structure and function of genetic material in living and infected cells (genetics).

The clinical part of the course involves medical students working with experienced doctors in general practice and hospitals to learn family practice and general medicine, and such specialized areas of health care as surgery (removal, reconnection, or transplantation of parts of the body), obstetrics (pregnancy and childbirth), paediatrics (diagnosis and treatment of childhood complaints), gynaecology (diagnosis and treatment of ailments of the reproductive system), geriatrics (diagnosis and treatment of ailments suffered by elderly people), and psychiatry (diagnosis and treatment of mental ill-health). During this time, medical students observe and learn from doctors working with patients on the wards and in specialist clinics, and gradually, under their supervision, become involved directly in the provision of health care (for example, diagnosis and administration of therapy).

Students have to pass examinations in all of these different aspects of the course, which take to form of written, practical, and oral tests. Upon graduating, they received a Doctor of Medicine (MD), Bachelor of Medicine (BM), or an equivalent degree. New doctors swear the Hippocratic Oath (or an equivalent professional statement) to adhere at all times to high standards of medical practice and ethics, and to protect the right of every patient to life, dignity, and confidentiality.

It is usual for “junior” doctors to serve at least one year as an “intern” or “house officer” and to have responsibility for both diagnosing and treating patients in the hospital. At this point, they choose to move away to a new hospital. Such a post, however, is considered to be an extension of their training with overall responsibility for their work resting with the senior colleagues supervising their work. In most countries, “junior” doctors often complain that they work excessively long hours for relatively poor pay (that is, relative to other professionals after several years of training).

During his or her time as a junior doctor, an individual must decide whether to work in general or in a specialist branch of medicine. If the latter, the doctor applies to work with a particular specialist and his or her team and once accepted embarks upon a training course which lasts for several years; the training being obtained largely by the experience of working with other more experienced doctors in the group. During this time, he or she is called “registrar” or “intern” and the training culminates in both written and oral exams set by an official body on that subject (for example, the Royal College of Pathologists or the Royal College of Surgery in the United Kingdom, both of which decide whether a doctor is sufficiently knowledgeable and able to practise as a specialist in that particular area of medicine). If successful, the doctor is awarded “membership” of the college.

It is important that doctors keep up with medical progress (the results of medical research concerning new forms of diagnosis and treatment). Most often this takes the form of reading medical journals and books, attending conferences, and discussing medical matters with other specialists in the same or different fields. More recently, doctors have been able to communicate with one another and receive the latest medical information using the Internet (often referred to as the “information superhighway”), which can link computers used by doctors in different hospitals and/or general practices around the world.

Some doctors, especially those in general practice, choose to incorporate such unorthodox medical techniques as acupuncture or reflexology (see Complementary Medicine) into their medical practice and offer these to their patients, where appropriate, usually in parallel with more conventional treatments; these are seldom offered as an alternative to conventional Western medicine. So popular are some of these unorthodox methods that some medical schools are now offering training courses on these topics for both trainee and postgraduate (that is, experienced, practicing) doctors.

Contributed By:
Claire Elizabeth Lewis

Loading...

Education, Physical

I INTRODUCTION

Education, Physical, instruction in various kinds of physical activity to promote the physical development and well-being of the individual. Physical education is generally taught in schools from nursery to secondary level, and in some countries, including Britain, is a compulsory part of the curriculum. It involves organized sports, gymnastics, dance, athletic activities, swimming, and outdoor and adventurous activities. The United Nations Educational, Scientific, and Cultural Organization (UNESCO) considers physical education programmes an important part of its mission.

II HISTORY

The nature of physical education and sport today has been influenced by many cultures. For example, in ancient times, physical education consisted of gymnastics to improve strength, agility, flexibility, and endurance. The Greeks considered the human body to be a temple that housed the mind and the soul, and gymnastics kept that temple healthy and functional. Eventually, structured gymnastic and callisthenic exercise were abandoned in favour of sports.

Traditionally, the objectives of physical education have been categorized as either promoting “education of the physical” or “education through the physical”. Education of the physical focuses on the actual development of the body and physical skills rather than any results that can be achieved through physical activities, while education through the physical emphasizes the acquisition of physical skills and bodily development, as well as nurturing emotional, intellectual, and social skills in the process. The latter approach utilizes carefully selected physical activity as a medium through which desirable objectives can be met.

III PHYSICAL EDUCATION TODAY

The scope of physical education and sport in society has widened considerably in the latter part of the 20th century. The two traditional approaches have become more closely interrelated, a trend that looks set to continue into the 21st century. Physical education and sporting opportunities, in general, have become more widely available, not just to the school-age population, but to people of all ages, in non-school settings, such as community and fitness centres.

With the increased awareness of the importance of an active lifestyle, physical education is seen as laying the foundations in young people for long-term health and improved quality of life. Many educationalists, administrators, policy-makers, and activity providers view physical education and sport as occurring at various levels. Introduction to the traditional major sports for most people usually takes place at school. As the range of sports opportunities widens, children increasingly encounter sports for the first time. Having been introduced to a sport, some become irregular participants, what is called the “recreation route”, while others may join a club and strive to improve personal performance, the “performance development route”.

Physical education, sports studies, and sports sciences are well recognized now as examination subjects at school, pre-university, and university level. Universities offer degree courses in areas such as leisure studies, community sport/arts/outdoor pursuits, recreation management, human movement studies, and physical education teacher training. National Vocational Qualifications (NVQs) in sport and recreation also provide routes into university education and the leisure industry. See also Physical Fitness.

Reviewed By:
Chris Laws

Loading...

Education, Multicultural

I INTRODUCTION

Education, Multicultural, educational approach that celebrates the cultural diversity of contemporary society. Its basic premise is that by exposing all children to the social and cultural customs of ethnic minority communities living in their country, they will have a greater understanding and tolerance of people from different backgrounds This article deals only with multicultural education in Britain.

The concept of multiculturalism in schools is part of a continuing debate about how to address the inequalities among different ethnic groups that exist in the education system, as well as how to engender tolerance and understanding between them. For decades, British educational theorists have been split between multiculturalism and anti-racism, the latter being a more direct challenge to racist structures in society. Since the publication in 1999 of the Macpherson Report into London’s Metropolitan Police Service’s handling of the murder of black teenager Stephen Lawrence, the concept of “institutional racism” has superseded the previous debates.

II HISTORY

Ever since the first post-war wave of immigration from the Caribbean in the 1950s, the British education establishment has explored ways of catering for children from ethnic minorities. In the 1950s and 1960s, this took an assimilationist line, in which the emphasis was on teaching English as a second language where needed. The social, cultural, and economic factors that were barriers to ethnic minority children’s acceptance into British society were largely ignored.

The Race Relations Act of 1976 coincided with widespread concern among many black parents and educationalists about the failure of their children in school. Their growing dissatisfaction with the education system led to the spontaneous establishment of a large number of Saturday or supplementary schools in inner cities around the country. These schools, still in existence, are designed to boost children’s achievement in curricular subjects as well as give them a grounding in their own cultural heritage.

The introduction of multicultural education in schools was largely a response to this threat of separatism, coupled with the impact of academic studies contending that black children’s low achievement could be tackled by developing curricula that reflected cultural diversity. The white Eurocentrism of learning materials was challenged, leading to the introduction of more images and stories of black people in books and the celebration of Asian and Caribbean festivals. Ethnic minority storytellers, musicians, poets, dancers, and theatre groups were frequent visitors to schools, and teachers or parents cooked food from different countries. The teaching in mother tongue or community languages was also introduced in some schools.

III ISSUES AND CONTROVERSIES

While these innovations were a first step in addressing the ethnic diversity of post-war Britain, they had many critics. On the political right, head teachers and parents vehemently opposed left-of-centre local education authorities’ impositions of multicultural policies. The long-awaited publication of the Education for All report in 1985, commissioned by the government and written by the Swann (formerly Rampton) Committee to investigate the “educational needs and attainments of pupils of West Indian origin”, represented the single most important argument for multicultural education. In the highly controversial document, Lord Swann and his team highlighted the need for multiculturalism in the curriculum as an important means of combating the racism that existed in schools. Although the report was derided by many of its critics and its findings ultimately rejected, educationalists in practice have implemented its recommendations on weaving multicultural themes and issues into their teaching ever since. As well as recommending curriculum content that reflected the cultural diversity of modern Britain, the report stressed the need for all schools, regardless of their ethnic make-up, to take a multicultural approach.

Disagreement over strategies for multicultural education has, however, continued on both sides of the political arena. On one side, it was argued that anti-racist policies disadvantaged white pupils and cluttered the curriculum with irrelevancies; on the other, that multiculturalism reinforced racial stereotypes. Some critics also argued that the exclusive focus on cultural diversity ignored the more fundamental issue of the institutionalized racism in schools against children of ethnic minorities. The anti-racism and multiculturalism that became opposing forces in the academic world remain so, to a certain extent, to this day.

The Education Reform Acts of 1988 and 1993 have also played their part in taking multiculturalism and anti-racism off the curriculum agenda in practical terms. By generalizing equality of opportunity to all who are perceived to be socially and economically disadvantaged, issues around race and ethnicity have moved out of the classroom. Instead, through ethnic monitoring of exclusions and attainment at local authority and school levels, they have become the concerns of school management. While this may help to raise awareness about different levels of achievement between ethnic groups, many feel that it leaves the question of how to educate children to live in a multicultural society unanswered.


Contributed By:
Reva Klein

Loading...
Loading...