Education Act 1944

Education Act 1944, legislation that radically changed the structure of education in England and Wales. Rab Butler, as Minister of Education, championed the progression of the Act, and it is after him that the Act is commonly known: “The Butler Act”. The Act affected education in four main ways.

Education Act 1944

First, it increased the role and powers of the Minister of Education. Before 1944, the Minister was simply responsible for “the superintendence of certain matters relating to education in England and Wales”. Under the 1944 Act, the Minister was charged with a positive duty to “promote” education and to “secure the effective execution by local authorities, under his control and direction” of the national policy for education. These strengthened ministerial powers created an education system that came to be known as “a national service, locally administered”.

Second, the administration of education was itself re-shaped: 169 of the 315 local education authorities that existed before 1944 were abolished on April 1, 1945. That left the local administration of all forms of public education in the hands of 146 county councils and county borough councils.

Third, the Act restructured the school system. Previously, publicly funded schools and colleges were either “elementary” or “higher”. “Higher” institutions included everything that was not elementary. So secondary schools, to which about 20 per cent of children went at the age of 11, and junior technical schools, to which only about 1 percent of children were admitted, were both distinguished from “elementary” schools, attended by nearly 80 per cent of the school population. Compulsory education ended at the age of 14 so the country’s 11-to-14-year-olds were in two overlapping systems leading to a very different job or further education opportunities. The Act put an end to this. It organized schools “in three progressive stages to be known as primary education, secondary education, and further education”. This structural change often described as “secondary education for all” was perhaps the Act’s greatest achievement. Part of that achievement, at a time of great economic difficulty, was to raise the school leaving age, initially from 14 to 15, in 1947.

Finally, Butler, the government minister responsible for seeing the Education Bill through parliament, brought about a compromise between schools maintained by councils and denominational schools. The essence of this “dual control” system, which still exists, was that nearly all church (denominational) schools chose either to become “controlled” or “aided”. Controlled schools were entirely funded by the local council, whereas the governors or managers of aided schools remained responsible for capital expenditure on the fabric of the buildings, which they continued to own, and had increased rights over staffing and the curriculum (see Religious Schools).

Much of the framework of the 1944 Education Act remained in place until the end of the 20th century. Although it achieved much, the Act left two important issues unresolved. One was the structure of the newly created secondary schools. Initially, the emphasis was on a tripartite system: grammar schools for some, technical schools for a very few, and secondary modern schools for most. Selection of these schools was managed by local education authorities. The 11+ tests, as they came to be known, were taken in the last year of primary education by all children, and usually consisted of a general intelligence test, a test of attainment in English and arithmetic, and a report from each child’s primary school teacher. As the pattern of schools that followed the 1944 Act has increasingly been replaced by schools that combine all three elements in one “comprehensive” school, by January 2000 the 11+ examination was retained only in the areas served by the 164 remaining grammar schools. Specialisms within comprehensive schools have, however, been encouraged, and aptitude testing related to specialisms, such as music, foreign languages, and technology, has steadily increased.

A second unresolved issue was the school curriculum. The Act deliberately left the curriculum to be determined locally by schools and local education authorities. It was not until 1988 that the notion of a national curriculum, defining a common entitlement for all children in whatever type of school they were enrolled, was given legislative force.

Contributed By:
Peter Anthony Newsam


Nursery Education


Nursery Education, term universally applied to the education of children aged two to six before they enter primary school. The ages of the boys and girls depend on the admission requirements of the nurseries and schools, the availability of places in the area, and the educational policy of the country.

Most nurseries and playgroups operate policies of learning through play, and activities are informal and flexible to suit the needs of young children. Sand, water, and paint form the basis of many learning exercises. Storytelling, nursery rhymes, and the development of early literacy and numeracy skills are also an important part of the curriculum.

State and private nurseries in the United Kingdom are staffed by qualified teachers and nursery nurses. Playgroups are run by workers with qualifications in the education of young children, helped by parents and volunteers.

But in developing countries nurseries are either non-existent or very few in number.


The day nursery movement began in Europe in the early 19th century as a response to the increasing employment of women in the industry. The absence of large numbers of mothers from their homes during the day led to child neglect, which in turn stimulated a variety of charitable agencies to seek ways of caring for the children of working parents.

In 1816 the socialist and philanthropist Robert Owen opened one of the first nursery schools in the world at the New Lanark cotton mills in Scotland to improve his workers’ quality of life. Another early leader of this movement was the French philanthropist Jean-Baptiste Firmin Marbeau, who in 1846 founded the Crèche (French, “cradle”) Society of France, with the aim of fostering child care. Within a relatively short period, day nurseries were established in many parts of France and in several other European countries. In the United States, the first-day nursery was opened in 1854 by the Nursery and Child’s Hospital of New York. Many nurseries were wholly or partly supported by local and national governments. A large number were set up in factories so mothers could look after their children while they were at work.

Most of the nurseries established in the latter half of the 19th century were supported by charitable organizations. Both in Europe and in the United States, the day nursery movement received great impetus during World War I, when unprecedented numbers of women replaced men in the industry. In Britain, France, Germany, and Italy nurseries were established even in munitions plants, under direct government sponsorship.

As studies of children revealed the importance of the early years in physical, social, emotional, and intellectual development, the nursery school movement gained momentum in Britain and other European countries.

With the outbreak of World War II, the number of day nurseries increased rapidly as women were again called on to work in industry. The expectation that most employed mothers would leave their jobs at the end of the war was only partially fulfilled, and during the post-war years, a widespread movement developed, headed by sociologists, social workers, teachers, and other groups, which sought renewed government aid to meet the need for a comprehensive day-care programme.

Today there is a consensus that education for the under-fives is important in its own right, as well as being beneficial for later schooling.


Nurseries throughout the world normally educate young children in small, informal groups with higher adult: child ratios than in later schooling. Nursery education in not compulsory, but it has become the norm in many developed countries. In the United Kingdom, for example, where children start primary school at the age of four or five, every three- and four-year-old child is entitled to a free part-time nursery place. In the Republic of Ireland, although the chool is not compulsory until the age of six, well over half of four-year-olds and almost all five-year-olds are in infant classes provided by state-funded primary schools. However, in less affluent countries such as India, although the demand for nurseries is high, there are not enough places. In Indonesia, primary school is compulsory from the age of six but enrolment is low. Development agencies are encouraging nurseries in an attempt to boost primary school rolls.

Early years experts regard Denmark and New Zealand as examples of good practice because these countries have developed national early childhood policies that have led to state-funded and integrated education and care services.

In England some experts are concerned about the pressure to formalize the education of children at an increasingly younger age. In 2005, as the Labour government sought to encourage breakfast clubs and after school care in the nation’s primaries, critics, while applauding the investment in the early years, claimed little children were spending too many hours of the day in institutions.

For example, in South Africa, Indonesia, and Australia children stay at nursery school until they are around the age of six, and in Singapore young children often do not start primary school until seven. But in the United Kingdom, although primary school is not compulsory until the age of four or five, many parents are pressurized into placing their children in primary school nursery classes to secure a primary place.


In most countries there is no national curriculum for nurseries, although governments often promote guidelines and examples of good practice. Learning holistically is important, as is learning in fun ways to prepare life-long learning. Typical nursery curricula in many countries include personal, social, and emotional development; communication, language, and literacy; mathematical development; physical and creative development; and responsibility and integrity.


In 2005 the Labour government in England and Wales published the Childcare Bill. Hailed as a “once in a life-time bill” by early years experts, it promised to guarantee by law accessible, high-quality childcare and education for all children under five.

From 2008 the bill will force childminders and nurseries to follow an early years curriculum from birth until the age of five. Three- to five-year-olds already have to learn core skills but this is the first time the British Government has prescribed what children should learn under the age of three. Critics dubbed the new measure “a national curriculum for babies”.

The bill sets out four different stages of development, from the youngest babies who are “lookers and communicators”, through the stages of explorers (8 to 18 months), players (18 to 24 months), and walkers, talkers, and pretenders (25 to 36 months).

Regular inspections by government inspectors will ensure children receive the early years foundation curriculum.

The bill also sets out a framework for “extended” schools, whereby in 2010 all children will be entitled to wraparound care from 8 a.m. to 6 p.m.

Contributed By:
Linda Ann Blackburne


Degree, Academic


Degree, Academic, title granted by a college or university, usually signifying completion of an established course of study. Honorary degrees are conferred as marks of distinction, not necessarily of scholarship.

Clever girl: Emma graduated with a bachelor’s degree in English literature from the Ivy League university in Rhode Island


Institutions of higher learning have granted degrees since the 12th century. The word itself was then used for the baccalaureate and licentiate, the two intermediate steps that led to the certificates of master and doctor, requisites for teaching in a medieval university. During the same period, honorary degrees were sometimes conferred by a pope or an emperor. In England, the Archbishop of Canterbury, by an act passed during the reign of King Henry VIII, acquired the authority to grant honorary Lambeth degrees.

During the Middle Ages, the conferring of a doctorate also allowed the recipient to practice the profession in which the certificate was awarded; this condition still holds true for the legal and medical professions in European countries, such as France, in which the government controls the universities.


In Germany and at most Continental universities, only the doctor’s degree is conferred, except in theology, in which the licentiate, or master’s degree, is also presented. Granting of the doctorate is contingent upon the acceptance of a dissertation and the passing of examinations. The baccalaureate is not usually a university degree in Europe. In France, it is acquired by passing a state examination at the completion of secondary education; the only university-conferred baccalaureate is that awarded by the faculty of law.

Most British universities grant the bachelor’s degree after the satisfactory completion of a three- or four-year course. Some, such as Oxford and Cambridge, have examinations, called tripos, for honours. The master’s degree in arts or science is granted after a further period of residence and study and the payment of fees. Other English universities grant the master’s degree only after a candidate has passed a series of examinations and presented an approved thesis. Doctorates (Ph.D. or D.Phil.) are awarded for individual research contributions in the arts or sciences. Postgraduate work leads to the writing of a thesis and the passing of oral (viva voce) and written examinations. Honorary degrees, such as the D.Litt. are sometimes given to prominent public figures.

In Australia, a bachelor’s degree precedes a master’s degree, with the latter being earned after an additional year or two of study. Degree courses vary between three and six years of study, with a doctor’s requiring a further two to five years. The most commonly granted degrees in the United States are the BA, or bachelor of arts, and the B.Sc., or bachelor of science, both generally given after the completion of a four-year course of study and sometimes followed by a mark of excellence, such as cum laude (“with praise”); magna cum laude (“with great praise”); or summa cum laude (“with highest praise”). The master’s degree is granted after one or two years of postgraduate work and may require the writing of a thesis or dissertation. The doctorate requires two to five years of postgraduate work, the writing of a thesis, and the passing of oral and written examinations.


The academic dress worn at degree-granting ceremonies consists of a long, full-cut gown and a mortarboard, a stiff square-shaped cap with a tassel. In the United Kingdom, a hood lined with coloured silk indicating the graduate’s institution is worn. More ornate gowns are worn for the D.Phil. and other higher degrees, and for ceremonies such as the Insaenia (the ceremony at which honorary degrees are granted). Full subfusc is still required at Oxford for examinations; this black and white outfit consists of dark suits for men and a white blouse and black skirt for women, as well as a mortarboard and cap.

More about  Clever girl: Emma graduated with a bachelor’s degree in English literature from the Ivy League university in Rhode Island


Semi-Exclusive… 51429934 ‘Harry Potter’ actress Emma Watson, in cap and gown, attending Brown University’s 2014 Graduation Ceremony in Providence, Rhode Island on May 25, 2014. Watson received a degree in English Literature and it took five years to earn her diploma, starting in 2009! Emma was surrounded by security guards once the graduation ceremony was complete. ‘Harry Potter’ actress Emma Watson, in cap and gown, attending Brown UniversityÕs 2014 Graduation Ceremony in Providence, Rhode Island on May 25, 2014. Watson received a degree in English Literature and it took five years to earn her diploma, starting in 2009! Emma was surrounded by security guards once the graduation ceremony was complete. FameFlynet, Inc – Beverly Hills, CA, USA – +1 (818) 307-4813

March 25, 2014: Emma Watson at the Ed Sullivan Theater in New York City for an appearance on ‘The Late Show With David Letterman’.
Mandatory Credit: Kristin Callahan/ACE/ Ref: infusny-220|sp|U.S., UK, AUSTRALIA, NEW ZEALAND SALES ONLY.
LONDON, UNITED KINGDOM – MARCH 31: Emma Watson attends the UK film premiere of ‘Noah’ at Odeon Leicester Square on March 31, 2014 in London, England. (Photo by Fred Duval/FilmMagic)
Cast member Emma Watson arrives for the UK Premiere of “Noah” at Leicester Square in London, March 31, 2014. REUTERS/Paul Hackett (BRITAIN – Tags: ENTERTAINMENT) – RTR3JDZ5

FILE – This March 26, 2014 file photo shows actress Emma Watson at the premiere of “Noah,” in New York. Watson, most known for her role as Hermione Granger in the “Harry Potter” franchise, is graduating from Brown University, an Ivy League school in Providence, R.I., on May 25. (Photo by Evan Agostini/Invision/AP, FIle)

Education, Postgraduate


Education, Postgraduate, courses of study in colleges and universities, professional schools, and other postsecondary institutions offered after completion of an undergraduate curriculum. Specific programmes of postgraduate education usually require a baccalaureate or bachelor’s degree or its equivalent as a prerequisite for admission. Education beyond the undergraduate years is often directed towards preparation for entrance into a profession such as law, medicine, or dentistry, in which advanced training is necessary for recognition as a practitioner. Although some professions, such as engineering or teaching, require only a baccalaureate degree for entrance, further education is frequently needed for advancement.



Formal professional training in law and engineering originated in ancient times in Egypt, Greece, and Rome. Medieval universities offered instruction in law, medicine, and theology. Beginning in the 16th century, great impetus was given to advanced technical and medical education as a result of scientific discoveries.



Postgraduate study ranges from courses emphasizing intensive training in a specific aspect of professional practice to degree programmes of several years’ duration, either in an academic discipline or a professional field. Many professions also require periodic postgraduate study in order to maintain certification for practice.

Graduate schools generally award master’s degrees or doctorates to those who have satisfactorily completed prescribed courses of study. A year is usually required to obtain a master’s degree, which demands the acquisition of a higher level of knowledge than is needed for a baccalaureate. The doctoral degree involves a longer period of study and requires participation in and summation of some type of original research, as well as written and oral (viva voce) examinations.

The demands for specific courses of postgraduate study change with the needs of society. In most developing nations, for example, professional training in engineering and the health sciences is in great demand. Preparation for a career in medicine represents one of the most intensive curricula, as a medical degree requires at least four years beyond the baccalaureate, and entry into a medical specialty can require four or more additional years of study. Most postgraduate students require funding of some sort. In the United Kingdom, a certain number of postgraduate grants are provided by research bodies, such as the Economic and Social Research Council or the Medical Research Council. Highly competitive scholarships are also sometimes available to support students from industry or from developing countries.



An ever-increasing number of women are now students in higher education programmes throughout the world. Traditionally, many professions, including engineering, law, and medicine, were dominated by men. Women are now demanding and acquiring equal access to the postgraduate education necessary for entry into all professions. This trend is likely to continue as political, economic, and social barriers to equal opportunities for women are removed.

As per capita income increases in a society, the demand for professional training in technical and human services also increases. Foreign aid from developed nations and educational programmes sponsored by the United Nations have done much to support the expansion of postgraduate education in developing countries. Many nations now include plans for the development of postgraduate studies as part of their own systems of higher education rather than supporting professional training abroad for citizens who may or may not return to their own countries.


Education, Military


Education, Military, training of the officers and enlisted (or conscripted) personnel of a nation’s military and naval forces. The goal of such training is to equip members of the services with the basic skills and discipline needed for appropriate action under the stress of combat. Qualified personnel may receive more formal education to enable them to advance professionally.


Until the mid-18th century, military training was restricted to drill and acquiring knowledge of weaponry and tactics primarily from practical experience in battle. The technological evolution of warfare, however, led to the establishment of formal military and naval academies specifically for training professional officers. Prussia was a pioneer in this field with the establishment in 1810 of the Kriegsakademie (War Academy) in Berlin. Today, in Germany, the universities of the Bundeswehr (German Armed Forces) are primarily technical and scientific in orientation.

In France, a two-track system evolved. L’École Polytechnique serves as a general technical university for army, naval, and civil service personnel, whereas L’École Spéciale Militaire, founded in 1808 by Napoleon, provides two years of initial military training. Originally located at St Cyr, it is now based at Coëtquidan. Advanced military training is provided by the École Supérieure de Guerre.

In 1947 the Royal Military Academy at Sandhurst, founded in 1802, was combined with the older Royal Military Academy at Woolwich, founded in 1741, to form the Royal Military Academy, Sandhurst. Until World War II broke out in 1939, British army officers had been trained in artillery, communications, and engineering at Woolwich or received cavalry and infantry training at Sandhurst. Naval cadets are trained at the Royal Naval College, Dartmouth; air force cadets at the RAF College, Cranwell. The Imperial Defence College serves as the school for senior service personnel. The emphasis on military élitism in the Soviet Union was reflected by the maintenance of 21 strongly research-orientated military academies on a level with civilian universities.

In the United States, the Reserve Officers Training Corps (ROTC) programmes in high schools and colleges prepare students for commissions in the Army Reserve. The United States Military Academy is a four-year engineering college whose graduates are commissioned into the Regular Army. Corresponding service institutions are the United States Naval Academy, the United States Air Force Academy, the United States Coast Guard Academy, and the US Merchant Marine Academy (see Merchant Marine of the United States).

The US Department of Defense maintains the National Defense University in Washington, D.C., which trains senior officers and selected civilians in theories and procedures relevant to national security.


Current trends in military education reflect rapid changes brought about by technological innovation and are geared to national security requirements. Advanced courses taken at civilian colleges and universities play a major role in supplementing the educational experience of selected personnel (primarily officers). In the United States, integration of women into the service academies was begun in 1976. On fulfilling their mandatory obligations or on retiring from the services, many embark on civilian careers in management or in professions requiring the skills acquired in the services.

List of beautiful militaries:



Education, Medical

Education, Medical, a process by which individuals acquire and maintain the skills necessary for the effective practice of medicine.

To train as a conventional doctor in the Western world a person needs to have achieved a good level of understanding in the sciences (for example, physics, chemistry, biology), either at senior (high) school or at college. Medical schools are usually part of a university (although not all universities have medical schools) and they offer only a limited number of training places in any one year. This results in fierce competition for places, with only the best students being admitted.

Most medical schools offer a training course of between three and six years in duration. The curriculum is traditionally divided into two parts: a preclinical course in which the basic science of how the human body works is studied; and a clinical course in which the student is introduced to actual patient care in a hospital. The former is usually taught in science departments at the university and the latter at a hospital affiliated with the university.

The preclinical course involves such areas of study as the gross and microscopic appearance and connections of the human body (anatomy), the organization and basic functions of different types of human cell (cell biology), the function and underlying biochemical processes of parts of human cells (biochemistry), the integrated functions of tissues, organs, and body fluids (physiology), the principal actions, distribution, and elimination of drugs in the body (pharmacology), the general principles underlying disease processes and such disease-related micro-organisms as viruses, bacteria, and parasites (pathology), the defence mechanisms of the body (immunology), and the structure and function of genetic material in living and infected cells (genetics).

The clinical part of the course involves medical students working with experienced doctors in general practice and hospitals to learn family practice and general medicine, and such specialized areas of health care as surgery (removal, reconnection, or transplantation of parts of the body), obstetrics (pregnancy and childbirth), paediatrics (diagnosis and treatment of childhood complaints), gynaecology (diagnosis and treatment of ailments of the reproductive system), geriatrics (diagnosis and treatment of ailments suffered by elderly people), and psychiatry (diagnosis and treatment of mental ill-health). During this time, medical students observe and learn from doctors working with patients on the wards and in specialist clinics, and gradually, under their supervision, become involved directly in the provision of health care (for example, diagnosis and administration of therapy).

Students have to pass examinations in all of these different aspects of the course, which take to form of written, practical, and oral tests. Upon graduating, they received a Doctor of Medicine (MD), Bachelor of Medicine (BM), or an equivalent degree. New doctors swear the Hippocratic Oath (or an equivalent professional statement) to adhere at all times to high standards of medical practice and ethics, and to protect the right of every patient to life, dignity, and confidentiality.

It is usual for “junior” doctors to serve at least one year as an “intern” or “house officer” and to have responsibility for both diagnosing and treating patients in the hospital. At this point, they choose to move away to a new hospital. Such a post, however, is considered to be an extension of their training with overall responsibility for their work resting with the senior colleagues supervising their work. In most countries, “junior” doctors often complain that they work excessively long hours for relatively poor pay (that is, relative to other professionals after several years of training).

During his or her time as a junior doctor, an individual must decide whether to work in general or in a specialist branch of medicine. If the latter, the doctor applies to work with a particular specialist and his or her team and once accepted embarks upon a training course which lasts for several years; the training being obtained largely by the experience of working with other more experienced doctors in the group. During this time, he or she is called “registrar” or “intern” and the training culminates in both written and oral exams set by an official body on that subject (for example, the Royal College of Pathologists or the Royal College of Surgery in the United Kingdom, both of which decide whether a doctor is sufficiently knowledgeable and able to practise as a specialist in that particular area of medicine). If successful, the doctor is awarded “membership” of the college.

It is important that doctors keep up with medical progress (the results of medical research concerning new forms of diagnosis and treatment). Most often this takes the form of reading medical journals and books, attending conferences, and discussing medical matters with other specialists in the same or different fields. More recently, doctors have been able to communicate with one another and receive the latest medical information using the Internet (often referred to as the “information superhighway”), which can link computers used by doctors in different hospitals and/or general practices around the world.

Some doctors, especially those in general practice, choose to incorporate such unorthodox medical techniques as acupuncture or reflexology (see Complementary Medicine) into their medical practice and offer these to their patients, where appropriate, usually in parallel with more conventional treatments; these are seldom offered as an alternative to conventional Western medicine. So popular are some of these unorthodox methods that some medical schools are now offering training courses on these topics for both trainee and postgraduate (that is, experienced, practicing) doctors.

Contributed By:
Claire Elizabeth Lewis


Education, Physical


Education, Physical, instruction in various kinds of physical activity to promote the physical development and well-being of the individual. Physical education is generally taught in schools from nursery to secondary level, and in some countries, including Britain, is a compulsory part of the curriculum. It involves organized sports, gymnastics, dance, athletic activities, swimming, and outdoor and adventurous activities. The United Nations Educational, Scientific, and Cultural Organization (UNESCO) considers physical education programmes an important part of its mission.


The nature of physical education and sport today has been influenced by many cultures. For example, in ancient times, physical education consisted of gymnastics to improve strength, agility, flexibility, and endurance. The Greeks considered the human body to be a temple that housed the mind and the soul, and gymnastics kept that temple healthy and functional. Eventually, structured gymnastic and callisthenic exercise were abandoned in favour of sports.

Traditionally, the objectives of physical education have been categorized as either promoting “education of the physical” or “education through the physical”. Education of the physical focuses on the actual development of the body and physical skills rather than any results that can be achieved through physical activities, while education through the physical emphasizes the acquisition of physical skills and bodily development, as well as nurturing emotional, intellectual, and social skills in the process. The latter approach utilizes carefully selected physical activity as a medium through which desirable objectives can be met.


The scope of physical education and sport in society has widened considerably in the latter part of the 20th century. The two traditional approaches have become more closely interrelated, a trend that looks set to continue into the 21st century. Physical education and sporting opportunities, in general, have become more widely available, not just to the school-age population, but to people of all ages, in non-school settings, such as community and fitness centres.

With the increased awareness of the importance of an active lifestyle, physical education is seen as laying the foundations in young people for long-term health and improved quality of life. Many educationalists, administrators, policy-makers, and activity providers view physical education and sport as occurring at various levels. Introduction to the traditional major sports for most people usually takes place at school. As the range of sports opportunities widens, children increasingly encounter sports for the first time. Having been introduced to a sport, some become irregular participants, what is called the “recreation route”, while others may join a club and strive to improve personal performance, the “performance development route”.

Physical education, sports studies, and sports sciences are well recognized now as examination subjects at school, pre-university, and university level. Universities offer degree courses in areas such as leisure studies, community sport/arts/outdoor pursuits, recreation management, human movement studies, and physical education teacher training. National Vocational Qualifications (NVQs) in sport and recreation also provide routes into university education and the leisure industry. See also Physical Fitness.

Reviewed By:
Chris Laws


Education, Multicultural


Education, Multicultural, educational approach that celebrates the cultural diversity of contemporary society. Its basic premise is that by exposing all children to the social and cultural customs of ethnic minority communities living in their country, they will have a greater understanding and tolerance of people from different backgrounds This article deals only with multicultural education in Britain.

The concept of multiculturalism in schools is part of a continuing debate about how to address the inequalities among different ethnic groups that exist in the education system, as well as how to engender tolerance and understanding between them. For decades, British educational theorists have been split between multiculturalism and anti-racism, the latter being a more direct challenge to racist structures in society. Since the publication in 1999 of the Macpherson Report into London’s Metropolitan Police Service’s handling of the murder of black teenager Stephen Lawrence, the concept of “institutional racism” has superseded the previous debates.


Ever since the first post-war wave of immigration from the Caribbean in the 1950s, the British education establishment has explored ways of catering for children from ethnic minorities. In the 1950s and 1960s, this took an assimilationist line, in which the emphasis was on teaching English as a second language where needed. The social, cultural, and economic factors that were barriers to ethnic minority children’s acceptance into British society were largely ignored.

The Race Relations Act of 1976 coincided with widespread concern among many black parents and educationalists about the failure of their children in school. Their growing dissatisfaction with the education system led to the spontaneous establishment of a large number of Saturday or supplementary schools in inner cities around the country. These schools, still in existence, are designed to boost children’s achievement in curricular subjects as well as give them a grounding in their own cultural heritage.

The introduction of multicultural education in schools was largely a response to this threat of separatism, coupled with the impact of academic studies contending that black children’s low achievement could be tackled by developing curricula that reflected cultural diversity. The white Eurocentrism of learning materials was challenged, leading to the introduction of more images and stories of black people in books and the celebration of Asian and Caribbean festivals. Ethnic minority storytellers, musicians, poets, dancers, and theatre groups were frequent visitors to schools, and teachers or parents cooked food from different countries. The teaching in mother tongue or community languages was also introduced in some schools.


While these innovations were a first step in addressing the ethnic diversity of post-war Britain, they had many critics. On the political right, head teachers and parents vehemently opposed left-of-centre local education authorities’ impositions of multicultural policies. The long-awaited publication of the Education for All report in 1985, commissioned by the government and written by the Swann (formerly Rampton) Committee to investigate the “educational needs and attainments of pupils of West Indian origin”, represented the single most important argument for multicultural education. In the highly controversial document, Lord Swann and his team highlighted the need for multiculturalism in the curriculum as an important means of combating the racism that existed in schools. Although the report was derided by many of its critics and its findings ultimately rejected, educationalists in practice have implemented its recommendations on weaving multicultural themes and issues into their teaching ever since. As well as recommending curriculum content that reflected the cultural diversity of modern Britain, the report stressed the need for all schools, regardless of their ethnic make-up, to take a multicultural approach.

Disagreement over strategies for multicultural education has, however, continued on both sides of the political arena. On one side, it was argued that anti-racist policies disadvantaged white pupils and cluttered the curriculum with irrelevancies; on the other, that multiculturalism reinforced racial stereotypes. Some critics also argued that the exclusive focus on cultural diversity ignored the more fundamental issue of the institutionalized racism in schools against children of ethnic minorities. The anti-racism and multiculturalism that became opposing forces in the academic world remain so, to a certain extent, to this day.

The Education Reform Acts of 1988 and 1993 have also played their part in taking multiculturalism and anti-racism off the curriculum agenda in practical terms. By generalizing equality of opportunity to all who are perceived to be socially and economically disadvantaged, issues around race and ethnicity have moved out of the classroom. Instead, through ethnic monitoring of exclusions and attainment at local authority and school levels, they have become the concerns of school management. While this may help to raise awareness about different levels of achievement between ethnic groups, many feel that it leaves the question of how to educate children to live in a multicultural society unanswered.

Contributed By:
Reva Klein


Education, Adult


Education, Adult, any organized and sustained learning programme designed for and appropriate to the needs of adults. Usually, adults need to fit in study alongside other domestic and work responsibilities; they bring a diversity of experience to their studies, and they study voluntarily. “Adult education” is an inclusive term covering all types of education and training activities for adults—formal and informal, whether offered by schools, colleges, universities, voluntary organizations, industry, or public service bodies.


Adult education takes different forms in different places at different times, reflecting the different social functions given to adult learning, and the different groups with access to opportunities. In ancient Greece, Athenian society was organized to enable a small class of people to pursue learning as the central location of their adult lives. However, adult learning was not then seen to be universally useful. In Denmark, adult education was central to the regeneration of a poor agrarian economy, inspired in the 19th century by the Danish poet and educator N. F. S. Grundtvig, and built on the development of and support for active and participative democracy. That commitment to popular participation and social justice remains central to adult education in the Nordic countries. In Britain, “adult education” has often been taken to mean part-time studies that do not lead to certification; in the United States, it is seen as a generic, all-inclusive term. However, in more than half the world, it is synonymous with adult literacy, with programmes of reading and writing for people with no initial schooling.


For much of the English-speaking world, the forms of adult education developed during and after colonialism draw on British experience. Widespread adult education developed in Britain along with industrialization and the growth of the demand for popular democracy, yet its roots stretch back in religious education to the beginnings of organized Christianity in the British Isles and, in secular education, to the Renaissance. King Alfred, in the 9th century, was a passionate and committed adult learner for the benefit of himself and others, establishing educational institutions to spread learning among the population; however, books were scarce before the invention of the printing press, and popular knowledge was mainly shared through the pulpit and the troubadour.

The Renaissance acted as a fillip to secular as well as religious inquiry, and public lectures on scientific subjects, attracting large attendances, are recorded in London from the 16th and 17th centuries, but more widely from 1700. During the period leading up to the English Civil War, thousands of pamphlets on how the world should be organized the stimulated debate. Later, coffee clubs, newspapers, and libraries all fostered a learning culture; and a wide range of bodies, including the Society for the Promotion of Christian Knowledge, the Welsh circulating schools, and dissenting schools, all contributed to spreading literacy.

Nevertheless, it was, in Britain, the Industrial Revolution and the growing concentration of population in towns that extended the opportunity for ordinary working people to gain instruction “in the principles of the Arts they practise, and in the various branches of science and useful Knowledge”. The Mechanics’ Institutes were founded on these principles. They started in Glasgow and London in 1823 and spread rapidly across Britain and to Australia. Like many later initiatives, the Institutes attracted radical manifestos and reformist practice in the debate about what constituted really useful knowledge. The Christian Socialist Working Men’s College was founded in 1854; Quaker-influenced adult schools followed later in the century, and, with the rise of the new unionism, the Workers’ Educational Association (WEA) was established in 1903. Parallel initiatives to bring education to workers prompted the rise of university extramural provision, and from 1919, following a key policy report, local government provided mass adult education opportunities for people to gain qualifications through “night school” or to keep fit, extend their creativity, and stretch tight budgets through a crafts and domestic skills curriculum. From the 1920s, community schools, based on the Cambridge Village Colleges, involved adults and children in complementary studies on single sites. Together, the WEA, the universities, and local authorities offered a rich and varied menu of education for self-improvement. However, they also marked a clear separation of learning for pleasure from vocational education.

World War II offered the largest-scale general education programme mounted by employers when the army’s Bureau of Current Affairs offered compulsory adult education for soldiers to discuss the shape of the post-war world.

After World War II there was a marked shift from practical to leisure-based learning. Increasing affluence led to a demand for languages and lifestyle courses, and rapid expansion of provision overall, but adult education failed to attract those people who had benefited least from initial education. A series of measures addressed this issue from the 1970s. In 1975 a major campaign was launched to teach literacy and numeracy to the six million adults in Britain with basic skills needs. English programmes for speakers of other languages settling in Britain, programmes targeting people with disabilities, and women’s studies initiatives followed as providers targeted excluded groups. Access courses, which developed in the 1980s, offer adults one-year courses preparing them for entry to university. However, adult education in Britain, and in many other industrialized countries, remains more effective at reaching the affluent and those with extended initial education.

A Broadcasting

Just as the growth of libraries had a major impact on adult learning in the 19th century, broadcasting had a comparable impact in the 20th century. It brought people access to information and the stimulus to learn, free at the point of use in their own homes. The literacy campaign was launched on prime-time television. The Open University, which opened to students in 1971, exploited this power, with a broadcasting-led distance education degree programme, delivered in modules, with high-quality print materials, supported by face-to-face tutorials, and an exclusively adult, part-time student population.

B Industrial and Technological Change

By the 1980s millions of adults were participating in formal or informal opportunities for learning, yet adult education was almost invisible to policy-makers. In public debate, education was interpreted as schools and universities, and training was concentrated on new, young entrants to the labour market. However, changes in the structure of the economies of industrial states have made lifelong learning more central to social policy. Demographic, technological, and industrial change, the emergence of information economies, and of global markets combine to make lifelong learning vital to international competitiveness. This has led to a demand for credit-bearing courses, for opportunities to have recognized the learning that adults have previously achieved. It has led to the need for qualifications that are transferable, and to the need for modes of study flexible enough to be fitted round the other pressures on adults’ lives (see also Education, Vocational).

Because industries now have a shorter life, and because there is a high level of international mobility, there are pressures for qualifications to be harmonized. In Australia, with 40 percent of professional workers coming from abroad, the new national qualifications system has been built around the National Office for Overseas Skills Recognition. In South Africa, the new South African Adult Basic Education and Training strategy is based on a qualifications system, with a clear competence statement for every standard. Similar measures are part of the policy frame of the European Union, too.

Adults now make up the majority of participants in post-compulsory education in Britain and the United States. Their participation is increasingly in qualifications-bearing and work-related study. The prospects are that they will demand and get increasingly adult-friendly structures in which to study. In Britain, though, an increased commitment to vocational opportunities for adults has been bought at a price, with weakened public commitment to courses offering to learn for its own sake.


In Australia, as in Canada, Scandinavia, and much of Europe, these forces are also evident, but there is a robust and continuing commitment to programmes that support the personal development of individuals and the democratic development of communities. In Australia, this commitment is recognized in the formal identification of adult education as a fourth sector in the education system, complementing primary, secondary, and tertiary education. As in Britain, a provision in Australia is widely different in different states, with highly developed courses offered in Victoria and New South Wales. As in Britain, there is a recognition of the need to invest in adult learning for economic prosperity. However, as the influential report of the Senate Standing Committee on Employment, Education, and Training, “Come in Cinderella”, recognized in 1991: “The adult and community education sector has demonstrated its capacity to respond to the needs and circumstances of millions of Australians, to provide educational opportunity where it has been previously denied and to create pathways out of powerlessness.”

In the new economies the old distinctions between vocational and leisure-based learning blur, and in information-rich societies there is a powerful case for investment in any kind of learning people can be persuaded to undertake.

A Literacy

In most countries, however, the struggle for literacy continues to dominate provision. The content of literacy programmes depends on context. Those programmes sponsored by the United Nations Educational, Scientific, and Cultural Organization or the World Bank have often focused on national economic priorities. Functional literacy programmes are designed to help people to become literate in order to support health, agricultural, and industrial development. By contrast, programmes inspired by Latin American popular education movements, and in particular by the Brazilian popular educator Paolo Freire, focus on power relations, reading the world, as well as reading words. As one of his students in Recife, Brazil, explained: “I want to learn to read and write to stop being the shadow of other people.” In the Soviet Union, Lenin argued that “literacy is not a political problem, but the fundament without which there can be no politics”. In Tanzania, Cuba, Ethiopia, and Nicaragua literacy programmes have been made central to overall government policy. In China and India, millions have learned through mass campaigns. Yet illiteracy remains a major problem in many parts of the world, particularly for women and for the rural poor.

B Prospects

Adult education for many will continue to involve the struggle for skills in reading, and access to wider learning opportunities. In the industrial world, new technologies hold the prospect of offering individuals and communities access to unrivaled stores of knowledge, and the growth of information industries suggests the risk that access to those sources may privilege those who can afford to pay. Whatever the structures, the common histories of adult learners suggest they will find new forms to satisfy their curiosities in new circumstances of the new millennium.

Teenage girl (13-14) reading book on sofa

Contributed By:
Alan Tuckett


Colleges and Universities


Colleges and Universities, degree-granting institutions of higher education. In the original sense of the word, a college was a group of students who gathered to share academic and residential facilities. Each college was a component part of a corporate body called a university, the word being an abbreviation of the Latin Universitas magisterium et scholarium (“guild [or union] of masters and students”), organized for mutual advantage and legal protection. Today, a college may be affiliated with a university or independent.

In some universities, particularly European institutions, students begin their higher education with specialized studies because their general education is completed in secondary school. In general, European universities have no prescribed courses, attendance requirements, or course grades. Students may attend lectures, but do their work directly with tutors who prepare them for examinations. Programmes may be completed in two to six years, usually split into three terms. In the United States, students are traditionally required to take general survey courses before they specialize in major areas of concentration; the undergraduate programme generally lasts four years, with each year split into two or three semesters.

Typical first degrees include the Bachelor of Arts (B.A.) and Bachelor of Science (B.Sc. or B.S.) degree, while those who want additional education may enrol in programmes leading to a Master of Arts (M.A., occasionally a first degree, as in some Scottish universities) or a Doctor of Philosophy (Ph.D.) degree.


Although modern colleges and universities evolved from Western European institutions of the Middle Ages, significant types of higher learning existed in ancient times, in the Middle and the Far East as well as in Europe. Some of these Eastern institutions still flourish.

A Historical Antecedents

In Greece, the Academy of Plato and the Lyceum of Aristotle were advanced schools of philosophy. During the Hellenistic period, which began in the 4th-century bc, Athens attracted many Roman students, including, later, the statesmen and writers Julius Caesar, Cicero, Augustus, and Horace. Also important during this period was the Egyptian city of Alexandria, with its great library (see Alexandria, Library of) and museum, which attracted scholars from the Middle East. The Jewish academies in Palestine and Babylonia, which produced the Talmud, promoted religious and secular intellectual pursuits from about ad 70 through to the 13th century. The University of Nalanda, in northern India, where native and Chinese students studied Buddhism, functioned until the 12th century. Institutions of higher education flourished in China itself from the 7th century onwards, and in Korea from the 14th century. Al-Azhar University in Cairo, now more than 1,000 years old, is the central authority for Islam. Another Islamic institution of equal antiquity is Al Qarawiyin University in Fès, Morocco.

B Medieval Universities

Western European universities developed as students migrated to various places where noted teachers lectured on subjects of particular interest to them. The language was no barrier because lectures and disputation were conducted in the universal tongue, Latin. By the 12th century, Paris was established as the centre for theology and philosophy, and the University of Paris became the model for later universities in northern Europe. Bologna, Italy, was the centre for the study of law, and the University of Bologna set the pattern for Italian and Spanish universities. From the 13th century onwards, universities were established in France, England, Scotland, Germany, Bohemia, and Poland. Students migrating from the same country banded together into so-called nations for mutual aid and protection. From these communities developed the concept of the college (Latin, collegium,”society”). Medieval universities had the right to suspend studies when conditions in their towns and cities were unfavourable and to confer degrees that included the privilege of teaching in any Christian country.

C From the Renaissance to the 18th Century

Italian universities such as Ferrara helped to transmit Renaissance humanistic ideas to northern European institutions. Bologna was the great 17th-century centre for medicine and biology. Leiden University in Holland, established in 1575, attracted students from all over the Continent to investigate the new sciences; in the 18th century it became an important centre for legal studies, attracting many students from Scotland. The University of Salamanca, in Spain, founded about 1230, set the pattern for the establishment of institutions in Central and South America in the 16th and 17th centuries.

The University of Wittenberg was the scene of the beginning of the Protestant Reformation (1517), started by Martin Luther, a professor there. His disciples went on to teach in all parts of Germany, Scandinavia, and eastern Europe. The Calvinist Reformation in Switzerland involved the University of Geneva, whose faculty and students helped to spread the doctrines of the theologian John Calvin throughout Europe and North America.

In the United States, in New England, Calvinists founded Harvard College (later Harvard University), the oldest American university. The Calvinist tradition also led to the establishment of Yale College (later Yale University) and the College of New Jersey (now Princeton University). Other colonial establishments included King’s College (Columbia University), Queen’s College (now Rutgers, the State University of New Jersey), and Dartmouth College. During the colonial period, however, many well-to-do American students chose to study abroad, primarily at universities in Scotland, Holland, France, and Italy.

The first institution of higher secular education in Russia was the Moscow State University, founded in 1755 by the scientist Mikhail Vasilyevich Lomonosov, after whom it is now named; it developed, along with other Russian secular universities, under German and other foreign influences. The universities of Vilna and Dorpat, although founded earlier, were primarily religious in orientation.

D The 19th Century to the Present

The post-Industrial Revolution era, with the growth of the middle class, provided much of the impetus for expanding European higher education. During the 19th century, German universities became influential sources of scholarly research and examples of academic freedom. The University of Berlin was noted for philosophy; Göttingen for literature and mathematics; Heidelberg for mathematics and the classics; Leipzig for psychology; and Jena for pedagogy. Many students from foreign countries obtained their Doctor of Philosophy (Ph.D.) degrees from German universities.

British institutions founded during this period include the universities of London and Durham (the first new English universities established after the Middle Ages), as well as the universities of Manchester, Liverpool, Leeds, and Wales. Unlike the University of Oxford and the University of Cambridge (founded in the 12th and 13th centuries, respectively), which represented the Establishment, social prestige, and relatively conservative views, these and other institutions familiarly referred to as “red brick universities” attracted students and faculty with advanced social and political ideas, as typified later by the post-World War II “angry young men” writers who studied or taught in these schools.

In Canada, in the 19th century, McGill University and the universities of Toronto and Montreal were founded.

Among new 19th-century universities on the Continent were those in Berlin, St Petersburg, Athens, Bucharest (Romania), and Sofia (Bulgaria). In India, the universities of Calcutta, Bombay, and Madras, all established in 1857, were formed as examining bodies along the lines of the University of London. Today the University of Sydney (1850), the oldest university in Australia, has one of the highest enrollments of Australia’s institutions of higher education—which include Monash in Victoria; Melbourne; Adelaide; and Queensland.

The growth of universities in China was slowed down by civil unrest during the 19th and early 20th centuries. The University of Beijing was founded in 1896; most of the other colleges and technical institutions date from the 1920s or after World War II. Japanese universities include Tokyo (1877) and Kyoto (1897).

Throughout the 19th century and up to the present, college and university students were generally in the vanguard of radical and revolutionary thought. Russian universities grew in number and influence in the 19th century, and until the Revolution of 1917, they offered studies in the classics, science, Russian literature, and history. They were also centres of radical and revolutionary political doctrines and activities. The government periodically withdrew academic privileges and imprisoned faculty members and students, but this control could not stem the tide of revolutionary thought. Restrictive and repressive measures by the administration and government authorities, as in tsarist Russia, and in Germany during the 1920s and 1930s, often led to student protests and riots and to school closures.

In the post-World War II era, particularly during the 1950s and 1960s, many universities were established in the United Kingdom and Germany, as well as in the developing nations of Asia and Africa.

The 1960s also saw periods of student unrest as, for example, in the United States, where protests were held against the Vietnam War. More recently, in 1989, Tiananmen Square in China was the scene of student prodemocracy demonstrations, resulting in widely televised, violent clashes.

The 1970s fostered the establishment of the Open University, which offers degree courses to people from all walks of life, by providing lessons on television, radios, and by post. The first Open University was created in the United Kingdom, in 1971; various countries, including India and South Africa, have followed suit.

One of the major problems faced by universities and students in the past decade or so has been that of funding, with reduced grants and the introduction of student loans (in the United Kingdom).