User:TimRJordan/sandbox/Approaches to Knowledge/2020-21/Seminar group 3/History

From Wikibooks, open books for an open world
Jump to navigation Jump to search

The History of Computer Art as a sub-discipline

[edit | edit source]

Computer Art as a sub-discipline

[edit | edit source]

The words “computer art” bring several pictures to mind but certainly not an image of a Renaissance painter with a blank canvas in front of them. Perhaps, an artist in front of a computer screen, who is developing algorithms that generate experimental art. Or maybe even an Artificial Intelligence (AI) robot that creates paintings that are just as good as the ones in the National Gallery. All of the above are examples of Computer Art in the modern world. Art and computer science as disciplines use contrastingly different methodological approaches to arrive at knowledge. In recent years, however, a unique overlap between the two has led to the formation of a sub-discipline -- Computer Art -- an increasingly popular bachelor program in universities across the world.

Computer Art could be viewed as a branch of both art and computer science. As a branch of art, the sub-discipline explores the development of new art forms through the use of technology. In the context of computer science, it focuses on the development of technology (for example, algorithms) to create art. An increasing number of artists nowadays learn to code in order to develop algorithms for aesthetic purposes.[1] To go even further, AI advancements have allowed for the production of AI-generated paintings which are accessible to be viewed online and even purchased.[2] Every day, the intersection between art and computer science grows and history is made.

Formation of the sub-discipline and its development

[edit | edit source]

The exact point in time when Computer Art first emerged as a sub-discipline is an academic debate. However, a possible answer would be the founding of the non-governmental organization “Experiments in Art and Technology” (E.A.T.) in 1966 by the electrical engineers Billy Klüver and Fred Waldhauer and the artists Robert Rauschenberg and Robert Whitman.[3] E.A.T. sought to forge collaboration between engineers and artists through industrial projects.[3] The organization quickly gained popularity and a few years after its establishment, it had over 4,000 participants: artists, engineers, programmers, researchers, and scientists, whose collaborative efforts led to the development of new computer art forms, such as video and digital image.[3]A disclaimer should be made that earlier examples of the presence of technology in Art exist but they do not point to the history of the sub-discipline and its methodologies, which is why those examples are not mentioned.

Nowadays, we consider digital image and video (and the technology involved in producing those) as an integral part of Art, however, sixty years ago, they were a revolutionary concept. Coding, as a means of creating art, is also being normalized. Today, we tremble with excitement (and fear) at the thought of AI using complex algorithms to produce art by itself. In sixty years, however, we may see AI as an inseparable part of Art.

The History of Museology

[edit | edit source]

Definition of Museology

[edit | edit source]

Museology, in general, is the theoretical study of the history and social roles of museums as well as activities related to the operation and management of them, including preservation, curating, and other practices.[4]

Origin of Museums

[edit | edit source]

The term "museum", or "museion" in Greek meaning "the seat of the Muses", emerged during the Classical Period. However, human beings have a longer history of collecting objects in the purpose of inquiry and acquisition. The earliest evidence can be seen from large amounts of grave goods in Palaeolithic burials.[5] Founded by Ptolemy Sotor in the 3rd century BCE, the great museum at Alexandria contained various collections related to botany and zoology. However, its function was a philosophical institute supporting scholars to engage in their studies.[6] The prototype of modern museums originated from the cabinets of curiosity starting from the 16th century in Europe. Accompanied by the colonization process, more overseas object entered into Europe, forming a large basis for collections. In the 18th century, there was a boom of museums, including the British Museum and the Louvre in France. The function of museums shifted to educating and “civilizing“ the general public in the 19th century, and several world’s fairs were held serving for this purpose.[7]

Establishment of the Discipline

[edit | edit source]

Though the history of collections is long, the systematic study of museums is a relatively new discipline. In the 16th century, the Belgian doctor published a book concerning museum practices, providing a guidance for organizing a collection. The term museology is first brought out by Georg Rathgeber, who formulated methods for making art collections in museums in one of his books. The first attempt of museum practices education can be traced back to 1856 in Spain, when the government established an institution to train professionals with national heritage preserving skills.[8]In 1889, the Museums Association was founded in London. Annual conferences were held for discussing museological topics. In 1901, the association published the Museums Journal, which was the first academic journal in the field of study.[9]

Problems and Development of Museology

[edit | edit source]

The research interest and methodology of museology underwent large changes in the second half of the 20th century, shifting focus from museum operations to its social roles. Stimulated by the social unrest in Western countries and the popularisation of environmentalism, political activism, and postmodernism, researchers started to rethink the role of museums in society. It was pointed out that the ideology of traditional museums was isolated from the public and tended to be elitist. Also, most museums were curator-entered and building-bounded, making them superior to the audience. In 1980, the idea of “new museology’’ was introduced by French museologist André Devallées in contrast to the “old museology’’, focusing on the roles of museums in social and political contexts and the engagement of the whole community in curatorial practices. There was also the “ecomuseum’’ movement, reconstructing museums into democratic places without boundary and are originated from the local community, thus increasing the involvement of the general public.[10]

Another contemporary field in museology is critical museology, which emerged in the late 20th century and the beginning of the 21st century. Since many collections in Western museums were historically related to colonization and were taken from wars, it is important to re-evaluate the history of the collections and to reconstruct museum practices critically. New methodologies are developing to remove the culturally superior part of traditional museology by interdisciplinary approaches. For example, with the help from historians and anthropologists, repatriations are undergoing on a global scale to decolonize museums.[11]

Future Fields of Research

[edit | edit source]

The methodologies and the research scope of museology are still actively changing by combining elements from other disciplines. For example, with the rapid development of information technology, the study of digital museology is becoming a trend. Many virtual museums are under construction, allowing more people to see the collections through the Internet, thus improving the efficiency in conveying knowledge. Documenting collections by digital technologies can also help with the organization of museums and the preservation of collections.[12]

The Loss of Stenography as a sub-discipline

[edit | edit source]

Stenography, the practice of shorthand, is a method of abbreviated writing that allows dictation to be taken in real time. Classes in stenography were highly respected and common in the mid 1900s since it was considered a valuable skill for multiple careers, such as police work, journalism, and law. These classes also became gateways for women to enter the clerical workforce in the '50s. [13] Since there have been classes with students and teachers, and academic writings on the importance and history of shorthand, I am inclined to state that it was once a discipline, or perhaps, a sub-discipline of english language studies.

Shorthand has played an important role in documenting history since perhaps around 400BC, which was implied when Diogenes Laërtius stated that Xenophon “was the first who took down conversations as they occurred”.[14] Through historians’ analyses of the different branches of shorthand, we have been able to translate some of Cicero’s works, who is known to be one of the greatest orators, and also introduced shorthand writers into his senate house[15]; Samuel Pepys’ diary which gave us an insight into English politics and the daily life of a Member of Parliament in the 1600s; and even some accounts of Sir Isaac Newton’s works which gave way to some of the greatest advancements in Maths and Physics.

Whilst there are still a few people who are trained in shorthand, it is very rarely used in today’s society. Shorthand is no longer a desirable or necessary skill to have in almost any profession. So, what changed? What has replaced the need for shorthand? Starting with the acoustic engineer Homer Dudley of Bell Laboratories in the 1930s through to the first speech recognition software, Dragon Dictate, having its initial release in 1997, automated speech recognition has developed from being able to process short sounds to fluent and sophisticated speech covering a wide range of dialects and languages. This development was accelerated by WWII, when Dudley focused his efforts towards finding a secure method of sending voice transmissions.

Now in most computers, phones, courtrooms, offices - and even homes with the likes of Amazon’s Alexa - we have highly accurate automated speech recognition devices and softwares that can record and take dictation with great speed and minimal clarification required.[16] There is no need for shorthand anymore. The emergence and developments in the disciplines of Artificial Intelligence, Technology, and Acoustic Engineering has made shorthand redundant, and has replaced it as a discipline in the modern day.

The History of Art Conservation as a scientific discipline

[edit | edit source]

Conservation of art is a multi-disciplinary study incorporating fine art, chemistry and scientific techniques. It has has only emerged as a discipline in the last century alongside developments in science. Prior to this scientific revolution, conservation of art was considered to be a craft and the focus was placed on cleaning and repairing rather than looking into scientific methods to conserve artwork .[17]

The discipline of art conservation and philosophies around its intentions come to the fore towards the end of the late eighteenth and nineteenth century. During this time, there was a change of emphasis, looking beyond the mere physical rectification of the artwork or artefact to looking at the material history and future value of its cultural heritage. [18]

One of the first notable conservation theorists is John Ruskin. In his 1849 book 'The Seven Lamps of Architecture', Ruskin argued against repairing and rebuilding old gothic buildings, introducing the idea of 'trusteeship'.[19] His views countered contemporaneous architect Eugene Viollet-le-Duc's belief that these buildings should be kept in as best condition as possible. Both men presented contrary views to conservation.[20]

Harold Plenderleith, in his 1998 journal article 'A History of Conservation', attributed the origin of scientific conservation to post First World War Britain [21]. Historical artefacts from the British Museum, which were temporarily held in the London Underground system, showed signs of serious damage. The scale of the subsequent operation to restore and preserve these artefacts was unlike any previous restoration project and, for the first time, was assisted by the Department of Scientific and Industrial Research. An emergency laboratory was set up under the instruction of Alexander Scott. This laboratory was officially incorporated as a Department of the Museum in 1931.

By this stage, more sophisticated scientific equipment and examination techniques were being used as a means to study artworks and artefacts. One such example is X-radiography, a technique used to examine the composition and condition of paintings and various objects.[22] X-radiography was championed by Edward W. Forbes, an art historian and director of the Fogg Art Museum between 1909 and 1944.

By 1950, the science of art conservation developed further by the founding of the International Institute for the Conservation of Museum Objects (renamed in 1959 as the International Institute for Conservation of Historic and Artistic Works). [23]

Innovative, science based conservation techniques have continued to evolve. One example is the newfound use of nanotechnology, its first use being the restoration of the Brancacci Chapel in Florence where a micro emulsion was used as an alternative to solvent cleaning for the removal of beeswax. As a scientific discipline, art conservation now has to take into account both the health and safety of the conservators and the environmental impact of the conservation processes.[24]

The History of Computer Science as a discipline

[edit | edit source]

The birth of computer science

[edit | edit source]

Though advances in computing go back to the work of Charles Babbage and Ada Lovelace in the 1830s, the emergence of computer science as a discipline occurred much later. Before it was recognized as its own discipline, the mathematical foundations of computing were explored in the mathematics discipline. In 1928, the Entscheidungsproblem challenge was posed by the German mathematician David Hilbert. It speculated on the existence of an algorithm that was able to take a statement and a set of axioms as inputs, and then determine whether the statement was true or false, i.e, whether it satisfied all the axioms.[25] This challenge in the field of mathematical logic prompted the work of Alan Turing who set out to prove that such an algorithm could not exist. Turing broke down computations step by step and explained how these steps could be executed by a theoretical machine, defining algorithms as computations that could be carried out by these machines.[26] These machines were later termed Turing machines by Alonzo Church in 1937.[27] Turing’s work on the theory of computation in the mathematics field is considered to be the foundation of the theory behind modern day computers.

The formalization of computer science as its own a discipline occurred in the 1940s. Spurred by the work of Alan Turing and the construction of the Turing-complete computer, ENIAC, in 1945, computing as an academic interest rose to prominence.[28] In 1961, George E. Forsythe coined the term "computer science". Forsythe identified the study of computer science as the study of "the theory of programming, numerical analysis, data processing, and the design of computer systems",[29] setting apart computer science’s disciplinary identity from fields it was historically intertwined with such as mathematics and engineering. In this period there was an increased demand for instruction in computing and the first degree programs and academic departments dedicated to computer science were established. In 1962, the first computer science department was officially formed at Purdue University and in 1965, Richard Wexelblat from the University of Pennsylvania became the first person to receive a Ph. D. from a computer science department.[30]

Until the mid 1980s, the focus of the computer science field was making advances in the power of computers and increasing their efficiency and effectiveness. However, as the personal computer and the Internet became more ubiquitous, questions about how computers interact with other fields of study arose and demanded academic attention. As computers became tools used for study and fundamentally changed how work was carried out in disciplines, ethical discussions about how they were being used started to take place.[31]

Ethics in the computer science discipline

[edit | edit source]

Work in computing has been around for nearly two centuries; however, the discussion around computer ethics is relatively newer, only having started in the 1940s. Computer ethics as a concept originated during WW2 when MIT professor Norbert Wiener was investigating the science of information feedback systems that enabled different parts of a cannon to communicate with each other.[32] This new branch of science, which Wiener termed “cybernetics”, would later influence artificial intelligence. In 1950, Wiener warned against the negative consequences of technology on society and encouraged the development of technology that enhances the well-being of humans in his book The Human Use of Human Beings.[33]

There was little academic interest in this new area of applied ethics until the mid 1960s, during which there was a series of computer-enabled bank robberies and privacy invasions by authoritarian government agencies. As the social consequences of technology became apparent in society, there was an increased academic interest in computer-related ethical issues.[34] The term ‘computer ethics’ was conceived in 1976 by Walter Maner when he noticed that the use of computers in the medical field created a whole new branch of ethical considerations. He deemed this new branch of applied ethics as “computer ethics” and defined it as the study of ethical problems “aggravated, transformed or created by computer technology.”[35] He made efforts to encourage the teaching of computer ethics in university, developing university courses and conducting workshops.[34] By the early 1980s, the concept of computer ethics quickly caught the attention of other scholars who began to contribute to this new field.

Recently with infamous controversies such as the Cambridge Analytica scandal that have gained widespread mainstream attention, the importance of including the education of computer ethics in the computer science discipline has only been emphasized.

The History of Neuroscience as a Discipline

[edit | edit source]

Neuroscience focuses on the study of the body's nervous system, especially the brain.[36]

The Slow Study of Neuroscience

[edit | edit source]

It is believed that the Egyptians, in 1700 BC, were the first to study the brain and its function. Indeed, the Edwin Smith Payprus contains proof of their brief knowledge of several parts of the nervous system.[37] Over the centuries, many scientists continued to investigate on this complex part of the human body, but were very limited by a lack of sufficient material. In 1543, Vesallius published a medical textbook on neuroscience. During the 18th century, philosophers reflecting on the link between the mind and body encouraged research. [38]. Scientists start to understand the concept of neurons and to perform neurosurgeries.

20th Century: Neuroscience's rise as a discipline

[edit | edit source]

While neuroscience used to be a subject limiting itself to the study of the nervous system, it became in the 20th century a unique discipline. It started encompassing other fields of study, such as as computer science, biology, chemistry, medicine, psychology, linguistics and mathematics. [39] Scientists now also work on the link between the mind and the human behavior, and the influence of our environment on the brain.[40] Up to 13 different branches in neurosciences have appeared, from cognitive neurosciences to neurophysiology. The rise of neuroscience is due to the respective discoveries of nuclear and functional magnetic resonance imaging in 1938 and in 1992, allowing neuroscientists to realise more precise research. Progress is made on the structure and activity inside the brain, and many Nobel Prizes are awarded. The term “Neuroscience” finally appears in 1960 [41] and in 1969, the Society for Neuroscience is founded, marking the emergence of this new discipline.[42]

The Society for Neuroscience and Interdisciplinarity

[edit | edit source]

Neuroscience is historically interdisciplinary through the creation of the Society for Neuroscience (SfN) in 1969, which marked the institutionalisation and recognition of Neuroscience as a formal discipline. Indeed, this was a product of growing scientific interest in the nervous system across a blurred set of disciplines along with new technology, leading to the Committee on Brain Sciences (CBS) creating this inclusive new organisation. The SfN’s interest in attracting a variety of scientists across different but linked fields was apparent through the thoughtfulness attributed to the naming of the society in itself. The CBS wanted the name to be representative of the wide scope of disciplines they sought to welcome, encompassing behavioural, biological and psychiatric aspects of neuroscience among others, without suggesting any form of hierarchy between them. There was debate as to whether or not the name should include the term “brain” for example, it then being decided that it would not be representative of the whole community the Society was aimed at, the most urgent aims for it being to gain memberships and funding for new interdisciplinary research related to the nervous system.[43]

Neuroscience and Philosophy: The Birth of New Disciplines

[edit | edit source]

The development of neuroscience as a discipline also led to the emergence of new disciplines in the late 20th century, with particular links being made between neuroscience and philosophy. Indeed, new studies and research about the brain, which is recognised as being associated with consciousness and “The Self”, were of importance and interest to philosophy in exploring corresponding old philosophical questions.[44][45]A new discipline known as neurophilosophy thus emerged, accompanied by other new fields such as neuroethics, which is linked to both neuroscience and philosophy but also marketing, neuromarketing in itself also developing as a new discipline.[46]

The Future of Neuroscience

[edit | edit source]

Today, neuroscience has a bright future ahead. It is playing a key role in the understanding of many diseases such as Alzheimer’s and Parkinson’s, and also in mental disabilities. [47] Associated with other disciplines such as computer science, neuroscientists are a big help in artificial intelligence. This field tries to recreate natural intelligence and the structure of the brain in machines. [48]

The History of Philosophy as a discipline

[edit | edit source]

Introduction

[edit | edit source]

Philosophy takes its origin from Ancient Greece and the word φιλοσοφία which translates to Philo+sophia, meaning the ‘love or desire for wisdom’. It is important to note the meaning of the words ‘desire' or ‘love’ for ancient greeks at the time were slightly different than our definitions today. According to them, ‘desire’, or ‘love’ in this case, were used to describe things that were unattainable. Philosophy was for them a state of great wisdom they would pursue to get as a close as possible but they would never be able to fully achieve it.[49][50]

In that regard, philosophy tried to answer to the great questions of our world. From, what it means to be human, what is our world made of, the different elements, what is truth, reality and so on. While many of these questions can never truly be answered Philosophy teaches every single one of us to develop our thinking skills and to challenge what we know.

Early Days

[edit | edit source]

Unlike many others fields of study where the discipline was practiced long before it actually became taught at school or on a university level, Philosophy has always been closely interlinked between practitioners and tutors. One of the first well known philosophers, Socrates is famous not only to be the father of western philosophy but also because of his status as a teacher of the discipline. Himself never actually wrote any text but focused on transmitting his knowledge, and teaching his disciples the fundamentals of philosophy. His student Plato expanded this idea of teaching the discipline by creating the Platonic Academy, where students could come study and think among other scholars and philosophers.[51] This trend continued with Plato’s student, Aristotle opening his own school known as the Peripatetic School or sometimes Lyceum or even Epicurus opening another school in his Garden, known as Epicure’s Garden. The same principle was true in the far east, where Confucian Schools were created in honor of Confucius to keep his teachings alive and reflect on his ideas.[52]

Through the centuries until today

[edit | edit source]

Philosophy as a discipline has changed very little ever since its birth around 600 B.C.E in Ancient Greece. Because of the nature of the subject many of the questions asked by Plato, Epicure, or Aristotle are very similar to questions faced by humans through the centuries and even today. As a result the way of teaching philosophy has remained more or less the same. While in Europe philosophical writings became less common after the fall of Greece’s influence due to a decline in literacy, greek philosophy continued to be studied in the islamic world and translated to arabic. Al-Kindi, a famous philosopher and mathematician is known to have translated and taught many philosophical writings in the Library of Baghdad known as the House of Wisdom around the 9th century AD.[53] In Europe, the discipline continued to be practiced and taught but this time with a strong christian influence in the Italian Accademia Platonica and the resurgence of Plato’s ideas in 15th century Florence.[54] In England the department of Philosophy made its debut at the University of Oxford at the start of the 1620’s and marks the resurgence of philosophy being taught at school.[55] Ever since, the discipline has continuously expanded to most universities around the world and is greatly respected in the world of Academia. Today UCL hosts one of the biggest department of philosophy of all the UK.[56]

History of Law as a discipline

[edit | edit source]

The Cambridge dictionary defines Law as the system of rules of a particular country, group, or area of activity.[57] It is a highly interdisciplinary discipline as it sets the rules of our societies and therefore has always existed.

Origins

[edit | edit source]

The first origins of law as an academic discipline may go back to the 5th century BC in Athens where citizens could be taught about related subjects like philosophy of law or argumentation. Around 160, Gaius, a Roman jurist, wrote the Institutes, a teaching book inspired by Roman law and Greek philosophy intended for future lawyers.[58] There are different roots to law. In England for instance, the famous universities of Oxford and Cambridge followed the new system of adjudication made during the Norman Invasion, which became Common Law.[59] It can be stated that law as an academic discipline exists since the creation of the concept itself of university. It was indeed taught in the University of Constantinople (founded in 425) considered the first university of the world.[60]

Evolution of Law since the 16th century

[edit | edit source]

By the 16th century, European countries started to realize that law had to change alongside with History and every European country wrote its own version including Grotius's Introduction to Dutch Law (1619-1621) or later Napoleon's Civil Code (1804). Around the 18th century, when a wave of criticism was born against the legal education as law-based schools were created, some new ideas emerged.[61] Some, like Sir William Blackstone, tried to make things evolve but no significant change was noticed before the 19th century.[62] The US-model is now the most commonly used system to teach Law, giving both an academic discipline and a professional approach to its students. [63]

The History of History as a Discipline

[edit | edit source]

It is difficult to pinpoint the emergence of history as a discipline because engagement with the past has always been a fundamental aspect of human culture. Chronographic texts such as king lists and annals can be classified as historical documents, and these can be found among the archaeological evidence of civilisations as early as Ancient Egypt and Mesopotamia. [64]. One artififact that can be classified as evidence of historical thinking among ancient civilisations is the Palermo Stone, a fragment of an Ancient Egyptian stele inscribed with a list of five dynasties of rulers (c. 2925–c. 2325 BCE) and a year-by-year record of significant events[65]. Such texts relied on written language to record the past, so although humans may have been involved with their past to a certain extent through oral tradition and storytelling, the elementary signs of the emergence of history as a discipline in a recognisable form can arguably be traced back no further than the invention of writing.

Ancient Greece

[edit | edit source]

The Ancient Greeks are credited with significant contributions to the discipline of history in the traditional sense. They helped cultivate the discipline particularly through their contribution to the genre of historical writing, as they developed forms of recording the past that permitted more sophisticated accounts than the chronographic texts of earlier civilisations. [66] Although the epic poetry of texts such as Homer's Iliad and Odyssey convey the importance the Greeks placed in the stories of their ancestors, it is Herodotus' Histories that is widely regarded as the foundational text of the genre of historical writing. It is with Herodotus that the word "History," deriving from the Greek history meaning "inquiry,"[67] first appears in the context of the study of the past.[68] Herodotus set out with Histories to investigate the events of the Persian Wars, and to apply this knowledge to an understanding of his contemporary world. The examination of cause and effect relationships and the idea that the past helps illuminate the present are features of Herodotus' work that still play a defining role in the discipline of history.[69]

Thucydides was another prominent historian to come out of Ancient Greece. Influenced by Herodotus, his History of the Peloponnesian War recorded the events of a war in a factual and analytic manner.[70] However, Thucydides differed from Herodotus in that he was a contemporary of the period that he wrote about. Thucydides was highly critical of Herodotus' attempt to record the nature of foreign places and societies, as well as events that he had not personally witnessed.[71]. Thucydides' criticisms raised the issue of reliability in history, which remains an important consideration within the discipline today. Other historians were inclined to agree with Thucydides' suspicions that Herodotus was more of a storyteller than a credible source for knowledge of the past. As a result of Thucydides' influence, for much of antiquity historians did not concern themselves with the direct acquisition of knowledge of the past, as this was perceived as futile, and instead the practice of history focused on building on previous historical work and the recording contemporary events to which the historians had themselves born witness.[72] Therefore, although the origins of the discipline are evident in the analytical perspective which the Ancient Greeks applied to studying the past, as well as their focus on core historical issues such as cause and effect and reliability, a large part of what today constitutes history was neglected.

The Enlightenment

[edit | edit source]

The emphasis on reason and empiricism that dominated the pursuit of knowledge during the Enlightenment period had a significant impact on the discipline of history as key thinkers pushed for a more scientific approach to the study of the past.[73]

History of Animal Magnetism

[edit | edit source]

Though Edgar Allan Poe’s Facts in the Case of M. Valdemar [74] was a work of fiction when it was published in 1845 many saw it as a scientific report about Mesmerism, also known as Animal Magnetism. It had been a popular discipline from the mid-eighteenth century through to the beginning of the nineteenth century, emerging in 1776 with the publishing of De planetarum influxu in corpus humanum, « The influence of the planets on the human body », by Franz Mesmer, a German doctor. By the mid-1800’s, new phenomena such as electromagnetism and hypnosis were gaining in popularity. As they were strongly founded scientifically, Mesmerism became side-lined and was no longer considered a credible theory.

According to the Britannica Encyclopedia, Animal Magnetism is a « presumed intangible or mysterious force that is said to influence human beings ». [75] It is a form of alternative medicine which gained popularity throughout Europe from the mid 18th century to the beginning of the 19th century. Mesmer defined it as the ability to heal others thanks to the natural fluid, a force present within ourselves which links man, the earth and the universe.[76]

Although Animal Magnetism was never backed by scientists throughout the years, many works and reviews were published on the subject such as Puységur’s Du magnétisme animal considéré dans ses rapports avec diverses branches de la physique générale (Animal magnetism considered through its relations with diverse branches of general physics) in 1807, or in 1814 Annales du Magnétisme (Annals of magnetism) by François Deleuze, a review of European experiments at the time. Furthermore, in 1782 the « Societé de l’Harmonie Universelle » (Society of Universal Harmony) was created to ensure the future of the doctrine, as it was threatened by academics and the French government for lacking scientific evidence. Animal Magnetism was also taught in several universities in Germany and in England, attracting public interest until the 1840s as works and reviews were translated in many different languages.

Promoters of Animal Magnetism were keen to ensure that the discipline was not seen as lacking a scientific approach. Instead, they were advocates for a new form of rationality staying away from the idea of possible and impossible. With the advancement of medicine and physics during the 19th century, the theories at the origin of Animal Magnetism were proved wrong, and Mesmerism started to be discredited.[77]

Today, Animal Magnetism has lost its past influence and is considered an obscure, alternative medicine, often associated with hypnosis. In recent years however, the Law of Attraction theory has gained momentum, which picks up on an important concept of Magnetism.

The History of Feminism and Feminist Literature as sub-disciplines

[edit | edit source]

Feminist Literature can be seen as a sub-discipline of literature and is also important in gender studies, studying feminism, therefore making it a discipline.

The beginnings of feminist literature

[edit | edit source]

The roots of feminist literature start off in convents (around the 11th century), as religious women were the only ones who were taught to read and write unlike other women at the time destined to marriage. [78] The first feminist authors were therefore religious women who were able to start questioning the social hierarchy of their (european) societies, like Jane Anger who wrote in 1589 that women, unlike what the huge majority of europeans believed during the 16th century, may actually be superior to men. Indeed, her interpretation of the creation of Eve in the Bible was that, as she was created from Adam’s rib whereas he was created from dirt, Eve (and therefore women) was(/were) a better version of the human being. [78]


The development of feminism outside of the religious context however, was much harder as women intellectuals were not tolerated if not gifted with a “divine inspiration” like in the previous religious context. [78]

First Circles of Women Intellectuals

[edit | edit source]

Circles of women intellectuals still managed to appear progressively like Mary Astell’s, an author from the 17th/18th centuries. [78]Through the literature emerging from them, women were encouraged to develop their own judgements and ways of thinking like in Astell’s A Serious Proposal to the Ladies, for the Advancement of Their True and Greatest Interest , (1694), calling to an intellectual emancipation of women and challenging the education of women.

The birth of a real “movement”

[edit | edit source]

The apparition of a real movement became clearer at the end of the 19th century, based on the education of girls, the legal situation of married women and the lack of accessibility to employment for women.[78] Another cause then became central in this movement: the right to vote, and with it the term “suffragette” One of the first countries to give women the right to vote was New Zealand, in 1893. [79] Then, the majority of European countries gave women the right to vote at the end of WW1; 1918: Britain and Germany, 1919: Austria and Netherlands, 1920: US, and women now have the right to vote in every country where elections take place since the instauration of women suffrage in Saudi Arabia in 2015.[80]

Importance of World War I

[edit | edit source]

The First World War was a key moment for feminism. As women engaged in the war effort, the world realized the possibility and need for women to contribute to society in the world of labour too. It challenged the idea of the inferiority of women anchored in western societies and made it hard to support the idea that women were unfit to vote. Finally, it allowed women to enter the “public arena” they had been deprived from for centuries. [81] The feminism movement and its literature continued to evolve through the 20th century, the ‘second-wave’ feminism and the UN established a Commission on the Status of Women in 1947, proof of the progress of the movement.[78]

Diverse Feminism(s)

[edit | edit source]

As the movement gained more importance and reached women of the entire world, the necessity for a diversity of feminisms to answer different issues faced by different women became clear. This need was for instance expressed by the author Simone de Beauvoir in The Second Sex (1949), exploring a large variety of categories of women (chapters: the girl child, the wife, the mother, the prostitute, the narcissist, the lesbian, and the woman in love).

This necessity to recognize different forms of feminism was also expressed later by Ien Ang, in 1995 saying that the lack of understanding and relating between different feminists should be accepted rather than trying to be brought down in the name of an unreal united feminism.[82]

Studying Feminism

[edit | edit source]

The increasing importance of Feminism in our world is also shown by the increasing number of courses focused on this discipline at a University level. For instance the Gender Studies degree at SOAS focuses on issues all around the world, emphasizing the importance of diverse feminisms mentionned earlier. Additionally, the master's degree in Sussex includes feminist research in its programme, whereas the University of York focusses more on its cultural, historical, political and sociological aspects, offering a master's in "Women's Studies" as well as a master's called "Women, Violence and Conflict". [83] All these different programmes are proof of the expansion of feminism's importance in our society, and reflect the diversity of approaches to feminism.

History of Economics

[edit | edit source]

Definition

[edit | edit source]

According to the Cambridge dictionary, today the definition of economics is "the study of the way in which economies work, for example, the way in which they make money and produce and distribute goods and services"[84]. Economics is a discipline not only covering international issues with macroeconomics (import/export, valuation/devaluation or currencies, tax barriers) or international organization (IMF for example); but also the actions of individual people or firms with microeconomics (supply & demand, consumer choice...). Therefore, it is a broad discipline connected to many others such as Sociology, Politics, International Relations...

Origins

[edit | edit source]

Today, many consider Adam Smith (1723 - 1790) as being the one who created the discipline of economics. He was the first to write about and create the now well-known notion of a liberal economy. Indeed, he was against any intervention by governments in the market and believed in the fact that in a free-market, supply and demand would balance themselves on their own. This pushed many other economists to emerge such as Marx, Malthus, and later Keynes as they either elaborated or contested Smith's liberal point of view.

However, many argue that economics was studied way before Adam Smith, in Ancient Greece with philosophers such as Hesiod (talks about the scarcity of resources and the concept of competition), Xenophon (explore management and leadership), and Aristotle ("money are a substance that has a telos, that individuals have devised a unit that supplies a measure on the basis of which just exchange can take place").[85]. Furthermore, many of Adam Smith's ideas were taken by the French writers exploring the idea of Mercantilism (says that countries should get richer through exports, forcing countries to develop themselves and become more and more productive and less dependent. They also invented the notion of protectionism). Therefore, the origins of economics are still a debate, however, Adam Smith is still known as the one who created the modern economy by uniting and expanding on all theories exposed before him[86].

Economics Today

[edit | edit source]

Today, economics has become more mathematical and less theoretical. It is now possible to test many theories used in the past through calculations and graphs. Furthermore, with the importance advances that technology has taken, most trades (whether it is on the stock market or other markets) are less on the instinct of individual consumers but on calculations and predictions made. Meaning that Economics is now closely related to the discipline of mathematics and is bordering of science.

Moreover, the theories that are still very important are the ones of Adam Smith and John Maynard Keynes, used by many countries (Marxism lost a lot of importance towards the end of the XXth century and the fall of the USSR). Smith's liberal view is used mostly during times of economic prosperity but is losing more and more importance with the multiple crises that have hit the western world. However, Keynes promotes the intervention of the government during times of crisis through the lowering of interest rates and funding of firms in difficulty (the exact opposite of Smith's theory). We have seen the USA use this method several times (1929 crisis, 1970s oils shocks, 2008 wall street crash). With the COVID 19 crisis, this theory has never been more applicable as almost every country has provided aids to firms, people through direct pay instead of their employer, a reduction of taxes, or a delay on debt repayment for example.[87]

History of Alchemy

[edit | edit source]

Emergence of Alchemy

[edit | edit source]

Alchemy emerged before the common era as a proto-scientific philosophical tradition. [88] It is difficult to trace the evolution of alchemy, but it is thought to have emerged independently at three different points. The first being in Hellenistic Egypt, the second on the Indian subcontinent potentially as early as the second millennium BCE [89], and the third in China potentially as early as the fourth century BCE, however evidence for this is anecdotal and linguistic study contradicts this claim as there was no word for gold during this time. [90] In Egypt, the earliest attributable author is Zosimos of Panopolis, who published his treatises c.300 CE and in it emphasised the influence of Egyptian metallurgy and religion on the development of alchemy. [91] Alchemical knowledge is thought to have been lost when the Roman emperor Diocletian ordered the burning of alchemical texts following a revolt in Alexandria in 272 CE. [92]


Aims of Alchemy

[edit | edit source]

Alchemy involved the production of “noble” metals from “base” metals [93] a common example of this being the transmutation of lead into gold (known as chrysopoeia). Across all alchemical schools of thought the process of creating gold remains central, whether it be to gain wealth or, in the Chinese and Indian tradition especially, as an ‘elixir of life’ which would grant immortality. The transmutation of metals was used as an allegory for the spiritual transformation of man [94].

Modern Alchemy

[edit | edit source]

Alchemy is often thought of as a precursor to modern chemistry, and existed alongside modern science before its claims could be fully refuted due to its close ties with the Church, which controlled much of European society well into the Industrial era [95]; Isaac Newton focused much of his study on chrysopoeia in the pre-industrial era even alongside his much more fruitful study of physics [96]. Alchemy has become an important concept in modern fiction, with the idea of the ‘Philosopher’s Stone’ being a core concept of the first Harry Potter book, and an exaggerated set of alchemical principles ruling the world of the popular anime ‘Fullmetal Alchemist’.

References

[edit | edit source]
  1. Ornes S. Science and Culture: Computers take art in new directions, challenging the meaning of "creativity" [Internet]. PNAS. National Academy of Sciences; 2019 [cited 2020Oct16]. Available from: https://www.pnas.org/content/116/11/4760
  2. AI Generated Paintings [Internet]. ART AI. [cited 2020Oct16]. Available from: https://www.artaigallery.com/?gclid
  3. a b c E.A.T. - Archive of published document. Daniel Langlois Foundation [Internet]. 2000 [cited 2020Oct16]; Available from: https://www.fondation-langlois.org/html/e/page.php?NumPage=306
  4. Carbonell B. Museum studies. Oxford: Blackwell; 2004.
  5. Lewis, G., 2020. Museum - The First Museum Boom. [online] Encyclopedia Britannica. Available at: <https://www.britannica.com/topic/museum-cultural-institution/The-first-museum-boom> [Accessed 16 October 2020].
  6. Leiws, G. Manual of Curatorship: a guide to museum practice. Second edition. Butterworth & Museums Association; 1992.
  7. Maroević, I. and Edson, G., 1998. Introduction To Museology. Munich: Müller-Straten.
  8. Popadić M. The beginnings of museology. Muzeológia a kultúrne dedičstvo. 2020; 8;2:5-16. doi: 10.46284/mkd.2020.8.2.1.
  9. Museums Association. 2020. Our Story - Museums Association. [online] Available at: <https://www.museumsassociation.org/about/our-story/> [Accessed 16 October 2020].
  10. McCall, V. and Gray, C., 2013. Museums and the ‘new museology’: theory, practice and organizational change. Museum Management and Curatorship, 29(1), pp.19-35. doi: 10.1080/09647775.2013.869852.
  11. Shelton, A., 2013. Critical Museology: A Manifesto. Museum Worlds, 1(1), pp.7-23. doi: 10.3167/armw.2013.010102.
  12. Biedermann B. ‘Virtual museums’ as digital collection complexes. A museological perspective using the example of Hans-Gross-Kriminalmuseum. Museum Management and Curatorship. 2017;32(3):281-297. doi:10.1080/09647775.2017.1322916.
  13. Wiley, Jeanne. The History of Gregg Shorthand. [Internet] 2014 Aug, cited 2020 Oct, Available here: https://www.cookandwiley.com/2014/08/04/history-gregg-shorthand/?doing_wp_cron=1603033929.5854520797729492187500
  14. Pitman, Sir Isaac. A History of Shorthand, Fourth Edition, pg 1. Available from https://books.google.co.uk/books?hl=en&lr=&id=C7JLHQY9QSAC&oi=fnd&pg=PA1&dq=history+shorthand&ots=aCq51pzIfd&sig=WTOwLunJHi_3pA0QBLs6ygmGMFU&redir_esc=y#v=onepage&q=history%20shorthand&f=false
  15. Pitman, Sir Isaac. A History of Shorthand, Fourth Edition, pg 5. Available from https://books.google.co.uk/books?hl=en&lr=&id=C7JLHQY9QSAC&oi=fnd&pg=PA1&dq=history+shorthand&ots=aCq51pzIfd&sig=WTOwLunJHi_3pA0QBLs6ygmGMFU&redir_esc=y#v=onepage&q=history%20shorthand&f=false
  16. McLoughlin, Ian and Sharifzadeh, Hamid Reza, Nanyang Technological University Singapore. Speech Recognition for Smart Homes. Section 4.3 pg 482. Available from: https://www.researchgate.net/publication/221702135_Speech_Recognition_for_Smart_Homes
  17. Vivian Van Saaze, 2013, Installation Art and the Museum: Presentation and Conservation of Changing Artworks, Chapter 1 Key Concepts and Developments in Conservation Theory and Practice, [Internet], Available at <https://www.jstor.org/stable/j.ctt46n18r.5>
  18. Paul Eggert, 2019, Securing the Past. Conservation in Art, Architecture and Literature, [Internet], Available at <https://www.jstor.org/stable/10.2979/tex.2009.4.2.113>
  19. David Watt, Belinda Colston, 2002, The Building Conservation Society, Science and Conservation, [Internet], Avaliable at <https://www.buildingconservation.com/articles/sciencecons/sciencecons.htm>
  20. Vivian Van Saaze, 2013, Installation Art and the Museum: Presentation and Conservation of Changing Artworks, Chapter 1 Key Concepts and Developments in Conservation Theory and Practice, [Internet] Available at <https://www.jstor.org/stable/j.ctt46n18r.5>
  21. Harold J Plenderleith, 1998, A History of Conservation, [Internet] Avaliable at <https://www.jstor.org/stable/1506740>
  22. Invaluable, The Science Behind the Restoration of A Painting, [Internet], Avaliable at <https://www.invaluable.com/blog/the-science-behind-art-restoration/>
  23. International Institute for Conservation of Historic and Artistic Works, History, [Internet], Avaliable at <https://www.iiconservation.org/about/history>
  24. Rachel Brazil, 2014, Modern Chemistry Techniques Save Ancient Art, [Internet], Available at <https://www.scientificamerican.com/article/modern-chemistry-techniques-save-ancient-art/>
  25. Hilbert, David; Ackermann, Wilhelm (1928). Grundzüge der Theoretischen Logik. Translated by Hammond, Lewis M.; Luce, Robert E.; Leckie, George Gaines; Steinhardt, Fritz; Principles of Mathematical Logic. American Mathematical Society, 1950
  26. Turing, A. M. (1937). "On Computable Numbers, with an Application to the Entscheidungsproblem". Proceedings of the London Mathematical Society. 2 (1): 230–265. doi:10.1112/plms/s2-42.1.230. Retrieved 8 November 2020.
  27. Church, Alonzo (1937). "Review of: On Computable Numbers with An Application to the Entscheidungsproblem by A.M. Turing". The Journal of Symbolic Logic. Association for Symbolic Logic. 2 (1): 42-43. doi:10.2307/2268810.
  28. Tedre, Matti (2007). "Know Your Discipline: Teaching the Philosophy of Computer Science" (PDF). Journal of Information Technology Education. 6: 105–122. doi:10.28945/204. Retrieved 18 October 2020.
  29. Knuth, Donald E. (1972). "George Forsythe and the Development of Computer Science". Communications of the ACM. 15 (8): 721–726. doi:10.1145/361532.361538. Retrieved 18 October 2020.
  30. Shallit, Jeffrey (1995). "A Very Brief History of Computer Science". University of Waterloo. Retrieved 18 October 2020.
  31. Denning, Peter J. (2000). "Computer Science: The Discipline" (PDF). Encyclopedia of Computer Science. Archived from the original (PDF) on May 25, 2006.
  32. Wiener, Norbert (1948). Cybernetics: Or Control and Communication in the Animal and the Machine. Technology Press. {{cite book}}: |access-date= requires |url= (help)
  33. Wiener, Norbert (1950). The Human Use of Human Beings: Cybernetics and Society. Houghton Mifflin. {{cite book}}: |access-date= requires |url= (help)
  34. a b Bynum, Terrell Ward (2000). "A Very Short History of Computer Ethics". Newsletter on Philosophy and Computing. American Philosophical Association. Retrieved 18 October 2020.
  35. Maner, Walter (1996). "Unique Ethical Problems in Information Technology". Science and Engineering Ethics. 2: 137–154. doi:https://doi.org/10.1007/BF02583549. Retrieved 18 October 2020. {{cite journal}}: Check |doi= value (help); External link in |doi= (help)
  36. rain Basics - The fundamentals of neuroscience [Internet]. Bris.ac.uk. 2011 [cited 8 November 2020]. Available from: http://www.bris.ac.uk/synaptic/basics/basics-0.html
  37. Hunter, A., 2017. A (Very) Brief History Of Neuroscience » Brain World. [online] Brain World Magazine. Available at: <https://brainworldmagazine.com/a-very-brief-history-of-neuroscience/> [Accessed 19 October 2020].
  38. Fleming, D., 2019. The Role Of Neuroscience In Psychology. [online] Grey Matters. Available at: <https://www.greymattersintl.com/role-neuroscience-psychology/> [Accessed 19 October 2020].
  39. Nordvqist, C., n.d. About Neuroscience - Department Of Neuroscience. [online] Georgetown University. Available at: <https://neuro.georgetown.edu/about-neuroscience/> [Accessed 19 October 2020].
  40. Ruttimann Oberst, J., 2011. The Many Fields Of Neuroscience: Shifting From Synapses To Society. [online] Science Mag. Available at: <https://www.sciencemag.org/features/2011/11/many-fields-neuroscience-shifting-synapses-society> [Accessed 19 October 2020].
  41. M Abi-Rached, J. and Rose, N., 2014. Historiciser Les Neurosciences. [online] Scholar.harvard.edu. Available at: <https://scholar.harvard.edu/files/jabirached/files/abirached_rose_chap2_-_moutaud_chamak.pdf> [Accessed 22 October 2020].
  42. Hunter, A., 2017. A (Very) Brief History Of Neuroscience » Brain World. [online] Brain World Magazine. Available at: <https://brainworldmagazine.com/a-very-brief-history-of-neuroscience/> [Accessed 19 October 2020]
  43. "Chapter II: Establishing the Society for Neuroscience, 1968-1970". www.sfn.org. Retrieved 2020-10-20.
  44. Klar, Philipp (2020-10-17). "What is neurophilosophy: Do we need a non-reductive form?". Synthese. doi:10.1007/s11229-020-02907-6. ISSN 1573-0964.
  45. "Neurophilosophy and Its Discontents". Institute for Advanced Study. Retrieved 2020-10-20.
  46. "Neurophilosophy and the philosophy of neuroscience (II) – Buyer Brain". Retrieved 2020-10-20.
  47. Sampson, S. and Brazette, Y., 2018. What Is Neuroscience?. [online] Medicalnewstoday.com. Available at: <https://www.medicalnewstoday.com/articles/248680#why-is-it-important> [Accessed 19 October 2020].
  48. Hong Jing, J., 2020. Fascinating Relationship Between AI And Neuroscience. [online] towards data science. Available at: <https://towardsdatascience.com/the-fascinating-relationship-between-ai-and-neuroscience-89189218bb05> [Accessed 19 October 2020].
  49. philosophy|Origin and meaning of Philosophy by online Etymology Dictionary https://www.etymonline.com/word/philosophy .
  50. What is Philosophy?|Department of Philosophy https://philosophy.fsu.edu/undergraduate-study/why-philosophy/What-is-Philosophy .
  51. Platonic Academy - New World Encyclopedia https://www.newworldencyclopedia.org/entry/Platonic_Academy .
  52. Confucius, The School of Life Articles . https://www.theschooloflife.com/thebookoflife/confucius/ . 2014-11-12
  53. Al-Kindi - Biography. Maths History. https://mathshistory.st-andrews.ac.uk/Biographies/Al-Kindi/ .
  54. Platonic Academy|Italian Scholars. Encyclopedia Britannica. https://www.britannica.com/topic/Platonic-Academy .
  55. History of Oxford Philosophy. https://www.philosophy.ox.ac.uk/history-oxford-philosophy#collapse387181 .
  56. UCL.- University College London. UCL Philosophy. https://www.ucl.ac.uk/philosophy/ .
  57. The Cambridge Dictionary https://dictionary.cambridge.org/dictionary/english/law
  58. Greenidge Abel Hendy Jones (Introduction) and Poste Edward (Translation) https://oll.libertyfund.org/titles/gaius-institutes-of-roman-law
  59. Brouwer, René The Study of Law as an Academic Discipline, Utrecht Law Review. 2017.13(3). https://www.utrechtlawreview.org/articles/abstract/10.18352/ulr.405/
  60. Kyriakis, Michael J. The University: Origin and Early Phases in Constantinople, Peeters Publishers (1971). Vol.41, p161-182. https://www.jstor.org/stable/44170312?seq=1
  61. Sheridan Lionel Astor, Glendon Mary Ann, Alford William P. et al Legal Education(2020) Encyclopaedia Britannica Available from: https://www.britannica.com/topic/legal-education
  62. The Editors of Encyclopaedia Britannica, Sir William Blackstone(2020) Encyclopaedia Britannica Available from: https://www.britannica.com/biography/William-Blackstone
  63. Sheridan Lionel Astor, Glendon Mary Ann, Alford William P. et al Legal Education(2020) Encyclopaedia Britannica Available from: https://www.britannica.com/topic/legal-education
  64. Woolf, Daniel (2019). A Concise History of History: Global Historiography from Antiquity to the Present. Cambridge: Cambridge University Press. pp. 15–16. ISBN 1108426190.
  65. "Palermo Stone". Encyclopædia Britannica. April 01, 2020. Retrieved October 20, 2020. {{cite web}}: Check date values in: |date= (help)CS1 maint: url-status (link)
  66. Woolf, Daniel (2019). A Concise History of History: Global Historiography from Antiquity to the Present. Cambridge: Cambridge University Press. p. 20. ISBN 1108426190.
  67. Hall, Edith (2015). The Ancient Greeks Ten Ways they Shaped the Modern World. London: Vintage. p. 120. ISBN 978-0-099-58364-6.
  68. Woolf, Daniel (2019). A Concise History of History: Global Historiography from Antiquity to the Present. Cambridge: Cambridge University Press. p. 21. ISBN 1108426190.
  69. Hall, Edith (2015). The Ancient Greeks Ten Ways they Shaped the Modern World. London: Vintage. pp. 120–121. ISBN 978-0-099-58364-6.
  70. Hall, Edith (2015). The Ancient Greeks Ten Ways they Shaped the Modern World. London: Vintage. p. 149. ISBN 978-0-099-58364-6.
  71. Momigliano, Arnaldo (1958). "The Place of Herodotus in the History of Historiography". History. 43: 1–13 – via JSTOR.
  72. Momigliano, Arnaldo (1958). "The Place of Herodotus in the History of Historiography". History. 43: 1–13 – via JSTOR.
  73. Munslow, Alan (2012). A History of History. Oxford: Routledge. p. 22. ISBN 1136240594.
  74. « The facts in the Case of M. Valdemar », Edgar Allan Poe, available at: https://poestories.com/read/facts
  75. The Britannica Encyclopedia: https://www.britannica.com/science/animal-magnetism
  76. Nicholas Spanos and Jack Gottlieb, 1979, « Demonic Possession, Mesmerism, and Hysteria: A Social Psychological Perspective on Their Historical Interrelations », Journal of Abnormal Psychology
  77. Carlos S Alvarado, 2009, Late 19th- and early 20th- century discussions of animal magnetism, https://pubmed.ncbi.nlm.nih.gov/20182996/#:~:text=While%20the%20concept%20of%20animal,it%20did%20not%20disappear%20completely.&text=This%20association%2C%20and%20the%20belief,a%20history%20that%20is%20incomplete.
  78. a b c d e f Walters M. Feminism: A Very Short Introduction (Very short introductions ; 141). Oxford University Press; 2005.
  79. Women MPs - Parliament's people | NZHistory, New Zealand history online [Internet]. Nzhistory.govt.nz. 2020 [cited 19 October 2020]. Available from: https://nzhistory.govt.nz/politics/parliaments-people/women-mps)
  80. Salami M. Sensuous knowledge; 2020
  81. Hume L. The national union of women's suffrage societies 1897-1914. Abingdon: Routledge; 2016.
  82. Ien Ang, ‘I’m a Feminist but …’, in Transitions: New Australian Feminisms , ed. B. Caine and R. Pringle (Sydney: Allen and Unwin, 1995).
  83. Postgraduate Women’s Studies Courses in the UK [Internet]. SI-UK: Move Forward. Be Great. 2020 [cited 23 October 2020]. Available from: https://www.studyin-uk.com/study-guide/best-womens-studies-programmes-uk/
  84. Cambridge online dictionary, the definition can be found at https://dictionary.cambridge.org/fr/dictionnaire/anglais/economics
  85. "Three Ancient Greek Philosophers that shaped economy as we know today" by DocumentaryTube.com, https://www.documentarytube.com/articles/three-ancient-greek-philosophers-that-shaped-economy-as-we-know-today
  86. "A brief History of EConomics, by Investopedia, https://www.investopedia.com/articles/economics/08/economic-thought.asp
  87. Professor Claude M'Fuka M'Lambi
  88. Principe LM. The Secrets of Alchemy. University of Chicago press. 2012;:9–14.
  89. Gilbert RA, Multhauf RP. Alchemy. Encyclopedia Britannica [Internet]. 2019Mar6 [cited 2020Nov10]; Available from: https://www.britannica.com/topic/alchemy
  90. Silvin N. Chinese Alchemy: Preliminary Studies. Harvard University Press. 1968;:21–2.
  91. Martelli M. L'alchemisto antica. Editrice Bibliografica. 2019;:73–86.
  92. Partington JR. A short history of chemistry. New York: Dover Publ.; 1989.
  93. Pereira M. Alchemy. Routledge Encyclopedia of Philosophy. 1998;
  94. Hanegraaff WJ, Faivre A. Western esotericism and the science of religion. Place of publication not identified: Peeters; 1999.
  95. Principe LM. The End of Alchemy? Osiris. 2014;29(1):96–116.
  96. Dobbs BJT. The Foundations of Newton's Alchemy or "The Hunting of the Greene Lyon". Cambridge University Press. 1975;