User:TimRJordan/sandbox/Approaches to Knowledge/2020-21/Seminar group 4/History

From Wikibooks, open books for an open world
Jump to navigation Jump to search

The History of Psychology as a Branch of Medicine[edit | edit source]

Psychology can be simply defined as the study of experience and behaviour according to Sonja Hunt in The relationship between psychology and medicine[1], which makes psychology have a common point with medicine: an interest in human functioning. The involvement of psychology in medicine has not been continuous or unanimously encouraged, and the interdependence between body and mind was questioned throughout the centuries in parallel with the interdependence between psychology and medicine. Mind and body were believed to be linked in Ancient times. The saying "Mens sana in corpore sano", which is a quote from Juvenal in Satires[2], is still used today. This citation demonstrates the importance in people’s mind of, in order to be well, not only be in a good physical shape but also in a good state of mind. However the link between these two disciplines was questioned when the view on Man changed: body and mind became two distinct areas of study in the 17th century, especially due to the work of Descartes (Mediationes de Prima philosophia[3], Meditation VI).

The view on the link between mind and body has since changed. Psychology is considered essential and has come to be a medical discipline even if it doesn't use prescriptions and medicines like psychiatry does. However, it may still be marginalised due to its sometimes-abstract aspect, less concrete than other branches of medicine. Hunt claims it can undermine the credibility of the discipline[4]. Nevertheless, psychology is increasingly recognised especially with the growing awareness of the importance of mental health.

References[edit | edit source]

  1. Hunt SM.The relationship between psychology and medicine. Social Science & Medicine. 1974;8(2):105-109. Available from: https://doi.org/10.1016/0037-7856(74)90040-7
  2. Wikipedia. Mens sana in corpore sano. Available from: https://fr.wikipedia.org/wiki/Mens_sana_in_corpore_sano [Accessed 17th October 2020]
  3. Descartes R. Meditation VI. Of the Existence of Material Things, and of the real distinction between the Soul and Body of Man. In: Meditations on First Philosophy. Cambridge University Press; 1911. Available from: https://yale.learningu.org/download/041e9642-df02-4eed-a895-70e472df2ca4/H2665_Descartes%27%20Meditations.pdf
  4. Hunt SM.The relationship between psychology and medicine. Social Science & Medicine. 1974;8(2):105-109. Available from: https://doi.org/10.1016/0037-7856(74)90040-7

The History of Material Culture[edit | edit source]

Analysing objects is not a recent way to understand other communities, as it has always been implied in the ethnographic work. However, its consideration as a discipline may only be dated from the late 1990’s with the creation of the Journal of Material Culture in 1996 firstly edited by members of the UCL department of Anthropology. Here it points out that academics started to share about their conception of material culture and thus it evidences its beginning as a proper discipline.[1]

At first, in anthropological researches, material culture studies’ aim was to prove the ‘modernism’ of western culture through the comparison between their evolved objects and ‘primary’ objects of the non-western ones: European culture is showed as superior.[2] Thus, due to colonialism the supremacy of Western assumptions exists, which also lead to a ‘masculine’ hierarchy of senses placing the visual sense at the top of the scale. This traditional ‘ocular centrism’s mode of analysis has been questioned from the assertion of non-Western cultures thanks to decolonization and the emergence of material culture studies as a peculiar discipline. The development of material culture studies allows to not discriminate any culture, ethne by criticizing a single approach of objects.[3] The social change with those new pluricultural communities, lead to increasingly challenge the foundational way of interacting with objects especially in museums where only the Sight is engaged. Although each culture has its own sensory model, only the Western model was considered relevant, which showed again the desire to distinguish European culture from others. Material culture studies as a discipline want to teach people in a crossed senses’ manner how to apprehend objects coming from different societies, “not only to see objects but to sense objects” as said in Sensible Objects Colonialism, Museums and Material Culture.

Thereby material culture studies will keep changing, challenging its traditional ways of analysis based on colonialism history. The question of changing the only visual approach of artefacts exhibited in museums is complexed, since the physical preservation aspect has to be raised.

References[edit | edit source]

  1. Woodward I. Understanding material culture. New-York: SAGE Publications Ltd; 2007.
  2. Woodward S. Material Culture [Internet]. obo. 2020 [cited 18 October 2020]. Available from: https://www.oxfordbibliographies.com/view/document/obo-9780199766567/obo-9780199766567-0085.xml
  3. Edwards, E., Gosden, C. and Phillips, R., 2006. Sensible Objects Colonialism, Museums And Material Culture. Berg, pp.1 to 32.Available from:10.5040/9781474215466.0006

The History of Logic in the West[edit | edit source]

Logic as discipline[edit | edit source]

The word “logic” originates in the Greek word logos. Traditionally, logos is translated to “reason”, but this translation has been contested by scholars and as a result, its entry in A Greek-English Lexicon, Liddell–Scott–Jones (LSJ), has over 60 translations, including ”speech”, ”word”, and ”argument”.[1] A broad definition of logic as ”the appraisal and analysis of arguments” has also been proposed.[2] Whilst humans have always reasoned and engaged with arguments, and therefore used logic, medieval universities distinguished logica utens (the use of logic in thought, speech, and writing) from logica docens (the formal study of Logic as a discipline), with Aristotle considered the founder of the latter in the West.[3]

Significant Work[edit | edit source]

Aristotle wrote a total of six books on logic, collectively called the Organon. In Topics, his first attempt at a logic textbook, he discusses the invention of arguments based on endoxa, “consensus”, and called it the art of dialectic.[4] Aristotelian Logic continued to be studied in medieval Europe, and most works on Logic remained based on the work of Aristotle until the mid-thirteenth to the mid-fourteenth century when more original work was developed.[5]

The most influential works in Logic after Aristotle include Port-Royal Logic, published in 1662 by Antoine Arnauld and Pierre Nicole and considered the start of Traditional Logic,[6] and John Stuart Mill’s A System of Logic, which was published in 1843 and what prompted the new perspective on logic as a branch of Psychology.[7] Since then, several modern logical systems have been developed, especially in mathematical logic, which after the Second World War split into Model theory, Proof theory, Computability theory, and Set theory.[8]

Logic in Higher Education[edit | edit source]

Whilst logic classes and courses are common at universities, logic degrees are relatively rare and new. These are usually interdisciplinary and taught in departments of Philosophy or Mathematics, and more recently also Computer Science.[9]

References[edit | edit source]

  1. Moss J. Right Reason in Plato and Aristotle: On the Meaning of "Logos". Phronesis [Internet]. 2014 [cited 2020];59(3):182. Available from: www.jstor.org/stable/24767942
  2. Gensler, H. J. (2017) [2002]. "Chapter 1: Introduction". Introduction to logic (3rd ed.). New York: Routledge. p. 1. doi:10.4324/9781315693361. ISBN 9781138910591. OCLC 957680480.
  3. Houser RE. Lesson 5 Aristotle Invents Logic—Twice. In: Logic as a liberal art: an introduction to rhetoric & reasoning. Washington, D.C.: The Catholic University of America Press; 2020. p. 50–.
  4. Cellucci, C. (2013). Rethinking Logic: Logic in Relation to Mathematics, Evolution, and Method. Dordrecht: Springer Science & Business Media. p. 82. ISBN 9789400760905.
  5. Boehner, Philotheus. Review: Jan Lukasiewicz, Aristotle's Syllogistic from the Standpoint of Modern Formal Logic. J. Symbolic Logic 17 (1952). p 1.
  6. Oxford Companion p. 504, article "Traditional logic"
  7. Adamson R. In: Short history of logic. Nabu Press; 2010. p. 242.
  8. Barwise, Jon, (ed.), Handbook of Mathematical Logic, Studies in Logic and the Foundations of Mathematics, Amsterdam, North-Holland, 1982 ISBN 978-0-444-86388-1 .
  9. Logic Degree Programs [Internet]. Study.com. 2020 [cited 2020Oct19]. Available from: https://study.com/articles/logic_degree_programs.html

History of Artificial Intelligence within Computer Science[edit | edit source]

Oxford Languages defines artificial intelligence (AI) as 'the theory and development of computer systems able to perform tasks normally requiring human intelligence'[1]

The Dartmouth Summer Research Project on Artificial Intelligence (DSRPAI), held in 1956, saw the emergence of AI as a discipline. It was here where John McCarthy, who first coined the term 'artificial intelligence', met with other academics such as Marvin Minsky, Nathaniel Rochester, and Claude Shannon to research and discuss AI.[2]

Before its establishment as a discipline, AI was a popular subject amongst researchers. Perhaps the most notable work being the Logic Theorist, a computer program created by Allen Newell, Herbert A. Simon and Cliff Shaw in 1955, which was later presented at the DSRPAI. [3]

Significant Work[edit | edit source]

Alan Turing was a British mathematician and computer scientist who famously cracked the enigma code during WW2, reaching feats in the field of AI before its debut. In his 1950 paper, 'Computing Machinery and Intelligence', he details the Turing test which sought to answer the question "Can Machines Think?". [4] Turing describes 'The Imitation Game' in which an examiner questions a computer and a human to identify one from the other. Based on Turing's work, the CAPTCHA is now more widely used to determine whether a user is a computer or a human.

At the MIT Artificial Intelligence Laboratory, Joseph Weizenbaum developed ELIZA - a natural language processing computer program - released in 1966. Many users thought ELIZA could conceptualise but she was simply "pattern matching". Despite this, ELIZA was one of the first programs with the capability to attempt the Turing test. [5]

In 1997, the reigning chess champion Garry Kasparov was defeated in multiple chess matches by IBM's supercomputer Deep Blue. Of the six games, three were a draw, one a win for Kasparov, and two were won by Deep Blue. [6]

Education and the Future of AI[edit | edit source]

At higher education, artificial intelligence is primarily taught as part of a computer science degree but is now increasingly being offered as a standalone course. [7] AI has roots in many subject areas including philosophy, mathematics, psychology, and biology. Within psychology, for example, to build artificially intelligent machines requires an understanding of how human brains function. Its versatility as a discipline is reflected in the multiple sub-fields of AI - neural networks, robotics, speech processing, machine learning, etc. [8]

With AI expanding as a discipline, the AI worldwide software market grows year-on-year (approximately 54% from 2019-20). [9] However, there are concerns about AI and what it means for our future. Often these are centred around unemployment, security, inequality, and singularity. [10] These concerns point to a different direction for AI; one which interests the public sector. [11]

References[edit | edit source]

  1. artificial intelligence [Internet]. Oxford Reference. 2020 [cited 20 October 2020]. Available from: https://www.oxfordreference.com/view/10.1093/oi/authority.20110803095426960
  2. Moor J. The Dartmouth College Artificial Intelligence Conference: The Next Fifty Years. AI Magazine. 2006;(27).
  3. Gugerty L. Newell and Simon's Logic Theorist: Historical Background and Impact on Cognitive Modeling. Clemson; 2006.
  4. Turing A. Computing Machinery and Intelligence. Mind; 1950.
  5. Before Siri and Alexa, there was ELIZA [Internet]. 2017 [cited 18 October 2020]. Available from: https://www.youtube.com/watch?v=RMK9AphfLco
  6. IBM100 - Deep Blue [Internet]. Ibm.com. [cited 18 October 2020]. Available from: https://www.ibm.com/ibm/history/ibm100/us/en/icons/deepblue/
  7. Search - UCAS [Internet]. Digital.ucas.com. 2020 [cited 20 October 2020]. Available from: https://digital.ucas.com/coursedisplay/results/providers?searchTerm=artificial%20intelligence&studyYear=2021&destination=Undergraduate&postcodeDistanceSystem=imperial&pageNumber=2&sort=ProviderAtoZ&clearingPreference=None
  8. Bullinaria J. IAI: The Roots, Goals, and Sub-fields of AI. Birmingham; 2005.
  9. Liu S. Forecast growth of the artificial intelligence (AI) software market worldwide from 2019 to 2025. Statista; 2020.
  10. Bossmann J. Top 9 ethical issues in artificial intelligence. World Economic Forum. 2016;.
  11. Office for Artificial Intelligence [Internet]. GOV.UK. 2020 [cited 20 October 2020]. Available from: https://www.gov.uk/government/organisations/office-for-artificial-intelligence

History of Forensic Linguistics as a Discipline[edit | edit source]

Forensic linguistics is a subsidiary of applied linguistics which involves implementing linguistic understanding as forensic evidence in lawful procedures.[1] Similar to most sciences, themes of Forensic Linguistics can be traced back to Ancient Greece and author identification (a key pillar of the discipline) has been a topic of debate in public discourse since literature began.[2] There were attempts to establish means of author attribution in the nineteenth and early twentieth century, with methods rooted in mathematics and statistics using quantitative factors such as average word and sentence length but these are incompatible with the definition of forensic linguistics for lacking in both forensic applications and established linguistic technique.[3]

The term forensic linguistics, however, was coined in 1968 by Jan Svartvik when he was commissioned to analyse a body of statements given to the Notting Hill Police Station regarding the case of Timothy John Evans who was accused of murdering his wife and child and was subsequently convicted and hanged in 1953. Evans’ statements troubled many experts who pondered their authenticity and Svartvik’s analysis showed that they couldn’t have been dictated by him. The efficiency of the technique of methodically examining bodies of literature was demonstrated in this case and its use began to gain traction. [4]

The American linguist Roger Shuy is widely regarded to be the founder of modern forensic linguistics and his work, along with others, has shaped many aspects of civil and criminal practice thus solidifying the discipline’s place and use. [5]

The International Association of Forensic Linguistics (IAFL) was formed in the late 20th century and is the primary organisation for forensic linguists. The IAFL edits the International Journal of Speech, Language and the Law, a peer-reviewed journal that is focused around all aspects of forensic language and audio analysis and they also hold biennial conferences.[6]

Additionally, forensic linguistics is taught internationally at Universities, sometimes as a stand-alone degree but often in dual honour degrees or in specific modules.

References[edit | edit source]

  1. "Centre for Forensic Linguistics". Aston University. Archived from the original on 27 September 2010. [accessed 17/10/20]
  2. John Olsson (2008), Forensic Linguistics, Second Edition. London: Continuum ISBN 978-0-8264-6109-4
  3. Olsson, J. (n.d.). What is Forensic Linguistics? [online] Available at: https://www.thetext.co.uk/what_is.pdf [Accessed 26 Oct. 2020]. ISBN 978-0-8264-6109-4[accessed 17/10/20]
  4. Olsson, J. (n.d.). What is Forensic Linguistics? [online] Available at: https://www.thetext.co.uk/what_is.pdf [Accessed 26 Oct. 2020]. ISBN 978-0-8264-6109-4 [accessed 17/10/20]
  5. Battistella, E. and Shuy, R.W. (2000). The Language of Confession, Interrogation, and Deception. Language, 76(3), p.731.name="CV">Shuy, Roger (2018). "Curriculum Vitae" (PDF). RogerShuy.com. Retrieved 18 September 2020.{{cite web}}: CS1 maint: url-status (link)[accessed 17/10/20]
  6. https://www.iafl.org/[accessed 18/10/20]

The Evolution of Beauty and Visual Culture within the Arts[edit | edit source]

Many of us spend hours curating and maintaining our social media presence every day; Instagram is a conduit to modern visual culture expression. Our perception of ourselves, society and the world are all mediated by consumption and aesthetic pleasures. Western society has a long history with the visual and it is interwoven into many aspects of our worldly perspective.[1]This history permeates our every day through magazines, YouTube, Instagram and television’s influence on our self-esteem, how we feel about our bodies and our identity. The cultural anthropologist Seremetakis believes the Western fetishisation of sight and aestheticism is seductive and hides the other senses on our cultural periphery.[1] The documentation of social expectations drives our digital economy, what one must purchase and achieve in order to be ‘normal’.

Literature and Art have changed and manipulated our perspective on beauty over time, not only in terms of ourselves but also of objects and architecture. These disciplines act as gateways to new perspectives of popular visual culture such as new genres of Literature or new art movements such as the Bauhaus movement. In his poem An Ode to a Grecian Urn, John Keats states that 'beauty is truth, truth beauty', suggesting that a more realistic representation of life and human experiences should be the principal subject of art. [2]

However, this realism within the creative disciplines was interrupted by the Aesthetes of the Pre-Raphaelite era (such as Gabriel Rossetti and William Morris).[3] [4] This movement upturned the expectation of art’s subject to simply be “art for art’s sake”, and the primary function of its existence to exemplify beauty and adhere to the pleasure of our senses. [5] Danto theorised that once beauty is made the epicentre of an art piece; the work becomes redundant in terms of meaning, the disclosure of knowledge and its function within Capitalism. [6] The chronological history of the creative disciplines is filled with discussion and dispute of the function of the discipline itself. Beauty has been moulded and shaped by Literature through a wide number of literary texts. For this particular essay, Virginia Woolf’s ‘Orlando’.[7] discloses the privilege of beauty and how beauty can be attained and maintained through wealth and power. Woolf’s main subject is the beauty of the human body through the eyes of a gender-fluid protagonist and his possessions. [8]

In an age and milieu where the importance of the visual, whether beautiful or not, galvanises disciplines such as Art and Literature but also causes division between smaller subcategories such as Aesthetes and the Didactic influences, the concept of beauty can be seen as both limiting and liberating. The history and progression of Visual Culture gives the academics of this discipline the tools to analyse the current visual pandemic of Instagram. When one considers that 500 million people globally post an Instagram story per day, the epicentral platform for visual culture during the current epoch, there is no doubt the academic and statistical analysis of the visual is fertile. [9]

References[edit | edit source]

  1. a b Edwards, Elizabeth., Chris Gosden., Ruth B. Phillips., "Introduction." Sensible Objects: Colonialism, Museums and Material Culture, 1–32. London: Bloomsbury Academic, 2006. Bloomsbury Collections.
  2. Keats, John. 'Ode on a Grecian Urn', the Eve of St. Agnes: And Other Poems with Biographical Sketch, Introduction and Notes. Boston, etc: Houghton Mifflin Co, 1915
  3. https://en.wikipedia.org/wiki/William_Morris
  4. https://en.wikipedia.org/wiki/Dante_Gabriel_Rossetti
  5. https://www.tate.org.uk/art/art-terms/a/aesthetic-movement
  6. Danto, Arthur C. “The End of Art: A Philosophical Defense.” History and Theory, vol. 37, no. 4, 1998, pp. 127–143. JSTOR, www.jstor.org/stable/2505400. Accessed 19 Oct. 2020.
  7. Woolf, Virginia., Orlando, London, Hogarth Press, 1928
  8. Scutts, Joanna., 2018, Orlando Is the Virginia Woolf Novel We Need Right Now., https://www.vulture.com/2018/10/why-virginia-woolfs-orlando-feels-essential-right-now.html
  9. Christina Newberry, 2019,'37 Instagram Stats That Matter to Marketers in 2020', Hootsuite, https://blog.hootsuite.com/instagram-statistics/

The History of Inuit Studies[edit | edit source]

The study of the Inuit people, previously known as 'Eskimology', can be traced back to 1745, to the missionary works of Hans Egede in Greenland, as well as Ivan Veniaminov (1840) in Alaska. At this time, studies were exclusively dominated by non-native intellectuals, emphasising 'facts’ instead of theoretical discourse about how these details may best be represented.'[1] The history of the discipline shows how direction of change can be 'influenced by the dilemma of being loyal to the colonised society ... studied while remaining a part of the ... colonising society.'[2]

Between the 1850s and 1920s, the study shifted from dispersed, independent research to a more coherent scholarly community with increased exchange of knowledge. It mostly comprised of early scientific knowledge based on explorations by natural scientists and colonial administrators. In 1894, Greenland was first colonised by Denmark in its easter region. Thus, an imperial project directed research initiatives in Greenland (in the 19th and early 20th centuries.)[3] Consequently, throughout history, the discipline has been a tool for colonialism as well as, more contemporarily, for decolonisation.[2]

From the 1920s to the 1950s 'Eskimology' focused on the origins of Inuit culture and people, drawing prehistory and archaeology to its core. In 1920, 'Eskimology' was institutionalised by the University of Copenhagen when a teaching position was given to William Thalbitzer, an ethnographer and philologist, introducing ethnographic and philological approaches to 'Eskimology.'

In the 1950s -1980s, ‘Eskimology’ transformed into ‘Inuit studies.’ The name ‘Eskimo’ was rejected by natives as it means ‘eaters of raw meat’ (though this is a matter of contention) in the language of Atlantic Inuits[4], and was imposed upon the people by colonisers. The change in the discipline’s very nomenclature 'reflects a shift in the status of the Inuit – being (tentatively) reversed from objects to subjects.’[5]The establishment of a 'Department of Eskimology' in 1967 at the University of Copenhagen furthered the discipline's engagement with contemporary political developments in Greenland, such as indigenous rights. Inuit people are now 'acknowledged as co-producers of knowledge and project initiators' within the discipline.[5] The department also introduced discourse around cultural identity and ethnicity, taking a more anthropological approach.[3] These developments were driven by forces of social pressure, including endogenously, from students and lecturers, as indigenous rights and decolonisation became increasingly integrated into mainstream conscience.

However, the significant delay in the discipline’s institutionalisation and introduction of ethnographic, anthropological approaches may be symbolic of intentional silencing of Inuit culture and the colonial process of ‘cultural conversion’[1].


References[edit | edit source]

  1. a b Fienup-Riordan, Ann (1990). Eskimo Essays: Yup'ik Lives and how We See Them. Rutgers University Press. ISBN 978-0-8135-1589-2. Retrieved 2020-10-17.
  2. a b Krupnik, Igor (2016-02-16). Early Inuit Studies: Themes and Transitions, 1850s-1980s. Smithsonian Institution Scholarly Press. ISBN 978-1-935623-71-7. Retrieved 2020-10-17.
  3. a b Thuesen, Søren (2005-09-23). "Eskimology". In Mark Nuttall (ed.). Encyclopedia of the Arctic. Routledge. ISBN 978-1-136-78680-8. Retrieved 2020-10-17.
  4. Parrott, Zach (2008-08-05). "'Eskimo'". The Canadian Encyclopedia. https://www.thecanadianencyclopedia.ca/en/article/eskimo.  https://www.thecanadianencyclopedia.ca/en/article/eskimo
  5. a b Pongerard, Julien (2018-08-27). "'Early Inuit Studies: Themes and Transitions, 1850s-1980s' edited by Igor Krupnik". History of Anthropology Newsletter. 42. {{cite journal}}: Unknown parameter |retrieved= ignored (|access-date= suggested) (help) https://histanthro.org/reviews/early-inuit-studies/

History of Digital Anthropology[edit | edit source]

Rise of DIgital Anthropology[edit | edit source]

The development of online and digital phenomena such as social media, online politics, big data, search engines, artificial intelligence, etc. opened a completely new chapter in the scientific work of human scientists. Diversity and dynamism of the internet environments led to a quick differentiation of cybercultures which became a subject of scientific inquiry. As a result, The Digital Revolution gave birth to a number of of new sub-disciplines among which we can distinguish digital anthropology which aims to understand how digital environments influence the way people communicate, relate, and interact with each other.

According to Philipp Budka and Manfred Kremser, the beginnings of the digital anthropology can be tracked down to the 1994 Arturo Escobar’s article ‘Welcome to Cyberia’ published in Current Anthropology journal[1]. In the article, Escobar formulated several fundamental questions regarding cybercultures and created basic guidelines for ethnographical research in the cyberspaces[2]

Areas of Study[edit | edit source]

Despite the relatively short existence of the discipline, it is now possible to distinguish several main areas of anthropological investigation in the digital spaces. Daniel Miller in his article for the Cambridge Encyclopedia of Anthropology characterised the following trends in the digital studies[3]:

  1. Study of technologies themselves via specific populations such as hackers, creators etc. which focuses on the exploration of closed online communities.
    • An example of such work is Gabierlla Coleman’s investigation of hacker cultures (2014)[4] or Tom Boellstroff’s ethnographies of the online computer game- Second Life- community (2008)[5]
  2. Study of ubiquitous digital platforms such as social media upon ordinary populations which focuses on producing traditional holistic ethnographies of people whose life was substantially influenced by digital technologies.
    • An example of such work is Madianou and Miller’s research on the transnational communication between mothers and children (2012)[6] or Nicolescu’s ethnographies of Southern Italians in which he explored the impact of social media platforms on the public sphere and interests of people in Southeast Italy (2016)[7]
  3.   Study of digital technologies for anthropological methodology. Digital revolution  implied changes within the discipline regarding the methods of gaining and gathering knowledge.
    • Access to new technology e.g. use of visual and audio recordings, downloads, etc. introduced new ways of conducting anthropological investigations such as online ethnographies which aren’t subject to space and time contraints.

References[edit | edit source]

  1. Escobar, Arturo; Hess, David; Licha, Isabel; Sibley, Will; Strathern, Marilyn; Sutz, Judith (1994-06). "Welcome to Cyberia: Notes on the Anthropology of Cyberculture [and Comments and Reply]". Current Anthropology. 35 (3): 211–231. doi:10.1086/204266. ISSN 0011-3204. {{cite journal}}: Check date values in: |date= (help)
  2. Budka, Philipp (19.10.2020). "Cyber Anthropology- Anthropology of CyberCulture". Contemporary Issues in Socio-Cultural Anthropology: 2. {{cite journal}}: Check date values in: |date= (help)
  3. Miller, Daniel (2018-08-28). "Digital Anthropology". Cambridge Encyclopedia of Anthropology. doi:10.29164/18digital.
  4. Coleman, E. Gabriella, 1973-. Hacker, hoaxer, whistleblower, spy : the many faces of Anonymous. London. ISBN 978-1-78168-583-9. OCLC 890807781.{{cite book}}: CS1 maint: multiple names: authors list (link)
  5. Boellstorff, Tom (2015-08-25). Coming of Age in Second Life: An Anthropologist Explores the Virtually Human. Princeton University Press. doi:10.2307/j.ctvc77h1s. ISBN 978-1-4008-7410-1.
  6. Madianou, Mirca. (2012). Migration and new media : transnational families and polymedia. Miller, Daniel, 1954-. Abingdon, Oxon: Routledge. ISBN 978-0-415-67928-2. OCLC 703208843.
  7. Nicolescu, Razvan. (2016). Social media in Southeast Italy : crafting ideals. London: UCL Press. ISBN 978-1-910634-75-2. OCLC 960754790.

History of Entrepreneurship[edit | edit source]

Early developments[edit | edit source]

The word ‘entrepreneurship’ has always been linked with the word ‘entrepreneur’ which comes from the thirteenth-century French verb ‘entreprendre’, meaning “to undertake”. It was not until the 18th and 19th century, however, that the concept of entrepreneurship was used in academia - more specifically by the economists: Richard Cantillon, Jean-Baptiste Say and John Stuart Mill. For its origins in classical economics, entrepreneurship was considered part of the field, being connected tightly with many economic concepts such as productivity, risk-taking and management.[1]

Entrepreneurship as a field of study[edit | edit source]

"In 1934, Schumpeter first identified entrepreneurs as distinct from business owners and managers”. [2] Thanks to his account, entrepreneurship started to be considered as necessary for economic growth and development. By the end of 20th century, the amount of theory behind entrepreneurship and the number of studies in the field [3] distinguished entrepreneurship as a discipline on its own. [4] Consequently, today one can find entrepreneurship departments in many social sciences and business schools around the world [5].

Further developments[edit | edit source]

Entrepreneurship research diverged from studying the effects of entrepreneurship in business, to the study of entrepreneurial processes in various contexts. While one still cannot disregard its strong tie to business fields [6], it has to be argued that the study of entrepreneurship differs greatly from generic and small business management and no longer focuses only on the supply side or individual traits that drive business formation [7]. Some of the salient implications that extend beyond understanding business and the functioning of complex organisations are the ones on inequality, social mobility, transition economies, social networks, family and life work. [8]

References[edit | edit source]

  1. https://www.econlib.org/library/Enc/Entrepreneurship.html
  2. Keister, Lisa A. (2005-07-01). Entrepreneurship. Emerald Group Publishing Limited. ISBN 9780762311910. Retrieved 2020-10-20.
  3. Boris Urban (March 2010). "'Entrepreneurship as a discipline and field of study". {{cite journal}}: Cite journal requires |journal= (help)
  4. Ronstadt, R., Vesper, K. H. and McMullan, W. E. (October 1988). "'Entrepreneurship: Today Courses, Tomorrow Degrees?". Entrepreneurship Theory and Practice. 13 (1): 7–13.{{cite journal}}: CS1 maint: uses authors parameter (link)
  5. https://ent.ut.ac.ir/en/-/لیست-دانشکده-های-کارافرینی-در-دنیا
  6. Eryılmaz (2017). Encyclopedia of Information Science and Technology. IGI Global.
  7. Patricia H. Thornton (August 1999). "The Sociology of Entrepreneurship". Annual Review of Sociology. 25 (1): 19–46.
  8. Keister, Lisa A. (2005-07-01). Entrepreneurship. Emerald Group Publishing Limited. ISBN 9780762311910. Retrieved 2020-10-20.

History of Ethics as a Branch of Philosophy[edit | edit source]

Ethics is a discipline that deals with questions of morality, what is good and what is bad, trying to create a set of moral values, it "involves systematizing, defending, and recommending concepts of right and wrong behavior"[1]. From Ancient Greek the work ēthos means "habitual character and disposition; moral character; habit, custom;"[2]. The origin of ethics as a system of moral norms cannot be described in the same sense as the origin of, for example, science or philosophy. There was no particular point on a timeline when morality arose, morality was inherent in the society, in one form or another, at all stages of its development. People living together were bound by different moral norms shaped by accepted beliefs, tendencies, and presuppositions.[citation needed]

In the western tradition, philosophical reflection on morality is considered to have started with the sophists in the fifth century B.C.E. The Sophists were teachers who taught a wide range of different subjects from philosophy and rhetoric to mathematics. They were focusing on arete –"virtue" or "excellence" [3] Socrates challenged the subjectivism of Sophist ethics. He held the belief that ethical principles were universal and they were able to be identified, examined, and improved within the individual[4]. Our knowledge of Socrates is derived mainly from Plato’s dialogues, such as The Republic and Gorgias[5]. Plato’s student, Aristotle builds his view of ethics upon his teacher’s beliefs but with significant differences. Unlike Plato, he viewed the goodness as a part of a flourishing life (or eudaimonia[6]) as the highest good, not the good itself. And eudaimonia can only be achieved through virtuous life.

With the development of the discipline, philosophers started to distinguish between deontological ethics (from Greek: δέον, 'obligation, duty' + λόγος, 'study') and consequentialism, the first of the two hold that morality of an action is determined by the action itself, while the second one focuses on the consequences or results of an action. In this sense, deontology is often associated with Immanuel Kant, who believed that ethical actions follow universal laws[7], while utilitarianists, such as Jeremy Bentham or John Stuart Mill, believed in maximization of utility of an action[8].

Ethics is now a part of most degrees in Philosophy, however it is rarely presented as a separate degree.[citation needed]

References[edit | edit source]

  1. Ethics | Internet Encyclopedia of Philosophy [Internet]. Iep.utm.edu. 2020 [cited 19 October 2020]. Available from: https://iep.utm.edu/ethics/
  2. ethos | Origin and meaning of ethos by Online Etymology Dictionary [Internet]. Etymonline.com. 2020 [cited 19 October 2020]. Available from: https://www.etymonline.com/word/ethos?ref=etymonline_crossreferenc
  3. Liddell, H.G. & Scott, R. A Greek–English Lexicon, 9th ed. (Oxford, 1940), s.v.ἀρετή
  4. Parry R. Ancient Ethical Theory (Stanford Encyclopedia of Philosophy) [Internet]. Plato.stanford.edu. 2014 [cited 19 October 2020]. Available from: https://plato.stanford.edu/entries/ethics-ancient/#2
  5. Plato, Cooper J. Complete works. Indianapolis: Hackett; 2009.
  6. Duignan B. eudaimonia | Definition & Facts [Internet]. Encyclopedia Britannica. 2020 [cited 20 October 2020]. Available from: https://www.britannica.com/topic/eudaimonia
  7. Critique of Practical Reason - Wikisource, the free online library [Internet]. En.wikisource.org. [cited 20 October 2020]. Available from: https://en.wikisource.org/wiki/Critique_of_Practical_Reason
  8. Duignan B, R. West H. utilitarianism | Definition, Philosophy, Examples, & Facts [Internet]. Encyclopedia Britannica. 2020 [cited 20 October 2020]. Available from: https://www.britannica.com/topic/utilitarianism-philosophy

The History of Psychoanalysis[edit | edit source]

Psychoanalysis is a theory of the human mind and a method of psychotherapy[1] developed by Austrian neurologist Sigmund Freud. It is based on the idea that there are conscious and subconscious parts of the mind, and that they interact with each other. Psychoanalytical therapy consists of one-on-one sessions with a psychoanalyst, during which the patient speaks freely of their past and present experiences, uncovering repressed subconscious thoughts which are discussed with the therapist.[2] Freud started theorizing psychoanalysis in the 1880s and 90s after working with Josef Breuer on the treatment of hysteria under hypnosis.[3] Their collaboration ended however, and for almost a decade Freud was the only person working on psychoanalysis[4]. He published The Interpretation of Dreams, considered to be his most important work, in 1899. In 1902, Freud was granted a Neuropathology professorship at the University of Vienna and he maintained it until his exile in 1938[5]. In 1902, Freud also started the “Wednesday Psychological Society” in which he met with 5 to 15 members to discuss psychoanalysis every week until about 1908, when the International Psychoanalytical Association was officially founded (even though there had already been an international psychoanalytical congress called “First Congress for Freudian Psychology” in 1907)[6]. Psychoanalysis was becoming a more widespread discipline and was slowly gaining acceptance in the academic realm, attracting more and more scholars and theorists: its concepts were even being used in other disciplines, such as history of art.[7]

Criticisms of Psychoanalysis[edit | edit source]

But along with this growing posterity came many criticisms, such as the ones formulated by Karl Popper, an epistemologist who considered psychoanalysis to be a “pseudo-science” because its theories could not be proven or disproven.[8] In fact, psychoanalytical methods were, despite Freud’s claims, not entirely scientific: psychoanalysts would often start with a theory and try to find an example in a clinical case, rather than starting with an observation and coming to conclusions empirically. And as psychology shifted to a more scientific approach in the second half of the 20th century, psychanalysis started losing its prestige.

Psychoanalysis in Higher Education[edit | edit source]

Today, some universities still offer courses and degrees relating to psychoanalysis, such as the University of Essex[9], the University of Oxford[10], University College London[11], University College Dublin[12], and the International University for Graduate Studies in Dominica[13]. Among these, only the latter offers a doctorate in “Psychoanalysis”, the others all having different names, such as “BA Psychosocial and Psychoanalytical Studies” at Essex or “MSc in Developmental Psychology and Clinical Practice” at UCL. The University of Oxford offers a course in “Psychodynamic Counselling”, but it is a Postgraduate Certificate and not a Master’s degree [14]. Some of these courses are approved by the IPA and allow students to become psychoanalysts, but nowadays psychoanalysis is mostly studied in relation to other disciplines. The Psychoanalysis Unit at UCL states: "Our mission is to break the mould of traditional approaches to psychoanalysis, taking inspiration from the discipline's ideas to meet the challenges of the modern world. Our interdisciplinary research applies psychoanalysis to contemporary issues, including mental and physical health, financial instability, gender, technology and the arts."[15]

The role of Psychoanalysis in Psychiatry[edit | edit source]

Despite its controversies, psychoanalysis continues to influence the field of psychiatry, the NHS stating it as a possible cure for depression[16], and it remains an important part of our collective psyche: Time magazine mentioned Freud as one of the 100 most influential people of the past century[17].

References[edit | edit source]

  1. Oxford University Press. Definition of Psychoanalysis; 2020. Available at: https://www.lexico.com/definition/psychoanalysis
  2. American Psychoanalytical Association. Psychoanalytic Theory & Approaches; 2020. Available at: https://apsa.org/content/psychoanalytic-theory-approaches.
  3. BBC. Historic Figures - Sigmund Freud; 2014. Available at: http://www.bbc.co.uk/history/historic_figures/freud_sigmund.shtml
  4. International Psychoanalytical Association. History of the IPA. Available at: https://www.ipa.world/IPA/en/IPA1/ipa_history/history_of_the_ipa.aspx
  5. Gay, P. Freud: A life of our time. 2nd ed. W W Norton & Co; 2006. pp 136-137
  6. International Psychoanalytical Association. History of the IPA. Available at: https://www.ipa.world/IPA/en/IPA1/ipa_history/history_of_the_ipa.aspx
  7. Adams L. S. Art and Psychoanalysis. New York: Icon editions; 1993.
  8. Popper K. Conjectures and refutations. New York (NY): Harper Torch; 1968
  9. University of Essex. BA Psychosocial and Psychoanalytic Studies; 2020. Available at: https://www.essex.ac.uk/courses/ug01035/1/ba-psychosocial-and-psychoanalytic-studies
  10. University of Oxford. PGCert in Psychodynamic Counseilling; 2020. Available at: https://www.conted.ox.ac.uk/about/postgraduate-certificate-in-psychodynamic-counselling
  11. UCL Psychoanalysis Unit. Courses; 2020. Available at: https://www.ucl.ac.uk/drupal/site_psychoanalysis/node/3419/
  12. University College Dublin. MSc in Psychoanalytic Psychotherapy; 2020. Available at: https://www.ucd.ie/medicine/studywithus/graduatestudies/psychotherapy/mscpsychoanalyticpsychotherapy/
  13. International University for Graduate Studies. The Doctorate in Psychoanalysis; 2020. Available at: https://iugrad.edu.dm/psychoanalysis/
  14. University of Oxford. PGCert in Psychodynamic Counseilling; 2020. Available at: https://www.conted.ox.ac.uk/about/postgraduate-certificate-in-psychodynamic-counselling
  15. UCL. Psychoanalysis; 2020. Available at: https://www.ucl.ac.uk/psychoanalysis/
  16. National Health Service. Treatment - Clinical Depression; 2019. Available at: https://www.nhs.uk/conditions/clinical-depression/treatment/
  17. Time Magazine. Time 100 Persons of the Century; 1999. Available at: http://content.time.com/time/magazine/article/0,9171,26473,00.html

The History of Parkour[edit | edit source]

Defining Parkour[edit | edit source]

Defining parkour is a complex task. Parkour is an acrobatic discipline because it is a stunning sport consisting of jumping, running, rolling, climbing, and balancing in an urban environment [1]. As David Belle, one of its founders puts it, parkour is a “type of freedom”[2]. The purpose of the sport is to move from one point to another point as quickly as possible whilst expressing and developing one’s freedom. It is also an art form as well in that it’s values and philosophy construct that of teamwork, solidarity, fraternity, trust, rigorous physical training and a mindset that pushes back the body’s boundaries.

The beginning[edit | edit source]

The predecessor of parkour was brought about by Georges Hébert, a member of the French marines, who wrote a book on it. In this book named La Méthode Naturelle, he describes the physical fitness of native indigenous tribes he met whilst in Africa. Hebert established a “natural method” that consists of activities such as running, climbing and jumping, for people to become physically fit[3]. An obstacle course for the French military was created based on Hébert’s writings. During the Vietnam War, a French military called Raymond Belle trained on this obstacle course for hours and his stamina and physical aptitude were said to have surged because of it. This empowered him to achieve exploits in the army and firefighting services. Furthermore, it was his son, David Belle, who was originally a gymnast, that is considered to be the pioneer of the parkour discipline.

The 1980s[edit | edit source]

Parkour in English is derived from the French word ‘Parcours’ meaning pathway. In a documented interview, David Belle told about when he asked his father, Raymond Belle, to characterise what the word meant when he created the sport. Belle Senior replied ‘Parcours is like in life, you have obstacles and you train to overcome them, you search for the best technique, you try all the techniques, you keep doing your best, you repeat it and then you get better.’ From this point on, David Belle and fellow members created a group named ‘Yamakasi’ which means “strong spirit” in Lingala and the discipline was officially practiced as ‘parkour’[4].

Parkour today[edit | edit source]

Today, there are parkour associations and groups internationally. Its attraction has significantly risen with many people around the world, particularly amongst the younger generations, learning about the discipline. The International Parkour Federation (or World Freerunning and Parkour Federation), established in 2014, also encourages the teaching of parkour in war zones or underpriviledged areas, to bring communities together through physical activity and the taching of some of parkour's values. It is also the WFPF that licenses parkour teachers.[5]

The original philosophical construct and mindset of its practice with hard training, stamina, and brotherhood, however, tends now to be more anecdotal as the vast majority of those that practice it consider parkour as a sport or hobby and not an art form[6]. In other words, parkour today is more of an acrobatic sport than a discipline, which it what it was considered to be in the 1980s and 1990s with the Yamakasi group training and mindset.

References[edit | edit source]

  1. Mould, O., 2009. Parkour, the City, the Event. Environment and Planning D: Society and Space, 27(4), pp.738-750, DOI: 10.1068/d11108
  2. Generation Yamakasi, 2006, France. A documentary directed by M. Daniels and produced by Philippe Alfonsi & Bruno Girard.
  3. Angel, J. (2011). Ciné Parkour: A cinematic and theoretical contribution to the understanding of the practice of parkour. Accessed on 23 October 2020, from http://bura.brunel.ac.uk/handle/2438/6119
  4. Edwardes, D. (2020). Parkour History | Parkour Generations. Available from 6 April 2018, accessed on 23 October 2020, from https://parkourgenerations.com/parkour-history
  5. WFPF. Peace through Parkour. [Accessed 10 November 2020] Available at: https://internationalparkourfederation.org/peace-through-parkour/
  6. Alister O'Loughlin (2012). A door for creativity – art and competition in parkour, Theatre, Dance and Performance Training, 3:2, 192-198, DOI: 10.1080/19443927.2012.689131