Amputation

In this post by our Digitisation Project Intern, we look at our amputation instruments, while referring to the work of Maister Peter Lowe, College founder and 16th century surgeon.

The surgical procedure of an amputation involves the removal of a section of a limb of the body. The volume of tissue removed from the body depends on a variety of factors, including the severity of the patient’s condition.

instruments-and-cauters-actuals-for-extirpation-copy

Woodcut illustration, 2nd ed. of Lowe’s Chirurgerie (1612)

 

It is uncertain as to how long amputations have been a regular form of surgical treatment, however the term can be traced back to the 16th century. For example, Peter Lowe uses the term “amputation” when describing how to treat a gangrenous limb in his 1597 work The Whole Course of Chirurgerie [1].  Here he explains how the operation should be carried out, referencing the works of previous scholars:

The judgements are, that it is for the most part incurable, and the patient will die in a cold sweat. The cure, in so much as may be, consists only in amputation of the member, which shall be done in this manner, for the patient must first be told of the danger, because often death ensues, as you have heard, either from apprehension, weakness, or loss of blood.”

It has only been within the last 170 years that amputations, and surgical procedures in general, have been performed in a safe manner, e.g. with the patient under anaesthesia. Prior to this, the limb was removed as quickly as possible. A successful and speedy amputation required precision, strength, skill, and a steady hand, as well as a set of sharp amputation instruments!

amputation-set

Mid 19th century amputation set

 

Within the museum collection are examples of amputation sets from the 1800-1900s.

Several components make up a set, from trephine heads to amputation saws to tourniquets. Each instrument would be used at a different stage of the surgical procedure. Let’s take a look at how a lower limb amputation would be performed.

First of all, the patient would be prepped for the surgery. In the days before pain relief, alcohol was the method used to calm the nerves. The patient would be given some rum or whisky, and then wheeled into the surgical theatre. Most likely the theatre would be structured with the operating table in the centre of the room surrounded by rows and rows of stands for spectators. Spectators would include the students of the chief surgeon involved in the procedure- not only was this a surgical operation, it was also a lesson. Once the patient was placed on the operating table, the chief surgeon would enter the theatre and the operation would commence.

One of the major dangers of amputating a limb is blood loss. Several blood vessels must be carefully salvaged during the procedure in order to limit haemorrhaging [1]. To enable the surgeon to operate on a bloodless area of the body, a Tourniquet was applied proximal to the site of amputation (a couple of inches above the site of incision).

“The use of the ribband is diverse. First it holds the member hard, that the instrument may curve more surely. Secondly, that the feeling of the whole part is stupefied and rendered insensible. Thirdly, the flow of blood is stopped by it. Fourthly, it holds up the skin and muscles, which cover the bone after it is loosed, and so makes it easier to heal.”[1]

tourniquet

Example of a tourniquet from an amputation set

 

The tourniquet would have been tightened in order to restrict blood flow and reduce haemorrhaging. It would also have reduced sensation to the limb, providing slight pain relief. However, this would also mean that oxygen was restricted. Hence another reason as to why amputations were performed as quickly as possible.

tourniquet-illustration

The initial incision would have been made with a sharp amputation knife. Amputation knives evolved in shape over the years, from a curved blade to a straight blade. Peter Lowe comments on the use of a curved blade for the procedure:

“…we cut the flesh with a razor or knife, that is somewhat crooked like a hook…”[1]

The blade was curved in order to easily cut in a circular manner around the bone (see image from Lowe’s book above) [2]. Amputation blades became straighter as the incision technique evolved. An example of a straight amputation knife is that of the Liston Knife. With a straight and sharp blade, this knife was named after the Scottish surgeon Robert Liston. Liston is best known for being the first surgeon in Europe to perform an amputation procedure with the patient under anaesthesia [3].

liston-knife

Liston knife, mid 19th century

 

The straight blades enabled the surgeon to dissect more precisely in order to form the flap of skin and muscle that would become the new limb stump.

As one can imagine, bone tissue would not be easily removed by an amputation knife. Instead, an amputation saw was required to separate bone. Amputation saws were similar to those found in carpentry, with sharp teeth to dig into and tear bone tissue for a quick procedure.

amputation-saw

Amputation saw, mid 19th century

 

Aside from the major dissecting tools, there are more specialised instruments within an amputation set that we must consider. One of the main risks of an amputation operation was death by haemorrhaging. For years, the letting of blood was used to treat certain ailments according to the ancient teaching of the “Four Humors”. However, in a surgical procedure the major loss of blood was something to be avoided. In order to prevent the haemorrhaging of dissected vessels, the surgeon would have used a Ligature to tie off the vessel and disrupt blood flow. This technique was pioneered by French surgeon Ambroise Paré during the 1500s [4].

Found within our amputation sets are trephine heads with accompanying handles. Rather than being used during an amputation procedure, trephine heads were used to drill into the skull to treat conditions by relieving intracranial pressure. Nowadays, access to the brain via the skull is achieved with the use of electric drills.

trephine1

Trephine, mid 19th century

 

Amputation procedures have changed dramatically since the days before anaesthesia and antiseptics, but the risks have remained. Blood loss, sepsis, and infection are factors that can still occur today. Thankfully, their likelihood is much lower than they were 170 years ago.

References

  1. Lowe, P., 1597. The Whole Course of Chirurgerie.
  2. Science Museum, 2016. Amputation Knife, Germany, 1701-1800. Brought to Life: Exploring the History of Medicine. [online] Available at: http://www.sciencemuseum.org.uk/broughttolife/objects/display?id=5510
  3. Liston, R., 1847. To the Editor. The Lancet, 1, p. 8.
  4. Hernigou, P., 2013. Ambroise Paré II: Paré’s contribution to amputation and ligature. International Orthopaedics, 37(4), pp. 769-772.

Glasses through the Ages: What’s Your Style?

Our digitisation intern, Kirsty Earley, takes a look at some of the fascinating items in our spectacles collection.

Eric Morecambe, John Lennon, Harry Potter – all figures who made glasses iconic. There are scores of celebrities (and fictional characters!) who have had an impact on the popularity of certain styles of glasses, which was certainly unique to the 20th century. From monocles to aviators, the styles of glasses and frames available today are as varied as ever. What were once used as modes of magnifying written text can now simply be worn as a fashion accessory. And with any item of fashion, there were clear trends in eyewear throughout history.

Folding pince-nez style spectacles

Folding pince-nez style spectacles

It is uncertain as to who invented glasses for eyewear, but it is known that they originated in Italy between the 13th and 14th centuries.1 Originally, glasses did not have sides2 that hooked over either ear, but instead were held on the bridge of the nose. This style was known as the Rivet Spectacle (Figure 1). Due to the lack of support, it wasn’t uncommon for the wearer to have to adjust to a position where the glasses wouldn’t fall off!3

illustration of Rivet Spectacles

Figure 1: Illustration of Rivet Spectacles

From the rivet emerged the Scissor Spectacles, which were originally manufactured during the 1700s.4 Named after their similarity to the scissor shape, these glasses were linked to a handle for easier use. Understandably, these glasses were not for constant wear, but rather occasional viewing. Their design led the way for a style of spectacle popular amongst opera fans – the Lorgnette.

The lorgnette was invented in 1770 by George Adams.3 Although inspired by the design of scissor spectacles, the lorgnette differs in that one lens is directly attached to the handle rather than both (Figure 2).

Lorgnettes

Figure 2: Lorgnettes

The example of a lorgnette in the College museum collection contains a spring mechanism, making it easier to carry (Figure 3).

Lorgnettes

Figure 3: The lorgnettes had a spring mechanism which enabled the wearer to fold them away when not needed.

Another style of spectacle held within the museum collection is the Pince-Nez (Figure 4). Literally translated to “to pinch the nose”, pince-nez glasses were popularly worn by President Theodore Roosevelt. Although lacking sides, the pince-nez remained stationary due to the pinch of the bridge of the spectacle on the nose, and could avoid damage by securing an ear chain to one side.

pince-nez

Figure 4: Pince-nez

The production of the pince-nez remained active well into the 20th century, and is still worn by some people today.

Frames with sides passing over either ear had been around since around the early 18th century, but this style became more and more popular during the 20th century. By this point in time, plastic frames, as well as metal frames, also became available for purchase. These frames tended to be much more durable and comfortable.

Wire framed spectacles

Wire framed spectacles

After the founding of the National Health Service in 1948, members of the public could receive free eye tests and also claim a free pair of glasses through the NHS.3 Although not the most aesthetically pleasing glasses, the number of people requiring spectacles was high. This ultimately led to glasses being seen as a fashion accessory rather than a sign of poor eyesight. Indeed today, the style and colour of glasses worn by an individual can reflect part of their personality, their identity.

Although the primary function of glasses in aiding eyesight has not changed over the years, the styles and designs have. There is now greater choice than ever, with even more emphasis on how the glasses look rather than what they do.

References

  1. Stein, H.A., Stein, R.M., and Freeman, M.I., 2012. The Ophthalmic Assistant: A Text for Allied and Associated Ophthalmic Personnel. Elsevier Health Sciences: China.
  2. The College of Optometrists, 2016. A bit on the side – The development of spectacle sides. [online] Available at: http://www.college-optometrists.org/en/college/museyeum/online_exhibitions/spectacles/side.cfm.
  3. The College of Optometrists, 2016. Rivet Spectacles. [online] Available at: http://www.college-optometrists.org/en/college/museyeum/online_exhibitions/spectacles/rivet.cfm.
  4. American Academy of Ophthalmology, 2016. Museum of Vision: Spectacles 1700s. [online] Available at: http://www.museumofvision.org/collection/sets/?key=26.

Update (22nd December 2016): Many thanks to Neil Handley (@neilhandleyuk), curator of the British Optical Association Museum at the College of Optometrists, for lending his expertise to correct some inaccuracies in an earlier version of this post.

Helping others

Latest blog post from our digitisation intern, Kirsty Early.

A recent focus of research for the digitisation project has been a medical bag dating from the 1930s. More specifically, the focus has been on the doctor who owned the medical bag, Dr. Maud Perry Menzies.

Medical bag of Dr Maud Perry Menzies

Medical bag of Dr Maud Perry Menzies

Dr. Menzies earned her medical degree at the University of Glasgow, graduating in 1934. Not only was she one of the minority of females graduating from medicine at that time, but she was the top ranking student in Surgery, receiving the Sir William Macewen Medal for her efforts.

Menzies had a passion for helping and healing members of the public, which was evident in her work as a general practitioner and a medical officer [1]. The outbreak of war lead to the spreading of many infectious diseases across Europe, including Diphtheria. This is an airborne condition caused by the bacterium Corynebacterium Diphtheriae. An indicator of infection is the appearance of a grey membrane at the back of the throat, which can lead to breathing and eating problems.

Due to its mode of transmission, diphtheria was particularly prevalent in major cities such as Glasgow. The child death rate in Glasgow was the highest in Europe at the time, primarily due to the crowded and unsanitary conditions of the slums [2]. Thankfully, the vaccine for diphtheria had been in practice since the 1920s, so action could be taken to prevent the spread of the disease. The vaccination would have been administered by an intramuscular injection using a syringe and hypodermic needle, such as the one pictured below (Fig 1).

Hypodermic needle

Fig 1: Hypodermic needle

Other infectious diseases would have required multi-puncture vaccinations, with several needles puncturing the skin simultaneously. This method was accomplished using vaccinators (Fig 2.). The needles were dipped in the vaccine, which would then be injected into the patient during the puncture.

Vaccinator

Fig 2: Vaccinator

As an assistant medical officer of health, Dr. Menzies launch an immunisation campaign for diphtheria in Rutherglen. She also went on to work for the RAMC during the European Campaign of the Second World War, returning to Glasgow to become the principal medical officer for the school health service [1]. Such was her drive for helping others.

References
1. Dunn, M., and Wilson, T.S. 1997. Obituaries: Maud Perry Menzies. The British Medical Journal, 2, 433.
2. Reid, E., 2014. The lesson my father taught me. Herald Scotland [online]. Available at: http://www.heraldscotland.com/opinion/13168642.The_lessons_my_father_taught_me/

Seeing the Invisible: Microscope Collection

Latest blog post from our digitisation intern, Kirsty Early.

Today, there are a variety of methods that enable us to visualize objects of microscopic proportions, from electron microscopes to light microscopes. However, the physical mechanisms of magnification were once a mystery to the human race.

Thousands of years ago, it was understood that water affected the view of an object. This was due to the manner in which water interacted with light, a concept known as refraction. Years later, philosopher Robert Bacon described the magnifying properties of lenses [1]. His major work Opus Majus was a milestone in the field of optics, with the first optical microscope being developed in the 16th century.

Within the College’s museum collection are several types of microscopes from the 18th to 20th centuries. Designs vary, which reflects the progression and improvement of microscopic technology. The Wilson-Type Microscope was designed by James Wilson in 1702, not as replacement for other microscopes, but simply as an alternative magnification tool [2].

Wilson-type microscope

Wilson-type microscope

Samples to be examined were placed onto a slide containing lenses of different magnification strengths. The position of the eyepiece could then be manipulated by a screw-mechanism, allowing the viewer to see different components of the target object more clearly.

Wilson-type microscope

Wilson-type microscope

Also within the collection is a Culpeper-style microscope (1725), whose design is not dissimilar to a Galileo microscope. Edmund Culpeper was an English instrument maker in the late 17th century. Although having made simple microscopes before, his personal design included a compound microscope with a tripod stand [3]. The tool was so popular that it continued to be manufactured for the next century [4].

Culpepper-type microscope

Culpepper-type microscope

The College has many resources on the life and works of Lord Lister, the pioneer of antiseptic surgery, but it also contains an example of his father’s work. Pictured below is an achromatic microscope manufactured by Andrew Pritchard, an optician and instrument maker of the mid-1800s. Joseph Jackson Lister, Lord Lister’s father, was a wine merchant with an interest in the study of optics [4]. His creation of a more accurate achromatic lens allowed for higher resolution viewing, and earned himself a fellowship in the Royal Society. Achromatic lenses focus light of different wavelengths in the same plane, hence producing a sharper microscopic image. This development in microscopic technology was truly revolutionary [5].

achromatic microscope manufactured by Andrew Pritchard

Achromatic microscope manufactured by Andrew Pritchard

The final example of microscope within the College collection is a monocular microscope from the 1900s.This microscope is most similar in design to those seen in laboratories today, although many today will be binocular. It contains a stand onto which a microscopic slide is mounted, kept in place by two pegs on either side. The light mechanism from the bottom is directed through the lens by a mirror, which reflects the light of its surroundings. Unlike the other microscopes, this model contains a simple switch mechanism that allows the magnification to be altered between 2/3” and 1/6 “.

monocular microscope

Monocular microscope c1900

Before the invention of the microscope, the only observations of the body were those visible to the human eye. However, under the microscope a whole new world was discovered.

References
1. Bacon, R., 1267. Opus Majus.
2. Wilson, J., 1702. The description and manner of using a late-invented set of small pocket microscopes, made by James Wilson; which with great ease are apply’d in viewing opake, transparent and liquid objects: as the farina of the flowers of plants etc. The circulation of blood in living creatures etc. The animalcula in semine, etc. Philosophical Transactions of the Royal Society, 23, pp. 1241-1247.
3. 3. Clay, R.S., and Court, T.H., 1925. The development of the culpepper microscope. Journal of the Royal Microscopal Society, 45(2), pp. 167-173.
4. Allen, E., and Turk, J.L., 1982. Microscopes in the Hunterian Museum. Annals of the Royal College of Surgeons of England, 64(6), pp. 414-418.
5. Bracegirdle, B., 1977. J.J. Lister and the establishment of histology. Medical History, 21(2), pp. 187-191.

The semi-flexible gastroscope

In her latest blog post, Digitisation Project Intern Kirsty Earley looks at the technology behind a mid 20th century gastroscope.

The development of gastroscopy and endoscopy evolved during the 19th century. Philipp Bozzini in the early 1800s is regarded as the first to attempt to see inside the body using a light source – at this stage candlelight and mirrors. The use of electric light in the later 19th century advanced the procedure. In 1868 Adolph Kussmaul tested a rigid gastroscope on a sword-swallower to establish the line from mouth to stomach.

mayer-and-meltzer-gastroscope-c-1914a

Rigid gastroscope in Mayer & Meltzer catalogue, c1914

Prior to any form of recording technology, visualization of the gastrointestinal tract could only be achieved via rigid gastroscopes. These were essentially long telescopes through which the physician could view inside of the patient’s stomach (see illustration above and below).

gastroscope-illustration

Due to the limitations on flexibility, the patient had to be positioned in order that the gastroscope could simply slide down the oesophagus towards the stomach. It would then be rotated to visualize all areas of the stomach. Not the easiest of procedures. For gastroscopy to advance, something had to be done to the gastroscope itself.

Rudolf Schindler (1888-1968) was a German doctor who specialised in gastroenterology. Considered the “father of gastroscopy”, Schindler made incredible efforts to promote the use of gastroscopy as a diagnostic technique for gastrointestinal conditions [1].

Schindler was the brains behind the first ever semi-flexible gastroscope, created in 1931 [2]. He constructed the gastroscope in such a manner that the distal end could be rotated, while the proximal end remained stationary (see image below). This allowed easier access to all areas of the stomach. But how did he test his design? Often, his instruments were tested on his own children, especially his daughter Ursula as she had a strong gag reflex [3].

2000-10-2_gastroscope-2

One of our mid 20th century gastroscopes

To ensure that procedures were being carried out safely, Schindler trained practitioners in how to use his gastroscope as a diagnostic tool. He argued for many years that gastroscopy should not become a specialised field of medicine, but an examination technique performed by any level of practitioner.

2000-10-2_gastroscope-7

Detail of mid 20th century gastroscope

Ultimately, the gastroscope was replaced by fiberoptic endoscopes [4]. Instead of a flexible distal end, the entire length of the fibreoptic endoscope was flexible. This allowed the patient to be in a more natural position, e.g. sitting up, during the examination, [5].

Gastroscopy today involves examining components of the gastrointestinal system by inserting a wire-like endoscope down the patient’s throat. The endoscope contains a camera and light, and is controlled by the physician performing the examination. The images from the camera are then fed to a monitor screen for visualization.

References

  1. Gerstner, P., 1991. The American Society for Gastrointestinal Endoscopy: a history. Gastrointestinal Endoscopy, 37(2).
  2. Olympus, date unknown. Olympus History: VOL 1 The Origin of Endoscopes. [online] Available at: http://www.olympus-global.com/en/corc/history/story/endo/origin/.
  3. Schindler Gibson, U., 1988. Rudolf Schindler, MD: living with a Renaissance man. Gastrointestinal Endoscopy, 34(5).
  4. DiMarino, A.J., and Benjamin, S.B., 2002. Gastrointestinal Disease: An Endoscopic Approach. Slack Incorporated: New Jersey.
  5. Hirschowitz, B., 1961. Endoscopic Examination of the Stomach and Duodenal Cap with the Fiberscope. The Lancet, 277(7186).

Dr Harry R. Lillie

We recently received an unusual donation, and one that holds an incredible story. A medical bag belonging to Dr Harry R. Lillie was generously given to the College, along with a copy of his book The Path through Penguin City (1955). In this blog post our Digitisation Project Intern Kirsty Earley explains its significance.

dr-hr-lillie

Dr Lillie’s medical bag

 

Dr Harry Russell Lillie was a surgeon and medical officer aboard British whaling ships in the Antarctic during the 1940s. Originally from Dundee, Lillie received his MB ChB from the University of St Andrews in 1939, previously graduating with a BSc Engineering in 1926.

hr-blood-pressure

Dr Lillie’s Baumanometer

He began his career at sea during the whaling season of 1946-1947. Serving up to 600 sailors at a time, Lillie was putting his surgical skills to good use at sea [1]. Life at sea was always busy, and certainly not a 9-5 job. Surgeons and medical officers had to be ready to deal not only with common illnesses contracted at sea, but also severe injuries of the whaling profession. It wasn’t unheard of for sailors to find themselves inside the mouth of the whale they were trying to hunt:

“Trapped with only his boots sticking out as the jaws came together, he got off with a moderately crushed chest and emphysema from the neck to the waist, but was back on his job in six weeks.” [1]

hr-kit-3

Dr Lillie’s surgical kit

As well as exercising his medical skills, Lillie was able to observe the conditions and methods of whaling in the Antarctic. The hunting of whales has been performed since prehistoric times, however the reasons for hunting whales has changed over time. Whales have been targeted as a food source for some communities, as well as being killed for oil and blubber.

The tools used to kill whales have evolved over the years. Lillie describes in detail the specific methods sailors used to take down their prey, and, as the true scientist he was, didn’t leave out any details. “Explosive Harpoons” were used to take down the whale instead of standard iron harpoons used previously. These harpoons had a delayed mechanism, where the spear would pierce the whale’s tissue, and then explode via implanted grenades after a few seconds. As would be expected with such a large mammal, death wasn’t immediate; often it required several hours for the whale to die after more than one harpoon fired.

Such scenes were the cause of Lillie’s campaigning for new whaling laws. He reported the horrific methods used to kill whales to make a clear point- things had to change. And things did change. His book The Path through Penguin City was published in 1955 and remains to be one of the most influential books in whaling conservation. Here he uses helpful imagery to explain the how horrible whaling was:

“If we can imagine a horse having two or three explosive spears stuck in its stomach and being made to pull a butcher’s truck through the streets of London while it pours blood into the gutter, we shall have an idea of the method of killing. The gunners themselves admit that if whales could scream the industry would stop, for nobody would be able to stand it.” [2]

It was this work that led to the formation of several conservation groups, including the International Whaling Commission, [3]. In fact, Sir David Attenborough has quoted Lillie’s work when discussing the still present inhumane methods of whaling [4].

With such an interesting background, it is safe to say that there is still much to discover about H.R.Lillie, his workings as a surgeon and as a conservationist.

References

  1. Lillie, H.R., 1949. With whales and seals. The British Medical Journal, 2(4642), p.1467-1468.
  2. Lillie, H.R., 1955. The Path through Penguin City. Benn Publishers.
  3. Society for the Advancement of Animal Wellbeing. Whaling. Available at: http://www.saawinternational.org/whaling.htm.
  4. Kirby, A., 2004. Whaling too cruel to continue. BBC News. [online] Available at: http://news.bbc.co.uk/1/hi/sci/tech/3542987.stm.

Cause of Death?

Latest update on our Uncovering our Medical Instruments project by our Digitisation intern, Kirsty Earley.

Death is an unfortunate certainty for us all, but how people die often differs. Sometimes it is even a mystery. Mysterious deaths are not only found in crime novels, but also in real life, and it is the job of pathologists to solve these mysteries.

The process by which pathologists determine the cause of death is known as a Post-mortem Examination, or an Autopsy. They will examine every inch of the body for any clues as to how the individual died. As would be expected, the examination involves several stages, with several different techniques used to investigate the tissue. Tissue is sampled and checked for any abnormalities. The pathologist may also test to identify any poisons that may be present in the victim’s system.

 

Wooden case containing a 20th century post-mortem kit.

Wooden case containing a 20th century post-mortem kit.

 

The findings can then be used to assist in a court case, e.g. to convict a suspected murderer. Biological evidence like this, along with DNA profiling, gives the court a strong indication of what happened to the victim, and whether or not to convict or release the suspect.

This is clearly a combination of medicine and law working together to discover the truth. However, this relationship hasn’t always existed. There was a time when post-mortem examinations were rarely carried out by medical professionals, but instead those who had some form of legal background. This all changed due to the efforts of one man- Thomas Wakley.

Thomas Wakley (1795-1862), was a surgeon based in London and also a coroner for the region of Middlesex, (1). During the 1800s, the care for employees was much more relaxed, workers in industrial environments were at high risk of injury, even death. For example, when the railways lines were first built, many of the men constructing the lines died while on shift. And this death rate increased over time. Hence, Wakley took it upon himself to campaign for medical coronerships. This would mean that coroners would need to have some form of medical training, ensuring that the cause of death would be investigated on a legal and medical level. This would then hopefully reduce the number of deaths.

Scalpel blades (c.1900) used in post-mortem examinations

Scalpel blades (c.1900) used in post-mortem examinations

Unfortunately, Wakley did not see the day when his fight was won. It was not until 1926 that The Coroner Amendment Act was passed, and the requirement for coroners to be legally or medically trained was compulsory (2). And this is still the case today; coroners must either be qualified lawyers or doctors with years of previous experience, (3).

An example of a post-mortem examination kit was found within the collection of medical instruments here at the College. It contains an array of instruments that were used during a post-mortem examination, including a bone chisel, cartilage knife, and solid steel saw. This particular kit dates from 1900, and it is uncertain as to who it belonged to and whether they were trained in medicine or not.

hooks and papers contained in the post-mortem kit

hooks and papers contained in the post-mortem kit

Despite the fact that there was much debate over exactly who should investigate the cause of death, the role of the coroner was still of vital importance in order to bring out justice in the courtroom.

References
1. Cawthon, E.A., 2004. Medicine on Trial: A Handbook with Cases, Laws, and Documents. ABC-CLIO: California.
2. Sprigge, S., and Morland, E., 1926. House of Lords: Coroners Bill. The Lancet, 1, p. 630.
3. http://www.inputyouth.co.uk/jobguides/job-coroner.html