This morning I read a tweet that made me stop and think, “Wait.. what?”
The word that got my attention was ‘bivouacked’. Despite the fact that I am a passionate reader and a scholar and teacher of History, I had no idea what this word meant. Obviously, I wasn’t the only one: plenty of people responded that they had to look the word up.
My trusty Macquarie Dictionary gave me the definition.
Etymonline explains that the use of bivouac in English dates back to 1702, meaning an “encampment of soldiers that stays up on night watch in the open air, dressed and armed.”
It is an image of readiness to defend and protect, which was exactly the context of the tweet. These images of bivouacked soldiers in the Capitol building, Washington DC, are confronting and comforting at the same time. That it is even necessary is heartbreaking, yet in the current political climate, I am thankful they are there.
The word came from French, and before that from the 17th century Swiss/Alsatian word ‘biwacht’ which meant “night guard”.
By 1853, bivouac was also used as a noun to mean an outdoor or open-air camp.
The use of the verb ‘to bivouac’, meaning to post troops in the night dates to 1809, and meaning to camp or sleep out-of-doors without tents dates to 1814. It should be no surprise that the noun became a verb in the context of the Napoleonic Wars and the War of 1812, during both of which the practice would have been common.
Freedom of speech is a human right. It is the right to express one’s ideas and opinions verbally or in writing, either publicly or privately. It is the right to engage in public conversation about personal and public issues and events. It is the right to communicate meaningfully with other people.
Even so, it has it’s ethical limitations.
All individuals have freedom of speech. It is not just the domain of one person, or one group. This means that the right is also accompanied by the responsibility of listening to, and responding thoughtfully to, the ideas and opinions of others. Freedom of speech is a two way street.
It is not the right to cause harm or injury to other people. It is not the right to incite violence. It is not the right to abuse, slander, or misrepresent situations or other people. It is not the right to spread dangerous disinformation. It is not the right to break the law or commonly accepted rules.
The people decrying Twitter and Facebook for banning Trump need to understand these things.
When he opened his social media accounts, he agreed to the terms and conditions. Nobody can have those accounts without agreeing to those rules, which clearly state that one cannot use that social media platform to break the law or encourage anyone else to do so. There is a clearly stated warning that infringement of those rules will result in your account being suspended or cancelled.
There is no doubt that these are the rules invoked when the accounts belonging to a range of criminals and terrorists were cancelled in the past. People and governments actively and rightly demanded that this should be the case in response to the manifesto and live streaming of the actions of the Christchurch mosque terrorist, for example.
It is illegal to use social media to promote illegal activity or post offensive material.
Why, then, should Trump not be banned for inciting a riot or encouraging sedition? Why should his followers not be banned for plotting violence and premeditating murder and insurrection?
The clear answer is that they absolutely should.
Anyone using social media to plan or conduct a criminal act should be banned and then prosecuted to the full extent of the law.
Twitter, Facebook, and Instagram have acted rightly. They have not assaulted anyone’s free speech. It is not censorship. Those on the quiet end of a ban have invited that consequence for themselves.
A Few Home Truths About #FreedomOfSpeech #Rights2021 #SocialMedia
I love tinsel. It’s so glittery and cheerful and colourful. It’s instant Christmas decoration that you can pull out of a bag and strew around the room and it immediately feels more like December.
Tinsel seems like a fairly recent invention, and in its current form, it is. Its history, though, goes back five hundred years to the very fine strands of hammered silver used in Nuremberg, Germany, in the early 1600s. At first, it was used more often to decorate sculptures or statues than trees., but it’s ability to sparkle and magnify the light from the candles used to illuminate Christmas trees caused its popularity to grow.
Flawed by both brittleness and tarnish, early types of tinsel were nowhere near as hardy or long-lasting as what we have now. Over time, various other tinsel-like decorations were made using various different shiny or sparkly materials: silver or gold thread, or pieces of shiny fabric, and foil made from lead, copper or aluminium. During the 20th century, the advent of plastics made production of what we now know as tinsel cheaper and easier, while the dangers of other more flammable or toxic materials caused them to decrease in popularity.
The word tinsel dates back to the mid-1400s when it was used to describe cloth with gold or silver thread woven through it.
It is this sense of the word that is used by Shakespeare in ‘Much Ado About Nothing’ where Margaret describe’s Hero’s fine wedding gown as being enhanced with a “bluish tinsel”.
The word came from Old French estencele, or estincelle — the es- was not pronounced– which meant ‘sparkle’ or ‘spangle’. From the 1590s onwards, tinsel was the name given to very thin sheets, strips or strands of shiny metal or fabric. This Old French word is related to the Latin word scintilla meaning ‘spark’ , which in turn most likely came from the PIE roots*ski-nto, from which English also gets ‘shine’ and ‘scintillate’. It is also related to ‘stencil’.
By the mid 17th century, tinsel was also used in a non-literal sense to mean something showy or shiny, but not with any real value.
I recently heard someone insisting that there was a difference between Christmas carols, which were all about baby Jesus and the angels, the star and the wise men, and Christmas songs, such as Jingle Bells or Rudolph the Red Nosed Reindeer.
It sounded like a feasible explanation, and the guy put up what seemed like a good argument– mostly due to his confidence and the underlying implication that he knew more about it than anyone else. (See malapert and ultracrepidarian.)
That’s what triggered me to research the question. I confess it was more out of my desire to possibly prove him wrong than to actually know the answer that I took out my phone and searched Etymoline for ‘carol’. To my delight, he was wrong! It does seem to be a popular belief, but it’s not consistent with the etymology of the word carol.
Carol is a very old word that dates back to about 1300 in both its noun and verb forms.
At this time, the noun meant both a joyful song and a form of dance in a circle or ring. Both of these meanings probably came from the Old French word carole that referred to that kind of circular dance, which was sometimes accompanied by singers. The origins of the word before that are unclear, but it certainly does paint a festive picture.
It wasn’t until about 1500 AD – two centuries later – that the word had also come to refer to a hymn or song of joy sung at Christmas. Thus, the religious connotations of the word came much later than the secular meaning.
The verb form to carol first meant to dance in a ring or circular formation. The sense of the word that meant to sing with joy or celebration had developed by the late 14th century.
The verb carol did not mean to sing Christmas songs, often moving from place to place to do so, until the late 1800s. It does seem, though, that the practice of carolling is believed to be a much older tradition that was outlawed in Britain, along with the celebration of Christmas itself, by the Puritans who governed in the mid-1600s.
So, Christmas songs are called carols because of their festive and joyful nature. Given that a. the word was originally far more specific about the type of dance than the type of songs being sung, other than that they were joyful, and b. Jingle Bells and Rudolph are as festive in their own ways as Hark The Herald Angels Sing or Joy to the World, there is no reason to classify them differently. They’re all Christmas carols, and that’s that.
Today– November 11th– is Remembrance Day. It’s also called Armistice Day.
It is a day of remembrance of the fact that at the 11th hour of the 11th day of the 11th month– November 11th, 1918– the Armistice that brought an end to World War 1 was signed.
Back then, they called World War I ‘The Great War’ and ‘The War to End All Wars’. It was called ‘The Great War’ because of its size and scale, not because it was good in any way. And, being the overachievers that humans are, we have since proven that it didn’t prevent any further wars at all.
Today is a day for acknowledging the devastation, loss of life, and tragedy of the war not just for Australia, or the Allies, but for all the nations involved.
It is a day for remembering the fallen soldiers, and those who came back broken and maimed. It is a day for remembering those who mourned them.
It is a day for giving thanks for their legacy.
Their soldiers’ commitment to fighting was anything but selfish: they fought for their country. Their service and sacrifice was for the sake of defending and preserving our freedoms.
Today, let us contemplate the horrors of war and how we can avoid them in the future. Let us reflect on those who gave their lives in loyal service of their country.
Lest we forget.
Remembrance Day #RemembranceDay #LestWeForget #blogpost
The practice of leaving a preposition at the end of a sentence, often referred to as preposition stranding, has long been considered to be “against the rules”. Generations of teachers and grammarians have condemned it as a grammatical taboo.
That isolated, lonely preposition, separated from its noun, is known as a terminal preposition, and may also be described as danging, hanging or stranded.
Albeit with the best of intentions, this was drummed into me as a child, so I simply accepted it and tried to avoid doing so in whatever I wrote.
As I got older, though, I came to realise that it’s something we do very naturally in speaking. In fact, avoiding it in spoken English can make what one is saying seem very formal and stilted.
When I was in high school, one of my History teachers told us a story about one of Winston Churchill’s famous comebacks. On receiving a correction about finishing a sentence with a preposition in the draft of a speech, he responded, “This is nonsense, up with which I shall not put.”
As it turned out, it probably wasn’t Churchill who first made the joke. I don’t know if he ever did, despite numerous and varied attributions. It has also been attributed to various other people, and there are variations on the line that was said to have been delivered, so it’s hard to know who said what, and when.
Either way, the story demonstrates that the rule is actually a bit ridiculous.
So where did this rule come from? And is it something we still have to abide by?
Back in the 1600s, a grammarian named Joshua Poole developed some principles about how and where in a sentence prepositions should be used, based on Latin grammar.
A few years later, the poet John Dryden, a contemporary of John Milton, took those rules one step further when he openly criticised Ben Johnson— another great poet— for ending a sentence with a preposition. Dryden decreed that this was something that should never be done. Nobody bothered to correct or oppose Dryden, and Ben Johnson certainly couldn’t because he had been dead for years, so Dryden’s strident and public protestations popularised the principle into a rule. Over time, strict grammarians and pedants began to actively oppose the practice, and the rule became widely accepted and firmly established.
Ironically, despite all the wise and clever plays, poetry and essays written by John Dryden, it was his consistent complaint about the terminal preposition that became his most enduring legacy.
Fowler’s A Dictionary of Modern English Usage, published in 1926, calls it a “cherished superstition that prepositions must, in spite of the incurable English instinct for putting them late… be kept true to their name and placed before the word they govern.” Fowler goes on to assert that even Dryden had to go back and edit all of his work to eliminate the terminal prepositions in his own writing.
In the last century or so, people have become progressively less fussy and worried about it, but some still seem determined to cling to the rule no matter what.
I advise my students that in formal writing such as essays, speeches, official letters and submissions, it is best to avoid the terminal preposition just in case their reader is someone who might judge them for it. Any other time, in keeping with standard spoken English, they are free to use their prepositions wherever they feel most natural and make the most sense.
Nobody in the 21st century is going to naturally ask someone “On which char did you sit?” rather than “Which chair did you sit on?”, nor will they say “I wonder for whom that parcel is intended” Instead of “I wonder who that parcel is for.”
In the 21st century, that really is nonsense up with which we do not have to put.
This post reminds me of the lessons I’ve been doing with my Year 8 History class about Medieval Europe and the Black Death.
My students were very interested in the plague, and surprised by the fact that this was when quarantine, social distancing, and the wearing of masks became the go-to modes of dealing with contagious disease. They were also surprised by the time it took the Europeans to understand the importance of basic hygiene, and how very long it took to develop good medical knowledge.
These lessons were highly relevant in These Times, and helped the kids to understand why we’re being reminded to wear masks, wash and sanitise our hands, and keep away from other people. It was good to be able to discuss how relevant history can be.
We all agreed we are incredibly thankful for modern medicine, science, vaccines and health care.
It does strike me as bizarre, however, that with all the scientific and technological advances we’ve made, we still have to remind people to wash their hands. Some things, it seems, never change.
In this time of Covid 19, when we don’t know why it seems to affect men more than women, and some ethnicities but not others, it is interesting that back in the 14th century the tsunami of the Great Pestilence of 1348 was followed by lesser waves that differed in many ways from the original. The first of these, in the England of 24 Edward III (January 1360 to January 1361) was called the secunda pestilencia and appeared to affect mostly the very young, babies and adolescents. Women were not affected in the same way.
The Chronicle of the Greyfriars of King’s Lynn notes: “…In that year  began a plague among Londoners at about the feast of St Michael, where at first infants…
Celebrating Mary Shelley’s Birth Date, August 30, 1797
“Invention, it must be humbly admitted, does not consist in creating out of void, but out of chaos …” Mary Shelley
Every year, the most ardent Mary Shelley fans remember this author on August 30. Frankenstein is still one of the most popular and enduring novels since its publication in 1818. We spend time reading her short stories and browsing her biographies, maybe discovering a new fact about her life and writing.
Did you know Frankenstein was inspired by a nightmare? In the preface of the third edition of the novel, Mary says that Frankenstein came to her in a dream. During a sleepless night in her dark room, behind closed shutters “with the moonlight struggling to get through … I saw with shut eyes, but acute mental vision – I saw the pale student of unhallowed arts…