Foundations of Amateur Radio The first official telegram to pass between two continents was a letter of congratulations from Queen Victoria of the United Kingdom to President of the United States James Buchanan on 16 August 1858. The text is captured in the collection of the US Library of Congress. It's a low resolution image of a photo of a wood engraving. Based on me counting the characters, the text from the Queen to the President is about 650 characters. IEEE reports it as 98 words, where my count gives 103 words or 95 words, depending on how you count the address. Due to a misunderstanding between the operators at either end of the 3,200 km long cable, the message took 16 hours to transmit and 67 minutes to repeat back. If you use the shortest duration, the effective speed is just over one and a half Words Per Minute or WPM. That's not fast in comparison with speeds we use today. Until 2003, the ITU expected that emergency and meteorological messages should not exceed 16 WPM, that a second class operator could achieve 20 WPM and a first class operator could achieve 25 WPM. To put the message speed in context of the era, in 1856, RMS Persia, an iron paddle wheel steamship and at the time, the largest ship in the world, won the so-called "Blue Riband" for the fastest westbound transatlantic voyage between Liverpool and Sandy Hook. The journey took nine days, 16 hours and 16 minutes. Similarly, it wasn't until 1861 that a transcontinental telegraph was established across the United States. In 1841 it took 110 days for the news of the death in office of President William Henry Harrison to reach Los Angeles. Today that distance is covered by a 39 hour drive, a 5 hour flight, and about 12 milliseconds on HF radio. So, while the speed of the message might not be anything to write home about today, at the time it was world changing. Speed in Morse code is measured in a specific way. Based on International Morse code, which is what I'm using throughout this discussion, if you send the word "PARIS" a dozen times in a minute and the next time starts right on the next minute, you officially sent Morse at 12 WPM. Looking inside the message of the word "PARIS", it's made up of a collection of dits and dahs. If a dit is one unit of time, then the letter "a", represented by dit-dah, is six units long when you include the spacing. In total, the word "PARIS", including the space after it, is exactly 50 units long. When you send at 12 WPM, you're effectively sending 600 dit units per minute, or ten units or bits per second, each lasting a tenth of a second. Unfortunately, there is not a one-to-one relationship between Morse speed and ASCII, the American Standard Code for Information Interchange, for a number of reasons. Firstly, Morse is made from symbols with varying lengths, where ASCII, the encoding that we really want to compare speeds with, has symbols with a fixed length. You cannot simply count symbols in both and compare their speeds, since communication speed is about what you send, how fast you send it, and how readable it is at the other end. Thanks to Aiden, AD8GM, who, inspired by my initial investigation, shared the idea and python code to encode Morse dits, dahs and spacing using a one for a dit, one-one-one for a dah, and zeros for spacing. This means that the letter "e" can be represented by "10" and the letter "t" by "1110". You can do this for the standard Morse word "PARIS" and end up with a combination of 50 zeros and ones, or exactly 50 bits. I've been extending the code that Aiden wrote to include other encoding systems. When I have something to show it will be on my GitHub page. However, using Aiden's idea, we gain the ability to directly compare sending Morse bits with ASCII bits, since they share the same zero and one encoding. If you use standard binary encoded ASCII, each letter takes up eight bits and the six characters for the word "PARIS", including the space, will take up 48 bits. Given that I just told you that the Morse version of the same message takes up 50 bits, you could now smile and say, see, ASCII is faster - wait, what? Yes, if you send the word "PARIS " using 8-bit binary coded ASCII it's two bits shorter than if you use Morse. Job done, roll the press, headline reads: "Morse is four percent slower than binary coded ASCII". Not so fast grasshopper. If you recall, American Morse code, the one that has Samuel Morse's name written all over it, was replaced by a different code, made by Friedrich Gerke which in turn was modified to become what we now know as International Morse code. Ask yourself, why did Gerke change the code? It turns out that one of the biggest issues with getting a message across an undersea cable was decoding the message at the other end. Let me give you an example, using American Morse, consider the encoding of "e", dit, and "o", dit-extra-space-dit and now try sending the word "seed" across a noisy line. Did you convey "seed", or was it "sod". In other words, there is room for ambiguity in the message and when you're talking about commerce, that's never a good basis for coming to a mutually binding agreement. It turns out that encoding needs to be more subtle than just creating a sequence of bits. Something else to consider, 10 bits per second is another way of saying 10 Hz, as-in, this is not just switching, we're dealing with frequencies and because we're not sending lovely sinusoidal waves, from a signal processing perspective, a very horrible square wave, we're also dealing with harmonics, lots of harmonics, and more of them as we speed things up. So, if you send binary coded ASCII and compare it to Morse code, will your message actually arrive? I'm Onno VK6FLAB