No, often they just don’t. Let’s look at a simple example. The 1828 Webster’s American Dictionary of the English Language defined a computer as:
“One who computes; a reckoner; a calculator.”
Then the 1913 edition said:
“One who computes; a machine which computes.”
Now the online Merriam-Webster says:
“One that computes; specifically: a programmable usually electronic device that can store, retrieve, and process data.”
Almost anyone you’d ask today would say that a computer is a thing you purchase rather than a person. So, don’t assume the definition you once learned still is the only correct one!
That original meaning is illustrated in the 1949 photograph of the “computer room” at NACA Dryden shown above. It just is a historical leftover. The transition from person to electronic thing happened during the second half of the twentieth century.
My father was a highly skilled computer. For two decades Harold taught chemical engineering in universities, and then he worked in industry. Dad was armed with a detailed knowledge of applied mathematics. When I was a young child he had a black Marchant mechanical calculator on his desk at home, with an electric motor as big as a pound coffee can on the back. I grew up playing with it. The Marchant handled long division by making repeated subtractions. Dividing could take many seconds, while producing sounds that resembled a heavy machine gun.
Later he had a Hewlett Packard 9100A desktop electronic calculator, with a CRT display showing a stack of three registers. It used Reverse Polish notation (RPN), so there was no “= “ key. I got familiar with using RPN, so all three pocket calculators I’ve owned came from Hewlett Packard.
I learned to use electronic computers like the exotic Bendix G21 and IBM System 360/67 back in my teens in the 1960s. The Boy Scout troop and Explorer post I belonged to both were run mainly by Carnegie Tech computer science grad students.
Images of the Marchant EB9 and HP9100A calculators came from Wikimedia Commons.