ABSTRACT

The first computer is believed to be from the first century C.E. It is called the Antikythera mechanism and is a mechanical computer [39] that showcases the engineering prowess of the ancient Greeks as well as their impressive knowledge of astronomy. Archaeologists believe it was part of a sunken ship carrying it to Rome, and possibly the dark ages ended its further development. More recently, the computer languages began with instructions and data entered via punch cards. Coded in a symbolic language, they used mnemonic symbols to represent instructions. They needed to be translated into machine language before being executed by the computer, so the computer could understand high-level languages that are closer to natural 9languages. The first programmer was Lord Byron’s daughter, Augusta Ada Byron, Countess of Lovelace who is famously known as Ada Lovelace (1815-1852); the programming language Ada was named in her honour. Navy Commodore Grace Murray Hopper (1996-1992) developed one of the first translation programs for the Mark I computer in 1944. The machine code was recorded on a magnetic drum. Computing programming languages have since been used for translations and adaptations of artificial intelligence (AI) systems and genetic algorithms to digital media. However, as noted by Margaret Boden [4], a perfect translation is not a simple matter, both in the case of translating one computer language into another for the sake of artificial intelligence research and in human languages translations: “Even Please give me six cans of baked beans will cause problems, if one of the languages codes the participants’ social status by the particular word chosen for Please” [4, p. 4]. There are two types of translators: The compiler converts the program to low-level languages (machine- and assembly languages) to be executed later; and, the interpreter converts and executes each statement [13].