Classical music has a lighter, clearer texture than Baroque music and is less complex. It is mainly homophonica clear melody above a subordinate chordal accompaniment. Counterpoint was by no means forgotten, especially later in the period, and composers still used counterpoint in religious pieces, such as Masses. Classical music also makes use of style galant, which contrasted with the heavy strictures of the Baroque style. Galant style emphasized light elegance in place of the Baroque's dignified seriousness and impressive grandeur.
Variety and contrast within a piece became more pronounced than before. Composers used a variety of keys, melodies, rhythms and dynamics. Classical pieces dynamic changes such as crescendo (an instruction to gradually get louder), diminuendo (an instruction to gradually get softer) and sforzando (a sudden strong, loud attack). Classical pieces had frequent changes of dynamics, mood and timbre, in contrast to Baroque music. Melodies tended to be shorter than those of Baroque music, with clear-cut phrases and distinct cadences. The orchestra increased in size and range; the harpsichord or pipe organ basso continuo role in orchestra gradually fell out of use between 1750 and 1800. As well, the woodwinds became a self-contained section, consisting of clarinets, oboes, flutes and bassoons. As a solo instrument, the harpsichord was replaced by the piano (or fortepiano, the first type of piano which was invented ca. 1700). Early piano music was light in texture, often with Alberti bass accompaniment, which used arpeggios in the left hand to state the harmonies. Over the Classical period, the pieces became richer, more sonorous and more powerful.
While vocal music such as comic opera was popular, great importance was given to instrumental music. The main kinds of instrumental music were the sonata, trio, string quartet, symphony, concerto (usually for a virtuoso solo instrument accompanied by orchestra), and light pieces such as serenades and divertimentos. Sonata form developed and became the most important form. It was used to build up the first movement of most large-scale works in symphonies and string quartets. Sonata form was also used in other movements and in single, standalone pieces such as overtures.
Early binary repertoires include Bacon's cipher, Braille, International maritime signal flags, and the 4-digit encoding of Chinese characters for a Chinese telegraph code (Hans Schjellerup, 1869). Common examples of character encoding systems include Morse code, the Baudot code, the American Standard Code for Information Interchange (ASCII) and Unicode.
Morse code was introduced in the 1840s and is used to encode each letter of the Latin alphabet, each Arabic numeral, and some other characters via a series of long and short presses of a telegraph key. Representations of characters encoded using Morse code varied in length.
The Baudot code, a five-bit encoding, was created by mile Baudot in 1870, patented in 1874, modified by Donald Murray in 1901, and standardized by CCITT as International Telegraph Alphabet No. 2 (ITA2) in 1930.
Fieldata, a six- or seven-bit code, was introduced by the U.S. Army Signal Corps in the late 1950s.
IBM's Binary Coded Decimal (BCD) was a six-bit encoding scheme used by IBM in as early as 1959 in its 1401 and 1620 computers, and in its 7000 Series (for example, 704, 7040, 709 and 7090 computers), as well as in associated peripherals. BCD extended existing simple four-bit numeric encoding to include alphabetic and special characters, mapping it easily to punch-card encoding which was already in widespread use. It was the precursor to EBCDIC.
ASCII was introduced in 1963 and is a seven-bit encoding scheme used to encode letters, numerals, symbols, and device control codes as fixed-length codes using integers.
IBM's Extended Binary Coded Decimal Interchange Code (usually abbreviated as EBCDIC) is an eight-bit encoding scheme developed in 1963.
The limitations of such sets soon became apparent, and a number of ad hoc methods were developed to extend them. The need to support more writing systems for different languages, including the CJK family of East Asian scripts, required support for a far larger number of characters and demanded a systematic approach to character encoding rather than the previous ad hoc approaches.
In trying to develop universally interchangeable character encodings, researchers in the 1980s faced the dilemma that on the one hand, it seemed necessary to add more bits to accommodate additional characters, but on the other hand, for the users of the relatively small character set of the Latin alphabet (who still constituted the majority of computer users), those additional bits were a colossal waste of then-scarce and expensive computing resources (as they would always be zeroed out for such users).
The compromise solution that was eventually found and developed into Unicode was to break the assumption (dating back to telegraph codes) that each character should always directly correspond to a particular sequence of bits. Instead, characters would first be mapped to a universal intermediate representation in the form of abstract numbers called code points. Code points would then be represented in a variety of ways and with various default numbers of bits per character (code units) depending on context. To encode code points higher than the length of the code unit, such as above 256 for 8-bit units, the solution was to implement variable-width encodings where an escape sequence would signal that subsequent bits should be parsed as a higher code point.