35mm film was the most common film gauge for cinematography, and was also used in still photography (in the form of 135 film).
The name derives from the width of the film strip. When used for motion pictures, the image is across the film and each frame usually has four perforations giving 16 frames per foot, whereas when used for photography the image is lengthways along the film and each frame uses eight perforations. In conventional motion picture film, the image is 22 x 16mm (known as the ‘Academy ratio’). The shape and frequency of the perforations differed in the early years.
The 35mm format was introduced in 1892, soon after the introduction of transparent flexible film in 1889, at a time when a large range of different film gauges were in use. By 1909 it became accepted as the international standard gauge and remained so until largely replaced by digital cinematography. Although other gauges have been used for cinematography, 35mm remained the most popular with professional film makers as it provided a good trade-off between cost and image quality.
Until the 1950s, 35mm film was made of cellulose nitrate which was highly inflammable and difficult to extinguish once alight. It was replaced with ‘safety film’ (cellulose triacetate). From the 1990s, film stock was made with a synthetic polyester safety base.
Sound was introduced around 1926, with Warner Bros. using synchronised phonograph discs. Later sound-on-film systems include optical analogue, optical digital, and magnetic strips. DTS soundtracks use a timecode printed on the film to synchronise with Compact Discs.
Between 2005 and 2015, most cinemas rapidly converted to digital projection, and in 2014 Paramount Pictures announced that it would no longer supply 35mm prints of movies in the US. Whilst 35mm film is still in use for both shooting and showing movies, it is rapidly becoming a niche format.