What is the difference between a still image and a video? The difference of course is movement over time. A digital video file or analog film clip is made up of multiple images taken very quickly and then played backed at high speed to trick the eye into seeing motion. These individual images are called frames, and the speed of which they’re played back back is referred to as frame rate. A frame rate is usually measure as frames per second, or FPS. In a modern context, most videos are usually displayed as 23.976, 24, 29.97, 30, 60 FPS for NTSC systems, and 25, 50 for PAL systems. Newer monitors designed for gaming are built with refresh rates that reach figures such as 144 or 240, this enables more accurate control and smoother motion.
What frame rate is best? What is required? Where do these figures come from? All these burning questions and more will be answered shortly, but first we need to go back before even photography was invented.
History of Frame Rate and Motion
Film is an art of storytelling and communication, both of which have foundations within human evolution as a whole. Much like our own ancestors, film’s origins could be traced back all the way to prehistoric times, where techniques now referred to as shadowgraphy and camera obscura used light and shadow to create art and tell stories in caves, tents, and other early settlements. Hand shadows for example were used to depict creatures dancing a long the walls, fueled by the creativity of the artist and the imagination of the audience. Illusions made with light are still being discovered and utilized today as technology proceeds to develop, such as with HDR TVs and the latest color reproduction devices.
The idea of multiple images being blended together at speed in order to create the illusion of motion has been used for centuries, some notable examples being stroboscopic disks (1833), the flip book (1868), or the praxinoscope (1877). Although the first photo-etching was created in 1822, the first daguerreotype camera developed for commercial manufacture was built in 1839 by Alphonse Giroux. This early cameras required a very long exposure time of 5 - 30 minutes. During this exposure time, the subject could not move or would otherwise appear blurry. This made taking photographs of motion impossible. It wasn’t until 1878 that cameras exposure time became fast enough to take photographs of fast moving subject. Within the same year, Eadweard Muybridge used a row of cameras and trip-wire to capture a galloping horse. When stitched together using modern technology, these frames could be considered the very first GIF.
As film cameras were developed and silent movies took the world by storm, there were no universal standards for frame rate, so a lot of the films were played back somewhere between 20 and 26 frames per second. The speed of capture also varied, which is why films like Charlie Chaplin appear sped up.
How Frame Rate tricks our eye
The earliest silent films were played back between 16 - 20 FPS. This is traditionally thought to be the speed where the illusion of motion begins; where your eye stops seeing individual images and start seeing blended movement. Shooting at the bare minimum frame rate was also important due to the strength of the cameraman, who had to manually crank the film itself. Of course this isn’t a true rule, as there are many variables the play into how the eye can be tricked to see smooth motion, such as motion blur and artistic direction. The actual threshold probably sits around closer to 10 - 12 FPS. Modern animations (usually found in hand-drawn mediums) still use frame rates as low as 10 - 6 FPS, while some mediums like games utilize frame rates even higher than 240 FPS. Essentially, the faster the frame rate is, the smoother the motion will blend and appear.
What are the modern standards for Frame Rate and where do they come from?
There are a few historical reasons why the industry shifted to a standard frame rate, but the main one was sound. Towards the end of the silent film era, films were displayed between 20 - 26 FPS, roughly. The variation between different films, projectors, and especially early hand-cranked devices didn’t matter too much when audio was only present via a live instrumentalist. Once audio started being incorporated into playback (circa 1926), variations of the playback speed would alter the pitch of the audio. Since our ears are much more sensitive than our eyes, this was a very disjointed result. A standard playback speed had to be decided on, and this was chosen to be 24 FPS due to being within the middle ground, as well as being the slowest (cheapest) frame rate that could support audio from a 35 mm reel.
Now modern video standards can get very complicated and messy because of specifics when it comes to broadcast, the editing process, audio, localization, interlace, dot crawl, pull-down, etc. The rabbit hole goes deep. Essentially all you need to worry about is the two main frame rate formats, NTSC and PAL. These two formats are based on the mains frequency of a location’s electric grid. Analog TV broadcast was developed with 50 Hz, which covers most of the world, while other locations such as the US or Japan use 60 Hz. In order for cameras and footage to sync with the grid, NTSC productions shot and displayed broadcast at 60 FPS (actually 60 / 1.001 = 59.94, but that’s another story) while PAL displayed at 50. Actual film was shot at NTSC 24 (or 30) and PAL 25 FPS, but was displayed through broadcast using pull-down processing, which is why these slower frame rates were the standard. Because of other funny broadcast things and audio syncing during editing, 23.976 became a standard frame rate for broadcast productions and cinema, which is why in your modern digital cameras, you will probably find settings for both 23.976 and 24 FPS.
As technology has improved, frame rate has been increased to 50 and 60 FPS productions or even higher. Cinema still is usually shot at 24 FPS however, due to the stylistic look of the frame rate compared to the sometimes uncanny valley of 60 FPS. Though some films have utilized more unconventional frame rates such as The Hobbit: An Unexpected Journey (2012, 48 FPS), and Gemini Man (2019, 120 FPS. 24 on Blu-ray and 60 on UHD render). Sports TV however does utilize high frame rate broadcast in order to present more clearly defined motion of the subject.
What frame rate should I use?
When you are working on your own production, there are a few things to consider regarding project frame rate.
First is how the content will be delivered and to what platform. Thanks to the internet and online video hosting platforms, hosting various frame rates is no problem at all. Whether NTSC or PAL, you will be able to upload your content and view it perfectly fine on modern displays and systems. Only thing to check in this regard is what maximum and minimum frame rates are available from a platform. Platforms will use the standard modern standards rather than hosting strange or VFR content that my have a rate of something like 43.8.
Once you decide on where you are going to display the project, you need to work out what exact frame rate you want the project to be rendered as. There are various things to consider in this regard. First off is your recording equipment and what it’s capable of shooting in. Second is what the purpose of the content is (is it an action sports documentary, a noir thriller, or an anime?). Third is where you are located. This last point is not AS relevant as it once was. Traditionally, you shoot content in either PAL or NTSC depending on where you live due to syncing the mains Hz to your frame rate to avoid light flicker from light sources. These days where LEDs and more universal displays are more common, these issues are less frequent, though many typical lights can still present this problem. Film lights also flash at a MUCH higher rate to mitigate this problem, so try film with one of these for any production. Once you decide on a frame rate for the project, do not deviate from that format (using PAL footage in an NTSC project). 25 does not divide cleanly into 24, and thus forcing that footage in a 24 FPS project will drop frames and create jankier movement. Shooting 120 FPS for a 30 FPS project however is fine since frames can be dropped evenly, or the footage can be slowed down 4x in order to create slow-motion.
Another factor that will affect what frame rate your project is in is file size and storage. The higher the frame rate, the larger the file will be, and the more taxing the playback will be on the system. Check out our article on bitrate for a more detailed breakdown of the process involved. In short, 60 FPS means there’s 60 individual image files shown per second, where 24 FPS obviously means there’s only 24 per second. A 10 second clip will have more images, and thus more data, if it has a higher frame-rate than a comparative clip (taking into account things like compression and resolution aside).
24 or 25 FPS is perfectly fine for any modern video production. If it works for the latest and greatest blockbuster, it will work for your webinar or team update. If you have a marketing video with beautifully animated motion graphics, then by all means render that clip in 50 or 60 fps to really make that motion pop!
If you have any other questions or would like to learn more, please don’t hesitate to contact Customer Success.