Science tells us that we theoretically perceive 40 moments per second based on our brain wave cycle of 40 hertz. Our eyes are grabbing 66 fps but our minds are only grabbing 40 of those moments. So why do we need 8 extra frames if we are only able to perceive 40 out of those 48 frames? Why do some people perceive footage shot in 60 fps as being "better" than footage shot in 48 fps? If the theory is in fact true, anything shot over 40 fps is overkill. Furthermore, if we are perceiving reality as 40 moments per second and anything below that allows us to suspend our disbelief, that leaves us with the question of: What is the ideal frame rate that will reduce motion blur and enable viewers to suspend their disbelief? Anything from 30-35 fps will almost definitely be a sweet spot. As we encroach upon the threshold of reality (reality being 40 hertz or 40 fps) we disallow our minds to suspend our disbelief. As I watched The Hobbit, I found that certain film techniques were affected by my failure to be captivated as I would have been with fake reality (good old 24 fps).
Quick cutting, inserts, long-sweeping-establishing-landscape- shots, fast action, and certain use of CG and lighting didn't bode well for me. Gandalf conjured fire between his fingers that resembled stage play magic. Embers burned unnaturally against a deep blue sky. Jackrabbits and massive puppy dogs were pulled across rolling green hills by a string over an air hockey table. For that matter they weren't lit properly (at least to my perception), they were slightly too bright and unnatural. The overall extra clarity may contribute to this declaration and yes, against the DP's better judgment, however my perception will be different than others. I hate to say that some of the CG was just plain gimmicky. Despite my CG distaste, Gollum looked amazing but it was also one of the most stationary scenes in the film and one scene that would have received far more attention in the studio. In fact, anything shot with little to no camera/character movement was welcomed to the 24 fps eye. Azog, The Pale Orc was also stunningly rendered. Maybe it's my gamers-eye that's used to anything running over 60 fps, but most of the CG characters looked pleasing to my eye. When I say pleasing to my eye, I'm not referring to texture, rather fluidity and believability, almost as if the characters were right in front of me (bring on 4D smell-o-vision and rumble seats).
Not only will we see a push in the film industry for 48 fps, we'll find that it's going to be the "new thing" for some directors to attempt. It's a stylistic choice in my opinion. For example, Michael Mann's Public Enemies was shot digitally and faced with mixed opinions as to whether or not film would have suited it better for the time period. Quentin Tarantino once stated he would never stray from shooting on film and if he had to, he would stop directing and become a writer permanently.
Technical mumbo-jumbo aside, it's all too subjective to give a solid conclusion. We all experience and think about film in different ways. Take the dive and go see a film in 3D HFR. It's honestly a wonderful experience overall. Some people get sick because of the 3D/48 fps combo. Others feel it's easier on their eyes. I personally never have a problem with 3D and motion sickness. However, whether it was good use of 3D or fluidity/clarity, I found myself reacting (blinking mostly) at objects that flew towards my face more than usual. I wish it was so real that I threw up... I also had the privilege of watching it in passive 3D, which resulted in a brighter image. I personally have an easier time adjusting to 3D with active lenses but I couldn't tell the difference this time around.
So, are we supposed to pool together the complaints of 48 fps and adjust accordingly or rely on the mentality of 24 fps? Maybe an evaluation is necessary and extremely valuable to the future of 48 fps filmmaking. I say we try to find that sweet spot to make everyone happy.
Written by: Alex Zarnoski