Another Example of Aspect Ratio Conversion

To further illustrate my previous post, another example of how different versions of the same scene look on different TVs and with different settings.

From Friends‘ “The One with Ross’s Teeth” (Season 6):

As originally broadcast, on 4:3 TV.

As originally broadcast, on a 4:3 TV.

This scene makes full use of the 4:3 aspect ratio. To evoke the claustrophobic setting of the elevator, there is barely any empty space on the sides or the top.

wide

As seen on blu-ray, on a widescreen TV.

Filmed on a constricted set, there is nothing on either side to open the image up to, so instead this is one of the rare cases where the widescreen aspect ratio was achieved entirely through cropping:

ralph-lauren-hddvd

This means that Rachel and Ralph Lauren’s hands, which in the original version added much to show their respective levels of comfort on this elevator ride, had to be sacrificed.

pill

Preserving the original 4:3 aspect ratio on a widescreen TV set.

The pillarboxed example above shows how opening up the image, while keeping the hands in frame, makes the elevator feel a lot less cramped. It’s a trade-off, and I don’t blame whoever decided to crop the image for doing it that way, especially when you consider how the 4:3 would look like on most widescreen TVs:

too

A widescreen TV automatically zooming in on a 4:3 image.

The Real-Life Drawbacks of Aspect Ratio Preservation

(Update: Another Example of Aspect Ratio Conversion.)

A scene from "The One with Joey's New Brain," off the 7th season DVD set of "Friends," as seen on an old 4:3 television.

A scene from “The One with Joey’s New Brain,” off the 7th season DVD set of “Friends,” as seen on an old 4:3 television.

There’s been some noise (at least in my filter bubble) about aspect ratios lately. First when FXX ran old episodes of The Simpsons cropped to fit widescreen TVs, then again this week when HBO announced they are going to rebroadcast The Wire in HD and reformatted to 16:9. (Wire creator David Simon has stated that, while the episodes were composed for 4:3, he’s basically okay with the new versions.)

The same scene, this time from the "Friends" blu-ray set, on a 16:9 widescreen TV.

The same scene, this time from the “Friends” blu-ray set, on a 16:9 widescreen TV.

Outrage over re-formatting old TV shows for widescreen is nothing new. Forums and blogs are filled with examples of shots showing either too little (when important elements get cut out of a frame) or too much (when crew members or equipment become visible) when the image is cropped or opened up (in some cases a little of both) to accommodate the wider format.

Direct comparison of the aspect ratios from the DVD (yellow) and the blu-ray (pink).

Direct comparison of the aspect ratios from the DVD (yellow) and the blu-ray (pink).

My preference on this matter is clear: I want to watch shows in the aspect ratio they were originally shown, or, overriding that, the way the creators had intended them to look.

A great example of this are the seven seasons of Star Trek: The Next Generation, which are presented in beautiful, remastered high definition and the original 4:3 aspect ratio on blu-ray.

OFF-hd-lboxd

Preserving the original 4:3 aspect ratio by placing black bars left and right of the image.

But most studios or networks, like HBO and FXX, are going another way, and one oft-circulated reason for not pillarboxing old 4:3 shows is that viewers prefer to have their screens filled, and with widescreen TVs now the norm, that means presenting content in 16:9.

I don’t believe that most people prefer full screens over black bars. What I do believe is that most people don’t actually have a preference, because they have never even thought about it. The reality that the vast majority of people wouldn’t even know how to change the aspect ratio settings on their televisions might be a sad one to some of us, but that’s the way it is. And I’ve been around enough widescreen TVs in the wild to know that, without actively telling them not to, they will do whatever they can to avoid showing black bars on the sides of the image.

A 4:3 image stretched to fit a 16:9 screen.

A 4:3 image stretched to fit a 16:9 screen.

Filling a widescreen TV by zooming in on a 4:3 image.

Filling a widescreen TV by zooming in on a 4:3 image.

I don’t think anyone will argue in favor of stretching the image – it’s clear to see why that’s a bad idea – but looking at the second example some might wonder what’s so bad about it. While it might not be that obvious on a show like Friends, cropping an image always means that information gets lost at the top and bottom of the frame. A widescreen TV zooming in on a 4:3 image will just automatically show you the 16:9-shaped middle of the frame, without regard for what’s being cut off.

Friends was actually filmed in a wider aspect ratio than how it first aired on TV, so the producers had the opportunity of opening up the image on either side when they converted the series to widescreen for the blu-ray release. But the image still had to be cropped, and in most cases they decided to keep the top part of the frame rather than the bottom, mostly to preserve the negative space above characters’ heads, which makes an image feel more open and less crammed.

joeysbrain2

Had Friends been released in the original 4:3 aspect ratio on blu-ray, the scene above would look like this on most (not yours or mine, of course!) widescreen televisions:

OFF-hd-rachel-zoom

4:3 image automatically zoomed and centered to fit 16:9 screen.

And that’s why some creators will rather release a show in widescreen than preserve the original 4:3 aspect ratio. It’s not that viewers prefer their screens to be filled, it’s that producers like David Simon know that they are going to be filled, and they prefer to at least have a say about how.

I am in no way promoting this practice – like I said, my preference for original aspect ratios is clear – but I can certainly understand why it’s done. Looking at the bigger picture, it’s the most practical way for artists to control the way their shows are watched.

Can a Computer Play the Perfect Game of Super Mario?

Mario_Bros._WorldMaybe you’ve seen these videos online of video game “speed-runs,” where the goal is to complete, let’s say, a game of “Super Mario Bros.” as fast as possible. They generally fall into one of two categories: actual humans playing the game in real-time, punching actual buttons and all that, or the “tool-assisted” variant, where players use an emulator to basically play the game slowed down, being able to use button combinations that wouldn’t be physically possible on a real controller, and, the big one: being able to “rewind” the game and doing moves over and over until they get them perfect.

I’ve played around with an NES and SNES emulator a bit and it’s basically like a combination of Neo’s powers at the end of “The Matrix” and the ability to travel back in time to fix mistakes that those aliens in “Edge of Tomorrow” have. After I’ve played that way for a few hours I got so used to this strange power that I had to remind myself the real world doesn’t quite work that way. Which is a good thing to keep in mind when you’re riding your bike down a busy street…

There are huge communities around both tool-assisted and non-assisted speed-runners, and one thing that kinda blows my mind is that even for games that have been around for two decades they are still setting new records on a regular basis. These guys and girls don’t rest on their laurels; they’re constantly looking for ways to improve the current runs, even if the end result’s just one frame faster than the old record. It’s fascinating.

The human element, even with tool-assisted speed-runs, is a huge, important part of the experience, and I immensely enjoy both the skill and the creativity on display there. But I was wondering if a computer program, on its own, would be able to find the fastest way through a video game. If we take something relatively simple like the the first Super Mario Bros. on the NES, and just let a computer play through all possible ways to play the first level, how long would it take to master it?

The options aren’t unlimited. The NES controller only has a few inputs: four directions (up, down, left, right), an “A”-button that makes Mario jump, and a “B”-button that, when held down, makes Mario run. Without physical restrictions (it’s impossible [I think?] to press both “up” and “down” simultaneously on the controller), our computer would have, on each frame of the game, the option to either press no button at all, one of the six buttons, a combination of two of the six, and so on. That’s, like, 64 unique button combinations per frame?

The longest the level can be played is until the timer runs out, which I gather is around 200 seconds. (I’ve just spent more time than I’d like to admit trying to come up with exact numbers on this. It gets pretty messy with frame rates and other stuff that I don’t really understand. Feel free to write in with the correct numbers.) Let’s say the game has 60 frames per second (again, not an exact number, just something to work with), that would mean a maximum of 12,000 frames per run-through, with 768,000 unique button combinations. Less than that, actually, if you stop the run each time Mario either dies or reaches the goal before the timer is up.

The world record for Level 1-1 of Super Mario Bros., as far as I can tell, is somewhere around 32 seconds (including the little victory animation at the end), so we can just tell our computer to stop trying if it hasn’t reached the goal after that time. So already we’re down to about 120 million button combinations that contain the fastest possible way (or ways) through the level.

If we have our computer play through each combination for 32 seconds in real time it should be done within 45 days.

And why stop there? The record for all of Super Mario Bros. is under 5 minutes, which should take our computer no longer than 10 years to beat (or confirm).

Let’s get on that.

Bright/Kauffman/Crane Productions

This is a screenshot of the Bright/Kauffman/Crane Productions logo as it appeared on the “Friends” DVD sets:

Bright/Kauffman/Crane Productions

 

This is a screenshot of the Bright/Kauffman/Crane Productions logo as it appeared on episode 15 of season 6 of the “Friends” blu-ray set:

Bright/Kauffman/Crane Productions high definition

 

And this is a screenshot of the Bright/Kauffman/Crane Productions logo as it appeared on every single other episode on the “Friends” blu-ray set:

Bright/Kauffman/Crane Productions

 

Wha?

Motion Capture and Movie Future

Something I’ve been thinking about quite a lot recently is the current state and possible future of motion capture performances. We’re already seeing a wide range of incredibly detailed and realistic computer-generated creatures performed and inhabited by actors in movies like the “of the Planet of the Apes”-series, the recent “The Hobbit” films, or this summer’s “Guardians of the Galaxy.”

Technology is also being used to render human likenesses, to alter an actor’s physique (Chris Evans in “Captain America: The First Avenger”), make him appear like his much younger self (Jeff Bridges in “TRON: Legacy”), or even raise the dead (Paul Walker in the upcoming “Furious 7″).

In the past these things haven’t always been entirely convincing (I wish Patrick Stewart had shot me with amnesia bullets so I wouldn’t have to remember his CGI face, somehow seeming to float an inch above where it should be, in that “Wolverine” sequel), but no doubt there has been progress, and if the last few decades have taught us anything it’s that computers will continue to get better and faster and cheaper. I am convinced that there will even come a day when you can watch a movie from ten years ago, look at the computer-generated effects, and say, yeah, that looks alright.

Over the last few years – and definitely in the wake of the untimely deaths of franchise-attached actors Paul Walker and Philip Seymour Hoffman – studios have been taking precautionary steps to insure that audiences will still get to pay for see their favorite characters on screen even after they’ve… become unavailable. Stepping onto that platform and having a thousand points of light scan and digitize your face and body (mostly face, I’m guessing) is probably as much part of an A-lister’s contractual obligations by now as a promotional appearance on The View.

I think in the coming years and decades we will see more and more examples of performers donning MoCap-suits not only to be transformed into animals (maybe in Jon Favreau’s “Jungle Book”?) or fantastical creatures, but ‘regular’ human beings, too. From younger versions of themselves to likenesses of people who have died or never even lived. Anything goes.

Instead of trying to predict any specific movies or performances, I thought it would be a fun exercise to look at cinema’s (and some of TV’s) history and imagine how it might have looked if advanced imaging and capturing had been around and affordable. (I am not actually saying any of these movies would have been better this way or need to be ‘fixed’ or anything. Nor do I think that in the future every single movie should use this technology.)

Fun fact: I am not a professional photo illustrator.

Fun fact: I am not a professional photo illustrator.

  • Nothing against Robert DeNiro, but what if, instead of getting another actor to play young Vito Corleone in “The Godfather: Part II,” Marlon Brando could have reprised the role himself? Using a 3D-model of Brando made when he was 29 and shooting “On the Waterfront,” the actor could have offered a seamless experience of Don Corleone arriving in America and ascending to godfather-ranks.
  • You don’t always need your characters to appear to be 20 years younger. Sometimes it’s enough for them to just stay the same age they were when you started filming. There are plenty of examples of long-running movie or TV series that feature characters that don’t age or age much slower than their human counterparts. Legolas, Bilbo, and a bunch of others in “The Lord of the Rings.” Spock in “Star Trek,” Data in “Star Trek: The Next Generation.” Angel in “Buffy the Vampire Slayer” and “Angel.” Every one of these franchises reached a point where the actors had visibly aged in ways their characters shouldn’t have. Make-up artists are already doing everything they can to reverse these immortals’ aging progress, so a little digital make-up would be a welcome solution. Even Walt from “Lost” fits this category, and he probably wouldn’t have to be written out of the show if his young appearance could have been kept. (And if the CGI isn’t convincing the studios could always use the same voodoo that’s keeping Hugh Jackman’s Wolverine from aging in the “X-Men”-franchise.)
  • One genre that could benefit the most from this is the biopic, which frequently faces the problem of either the actors, the subjects, or both to be too familiar to audiences as to suspend their disbelief and accept the person on the screen as the real-life figure. Sure, a Philip Seymour Hoffman or a Daniel Day-Lewis can pull it off, but sometimes you’re stuck with Leonard DiCaprio in a J. Edgar Hoover Halloween mask. Maybe recreating Hoover’s likeness from old film and photographs wouldn’t have saved “J. Edgar,” but the possibilities in this field are endless. Working from paintings, woodcarvings, photographs, film and video material, all the presidents from Washington to Obama could be faithfully portrayed. Hitler and Anne Frank, Mozart and Salieri, the Boleyns. Busts of Caesar and Cleopatra are practically begging to be scanned and brought to life once more.

It’s a fun thought exercise, anyway. I don’t expect computers to take away all the acting jobs anytime soon, but I’m confident that ten years from now digitally enhanced performances will be a lot more accepted and commonplace, even outside of sci-fi and fantasy movies.

Or we’ll just have a ton of dead celebrities selling whiskey and shit.

Ain’t That a Bitch

Yep, that’s the one. If I had to name one, right now, at this moment in time, “Ain’t That a Bitch” from 1997’s “Nine Lives” is my favorite Aerosmith song.

But I don’t have to. I can name as many fucking Aerosmith songs as I want, and having listened to their entire discography up and down and in and out for the last two decades, I could fill a dozen mixtapes with favorite tracks.

But this was a fun way to experience the music, I must say. And I’m sure the results would have looked a lot different ten years ago, and will no doubt be something else entirely ten years from now. Or ten days, for that matter.

If you want to get a sense of where my head is at right now, or to discover a few of their songs you may not have heard before, I’ve put together this Spotify playlist to commemorate the Great Aerosmith Song-Off of 2014:

They are the songs that were the most painful to cut from the tournament, plus the winner, of course. (Two more songs, “Falling Off” and “Face,” are not available on Spotify at this time.)

This photo came up in a Google image search for "Aerosmith." I don't know why. I don't care why.

This photo came up in a Google image search for “Aerosmith.” I don’t know why. I don’t care why.

Part VI: The Grind

Joe Perry Fiery Foods

In the penultimate installment of the series, I will present the final three Aerosmith songs that are not, at this time in history, my “favorite” Aerosmith song. They are:

  • From “Nine Lives” (1997)
    • “The Farm”
    • “Falling Off”
  • From “Honkin’ on Bobo” (2004)
    • “The Grind”

Tomorrow the “winner” will be announced, but let’s face it: we are all winners having lived through this magical event. And just because I decided I like one thing a bit more than another thing does not mean I don’t like both things. I like lot of things, you guys.

I like Aerosmith.