Dallas Skyline Across the Trinity River At Flood - M3U8 Version

Mouseover poster image above and see the extra sales message pop in. This is driven by a custom CSS for the player that I can include for each player I want to customize.

To see the full quality in this test rendering, from the video toolbar in the embedded media player above, use the pink "HD" button be sure you are watching this in 1080p (you may default to a slower speed.) Also, use the full-screen toggle in the right corner of the video toolbar to see this in full screen (hoping you have an HD monitor.)

Chrome+Android users: we're tracking what appears to be a bug in Chrome on Android devices whereby the request to serve the HLS .m3u8 manifest file is sent with a null referer. Our security features in our web application firewall reject requests that do not come with a referer from an authorized website. We have reported this bug to Google and await more news. This bug only impacts Chrome+Android. iOS devices (phones and tablets) and all desktop browser (including Chrome) work as they should. Also, the bug only impacts .m3u8 requests (the manifest file for HLS streaming). On Chrome+Android devices, requesting a MP4 file sends the full referer and therefore plays normally as it should. See https://www.tkach-law.com/lp-dallas-skyline-across-trinity-at-flood-mp4

About Dailies

This is just the first quick post-processing of a new time lapse sequence. Like all of the other previews in this series, they are "rushes" or "dailies." All of these videos are only "proof-quality" in only 720p, using a single speed of low-bandwidth encoding and 4:2:0 chroma subsampling and presented in MP4 progressive downloading. Since this sequence was shot in high resolution the RAW format, there are almost endless technical and creative possibilities that can be applied in post-processing and its final production rendering including going up to 8k in 4:4:4 for the final production masters. This rendering only features post-processing that is "in the ballpark" but usually doesn't have every last bit of polishing typical of a production clip because the final polishing/tweaking can be very time consuming to "get it just right." For dailies, only enough post-processing was done to be able to check and evaluate the camera work of its shooting and the overall action, which is the primary purpose of dailies. Short-cutting the post-processing of dailies to only the minimum required work expedites making the dailies and allows them to be produced without delays, typically within 12-24 hours after shooting ends (but long difficult scenes might require 36 hours.) The highly-expedited nature of dailies is where the industry jargon term "rushes" (a synonym for dailies) comes from. Yes, they are indeed rushed! Dailies also provide the video editors working on the project with the full take of a sequence (with time codes) as it came out of the camera. Everything is shown: good, bad and indifferent. As such, dailies show everything that was shot which is always much more than will actually be used in the final film. This allows editors to begin planning for the integration of the sequence into the final film soon after shooting ends. The resulting MP4 files of the dailies can also function as proxies for the finished master (until they are done) and inducted into our video editing software for early rough cuts to see how the resulting clip(s) work with other content as it is cut together. Not only do we get a head start by using the dailies as proxies, if we're only going to use just eight seconds from a 22-second clip, we will finish and master only the actual eight seconds that will be used. This saves a lot of time and money in the final processing, grading mastering the final set of clips. When the production masters are finished, we simply replace the proxy files with the finished master files. This sequence shown in dailies will be trimmed, fully mastered and finished in its final production form and used as a clip in an upcoming video at a later date.

This is the MP4 progressive download version. You can see the HLS version at https://www.tkach-law.com/lp-dallas-skyline-across-trinity-at-flood

Production Notes

by Bill Anderton

This is literally the very first test rendering of my new workflow for my upcoming new time-lapse sequences. I'm very happy with the results already, even at this stage; PLUS, there is much more quality possible! As literally the first viewing, I could still tweak this a bit, but not much will be needed. It is good to go as is. But, yes, me being me, I'll play with it some more, I'm sure.

The original NEF images, Nikon's RAW-formatted extension containing images and the important image metadata, were shot on a Nikon DSLR DX-format camera using a 40mm Nikkor f2.8 lens at 100 ISO. The camera was set low, near the river on the west bank on the Trinity River Levee using an old-school vintage TiltAll tripod. The Trinity River was over 38-feet into its flood stage on this night. This was not the record level for the Trinity, but it was close to it and far higher that the river has been in a while. The aperture of each shot was a constant f2.8 but the exposure time for each was ramped from 1/1,250 sec to 5-seconds during the complete day-to-night sequence. The interval between images was ten seconds. The sequence was originally shoot on the evening of June 22, 2015 between 8:17 pm and 10:23 pm CDT, with Todd Tkach assisting, along with Austin Anderton assisting in earlier still-frame testing and location scouting on previous nights.

I have developed an entirely new workflow for post-production for this new project. The objective is to get the highest possible quality from the 90+ time lapse sequences that will be shot for this project. The sequence above is literally the first output of the new process as performed end to end.

In post-production, I used multiple software packages anchored by Adobe Lightroom in the center of the whole set of processes to work on the images; passing data back and forth among the various programs was accomplished elegantly via the image metadata in sidecar files. This allowed seamless "round-tripping" the image sequences through the various software packages for specialized processing. It also allows multiple round-trips to tweak the corrections that are applied. Each image was individually processed and smoothly blended into the next image to eliminate flicker. While processing many factors, particular attention was paid to smoothing the luminance curves of the whole sequence to achieve a butter-smooth time lapse. The total process was non-destructive so individual steps would be undone if needed and the processing tweaked and redone.

After extensive post-production processing of each individual image, I made full-resolution intermediate-masters of each of the resulting 673 individual images in the sequence as 16-bit TIFF files. This is a very time-consuming but worth every minute of it to get the resulting quality. Also, it saves a lot of time on the back end when rendering video. The process for going from intermediate-masters to the video-masters, by design, uses "over-sampling" both spatially and in color space for HD and 4K. In the 6K format, the intermediate masters are 1:1 spatially with the 6k video master. In the 8k format, there is a slight enlargement used on the intermediate-master files. My newer cameras can shoot 8k natively and eliminate the need for any enlargement for improved quality. Also, the large high-quality intermediate-masters can, in turn, be rendered into any size/quality of video without remaking the intermediate-masters which save a tremendous amount of time in this workflow. Once the intermediate-masters have been made, it only takes several minutes to render a video master for this 22-second clip.

Since all processing is performed on the original RAW files non-destructively for each frame, this technique also provide maximum flexibility when color grading the final product. In high-quality productions, color grading of each scene allows a director and colorist to match each shot to conform to a desired style or look, as well as to blend multiple shots together seamlessly so they match the desired style. Working with the original RAW files allows color grading on steroids!

For this very first test, the video-master for this sequence was rendered only as a 1080p 29.97-fps with a 4:1:0 RGB color space in a MP4 file container encoded at 100-mbps. Normally, this video-master would be taken into Adobe After Effects for visual tweaking and sweetening or into a video editor to be mated with other content for a finished piece. However, here, the video-master was uploaded directly into the distribution process on my global media platform.

The resulting video-master was then transcoded for distribution using my standard HLS Adaptive Streaming process that produces eleven distribution file sets, each with different spatial sizes and speeds. HLS Adaptive Streaming automatically and dynamically throttles up and/or down every nine seconds to best match the eleven files with the speed of each user's connection. Click on the pink "HD" button in the embedded media player above to see all distribution sizes and speeds. My media player defaults to "auto" but you can manually select any of the speeds and feeds.

Also, for use as proxy files for rough cuts in editing, in my standard workflow, I make low-bandwidth 720p MP4 files for progressive-downloads.

The next test will be to render a new 1080p HD video-master as a 4:2:2 ProRes 422 HQ file and re-transcode the distribution file sets from the new master. My workflow can also do 4k and 6k formats, and even go up to the 8K spatial size (7,680 x 4,320). All sizes can be produced in 4:1:0, 4:2:2 or even in 4:4:4 RGB color space for digital cinema use. I will try all of these to test my workflow, but I don't yet have a 8k-capable display device and will top out my critical-viewing testing at 4k and 6k.

Since one of the important end use of this project will be for theatrical release and film festivals in digital cinemas, the ultra-highly quality that will be required for these venues defines a lot of my work flow. I will primarily work on the 4K formats using what is called the Digital Cinema Package (DCP) that standardizes all of the data (image reels, JPEG-2000 codec, sound MFX audio reels and meta data) for cinema projectors and their digital play-out servers. I will work with both the "Flat" form of DCP in 3996 × 2160 for a 1.85:1 aspect ratio and the "Scope" form of DCP in 4096 × 1716 for a 2.39:1 aspect ratio for cinema release. The whole final film package and all of its files will be placed on a high-capacity SSD in a caddy (also standardized in DCP), put into a Pelican case and shipped off to the theater or festival showing the film. Once on-site, the SSD in its caddy is simply plugged into the play-out server and ingested at 6-Gbps. Once safely in the play-out server, the SSD can be removed and returned in its case.

However, I am also hoping to play the final film in some 6K- and 8K-capable theaters. By starting with over-sampled 8K intermediate masters, I can make any format needed (even as just "one-offs") by only doing new video renders from the intermediate masters without starting over. Easy-peazy plus massively less expensive than making distribution prints photo-mechanically previously used for releases.  Also, these same intermediate masters can be used to produce any other types of video master required such as the 16:9 aspect-ratio masters for making for distribution files for streaming media. What would be the Black Speech for "and one intermediate format to rule them all," Tolkien fans?

I will turn this sequence into a release video once its musical scoring has been completed and it is mated with other content in the coming week.

Stay tuned; more to follow! Happy, happy, happy!