Storing multiple terabytes of video content monthly is, obviously, a drive-intensive business -- particularly when using RAID for local editing scratch disks, a NAS for internal server access, and web remote backup. Rather than buy more drives and build a data library that is both impossible to manage and impossible to search, we decided to use our disks smarter and begin compressing broll as it falls into disuse. Deletion is the final step, at some point, but the compression is small enough as to be a non-concern right now. We're able to compress our broll anywhere from 50-86%, depending on what kind of content is contained therein, and do so with nearly 0 perceptible impact to content quality. All that's required is a processor with a lot of threads, as that's what we wrote our compression script to use, and some extra power each month.
Threadripper saw use recently in a temporary compression rig for us, as we wanted to try the CPU out in a real-world use case for our day-to-day operations. The effort can be seen below:
We occasionally post less formal "site updates" that help bring everyone up to speed on what's going on.
As many of you likely know, the AMD events pertaining to RX Vega and Threadripper just ended, and so we're already working on test planning for these products. Release is the 10th for Threadripper, and so it'd be reasonable to assume review publication around that time. RX Vega is mid-August, with rumblings that August 14 is the release target. Those are our major items right now, but aren't the only things we're working on.
Part of our 4K camera upgrade was for ergonomics – better ability to handle the camera, particularly in show floor environments – with most the other reasons centering around quality. Camera quality is superior in every technical sense, low-light and noise reduction being a major area of improvement, but working with larger files at higher bit-rates means longer render times. We can now capture up to 200Mbps (previously captured 28Mbps) at 4K resolution, and we output at 2x the bit-rate of our previous 1080p60 videos. Render times have skyrocketed, as you’d expect, and have gone from roughly video duration + a few minutes to an hour per 20-minute video.
There’s not a lot we can do about this. Adobe Premiere, sadly, does not really do much with multi-GPU. The GPUs are accelerators, with rendering still falling on the CPU for a lot of the workload. We’re becoming more thread-limited than anything at this point, and really don’t want to build an entirely new production system right now. For now, upgrading the primary GPU to a 1080 Ti will help us out a bit in Premiere and significantly in Blender.
For those who don't follow the YouTube channel as closely as the website, it's possible that you may have missed out on our first two livestreams. Both have VODs up on the YouTube channel over here: GN Live #1 - Seidon Cooler Tear-Down & GN Live #2 - EK Open Loop & Vega Work.
We don't have any plans to start a regularly scheduled stream, but we are working on just streaming when the team is building/unbuilding things anyway. For now, we wanted to take apart and build the stuff shown in streams #1 and #2, so it made sense to set them live.
It’s a far cry from our last “major” camera purchase, which consisted of about $3000 to buy a then-new Canon XA20 and shotgun mic. That was around 2013. Since that time, we’ve invested thousands in audio equipment, sliders, tripods, and lighting – but as video team’s skills and arsenal have grown, we’ve had one straggler: The camera. The XA20 was a fantastic camera to buy for our first major video equipment, replacing our previous Canon Vixia HF S20; the XA20 permitted 1080p60 uploads, put us on the map for video, and continues to be an absolute workhorse for road production. We’re planning to keep it around for multi-cam interview shooting in the future, alongside giving us an option for multiple video staff on-site at an event. Logistically, it makes good sense to keep the XA20 around – again, the thing is truly a workhorse, and I’d be lying to not acknowledge a sentimental attachment.
About three blocks from our hotel during CES was a relatively new museum called "The National Atomic Testing Museum," associated with the Smithsonian. I popped down there with Patrick Stone for a quick visit as a break from CES and the carrion mobile device salesmen on the show floor.
Upon entering the museum, the first thing you see is a movie prop from the 1956 movie "Forbidden Planet." The robot ("Robbie," naturally) may be a reproduction, as there was no clear explanation except that it belonged to the original prop master Robert Kinoshita, who died at the age of 92 quite recently. It sets the mood for the atomic age when atomic testing around Las Vegas, home to CES, was extensive. Maybe that explains some of the mutants we saw shambling around the convention center. I still think that Forbidden Planet (based on Shakespeare's "The Tempest") is a great movie and worth a watch.
Ramping up the video production for 2016 led to some obvious problems – namely, burning through tons of storage. We’ve fully consumed 4TB of video storage this year with what we’re producing, and although that might be a small amount to large video enterprises, it is not insignificant for our operation. We needed a way to handle that data without potentially losing anything that could be important later, and ultimately decided to write a custom Powershell script for automated Handbrake CLI compression routines that execute monthly.
Well, will execute monthly. For now, it’s still catching up and is crunching away on 4000+ video files for 2016.
Regular visitors to the site may have noticed that we've been experiencing intermittent downtime for about a week. Following an (unrequested) update to the software stack at our server provider, we've been dropping connection at least once every few hours for about 10-15 minutes at a time. This was out of our control, and after fighting with the host over a resolution, we've decided to build a new server ground-up.
I'll be able to share more details about this soon, but it's looking promising so far. The new server should be faster, no longer drop connection, and will improve our ability to add website functionality in the future. We've been on unmanaged hosting since 2012, which means that we've basically got a remote box that is accessed through shell/putty, then updated entirely through an equivalent to command prompt (for Linux access).
Product launches haven't slowed down this year, it seems. We are about to ramp into one of the busiest seasons of the year for the site, and that means we're in high demand to work on build guides, sales guides, and news posts throughout the fourth quarter.
GamersNexus is seeking writers to assist in meeting our content demand. This is not an employment position, but a paid contract position. Just to reiterate: This is paid writing work.
This means that GamersNexus will ask you to write a piece (or you approach us with an idea), we agree on budget for the piece, and it gets delivered within a defined timeframe. That content is then published under your name as a contract writer, and we move on to the next piece.
We are under high demand through December, but there is potential for continued editorial work into 2017.
We have discovered a few issues with the Gears of War 4 testing that require a revisit to the game. We are working diligently to perform those tests now, and have temporarily unpublished the original content while we work to learn more about the title.
Our apologies for the inconvenience as we work through some new tests with the game. These are important to the results, and we believe them to be critical enough to put a pause on our original content delivery.
UPDATE: We have published the revamped Gears 4 benchmark.
We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.