43 min read
“Everyone Wants to Edit Remotely… But How?” – BeBop & Teradici 2020 Webinar
By: Admin on Jan 23, 2020 8:00:00 AM
In case you were unable to join Paul Austin (Teradici Director of Global Channels) and Michael Kammes (BeBop Director of Business Development) for our “Everyone Wants to Edit Remotely… But How?” Webinar which covered the ins and outs of remote and cloud editing from a creative standpoint, we’ve uploaded a copy of the video here for you.
If you have any questions or would like to find out more about how you can integrate BeBop Technology into your existing editorial or VFX workflow after watching, please click the button below, fill out the form and one of our professionals will help you get started.
Thank you for joining us today everyone!
Now all of you want to edit remotely in some capacity, and in some cases you’ve already crossed that threshold where you actually *need* to edit remotely. Whether this is working from home or maybe to make last minute changes for a difficult client (because none of us have ever had that,right?) or maybe you just want to use your talent that’s sitting around the world. We’re going to go over today how you can do that.
Let me introduce the cast of characters. First, you have me, Michael Kammes, (@michaelkammes) the Director of Business Development for BeBop Technology, and we also have Paul Austin (@PCoIP_Partners) who is the Director of Global Channels for Teradici.
As I mentioned, today we’re going to tell you how to edit remotely, both on-premises (AKA “on-prem”) and in the cloud. Whether that’s extending your local machine to wherever you want to create or simply using the cloud to have infinitely scalable hardware and software… it can be done, and it has been done, and, well, it will be done. We’re also going to cover the pros and cons of each approach so you can make the best strategic decision possible for you and your workload.
Now, to better set the ground rules, we really need to define remote editing versus cloud editing. There is a difference.
If we get really specific, remote editing is extending from your main on-premises editing system to wherever you may be. Think of it as a secure, high quality extended desktop. This could be your home office editing system or the machine that’s sitting at a post facility that you need to access remotely. This ensures you’re getting access to the machine you know and love and all the network resources that are available to that computer locally.
Cloud editing, on the other hand, leverages data centers all around the world as your creative editing system.
Because it’s in the cloud, you have access to infinitely more resources than just your local machine and without all of the security and infrastructure extras you need from a remote editing scenario. In both of these situations, we want to ensure that you get to use the creative tools that you’ve spent thousands of hours getting really good at and not some watered down version. If you’re an Autodesk Maya or 3ds Max user for VFX, well, we want you to work and have access to that. If you’re a power editor in Adobe Creative Cloud / Premiere Pro or Avid Media Composer, hey, you should be able to cut with those tools as well. In addition, it’s imperative that the protocol, that is the underlying technology that allows for accessing computers from dozens or a hundred miles away, is up to the challenge.
A good way to think of it as an engine in a car. Without a high performance engine, your car will never perform as well as it could. I’m obviously recommending Teradici, but we’ll get into why later on. For now, let’s just examine both scenarios and the pros and cons of each. Let’s start this off with the fact that this is a reality. This isn’t a technology preview or a future technology demo. Cloud editing with your favorite creative tools is being done and is out in the wild.
With that said, “so how was this accomplished?”
Let’s look at how BeBop Technology handles this. First, it’s imperative that you have a decent internet connection. No external editing solution is going to work well if you’re closer to 1990 dial-up speeds than to 2020 broadband speeds. The national average in the United States is around 94Mbps (megabits per second). We find that an average cloud editing user needs 20-50 Mbps (Megabits per second) of stable bandwidth per HD computer screen for a good editing and VFX experience. As you can see, this easily fits down the pipe that a vast majority of U.S. users have. Now, if you want to check out your own speed, take a look at sites like speedtest.net so you can see what kind of speed you’re rocking.
Now that we’ve determined that you’ve got enough bandwidth, let’s check out what your latency is.
Well, what’s latency, Michael? Well, I’m glad you asked.
Latency in this case is how long it takes for a key press on your local keyboard to be reflected on your computer screen. Now, all computers have latency. In fact, your local keyboard, computer and monitor all introduce various amounts of latency, but usually this latency is so low that the user isn’t bothered by it. Expect 80-145ms (milliseconds) as an average for your local computer.
When we add in the additional latency due to geographical distance from where you’re editing machine in the cloud is, we need to ensure that the total latency is not distracting to the user. Again, you. This is why BeBop targets data centers near you to ensure the total latency is seamless to you. We like to recommend under 70 milliseconds from your editing fortress of solitude to the data center where your machine is for optimal performance, which means it’s virtually imperceptible to most people’s eyes. Of course, the proof is always in the pudding, which is why we recommend that all users test out the performance ahead of time. And, honestly, I’m not kidding here folks, most users forget the system is remote. Seriously.
Now that we have the bandwidth and latency covered, the next thing we have to do is determine what CSP, or cloud service provider, you want to use. The CSPs have all of the horsepower you could ever need and also allow for the VMs you need to run your software. VMs, or virtual machines, is what powers virtualization. Virtualizing editing systems allows the data center to deliver a traditional desktop for you to interact with.
There’s a handful of major CSP players out there, namely, Microsoft Azure, Google Cloud Computing Services, and Amazon Web Services, but there are also smaller private data centers out there as well.
The benefit of using one of the big three is that they all have data centers all around the world and one is most likely close to you, and this reduces the aforementioned latency. But they also offer standard configurations of computing power, GPUs and storage. What this means is that wherever you decide to spin up a VM to work from, you’ll have access to the configurations you need.
Now, I’m not going to kid you, these configurations can get tricky.
How much horsepower do you really need?
And how much GPU do you need?
And what storage will be fast enough?
These are things that are even difficult to determine on-prem, let alone in the cloud. Again, this is where BeBop has done all the work for you. We provision horsepower appropriate for your workflow and for your users as well as storage that can deliver the data you need as fast as you need it.
Now, speaking of storage, I should probably clear up some misinformation about cloud storage. Storage that is commonly used for web backup or archival such as Amazon Glacier or even Dropbox are not up to the task for real-time media usage. The storage used for backup and archive is meant to be used for a push and pull type workflow, not real time usage. They typically are Object Storage if you want to get really geeky about it, which is great for backup, but not fast enough for the responsiveness and throughput that your media workflows need.
Because of this, faster storage needs to be provisioned, configured and mounted on each of your cloud editing VM workstations. I usually find that media usage in the cloud often follows a similar methodology to storage tiers that are on-prem, that is fast storage for media that you need immediately followed by slower storage for backup or archive.
If you follow this basic on-prem storage rule and then apply that to the cloud, you’ll be able to save a chunk of money on cloud storage costs on a per monthly basis.
Now we’re getting to the real magic. How do we control and communicate with the cloud machine? Again, I’m glad you asked. I’m sure everyone has tried the usual suspects, right? You’ve tried TeamViewer, you’ve tried Microsoft Remote Desktop and you’ve probably tried VNC. And while you see they function at a base level their use was never meant for media. They were meant for industrial IT usage. The audio is often compressed, out of sync or sometimes even non-existent. The video is compressed, and you’ll never get close to a full frame rate of your media. Now, if we change this up and used a robust protocol such as Teradici’s PCoIP or PCoIP Ultra, that ensures video and audio sync, color fidelity plus communication for your keyboard, mice and tablets. Plus, because the PCoIP protocol compresses areas of the screen differently, the amount of bandwidth you need is much lower as opposed to streaming video where one codec is traditionally used for the entire frame. Essentially, only the pixels that change are delivered to your local system and in different codecs.
Once you load the PCoIP software, you can now access it from any device that supports the Teradici PCoIP protocol. This can be as simple as a desktop software client or what I prefer, a “dumb terminal”: which is a purpose-built zero client, which is by far the best experience possible.
One more thing before I actually show you this working, is we need to get content up and down from your local machine and storage up to the cloud. This is another area where BeBop handles all this for you with an easy to use interface. If you’re doing this all on your own, you need to follow the protocol and methodology that each CSP provides. This is normally accomplished by the almost 50-year-old FTP protocol or via a webpage drag and drop, and this can get very frustrating because it will differ from each CSP.
Now, we’ve covered a lot of technical ground so far, so let’s now tie all of this together with a live demo.
[Please watch the demo in the video!]
Here’s my local zero client screen, and the login screen. I’m connecting to the AWS US-West-1 Data Center, which is up in San Jose, about 350 miles from me. Now I’ll enter in my secure password. And in order to save time, I’ve already spun up a workstation in that data center.
I could have several running at once if needed, but for this demo we’ll do just one. What you’re looking at now is the desktop of the machine in the data center. At this point, you can use the computer as you normally would. Using BeBop I have their dashboard, which allows me to mount fast shared storage in the cloud that’s been assigned to me plus enable little things like “enable Mac keyboard”.
If you’re interested, here are the specs on the machine. It’s a base level 16 core, 2.3 GHz machine with 122GB RAM and a 16GB CUDA enabled GPU. Because it’s the cloud, if I need horsepower, it’s easy to add more. I’m going to launch Premiere Pro 2019 which will load any plugins I have installed because, again, at this point you’re just working in a Windows OS with your creative app. No additional tech hurdles to jump through. Next, we’ll open up an unrendered project with various media, and we’ll see how this plays.
Everything you’re seeing right now is happening in the cloud. That includes all storage, all media and this very Premiere Pro session, and then we screen record it locally to see what the quality is. Now, the first thing you want to realize is the green indicator below. If that light is green, that means no frames are being dropped on the machine in the cloud. This first clip is 4K UHD Blackmagic RAW. This is a 4K UHD Blackmagic RAW with a LUT applied. Now this on the other hand is a 4K UHD h.264. This is still a 4K UHD h.264, but now we have a real-time Lumetri Color effect applied. This is still a 4K UHD h.264 with the real-time Lumetri Color effect applied plus a title. And lastly, we have a four way split with multicam; all with HD h.264s.
[End Live Demo]
Pretty cool, huh? No dropped frames by the machine in the cloud, and it looks smooth as butter here on my local machine. Now, if you didn’t see full frame rate on your end, we got to chalk that up to using the webinar conference service. Again, please check out this webinar as VOD later on in the week and all of the local screen recording from my computer.
Now as promised, we’re going to talk about the pros and cons.
There are a ton of good reasons to use the cloud. First, you can use as much or as little horsepower as you need at any given time. Do you have a gig coming up for only a month? Awesome. Use it as you need it and then don’t pay for it when not in use. This makes your OpEx (or “operating expenditure”) much more in line with your incoming work.
Buying brand new gear outright is CapEx (or “capital expenditure”) and may be a hard cost to swallow when you’re part of the gig economy. Also, when you’re working in the cloud, you have the ability of altering the equipment you get to work with. Faster machines and storage when you need it. You also get to work virtually wherever you want and collaborate with other folks in real time as opposed to having to export cuts and push and pull those files back and forth.
Now, let’s be honest, the cloud may not be an instant fit for you. If you’re a single freelancer who is using their own gear, then getting your gig employer to move everything to the cloud just for you, that’s going to be kind of tough. If you’re using a Mac only application, that’s probably going to be a deal breaker too. Almost every data center out there is built for Windows and Linux deployments – macOS. The handful of Mac data centers out there don’t support robust protocols like PCoIP and are often cost prohibitive.
So what are your costs looking like?
Well, that gets a little complicated so I’ll give you some ballpark numbers. Let’s say you’re a full-time video editor, which is only 40 hours a week, right everyone? Or a VFX artist. You can expect to pay $1,000 to $2,000 a month ballpark. You may end up paying $2,000 to $4,000 a month for a handful of specialty applications.
Now, as I mentioned, this would include full-time computer usage in the cloud, plus a TB or so of edit worthy storage in the cloud. Now this number can fluctuate depending on what CSP you use and what abilities you need. But this number is a good starting point. The cloud approach isn’t right for everyone just yet. We get that, and that’s why there’s a hybrid approach, remote editing. And remote editing like cloud editing uses the same underlying technology and protocol, Teradici.
Another good benefit about remote editing is that it’s built on your existing system. As mentioned, we’re just extending this system for remote usage. First, we need to take into consideration some of the points we’ve already outlined with cloud editing, that is, how fast your internet connection is and what the latency is from where you’ll be creating from to the location where your host system is. Now the same tools apply.
Next, many folks overlook the non-creative components involved in remote editing. Things like, well, I don’t know, security and networking and VLANs! Any networking person will tell you that leaving your storage with sensitive client content directly accessible from the internet is a big no-no. This is why you’ll need a heavily featured firewall with Fort Knox for security. Consumer firewalls are just not robust or secure enough for this. They may functionally work but are not difficult for a hacker to penetrate. You’ll probably also want to set up a VPN or virtual private network from your editing system to where you’ll be working remotely. Often this is something you’ll need to license on your firewall. This is to ensure that you and only you can access the system in a secure and efficient manner. It’s also highly recommended that you set up a separate network for remote access to your edit machine to have an air gap between your editing machine and the rest of the network.
Also, on this network, background processes like file transfers, Aspera or Signiant, Pandora or Spotify or even Netflix can cause additional internet traffic on that special network. This additional traffic can cause the appearance of dropped frames and can seriously degrade the user experience. A good rule of thumb for you IT professionals is to set up a separate network for remote editing much like you would a VoIP (or “Voiceover IP”) system.
I also can’t stress enough that you should set up a QoS rule (Quality of Service) rule for the accessing protocol, such as Teradici’s PCoIP. It’s pretty much a necessity.
Now that we have the framework for remote editing, we need to put in the guts of the system. This is where Teradici has you covered with both software and hardware.
As I mentioned earlier, Teradici cloud access software enables you to migrate workloads and all of your applications to a public or in this case remote editing a private cloud so you can securely access them from any PCoIP end point. So essentially editors can access their go-to editorial or VFX applications from anywhere, and these remote workstations can be configured for multi-monitor HD or UHD/4K displays. The PCoIP protocol effectively creates a secure connection just as you would have in the cloud but instead with an on-premises editing system.
If you want to get even more of a robust system, you can also use Teradici’s workstation cards to put into your Windows or Linux based PC.
Now you’ve probably noticed that I haven’t spoken much about Apple. I know it pains me too, but because of Apple’s, shall we say, gated approach, there isn’t a native hardware or software solution for Teradici on Mac (as a server). However, there is a third party that can do this for you. A company called Amulet Hotkey – who is a partner of Teradici’s – takes a Teradici hardware card and places it into an external enclosure, so you get the goodness of the macxOS and the quality of a Teradici connection.
So why is this remote editing approach good?
Well, first, your existing infrastructure are the building blocks for successful deployment. Because you’re simply extending your desktop and network you can pick up where you left off without uploading and downloading to the cloud. In addition, this methodology can be used to extend your system internally. Think of the old post production facility paradigm with all of the machines in the core or server room, you can use Teradici as a KVM to extend your desktops from the server room to the edit bays.
There are some gotchas, however.
Getting content to and from your workstation when you’re outside of the building may be difficult. There’s nothing built into the PCoIP protocol for this. So you’ll need a third party tool to transfer content back and forth. Your VPN may allow for this, but often an external tool is needed. As to cost, the Teradici software is very inexpensive, usually only a few hundred dollars per year. The PCIe cards that go into your desktops are usually under a thousand dollars apiece. And for you Apple users, external enclosures are only a few thousand dollars plus the physical PCIe card and whatever your hardware or software client is.
You’re probably asking yourself, “Okay, Michael, so all of this looks great, but why Teradici?” Well, there’s a reason BeBop uses Teradici and why media entertainment mainstays like Avid standardize on Teradici. For that, I’m going to turn it over to Paul from Teradici. Okay, Paul, it’s your turn.
Thanks Michael. We’ve been in this business for a while now, a little more than 15 years, and we’ve got around 13 million desktops where we’re trusted to run these high value, and by that I mean computationally and graphically intense, workloads. And I think the answer lies in our unswerving focus on delivering the workload in such an uncompromising way that an editor really cannot distinguish between the remote or cloud experience and their physical editing bay.
Start with the workload delivery itself. Because we render it on the VDI host, Windows and Linux are supported by the way, there are no concerns about application compatibility. The application simply runs as it is expected to. The protocol itself is distinguished from other protocols by supporting multiple codecs to ensure that we achieve an optimal, pixel perfect build to lossless experience. Third in these PCoIP innovations is that because network conditions are prone to change and can be the least predictable element in the architecture, our protocol can dynamically adapt to ensure optimal user experience.
Fourth, and as you’ve mentioned earlier, is our focus on security, a really critical element for the M&E workforce. We’re only transmitting encrypted pixels between the data center and the end point, meaning valuable assets and IP are secured in the cloud. We support dark side implementations, which add an air gap dimension to our security. And, of course, pairing all of this with a zero client as a zero risk zero attack surface endpoint replacement adds up to a very secure environment.
And finally, our solutions are extremely flexible and open supporting on-premises, cloud and hybrid deployment scenarios. We’re largely agnostic to the hypervisor and the hardware platform. We support many platforms today with new support being added all the time. All these innovations are being delivered in the cloud access software platform, which delivers an incredibly secure, highly performant and easily deployed and managed solution, which supports public and private cloud implementations.
This is a solution which anticipates the need for broad flexibility in the industries we serve and we deliver. Cloud access software not only delivers the desktop workloads from the broadest variety of hosting platforms to an equally broad selection of end point devices, but we also provide for brokering those desktops with connection management and secure gateway capabilities all built in.
As with many of the industries that trust Teradici to deliver their workloads, we’ve developed and continue to refine and optimize key media and entertainment workload architectures like editorial workflows, visual effects workflows, on-air playout automation, and already mentioned our focus on security.
I’ll touch on each of them briefly.
In editorial workflows like with Avid Media Composer or Adobe Premiere, we see the need for dual screens in the editorial and review workstations with high definition or 2K configurations.
The applications are working with down sampled files. And because we’re talking about an uncompromising group of users, a highly responsive experience with audio and video syncing is a must.
In the on-air playout scenario, you have the same highly performant and responsive studio pods with easily deployed secure zero clients that reduce the heat and noise of traditional workstations and provide for round-the-clock reliability.
Like editorial, VFX workflows can be delivered via either the remote workstation or cloud and feature the same no compromise experience for the creative. We add in Wacom tablet support with local end point termination and support content security policy.
The net of all this is that for media and entertainment businesses from small to large, working with Teradici cloud access software allows them to be even more responsive in addressing faster turnaround times for content, have greater access to out of region talent while enjoying a secure no compromise user experience that’s indistinguishable from the editing bay with a high fidelity interactivity, color accuracy and quality.
Thank you so much. Paul.
Being restrained by the four walls of your facility is no longer the only way to create. Forcing hybrid proxy workflows and shipping hard drives around the world is now the old way of doing things. Editing remotely with or without the cloud is now totally possible. Now’s the time in the webinar where we open it up for question and answer. So let’s get started.
Live Question and Answer
The first question we have is, “is there a possibility for project and file collaboration with other users or is this for just a single project? If a project is in the system, can a user export it out?”
I love this question because one of the reasons I started working at BeBop was because of the collaborative features of BeBop. If you’re using BeBop and you’re editing in the cloud, every one of your team members is assigned the fast storage that I talked about earlier, and that fast storage is mounted on all the team workstations with the same drive letter.
It’s essentially a NAS, so that means if you have five users, they’re all mounting the same drive letter and they all have access to the same media and folders (if you’ve given them permission so everyone can work on the same projects.) If you’re using Avid Media Composer, for example, and we’re using Azure, Azure has the Avid NEXIS file system, which means you can do your bin locking: your green unlock, your red lock, and you can share that between different users. If you’re using Adobe, even better! Adobe obviously has team projects for remote editing, and then they have shared projects when people are on the same SAN or NAS. Yes, we support that as well. We currently have several clients right now that are using shared projects from Adobe to work on the same cut at the same time with multiple users. So, yes, that can certainly be done. And, yes, a user can export it out. Because we’re giving you access at the OS level, you can export from Premiere or Avid or from Adobe Media Encoder or Blender or any of the apps that you’re using. We do have security functions, though. So if you are being employed by someone who has BeBop, and the person who was administrating BeBop does not give you download access or internet access, that means you can export to that local machine in the cloud but you won’t be able to download it. And, again, that’s a security feature that can be toggled on and off on a per user basis. I hope that answers your question.
We now have a Teradici question. “Is the Teradici software optionally available as a onetime purchase per machine as an alternative to per year or reoccurring subscription cost?” Paul, do you want to tackle that one?
Yeah, sure. Thanks Michael. The idea here is it’s really treated like the high watermark. We licensed the software on a concurrent user basis, and we do have… As somebody else has noted, we do have a minimum order quantity of five seats of cloud access software whether it’s the standard or it is the plus. And a just a quick distinction on the two, standard is the standard graphics agent, and plus is the agent that is expecting to find a GPU.
Excellent. Since we have you Paul, there was another question: “…is there the ablility to purchase licenses in smaller than five packs?”
At this stage, there isn’t. However, we are open for business, and we’re very interested in understanding the business requirements of our user constituency. So I would say contact us, and let’s talk about your business case. Obviously, a minimum order quantity is a very common thing in the software industry, and it just reflects the other real cost of processing orders and all of the other types of considerations in this sort of a business. But, again, I think my answer is that we are open for business. We want to enable businesses, and come and talk to us.
Thank you, Paul!
Another question, “how many screens are supported?”
Most editors use dual monitors at a minimum and even prefer a third for video out. Great question. In case it wasn’t evident in the video, there’s two PCoIP protocols. There’s the rock-solid-been-around-for-several-years PCoIP protocol, and then there’s the newer version, which is the PCoIP Ultra. The PCoIP Ultra supports even more. Currently with BeBop, we’re using the original PCoIP. That supports four HD GUI monitors – so, computer monitors. That means you’re getting 1920 by 1080 or 1200 and up to four monitors. If you’re dealing with 4K/UHD, then we do two UHD monitors.
Obviously, that is going to impact your bandwidth, so you’re going to need 20 to 50 Mbps (megabits) per HD screen and, of course, multiply that out by how many screens you have and then temper that with how much motion is on each screen because – remember – the bandwidth that’s being taken up is dynamic depending on what’s happening on the screen.
You also asked about video out. Yes, we are completely aware that a lot of editors like to have a broadcast monitor or a color or confidence monitor. Right now, the PCoIP protocol does not support that. It doesn’t handle those frame rates and at that size exceptionally well. Rest assured the BeBop engineering team is looking into it, and we’ve got several potential solutions, and as soon as we have something that we feel is going to give you a broadcast-like experience, we’ll definitely be doing that. But for now, you’re going to be doing the computer GUI screen, and if you want to go full screen on that monitor you can certainly do that, but I would expect a dropped frame here or there.
Next Question: Is this available in other countries like Brazil?
That depends on what you mean by “is this available” because there are two different extended editing paradigms. There’s the cloud editing paradigm, and if you’re in the cloud then we need to have a data center within 70 milliseconds latency from where you’re going to be cutting. Some editors are less susceptible to that, and we can bump that up, but you want to be within 70 milliseconds, and traditionally that’s several hundred miles. So you could probably push that to 400 or 500 miles, maybe even more and be okay. But again, we’ll be wanting to be run a ping test from where you are to the closest data center, and then we can determine vaibility from there. If we’re talking about remote editing, well, that’s more dependent on where your office is and then where you’re editing. Most likely you’re not driving hundreds of miles to work every day, so I’d imagine that remote editing workflow would be fine in Brazil.
“How long does it take to set up a remote editorial station?“
That’s a good question. It doesn’t take long at all. What BeBop does, if we’re talking about cloud editing, is we create a disk image, and that disk image has all the apps that you and your team have agreed to, and all you have to do is submit a ticket through our help system, and that gets our engineers on it, and we start a new instance based on that disc image. So that gets you a disc image going very quickly.
“Can Avid, DNxIQ, Blackmagic Extreme be used on the client’s system for playout?”
Unfortunately, no, not right now, Richard. As I mentioned, full screen baseband video output is not something that BeBop has in a shipping format. However, we are looking at it, and we’re looking at potential partners as well.
Next question: “Does Avid solution support growing file formats?“
That’s a little bit of a tricky question because the storage has to support it, and whatever is capturing the live ingest has to support that. Now, BeBop has that working. We have that working in our lab from live ingest, but how we’re depositing that on the Avid NEXIS storage and then giving that access to Avid is a bit more of an engineering question, so we can certainly discuss that offline.
“What is a typical setup for BeBop and/or Teradici between an editor working on a VM desktop and remote producer who needs to watch and give feedback to the cut and edit as it progresses?”
This is another cool feature we have. If you check out the BeBop Technology website, we have something called OTS or Over the Shoulder. I’m sure all of you have dealt with producers or directors that want to come in and sit behind you while you edit, and they want to grab your mouse, and they want to tell you how to do your job. I’m sure none of you on the call have ever experienced this!
In any event, our OTS solution, if you’re editing on BeBop in the cloud, you can invite someone who is also on BeBop to see your screen, and you can share your screen with them, complete with bi-directional audio conferencing. This allows you to see each other’s screens. It allows them to make comments on what you’re doing, but they can’t take control. It’s kind of like a driver’s ed instructor, right? They can sit next to you, but they can’t do anything. And that’s included on our platform. And if you have just a producer that is not editing and only wants to look over your shoulder, we offer that at a discounted rate as well, which is pretty interesting.
“Hi, can you talk about color accuracy? What is the color depth delivered by codex used by Teradici? Would a color artist be able to work and finish remotely on a project?”
That’s, again, a loaded question, but let me dive into that. The Teradici PCoIP protocol right now supports 8bit color replication. As we get into PCoIP Ultra, that gets more flexible, but the PCoIP protocol right now is at 8bit. So that means if you’re doing a grade, you’re probably going to get some gradients here or there. So we don’t find most people are doing high-end grading on BeBop; but it’s not just because of that. It’s because normally when you’re grading, you’re using high res material. High res material as you know, takes up a lot of space. A lot of space on fast storage in the cloud that can deliver hundreds of MegaBytes a second starts to get cost-prohibitive at a certain point. So can it be done? Yes. Is it the best solution? No. And that’s why for the time being, we’re looking at hybrid workflows where online cloud editing is being used for VFX, being used for creative editorial assembly, string outs, producer or predator type workflows. But when we’re doing audio finishing and we’re doing high end color work that’s traditionally done on-prem.
“Can the licensing use on an on-prem server for dispensing across geographical locations, i.e., a floating license server? “
Yes. That’s a very, very common question. A lot of facilities will have font servers or they’ll have plugin license servers. As you know, all plugins are different whether they do a question and response, maybe they have a license key, and maybe they have a named IP schema. So what BeBop does is on the virtual LAN in the cloud, we set up license servers, and those license servers can dispense Avid licenses, they can dispense plugin licenses, they can dispense fonts. So that’s all set up. And if you already have an on-prem licensed server, we can tie into that as well, so you can have the best of both worlds.
“Can we integrate with workflows that utilize remote color grading with Blackmagic Resolve for 824 or the Mill?”
Great question, Tim. We haven’t done any testing with that, but given the fact that we’re still working on the baseband video out, I’m willing to bet that that’s going to be an engineering discussion to see if that’s something that would benefit a lot of people.
“Does this solution support remote edit sessions between the editor who may be on remote location and a producer who may be in another? Is it possible to have remote people connected to the same machine for purposes such as this?”
As I mentioned earlier, yes, we can certainly do OTS to share screens so everyone can see the editor. Right now, it’s a one-to-one ratio, so one editor can allow one additional person to connect and see their screen.
“Also, any pros and cons between BeBop VMs and a physical remote workstation or GPU limitations?”
Well, both of them are kind of completely different. One is in the cloud and virtualized for everyone, and one is simply extending what you have on-prem. So it’s very difficult to compare them because they’re both completely different creatures. If you go back to the VOD of this presentation, you’ll definitely see the pros and cons of each solution, and that’s probably the best way of analyzing them against one another.
“Why choose Tera2 hardware instead of Teradici software only?”
Good question. If you’re working in your machine room or server room, perhaps it’s easier to have one centralized location in the rack for you to extend things out. Perhaps you want to have different users utilize that station. Also, anytime you can offload processing from the processor to a hardware or a dedicated system, you’re always going to have that much of a guaranteed performance boost. As an example, that’s why with video out in your editing machine, quite often we’re using third party IO cards. That’s also why quite often you’ll have third party transcoders that run their own hardware because you’re always going to get better performance with purpose built hardware over software.
“Any solution for remote voice over workflow?”
That’s a good question. If I’m not mistaken, if we’re using the Teradici PCoIP protocol, that supports microphone input. So, in theory, inside Adobe or inside Avid, you should be able to do a punch in and record. To be honest with you, I haven’t tried it, but I believe that’s supported, and hopefully someone can jump in and correct me again if I’m wrong.
“What is performance like over WiFi?” says Johnson Lee.
As you may have seen in the presentation, we want to cut down latency as much as possible. 70ms from you to the data center, that’s reasonable. But when you start adding in consumer WiFi or shall we say Starbucks WiFi, you start adding more latency and you start adding more chatter on the line. There’s more traffic going on. So will it work? Yes, I shouldn’t be saying this, but occasionally I’ve demoed over WiFi. It will work, but expect increased latency, sometimes in the magnitude of 10 to 20 to 30 milliseconds, and I would expect the appearance of a dropped frame more frequently. Now the system in the cloud won’t drop frames, but it may appear locally as a dropped frame because of the traffic on the line.
For all of you tech folks in here, that’s because the Teradici protocols sends data down a UDP port. And as you’re aware with UDP there’s no error correction, so that means the more chatter there is on the line, the more processes that are going on, the more degradation in a viewable, constant data stream.
“Are there network product description costing diagrams available for each remote and cloud hardware versed software? “
No, and there’s a reason for that. There are some diagrams of cloud editing and remote editing, but pricing changes fairly regularly, and that’s because the cloud editing CSPs, the cloud service providers, their pricing changes, so that gets a little bit more difficult to plan out the exact pricing. I recommend using the broad strokes that I’ve outlined here, and then when it comes time for you to look into this for your workflow or your facility, then you hit us up and we can give you the “Market Pricing”, and we’ll find out exactly what the pricing is for that day and time.
“I have a lot of plug-ins. How do you handle those?”
Good question. As I mentioned, because we’re giving you access to the creative apps at the OS level, we don’t abstract that process. So as long as you tell us what plug-ins you want and what versions you want, our team installs and then uses your license to authenticate them. Right now, we run just about every plug-in out there because, again, we’re not running anything to prevent those from working. Our team installs them and they just work.
“Does PCoIP work using high latency satellite network connections? What’s the maximum latency?” At last check, satellite was introducing around a hundred milliseconds of latency. Obviously, that changes. Heck, my wife goes hiking, and she texts me on a satellite phone and I don’t get that for hours. So I can’t swear to you what the latency is, but when you start getting to a hundred milliseconds, that’s going to be kind of, shall we say, an unpleasant editing experience. So if what you’re doing is more producer or predator based string outs, or similar that may be feasible, but I think unless we test it, my knee jerk reaction is that may be a little bit high for creative editorial.
“VMware also has the Horizon Client for remote access to virtual machines. What are the key differences between this and Teradici?”
Well, that’s obviously a question that we get a lot of about how we differentiate ourselves. I’m going to give just a quick overview of the history between VMware and Teradici. Teradici licensed our PCoIP protocol to VMware back in 2009 where they didn’t have their own performant in-house protocol, and we’ve continued to develop ours, and they’ve actually started to produce their own stack. They want it to be more self-reliant. But a lot of that tends to be based on more of an H264, H265 type of approach whereas our protocol and our entire solution stack is more oriented around the broader use of those multi-codec types of approaches to deliver. And you covered it beautifully, Michael, when you talked about the way the workload is delivered. It’s not just the moving changing pixels on the screen, but it’s also the high fidelity, the crisp lines in wire frames if you’re a Maya user or something like that, and the color matching and all of those other things.
There are a number of articles, industry articles, created by third parties that I will dig up and send along just to kind of dig into it. And, obviously, there are all of the VMware… not the VMware, but the VDI Smackdowns that Ruben Spruijt used to do back in the day. Yeah, so there’s a lot of very intelligent people who are out there thinking about how do we compare all of these. Rachel Barry did a great piece a little while ago, so I will send those along. Obviously, this is a very long-winded answer, but hopefully that took a swing at it.
Andy asks, “how does Teradici handle things like Dolby 5.1 sound and the need to do SDI output for color grading?”
Well, we addressed the last part first. So Dolby 5.1, great question. Right now, the PCoIP protocol, the originating protocol, supports a stereo down mix. So let’s say you’re working in Adobe or you’re working in Avid Media Composer on BeBop in the cloud and you’re playing back 5.1, it will be muxed down to a stereo signal, which comes down the PCoIP connection.
But, moreover, if you’re doing 5.1 that tells me that there is a little bit of post audio going on, and right now audio and the cloud is, shall we say, basic at best. If you’re doing frame based audio cuts as you would in most NLEs, that’s fine. When you start getting out into the sample level, as you know, samples are a little bit smaller than frames. So we find that people trying to edit multi-track audio and doing 5.1 mixes and adjusting things at the sample level, you just don’t get that kind of response in the cloud right now. That’s one reason why the 800 pound gorilla, Avid, doesn’t have Pro Tools operating in the cloud. That’s why running Audacity or running REAPER or any of the other high end audio tools in the cloud isn’t kind of commonplace or talked about a lot because unfortunately latency in samples is just very difficult.
Here’s another good question. “Do you support Avid NEXIS storage or do you only do SAN storage?”
Let me be really pedantic about this for all you three letter acronym fans. If you choose to use Avid, – which is fantastic – Avid right now will only run in Microsoft Azure, so you’re not running that in AWS or Google. It’s running only on Azure. Avid has been able to virtualize the NEXIS file system. So if you’re running Avid Media Composer on Azure through BeBop, you completely get the NEXIS file system.
Any other apps that you run on the same machine, those would have access to the NEXIS file system as well. If we decide not to use Avid and maybe we want to use Adobe on Azure or Adobe on Google or Adobe on AWS, then we present to you NAS – not a SAN – but NAS. These are SMB mounts and you’re going to get low latency access, so you’re not editing off object storage. You’re editing off block level storage, which is going to get you hundreds of megabytes a second (both read and write )to multiple stations, and you get the ability to manipulate that any way you want, provided your user has permissions.
“Will remote editing using Teradici work over 4G as latency doable?”
I have seen people demo it over a 4G hotspot. However, you know we’re getting to a point of diminishing returns. Will it functionally work? Yeah, but is it going to give you the editing experience and responsiveness you need? That would worry me. So I’m all in favor of a science project and giving it a shot, but I think there has to be kind of a line in the sand of where performance and experience meet.
“When you say multi-codec, to which codex are you referring to and what purpose?”
Great question, Jeff. Let’s say you’re watching Netflix. Netflix is going to stream that to you in a frame methodology, right? They’re going to give you a frame of information. Often it’s compressed, so it’s in a 264 codec or something along those lines, and it’s only the difference between each frame. That’s going to be one codec for each frame, so that means the data rate is always going to be around the same.
What Teradici does is look at the data that’s being sent and says, for example, hey, there’s more motion in your composer window or there’s more motion in your program monitor. We’re going to encode that with a different codec, but your bin monitor or just the bin part of your window, that doesn’t have a lot going on. That’s static text, not a lot of colors, so we’ll use a different codec to encode that. The benefit of that is that you can cut the data rate down even more so you don’t need massive amounts of throughput. That’s another reason why when people say you can’t edit 4k, how can you stream 4K in real time? Well, remember, we’re only encoding what pixels are on the screen so we don’t have to deliver the entire payload. It’s just what codex or what pixels are changing at that point in time.
Raymond asks,”can a printer be connected to a local thin client to handle a print queue remotely from a remote PCoIP?”
Huh. Interesting. What BeBop does is we put a proxy server in the cloud that filters all traffic coming off that virtual machine. This is for security and your safety obviously. I can foresee an option where we whitelist an IP and then that IP is routed directly to your print server and you print. I’ve never tried it, but I think in theory it may work. Then again, it may be just as easy for you to send yourself an email or download a PDF and print it locally, but that’s an interesting use case.
“Does Teradici offer a demo program?”
Before we get to that part, I can tell you that BeBop is working on a test drive where you can kick the tires for 48 hours. We hope to roll that out soon. But back to the original point, Teradici. Paul, is there a demo program or an evaluation or a try and buy kind of thing?
Yeah, there absolutely is. Obviously, seeing is believing. One of the things that we would encourage someone who wants to evaluate cloud access software to do is to contact us. We can connect them with the appropriate sales team that can make the arrangements.
“are you reselling Adobe licenses or do we need to bring our own?”
This is a great question. For the most part, BeBop is an advocate of BYOL, bring your own license. So if you’re already paying for your subscription per month to Adobe, great. You subscribe to BeBop, you get on the platform, you enter in your credentials, you’re good to go. If you’re using Maya, if you’re using 3ds Max, we have the Autodesk desktop app. If you’re using Resolve, you can plug in your credentials there. That works fine. So there’s no additional cost. What we get into are the companies that have decided that if you want to use their software in the cloud, then that requires another license, and some companies are doing that.
Avid, for one, will require a newer license, and that’s something that we can obviously discuss. And there are a handful of other companies that are looking at the cloud as the user should incur an additional cost. The best thing to do is reach out to BeBop. We have on our website what applications work natively. And we can answer questions about apps that we maybe not have tested yet.
“How does RGS on HP machines differ from Teradici?”
Oh, it looks like there was some discussion on the RGS application, which HP bundles in with their machines, and, again, that’s like comparing to VMware. They’re both different creatures, so we can certainly take that offline because that’s going to be another webinar just on that.[/vc_column_text][vc_column_text]
Work From Home Securely: Cloud Or Remote Editing – Which Is Best for You?
Did you miss the “Work From Home Securely: Cloud Or Remote Editing – Which Is Best for You?”...
NFL Draft 2020: NFL Media Deploys iPhone Production Kits, Coordinates 600+ Live Feeds To Bring Virtual Draft to Fans
In absence of a physical Draft, NFL Media teams with ESPN to serve fans with unique experience
Arturo Sedano from Cinematic Media: With Bebop, we make it easier to handle peak work
Cinematic Media highlights the high performance offered by the Bebop platform to attend complex...