
2-3 Minute Read
Remote working has become much more prevalent during the pandemic period. But one of the things about working with high quality video is the fact that it’s fairly lumpy and so can be difficult and inconvenient to work with remotely, with the requirement to transfer large files from place to place.
In many ways we are still in a state of confusion, and there are a number of legitimate approaches to remote working that have their benefits and downsides. There isn’t really any sense of standardisation, or a best direction for everybody as yet.
At GB Labs we have developed technology to achieve something that is effective, convenient and simple to use - because we understand that creative people don’t like to be told ‘this is the way we work here’ or ‘you need to change the way that you work.’
Remote working should be made as simple as possible, so that it’s basically the same as working in the office. When you go home, or wherever you are in the world on location, it isn’t necessarily any more complicated than when you go into the office and sit down to work from there.
Our approach is that you still use central storage no matter where in the world you are located - and we endeavor to make that as painless as possible in the background. The way we do that is by using one true filesystem that is accelerated at end points to effectively offer the same file path around the world and can maintain the kind of speed that you need.
Content needs to be shared. Media must be secure and it must deliver suitable access speed when it is needed. Most of all, it needs to be automatic and seamless: editors want to get on with cutting the show, and managers want to deliver to their clients on time and on budget.
What is needed is a way to make this geographically distributed, multi-format set of storage sub-systems appear as a single, common pool with a single sign-in. Users should have access to the material they need without being overwhelmed with too much unnecessary information. All files should be maintained in consistent synchronisation, and there should be no doubt about the master file, so no work is wasted on outdated versions.
Connectivity, data management and access controls must be addressed. The layer of automation and communication that sits below the user data must use intelligence and business rules to manage this, so it appears seamless.
Management and analytics need to be virtualised too, so that the IT manager can log on from a web browser anywhere in the world and see the state of the storage as a whole, gain metrics, troubleshoot and manage the content flows.
Those flows are dependent on acceleration. That means not only using the latest data communication techniques to access media from remote locations as quickly as possible, but also managing workflows by ensuring that the right content is pre-loaded into a remote store in time for the work to start. Hot caches of solid-state storage provide that rapid access.
Enjoy this blog? Share it with your followers by clicking the social media icons below
![]() |
![]() |
![]() |