Space takes on 400 students
“We prepare students for a career in television and film,” says Sean Thornton of Anglia Ruskin University. “As such, the technology we use has to be real-world and broadcast standard.”
The Media Services team provides equipment to under- and post-graduate students and operates facilities that include TV and photographic studios, video editing rooms and media production suites. Content is created in a variety of formats using professional Panasonic and JVC camcorders, and DSLR systems. It is post produced using network-connected Apple and Adobe software suites, running on Mac hardware.
“Prior to upgrading our infrastructure with GB Labs’ Space, we had a fibre channel-based storage system.” Continues Thornton. “It worked well and generally ran problem free. However, it had some key shortcomings that meant that it would no longer meet our needs. Firstly, it placed a cap on the number of user accounts that could access the storage; secondly the amount of space allocated to individual users was restricted, something that became a real problem as we transitioned to multi-layer HD projects.”
The success of the film and television courses meant that an upgrade to the storage infrastructure was essential. With between 300 and 400 users, 35 Mac’s running Final Cut Pro and the Creative Suite, the university needed a powerful shared storage system that had the reliability for very intensive use, especially at peak times in the run up to assessment deadlines.
The university opted for GB Labs Space, a solution ideally suited to the students’ ever more demanding media projects. Now technology is not a bar on their creativity. Thornton remarks that users are increasingly creating complex graphic-rich programmes. With Space, they can access their personal files at any time, either by logging in via their university username or using group access credentials supplied by the media department. Employing personal and group logins means that security is easy to manage: individuals can protect their assets and identities or share files with other users very safely and very simply. Completed projects are transferred to parking areas on the drives to allow for group criticism, an essential part of the department’s activities.
However, security goes beyond asset privacy. The entire system is RAID protected while plans are in place to formalize routine system backups. Thornton outlines the vision:
“Data loss is unthinkable. So for the next phase of development, we will have the option to replicate the complete storage system automatically onto a GB Labs Echo nearline system. Alternatively, we are also investigating the new Bridge unit that would allow us to use the old Fibre array as a network attached storage silo. Whichever route we take will add a new dimension to protecting our assets.”
Both backup devices are loaded with software tools that automate data replication and archiving, with enhancements available for intelligent hierarchical storage management.
The Media Services team, at Anglia Ruskin University, has already upgraded its Space with EX expansion modules for additional performance and capacity. Plans are afoot to increase system RAM to accommodate growing student numbers and to support still higher stream counts. “As project deadlines approach, the students hit the network extremely hard. We always have an eye on managing storage capacities and maintaining system performance.” Concludes Thornton.
By making the transition to Space, the university has the scope to grow its capacity, support more concurrent users, handle highly sophisticated video projects and secure every asset held on the network. With a sound foundation and a clear upgrade path, Anglia Ruskin’s storage network is a model worthy of many a professional facility.
Ushering in the tapeless era
“It all changed with the sudden disappearance of HDCAM SR from the shelves of tape suppliers,” says Gerry Wade, technical director at London’s Intervideo.
As the shift to programme production, storage and delivery moved rapidly from tape to file, encoding, conversion and QA facilities, such as Intervideo, had to evolve fast to service the needs of its customers - broadcast, post production and production companies.
“The whole industry was set up to support tapes. At Intervideo we have almost every VTR, from DigiBeta and HDCAM to D3 and D5 with quality control, aspect ratio, standards conversion and processing equipment from the likes of Snell and Tektronix. But like the rest of the production community, we were not expecting the file to replace the tape quite so suddenly.” Wade continues.
The move to tapeless workflows demanded investment in a new infrastructure and in new non-linear tools. Initially, Final Cut edit suites were used to ingest and re-purpose files, with files stored locally. Compared with linear systems, this workflow delivered little, if any, speed advantage but was an effective way of capturing video as files and re-formatting video to a client’s instructions.
As it soon became clear that the short-term disappearance of HDCAM SR would be a turning point in the production and delivery process, and not merely a passing crisis, Wade and his team began to research how to exploit fully the potential of a tapeless world. The result of their systems review is a fast, flexible and powerful workflow that is capable of high speed ingest, processing, storage and delivery. Wade outlines the vision, and the reality:
“Our clients include producers such as National Geographic . As transnational organizations, they need their suppliers to work fast and flexibly and to support every global standard. While in the past this involved multiple racks of VTRs, powerful processing hardware and a reliable courier, the tapeless world required a wholly new approach. Now, at the heart of the operation is GB Labs Space. It’s a central network attached storage system that holds every active media asset. It’s OS agnostic which is very important for us: each Mac user, PC workstation and Linux server on our network accesses files on Space. It is also powerful, able to serve multiple HD streams to users throughout the facility. Being compatible with third party software and systems is also vital for future development as we integrate asset management software and quality control software into the network.”
The workflow is simple, yet sophisticated. Intervideo’s Signant server receives files from National Geographic over a dedicated Gigabit fibre connection. Technicians then process files with Telestream’s Episode software, quality control is undertaken and files are delivered to their final destination using high speed file transfer software. Intervideo has invested in Space LTO, a multi-drive tape drive that resides on the network. It incorporates built-in tools for data management, including automated file transfer, media tracking and video archiving, which simplifies the archiving process.
“Integrating the LTO system was seamless. It gives us the potential to securely back up our clients files and to archive projects for quick retrieval should a customer require a copy”. However, as before, every process is still reliant on the expertise and experience of the operators to ensure the best possible results. Indeed, core to the organization is the quality control suite where video is exposed to the stringent analysis of human eyes.
As well as investigating automated QC tools, Wade is watching the development of 4K with interest: “But frankly, it’s still too soon to say…” However, whenever the leap from HD to a higher resolution takes place, the facility’s network and central storage is already UltraHD-ready: with Space Intervideo has clearly make a long-term investment.