|2014/03/12||Intervideo upgrades to multi-OS storage network|
|2014/02/19||Snöball Film shares the knowledge with GB Labs central storage|
|2013/07/12||A new age at Anglia Ruskin University|
|2013/07/12||Intervideo goes from tape to file with GB Labs|
|2013/06/17||Scuola Nazionale di Cinema|
|2013/05/28||Central SSD Storage Delivers Olympic Success|
Intervideo are delighted to announce expansion of its GB Labs Space storage network. As a specialist encoding, conversion and content management facility, Intervideo needs to maintain its technical edge. The company has to handle every new codec and format that appears and deliver the best quality processing service at all times.
Until now, client data was managed by a central Mac based SAN. Gerry Wade, the company’s technical director remarks:
“Selecting best of breed apps that run on other platforms was always a challenge. We needed a solution which was OS agnostic that could reliably manage very high IOPS*.”
The move to a complete GB Labs Space storage network lifts this restriction. Linux, Mac and Windows workstations running any application can now connect to additional tier 1 Space and tier 2 Space Echo, installed early in 2014. This complements the original Space and central tape storage system, Space LTO, which was implemented in 2012 and 2013.
It was important to streamline operations and allow every system to work in harmony and with the storage network running on Ethernet and each client connected over IP, workflows are now far easier to manage. The result is more secure data and faster turnaround times. Plus, as Space is one of the most powerful and scalable network storage systems currently available, the storage will future-proof the facility as the industry transitions towards 4K.
“We now have seamless file exchange with our media delivery systems, our encoders and our Final Cut editing platforms.” Concluded Wade.
Snöball Film opts for GB Labs Space storage
Snöball Film AS is a leader in educational video production. The company, founded in 2000, has recently witnessed strong growth, and now has some 20 employees. In 2012 the company produced more than 80 films and, to cope with increased workload, the company took the decision to upgrade their archive and storage systems both to improve its workflow and to ensure data security.
"Until this year we relied on external hard drives," says producer Marte. "But this is an expensive and labour-intensive way of working. It resulted in redundant double copies, complicated file management and poor security. We started a process to update our workflow, which would support Adobe Premiere Pro, and the full Creative Suite."
With assistance from local reseller Video 4, Snöball opted for a 40TB GB Labs Space storage and archive solution. Editors, graphic artists, directors and data loggers now have access to a intuitive and straightforward file system. External hard drives have been gradually phased out for greatly enhanced data management. New projects are ingested directly onto the server, with the team able to work on different projects from different workstations without having to worry about making local copies of the material.
"We are all very happy with the solution. Our data is now handled in a secure manner, enabling us to concentrate on what is, after all, the most important thing: to make good, informative educational films."
Snöball Film's storage network is based on the 24-port Brocade switch with 10GbE uplink ports. The storage itself is housed in a USYSTEMS silenced rack cabinet.
For more details: http://www.snoball.no/
Space takes on 400 students
“We prepare students for a career in television and film,” says Sean Thornton of Anglia Ruskin University. “As such, the technology we use has to be real-world and broadcast standard.”
The Media Services team provides equipment to under- and post-graduate students and operates facilities that include TV and photographic studios, video editing rooms and media production suites. Content is created in a variety of formats using professional Panasonic and JVC camcorders, and DSLR systems. It is post produced using network-connected Apple and Adobe software suites, running on Mac hardware.
“Prior to upgrading our infrastructure with GB Labs’ Space, we had a fibre channel-based storage system.” Continues Thornton. “It worked well and generally ran problem free. However, it had some key shortcomings that meant that it would no longer meet our needs. Firstly, it placed a cap on the number of user accounts that could access the storage; secondly the amount of space allocated to individual users was restricted, something that became a real problem as we transitioned to multi-layer HD projects.”
The success of the film and television courses meant that an upgrade to the storage infrastructure was essential. With between 300 and 400 users, 35 Mac’s running Final Cut Pro and the Creative Suite, the university needed a powerful shared storage system that had the reliability for very intensive use, especially at peak times in the run up to assessment deadlines.
The university opted for GB Labs Space, a solution ideally suited to the students’ ever more demanding media projects. Now technology is not a bar on their creativity. Thornton remarks that users are increasingly creating complex graphic-rich programmes. With Space, they can access their personal files at any time, either by logging in via their university username or using group access credentials supplied by the media department. Employing personal and group logins means that security is easy to manage: individuals can protect their assets and identities or share files with other users very safely and very simply. Completed projects are transferred to parking areas on the drives to allow for group criticism, an essential part of the department’s activities.
However, security goes beyond asset privacy. The entire system is RAID protected while plans are in place to formalize routine system backups. Thornton outlines the vision:
“Data loss is unthinkable. So for the next phase of development, we will have the option to replicate the complete storage system automatically onto a GB Labs Echo nearline system. Alternatively, we are also investigating the new Bridge unit that would allow us to use the old Fibre array as a network attached storage silo. Whichever route we take will add a new dimension to protecting our assets.”
Both backup devices are loaded with software tools that automate data replication and archiving, with enhancements available for intelligent hierarchical storage management.
The Media Services team, at Anglia Ruskin University, has already upgraded its Space with EX expansion modules for additional performance and capacity. Plans are afoot to increase system RAM to accommodate growing student numbers and to support still higher stream counts. “As project deadlines approach, the students hit the network extremely hard. We always have an eye on managing storage capacities and maintaining system performance.” Concludes Thornton.
By making the transition to Space, the university has the scope to grow its capacity, support more concurrent users, handle highly sophisticated video projects and secure every asset held on the network. With a sound foundation and a clear upgrade path, Anglia Ruskin’s storage network is a model worthy of many a professional facility.
Ushering in the tapeless era
“It all changed with the sudden disappearance of HDCAM SR from the shelves of tape suppliers,” says Gerry Wade, technical director at London’s Intervideo.
As the shift to programme production, storage and delivery moved rapidly from tape to file, encoding, conversion and QA facilities, such as Intervideo, had to evolve fast to service the needs of its customers - broadcast, post production and production companies.
“The whole industry was set up to support tapes. At Intervideo we have almost every VTR, from DigiBeta and HDCAM to D3 and D5 with quality control, aspect ratio, standards conversion and processing equipment from the likes of Snell and Tektronix. But like the rest of the production community, we were not expecting the file to replace the tape quite so suddenly.” Wade continues.
The move to tapeless workflows demanded investment in a new infrastructure and in new non-linear tools. Initially, Final Cut edit suites were used to ingest and re-purpose files, with files stored locally. Compared with linear systems, this workflow delivered little, if any, speed advantage but was an effective way of capturing video as files and re-formatting video to a client’s instructions.
As it soon became clear that the short-term disappearance of HDCAM SR would be a turning point in the production and delivery process, and not merely a passing crisis, Wade and his team began to research how to exploit fully the potential of a tapeless world. The result of their systems review is a fast, flexible and powerful workflow that is capable of high speed ingest, processing, storage and delivery. Wade outlines the vision, and the reality:
“Our clients include producers such as National Geographic . As transnational organizations, they need their suppliers to work fast and flexibly and to support every global standard. While in the past this involved multiple racks of VTRs, powerful processing hardware and a reliable courier, the tapeless world required a wholly new approach. Now, at the heart of the operation is GB Labs Space. It’s a central network attached storage system that holds every active media asset. It’s OS agnostic which is very important for us: each Mac user, PC workstation and Linux server on our network accesses files on Space. It is also powerful, able to serve multiple HD streams to users throughout the facility. Being compatible with third party software and systems is also vital for future development as we integrate asset management software and quality control software into the network.”
The workflow is simple, yet sophisticated. Intervideo’s Signant server receives files from National Geographic over a dedicated Gigabit fibre connection. Technicians then process files with Telestream’s Episode software, quality control is undertaken and files are delivered to their final destination using high speed file transfer software. Intervideo has invested in Space LTO, a multi-drive tape drive that resides on the network. It incorporates built-in tools for data management, including automated file transfer, media tracking and video archiving, which simplifies the archiving process.
“Integrating the LTO system was seamless. It gives us the potential to securely back up our clients files and to archive projects for quick retrieval should a customer require a copy”. However, as before, every process is still reliant on the expertise and experience of the operators to ensure the best possible results. Indeed, core to the organization is the quality control suite where video is exposed to the stringent analysis of human eyes.
As well as investigating automated QC tools, Wade is watching the development of 4K with interest: “But frankly, it’s still too soon to say…” However, whenever the leap from HD to a higher resolution takes place, the facility’s network and central storage is already UltraHD-ready: with Space Intervideo has clearly make a long-term investment.
Scuola Nazionale di Cinema
The School and its Editors
For more than seventy years, whole generations of film-makers including some of the most famous figures in Italian cinema have passed through the classrooms and film studios of the Scuola Nazionale di Cinema (National Film School) in Rome. The school forms part of the Centro Sperimentale di Cinematografia (Experimental Cinematography Centre), which is dedicated to training students for careers in the cinema. The Scuola, the oldest in Western Europe, is dedicated to discovering and nurturing new talent; alumni include the Oscar-winning Néstor Almendros and Pasqualino De Santis, acclaimed editor Roberto Perpignani and Vittorio Storaro who was judged to be one of the top 10 most influential cinematographers of all time.
The school’s interdisciplinary teaching programme prepares students in the specific areas of directing, scriptwriting, acting, photography, editing, sound techniques, production, set-design, props and wardrobe. Film editing skills are taught via hands-on experience with Apple Final Cut Pro 7 which runs on thirteen networked Mac Pro computers.
The workgroup was originally configured with each workstation relying on direct attached external drives. While installing these drives can be fast, there is no easy way to expand capacity and increase read/write performance. There were other serious drawbacks in terms of maintaining data security, project sharing and efficient media asset management: storing, retrieving and safeguarding files all involved ad hoc, manual processes. A more professional solution was required.
In order to deliver a managed storage solution, the school, in consultation with specialist reseller Bagnetti, turned to GB Labs’ Space. A 96TB RAID system was installed with media files and projects stored and administrated centrally. With Space, entire volumes can be replicated for data security, while capacity and performance can be increased without downtime, with additional Space EX expansion modules. Workstations, running any OS, are attached in seconds over Ethernet, with no need for client drivers, special adapter cards or complex configuration routines.
According to Bagnetti’s Patrick Frans, the use of the latest gigabit and 10-gigabit Ethernet technology means that Space runs at peak efficiency with no performance issues whatsoever. He notes that: “There is absolutely no difference in performance compared with the fastest Fibre SAN.” With the new network attached storage installed, student editors can now easily exceed 30 concurrent streams of ProRes 422 HQ video.
When, at a later date, more capacity is require-ordinate automated back-up routines. Furthermore, because of the versatility of the Space environment, the cinematography school can also migrate to 4K film making at any point, using existing equipment. Additionally, shared projects using uncompressed 4K DPX files are supported, with the installation of a Space SSD system to the same IP network.
Central SSD Storage Delivers Olympic Success
The 2012 Olympics pushed the organizers and its partners to the limit. As a highly visual form of entertainment, much of the emphasis of the Games was on creating and delivering an immersive media experience, with motion graphics displayed in and around every venue.
To meet this huge technical and creative challenge, the organizing committee, LOCOG, turned to Crystal CG International, one of the world's leading digital media companies. It relied on the skill of its team and the power of central SSD RAID storage to deliver the project.
“We have unsurpassed experience in static and dynamic imagery, digital animation and interactive virtual reality.” Remarks Ed Cookson, Project Director. “We employ the best that technology can offer to help our employees illustrate their ideas and designs.”
For the London games, Crystal CG were tasked with producing as many as 3,000 video clips and motion graphics for everything from medal formalities to the massive scale opening and closing ceremonies. The organization was also responsible for 3D venue visualizations, used extensively in television advertising, for broadcast fly-throughs and as training aids.
The project required 80 creatives in all working side by side over a concerted period to produce media of the highest quality. At the same time, data security and confidentiality were of paramount importance. Just as demanding were the guidelines from the games organizing committee LOCOG and the creative guidance from Director Danny Boyle and his team. As a result, there could be no compromise on either creative results or project deadlines.
The smooth running of the network and client computers was critical. Employing a variety of professional software tools including Adobe After Effects and Autodesk 3ds Max on the same network demanded 10 Gigabit Ethernet infrastructure to manage the massive flow of data between storage servers, render farms and client workstations. At the heart of the network was GB Labs’ Space SSD:
“We needed a central storage unit that could serve data to up-to 80 online users, concurrently and at full speed. Our research into the available options came up with one viable solution: Space SSD.” Notes Cookson.
The storage was installed by London-based reseller and systems integrator, NMR. Its MD, Neil Anderson, commented: “Using 24 drives, Space SSD is exceptionally fast in its own right. Adding three EX SSD devices expanded the number of drives to 48 and the system capacity to 28.8TB. This also increased the speed even further. We recorded sustained read / write speeds in excess of 6,000MB/s – enough to support 200+ ProRes streams, something no SAN can match. Space SSD was the ideal match for this project and, in fact, we don’t know of any solution that could deliver anything like the performance of the GB Labs system.”
Space SSD is an exceptionally high speed central storage system that is optimized for video and media workgroups. Using Ethernet protocols, it connects to Gigabit and 10 Gigabit networks without the need for special hardware adapters, client software or licences on workstations. As a super tier 1 device, it can sustain 4K DPX sequences or more than 40 uncompressed HD streams. And with Ethernet connections saturated with data, network slowdown is avoided.
One of Space SSD’s key features was vital for the Crystal CG studio. Its use of Mac and PC workstations needed a highly versatile storage system that could co-inhabit a multi-OS environment. The system also had to be integrated with the studio’s networked animation render farm which was in use at the same time as designers were busy at work on their respective workstation.
“From our years’ experience working with highly pressurized media customers, we know that downtime – or even slowdown - is not an option. Nor is time wasted configuring and re-configuring operating systems and client adapters to fit in with the special demands of the central storage. This is why we built Space: it has none of these issues.” Said Ben Pearce of GB Labs.
“We are also motivated by data security and accelerated workflows, so the 32TB Space Echo tier 2 nearline system that Crystal CG installed was configured to automate backups from the online system using tools we build into the storage itself.” Pearce continues.
Supplying graphics for all venues, in many resolutions, in varying aspect ratios, and across multiple locations, the main issue for the project team became managing assets and prioritizing jobs. Most notably, the team had to design graphics to appear on thousands of tablet panels spread across many tiers of seating in the main stadium. This kind of innovation in display technology required a new approach to creative projects.
As a consequence, Cookson concludes: “Managing the creative teams, the creative processes, the project workflows and delivery schedules was a massive challenge. Thankfully, our Space SSD storage never once added to those concerns. It is a very capable and very complete solution.”