Jump to content

Non-linear editing

From Wikipedia, the free encyclopedia
(Redirected fromNon-destructive editing)

An example of a video editing studio
A non-linear video editing studio

Non-linear editingis a form ofoffline editingforaudio,video,andimage editing.In offline editing, the original content is not modified in the course of editing. In non-linear editing, edits are specified and modified by specialized software. A pointer-based playlist, effectively anedit decision list(EDL), for video and audio, or adirected acyclic graphfor still images, is used to keep track of edits. Each time the edited audio, video, or image is rendered, played back, or accessed, it is reconstructed from the original source and the specified editing steps. Although this process is more computationally intensive than directly modifying the original content, changing the edits themselves can be almost instantaneous, and it prevents furthergeneration lossas the audio, video, or image is edited.

Anon-linear editing system(NLE) is a video editing (NLVE) program or application, or an audio editing (NLAE)digital audio workstation(DAW) system. These perform non-destructive editing on source material. The name is in contrast to 20th-century methods oflinear video editingandfilm editing.

In linear video editing, the product is assembled from beginning to end, in that order. One can replace or overwrite sections of material but never cut something out or insert extra material. Non-linear editing removes this restriction. Conventional film editing is a destructive process because the original film must be physically cut to perform an edit.

Basic techniques

[edit]

A non-linear editing approach may be used when all assets are available as files onvideo servers,or on localsolid-state drivesorhard disks,rather than recordings on reels or tapes. While linear editing is tied to the need to sequentially view film or hear tape, non-linear editing enables direct access to anyvideo framein a digitalvideo clip,without having to play orscrub/shuttlethrough adjacentfootageto reach it, as is necessary withvideo tapelinear editing systems.

When ingesting audio or video feeds,metadatais attached to the clip. That metadata can be attached automatically (timecode,localization, take number, name of the clip) or manually (players names, characters, in sports). It is then possible to access any frame by entering directly the timecode or the descriptive metadata. An editor can, for example, at the end of the day in theOlympic Games,easily retrieve all the clips related to the players who received a gold medal.

The non-linear editing method is similar in concept to thecut and pastetechniques used inIT.However, with the use of non-linear editing systems, the destructive act of cutting of film negatives is eliminated. It can also be viewed as the audio/video equivalent ofword processing,which is why it is calleddesktop videoediting in the consumer space.[1]

Broadcast workflows and advantages

[edit]

Inbroadcastingapplications, video and audio data are first captured to hard disk-based systems or other digital storage devices. The data are then imported into servers employing any necessarytranscoding,digitizingortransfer.Once imported, the source material can be edited on a computer using any of a wide range ofvideo editing software.

The end product of the offline non-linear editing process is a frame-accurateedit decision list(EDL) which can be taken, together with the source tapes, to an online quality tape or film editing suite. The EDL is then read into an edit controller and used to create a replica of the offline edit by playing portions of the source tapes back at full quality and recording them to a master as per the exact edit points of the EDL.

Editing software records the editor's decisions in an EDL that is exportable to other editing tools. Many generations and variations of the EDL can exist without storing many different copies, allowing for very flexible editing. It also makes it easy to change cuts and undo previous decisions simply by editing the EDL (without having to have the actual film data duplicated).Generation lossis also controlled, due to not having to repeatedly re-encode the data when different effects are applied. Generation loss can still occur in digital video or audio when using lossy video or audio compression algorithms as these introduce artifacts into the source material with each encoding or re-encoding.codecssuch asApple ProRes,Advanced Video Codingandmp3are very widely used as they allow for dramatic reductions on file size while often being indistinguishable from the uncompressed or losslessly compressed original.

Compared to the linear method of tape-to-tape editing, non-linear editing offers the flexibility of film editing, with random access and easy project organization. In non-linear editing, the original source files are not lost or modified during editing. This is one of the biggest advantages of non-linear editing compared to linear editing. With the EDLs, the editor can work on low-resolution copies of the video. This makes it possible to edit both standard-definition broadcast quality andhigh definitionbroadcast quality very quickly on desktop computers that may not have the power to process huge full-quality high-resolution data in real-time.

The costs of editing systems have dropped such that non-linear editing tools are now within the reach of home users. Some editing software can now be accessed free asweb applications;some, likeCinelerra(focused on the professional market) andBlender,can be downloaded asfree software;and some, likeMicrosoft'sWindows Movie MakerorApple Inc.'siMovie,come included with the appropriate operating system.

Accessing the material

[edit]

The non-linear editing retrieves video media for editing. Because these media exist on the video server or other mass storage that stores the video feeds in a given codec, the editing system can use several methods to access the material:

Direct access
The video server records feeds with a codec readable by the editing system, has network connection to the editor and allows direct editing. The editor previews material directly on the server (which it sees as remote storage) and edits directly on the server withouttranscodingortransfer.
Shared storage
The video server transfers feeds to and from shared storage that is accessible by all editors. Media in the appropriate codec on the server need only be transferred. If recorded with a different codec, media must be transcoded during transfer. In some cases (depending on material), files on shared storage can be edited even before the transfer is finished.
Importing
The editor downloads the material and edits it locally. This method can be used with the previous methods.

Editor brands

[edit]
Non-linear editing examples
Avid Media Composer
Avid Media Composer software in use
Da Vinci Resolve
Blackmagic DaVinci Resolve non-linear editing system panel: Advanced panel, with four trackballs and rotatable rings, designed for color grading

The leading professional non-linear editing software for many years has beenAvid Media Composer.This software is likely to be present[2]in almost all post-production houses globally, and it is used for feature films,[3]television programs, advertising and corporate editing. In 2011, reports indicated, "Avid is still the most-used NLE on prime-time TV productions, being employed on up to 90 percent of evening broadcast shows."[4]

Since then the rise in semi-professional and domestic users of editing software has seen a large rise in other titles becoming very popular in these areas. Other significant software used by many editors is Adobe Premiere Pro (part ofAdobe Creative Cloud),Apple Final Cut Pro X,DaVinci ResolveandLightworks.The take-up of these software titles is to an extent dictated by cost and subscription licence arrangements, as well as the rise in mobile apps and free software. As of January 2019,Davinci Resolve has risen in popularity within professional users and others alike - it had a user base of more than 2 million using the free version alone.[5]This is a comparable user base to Apple'sFinal Cut Pro X,which also had 2 million users as of April 2017.[6]

Some notable NLEs are:

Home use

[edit]

Early consumer applications using amultimedia computer[1]for non-linear editing of video may have avideo capture cardto captureanalog videoor aFireWireconnection to capturedigital videofrom aDVcamera, with its video editing software. Various editing tasks could then be performed on the imported video before export to anothermedium,orMPEGencodedfor transfer to aDVD.

Modern web-based editing systems can take video directly from a camera phone over a mobile connection, and editing can take place through a web browser interface, so, strictly speaking, a computer for video editing does not require any installed hardware or software beyond aweb browserand aninternetconnection.

Nowadays there is a huge amount of home editing which takes place both on desktop and tablets or smartphones. The social media revolution has brought about a significant change in access to powerful editing tools or apps, at everyone's disposal.

History

[edit]

Whenvideotapeswere first developed in the 1950s, the only way to edit was to physically cut the tape with a razor blade and splice segments together. While the footage excised in this process was not technically destroyed, continuity was lost and the footage was generally discarded. In 1963, with the introduction of theAmpexEditec, videotape could be edited electronically with a process known aslinear video editingby selectively copying the original footage to another tape called amaster.The original recordings are not destroyed or altered in this process. However, since the final product is a copy of the original, there is a generation loss of quality.

First non-linear editor

[edit]

The first truly non-linear editor, theCMX 600,was introduced in 1971 byCMX Systems,a joint venture betweenCBSandMemorex.[12][13]It recorded and played back black-and-white analog video recorded in "skip-field"mode on modifieddisk packdrives the size of washing machines that could store a half-hour worth of video & audio for editing. These disk packs were commonly used to store data digitally on mainframe computers of the time. The 600 had a console with two monitors built in. The right monitor, which played the preview video, was used by the editor to make cuts and edit decisions using alight pen.The editor selected from options superimposed as text over the preview video. The left monitor was used to display the edited video. A DECPDP-11computer served as a controller for the whole system. Because the video edited on the 600 was in low-resolution black and white, the 600 was suitable only for offline editing.

The 1980s

[edit]

Non-linear editing systems were built in the 1980s using computers coordinating multipleLaserDiscsor banks of VCRs. One example of these tape and disc-based systems was Lucasfilm'sEditDroid,which used several LaserDiscs of the same raw footage to simulate random-access editing.[a]EditDroid was demonstrated at NAB in 1984.[15]EditDroid was the first system to introduce modern concepts in non-linear editing such as timeline editing and clip bins.

The LA-based post house Laser Edit[b]also had an in-house system using recordable random-access LaserDiscs.

The most popular non-linear system in the 1980s wasEdiflex,[16]which used a bank ofU-maticandVHSVCRs for offline editing. Ediflex was introduced in 1983 on the Universal series "Still the Beaver".By 1985 it was used on over 80% of filmed network programs and Cinedco was awarded theTechnical Emmyfor "Design and Implementation of Non-Linear Editing for Filmed Programs."[17][18]

In 1984,Montage Picture Processorwas demonstrated at NAB.[15]Montage used 17 identical copies of a set of film rushes on modified consumer Betamax VCRs. A custom circuit board was added to each deck that enabled frame-accurate switching and playback using vertical interval timecode. Intelligent positioning and sequencing of the source decks provided a simulation of random-access playback of a lengthy edited sequence without any re-recording. The theory was that with so many copies of the rushes, there could always be one machine cued up to replay the next shot in real time. Changing the EDL could be done easily, and the results seen immediately.

The first feature edited on the Montage was Sidney Lumet'sPower.Notably, Francis Coppola editedThe Godfather Part IIIon the system, and Stanley Kubrick used it forFull Metal Jacket.It was used on several episodic TV shows (Knots Landing,for one) and on hundreds of commercials and music videos.

The original Montage system won an Academy Award for Technical Achievement in 1988.[citation needed]Montage was reincarnated as Montage II in 1987, and Montage III appeared at NAB in 1991, using digital disk technology, which should prove to be considerably less cumbersome than the Betamax system.

All of these original systems were slow, cumbersome, and had problems with the limited computer horsepower of the time, but the mid-to-late-1980s saw a trend towards non-linear editing, moving away from film editing onMoviolasand the linear videotape method using U-matic VCRs. Computer processing advanced sufficiently by the end of the 1980s to enable true digital imagery and has progressed today to provide this capability in personal desktop computers.

An example of computing power progressing to make non-linear editing possible was demonstrated in the first all-digital non-linear editing system, the "Harry" effects compositing system manufactured byQuantelin 1985. Although it was more of a video effects system, it had some non-linear editing capabilities. Most importantly, it could record (and apply effects to) 80 seconds (due to hard disk space limitations) of broadcast-quality uncompressed digital video encoded in 8-bitCCIR 601format on its built-in hard disk array.

The 1990s

[edit]

The termnonlinear editingwas formalized in 1991 with the publication ofMichael Rubin'sNonlinear: A Guide to Digital Film and Video Editing[14]—which popularized this terminology over other terminology common at the time, includingreal-timeediting,random-accessorRAediting,virtualediting,electronic filmediting, and so on.[citation needed]

Non-linear editing with computers as it is known today was first introduced byEditing Machines Corp.in 1989 with the EMC2 editor, a PC-based non-linear off-line editing system that utilized magneto-optical disks for storage and playback of video, using half-screen-resolution video at 15 frames per second. A couple of weeks later that same year,Avidintroduced the Avid/1, the first in the line of theirMedia Composersystems. It was based on theApple Macintoshcomputer platform (Macintosh IIsystems were used) with special hardware and software developed and installed by Avid.

The video quality of the Avid/1 (and laterMedia Composersystems from the late 1980s) was somewhat low (about VHS quality), due to the use of a very early version of aMotion JPEG(M-JPEG) codec. It was sufficient, however, to provide a versatile system for offline editing.Lost in Yonkers(1993) was the first film edited with Avid Media Composer, and the first long-form documentary so edited was the HBO programEarth and the American Dream,which won a National Primetime Emmy Award for Editing in 1993.

The NewTekVideo Toaster Flyerfor theAmigaincluded non-linear editing capabilities in addition to processing live video signals. The Flyer usedhard drivesto store video clips and audio, and supported complex scripted playback. The Flyer provided simultaneous dual-channel playback, which let the Toaster'svideo switcherperform transitions and other effects on video clips without additionalrendering.The Flyer portion of the Video Toaster/Flyer combination was a complete computer of its own, having its ownmicroprocessorandembedded software.Its hardware included three embeddedSCSIcontrollers. Two of these SCSI buses were used to store video data, and the third to store audio. The Flyer used a proprietarywavelet compressionalgorithm known as VTASC, which was well regarded at the time for offering better visual quality than comparable non-linear editing systems usingmotion JPEG.

Until 1993, the Avid Media Composer was most often used for editing commercials or other small-content and high-value projects. This was primarily because the purchase cost of the system was very high, especially in comparison to the offline tape-based systems that were then in general use. Hard disk storage was also expensive enough to be a limiting factor on the quality of footage that most editors could work with or the amount of material that could be held digitized at any one time.[c]

Up until 1992, the Apple Macintosh computers could access only 50gigabytesof storage at once. This limitation was overcome by a digital video R&D team at theDisney Channelled byRick Eye.By February 1993, this team had integrated a long-form system that let the Avid Media Composer running on the Apple Macintosh access over seventerabytesof digital video data. With instant access to the shot footage of an entiremovie,long-form non-linear editing was now possible. The system made its debut at theNABconference in 1993 in the booths of the three primary sub-system manufacturers, Avid,Silicon GraphicsandSony.Within a year, thousands of these systems had replaced35mm filmediting equipment in major motion picture studios and TV stations worldwide.[19]

Although M-JPEG became the standard codec for NLE during the early 1990s, it had drawbacks. Its high computational requirements ruled out software implementations imposing extra cost and complexity of hardware compression/playback cards. More importantly, the traditional tapeworkflowhad involved editing from videotape, often in a rented facility. When the editor left the edit suite, they could securely take their tapes with them. But the M-JPEG data rate was too high for systems like Avid/1 on the Apple Macintosh andLightworkson PC to store the video on removable storage. The content needed to be stored on fixed hard disks instead. The secure tape paradigm of keeping your content with you was not possible with these fixed disks. Editing machines were often rented from facilities houses on a per-hour basis, and some productions chose to delete their material after each edit session, and then ingest it again the next day to guarantee the security of their content.[citation needed]In addition, each NLE system had storage limited by its fixed disk capacity.

These issues were addressed by a small UK company,Eidos Interactive.Eidos chose the newARM-based computers from the UK and implemented an editing system, launched in Europe in 1990 at theInternational Broadcasting Convention.Because it implemented its own compression software designed specifically for non-linear editing, the Eidos system had no requirement for JPEG hardware and was cheap to produce. The software could decode multiple video and audio streams at once for real-time effects at no extra cost. But most significantly, for the first time, it supported unlimited cheap removable storage. The Eidos Edit 1, Edit 2, and later Optima systems let the editor useanyEidos system, rather than being tied down to a particular one, and still keep his data secure. The Optima software editing system was closely tied toAcornhardware, so when Acorn stopped manufacturing theRisc PCin the late 1990s, Eidos discontinued the Optima system.[citation needed]

In the early 1990s, a small American company called Data Translation took what it knew about coding and decoding pictures for the US military and large corporate clients and spent $12 million developing a desktop editor based on its proprietary compression algorithms and off-the-shelf parts. Their aim was to democratize the desktop and take some of Avid's market. In August 1993,Media 100entered the market, providing would-be editors with a low-cost, high-quality platform.[citation needed]

Around the same period, other competitors provided non-linear systems that required special hardware—typically cards added to the computer system. Fast Video Machine was a PC-based system that first came out as an offline system, and later became moreonline editingcapable. TheImix video cubewas also a contender for media production companies. The Imix Video Cube had a control surface with faders to allow mi xing and shuttle control. Data Translation's Media 100 came with three different JPEG codecs for different types of graphics and many resolutions.DOS-basedD/Vision Prowas released by TouchVision Systems, Inc. in the mid-1990s and worked with theAction Media IIboard. These other companies caused tremendous downward market pressure on Avid. Avid was forced to continually offer lower-priced systems to compete with the Media 100 and other systems.

Inspired by the success of Media 100, members of thePremieredevelopment team left Adobe to start a project called "Keygrip" for Macromedia. Difficulty raising support and money for development led the team to take their non-linear editor to theNAB Show.After various companies made offers, Keygrip was purchased by Apple as Steve Jobs wanted a product to compete with Adobe Premiere in the desktop video market. At around the same time, Avid—now with Windows versions of its editing software—was considering abandoning the Macintosh platform. Apple releasedFinal Cut Proin 1999, and despite not being taken seriously at first by professionals, it has evolved into a serious competitor to entry-level Avid systems.

DV

[edit]

Another leap came in the late 1990s with the launch ofDV-basedvideo formats for consumer and professional use. With DV cameIEEE 1394(FireWire/iLink), a simple and inexpensive way of getting video into and out of computers. Users no longer had to convert video fromanalogto digital—it was recorded as digital to start with—and FireWire offered a straightforward way to transfer video data without additional hardware. With this innovation, editing became a more realistic proposition for software running on standard computers. It enabled desktop editing, producing high-quality results at a fraction of the cost of earlier systems.

HD

[edit]

In early 2000, the introduction of highly compressed HD formats such asHDVhas continued this trend, making it possible to edit HD material on a standard computer running a software-only editing system.

Avidis an industry standard used for major feature films, television programs, and commercials.[20]Final Cut Pro received aTechnology & Engineering Emmy Awardin 2002.

Since 2000, many personal computers include basic non-linear video editing software free of charge. This is the case of AppleiMoviefor the Macintosh platform, various open-source programs likeKdenlive,Cinelerra-GG InfinityandPiTiVifor the Linux platform, andWindows Movie Makerfor the Windows platform. This phenomenon has brought low-cost non-linear editing to consumers.

The cloud

[edit]

The demands of video editing in terms of the volumes of data involved means the proximity of the stored footage being edited to the NLE system doing the editing is governed partly by the capacity of the data connection between the two. The increasing availability of broadband internet combined with the use of lower-resolution copies of original material provides an opportunity to not just review and edit material remotely but also open up access to far more people to the same content at the same time. In 2004 the firstcloud-based video editor,known asBlackbirdand based on technology invented byStephen Streater,was demonstrated atIBCand recognized by theRTSthe following year. Since that time a number of other cloud-based editors have become available including systems fromAvid,WeVideoandGrabyo.Despite their reliance on a network connection, the need to ingest material before editing can take place, and the use of lower-resolutionvideo proxies,their adoption has grown. Their popularity has been driven largely by efficiencies arising from opportunities for greater collaboration and the potential for cost savings derived from using a shared platform, hiring rather than buying infrastructure, and the use of conventional IT equipment over hardware specifically designed for video editing.

4K

[edit]

As of 2014,4K Videoin NLE was fairly new, but it was being used in the creation of many movies throughout the world, due to the increased use of advanced 4K cameras such as theRed Camera.Examples of software for this task includeAvidMedia Composer,Apple'sFinal Cut Pro X,Sony Vegas,Adobe Premiere,DaVinci Resolve,Edius,andCinelerra-GG Infinityfor Linux.[21]

8K

[edit]

As of 20198K videowas relatively new. 8K video editing requires advanced hardware and software capable of handling the standard.[citation needed]

Image editing

[edit]

For imaging software, early works such asHSC Software's Live Picture[22]brought non-destructive editing to the professional market and current efforts such asGEGLprovide an implementation being used in open-source image editing software.

Quality

[edit]

An early concern with non-linear editing had been picture and sound quality available to editors. Storage limitations at the time required that all material undergolossy compressiontechniques to reduce the amount of memory occupied. Improvements in compression techniques and disk storage capacity have mitigated these concerns, and the migration to high-definition video and audio has virtually removed this concern completely. Most professional NLEs are also able to edit uncompressed video with the appropriate hardware.[citation needed]

See also

[edit]

Notes

[edit]
  1. ^A compatible system calledSoundDroidwas developed for sound post-production. This is considered to be one of the earliestdigital audio workstations.[14]
  2. ^Laser Edit later merged with Pacific Video as Laser-Pacific.
  3. ^In editing facilities rented by the hour or the day, a production's digitized rushes would usually be deleted at the end of the hire, so that the full amount of hard disk storage was available to the next client.

References

[edit]
  1. ^abEvans, Russell (2005).Practical DV Filmmaking.Focal Press. p. 14.ISBN0-240-80738-3.
  2. ^"Avid Media Composer commands 4.57% market share in Audio & Video Editing".enlyft.Retrieved2022-03-25.
  3. ^"Workflow Breakdown of Every Best Picture and Best Editing 2018 Oscars Nominee".Frame.io Insider.2018-03-05.Retrieved2022-03-25.
  4. ^"Broadcast Engineering: Nonlinear editors".20 September 2011.
  5. ^Waters, Cara (2019-01-27)."The Aussie tech company behind the Oscar nominees".The Sydney Morning Herald.Retrieved2019-09-21.
  6. ^"The Seven Year Itch – FCP X, Zero to Hero?".Newsshooter.2018-06-20.Retrieved2019-09-21.
  7. ^abc"Post Technology Alliance".pta.netflixstudios.
  8. ^abcKroll, Noam (November 7, 2014)."The Big 3 NLEs And Their Place in Today's Film Industry".The Beat: A Blog by PremiumBeat.
  9. ^"Adobe, Avid, Blackmagic Design: The Big 3 NLEs at NAB 2019".
  10. ^updated, Cat Ellis last (2018-04-09)."Lightworks review".TechRadar.Retrieved2022-03-25.
  11. ^"Shortcut".Retrieved2022-08-17.
  12. ^"The History of Digital Nonlinear Editing",Facer Ezine,archived fromthe originalon 2013-04-10
  13. ^"A Brief History Of Electronic Editing"(PDF),Non Linear,archived fromthe original(PDF)on 2007-10-21
  14. ^abRubin, Michael (1991).Nonlinear: A Guide to Digital Film and Video Editing.Triad Pub. Co.ISBN0937404853.
  15. ^abFraser Harrison (2013-03-14)."What was EditDroid?".Retrieved2019-08-29.
  16. ^Richard Seel."Developments in Post Production 1946 - 1991".Archived fromthe originalon 2012-03-02.
  17. ^Buck, John (1988).Timeline, A History of Editing.Melbourne: Enriched Books. p. 448.ISBN978-0-646-49224-7.
  18. ^"NBC LEADS EMMY WINNERS WITH 15 HONORS IN BEHIND-SCENES CATEGORIES".Associated Press.Padadena, CA. September 8, 1986.Archivedfrom the original on September 28, 2015.RetrievedJuly 30,2013.
  19. ^Harrington, Richard; Weiser, Mark; Pixel, RHED (12 February 2019).Producing Video Podcasts: A Guide for Media Professionals.Taylor & Francis.ISBN9780240810294– via Google Books.
  20. ^"Nonlinear editors".Broadcast engineering.September 1, 2011. Archived fromthe originalon 2018-01-04.
  21. ^Radev, Vlady (2014-07-11)."Popular Non-Linear Editors in 2014 Which Support 4K".4kshooters.net.Retrieved2023-05-26.
  22. ^"Live Picture".Pixiq.Archived fromthe originalon 2013-02-02.
[edit]