Ahead-up display,orheads-up display,[1]also known as aHUD(/hʌd/) orhead-up guidance system(HGS), is anytransparent displaythat presents data without requiring users to look away from their usual viewpoints. The origin of the name stems from apilotbeing able to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments. A HUD also has the advantage that the pilot's eyes do not need torefocusto view the outside after looking at the optically nearer instruments.

HUD of anF/A-18 Hornet

Although they were initially developed for military aviation, HUDs are now used in commercial aircraft, automobiles, and other (mostly professional) applications.

Head-up displays were a precursor technology toaugmented reality(AR), incorporating a subset of the features needed for the full AR experience, but lacking the necessary registration and tracking between the virtual content and the user's real-world environment.[2]

Overview

edit
HUD mounted in aPZL TS-11 Iskrajet trainer aircraft with a glass plate combiner and a convex collimating lens just below it

A typical HUD contains three primary components: aprojector unit,acombiner,and avideo generation computer.[3]

The projection unit in a typical HUD is anoptical collimatorsetup: aconvex lensorconcave mirrorwith acathode-ray tube,light emitting diode display,orliquid crystal displayat its focus. This setup (a design that has been around since the invention of thereflector sightin 1900) produces an image where the light iscollimated,i.e. the focal point is perceived to be at infinity.

The combiner is typically an angled flat piece of glass (abeam splitter) located directly in front of the viewer, that redirects the projected image from projector in such a way as to see the field of view and the projected infinity image at the same time. Combiners may have special coatings that reflect themonochromaticlight projected onto it from the projector unit while allowing all otherwavelengthsof light to pass through. In some optical layouts combiners may also have a curved surface to refocus the image from the projector.

The computer provides the interface between the HUD (i.e. the projection unit) and the systems/data to be displayed and generates the imagery and symbology to be displayed by the projection unit.

Types

edit

Other than fixed mounted HUD, there are alsohead-mounted displays(HMDs.) These includehelmet-mounted displays(both abbreviated HMD), forms of HUD that feature a display element that moves with the orientation of the user's head.

Many modern fighters (such as theF/A-18,F-16,andEurofighter) use both a HUD and HMD concurrently. TheF-35 Lightning IIwas designed without a HUD, relying solely on the HMD, making it the first modern military fighter not to have a fixed HUD.

Generations

edit

HUDs are split into four generations reflecting the technology used to generate the images.

  • First Generation—Use aCRTto generate an image on a phosphor screen, having the disadvantage of the phosphor screen coating degrading over time. The majority of HUDs in operation today are of this type.
  • Second Generation—Use a solid state light source, for exampleLED,which is modulated by an LCD screen to display an image. These systems do not fade or require the high voltages of first generation systems. These systems are on commercial aircraft.
  • Third Generation—Useoptical waveguidesto produce images directly in the combiner rather than use a projection system.
  • Fourth Generation—Use a scanning laser to display images and even video imagery on a clear transparent medium.

Newer micro-display imaging technologies are being introduced, includingliquid crystal display(LCD),liquid crystal on silicon(LCoS),digital micro-mirrors(DMD), andorganic light-emitting diode(OLED).

History

edit
Longitudinal cross-section of a basicreflector sight(1937 German Revi C12/A)
Copilot's HUD of aC-130J

HUDs evolved from thereflector sight,a pre-World War IIparallax-free optical sight technology formilitaryfighter aircraft.[4]Thegyro gunsightadded areticlethat moved based on the speed and turn rate to solve for the amount ofleadneeded to hit a target while maneuvering.

During the early 1940s, theTelecommunications Research Establishment(TRE), in charge of UKradardevelopment, found thatRoyal Air Force(RAF)night fighterpilots were having a hard time reacting to the verbal instruction of the radar operator as they approached their targets. They experimented with the addition of a second radar display for the pilot, but found they had trouble looking up from the lit screen into the dark sky in order to find the target. In October 1942 they had successfully combined the image from the radar tube with a projection from their standard GGS Mk. IIgyro gunsighton a flat area of the windscreen, and later in the gunsight itself.[5]A key upgrade was the move from the originalAI Mk. IV radarto the microwave-frequencyAI Mk. VIII radarfound on thede Havilland Mosquitonight fighter.This set produced anartificial horizonthat further eased head-up flying.[citation needed]

In 1955 theUS Navy's Office of Naval Research and Development did some research with a mockup HUD concept unit along with asidestick controllerin an attempt to ease the pilot's burden flying modern jet aircraft and make the instrumentation less complicated during flight. While their research was never incorporated in any aircraft of that time, the crude HUD mockup they built had all the features of today's modern HUD units.[6]

HUD technology was next advanced by theRoyal Navyin theBuccaneer,the prototype of which first flew on30 April1958. The aircraft was designed to fly at very low altitudes at very high speeds and drop bombs in engagements lasting seconds. As such, there was no time for the pilot to look up from the instruments to a bombsight. This led to the concept of a "Strike Sight" that would combine altitude, airspeed and the gun/bombsight into a single gunsight-like display. There was fierce competition between supporters of the new HUD design and supporters of the old electro-mechanical gunsight, with the HUD being described as a radical, even foolhardy option.

The Air Arm branch of theUK Ministry of Defencesponsored the development of a Strike Sight. TheRoyal Aircraft Establishment(RAE) designed the equipment and the earliest usage of the term "head-up-display" can be traced to this time.[7]Production units were built byRank Cintel,and the system was first integrated in 1958. The Cintel HUD business was taken over byElliott Flight Automationand the Buccaneer HUD was manufactured and further developed, continuing up to a Mark III version with a total of 375 systems made; it was given a 'fit and forget' title by the Royal Navy and it was still in service nearly 25 years later.BAE Systems,as the successor to Elliotts via GEC-Marconi Avionics, thus has a claim to the world's first head-up display in operational service.[8]A similar version that replaced the bombing modes with missile-attack modes was part of theAIRPASSHUD fitted to theEnglish Electric Lightningfrom 1959.

In the United Kingdom, it was soon noted that pilots flying with the new gunsights were becoming better at piloting their aircraft.[citation needed]At this point, the HUD expanded its purpose beyond weapon aiming to general piloting. In the 1960s, French test-pilot Gilbert Klopfstein created the first modern HUD and a standardized system of HUD symbols so that pilots would only have to learn one system and could more easily transition between aircraft. The modern HUD used ininstrument flight rulesapproaches to landing was developed in 1975.[9]Klopfstein pioneered HUD technology in militaryfighter jetsandhelicopters,aiming to centralize critical flight data within the pilot's field of vision. This approach sought to increase the pilot's scan efficiency and reduce "task saturation" andinformation overload.

Use of HUDs then expanded beyond military aircraft. In the 1970s, the HUD was introduced to commercial aviation, and in 1988, theOldsmobile Cutlass Supremebecame the first production car with a head-up display.

Until a few years ago, the Embraer 190, Saab 2000, Boeing 727, andBoeing 737Classic(737-300/400/500) andNext Generationaircraft (737-600/700/800/900 series) were the only commercial passenger aircraft available with HUDs. However, the technology is becoming more common with aircraft such as theCanadair RJ,Airbus A318and several business jets featuring the displays. HUDs have become standard equipment on theBoeing 787.[10]Furthermore, the Airbus A320, A330, A340 and A380 families are currently undergoing the certification process for a HUD.[11]HUDs were also added to theSpace Shuttleorbiter.

Design factors

edit
Headset computer

There are several factors that interplay in the design of a HUD:

  • Field of View– also "FOV", indicates the angle(s), vertically as well as horizontally, subtended at the pilot's eye, at which the combiner displayssymbologyin relation to the outside view. A narrow FOV means that the view (of a runway, for example) through the combiner might include little additional information beyond the perimeters of the runway environment; whereas a wide FOV would allow a 'broader' view. For aviation applications, the major benefit of a wide FOV is that an aircraft approaching the runway in a crosswind might still have the runway in view through the combiner, even though the aircraft is pointed well away from the runway threshold; whereas with a narrow FOV the runway would be 'off the edge' of the combiner, out of the HUD's view. Because human eyes are separated, each eye receives a different image. The HUD image is viewable by one or both eyes, depending on technical and budget limitations in the design process. Modern expectations are that both eyes view the same image, in other words a "binocular Field of View (FOV)".
  • Collimation– The projected image iscollimatedwhich makes the light rays parallel. Because the light rays are parallel the lens of the human eye focuses on infinity to get a clear image. Collimated images on the HUD combiner are perceived as existing at or near opticalinfinity.This means that the pilot's eyes do not need to refocus to view the outside world and the HUD display – the image appears to be "out there", overlaying the outside world. This feature is critical for effective HUDs: not having to refocus between HUD-displayed symbolic information and the outside world onto which that information is overlaid is one of the main advantages of collimated HUDs. It gives HUDs special consideration in safety-critical and time-critical manoeuvres, when the few seconds a pilot needs in order to re-focus inside the cockpit, and then back outside, are very critical: for example, in the final stages of landing. Collimation is therefore a primary distinguishing feature of high-performance HUDs and differentiates them from consumer-quality systems that, for example, simply reflect uncollimated information off a car's windshield (causing drivers to refocus and shift attention from the road ahead.)
  • Eyebox– Theoptical collimatorproduces a cylinder of parallel light so the display can only be viewed while the viewer's eyes are somewhere within that cylinder, a three-dimensional area called thehead motion boxoreyebox.Modern HUD eyeboxes are usually about 5 lateral by 3 vertical by 6 longitudinal inches (13x8x15 cm.) This allows the viewer some freedom of head movement but movement too far up/down or left/right will cause the display to vanish off the edge of the collimator and movement too far back will cause it to crop off around the edge (vignette.) The pilot is able to view the entire display as long as one eye is inside the eyebox.[12]
  • Luminance/contrast– Displays have adjustments inluminanceand contrast to account for ambient lighting, which can vary widely (e.g. from the glare of bright clouds to moonless night approaches to minimally lit fields.)
  • Boresight– Aircraft HUD components are very accurately aligned with the aircraft's three axes – a process calledboresighting– so that displayed data conforms to reality typically with an accuracy of ±7.0milliradians(±24minutes of arc), and may vary across the HUD's FOV. In this case the word "conform" means, "when an object is projected on the combiner and the actual object is visible, they will be aligned". This allows the display to show the pilot exactly where the artificialhorizonis, as well as the aircraft's projected path with great accuracy. WhenEnhanced Visionis used, for example, the display of runway lights is aligned with the actual runway lights when the real lights become visible. Boresighting is done during the aircraft's building process and can also be performed in the field on many aircraft.[9]
  • Scaling– The displayed image (flight path, pitch and yaw scaling, etc.), is scaled to present to the pilot a picture that overlays the outside world in an exact 1:1 relationship. For example, objects (such as a runway threshold) that are 3 degrees below the horizon as viewed from the cockpit must appear at the −3 degree index on the HUD display.
  • Compatibility– HUD components are designed to be compatible with other avionics, displays, etc.

Aircraft

edit
Head-up display of an F-14A Tomcat

On aircraft avionics systems, HUDs typically operate from dual independent redundant computer systems. They receive input directly from the sensors (pitot-static,gyroscopic,navigation, etc.) aboard the aircraft and perform their own computations rather than receiving previously computed data from the flight computers. On other aircraft (the Boeing 787, for example) the HUD guidance computation for Low Visibility Take-off (LVTO) and low visibility approach comes from the same flight guidance computer that drives the autopilot. Computers are integrated with the aircraft's systems and allow connectivity onto several different data buses such as theARINC 429,ARINC 629, andMIL-STD-1553.[9]

Displayed data

edit
Displayed data symbology of a head-up display

Typical aircraft HUDs displayairspeed,altitude,ahorizonline,heading,turn/bank andslip/skidindicators. These instruments are the minimum required by 14 CFR Part 91.[13]

Othersymbolsand data are also available in some HUDs:

  • boresightorwaterlinesymbol — is fixed on the display and shows where the nose of the aircraft is actually pointing.
  • flight path vector (FPV)orvelocity vectorsymbol — shows where the aircraft is actually going, as opposed to merely where it is pointed as with the boresight. For example, if the aircraft ispitchedup but descending as may occur in highangle of attackflight or in flight through descending air, then the FPV symbol will be below the horizon even though the boresight symbol is above the horizon. During approach and landing, a pilot can fly the approach by keeping the FPV symbol at the desired descent angle and touchdown point on the runway.
  • acceleration indicatororenergy cue— typically to the left of the FPV symbol, it is above it if the aircraft is accelerating, and below the FPV symbol if decelerating.
  • angle of attackindicator— shows the wing's angle relative to the airflow, often displayed as"α".
  • navigation data and symbols — for approaches and landings, the flight guidance systems can provide visual cues based on navigation aids such as anInstrument Landing Systemor augmentedGlobal Positioning Systemsuch as theWide Area Augmentation System.Typically this is a circle which fits inside the flight path vector symbol. Pilots can fly along the correct flight path by "flying to" the guidance cue.

Since being introduced on HUDs, both the FPV and acceleration symbols are becoming standard on head-down displays (HDD.) The actual form of the FPV symbol on an HDD is not standardized but is usually a simple aircraft drawing, such as a circle with two short angled lines, (180 ± 30 degrees) and "wings" on the ends of the descending line. Keeping the FPV on the horizon allows the pilot to fly level turns in various angles of bank.

Military aircraft specific applications

edit
FA-18 HUD while engaged in a mockdogfight

In addition to the generic information described above, military applications include weapons system and sensor data such as:

  • target designation (TD)indicator — places a cue over an air or ground target (which is typically derived fromradarorinertial navigation systemdata.)
  • Vc— closing velocity with target.
  • Range— to target, waypoint, etc.
  • weapon seekeror sensor line of sight — shows where a seeker or sensor is pointing.
  • weapon status— includes type and number of weapons selected, available, arming, etc.

VTOL/STOL approaches and landings

edit

During the 1980s, the United States military tested the use of HUDs in vertical take off and landing (VTOL) and short take off and landing (STOL) aircraft. A HUD format was developed atNASAAmes Research Centerto provide pilots of VTOL and STOL aircraft with complete flight guidance and control information forCategory III Cterminal-area flight operations. This includes a large variety of flight operations, from STOL flights on land-based runways to VTOL operations onaircraft carriers.The principal features of this display format are the integration of the flightpath and pursuit guidance information into a narrow field of view, easily assimilated by the pilot with a single glance, and the superposition of vertical and horizontal situation information. The display is a derivative of a successful design developed for conventional transport aircraft.[14]

Civil aircraft specific applications

edit
The cockpit ofNASA'sGulfstream GVwith a synthetic vision system display. The HUD combiner is in front of the pilot (with a projector mounted above it.) This combiner uses a curved surface to focus the image.

The use of head-up displays allows commercial aircraft substantial flexibility in their operations. Systems have been approved which allow reduced-visibility takeoffs, and landings, as well as full manualCategory III Alandings and roll-outs.[15][16][17]Initially expensive and physically large, these systems were only installed on larger aircraft able to support them. These tended to be the same aircraft that as standard supported autoland (with the exception of certain turbo-prop types[clarification needed]that had HUD as an option) making the head-up display unnecessary for Cat III landings. This delayed the adoption of HUD in commercial aircraft. At the same time, studies have shown that the use of a HUD during landings decreases the lateral deviation from centerline in all landing conditions, although the touchdown point along the centerline is not changed.[18]

Forgeneral aviation,MyGoFlight expects to receive aSTCand to retail its SkyDisplay HUD for $25,000 without installation for a single piston-engine as theCirrus SR22sand more forCessna CaravansorPilatus PC-12ssingle-engine turboprops: 5 to 10% of a traditional HUD cost albeit it is non-conformal,not matching exactly the outside terrain.[19] Flight data from atablet computercan be projected on the $1,800 Epic Optix Eagle 1 HUD.[20]

Enhanced flight vision systems

edit
Thermal imageviewed through a head-up display

In more advanced systems, such as the USFederal Aviation Administration(FAA)-labeled 'Enhanced Flight Vision System',[21]a real-world visual image can be overlaid onto the combiner. Typically aninfraredcamera (either single or multi-band) is installed in the nose of the aircraft to display a conformed image to the pilot. "EVS Enhanced Vision System" is an industry-accepted term which the FAA decided not to use because "the FAA believes [it] could be confused with the system definition and operational concept found in 91.175(l) and (m)"[21]In one EVS installation, the camera is actually installed at the top of the vertical stabilizer rather than "as close as practical to the pilots eye position". When used with a HUD however, the camera must be mounted as close as possible to the pilots eye point as the image is expected to "overlay" the real world as the pilot looks through the combiner.

"Registration", or the accurate overlay of the EVS image with the real world image, is one feature closely examined by authorities prior to approval of a HUD based EVS. This is because of the importance of the HUD matching the real world and therefore being able to provide accurate data rather than misleading information.

While the EVS display can greatly help, the FAA has only relaxed operating regulations[22]so an aircraft with EVS can perform aCATEGORY I approach to CATEGORY II minimums.In all other cases the flight crew must comply with all "unaided" visual restrictions. (For example, if the runway visibility is restricted because of fog, even though EVS may provide a clear visual image it is not appropriate (or legal) to maneuver the aircraft using only the EVS below 100 feet above ground level.)

Synthetic vision systems

edit
A synthetic vision system display (Honeywell)

HUD systems are also being designed to display asynthetic vision system(SVS) graphic image, which uses high precision navigation, attitude, altitude and terrain databases to create realistic and intuitive views of the outside world.[23][24][25]

In the 1st SVS head down image shown on the right, immediately visible indicators include the airspeed tape on the left, altitude tape on the right, and turn/bank/slip/skid displays at the top center. The boresight symbol (-v-) is in the center and directly below that is the flight path vector (FPV) symbol (the circle with short wings and a vertical stabilizer.) The horizon line is visible running across the display with a break at the center, and directly to the left are numbers at ±10 degrees with a short line at ±5 degrees (the +5 degree line is easier to see) which, along with the horizon line, show the pitch of the aircraft. Unlike this color depiction of SVS on a head down primary flight display, the SVS displayed on a HUD is monochrome – that is, typically, in shades of green.

The image indicates a wings level aircraft (i.e. the flight path vector symbol is flat relative to the horizon line and there is zero roll on the turn/bank indicator.) Airspeed is 140 knots, altitude is 9,450 feet, heading is 343 degrees (the number below the turn/bank indicator.) Close inspection of the image shows a small purple circle which is displaced from the flight path vector slightly to the lower right. This is the guidance cue coming from the Flight Guidance System. When stabilized on the approach, this purple symbol should be centeredwithinthe FPV.

The terrain is entirely computer generated from a high resolution terrain database.

In some systems, the SVS will calculate the aircraft's current flight path, or possible flight path (based on an aircraft performance model, the aircraft's current energy, and surrounding terrain) and then turn any obstructions red to alert the flight crew. Such a system might have helped prevent the crash ofAmerican Airlines Flight 965into a mountain in December 1995.[citation needed]

On the left side of the display is an SVS-unique symbol, with the appearance of a purple, diminishing sideways ladder, and which continues on the right of the display. The two lines define a "tunnel in the sky". This symbol defines the desired trajectory of the aircraft in three dimensions. For example, if the pilot had selected an airport to the left, then this symbol would curve off to the left and down. If the pilot keeps the flight path vector alongside the trajectory symbol, the craft will fly the optimum path. This path would be based on information stored in the Flight Management System's database and would show the FAA-approved approach for that airport.

The tunnel in the sky can also greatly assist the pilot when more precise four-dimensional flying is required, such as the decreased vertical or horizontal clearance requirements ofRequired Navigation Performance(RNP.) Under such conditions the pilot is given a graphical depiction of where the aircraft should be and where it should be going rather than the pilot having to mentally integrate altitude, airspeed, heading, energy and longitude and latitude to correctly fly the aircraft.[26]

Tanks

edit

In mid-2017, theIsrael Defense Forceswill begin trials ofElbit's Iron Vision, the world's first helmet-mounted head-up display for tanks. Israel's Elbit, which developed the helmet-mounted display system for theF-35,plans Iron Vision to use a number of externally mounted cameras to project the 360° view of a tank's surroundings onto the helmet-mounted visors of its crew members. This allows the crew members to stay inside the tank, without having to open the hatches to see outside.[27]

Automobiles

edit
HUD in aBMW E60
The green arrow on the windshield near the top of this picture is a Head-Up Display on a 2013Toyota Prius.It toggles between theGPSnavigation instruction arrow and the speedometer. The arrow is animated to appear scrolling forward as the car approaches the turn. The image is projected without any kind of glass combiner.

These displays are becoming increasingly available in production cars, and usually offerspeedometer,tachometer,andnavigation systemdisplays.Night visioninformation is also displayed via HUD on certain automobiles. In contrast to most HUDs found in aircraft, automotive head-up displays are not parallax-free. The display may not be visible to a driver wearing sunglasses with polarised lenses.

Add-on HUD systems also exist, projecting the display onto a glass combiner mounted above or below the windshield, or using the windshield itself as the combiner.

The first in-car HUD was developed by General Motors Corporation in 1999 with the function of displaying the navigation service in front of the driver's line of sight. Moving into 2010, AR technology was introduced and combined with the existing in-car HUD. Based on this technology, the navigation service began to be displayed on the windshield of the vehicle.[28]

In 2012,Pioneer Corporationintroduced a HUD navigation system that replaces the driver-side sun visor and visually overlays animations of conditions ahead, a form ofaugmented reality (AR.)[29][30]Developed by Pioneer Corporation, AR-HUD became the first aftermarket automotive Head-Up Display to use a direct-to-eye laser beam scanning method, also known asvirtual retinal display(VRD.) AR-HUD's core technology involves a miniature laser beam scanning display developed by MicroVision, Inc.[31]

Motorcycle helmetHUDs are also commercially available.[32]

In recent years, it has been argued that conventional HUDs will be replaced by holographicARtechnologies, such as the ones developed byWayRaythat useholographic optical elements(HOE.) The HOE allows for a wider field of view while reducing the size of the device and making the solution customizable for any car model.[33][34]Mercedes Benz introduced an Augmented Reality-based Head Up Display[35]while Faurecia invested in an eye gaze and finger controlled head up display.[36]

Further development and experimental uses

edit

HUDs have been proposed or are being experimentally developed for a number of other applications. In military settings, a HUD can be used to overlay tactical information such as the output of a laserrangefinderor squadmate locations toinfantrymen.A prototype HUD has also been developed that displays information on the inside of a swimmer's goggles orof a scuba diver's mask.[37]HUD systems that project information directly onto the wearer'sretinawith a low-poweredlaser(virtual retinal display) are also being tested.[38][39]

A HUD product developed in 2012 could perform real-time language translation.[40]In an implementation of anOptical head-mounted display,theEyeTapproduct allows superimposed computer-generated graphic files to be displayed on a lens. TheGoogle Glasswas another early product.

See also

edit

References

edit
  1. ^Oxford Dictionary of English, Angus Stevenson, Oxford University Press – 2010, page 809 (head-up display (N.Amer. also heads-up display))
  2. ^"Augmented reality brings VR to the real world in all sorts of exciting ways".Digital Trends.2019-06-06.Retrieved2022-10-10.
  3. ^Fred H. Previc; William R. Ercoline (2004).Spatial Disorientation in Aviation.AIAA. p. 452.ISBN978-1-60086-451-3.
  4. ^D. N. Jarrett (2005).Cockpit engineering.Ashgate Pub. p. 189.ISBN0-7546-1751-3.Retrieved2012-07-14.
  5. ^Ian White,"The History of Air Intercept Radar & the British Nightfigher",Pen & Sword, 2007, p. 207
  6. ^"Windshield TV Screen To Aid Blind Flying."Popular Mechanics,March 1955, p. 101.
  7. ^John Kim,Rupture of the Virtual,Digital Commons Macalester College, 2016, p. 54
  8. ^Rochester Avionics Archives
  9. ^abcSpitzer, Cary R., ed. "Digital Avionics Handbook". Head-Up Displays. Boca Raton, FL: CRC Press, 2001
  10. ^Norris, G.; Thomas, G.; Wagner, M. & Forbes Smith, C. (2005).Boeing 787 Dreamliner—Flying Redefined.Aerospace Technical Publications International.ISBN0-9752341-2-9.
  11. ^"Airbus A318 approved for Head Up Display".Airbus.com. 2007-12-03. Archived fromthe originalon December 7, 2007.Retrieved2009-10-02.
  12. ^Cary R. Spitzer (2000).Digital Avionics Handbook.CRC Press. p. 4.ISBN978-1-4200-3687-9.
  13. ^"14 CFR Part 91".Airweb.faa.gov.Retrieved2009-10-02.
  14. ^Vernon K. Merrick, Glenn G. Farris, and Andrejs A. Vanags. "A Head Up Display for Application to V/STOL Aircraft Approach and Landing". NASA Ames Research Center 1990.
  15. ^Order: 8700.1 Appendix: 3 Bulletin Type: Flight Standards Handbook Bulletin for General Aviation (HBGA) Bulletin Number: HBGA 99-16 Bulletin Title: Category III Authorization for Parts 91 and 125 Operators with Head-Up Guidance Systems (HGS); LOA and Operations Effective Date: 8-31-99ArchivedOctober 1, 2006, at theWayback Machine
  16. ^Falcon 2000 Becomes First Business Jet Certified Category III A by JAA and FAA; Aviation Weeks Show News Online September 7, 1998
  17. ^"Design Guidance for a HUD System is contained in Draft Advisory Circular AC 25.1329-1X," Approval of Flight Guidance Systems "dated 10/12/2004".Airweb.faa.gov.Retrieved2009-10-02.
  18. ^Goteman, Ö.; Smith, K.; Dekker, S. (2007). "HUD With a Velocity (Flight Path) Vector Reduces Lateral Error During Landing in Restricted Visibility".International Journal of Aviation Psychology.17(1): 91–108.doi:10.1080/10508410709336939.S2CID219641008.
  19. ^Matt Thurber (August 24, 2018)."A HUD For the Rest of Us by".AIN online.
  20. ^Matt Thurber (December 26, 2018)."This HUD's For You".AIN online.
  21. ^abU.S. DOT/FAA – Final Rule: Enhanced Flight Vision Systemswww.regulations.gov
  22. ^14 CFR Part 91.175 change 281 "Takeoff and Landing under IFR"
  23. ^"Slide 1"(PDF).Archived fromthe original(PDF)on March 9, 2008.Retrieved2009-10-02.
  24. ^For additional information see Evaluation of Alternate Concepts for Synthetic Vision Flight Displays with Weather-Penetrating Sensor Image Inserts During Simulated Landing Approaches, NASA/TP-2003-212643Archived2004-11-01 at theWayback Machine
  25. ^"No More Flying Blind, NASA".Nasa.gov. 2007-11-30.Retrieved2009-10-02.
  26. ^"PowerPoint Presentation"(PDF).Archived fromthe original(PDF)on March 9, 2008.Retrieved2009-10-02.
  27. ^IDF to trial Elbit's IronVision in Merkava MBTPeter Felstead, Tel Aviv - IHS Jane's Defence Weekly, 27 March 2017
  28. ^Liang, Yongshi; Zheng, Pai; Xia, Liqiao (January 2023)."A visual reasoning-based approach for driving experience improvement in the AR-assisted head-up displays".Advanced Engineering Informatics.55:101888.doi:10.1016/j.aei.2023.101888.ISSN1474-0346.
  29. ^Alabaster, Jay (June 28, 2013)."Pioneer launches car navigation with augmented reality, heads-up displays".Computerworld.
  30. ^Ulanoff, Lance (January 11, 2012)."Pioneer AR Heads Up Display Augments Your Driving Reality".Mashable.
  31. ^Freeman, Champion (2014)."Madhaven—Scanned Laser Pico-Projectors: Seeing the Big Picture (with a Small Device)".
  32. ^"Mike, Werner." Test Driving the SportVue Motorcycle HUD ". Motorcycles in the Fast Lane. 8 November 2005. Accessed 14 February 2007".News.motorbiker.org. Archived fromthe originalon 30 March 2010.Retrieved2009-10-02.
  33. ^"WayRay's AR in-car HUD convinced me HUDs can be better".TechCrunch.Retrieved2018-10-03.
  34. ^"AR Smart Driving Tool Set to Replace GPS? - L'Atelier BNP Paribas".L'Atelier BNP Paribas.Retrieved2018-10-03.
  35. ^"Augmented reality heads-up displays for cars are finally a real thing".10 July 2020.
  36. ^Prabhakar, Gowdham; Ramakrishnan, Aparna; Madan, Modiksha; Murthy, L. R. D.; Sharma, Vinay Krishna; Deshmukh, Sachin; Biswas, Pradipta (2020). "Interactive gaze and finger controlled HUD for cars".Journal on Multimodal User Interfaces.14:101–121.doi:10.1007/s12193-019-00316-9.ISSN1783-8738.S2CID208261516.
  37. ^Clothier, Julie."Clothier, Julie." Smart Goggles Easy on the Eyes ". CNN.Com. 27 June 2005. CNN. Accessed 22 February 2007".Edition.cnn.com.Retrieved2009-10-02.
  38. ^Panagiotis Fiambolis.""Virtual Retinal Display (VRD) Technology". Virtual Retinal Display Technology. Naval Postgraduate School. 13 February 2007 ".Cs.nps.navy.mil. Archived fromthe originalon April 13, 2008.Retrieved2009-10-02.
  39. ^Lake, Matt (2001-04-26)."Lake, Matt (26 April 2001)." How It Works: Retinal Displays Add a Second Data Layer "".The New York Times.Retrieved2009-10-02.
  40. ^Borghino, Dario (29 July 2012).Augmented reality glasses perform real-time language translation.gizmag.
edit