Mixed reality

From Wikipedia, the free encyclopedia

Clip from a mixed reality Job Simulator game

Mixed reality (MR) is a term used to describe the merging of a real-world environment and a computer-generated one. Physical and virtual objects may co-exist in mixed reality environments and interact in real time.

Mixed reality that incorporates haptics has sometimes been referred to as Visuo-haptic mixed reality.[1][2]

In a physics context, the term "interreality system" refers to a virtual reality system coupled with its real-world counterpart.[3] A 2007 paper describes an interreality system comprising a real physical pendulum coupled to a pendulum that only exists in virtual reality.[4] This system has two stable states of motion: a "dual reality" state in which the motion of the two pendula are uncorrelated, and a "mixed reality" state in which the pendula exhibit stable phase-locked motion, which is highly correlated. The use of the terms "mixed reality" and "interreality" is clearly defined in the context of physics and may be slightly different in other fields, however, it is generally seen as, "bridging the physical and virtual world".[5]

Applications[edit]

Mixed reality has been used in applications across fields including design, education, entertainment, military training, healthcare, product content management, and human-in-the-loop operation of robots.

Education[edit]

Simulation-based learning includes VR and AR based training and interactive, experiential learning. There are many potential use cases for mixed reality in both educational settings and professional training settings. In education, AR has been used to simulate historical battles, providing an unparalleled immersive experience for students and potentially enhanced learning experiences.[6] In addition, AR has shown effectiveness in university education for health science and medical students within disciplines that benefit from 3D representations of models, such as physiology and anatomy.[7][8]

Entertainment[edit]

From television shows to game consoles, mixed reality has many applications in the field of entertainment.

The 2004 British game show Bamzooki called upon child contestants to create virtual "Zooks" and watch them compete in a variety of challenges.[9] The show used mixed reality to bring the Zooks to life. The television show ran for four seasons, ending in 2010.[9]

The 2003 game show FightBox also called upon contestants to create competitive characters and used mixed reality to allow them to interact.[10] Unlike Bamzoomi's generally non-violent challenges, the goal of FightBox was for new contestants to create the strongest fighter to win the competition.[10]

In 2009, researchers presented to the International Symposium on Mixed and Augmented Reality (ISMAR) their social product called "BlogWall", which consisted of a projected screen on a wall.[11] Users could post short text clips or images on the wall and play simple games such as Pong.[11] The BlogWall also featured a poetry mode where it would rearrange the messages it received to form a poem and a polling mode where users could ask others to answer their polls.[11]

Mario Kart Live: Home Circuit is a mixed reality racing game for the Nintendo Switch that was released in October 2020.[16a-New] The game allows players to use their home as a race track[12] Within the first week of release, 73,918 copies were sold in Japan, making it the country's best selling game of the week.[13]

Other research has examined the potential for mixed reality to be applied to theatre, film, and theme parks.[14]

Military training[edit]

The first fully immersive mixed reality system was the Virtual Fixtures platform, which was developed in 1992 by Louis Rosenberg at the Armstrong Laboratories of the United States Air Force.[15] It enabled human users to control robots in real-world environments that included real physical objects and 3D virtual overlays ("fixtures") that were added enhance human performance of manipulation tasks. Published studies showed that by introducing virtual objects into the real world, significant performance increases could be achieved by human operators.[15][16][17]

Combat reality can be simulated and represented using complex, layered data and visual aides, most of which are head-mounted displays (HMD), which encompass any display technology that can be worn on the user's head.[18] Military training solutions are often built on commercial off-the-shelf (COTS) technologies, such as Improbable's synthetic environment platform, Virtual Battlespace 3 and VirTra, with the latter two platforms used by the United States Army. As of 2018, VirTra is being used by both civilian and military law enforcement to train personnel in a variety of scenarios, including active shooter, domestic violence, and military traffic stops.[19][20] Mixed reality technologies have been used by the United States Army Research Laboratory to study how this stress affects decision-making. With mixed reality, researchers may safely study military personnel in scenarios where soldiers would not likely survive.[21]

In 2017, the U.S. Army was developing the Synthetic Training Environment (STE), a collection of technologies for training purposes that was expected to include mixed reality. As of 2018, STE was still in development without a projected completion date. Some recorded goals of STE included enhancing realism and increasing simulation training capabilities and STE availability to other systems.[22]

It was claimed that mixed-reality environments like STE could reduce training costs,[23][24] such as reducing the amount of ammunition expended during training.[25] In 2018, it was reported that STE would include representation of any part of the world's terrain for training purposes.[26] STE would offer a variety of training opportunities for squad brigade and combat teams, including Stryker, armory, and infantry teams.[27]

Blended spaces[edit]

A blended space is a space in which a physical environment and a virtual environment are deliberately integrated in a close knit way. The aim of blended space design is to provide people with the experience of feeling a sense of presence in the blended space, acting directly on the content of the blended space.[28][29] Examples of blended spaces include augmented reality devices such as the Microsoft HoloLens and games such as Pokémon Go in addition to many smartphone tourism apps, smart meeting rooms and applications such as bus tracker systems.

The idea of blending comes from the ideas of conceptual integration, or conceptual blending introduced by Gilles Fauconnier and Mark Turner.

Manuel Imaz and David Benyon introduced blending theory to look at concepts in software engineering and human-computer interaction.[30]

The simplest implementation of a blended space requires two features. The first required feature is input. The input can range from tactile, to changes in the environment. The next required feature is notifications received from the digital spaces. The correspondences between the physical and digital space have to be abstracted and exploited by the design of the blended space. Seamless integration of both the spaces is rare. Blended spaces need anchoring points or technologies to link the spaces.[29]

A well designed blended space advertises and conveys the digital content in a subtle and unobtrusive way. Presence can be measured using physiological, behavioral, and subjective measures derived from the space.[30]

Conceptual blending in mixed reality spaces

There are two main components to any space. They are:

  1. Objects – The actual distinct objects which make up the medium/space. The objects thus effectively describe the space.
  2. Agents – Correspondents/users inside the space who interact with it through the objects.[28]

For presence in a blended space, there must be a physical space and a digital space. In the context of blended space, the higher the communication between the physical and digital spaces, the richer the experience.[28] This communication happens through the medium of correspondents which relay the state and nature of objects.
For the purpose of looking at blended spaces, the nature and characteristics of any space can be represented by these factors:# Ontology – Different types of objects present in the space the total number of objects and the relationships between objects and the space.

  1. Topology – The way objects are placed and positioned.
  2. Volatility – Frequency with which the objects change.
  3. Agency – Medium of communication between the objects, and between the objects and users. Agency also encompasses the users inside the space.

Physical space – Physical spaces are spaces which afford spatial interaction.[31] This kind of spatial interaction greatly impacts the user's cognitive model.[32]
Digital space – Digital space (also called the information space) consists of all the information content. This content can be in any form.[33]

Remote working[edit]

Mixed reality allows a global workforce of remote teams to work together and tackle an organization's business challenges. No matter where they are physically located, an employee can wear a headset and noise-canceling headphones and enter a collaborative, immersive virtual environment. As these applications can accurately translate in real time, language barriers become irrelevant. This process also increases flexibility. While many employers still use inflexible models of fixed working time and location, there is evidence that employees are more productive if they have greater autonomy over where, when, and how they work. Some employees prefer loud work environments, while others need silence. Some work best in the morning; others work best at night. Employees also benefit from autonomy in how they work because of different ways of processing information. The classic model for learning styles differentiates between visual, auditory, and kinesthetic learners.[34]

Machine maintenance can also be executed with the help of mixed reality. Larger companies with multiple manufacturing locations and a lot of machinery can use mixed reality to educate and instruct their employees. The machines need regular checkups and have to be adjusted every now and then. These adjustments are mostly done by humans, so employees need to be informed about needed adjustments. By using mixed reality, employees from multiple locations can wear headsets and receive live instructions about the changes. Instructors can operate the representation that every employee sees, and can glide through the production area, zooming in to technical details and explaining every change needed. Employees completing a five-minute training session with such a mixed-reality program have been shown to attain the same learning results as reading a 50-page training manual.[35] An extension to this environment is the incorporation of live data from operating machinery into the virtual collaborative space and then associated with three dimensional virtual models of the equipment. This enables training and execution of maintenance, operational and safety work processes, which would otherwise be difficult in a live setting, while making use of expertise, no matter their physical location.[36]

Functional mockup[edit]

Mixed reality can be used to build mockups that combine physical and digital elements. With the use of simultaneous localization and mapping (SLAM), mockups can interact with the physical world to gain control of more realistic sensory experiences[37] like object permanence, which would normally be infeasible or extremely difficult to track and analyze without the use of both digital and physical aides.[38][39]

Healthcare[edit]

Smartglasses can be incorporated into the operating room to aide in surgical procedures; possibly displaying patient data conveniently while overlaying precise visual guides for the surgeon.[40][41] Mixed reality headsets like the Microsoft HoloLens have been theorized to allow for efficient sharing of information between doctors, in addition to providing a platform for enhanced training.[42][41] This can, in some situations (i.e. patient infected with contagious disease), improve doctor safety and reduce PPE use.[43] While mixed reality has lots of potential for enhancing healthcare, it does have some drawbacks too.[41] The technology may never fully integrate into scenarios when a patient is present, as there are ethical concerns surrounding the doctor not being able to see the patient.[41][39] Mixed reality is also useful for healthcare education. For example, according to a 2022 report from the World Economic Forum, 85% of first-year medical students at Case Western Reserve University reported that mixed reality for teaching anatomy was "equivalent" or "better" than the in-person class.[44]

Product content management[edit]

Product content management before the advent of mixed reality consisted largely of brochures and little customer-product engagement outside of this 2-dimensional realm.[45] With mixed reality technology improvements, new forms of interactive product content management has emerged. Most notably, 3-dimensional digital renderings of normally 2-dimensional products have increased reachability and effectiveness of consumer-product interaction.[46]

Human-in-the-loop operation of robots[edit]

Recent advances in mixed-reality technologies have renewed interest in alternative modes of communication for human-robot interaction.[47] Human operators wearing mixed reality glasses such as HoloLens can interact with (control and monitor) e.g. robots and lifting machines[48] on site in a digital factory setup. This use case typically requires real-time data communication between a mixed reality interface with the machine / process / system, which could be enabled by incorporating digital twin technology.[48]

Business firms[edit]

Mixed reality allows sellers to show the customers how a certain commodity will suit their demands. A seller may demonstrate how a certain product will fit into the homes of the buyer. The buyer with the assistance of the VR can virtually pick the item, spin around and place to their desired points. This improves the buyer's confidence of making a purchase and reduces the number of returns.[49]

Architectural firms can allow customers to virtually visit their desired homes.

Display technologies and products[edit]

While mixed reality refers to the intertwining of the virtual world and the physical world at a high level, there are a variety of digital mediums used to accomplish a mixed reality environment. They may range from handheld devices to entire rooms, each having practical uses in different disciplines.[50][51]

Cave automatic virtual environment[edit]

A user standing in the middle of a Cave Automatic Virtual Environment

The cave automatic virtual environment (CAVE) is an environment, typically a small room located in a larger outer room, in which a user is surrounded by projected displays around them, above them, and below them.[50] 3D glasses and surround sound complement the projections to provide the user with a sense of perspective that aims to simulate the physical world.[50] Since being developed, CAVE systems have been adopted by engineers developing and testing prototype products.[52] They allow product designers to test their prototypes before expending resources to produce a physical prototype, while also opening doors for "hands-on" testing on non-tangible objects such as microscopic environments or entire factory floors.[52] After developing the CAVE, the same researchers eventually released the CAVE2, which builds off of the original CAVE's shortcomings.[53] The original projections were substituted for 37 megapixel 3D LCD panels, network cables integrate the CAVE2 with the internet, and a more precise camera system allows the environment to shift as the user moves throughout it.[53]

Head-up display[edit]

Photograph of the Head-up display of a F/A-18C

Head-up display (HUD) is a display that projects imagery directly in front of a viewer without heavily obfuscating their environment. A standard HUD is composed of three elements: a projector, which is responsible for overlaying the graphics of the HUD, the combiner, which is the surface the graphics are projected onto, and the computer, which integrates the two other components and computes any real-time calculations or adjustments.[54] Prototype HUDs were first used in military applications to aid fighter pilots in combat, but eventually evolved to aid in all aspects of flight – not just combat.[55] HUDs were then standardized across commercial aviation as well, eventually creeping into the automotive industry. One of the first applications of HUD in automotive transport came with Pioneer's Heads-up system, which replaces the driver-side sun visor with a display that projects navigation instructions onto the road in front of the driver.[56] Major manufacturers such as General Motors, Toyota, Audi, and BMW have since included some form of head-up display in certain models.

Head-mounted display[edit]

An augmented reality head-mounted display

A head-mounted display (HMD), worn over the entire head or worn in front of the eyes, is a device that uses one or two optics to project an image directly in front of the user's eyes. Its applications range across medicine, entertainment, aviation, and engineering, providing a layer of visual immersion that traditional displays cannot achieve.[57] Head-mounted displays are most popular with consumers in the entertainment market, with major tech companies developing HMDs to complement their existing products.[58][59] However, these head-mounted displays are virtual reality displays and do not integrate the physical world. Popular augmented reality HMDs, however, are more favorable in enterprise environments. Microsoft's HoloLens is an augmented reality HMD that has applications in medicine, giving doctors more profound real-time insight, as well as engineering, overlaying important information on top of the physical world.[60] Another notable augmented reality HMD has been developed by Magic Leap, a startup developing a similar product with applications in both the private sector and the consumer market.[61]

Mobile devices[edit]

Mobile devices, including smartphones and tablets, have continued to increase in computing power and portability. Many modern mobile devices come equipped with toolkits for developing augmented reality applications.[51] These applications allow developers to overlay computer graphics over videos of the physical world. The first augmented reality mobile game with widespread success was Pokémon GO, which released in 2016 and accumulated 800 million downloads.[62] While entertainment applications utilizing AR have proven successful, productivity and utility apps have also begun integrating AR features. Google has released updates to their Google Maps application that includes AR navigation directions overlaid onto the streets in front of the user, as well as expanding their translate app to overlay translated text onto physical writing in over 20 foreign languages.[63] Mobile devices are unique display technologies due to the fact that they are commonly equipped at all times.

See also[edit]

References[edit]

  1. ^ Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A. (January 2013). "Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration". IEEE Transactions on Visualization and Computer Graphics. 19 (1): 159–172. doi:10.1109/TVCG.2012.107. ISSN 1941-0506. PMID 22508901. S2CID 2894269.
  2. ^ Aygün, Mehmet Murat; Öğüt, Yusuf Çağrı; Baysal, Hulusi; Taşcıoğlu, Yiğit (January 2020). "Visuo-Haptic Mixed Reality Simulation Using Unbound Handheld Tools". Applied Sciences. 10 (15): 5344. doi:10.3390/app10155344. ISSN 2076-3417.
  3. ^ J. van Kokswijk, Hum@n, Telecoms & Internet as Interface to Interreality Archived 26 September 2007 at the Wayback Machine (Bergboek, The Netherlands, 2003).
  4. ^ V. Gintautas, and A. W. Hubler, Experimental evidence for mixed reality states in an interreality system Phys. Rev. E 75, 057201 (2007).
  5. ^ Repetto, C. and Riva, G., 2020. From Virtual Reality To Interreality In The Treatment Of Anxiety Disorders. [online] Jneuropsychiatry.org. Available at: https://www.jneuropsychiatry.org/peer-review/from-virtual-reality-to-interreality-in-the-treatment-of-anxiety-disorders-neuropsychiatry.pdf [Accessed 30 October 2020].
  6. ^ Lubrecht, Anna. Augmented Reality for Education Archived 5 September 2012 at the Wayback Machine The Digital Union, The Ohio State University 24 April 2012.
  7. ^ Moro, Christian; Birt, James; Stromberga, Zane; Phelps, Charlotte; Clark, Justin; Glasziou, Paul; Scott, Anna Mae (2021). "Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis". Anatomical Sciences Education. 14 (3): 368–376. doi:10.1002/ase.2049. ISSN 1935-9780. PMID 33378557. S2CID 229929326.
  8. ^ Moro, Christian; Phelps, Charlotte; Redmond, Petrea; Stromberga, Zane (2021). "HoloLens and mobile augmented reality in medical and health science education: A randomised controlled trial". British Journal of Educational Technology. 52 (2): 680–694. doi:10.1111/bjet.13049. ISSN 1467-8535. S2CID 229433413.
  9. ^ a b "Bamzooki (TV Series 2004–2010) - IMDb", IMDb. [Online]. Available: https://www.imdb.com/title/tt2065104/. [Accessed: 01- Nov- 2020].
  10. ^ a b "FightBox (TV Series 2003–2004) - IMDb", IMDb. [Online]. Available: https://www.imdb.com/title/tt0386197/. [Accessed: 01- Nov- 2020].
  11. ^ a b c Cheok, Adrian David; Haller, Michael; Fernando, Owen Noel Newton; Wijesena, Janaka Prasad (1 January 2009). "Mixed Reality Entertainment and Art". International Journal of Virtual Reality. 8 (2): 83–90. doi:10.20870/IJVR.2009.8.2.2729. ISSN 1081-1451.
  12. ^ "Mario Kart Live: Home Circuit – Official Site". mklive.nintendo.com. Retrieved 1 November 2020.
  13. ^ Romano, Sal (22 October 2020). "Famitsu Sales: 10/12/20 – 10/18/20". Gematsu. Retrieved 22 October 2020.
  14. ^ Stapleton, C.; Hughes, C.; Moshell, M.; Micikevicius, P.; Altman, M. (December 2002). "Applying mixed reality to entertainment". Computer. 35 (12): 122–124. doi:10.1109/MC.2002.1106186. ISSN 0018-9162.
  15. ^ a b Rosenberg, Louis B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments". Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.
  16. ^ Rosenberg, Louis B. (21 December 1993). Kim, Won S. (ed.). "Virtual fixtures as tools to enhance operator performance in telepresence environments". Telemanipulator Technology and Space Telerobotics. Boston, MA. 2057: 10–21. Bibcode:1993SPIE.2057...10R. doi:10.1117/12.164901. S2CID 111277519.
  17. ^ Hughes, C.E.; Stapleton, C.B.; Hughes, D.E.; Smith, E.M. (November 2005). "Mixed reality in education, entertainment, and training". IEEE Computer Graphics and Applications. 25 (6): 24–30. doi:10.1109/MCG.2005.139. ISSN 0272-1716. PMID 16315474. S2CID 14893641.
  18. ^ Pandher, Gurmeet Singh (2 March 2016). "Microsoft HoloLens Preorders: Price, Specs Of The Augmented Reality Headset". The Bitbag. Archived from the original on 4 March 2016. Retrieved 1 April 2016.
  19. ^ VirTra Inc. "VirTra's Police Training Simulators Chosen by Three of the Largest U.S. Law Enforcement Departments". GlobeNewswire News Room. Retrieved 22 August 2018.
  20. ^ "How do police use VR? Very well | Police Foundation". www.policefoundation.org. 14 August 2017. Archived from the original on 22 February 2020. Retrieved 22 August 2018.
  21. ^ Patton, Debbie; Marusich, Laura (9 March 2015). 2015 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision. pp. 145–150. doi:10.1109/COGSIMA.2015.7108190. ISBN 978-1-4799-8015-4. S2CID 46712515.
  22. ^ Eagen, Andrew (June 2017). "Expanding Simulations as a Means of Tactical Training with Multinational Partners" (PDF). A thesis presented to the Faculty of the U.S. Army Command and General Staff College. Archived (PDF) from the original on 27 March 2020.
  23. ^ Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis (1 January 2017). "A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care". INQUIRY: The Journal of Health Care Organization, Provision, and Financing. 54: 0046958016687176. doi:10.1177/0046958016687176. ISSN 0046-9580. PMC 5798742. PMID 28133988.
  24. ^ Smith, Roger (1 February 2010). "The Long History of Gaming in Military Training". Simulation & Gaming. 41 (1): 6–19. doi:10.1177/1046878109334330. ISSN 1046-8781. S2CID 13051996.
  25. ^ Shufelt, Jr., J.W. (2006) A Vision for Future Virtual Training. In Virtual Media for Military Applications (pp. KN2-1 – KN2-12). Meeting Proceedings RTO-MP-HFM-136, Keynote 2. Neuilly-sur-Seine, France: RTO. Available from:Mixed Reality (MR)Archived 13 June 2007 at the Wayback Machine
  26. ^ "STAND-TO!". www.army.mil. Retrieved 22 August 2018.
  27. ^ "Augmented reality may revolutionize Army training | U.S. Army Research Laboratory". www.arl.army.mil. Retrieved 22 August 2018.
  28. ^ a b c Benyon, David (2014). Spaces of Interaction, Places for Experience (1 ed.). Morgan and Claypool. p. 97. ISBN 9781608457724.
  29. ^ a b Benyon, David (July 2012). "Presence in Blended Spaces". Interacting with Computers. 24 (4): 219–226. doi:10.1016/j.intcom.2012.04.005.
  30. ^ a b Benyon, David; Imaz, Manuel (2007). Designing with blends (1 ed.). Cambridge, Mass. & London: MIT Press. pp. 209–218. ISBN 9780262090421.
  31. ^ Dourish, Paul. Implications for Design. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. dl.acm.org. SIGHCI. doi:10.1145/1124772.1124855.
  32. ^ Buxton, Bill (2009). "Mediaspace – Meaningspace – Meetingspace". Media Space 20 + Years of Mediated Life. Computer Supported Cooperative Work. Springer. pp. 217–231. doi:10.1007/978-1-84882-483-6_13. ISBN 978-1-84882-482-9.
  33. ^ Benyon, David (2012). Designing Blended Spaces (PDF). BCS-HCI '12 Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and Computers. dl.acm.org. BCS-HCI. pp. 398–403.
  34. ^ Sena, Pete (30 January 2016). "How The Growth Of Mixed Reality Will Change Communication, Collaboration And The Future Of The Workplace". TechCrunch. Retrieved 16 May 2017.
  35. ^ "Manufacturers are successfully using mixed reality today". The Manufacturer.
  36. ^ Bingham and Conner "The New Social Learning" Chapter 6 - Immersive Environments Refine Learning
  37. ^ Bruno, Fabio; Barbieri, Loris; Muzzupappa, Maurizio (2020). "A Mixed Reality system for the ergonomic assessment of industrial workstations". International Journal on Interactive Design and Manufacturing. 14 (3): 805–812. doi:10.1007/s12008-020-00664-x. S2CID 225517293.
  38. ^ "Virtual Reality Design: User Experience Design Software". dummies. Retrieved 7 March 2024.
  39. ^ a b "Object Permanence - an overview | ScienceDirect Topics". www.sciencedirect.com. Retrieved 7 March 2024.
  40. ^ "Taipei hits highs in Medica 2017". healthcare-in-europe.com. Retrieved 5 April 2019.
  41. ^ a b c d "Mixed Reality vs. Augmented Reality vs. Virtual Reality: Their Differences and Use in Healthcare". Brainlab. Retrieved 7 March 2024.
  42. ^ M. Pell, Envisioning Holograms Design Breakthrough Experiences for Mixed Reality, 1st ed. 2017. Berkeley, CA: Apress, 2017.
  43. ^ Mixed-reality headsets in hospitals help protect doctors and reduce need for PPE
  44. ^ Wish-Baratz, Susanne; Crofton, Andrew R.; Gutierrez, Jorge; Henninger, Erin; Griswold, Mark A. (1 September 2020). "Assessment of Mixed-Reality Technology Use in Remote Online Anatomy Education". JAMA Network Open. 3 (9): e2016271. doi:10.1001/jamanetworkopen.2020.16271. ISSN 2574-3805. PMC 7499123. PMID 32940677.
  45. ^ Lunka, Ryan (3 November 2015). "What is product content management? | nChannel Blog". www.nchannel.com. Retrieved 7 March 2024.
  46. ^ Melroseqatar.com. 2020. MELROSE Solutions W.L.L. [online] Available at: http://www.melroseqatar.com/reality-technologies.html [Accessed 25 October 2020].
  47. ^ Chakraborti, Tathagata; Sreedharan, Sarath; Kulkarni, Anagha; Kambhampati, Subbarao (October 2018). "Projection-Aware Task Planning and Execution for Human-in-the-Loop Operation of Robots in a Mixed-Reality Workspace". 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid: IEEE. pp. 4476–4482. doi:10.1109/IROS.2018.8593830. ISBN 978-1-5386-8094-0. S2CID 13945236.
  48. ^ a b Tu, Xinyi; Autiosalo, Juuso; Jadid, Adnane; Tammi, Kari; Klinker, Gudrun (12 October 2021). "A Mixed Reality Interface for a Digital Twin Based Crane". Applied Sciences. 11 (20): 9480. doi:10.3390/app11209480. ISSN 2076-3417.
  49. ^ "Adopting New Technologies for Effective Procurement - SIPMM Publications". publication.sipmm.edu.sg. 29 January 2018. Retrieved 1 November 2022.
  50. ^ a b c Cruz-Neira, Carolina; Sandin, Daniel J.; DeFanti, Thomas A.; Kenyon, Robert V.; Hart, John C. (June 1992). "The CAVE: audio visual experience automatic virtual environment". Communications of the ACM. 35 (6): 64–72. doi:10.1145/129888.129892. ISSN 0001-0782. S2CID 19283900.
  51. ^ a b Demidova, Liliya (2016). Ivanova, S.V.; Nikulchev, E.V. (eds.). "Augmented Reality and ARToolkit for Android: the First Steps". SHS Web of Conferences. 29: 02010. doi:10.1051/shsconf/20162902010. ISSN 2261-2424.
  52. ^ a b Ottosson, Stig (June 2002). "Virtual reality in the product development process". Journal of Engineering Design. 13 (2): 159–172. doi:10.1080/09544820210129823. ISSN 0954-4828. S2CID 110260269.
  53. ^ a b Febretti, Alessandro; Nishimoto, Arthur; Thigpen, Terrance; Talandis, Jonas; Long, Lance; Pirtle, J. D.; Peterka, Tom; Verlo, Alan; Brown, Maxine; Plepys, Dana; Sandin, Dan (4 March 2013). Dolinsky, Margaret; McDowall, Ian E. (eds.). CAVE2: a hybrid reality environment for immersive simulation and information analysis. The Engineering Reality of Virtual Reality 2013. SPIE Proceedings. Vol. 8649. Burlingame, California, US. pp. 9–20. doi:10.1117/12.2005484. S2CID 6700819.
  54. ^ "Spatial Disorientation in Aviation: Historical Background, Concepts, and Terminology", Spatial Disorientation in Aviation, Progress in Astronautics and Aeronautics, Reston, VA: American Institute of Aeronautics and Astronautics, pp. 1–36, January 2004, doi:10.2514/5.9781600866708.0001.0036, ISBN 978-1-56347-654-9, retrieved 5 November 2020
  55. ^ "Fault-Tolerant Avionics", Digital Avionics Handbook, CRC Press, pp. 481–504, 20 December 2000, doi:10.1201/9781420036879-37, ISBN 978-0-429-12485-3, retrieved 5 November 2020
  56. ^ Alabaster, Jay (28 June 2013). "Pioneer launches car navigation with augmented reality, heads-up displays". Computerworld. Retrieved 5 November 2020.
  57. ^ Shibata, Takashi (April 2002). "Head mounted display". Displays. 23 (1–2): 57–64. doi:10.1016/S0141-9382(02)00010-0.
  58. ^ "Oculus Device Specifications | Oculus Developers". developer.oculus.com. Retrieved 5 November 2020.
  59. ^ "VIVE Specs & User Guide - Developer Resources". developer.vive.com. Archived from the original on 23 October 2020. Retrieved 5 November 2020.
  60. ^ "Evaluating the Microsoft Hololens Through an Augmented Reality Assembly Application". doi:10.1117/12.2262626.5460168961001. Retrieved 5 November 2020.
  61. ^ Crecente, Brian (20 December 2017). "Magic Leap: Founder of Secretive Start-Up Unveils Mixed-Reality Goggles". Variety. Retrieved 5 November 2020.
  62. ^ Rauschnabel, Philipp A.; Rossmann, Alexander; tom Dieck, M. Claudia (November 2017). "An adoption framework for mobile augmented reality games: The case of Pokémon Go". Computers in Human Behavior. 76: 276–286. doi:10.1016/j.chb.2017.07.030. S2CID 45215074.
  63. ^ "Take off to your next destination with Google Maps". Google. 8 August 2019. Retrieved 5 November 2020.

Further reading[edit]

External links[edit]

Media related to Mixed reality at Wikimedia Commons