Artificial Intelligence Technology

Alpha, the Pilot

The year of 2016 ushered in an interesting development in the field of Artificial Intelligence when Alpha, the Artificial Intelligence power computer pilot was pitted against a human pilot in simulated scenarios at the Air Force Research Labs (AFRL) in USA. Alpha, the Artificial Intelligence induced pilot, was created using a Genetic Fuzzy Tree (GFT) system, which involves breeding' of several algorithms to enable self-learning amongst the systems. Instructions on engagement using weapons were taught using linguistic classifications, which is the same way as we decide and act.

The result of the simulated fights between Alpha, the Artificial Intelligence pilot and the human pilot surprised everyone. Alpha, the Artificial Intelligence induced pilot won every simulated scenario.

In subsequent simulated runs, gradual limitations in terms of an inferior aircraft, lower speeds, shorter missile range and inferior sensors were induced into Alpha, the Artificial Intelligence pilot as compared to the human pilot. The results however, did not change and Alpha was able to both evade and attack the human pilot making the switch between offensive and defensive modes seamlessly.

Get quality help now
writer-Charlotte
writer-Charlotte
checked Verified writer

Proficient in: Artificial Intelligence

star star star star 4.7 (348)

“ Amazing as always, gave her a week to finish a big assignment and came through way ahead of time. ”

avatar avatar avatar
+84 relevant experts are online
Hire writer

This test has prompted the AFRL to explore the feasibility of using Alpha for Unmanned Combat Aerial Vehicles (UCAVs). In order to further understand the technology, it is important to grasp the basic concepts.

The Definition

The definition of Artificial Intelligence is continuously evolving with the rapid advance of the technology itself and therefore, only an approximate boundary can be outlined around the concept of Artificial Intelligence. In the 80s, Artificial Intelligence was defined as the study of how to make computers do things which at the moment humans did better .

Get to Know The Price Estimate For Your Paper
Topic
Number of pages
Email Invalid email

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy. We’ll occasionally send you promo and account related email

"You must agree to out terms of services and privacy policy"
Write my paper

You won’t be charged yet!

The modern day definition of Artificial Intelligence has evolved with research and the terms Artificial Intelligence and Machine Language are being used interchangeably. The research in Artificial Intelligence is defined as the study of intelligent agents; any device that perceives its actions that maximize its chances of successfully achieving its goals . In the general open source, the term Artificial Intelligence is also applied when machines mimic cognitive functions that humans associate with other human minds, such as learning' and problem solving'. In order to better appreciate the present day applications of Artificial Intelligence, it is prudent to understand the growth of this technology.

Growth of Artificial Intelligence

The invention of the programmable digital computer, a machine based on the abstract essence of mathematical reasoning inspired a handful of scientists to think about the possibility of a digital brain. A workshop in the campus of Dartsmouth College in the summer of 1954 paved way for future research as well as the setting up of an Artificial Intelligence Lab in Massachusetts Institute of Technology. The limitations in terms of computing power, storage capacities and hardware limitations brought about a lull in the research of Artificial Intelligence which is widely regarded as the AI Winters'. Ms Elaine Rich and Mr Kevin Knight have highlighted in their book titled Artificial Intelligence' that the main reason leading to AI Winters' was the approach of the researchers to make the computers learn language skill and common sense reasoning. Since there was no database like the human brain where computers could store a large amount of data to build up knowledge, natural language understanding and common sense emerged to be a major hurdle.

However, research in the field of expert tasks comprising of applications that could be bound within the confines of algorithms to link objects to the tasks required to be undertaken, flourished with wide ranging applications in the field of engineering, medical and financial streams. The next breakthrough arrived with the advancement in the field of semiconductors, that brought in faster processing speeds coupled with higher data storage capacity. This led to the emergence of Machine Learning and Deep Learning (also known as Artificial Neural Networks).

Machine Learning

Machine Learning is the practice of using algorithms to resolve a statement into its component parts and describe their roles, learn from it and then make a determination or a prediction about something in the world. Therefore, instead of writing a specific software code with a set of instructions to accomplish a particular task, the machine is trained using large amounts of data (which translates into a knowledge base for the computer system) and algorithms that accord the system to learn how to perform the task. Statistics and probability theory are the major tools that machine learning is modelled on and therefore, the higher the data available within the system, the better the predictions from a Machine Learning system. However, the widely acknowledged flip side of Machine Learning is the fact that even small variations in the data (or the knowledge base) of the system can bring forth large variations in the predictions from the Machine Learning system. The flower crown selfies' of the mobile app Snapchat', ride time and recommended routes on the mobile app Uber', autospam filtering of Gmail' and the content recommendations of NetFlix' are some of the most common applications of Machine Learning systems.

Deep Learning

Deep Learning is a subset of Machine Learning Technology and is also known as the Artificial Neural Network' technology. Research in the field of Artificial Neural Network has been in progress for decades, however the real breakthrough that was implementable came about in 2006. The technology is inspired by neurons in the human brain and their interconnections to each other and the nervous system using synapses. Each neuron has an input and based on an assigned weightage, the output would be an affirmatory i.e one or a nugatory i.e zero. The output of the neuron can be controlled by controlling the assigned weightage. Networks of such artificial neurons are made and thereafter, many such networks are made which are layered one on top of the other network. Any data input into the neural network passes through multiple layers with adjusted weightages to make sense of the input data and compare with the output. The self-learning aspect is achieved as the system itself adjusts the weightage factor to achieve an output that is most suitable. In simpler terms, the Deep Learning system first learns the attributes required to perform a task on its own and then proceeds to undertake the task.

Deep Learning systems are known to give very accurate predictions as well as draw correlations that are impossible for human brains to predict. Since the system learns on its own, it does take more time in the initial learning phases. However, the Deep Learning system does not provide any insight into the way the prediction has been made and this remains the most concerning aspect of this technology. In the event of an incorrect or a dangerous outcome from a Deep Learning system, there would be no methodology to ascertain the reasons since it is not possible to trace the firing of the neurons in the system . Some everyday applications include product recommendations on Amazon, voice to text and voice search in Google, Siri from Apple and Alexa from Kindle. The ability of the Artificial Intelligence powered systems to display quick decision making abilities post the initial self-learning' process opens up its applications in the systems that aid the Commander of a warship at sea in decision making processes considering the threats or targets in the operating environment.

THREATS DURING NAVAL WARFARE AND CREDIBLE MARITIME DOMAIN AWARENESS

Aerial Threats

Surface to Surface Missiles (SSMs), Anti-Ship Missiles (AShMs) and enemy Aircraft are the typical aerial threats that a warship or a fleet faces whilst at sea. Measures to mitigate these threats are either available on a warship or in the task force that the warship is a part of. However, the modern day missiles adopt several detection evasion measure coupled with very high speed. The speed that modern day missiles can achieve is five (05) Mach and development of missiles attainting speeds up to eight (08) Mach is likely before the end of this decade itself. Such high speeds translate to a very small reaction time available with the ship's crew. Therefore, early detection of aerial threats to enable launching threat mitigation measures by a warship or a fleet is mandatory considering the inherent low speeds of a warship or the fleet itself. Launching missile decoy systems and Surface to Air Missiles (SAM) as part of the Area Defense System or the Point Defense System provides the warship or the fleet the only measure for self-protection against the threats highlighted above. Therefore, it is important that a warship detects the incoming threat in the lowest possible time frame so that self-protection measures can be initiated and the ship has a high chance of survival.

Underwater Threats

A warship at sea is always prone to attacks from torpedoes launched by enemy warship or submarines and the different kinds of mines. Supercavitating torpedoes provide very inadequate time for the warship to launch counter measures as well as take evasive action. In this case also, early detection becomes the most critical factor followed by immediate self-protection measures in the lowest timeframe. As iterated earlier, establishing credible MDA is the only solution against these threats.

Updated: Feb 23, 2021
Cite this page

Artificial Intelligence Technology. (2019, Aug 20). Retrieved from https://studymoose.com/artificial-intelligence-technology-essay

Artificial Intelligence Technology essay
Live chat  with support 24/7

👋 Hi! I’m your smart assistant Amy!

Don’t know where to start? Type your requirements and I’ll connect you to an academic expert within 3 minutes.

get help with your assignment