Machine Learning and Quantum Computing in Biomedical Intelligence

Machine Learning and Quantum Computing in Biomedical Intelligence

Pradeepta Kumar Sarangi, Shreya Kumari, Mani Sawhney, Amit Vajpayee, Mukesh Rohra, Srikanta Mallik
Copyright: © 2024 |Pages: 20
DOI: 10.4018/979-8-3693-1479-1.ch008
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The digital world is replete with data like cyber security data, internet of things (IoT) data, enterprise data, mobile data, health data, and more. To analyse this data brilliantly and develop intelligent and automated applications, everyone has to know artificial intelligence (AI) algorithms, deep learning (DL) and machine learning (ML). Therefore, in today's technology-driven or digital world, no company can afford to ignore artificial intelligence or machine learning. Machine learning is a subfield of artificial intelligence, which is the scientific study of algorithms and statistical models that a computer system utilises to effectively carry out a given task without the need for any explicit instructions. This chapter begins with the basics of machine learning and its diverse range of techniques. This chapter also discusses various classification and clustering methods along with their applications and concludes with some real-world applications and examples and research development using machine learning and quantum computing in healthcare.
Chapter Preview
Top

1. Introduction

Humans have evolved and learned from previous experiences for billions of years, as the era of machines and robots has just begun. Since his evolution, man has used many kinds of tools to perform numerous tasks more simply (Sarangi, P., Sinha, D., Sinha, S., & Dubey, M. 2019). The development of numerous machines is a result of the ingenuity of the human brain, and these technologies have greatly facilitated human existence by allowing humans to meet a variety of demands. In today's world, these machines and robots have to be programmed before they follow your instructions, but if the machines start to learn by themselves as shown in Figure 1. This is where machine learning plays a part.

Many promising technological developments of the future are based on ML (Trivedi, N. K., Gautam, V., Anand, A., Aljahdali, H. M., Villar, S. G., Anand, D., Goyal, N., and Kadry, S. 2021). The artificially intelligent robot Sophia, Tesla's self-driving car, Apple's series, and many more are just a few of the many examples or uses of ML that surround us today.

ML is a sub-domain of AI. It mainly concentrates on using data and algorithms to emulate the way a human learns (Dhiman, P., Kukreja, V., Poongodi M., Kaur, A., Kamruzzaman, M. M., Dhaou, I., and Iwendi, C. 2022). The first practical test to develop a machine that imitates a living being was carried out by (Thomas Ross, 1930). Machine learning was defined as a “Field of study that gives computers the ability to learn without being explicitly programmed.” by (Arthur Samuel,1959).

Figure 1.

Human-Machine interaction

979-8-3693-1479-1.ch008.f01

The purpose of machine learning is to train computers by feeding them data and algorithms. The need for machine learning is rising as more and more of these data sets become available. Industries around the world are utilising ML to extract useful data. The use of algorithms to analyse data is central to machine learning. Without being explicitly coded, machine learning algorithms use historical data to develop a mathematical model that can then be used to make predictions or data-driven decisions. Figure 2 illustrates the three distinct types of machine learning.

Figure 2.

Machine learning types

979-8-3693-1479-1.ch008.f02

Complete Chapter List

Search this Book:
Reset