Sale!

DEEP LEARNING: NEURAL NETWORK AND BEYOND

Dr. S. Suganya is an Associate Professor of Electrical and Electronics Engineering in AAA College of Engineering and Technology, Sivakasi. She graduated from Anna University Chennai with a doctorate in the field of electric vehicle charge scheduling. She is the recipient of young researcher award. Also, she is an Anna University recognized supervisor in Electrical Engineering. She has about 8 research and 2 patent publications. Presented her research work in many National and International Conferences. Her research interest, dive deep into getting funded projects and writing proposals. She has successfully completed DST- SERB funded project as a Junior Research Fellow. In addition to that, she had also conducted SERB sponsored seminars for the welfare of the student community.

Dr. Sunila is Professor in department of Computer Science and Engineering at Guru Jambeshwar University of Science and Technology, Hisar, Haryana, India, recognized by UGC and A+ Grade NAAC accredited with Potential for Excellence. She is Ph. D., M. Tech in Computer Science and Engineering. Her areas of interest are Cloud computing and Machine Learning. She has more than 19 years of teaching and research experience. She published more than 70 publications in SCOPUS & UGC Indexed international journals and international/ national conferences.

Sivasubramanian Balasubramanian is a distinguished computer scientist and authority on deep learning technologies. With a M.Sc. in Data Science from University College Cork, Siva has dedicated his career to advancing the understanding and application of neural networks and deep learning algorithms. As a researcher, Sivasubramanian has been at the cutting edge of developing innovative deep learning models that push the boundaries of AI capabilities. His work has contributed significantly to areas such as image and speech recognition, natural language processing, and autonomous systems, paving the way for new AI applications that are more intuitive and effective.

Dr. Haewon Byeon received the Dr Sc degree in Biomedical Science from Ajou University School of Medicine. Haewon Byeon currently works at the Department of Medical Big Data, Inje University. His recent interests focus on health promotion, AI-medicine, and biostatistics. He is currently a member of international committee for a Frontiers in Psychiatry, and an editorial board for World Journal of Psychiatry. Also, He were worked on 4 projects (Principal Investigator) from the Ministry of Education, the Korea Research Foundation, and the Ministry of Health and Welfare. Byeon has published more than 343 articles and 19 books.

Description

Deep learning has brought about a revolution in the field of artificial intelligence by providing sophisticated tools that can be used to solve difficult issues in a variety of fields. One of the most important components of deep learning is the neural network, which is a computational model that is modeled after the structure and function of the human brain. Neural networks are made up of neurons, which are nodes that are connected to one another and are arranged in layers. Input data is processed by each neuron, and signals are then transmitted to neurons in the subsequent layer, which finally results in the production of output. The process of neural networks learning from data is referred to as backpropagation. This involves altering the strength of connections between neurons in order to reduce the amount of errors that occur in their predictions. However, the scope of deep learning encompasses a much wider range of applications than typical neural networks. In order to improve the capabilities of these models, researchers are continually investigating novel structures and methods. Examples of neural networks that are specifically developed for processing grid-like data include convolutional neural networks (CNNs), which are used to process images. Convolutional neural networks (CNNs) are able to effectively capture spatial hierarchies in visual input by utilizing convolutional layers. This enables CNNs to perform tasks such as image categorization and object detection with exceptional accuracy. The use of recurrent neural networks (RNNs) is another key innovation that is particularly well-suited for sequential data processing tasks. Some examples of these tasks include the understanding of natural language and the prediction of time series. In contrast to feedforward neural networks, recurrent neural networks (RNNs) feature connections that create directed cycles, which provide them with the ability to remember previous inputs. The ability of RNNs to record temporal connections in data is made possible by this memory, which makes them extremely useful for jobs that require context or continuity. In addition to these well-established designs, academics are investigating more unusual models such as transformers and generative adversarial networks (GANs). An artificial neural network (GAN) is made up of two neural networks—a generator and a discriminator—that are involved in a process of competitive learning. Because of this configuration, GANs are able to generate synthetic data that is realistic, which has a wide range of applications, including drug discovery and image synthesis

Reviews

There are no reviews yet.

Add a review

Your email address will not be published. Required fields are marked *