Home > Artificial Intelligence

Artificial Intelligence

Hadoop Architecture in Big Data: YARN, HDFS, and MapReduce

Hadoop Architecture in Big Data: YARN, HDFS, and MapReduce

What is Hadoop? | What is Hadoop Architecture? | HDFS Architecture | YARN Architecture | MapReduce | The Takeaway

Do you want to know more about the Hadoop Architecture in Big Data? HDFS, MapReduce, and YARN are the three important concepts of Hadoop. In this tutorial, you will learn the Apache Hadoop HDFS and YARN Architecture in details. 

Hadoop Architecture Tutorial YARN HDFS

What is Hadoop?

Hadoop is an open source framework that allows for the distributed processing of large datasets across clusters of computers using simple programming models. It is from Apache and is used to store process and analyze data which are very huge in volume. The summary of the Hadoop framework is as follows:Read More »Hadoop Architecture in Big Data: YARN, HDFS, and MapReduce

AWS vs Azure vs Google Cloud A Comparative Review

AWS vs Azure vs Google Cloud: A Comparative Review

A comparative review of the AWS vs Azure vs Google Cloud is very important in the modern context. Code free complete machine learning lifecycle is the key trend for the modern cloud computing platforms.

As companies are going for more on code free deep learning and other high-end IoT technologies, selecting suitable cloud platforms are vital for corporate growth. In this article, we’re going to help you decide between the three giants of cloud computing.

Best of AWS Azure and Google Cloud Platforms

Cloud computing Foundation

Before going into the prominence in the cloud market, Google, Amazon and Microsoft are the market leaders in their respective fields. Each has unique advantage. All these three major cloud providers have also attempted to create many general-purpose services that are relatively easy to use by the end users. 

Price, speed, performance, flexibility, and features are the five key criteria normally used to select the best cloud computing platform for your jobs. 

The cloud computing makes it easy for enterprises to experiment with machine learning capabilities and scale up as projects go into production and demand for those features increases.

GPUs are the key hardwires and the processors in the cloud computing  of choice for many complex machine learning applications because they significantly reduce processing time.… Read more..

Transfer Learning Basic Concept and the Building Blocks

Transfer Learning A Step by Step Easy Guide

Considering the lengthening timelines for deep machine learning and AI projects to fight against COVID-19 the interest in transfer learning has grown significantly. Transfer learning for deep machine learning is the process of first training a base network on a benchmark dataset (like ImageNet), and then transferring the best-learned network features (the network’s weights and structures) to a second network to be trained on a target dataset. This idea has been shown to improve deep neural network’s generalization capabilities significantly in many application areas.

Transfer learning is currently used in almost every deep learning model when the target dataset does not contain enough labeled data. Building deep learning models from scratch and training with huge data is very expensive, both in time and resources. Transfer learning is very effective for rapid prototyping, resource efficiency and high performance. As human brain carry forward knowledge and wisdom and learn it from others, transfer learning mimic this type behavior.   

Transfer Learning Base Models

To design an efficient neural network model, you need to know the details of different base models. Because from the base model you will be transferring the knowledge to your new model. Here, knowledge means the network structures and the weights.… Read more..

Key Artificial Intelligence Projects to Fight Against COVID-19https://amitray.com/artificial-intelligence-to-fight-against-covid-19/

Artificial Intelligence to Fight Against COVID-19

Key Artificial Intelligence Projects to Fight Against COVID-19

Dr. Amit Ray 
Compassionate  AI Lab 

In our Compassionate AI Lab, we broadly classified our fight against COVID 19 Artificial intelligence (AI) based research projects into six groups. They are AI for COVID vaccine development, AI for COVID drug discovery, AI for COVID diagnosis, AI for COVID testing, AI for COVID growth rate forecasting, and AI for social robots.

Researchers across the world are in search of urgent drugs and vaccines that can save millions of lives of infected people and perhaps prevent infections for the future generations. The exacerbated time, cost and high failure rate of traditional path of drug discovery and vaccine development has prompted the need for efficient use of machine learning techniques. In these projects, we are trying to solve one the most complex problems of humanity ever encountered 

1. AI for COVID 19: An Overview

The massive outbreak of the COVID-19 has prompted various scientists, researchers, laboratories, and organizations around the world to conduct large-scale research to help develop vaccines and other treatment strategies. Biology and medicine of coronavirus are data rich, complex, and often ill understood. Problems of this nature may be particularly well suited to deep learning techniques.… Read more..

Top 10 Limitations of Artificial Intelligence and Deep Learning

Artificial Intelligence (AI) has provided remarkable capabilities and advances in image understanding, voice recognition, face recognition, pattern recognition, natural language processing, game planning, military applications, financial modeling, language translation, and search engine optimization. In medicine, deep learning is now one of the most powerful and promising tool of AI, which can enhance every stage of patient care —from research, omics data integration, combating antibiotic resistance bacteria,  drug design and discovery to diagnosis and selection of appropriate therapy. It is also the key technology behind self-driving car.

However, deep learning algorithms of AI have several inbuilt limitations. To utilize the full power of artificial intelligence, we need to know its strength and weakness and the ways to overcome those limitations in near future.

Now, AI support messaging apps, and voice controlled chatbots are helping people for deep space communications, customer care, taking off the burden on medical professionals regarding easily diagnosable health concerns or quickly solvable health management issues and many other applications. However, there are many obstacles and number of issues remain unsolved. 

Even with so many success and promising results its full application is limited. Mainly, because, present day AI has no common sense about the world and the human psychology. Presently, in complex application areas, one part is solved by the AI system and the other part is solved by human – often called as human-assisted AI system.  The challenges are mostly in the large-scale application areas like drug discovery, multi-omics-data integration, assisting elderly people,  new material design and modeling,  computational chemistry, quantum simulation, and aerospace physics.

This article is focused to explain the power and challenges of current AI technologies and learning algorithms. It also provides the directions and lights to overcome the limits of AI technologies to achieve higher levels learning capabilities.

Top 10 Limitations of Artificial Intelligence and Deep LearningRead More »Top 10 Limitations of Artificial Intelligence and Deep Learning

10 Quantum Machine Learning Properties By Amit Ray

Quantum Machine Learning The 10 Key Properties 

Quantum Machine Learning and the Deep Intelligence Frameworks – The 10 Key Properties 

Dr. Amit Ray, Compassionate AI Lab

In this article, we discussed the 10 properties and characteristics of hybrid classical-quantum machine learning approaches for our Compassionate AI Lab projects. Quantum computers with the power of machine learning will disrupt every industry. They will change the way we live in this world and the way we fight diseases, care for old people and blind people,  invent new medicines and new materials, and solve health, climate and social issues. Similar to the 10 V’s of big data we have identified 10 M’s of quantum machine learning (QML). These 10 properties of quantum machine learning can be argued, debated and fine tuned for further refinements. 

10 Quantum Machine Learning Properties By Amit Ray

The field of hybrid classical-quantum machine learning has been maturing rapidly with the availability of the prototype universal gate-model quantum processors of IBM, Google, and Rigetti as well as more sophisticated quantum annealers of D-Wave systems. Moreover, the availability of various high-performance computing (HPC) simulation of quantum circuits facilities improving the possibilities of exploring the power of QML in various application areas. Quantum hardware dedicated to machine learning are also becoming reality. They too can provide much faster processing power than a general-purpose quantum computer.

Classical Quantum Hybrid

Hybrid Classical Quantum Machine Learning

The compassionate AI lab is currently developing a hybrid classical-quantum machine learning (HQML) framework – a quantum computing virtual plugin to build a bridge between the available quantum computing facilities with the classical machine learning software like Tensor flow, Scikit-learn, Keras, XGBoost, LightGBM, and cuDNN.

 

Presently the hybrid classical-quantum machine learning (HQML) framework includes the quantum learning algorithms like: Quantum Neural Networks, Quantum Boltzmann Machine, Quantum Principal Component Analysis, Quantum k-means algorithm, Quantum k-medians algorithm, Quantum Bayesian Networks and Quantum Support Vector Machines.

Read More »Quantum Machine Learning The 10 Key Properties 

Machine Learning to Fight Antimicrobial Resistance

Machine Learning to Fight Antimicrobial Resistance

The seven top machine learning projects to fight against antimicrobial resistance are explained. Antimicrobial resistance  is one of the key reasons of human sufferings in modern hospitals. Preventing microbes from developing resistance to drugs has become as important issue for treating illnesses across the world. Artificial Intelligence, machine learning, genomics and multi-omics data integration are the fast-growing emerging technologies to counter antimicrobial resistance problems. Here,  Dr. Amit Ray explains how these technologies can be used in seven key areas to counter antimicrobial resistance issues.

Five Key Benefits of Quantum Machine Learning

Five Key Benefits of Quantum Machine Learning

Here, Dr. Amit Ray discusses the five key benefits of quantum machine learning. 

Quantum machine learning is evolving very fast and gaining enormous momentum due to its huge potential. Quantum machine learning is the key technology for future compassionate artificial intelligence. In our Compassionate AI Lab, we have conducted several experiments on quantum machine learning in the areas of drug-discovery, combating antibiotic resistance bacteria, and multi-omics data integration. 

We have realized that in the area of drug design and multi-omics data integration, the power of deep learning is very much restricted in classical computer. Hence, with limited facilities, we have conducted many hybrid classical-quantum machine learning algorithm testing at our Compassionate AI Lab. 

Benefits of Quantum Machine Learning

Five Benefits of Quantum Machine Learning

Read More »Five Key Benefits of Quantum Machine Learning

What’s Holding Back Machine Learning in Healthcare

What is holding back the large scale implementation of machine learning systems in healthcare and precision medicine? In this article Dr. Amit Ray, explains the key obstacles and challenges of  implementing large-scale machine learning systems in healthcare.   Dr. Ray argued that lack of deeper integration, incomplete understanding of the underlying molecular processes of disease it is intended to treat, may limit the progress of implementing large-scale machine learning based reliable systems in healthcare. Here, nine obstacles of present day machine learning systems in healthcare are discussed. 

What Holding Back Machine Learning in Healthcare

Machine Learning in Healthcare

Recently, machine learning algorithms, especially deep learning has shown impressive performance in many areas of medical science, especially in classifying imaging data in different clinical domains. In academic environment, Deep learning and Reinforcement learning methods of Artificial Intelligence (AI) has shown tremendous success in numerous clinical areas such as: Omics data integration (such as genomics, proteomics or metabolomics), prediction of drug-disease correlation based on gene expression, and finding combinations of drugs that should not be taken together. Deep learning is very successful in predicting cancer outcome based on tumour tissue images. Machine learning are used for medical decision support systems for ICU and critical care. Artificial Intelligence in Healthcare Current Trends discusses the current status of AI in healthcare. Read More »What’s Holding Back Machine Learning in Healthcare

Roadmap for 1000 Qubits Fault-tolerant Quantum Computers

How many qubits are needed to out-perform conventional computers, how to protect a quantum computer from the effects of decoherence and how to design more than 1000 qubits fault-tolerant large scale quantum computers, these are the three basic questions we want to deal in this article.  Qubit technologies, qubit quality, qubit count, qubit connectivity and qubit architectures are the five key areas of quantum computing are discussed.

Roadmap for 1000 Qubits Fault-tolerant Quantum Computers

Earlier we have discussed 7 Core Qubit Technologies for Quantum Computing, 7 Key Requirements for Quantum Computing.  Spin-orbit Coupling Qubits for Quantum Computing and AIQuantum Computing Algorithms for Artificial IntelligenceQuantum Computing and Artificial Intelligence , Quantum Computing with Many World Interpretation Scopes and Challenges and Quantum Computer with Superconductivity at Room Temperature. Here, we will focus on practical issues related to designing large-scale quantum computers. 

Instead of running on zeros and ones, quantum computers run on an infinite number of states between zero and one. Instead of performing one calculation before moving on to the next, quantum computers can manage multiple processes all simultaneously. 

Unlike binary bits of information in ordinary computers, “qubits” consist of quantum particles that have some probability of being in each of two states, represent as |0⟩ and |1⟩, simultaneously. When qubits interact, their possible states become interdependent (entangled), each one’s chances of |0⟩ and |1⟩ hinging on those of the other. Moreover, quantum information does not have to be encoded into binary bits, it could also be encoded into continuous observables bits (qubits). 

The speed requirements for various applications grows with the complexity of the problems and the speed advantage of quantum computers are enormous compare to classical computers.  The key to quantum computation speed is that every additional qubit doubles the potential computing power of a quantum machine. 

The objective of 1000 qubits fault-tolerant quantum computing is to compute accurately even when gates have a high probability of error each time they are used. Theoretically, accurate quantum computing is possible with error probabilities above 3% per gate, which is significantly high. The resources required for quantum computing depends on the error probabilities of the gates. It is possible to implement non-trivial quantum computations at error probabilities as high as 1% per gate.Read More »Roadmap for 1000 Qubits Fault-tolerant Quantum Computers