Artificial Intelligence By Example: Acquire Advanced AI, Machine Learning, and Deep Learning Design Skills [2nd Edition]

MalwareGod

Member
Staff member
Moderator
Joined
Aug 15, 2023
Messages
98
Hellcoins
♆691
  • Implementing the KNN algorithm
  • The knn_polysemy.py program
  • Implementing the KNN function in Google_Translate_Customized.py
  • Conclusions on the Google Translate customized experiment
  • Part I – the background to blockchain technology
  • Mining bitcoins
  • Using cryptocurrency
  • PART II – using blockchains to share information in a supply chain
  • Using blockchains in the supply chain network
  • Creating a block
  • Exploring the blocks
  • Part III – optimizing a supply chain with naive Bayes in a blockchain process
  • A naive Bayes example
  • The blockchain anticipation novelty
  • The goal – optimizing storage levels using blockchain data
  • Implementation of naive Bayes in Python
  • Gaussian naive Bayes
  • The original perceptron could not solve the XOR function
  • XOR and linearly separable models
  • Linearly separable models
  • The XOR limit of a linear model, such as the original perceptron
  • Building an FNN from scratch
  • Step 1 – defining an FNN
  • Step 2 – an example of how two children can solve the XOR problem every day
  • Implementing a vintage XOR solution in Python with an FNN and backpropagation
  • A simplified version of a cost function and gradient descent
  • Linear separability was achieved
  • Applying the FNN XOR function to optimizing subsets of data
  • Introducing CNNs
  • Defining a CNN
  • Initializing the CNN
  • Adding a 2D convolution layer
  • Kernel
  • Shape
  • ReLU
  • Pooling
  • Next convolution and pooling layer
  • Flattening
  • Dense layers
  • Dense activation functions
  • Training a CNN model
  • The goal
  • Compiling the model
  • The loss function
  • The Adam optimizer
  • Metrics
  • The training dataset
  • Data augmentation
  • Loading the data
  • The testing dataset
  • Data augmentation on the testing dataset
  • Loading the data
  • Training with the classifier
  • Saving the model
  • Generating profit with transfer learning
  • The motivation behind transfer learning
  • Inductive thinking
  • Inductive abstraction
  • The problem AI needs to solve
  • The 𝚪𝚪 gap concept
  • Loading the trained TensorFlow 2.x model
  • Loading and displaying the model
  • Loading the model to use it
  • Defining a strategy
  • Making the model profitable by using it for another problem
  • Domain learning
  • How to use the programs
  • The trained models used in this section
  • The trained model program
  • Gap – loaded or underloaded
  • Gap – jammed or open lanes
  • Gap datasets and subsets
  • Generalizing the 𝚪𝚪 (the gap conceptual dataset)
  • The motivation of conceptual representation learning metamodels applied to dimensionality
  • The curse of dimensionality
  • The blessing of dimensionality
  • Planning and scheduling today and tomorrow
  • A real-time manufacturing process
  • Amazon must expand its services to face competition
  • A real-time manufacturing revolution
  • CRLMM applied to an automated apparel manufacturing process
  • An apparel manufacturing process
  • Training the CRLMM
  • Generalizing the unit training dataset
  • Food conveyor belt processing – positive p𝜸𝜸 and negative n𝜸𝜸 gaps
  • Running a prediction program
  • Building the RL-DL-CRLMM
  • A circular process
  • Implementing a CNN-CRLMM to detect gaps and optimize
  • Q-learning – MDP
  • MDP inputs and outputs
  • The optimizer
  • The optimizer as a regulator
  • Finding the main target for the MDP function
  • A circular model – a stream-like system that never starts nor ends
  • The public service project
  • Setting up the RL-DL-CRLMM model
  • Applying the model of the CRLMM
  • The dataset
  • Using the trained model
  • Adding an SVM function
  • Motivation – using an SVM to increase safety levels
  • Definition of a support vector machine
  • Python function
  • Running the CRLMM
  • Finding a parking space
  • Deciding how to get to the parking lot
  • Support vector machine
  • The itinerary graph
  • The weight vector
  • Exploring the output of the layers of a CNN in two steps with TensorFlow
  • Building the layers of a CNN
  • Processing the visual output of the layers of a CNN
  • Analyzing the visual output of the layers of a CNN
  • Analyzing the accuracy of a CNN using TensorBoard
  • Getting started with Google Colaboratory
  • Defining and training the model
  • Introducing some of the measurements
  • Defining basic terms and goals
  • Introducing and building an RBM
  • The architecture of an RBM
  • An energy-based model
  • Building the RBM in Python
  • Creating a class and the structure of the RBM
  • Creating a training function in the RBM class
  • Computing the hidden units in the training function
  • Random sampling of the hidden units for the reconstruction and contractive divergence
  • Reconstruction
  • Contrastive divergence
  • Error and energy function
  • Running the epochs and analyzing the results
  • Using the weights of an RBM as feature vectors for PCA
  • Understanding PCA
  • Mathematical explanation
  • Using TensorFlow's Embedding Projector to represent PCA
  • Analyzing the PCA to obtain input entry points for a chatbot
  • Basic concepts
  • Defining NLU
  • Why do we call chatbots "agents"?
  • Creating an agent to understand Dialogflow
  • Entities
  • Intents
  • Context
  • Adding fulfillment functionality to an agent
  • Defining fulfillment
  • Enhancing the cogfilmdr agent with a fulfillment webhook
  • Getting the bot to work on your website
  • Machine learning agents
  • Using machine learning in a chatbot
  • Speech-to-text
  • Text-to-speech
  • Spelling
  • Why are these machine learning algorithms important?
  • From reacting to emotions, to creating emotions
  • Solving the problems of emotional polysemy
  • The greetings problem example
  • The affirmation example
  • The speech recognition fallacy
  • The facial analysis fallacy
  • Small talk
  • Courtesy
  • Emotions
  • Data logging
  • Creating emotions
  • RNN research for future automatic dialog generation
  • RNNs at work
  • RNN, LSTM, and vanishing gradients
  • Text generation with an RNN
  • Vectorizing the text
  • Building the model
  • Generating text
  • Understanding evolutionary algorithms
  • Heredity in humans
  • Our cells
  • How heredity works
  • Evolutionary algorithms
  • Going from a biological model to an algorithm
  • Basic concepts
  • Building a genetic algorithm in Python
  • Importing the libraries
  • Calling the algorithm
  • The main function
  • The parent generation process
  • Generating a parent
  • Fitness
  • Display parent
  • Crossover and mutation
  • Producing generations of children
  • Summary code
  • Unspecified target to optimize the architecture of a neural network with a genetic algorithm
  • A physical neural network
  • What is the nature of this mysterious S-FNN?
  • Calling the algorithm cell
  • Fitness cell
  • ga_main() cell
  • Artificial hybrid neural networks
  • Building the LSTM
  • The goal of the model
  • Neuromorphic computing
  • Getting started with Nengo
  • Installing Nengo and Nengo GUI
  • Creating a Python program
  • A Nengo ensemble
  • Nengo neuron types
  • Nengo neuron dimensions
  • A Nengo node
  • Connecting Nengo objects
  • Visualizing data
  • Probes
  • Applying Nengo's unique approach to critical AI research areas
  • The rising power of quantum computers
  • Quantum computer speed
  • Defining a qubit
  • Representing a qubit
  • The position of a qubit
  • Radians, degrees, and rotations
  • The Bloch sphere
  • Composing a quantum score
  • Quantum gates with Quirk
  • A quantum computer score with Quirk
  • A quantum computer score with IBM Q
  • A thinking quantum computer
  • Representing our mind's concepts
  • Expanding MindX's conceptual representations
  • The MindX experiment
  • Preparing the data
  • Transformation functions – the situation function
  • Transformation functions – the quantum function
  • Creating and running the score
  • Using the output
  • Answers to the questions for each chapter

You must reply before you can see the hidden data contained here.
 
Top