Deep Learning

Fri, 08 Mar 2024 07:34:20 GMT

 Properties

Key Value
Identifier deep-learning
Name Deep Learning
Type Topic
Creation timestamp Fri, 08 Mar 2024 07:34:20 GMT
Modification timestamp Fri, 08 Mar 2024 08:02:04 GMT

Deep learning is a subset of machine learning that uses multi-layered neural networks, called deep neural networks, to simulate the complex decision-making power of the human brain.

Understanding Deep Learning

Imagine five sheets of coloured paper: red, yellow, orange, green, and blue. Put one on top of the other. Now crumple them together into a small ball. That crumpled paper ball is your input data, and each sheet of paper is a class of data in a classification problem. What a neural network is meant to do is figure out a transformation of the paper ball that would uncrumple it, so as to make the five classes cleanly separable again.

With deep learning, this would be implemented as a series of simple transformations of the 3D space, such as those you could apply on the paper ball with your fingers, one movement at a time.

Uncrumpling paper balls is what machine learning is about: finding neat representations for complex, highly folded data manifolds in high-dimensional spaces (a manifold is a continuous surface, like our crumpled sheet of paper). Deep learning takes the approach of incrementally decomposing a complicated geometric transformation into a long chain of elementary ones, which is pretty much the strategy a human would follow to uncrumple a paper ball. Each layer in a deep network applies a transformation that disentangles the data a little, and a deep stack of layers makes tractable an extremely complicated disentanglement process.

Deep-learning Libraries

PyTorch

PyTorch is an open-source machine learning library primarily developed by Facebook's AI Research lab (FAIR). It is widely used for various artificial intelligence and machine learning tasks, including deep learning. PyTorch provides a dynamic computational graph, which allows for more flexibility in model building and experimentation compared to static computational graphs found in some other frameworks.

Key features of PyTorch include:

  • Dynamic Computational Graphs: PyTorch uses a dynamic computational graph, where the graph is built on-the-fly as operations are executed. This is in contrast to static computational graphs used by some other frameworks like TensorFlow.
  • Tensors: PyTorch uses tensors, which are multi-dimensional arrays, as the fundamental building blocks for data representation and manipulation. Tensors are similar to NumPy arrays and can be easily moved between CPUs and GPUs.

TensorFlow

TensorFlow is an open-source machine learning framework developed by the Google Brain team. It is widely used for various machine learning tasks, with a particular emphasis on deep learning. TensorFlow provides a comprehensive ecosystem of tools, libraries, and community resources to support the development and deployment of machine learning models.

Key features of TensorFlow include:

  • Computational Graph: TensorFlow uses a static computational graph, where the entire computation is defined as a graph before the actual execution. This allows for optimization opportunities and facilitates distributed computing.
  • Tensors: Similar to PyTorch, TensorFlow uses tensors as the fundamental data structure. Tensors are multi-dimensional arrays that can be manipulated using various operations. TensorFlow supports efficient execution of operations on CPUs, GPUs, and TPUs (Tensor Processing Units).

Back to top



 Context