Starting in 2011, Google Brain built DistBelief as a proprietary machine learning system based on deep learning neural networks. Its use grew rapidly across diverse Alphabet companies in both research and commercial applications. Google assigned multiple computer scientists, including Jeff Dean, to simplify and refactor the codebase of DistBelief into a faster, more robust application-grade library, which became TensorFlow. In 2009, the team, led by Geoffrey Hinton, had implemented generalized backpropagation and other improvements which allowed generation of neural networks with substantially higher accuracy, for instance a 25% reduction in errors in speech recognition.
TensorFlow is Google Brain's second-generation system. Version 1.0.0 was released on February 11, 2017. While the reference implementation runs on single devices, TensorFlow can run on multiple CPUs and GPUs (with optional CUDA and SYCL extensions for general-purpose computing on graphics processing units). TensorFlow is available on 64-bit Linux, macOS, Windows, and mobile computing platforms including Android and iOS.
Its flexible architecture allows for the easy deployment of computation across a variety of platforms (CPUs, GPUs, TPUs), and from desktops to clusters of servers to mobile and edge devices.
TensorFlow computations are expressed as stateful dataflow graphs. The name TensorFlow derives from the operations that such neural networks perform on multidimensional data arrays, which are referred to as tensors. During the Google I/O Conference in June 2016, Jeff Dean stated that 1,500 repositories on GitHub mentioned TensorFlow, of which only 5 were from Google.
In May 2019, Google announced TensorFlow Graphics for deep learning in computer graphics.
Tensor processing unit (TPU)
In May 2016, Google announced its Tensor processing unit (TPU), an application-specific integrated circuit (ASIC, a hardware chip) built specifically for machine learning and tailored for TensorFlow. A TPU is a programmable AI accelerator designed to provide high throughput of low-precision arithmetic (e.g., 8-bit), and oriented toward using or running models rather than training them. Google announced they had been running TPUs inside their data centers for more than a year, and had found them to deliver an order of magnitude better-optimized performance per watt for machine learning.
In May 2017, Google announced the second-generation, as well as the availability of the TPUs in Google Compute Engine. The second-generation TPUs deliver up to 180 teraflops of performance, and when organized into clusters of 64 TPUs, provide up to 11.5 petaflops.
In May 2018, Google announced the third-generation TPUs delivering up to 420 teraflops of performance and 128 GB high bandwidth memory (HBM). Cloud TPU v3 Pods offer 100+ petaflops of performance and 32 TB HBM.
In July 2018, the Edge TPU was announced. Edge TPU is Google's purpose-built ASIC chip designed to run TensorFlow Lite machine learning (ML) models on small client computing devices such as smartphones known as edge computing.
In May 2017, Google announced a software stack specifically for mobile development, TensorFlow Lite. In January 2019, TensorFlow team released a developer preview of the mobile GPU inference engine with OpenGL ES 3.1 Compute Shaders on Android devices and Metal Compute Shaders on iOS devices. In May 2019, Google announced that their TensorFlow Lite Micro (also known as TensorFlow Lite for Microcontrollers) and ARM's uTensor would be merging.
Pixel Visual Core (PVC)
In October 2017, Google released the Google Pixel 2 which featured their Pixel Visual Core (PVC), a fully programmable image, vision and AI processor for mobile devices. The PVC supports TensorFlow for machine learning (and Halide for image processing).
Google also released Colaboratory, which is a TensorFlow Jupyter notebook environment that requires no setup to use.
Machine Learning Crash Course (MLCC)
On March 1, 2018, Google released its Machine Learning Crash Course (MLCC). Originally designed to help equip Google employees with practical artificial intelligence and machine learning fundamentals, Google rolled out its free TensorFlow workshops in several cities around the world before finally releasing the course to the public.
As TensorFlow's market share among research papers was declining to the advantage of PyTorch, the TensorFlow Team announced a release of a new major version of the library in September 2019. TensorFlow 2.0 introduced many changes, the most significant being TensorFlow eager, which changed the automatic differentiation scheme from the static computational graph, to the "Define-by-Run" scheme originally made popular by Chainer and later PyTorch. Other major changes included removal of old libraries, cross-compatibility between trained models on different versions of TensorFlow, and significant improvements to the performance on GPU.[non-primary source needed]
"New language support should be built on top of the C API. However, [..] not all functionality is available in C yet". Some more functionality is provided by the Python API.
- "Credits". TensorFlow.org. Retrieved November 10, 2015.
- "TensorFlow Release". Retrieved May 31, 2021.
- "TensorFlow.js". Retrieved June 28, 2018.
- Abadi, Martín; Barham, Paul; Chen, Jianmin; Chen, Zhifeng; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Irving, Geoffrey; Isard, Michael; Kudlur, Manjunath; Levenberg, Josh; Monga, Rajat; Moore, Sherry; Murray, Derek G.; Steiner, Benoit; Tucker, Paul; Vasudevan, Vijay; Warden, Pete; Wicke, Martin; Yu, Yuan; Zheng, Xiaoqiang (2016). "TensorFlow: A System for Large-Scale Machine Learning" (PDF). arXiv:1605.08695. Cite journal requires
- Google (2015). TensorFlow: Open source machine learning. "It is machine learning software being used for various kinds of perceptual and language understanding tasks" – Jeffrey Dean, minute 0:47 / 2:17 from YouTube clip
- Video clip by Google about TensorFlow 2015 at minute 0:15/2:17
- Video clip by Google about TensorFlow 2015 at minute 0:26/2:17
- Dean et al 2015, p. 2
- Metz, Cade (November 9, 2015). "Google Just Open Sourced TensorFlow, Its Artificial Intelligence Engine". Wired. Retrieved November 10, 2015.
- Dean, Jeff; Monga, Rajat; et al. (November 9, 2015). "TensorFlow: Large-scale machine learning on heterogeneous systems" (PDF). TensorFlow.org. Google Research. Retrieved November 10, 2015.
- Perez, Sarah (November 9, 2015). "Google Open-Sources The Machine Learning Tech Behind Google Photos Search, Smart Reply And More". TechCrunch. Retrieved November 11, 2015.
- Oremus, Will (November 9, 2015). "What Is TensorFlow, and Why Is Google So Excited About It?". Slate. Retrieved November 11, 2015.
- Ward-Bailey, Jeff (November 25, 2015). "Google chairman: We're making 'real progress' on artificial intelligence". CSMonitor. Retrieved November 25, 2015.
- "Tensorflow Release 1.0.0".
- Metz, Cade (November 10, 2015). "TensorFlow, Google's Open Source AI, Points to a Fast-Changing Hardware World". Wired. Retrieved November 11, 2015.
- Machine Learning: Google I/O 2016 Minute 07:30/44:44 accessdate=2016-06-05
- TensorFlow (January 14, 2019). "What's coming in TensorFlow 2.0". Medium. Retrieved May 24, 2019.
- TensorFlow (September 30, 2019). "TensorFlow 2.0 is now available!". Medium. Retrieved November 24, 2019.
- TensorFlow (May 9, 2019). "Introducing TensorFlow Graphics: Computer Graphics Meets Deep Learning". Medium. Retrieved May 24, 2019.
- Jouppi, Norm. "Google supercharges machine learning tasks with TPU custom chip". Google Cloud Platform Blog. Retrieved May 19, 2016.
- "Build and train machine learning models on our new Google Cloud TPUs". Google. May 17, 2017. Retrieved May 18, 2017.
- "Cloud TPU". Google Cloud. Retrieved May 24, 2019.
- "Cloud TPU machine learning accelerators now available in beta". Google Cloud Platform Blog. Retrieved February 12, 2018.
- Kundu, Kishalaya (July 26, 2018). "Google Announces Edge TPU, Cloud IoT Edge at Cloud Next 2018". Beebom. Retrieved February 2, 2019.
- "Google's new machine learning framework is going to put more AI on your phone".
- TensorFlow (January 16, 2019). "TensorFlow Lite Now Faster with Mobile GPUs (Developer Preview)". Medium. Retrieved May 24, 2019.
- "uTensor and Tensor Flow Announcement | Mbed". os.mbed.com. Retrieved May 24, 2019.
- "Colaboratory – Google". research.google.com. Retrieved November 10, 2018.
- "Machine Learning Crash Course with TensorFlow APIs".
- He, Horace (October 10, 2019). "The State of Machine Learning Frameworks in 2019". The Gradient. Retrieved May 22, 2020.
- He, Horace (October 10, 2019). "The State of Machine Learning Frameworks in 2019". The Gradient. Retrieved July 6, 2020.
- "TensorFlow 2.0 is now available!". TensorFlow Blog. September 30, 2019. Retrieved May 22, 2020.
- "All symbols in TensorFlow | TensorFlow". TensorFlow. Retrieved February 18, 2018.
- "TensorFlow Version Compatibility | TensorFlow". TensorFlow. Retrieved May 10, 2018.
Some API functions are explicitly marked as "experimental" and can change in backward incompatible ways between minor releases. These include other languages
- "API Documentation". Retrieved June 27, 2018.
- TensorFlow (April 26, 2018). "Introducing Swift For TensorFlow". Medium. Retrieved August 14, 2019.
not just a TensorFlow API wrapper written in Swift
- "Swift for Tensorflow is being archived and development has ceased". Retrieved February 18, 2021.
As S4TF heads into maintenance mode, it's a bit Exploding head to reflect on how much I've learned.
- Icaza, Miguel de (February 17, 2018). "TensorFlowSharp: TensorFlow API for .NET languages". Retrieved February 18, 2018.
- Chen, Haiping (December 11, 2018). "TensorFlow.NET: .NET Standard bindings for TensorFlow". Retrieved December 11, 2018.
- "haskell: Haskell bindings for TensorFlow". tensorflow. February 17, 2018. Retrieved February 18, 2018.
- Malmaud, Jon (August 12, 2019). "A Julia wrapper for TensorFlow". Retrieved August 14, 2019.
operations like sin, * (matrix multiplication), .* (element-wise multiplication), etc [..]. Compare to Python, which requires learning specialized namespaced functions like tf.matmul.
- "A MATLAB wrapper for TensorFlow Core". November 3, 2019. Retrieved February 13, 2020.
- "tensorflow: TensorFlow for R". RStudio. February 17, 2018. Retrieved February 18, 2018.
- Platanios, Anthony (February 17, 2018). "tensorflow_scala: TensorFlow API for the Scala Programming Language". Retrieved February 18, 2018.
- "rust: Rust language bindings for TensorFlow". tensorflow. February 17, 2018. Retrieved February 18, 2018.
- Mazare, Laurent (February 16, 2018). "tensorflow-ocaml: OCaml bindings for TensorFlow". Retrieved February 18, 2018.
- "fazibear/tensorflow.cr". GitHub. Retrieved October 10, 2018.
- "TensorFlow in other languages | TensorFlow Core". TensorFlow. Retrieved August 14, 2019.
- Byrne, Michael (November 11, 2015). "Google Offers Up Its Entire Machine Learning Library as Open-Source Software". Vice. Retrieved November 11, 2015.
- Moroney, Laurence (October 1, 2020). AI and Machine Learning for Coders (1st ed.). O'Reilly Media. p. 365. ISBN 9781492078197.
- Géron, Aurélien (October 15, 2019). Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow (2nd ed.). O'Reilly Media. p. 856. ISBN 9781492032632.
- Ramsundar, Bharath; Zadeh, Reza Bosagh (March 23, 2018). TensorFlow for Deep Learning (1st ed.). O'Reilly Media. p. 256. ISBN 9781491980446.
- Hope, Tom; Resheff, Yehezkel S.; Lieder, Itay (August 27, 2017). Learning TensorFlow: A Guide to Building Deep Learning Systems (1st ed.). O'Reilly Media. p. 242. ISBN 9781491978504.
- Shukla, Nishant (February 12, 2018). Machine Learning with TensorFlow (1st ed.). Manning Publications. p. 272. ISBN 9781617293870.
Presented by Jörg Kienitz and Nikolai Nowaczyk
The goal of this two-day workshop is to provide a detailed overview of machine learning techniques applied for finance. We offer insights into the latest techniques for modelling financial markets and focus on option pricing and calibration.
Mastering Python for Finance:
Implement advanced state-of-the-art financial statistical applications using Python, 2nd Edition
James Ma Weiming -
Explore advanced financial models, build state-of-the-art infrastructure, empower your financial applications.