"Three trends are driving a resurgence in machine learning. First, data of all kinds is growing exponentially. Second, researchers have made big improvements in the mathematical models used for machine learning. Finally, GPUs have emerged as a critical computational platform for machine learning research. These drivers are resulting in game-changing improvements in the accuracy of these models. That’s because GPUs allow researchers to train these models with more data – much more data – than was possible before. Even using GPUs, the process of training these models by digesting mountains of data takes weeks. Replicating this training process using CPUs is possible – in theory. In reality it would take over a year to train a single model. That’s just too long. Reducing training time is important because the field is evolving fast. Researchers must accelerate through design and training cycles quickly to keep up. GPUs just cost less, too. The hardware is cheaper and sucks up much less power."
Wanna join the discussion?! Login to your HotHardware Forums forum account, or Register a new forum account.
Digital assistants are great. Exception might be the annoying spell check on my S5 which argues with me & substitutes words I didn't create, and you don't figure it out until after you send the text.