Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

Options
View
Go to last post Go to first unread
Offline News  
#1 Posted : Tuesday, November 18, 2014 5:28:18 AM(UTC)
News


Rank: Member

Reputation:

Groups: Administrators, Registered
Joined: 9/23/2007(UTC)
Posts: 25,073

Was thanked: 3 time(s) in 3 post(s)
Machine learning and artificial intelligence aren't easy things to grasp, but both are critically important for technology to march on, our surroundings to get smarter, and the digital assistants within our phones to become more useful. Microsoft Research and NVIDIA are teaming up this week to showcase what's possible when two giants in the space link hands and share insights. NVIDIA GPUs are being used by Microsoft Research, and the results are impressive.


Microsoft Research employs around 1,000 scientists and engineers to make significant product contributions and address some of society’s toughest challenges. According to information released by both companies, an increasing amount of their work is focused on machine learning. Here's a bit from NVIDIA:

"Three trends are driving a resurgence in machine learning. First, data of all kinds is growing exponentially. Second, researchers have made big improvements in the mathematical models used for machine learning. Finally, GPUs have emerged as a critical computational platform for machine learning research.

These drivers are resulting in game-changing improvements in the accuracy of these models. That’s because GPUs allow researchers to train these models with more data – much more data – than was possible before. Even using GPUs, the process of training these models by digesting mountains of data takes weeks. Replicating this training process using CPUs is possible – in theory. In reality it would take over a year to train a single model. That’s just too long.

Reducing training time is important because the field is evolving fast. Researchers must accelerate through design and training cycles quickly to keep up. GPUs just cost less, too. The hardware is cheaper and sucks up much less power."

Microsoft Research has just deployed a computer system packed with NVIDIA GPUs, and it's going to be very interesting to see what's produced from it. 
Offline starwhite  
#2 Posted : Tuesday, November 18, 2014 5:13:54 PM(UTC)
starwhite


Rank: Member

Reputation:

Groups: Registered
Joined: 9/17/2009(UTC)
Posts: 156
Man
United States
Location: Rugby ND. WE have a small Ranch house 15 miles outside of town on the open prairie.

Digital assistants are great. Exception might be the annoying spell check on my S5 which argues with me & substitutes words I didn't create, and you don't figure it out until after you send the text.

Users browsing this topic
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.