Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

2 Pages12>
Options
View
Go to last post Go to first unread
Offline News  
#1 Posted : Wednesday, October 8, 2014 5:56:17 PM(UTC)
News


Rank: Member

Reputation:

Groups: Administrators, Registered
Joined: 9/23/2007(UTC)
Posts: 25,073

Was thanked: 3 time(s) in 3 post(s)
A new interview with Assassin's Creed Unity senior producer Vincent Pontbriand has some gamers seeing red and others crying "Told you so!" after the developer revealed that the game's 900p framerate and 30 fps target on consoles is a result of weak CPU performance rather than GPU compute.

"Technically we're CPU-bound, he [Pontbriand] said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.

"We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."



This has been read by many as a rather damning referendum on the capabilities of AMD's APU. To some extent, that's justified -- the Jaguar CPU inside both the Sony PS4 and Xbox One is a modest chip with a relatively low clock speed. Both consoles may offer eight threads on paper, but games can't access all that headroom -- one thread is reserved for the OS and several more cores will be used for processing the 3D pipeline. Between the two, Ubisoft may have only had 4-5 cores for AI and other calculations -- scarcely more than last gen, and the Xbox 360 and PS3 CPUs were clocked much faster than the 1.6 / 1.73GHz frequencies of their replacements.

What kind of AI is Ubisoft trying to build?

I think it's worth noting that the concept of good AI is entirely game-dependent. We can separate it into two approximate camps -- games that attempt to model near-lifelike behavior for a very small number of characters, and games that attempt to create a large number of characters that aren't blitheringly, humiliatingly, obviously artificial and stupid.

What does this mean? It's genuinely not clear -- and sometimes, even small things can make an enormous difference to players. Take SimCity -- when Maxis launched the latest troubled incarnation of that franchise, it quickly became obvious that its Sims were terrifyingly moronic in ways that created problems for players. Instead of actually returning home, they simply walked into the first empty house they found. Rather than "owning" a car, they got in the first empty vehicle. This created massive traffic and housing snarls and caused job shortages -- the game treated characters as interchangeable parts that walked in and out of a machine every day, rather than as individuals with particular assignments and tasks.

At the same time, however, trying to make NPCs smarter can run into its own challenges. Skyrim's NPC merchants, guards and bandits are often mocked for ludicrously stupid behavior -- but in many cases, these oddities -- such as being able to put buckets on shopkeeper heads and then steal at will -- are the result of sophisticated attempts to improve behavior by adding more flexibility. Shopkeepers aren't just invisibly aware of you if you steal something -- they can only know you stole it if they see you do it.


Let's face it -- some of these AI decisions really were just terrible


Similarly, we laugh at bandits who return to normal status after a few seconds -- even walking around a dead ally while muttering "Must've been my imagination" -- yet few people want to play a version of the game where being detected always led to complete camp mobilization and being chased out of the area or killed.

Somewhere in between the two lines, there's a balance -- but where that balance sits varies from game to game.

What seems increasingly obvious is that this will not be the console generation of 1080p60 as a reliable feature. Unlike some, I don't blame AMD's Jaguar for that:  AMD has variants of the core clocked at 2GHz and above, albeit at higher power consumptions. Microsoft or Sony could've specced out a variant of the core clocked at 2-2.4GHz and boosted total CPU throughput by as much as 50%. They didn't. The programmable nature of the GCN architecture inside the Xbone and PS4 is meant to compensate for the relatively lightweight core, but AI calculations may simply be something it can't do much with -- GPU calculations tend to be high latency, and AI typically requires fast response times.

So far, Sony and Microsoft's decision to minimize console costs and go with off-the-shelf designs that could reach profitability more quickly has paid off, big time -- but whether or not that continues to be the case long term is an open question.
Offline icepick314  
#2 Posted : Thursday, October 9, 2014 6:43:55 AM(UTC)
Anonymous


Rank: Guest

Groups: Administrators
Joined: 5/26/2007(UTC)
Posts: 1,342

wasn't SimCity made as online-only game to make AI and resource calculations on EA servers instead on player's own CPU?

and for PC, there are more cores available on CPU and GPU than consoles will ever have...if you have Intel CPUs, can't developers take advantage of hyper-threading for better AI calculation?

I think developers are making excuses for mediocre performances on PC because they're focused on console release...

Offline LuqmanHakimAzman  
#3 Posted : Thursday, October 9, 2014 8:13:21 AM(UTC)
LuqmanHakimAzman


Rank: Member

Reputation:

Groups: Registered
Joined: 10/9/2014(UTC)
Posts: 1

Ubisoft makes a lot of performance cap these days

Offline alkarnur  
#4 Posted : Thursday, October 9, 2014 8:57:38 AM(UTC)
alkarnur


Rank: Member

Reputation:

Groups: Registered
Joined: 9/30/2014(UTC)
Posts: 11

I don't believe you can't make AI for both large crowds and small NPC groups / individual NPCs that doesn't ~seem~ to be intelligent.

Regarding the example of NPCs attacking the player after they just saw him kill a high-ranking opponent, the logic for this has been solved almost a decade ago in games such as Warlords Battlecry and the original Assassin's Creed. In WBC, if you killed a difficult target, such as a dragon or daemon, nearby enemies had a percentage chance of being afflicted with the psych effect of fear or terror. In AC1, killing someone sent the nearby civilian crowd running away from you. If you want to make it more realistic, add logic to compare the difficulty of the opponent you slew to those of the nearby opponents who were in line of sight. Killing a level 5 opponent only sends NPCs with lower level difficulty running away from you.

As the number of NPCs increases, more and more they start acting as a crowd and less as individuals, and the AI per NPC can be simplified.

In scenarios with fewer NPCs, it's really just a matter of setting switches. It's just that the developer has to have the foresight to put switches for different scenarios such as having previously been attacked or wronged (e.g. you stole something from them) by the player. With many years of gaming now behind us, most scenarios are now known, even turned into memes. So any developer should put a minimum set of switches for their NPCs' AI.

Switches are permanent, or have longer longevity than current AI state. So an enemy guard, in a restricted area, who almost came close to making you out, would have logged these previous events and doesn't go back to the default state of 'unalarmed'. In the future, when he's in the process of making you out, he'll be more insistent, and even investigate more aggressively, even before the "suspicion bar" fills all the to red and he positively detected you.

So really, if there's any limitation it's a RAM limitation, but we have plenty of that in modern PCs.

Granted, putting a large number of NPCs on the screen at one time who act like individuals as opposed to a crowd would still be CPU-bound. Conversely, you can split the crowd into two crowds: a crowd of guards and a crowd of civilians.

Offline KOwen  
#5 Posted : Thursday, October 9, 2014 9:58:08 AM(UTC)
KOwen


Rank: Member

Reputation:

Groups: Registered
Joined: 9/5/2012(UTC)
Posts: 111
Location: California

Consoles are dead. I shall now refer to them as the poor man's PC. I'm glad they don't have any killer console exclusive titles this generation cause I'd rather drop 400 on a new maxwell graphics card than a whole console for one game.

Offline StaticFX  
#6 Posted : Thursday, October 9, 2014 10:32:54 AM(UTC)
StaticFX


Rank: Advanced Member

Reputation:

Groups: Registered
Joined: 3/7/2014(UTC)
Posts: 622
Man
United States
Location: Rochester, NY

Thanks: 4 times
Was thanked: 11 time(s) in 11 post(s)

I have both - well... 2 pcs and the Xbox One, 360, & Wii

Console has its place.. for me... driving games. Games like Alien Isolation... Arcade style (Castle Crashers or Gauntlet - although I have that on the pc and use the xbox controller) There are just some games that are great to play on  a big screen on the couch.

 

FPS.. nope.. PC only. Diablo style games... PC... games with Mods... PC. 

Offline KOwen  
#7 Posted : Thursday, October 9, 2014 10:41:53 AM(UTC)
KOwen


Rank: Member

Reputation:

Groups: Registered
Joined: 9/5/2012(UTC)
Posts: 111
Location: California

ever heard of steam big picture mode? I have my PC hooked up to my tv so if I ever tire of using my 30" IPS 2560x1440p monitor I can switch to my soon to be 4K vizio p-series

Offline ClaytonBugeja  
#8 Posted : Thursday, October 9, 2014 12:36:29 PM(UTC)
ClaytonBugeja


Rank: Member

Reputation:

Groups: Registered
Joined: 3/10/2014(UTC)
Posts: 9

ubisoft i love your games but your always lying

Offline JackoDeJager  
#9 Posted : Thursday, October 9, 2014 12:36:30 PM(UTC)
JackoDeJager


Rank: Member

Reputation:

Groups: Registered
Joined: 10/9/2014(UTC)
Posts: 1

I'm just here for the smart comments

Offline AdamMichael  
#10 Posted : Thursday, October 9, 2014 4:31:56 PM(UTC)
AdamMichael


Rank: Member

Reputation:

Groups: Registered
Joined: 6/27/2013(UTC)
Posts: 9

W... what? I don't think they get how CPU'S work.

Offline ThiagoDantas  
#11 Posted : Thursday, October 9, 2014 4:41:25 PM(UTC)
ThiagoDantas


Rank: Member

Reputation:

Groups: Registered
Joined: 10/9/2014(UTC)
Posts: 7

If they're right, I'm $%¨&

Offline Joel H  
#12 Posted : Thursday, October 9, 2014 6:10:50 PM(UTC)
Joel H


Rank: Member

Reputation:

Groups: Administrators, Registered
Joined: 7/16/2009(UTC)
Posts: 1,089

I actually had a much longer post on this topic but deleted it because it was off-topic. First, let me say that I think you're right. I think they absolutely could've fixed or reduced the issue of absurd attacks on PC characters. No question.

But the bigger point is that players don't really want *smart* AI past a certain point. Imagine if every bandit camp in Skyrim woke up as soon as you attacked and chased you off, then relocated the following day? Or if you arrived at a fortress where you expected to find epic loot, only found nothing -- because dragons had struck the fortress three weeks ago, and enterprising bandits had noted the attack and swarmed the place while the weakened guards were at half strength?

There are people who would love to play this kind of game, to be sure -- but it's getting far afield from what Skyrim itself offered. Remember, in this case, it's not just about setting switches -- it's about setting those switches for a dozen or a hundred characters and performing sophisticated evaluation on multiple aspects of behavior *without* slowing down the main game.

What Ubisoft is saying is that they're already doing so much of this, they can't run the game faster or the AI couldn't keep up.

Offline basroil3  
#13 Posted : Thursday, October 9, 2014 7:48:58 PM(UTC)
basroil3


Rank: Member

Reputation:

Groups: Registered
Joined: 7/30/2013(UTC)
Posts: 60

"Unlike some, I don't blame AMD's Jaguar for that: AMD has variants of the core clocked at 2GHz and above, albeit at higher power consumptions. Microsoft or Sony could've specced out a variant of the core clocked at 2-2.4GHz and boosted total CPU throughput by as much as 50%. They didn't. "

CPU limited doesn't mean GFLOP limited. Most AI systems are GB/s limited, with small memory footprints that can be in L1/L2 cache. Most likely Ubisoft's methods are crossing the L2 barrior more often than they would on conventional desktop CPUs due to poor chip design rather than not being able to actually implement a good algorithm. Both companies should have stuck to IBM or Intel for the chips, since both are more energy efficient when limited by something other than gflops.

"The programmable nature of the GCN architecture inside the Xbone and PS4 is meant to compensate for the relatively lightweight core"

And AMD's driver implementation means you can't compensate for anything due to horrible OpenCL complience. Few AI kernels would be able to be properly compiled for AMD chips, and if the AI is as complex as they claim it certainly won't work!

Offline basroil3  
#14 Posted : Thursday, October 9, 2014 7:56:54 PM(UTC)
basroil3


Rank: Member

Reputation:

Groups: Registered
Joined: 7/30/2013(UTC)
Posts: 60

AI is usually memory limited rather than GFLOP limited. If we all had stacked DRAM racetracked into the CPU it might be GFLOP limited, but so far even AI in L3 cache is too slow to be GFLOP limited

Offline Joel H  
#15 Posted : Friday, October 10, 2014 4:03:00 PM(UTC)
Joel H


Rank: Member

Reputation:

Groups: Administrators, Registered
Joined: 7/16/2009(UTC)
Posts: 1,089

Basroil,

1). I could certainly believe that these chips are GB/s limited -- which is why increasing clock rate on the CPU would improve L2 transfer bandwidth. The Xbox One and PS4 retained Jaguar's 50% L2 clock -- so we'd be pushing the L2 up from 800MHz to 1200MHz.

If you meant that the chips are crossing from L2 to main memory, bringing the clock speed up *still* helps -- the IMC on Jaguar runs at L2 speed (meaning again, we're stepping from 800MHz to 1.2GHz).

2). One assumes that Microsoft and Sony investigated the cost of moving to Intel and IBM for the chips and opted not to for cost and complexity reasons. This gets back to my point about both companies prioritizing immediate profits over cutting-edge technology.

Neither IBM nor Intel own a GPU technology capable of driving a next-gen console. Microsoft and Sony would've had to license designs from Nvidia or AMD, then do the work themselves to integrate them. This drove up costs on the Xbox 360 and PS3, and that's why neither company went this route.

Now, if your point is that Sony and MS could've gotten a better chip if they'd been willing to spend more money, I agree. But they obviously weren't, and thus we find ourselves here today.

3). I don't think enough is known about the state of GCN or how its exposed on either platform to speculate on this. As far as I can tell neither the Xbox One or PS4 is OpenCL-compatible. The Xbox One might use DirectCompute -- what Sony uses is unclear.

Sony has talked about HSA and OCL in the past, as they related to the PS4, but there's no hard information currently available that I'm aware of.

Users browsing this topic
2 Pages12>
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.