wasn't SimCity made as online-only game to make AI and resource calculations on EA servers instead on player's own CPU?
and for PC, there are more cores available on CPU and GPU than consoles will ever have...if you have Intel CPUs, can't developers take advantage of hyper-threading for better AI calculation?
I think developers are making excuses for mediocre performances on PC because they're focused on console release...
Ubisoft makes a lot of performance cap these days
I don't believe you can't make AI for both large crowds and small NPC groups / individual NPCs that doesn't ~seem~ to be intelligent.
Regarding the example of NPCs attacking the player after they just saw him kill a high-ranking opponent, the logic for this has been solved almost a decade ago in games such as Warlords Battlecry and the original Assassin's Creed. In WBC, if you killed a difficult target, such as a dragon or daemon, nearby enemies had a percentage chance of being afflicted with the psych effect of fear or terror. In AC1, killing someone sent the nearby civilian crowd running away from you. If you want to make it more realistic, add logic to compare the difficulty of the opponent you slew to those of the nearby opponents who were in line of sight. Killing a level 5 opponent only sends NPCs with lower level difficulty running away from you.
As the number of NPCs increases, more and more they start acting as a crowd and less as individuals, and the AI per NPC can be simplified.
In scenarios with fewer NPCs, it's really just a matter of setting switches. It's just that the developer has to have the foresight to put switches for different scenarios such as having previously been attacked or wronged (e.g. you stole something from them) by the player. With many years of gaming now behind us, most scenarios are now known, even turned into memes. So any developer should put a minimum set of switches for their NPCs' AI.
Switches are permanent, or have longer longevity than current AI state. So an enemy guard, in a restricted area, who almost came close to making you out, would have logged these previous events and doesn't go back to the default state of 'unalarmed'. In the future, when he's in the process of making you out, he'll be more insistent, and even investigate more aggressively, even before the "suspicion bar" fills all the to red and he positively detected you.
So really, if there's any limitation it's a RAM limitation, but we have plenty of that in modern PCs.
Granted, putting a large number of NPCs on the screen at one time who act like individuals as opposed to a crowd would still be CPU-bound. Conversely, you can split the crowd into two crowds: a crowd of guards and a crowd of civilians.
Consoles are dead. I shall now refer to them as the poor man's PC. I'm glad they don't have any killer console exclusive titles this generation cause I'd rather drop 400 on a new maxwell graphics card than a whole console for one game.
I have both - well... 2 pcs and the Xbox One, 360, & Wii
Console has its place.. for me... driving games. Games like Alien Isolation... Arcade style (Castle Crashers or Gauntlet - although I have that on the pc and use the xbox controller) There are just some games that are great to play on a big screen on the couch.
FPS.. nope.. PC only. Diablo style games... PC... games with Mods... PC.
ever heard of steam big picture mode? I have my PC hooked up to my tv so if I ever tire of using my 30" IPS 2560x1440p monitor I can switch to my soon to be 4K vizio p-series
ubisoft i love your games but your always lying
I'm just here for the smart comments
W... what? I don't think they get how CPU'S work.
If they're right, I'm $%Â¨&
I actually had a much longer post on this topic but deleted it because it was off-topic. First, let me say that I think you're right. I think they absolutely could've fixed or reduced the issue of absurd attacks on PC characters. No question.
But the bigger point is that players don't really want *smart* AI past a certain point. Imagine if every bandit camp in Skyrim woke up as soon as you attacked and chased you off, then relocated the following day? Or if you arrived at a fortress where you expected to find epic loot, only found nothing -- because dragons had struck the fortress three weeks ago, and enterprising bandits had noted the attack and swarmed the place while the weakened guards were at half strength?
There are people who would love to play this kind of game, to be sure -- but it's getting far afield from what Skyrim itself offered. Remember, in this case, it's not just about setting switches -- it's about setting those switches for a dozen or a hundred characters and performing sophisticated evaluation on multiple aspects of behavior *without* slowing down the main game.
What Ubisoft is saying is that they're already doing so much of this, they can't run the game faster or the AI couldn't keep up.
"Unlike some, I don't blame AMD's Jaguar for that: AMD has variants of the core clocked at 2GHz and above, albeit at higher power consumptions. Microsoft or Sony could've specced out a variant of the core clocked at 2-2.4GHz and boosted total CPU throughput by as much as 50%. They didn't. "
CPU limited doesn't mean GFLOP limited. Most AI systems are GB/s limited, with small memory footprints that can be in L1/L2 cache. Most likely Ubisoft's methods are crossing the L2 barrior more often than they would on conventional desktop CPUs due to poor chip design rather than not being able to actually implement a good algorithm. Both companies should have stuck to IBM or Intel for the chips, since both are more energy efficient when limited by something other than gflops.
"The programmable nature of the GCN architecture inside the Xbone and PS4 is meant to compensate for the relatively lightweight core"
And AMD's driver implementation means you can't compensate for anything due to horrible OpenCL complience. Few AI kernels would be able to be properly compiled for AMD chips, and if the AI is as complex as they claim it certainly won't work!
AI is usually memory limited rather than GFLOP limited. If we all had stacked DRAM racetracked into the CPU it might be GFLOP limited, but so far even AI in L3 cache is too slow to be GFLOP limited
1). I could certainly believe that these chips are GB/s limited -- which is why increasing clock rate on the CPU would improve L2 transfer bandwidth. The Xbox One and PS4 retained Jaguar's 50% L2 clock -- so we'd be pushing the L2 up from 800MHz to 1200MHz.
If you meant that the chips are crossing from L2 to main memory, bringing the clock speed up *still* helps -- the IMC on Jaguar runs at L2 speed (meaning again, we're stepping from 800MHz to 1.2GHz).
2). One assumes that Microsoft and Sony investigated the cost of moving to Intel and IBM for the chips and opted not to for cost and complexity reasons. This gets back to my point about both companies prioritizing immediate profits over cutting-edge technology.
Neither IBM nor Intel own a GPU technology capable of driving a next-gen console. Microsoft and Sony would've had to license designs from Nvidia or AMD, then do the work themselves to integrate them. This drove up costs on the Xbox 360 and PS3, and that's why neither company went this route.
Now, if your point is that Sony and MS could've gotten a better chip if they'd been willing to spend more money, I agree. But they obviously weren't, and thus we find ourselves here today.
3). I don't think enough is known about the state of GCN or how its exposed on either platform to speculate on this. As far as I can tell neither the Xbox One or PS4 is OpenCL-compatible. The Xbox One might use DirectCompute -- what Sony uses is unclear.
Sony has talked about HSA and OCL in the past, as they related to the PS4, but there's no hard information currently available that I'm aware of.