AC Unity PS4 gimped to 900p in order to match Xbox one

Started by Legend, Oct 06, 2014, 07:15 PM

0 Members and 2 Guests are viewing this topic.

the-pi-guy

Quote from: darkknightkryta on Oct 10, 2014, 02:39 PM
They're not separate values.  His "preference" states that coop makes games better(Specifically Assassin's Creed) and that Assassin's Creed shouldn't have coop.  P and ~P.  If he said "Coop makes games better, but I'd prefer Assassin's Creed not to have it since it doesn't enhance the experience" you'd be right.  He didn't say that.
That would be  a contradiction. 

Legend

Quote from: darkknightkryta on Oct 10, 2014, 02:39 PM
They're not separate values.  His "preference" states that coop makes games better(Specifically Assassin's Creed) and that Assassin's Creed shouldn't have coop.  P and ~P.  If he said "Coop makes games better, but I'd prefer Assassin's Creed not to have it since it doesn't enhance the experience" you'd be right.  He didn't say that.

So is he a shill?!?

darkknightkryta


Legend

Quote from: darkknightkryta on Oct 10, 2014, 03:28 PM
Why I quite think he is.
Hahahhahahahhahahahahahahahahahahahahahahahahahahahahahahahahahahahahahhahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha

Riderz1337

Quote from: darkknightkryta on Oct 10, 2014, 02:39 PM
They're not separate values.  His "preference" states that coop makes games better(Specifically Assassin's Creed) and that Assassin's Creed shouldn't have coop.  P and ~P.  If he said "Coop makes games better, but I'd prefer Assassin's Creed not to have it since it doesn't enhance the experience" you'd be right.  He didn't say that.
Co op makes games better but co op doesn't make games better...In the same sentence.

BRUH

Legend made me remove this. Everybody riot.

u4gReservoirDogs

I'm always willing to endure humiliation on behalf of my characters. - Ben Stiller

Legend

http://www.neogaf.com/forum/showthread.php?t=913010


"I'm happy to enlighten you guys because way too much battleship about 1080p making a difference is being thrown around.  If the game is as pretty and fun as ours will be, who cares?  Getting this game to 900p was a baby. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps.  The game was 9fps 9 months ago.  We only achieved 900p at 30fps weeks ago.  The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say.  Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles.  So yes, locking the framerate is a conscious decision to keep people bullshiting, but that doesn't seem to have worked in the end.  Even if Ubi has deals, the dev team members are proud, and want the best performance out of every console out there.  What's hard is not getting the game to render at this point, it's making everything else in the game work at the same level of performance we designed from the start for the graphics.  By the amount of content and NPCs in the game, from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles.  This really is about to define a next gen like no other game before.  Mordor has next gen system and gameplay, but not graphics like Unity does.  The proof comes in that game being cross gen.  Our producer (Vincent) saying we're bound with AI by the CPU is right, but not entirely.  Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did. I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting.  The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others.  Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."

the-pi-guy

Quote from: Legend on Oct 15, 2014, 07:38 PM
http://www.neogaf.com/forum/showthread.php?t=913010


"I'm happy to enlighten you guys because way too much battleship about 1080p making a difference is being thrown around.  If the game is as pretty and fun as ours will be, who cares?  Getting this game to 900p was a baby. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps.  The game was 9fps 9 months ago.  We only achieved 900p at 30fps weeks ago.  The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say.  Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles.  So yes, locking the framerate is a conscious decision to keep people bullshiting, but that doesn't seem to have worked in the end.  Even if Ubi has deals, the dev team members are proud, and want the best performance out of every console out there.  What's hard is not getting the game to render at this point, it's making everything else in the game work at the same level of performance we designed from the start for the graphics.  By the amount of content and NPCs in the game, from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles.  This really is about to define a next gen like no other game before.  Mordor has next gen system and gameplay, but not graphics like Unity does.  The proof comes in that game being cross gen.  Our producer (Vincent) saying we're bound with AI by the CPU is right, but not entirely.  Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did. I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting.  The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others.  Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."
Uh huh, and what do you think? 


darkknightkryta

Quote from: the-Pi-guy on Oct 15, 2014, 07:48 PM
Uh huh, and what do you think? 
If Ubisoft could get the game to 900p at 30 fps on the Xbox One, there's no reason the framerate isn't higher at 900p on PS4.  That's all there is too it.

ethomaz


ethomaz


the-pi-guy

Quote from: darkknightkryta on Oct 15, 2014, 08:57 PM
If Ubisoft could get the game to 900p at 30 fps on the Xbox One, there's no reason the framerate isn't higher at 900p on PS4.  That's all there is too it.
I could understand frame rate, but not resolution. 

NeverDies

I just hope they decide to bump it back up, cause I have a giant TV and want it to look good.
<br />
<br />
<br />
<br /><br />
<br />
<br /><br />
<br />
<br />
<br />

Legend

Quote from: the-Pi-guy on Oct 15, 2014, 07:48 PM
Uh huh, and what do you think? 

I think this whole situation is odd.

Basically though right now I'm assuming the first guy who started everything was the only one being honest and just speaking the unfiltered truth. Everything after that is just reactionary with the intent to decrease public backlash.