• Changing RCF's index page, please click on "Forums" to access the forums.

Fallout 4

Do Not Sell My Personal Information
I'm buying a Fallout game. Motherfucker.
 
Birthday present for me.

Ooh-fucking-rah
 
I really wish when they did anthologies like that they'd include the current game in them. Not that I'd buy it anyway. It looks cool and all, but I already own every one of those games on Steam, and I basically did a 100% run on the 360 version of Fallout 3. Great game, but I have no particular desire to ever go back and replay it.
 
I really wish when they did anthologies like that they'd include the current game in them. Not that I'd buy it anyway. It looks cool and all, but I already own every one of those games on Steam, and I basically did a 100% run on the 360 version of Fallout 3. Great game, but I have no particular desire to ever go back and replay it.

Mod those on the PC. Use Gophers YouTube vids to do it. Totally changes FO3 and New Vegas experiences.
 
c9O7oZA.png


File this under peasantry. @Jack Brickman @gourimoko . Although I hear consoles are joining the 20th century and getting mods this go (with restrictions of course).
 
c9O7oZA.png


File this under peasantry. @Jack Brickman @gourimoko . Although I hear consoles are joining the 20th century and getting mods this go (with restrictions of course).

Mods will be weak on consoles since a lot of what mods do (like running third-party code) won't be permitted under TOS. So nothing like ENBs or script mods.

30fps is also fucking hilarious... Lord Jesus...

Cannot understand the appeal of these next gen systems.

Anyway, still designing my Rift rig.. Fallout 4 with Oculus is going to be fucking obscene.

I'm actually thinking of just building a custom external GPU dock and running the equivalent of a Titan X over multiple mPCIe links. If possible, I'm hoping to find a laptop with more than one PCIe bus so I can link two high-end Radeon cards via Crossfire and avoid as much of the the x4 PCIe bottleneck associated with these types of builds.

Two cards in Crossfire should suffer less overall at sub-4k resolution over two discrete x4 links rather than a single card on a single x4 link (obviously doubling the bandwidth to an effective link speed of x8 rather than x4).

That way I can continue to use a single laptop for all of my needs. When I wanna game, I go home, dock my laptop (as I usually do) and play without even thinking twice about it; but now I'm getting Titan X quality or better. This might hopefully allow me to also downgrade my laptop in the gfx department (I use SLI in my laptop today), and get something more akin to a Macbook Pro or a really good IPS 15" ultrabook.

Anything to avoid building a massive PC.
 
Mods will be weak on consoles since a lot of what mods do (like running third-party code) won't be permitted under TOS. So nothing like ENBs or script mods.

30fps is also fucking hilarious... Lord Jesus...

Cannot understand the appeal of these next gen systems.

Anyway, still designing my Rift rig.. Fallout 4 with Oculus is going to be fucking obscene.

I'm actually thinking of just building a custom external GPU dock and running the equivalent of a Titan X over multiple mPCIe links. If possible, I'm hoping to find a laptop with more than one PCIe bus so I can link two high-end Radeon cards via Crossfire and avoid as much of the the x4 PCIe bottleneck associated with these types of builds.

Two cards in Crossfire should suffer less overall at sub-4k resolution over two discrete x4 links rather than a single card on a single x4 link (obviously doubling the bandwidth to an effective link speed of x8 rather than x4).

That way I can continue to use a single laptop for all of my needs. When I wanna game, I go home, dock my laptop (as I usually do) and play without even thinking twice about it; but now I'm getting Titan X quality or better. This might hopefully allow me to also downgrade my laptop in the gfx department (I use SLI in my laptop today), and get something more akin to a Macbook Pro or a really good IPS 15" ultrabook.

Anything to avoid building a massive PC.
When you get this all setup can you post pics of your setup?
 
When you get this all setup can you post pics of your setup?

Definitely. There are quite a few commercial builds for this already, like from Alienware (they make a Titan X version). Also MSI has one too.

I may end up just using a PE4H card which would work in my Lenovo which already has two discrete PCIe buses and SLI cards.

The ultimate goal is to use an MXM->PCIe connection, which would preserve PCIe x16 just using a riser card, but I'm not sure how easy this would be. I would need to retrofit an Ultrabay card. But this would give nearly 100% performance of a desktop build.

The easier solution is to use mPCIe, which will be at x4. If mPCIe is over both buses, then I may just go with this since it's fairly straight forward and easy to do (again using the PE4H board). This would give 95% performance of a desktop build.

ExpressCards are another option, but I don't think many ultrabooks even have ExpressCard slots. This would give about ~85% performance.

The other option, which no one seems to do, is to just (custom) order an MXM->MXM riser cable and rewire it from a ribbon into an actual cable then bond the two together to form a single pair. That way, over a single cable you'd have 2 MXM connections. You could then drive 2x980Ms externally, with external power (not ATX) which would be a very small, neat, and even portable build. You'd even get them to display on your laptop's screen! This would be awesome, and would allow you to use mobile GPUs in tiny builds at full efficiency.

All of these builds would work with the Thinkpad/Ideapad series of systems, but I was hoping to move away from these systems to something half the weight and with an IPS display.

P.S.
Another option is to use Thunderbolt 3. Thunderbolt 3 is essentially a PCIe 3.0 x4 cable with a 40Gbps bottleneck (compared to 64Gbps max for a hard link).

That's a pretty significant drop in bandwidth (only 67.5% of an x4 connection); however, x4 3.0 can run a single Titan X at 95% efficiency, and that's kind of the point. It's not as ideal as using MXM->PCIe (which is x16 all the way), or even as good as using mPCI and a PEH4 board (which is x4 all the way), but it's by far the easiest and cleanest solution.

To use 2 separate Thunderbolt 3.0 -> PCIe x16 adapters and run two Radeon cards in Crossfire (not SLI), you'd get a portable 2-card up to 4-way Crossfire that's not exactly efficient but for such a tradeoff you can use the extremely powerful 4ghz Intel i7 in an IPS laptop as the basis for your gaming hardware. Also, your system stays ice cold, and you game on enthusiast for all games even with Oculus.

Trust me, I'd rather use a Nvidia card over the AMD alternatives, but, Crossfire would work over Thunderbolt, which is fucking astounding... So that kind of makes the decision for me to use AMD tech for this build.
 
Last edited:
30 fps is a fucking joke. Pc will always be where my heart is even though ive transitioned over to mainly xbox one. I just got sick of upgrading my pc and lost interest in all the tech stuff i used to enjoy.

Ill be buying fallout on xbone regardless.
 
30 fps is a fucking joke. Pc will always be where my heart is even though ive transitioned over to mainly xbox one. I just got sick of upgrading my pc and lost interest in all the tech stuff i used to enjoy.

Ill be buying fallout on xbone regardless.

I dunno man... I wouldn't do it if I were you.

They say 1080p/30fps on Xbox One, but imma tell you flat out, I'd bet it's not really 1080p on release.

Lots of games say 1080p on the box, because that's the output resolution coming out of the frame buffer (or 1080i coming out of the RAMDAC). So, legally, yeah it's 1080 vertical lines.. But, in reality? It's very likely rendering to a 720p bitmap and being upscaled using a pixel shader.

Even on the PS4, Battlefront does this to get a 15% performance boost by cutting the equivalent 180 vertical lines of resolution and simply upscaling to 1080p from 900.

Going from 900 to 1080 is one thing, but from 720 to 1080 is another. 720p content on my 65" screen is very noticeable, and I'd expect the same playing Fallout 4 on an Xbone vs a PC.

Side-by-side image comparisons are a bit silly, because the display can't actually show both images natively. Instead it's more useful to compare the images stacked. Here is the difference between 720 and 1080p for those that don't know:
2361882-hd_test_1080p_vs_720p_vs_dvd_1.jpg


So what the Xbox One (and the PS4 to a lesser extent) is doing is taking the middle image and enlarging it using a nearest neighbor pixel shader to interpolate the missing data. This invariably results in a more distorted picture. This is only not the case if you run at 720p natively, because then no upscaling is required. That is the only way you will get a 100% accurate picture, but doing so on a large screen (50"+) is problematic.

Lastly, the 30fps, for a first person shooter?

....

If you have a PC that can play Fallout 4, I'd strongly recommend just going that route. If you have Windows 10, they released the Xbone controller dongle so you can play the game just as if it was on the Xbox - but it's on Windows 10 instead.

To give you an idea of how I used to do my setup in Hawaii; my PC was in the other room and I ran a very long HDMI cable through the wall and around the floor to my TV. Even over 40 ft, HDMI will still work if the connectors are high grade and the cable is decently made.

Playing L.A. Noire on my PC with an Xbox 360 controller was no different than playing it on the 360 itself. With these new Windows 10 / Xbone games, the difference is simply non-existent as far as I know.

I just couldn't recommend anyone take that kind of hit, lose modability, lose graphics, lose replayability and portability of your save game, unless they had no other options. This could apply to any game really, but Fallout is a special case.

To play a game like New Vegas, without mods, is a joke of an experience by comparison to playing it with mods installed. I typically run about 20+ mods simultaneously, and the experience has created an entirely new game. I highly doubt this will be possible on console platforms even if they do allow some degree of modding.
 
Last edited:
The save game thing is a big deal. Im no expert but why the fuck can i not save wherever i want in console games. Is this not the year 2015?

I hear you gouri and i dont doubt anything youre saying. My pc is about 2 years old. Built it myself in 2013. Probably cant run a graphically demanding game at top notch graphica anymore but I use my pc primarily for schoolwork, and gaming wise stuff like Blizzard games ie Heroes of the Storm and Starcraft 2 which arent demanding at all, and the fantastic recently released Pillars of Eternity. I dont think i will be tempted to upgrade my pc anytime soon. Just isnt worth the money and time to me both of which are limited.

My xbox one was free too which helps. I dunno. Im not the hard core gaming nerd i used to be. I used to rail on console gamers nonstop from my pc throne of dominance. Now in the little time i have for games i play mostly xbox.

No idea what happened. Believe me, i know that pc will always be technically superior. Theyve been saying for two generations now that consoles will pass pc. Havent seen it yet.
 

Rubber Rim Job Podcast Video

Episode 3-14: "Time for Playoff Vengeance on Mickey"

Rubber Rim Job Podcast Spotify

Episode 3:14: " Time for Playoff Vengeance on Mickey."
Top