This is the carrot that is going to force me to switch from Java to bedrock.
Buying a $1200 graphics card is crazy talk to me.
Is Java really that lagging in behind in power, or does Microsoft just want you to swap to Windows (Bedrock)?
I paid like 800 something for my 2080S, after tax, but you can get a 2060 for like 300 something I think.
It has to do with the programming code, Java doesn't support it.
The Windows 10 computer in the house blue screened last month due to an update they pushed and had to do a reinstall. I have a long history of bad luck with the OS. It stinks too because MC w/ RTX looks so good.
I'll still be on Java for a while also because of friends that have to use it and I want to be able to play with them.
I'll just drool (a bit) over these shiny videos though.
It's shitty how the community is divided like this.
I never really followed why they forked it, maybe Java is too limited?
Redstone doesn't even behave the same between the versions. Doesn't make sense.
Then in other news, Crytek is remastering Crysis and it's supposed to deploy with it's own agnostic real-time ray tracing tech this summer. I was already quite impressed with the Intel agnostic solution that I got to test out with the World of Tanks tech demo. I made a fair upgrade to my rig last fall and still managed Ultra settings and locked 1080p/60fps in the demo. Considering I saw a lot of nvidia RTX games require dropping down the resolution to 1080p for playable frame rates, this development just kind of cements that I won't be upgrading my monitor any time soon.
I want to get an Asus VG27BQ but everyone is out of stock.
I never really felt the urge to upgrade to 1440p but since I have a card now that can push it for I figure why not. I can still play in 1080 if needed.
That Crysis remaster sound sweet. I never played world of tanks.
That's a pretty nice monitor, so I can appreciate why it's been hard to come by. I think my next one will be a Samsung QLED unless microLED ones drop in a reasonable price range. I've always heard that the higher res monitors have blurriness if you lower the resolution from native. I had to down-res my PS3 from 1080p to 720p in the past and never noticed that big of a quality hit. Maybe it's more pronounced the higher the pixel density, though?
Hmm. Not sure.
I currently have a nice Asus 1080p 144 hz monitor. I was thinking about gifting it to my son if I get a new one, but I could always be greedy and keep it if I see a problem with that.
(post is archived)