New
#31
do you want me to list me resume? i don't have to.
I just told you how it works and you are trying to over wirte me with information that is incorrect. i actually make games when i say this then i mean that.
You have potentially played some of them that contain assets of mine. I don't have to show boat and it would be a disservice to come in here on blast trying to showcase my work, its unethical i don't need to do that. I don't have to prove myself.
Textures are sent to the Graphics card and are the top most cause of VRAM usage the next after that is geometry.
lets say a texture map of 1024 x 1024 with a bit depth of 24 and we want to know exactly how much VRAM it will use without compression.
- 1024 x 1024 = 1,048,576 total pixels
- 1,048,576 x 24 bits for each pixel = 25,165,824 total bits in that texture map
- 3,145,728 bytes / 1024 = 3,072kb
the texture will take roughly about 3mb of VRAM.
Now PBR rendering which is the dominate rendering style today can have several maps
- Color
- Normal
- Metallic,
- Gloss,
- Ambient occlusion
- Illumination
That is 18mb of textures. If you are spawning unique assets that have that overhead 100 times in a scene then that is 1.8 gb of ram. As a baseline i mean without optimizations here.
This scales up wards so 2048X 2048 texture maps are common these days and in some games 4k texture maps are common on character models. which can easily be several gb of VRAM just on one model.
Anyway Compression can happen on the texture or image itself or it can even happen at runtime all the other stuff is nuanced i know what i am talking about.
So basically all you could do is trying to prove you know math ? that is good for you , now the thing is your math is proving my proved theory already , that they could do things in 2 GBs VRams or they can do it in 20 GBs based on how optimized they work things so they can easily render your hardware uncapable just by believing that they need to take of the load of compression and decompression time of texture on CPU by just filling the GPU VRam instead while in reality they could just even use 256 x 256 textures for an NPC so small that is hardly noticed on screen . Ugh this is turning out to be so lame convo , you are trying to prove your self knowledgeable while you just offer nothing really , ugh , i'm out , continue with your rant on your own , adios .
- - - Updated - - -
Don't understand why would a post like this drive some people sleepless ?
It has , the person wants to know of the average VRAM that can save him few years from buying new hardware , the post was to demonstrate that VRAM is actually a toy in the hands of developers , he can have a title as of this year that he would love to play yet they can just make it hard for him to by adopting a weird rendering strategy so he'd end up lagging or crashing and directed to an upgrade for their liking .
What is possible in theory doesn't help anyone that wants to buy a video card. The OP wanted real advice.
Over the years more and more games have been designed to use more and more VRAM. That is reality. It wasn't long ago when only 1GB VRAM was enough. Today some games need 8GB or more of VRAM. I wouldn't be surprised do see in 5 years that 12-16GB of VRAM will be a minimum for some games. Of course this is for demanding games. There are many games that require much less VRAM.
The only way to really answer the question of how much VRAM is needed is to see how much is needed for any particular game. The problem is most people have no way of knowing what games they will be playing in 5 years or how much VRAM will be needed.
So practically you are supporting the idea that the real answer is if weather you want to play a particular title then you realize that the developers decided to ambiguously blow its VRAM use to render whatever you have in hand obsolete regardless of the time span ? then this was exactly my answer .
nIGHTmAYOR
There is no such thing as if this or that can be good for few years , if a developer and a GPU maker decided to team up and make your 32 GB VRAM GPU struggling so you have to buy a new GPU to support their business model they just can (You can research the story of a game called Crysis and the meme that surfaced during "Can it run Crysis ?")
It is an ongoing cycle where graphics cards are developed with more and more VRAM and game developers are quick to develop new games that will take advantage of that. It has been that way for many years.
Another factor is monitor resolution. As people continue to buy larger monitors with higher resolutions more VRAM is needed to support that. I see where some people buy 4K monitors and wonder why their graphic cards can't drive them. They didn't realize that a 4K monitor requires a faster graphics card with more VRAM than one that is only 1080p.
The problem with us mere game players is that we have no control of how much VRAM a particular game needs. If we want to play the latest and most demanding games then we have to buy newer and faster graphic cards with more VRAM. If not then we have to be realistic and only play games that our current graphic cards will support. That is just the way it is.