What do you develop your games on?

Who ever built that should be slapped lol, that is like buying a £1000 desktop and SLI’in a pair of 620GTs. Pointless.

How is the battery life?

I survived like 4 years with a broken battery. just plug it in. never cared about battery - I dont wanna freaking play games on the train and as a general rule: if I can sit down, there is an outlet

I think you mean a 1920x1080 but anyways, 500GB SSD? I thought SSD’s only existed for a few years now (and the 2 year olds are thrash, compared to the newer Samsung 840 EVO’s)? Or did you install one yourself (lately)?
If you did install it yourself are you sure you plugged the SATA-cable from your SSD in a 6GHz Sata-port on your motherboard (if it had any)? otherwise it’s quite pointless …

Nice video card for a 5,5 yr old though, you must have bought the best of the best hardware that time, mine is 4 yrs old and it has also an i7-2600 but an older vid card (not even GTX but GT545) I wanted to install an SSD as well, but my pc is a piece of thrash it doesn’t even have a 6 GHz Sata-port on its motherboard :frowning:

  • i7-4770K (yes, Hyperthreading helps, especially for Java games)
  • 16 GB of DDR3 2400MHz RAM (probably doesn’t help xd)
  • A main 2560x1440 IPS monitor overclocked to 90 Hz (yes, I see a clear difference) and a secondary 1920x1080 monitor
  • 2x GTX 770s @ 1241MHz with 4GB VRAM each (yes, I have an SLI profile working with Java games and yes, I have gone above 2GB VRAM usage in BF4 and other games
  • 4 TB harddrive
  • 120GB SSD used as cache drive (at current write rates it will remain under spec until 2022)
  • Windows 7

I am unsure what to make of this. >_>

Hey agent, if you aren’t ‘kidding’ us, you just bought a new pc?
I never heard of overclocking monitors? And how much have you overclocked your i7?
And an SSD just as a cache drive? How does that work? Are your HDD(s) now as fast as an SSD or ?

No, I’m not kidding. xd I built it myself and got the parts last summer, so it’s almost a year old now. I haven’t overclocked my CPU, since I’m a bit worried I might overload my PSU which is just barely powerful enough to drive those to graphics cards. When it comes to temperature it does idle at around 25 degrees with that heatsink and maxes out at around 50 degrees, so I might take a look at it. The GPUs idle at around 30 degrees, but do get quite hot when gaming as the whole case heats up after a while.

Horrible picture proof:

Monitors have a clock rate as well, which effectively decides how fast they refresh. On almost all monitors I’ve tried this is software capped to a 10% tolerance, which means that you can only overclock by 10% (e.g. 66 Hz on a 60Hz screen) before either getting artifacts or simply getting a “unsupported resolution/refresh rate” error. In addition, there are no IPS displays that run at over 60Hz, most likely since IPS displays don’t have good enough response time. There’s also the problem with the bandwidth of a DVI cable, which is why there aren’t any 2560x1440 monitors with a refresh rate higher than 60Hz.

I bought a surprisingly cheap (300-350€) 2560x1440 IPS monitor from a Korean vendor on Amazon. They use the same displays that are in those super expensive Mac monitors, but they take the ones that have minor faults and sell them without any guarantees. Some have dead pixels, backlight bleeding around the edges or similar problems, so it’s a bit of a gamble that obviously isn’t for everyone. The display I got have a pixel-sized dark spot due to some dust being stuck between the backlight and the display. I’d gladly have paid more for a perfect display, but this is the best you can get for gaming purposes if you want an IPS display. Basically, these Koreans attempt to sell these monitors as cheaply as possible, so they strip out anything that isn’t necessary from them. There are 3 main “drawbacks”:

  • There is no on-screen menu. The only setting you can change is the backlight brightness. Colorwise the displays are decently configured out of the box though.
  • There’s no built-in scaler, so it relies on the GPU scaling the output image to 2560x1440 (all AMD and NVIDIA cards can do that).
  • There’s only one output available, a Dual-Link DVI output.

The advantage of this is that removing these features drastically reduces the latency of the screen, making it more fit for gaming. In addition, these monitors use cheaper controllers than most other monitors that do not have a clockrate cap, so you can essentially force any clock rate you want. Mine was stable up to around 96Hz, but a large number of them are stable at 120Hz. I run it at 90 Hz because it’s a better multiple of 60 and I don’t want to push it too hard. However, since this is an IPS monitor there is some motion blur (especially compared to a 120Hz TN monitor with the LightBoost hack), but this is the closest you can get to the best of both IPS panels (good colors) and TN panels (good latency and refresh rate). As a game developer, both were equally important. I am currently considering buying two additional monitors. These monitors definitely aren’t for everyone though.

My last computer used 60GB SSD as a boot drive for Windows and drivers and a 1TB hard drive for everything else. Although the computer booted quickly at first, it quickly got bogged down by Steam and other application I could not fit on the SSD. Game loading was horribly slow, so in my opinion I pretty much wasted the performance of the SSD by not using it enough. It was just too tedious to manage what I should put on the SSD.

Using an SSD as a cache drive is awesome. It gives you a single logical drive which you can just dump everything into, and dedicated hardware on the motherboard handles moving files between the SSD and the harddrive. Performance-wise I get around 90% of the performance of an SSD where it matters. The first time you load a level in a game it takes some time, but the next time (even after a reboot) is lightning fast. It’s just so convenient. Instead of having to waste time trying to organize everything between an SSD and a HD, I can just get 90% of the performance of an SSD where I need it on a 4TB hard drive. I strongly recommend it. Enabling it is pretty easy. The most difficult part is changing the SATA mode in your BIOS to RAID (since the caching is implementing by RAIDing together the SSD and the hard drive). This is difficult to do if you already have Windows installed as Windows will no longer boot if you do that unless you manually take care of some stuff before changing the setting. This isn’t a problem if you’re building a new computer of course as long as you install Windows with RAID enabled. After that, you simply install Intel Rapid Storage Technology and with three clicks you can have the cache drive working.

And I thought that the pc’s I built were messy… :wink:

Your setup maybe is a year old, but it still is the best hardware you can get.
But, you haven’t overclocked your i7? Why then buy a processor-cooling unit like that, it’s massive!
The stock cooler (especially with K-versions) is more than good enough, maybe a little loud, but that cooler is seriously overkill, do you know its weight? (Can’t be good for your motherboard lol)

And if I’m right, you have a kingston SSD? Kingstons SSDs are much slower than a Samsung 840 EVO f.e.? Maybe cache-SSDs are another story? Maybe only reading is faster, and writing still takes the same time with or without an SSD as cache? (then I would understand why you picked a Kingston, since I thought they had slow writing speeds)
Because, really, I still can’t believe those cache story’s, sounds like a fairytale, giving all your HDD’s almost the speed of an SSD, if that’s true I am going to try that with my next computer as well. I love SSD’s because of their speed, but hate them because they are so small (because if you store games etc. (because big SSDs are expensive) on an HDD your speed is gone, as you already mentioned).

And wow, that’s interesting business, those IPS monitors. But yeah, just like you said, it isn’t for everyone.
I would really hate it if I had a screen with dead pixels, don’t they make IPS monitors like that with just good displays? Then I would consider buying one, if the bigger resolution is that awesome, that is. Also, I understand the part that those monitors can be overclocked further than others, but I don’t understand the part about the bandwith of the DVI-cable, since (in my belief) no on-screen-menu’s, built-in scalers or cheaper controllers increase the bandwith of your DVI-cable?

Also, thanks for your detailed and enriching (if that’s a proper english word) answer.

Heh, the cables are messy as hell, but the flash of my camera made it look 100x dustier than it actually…

Like I said, I only have a 750W power supply, so I’m a bit worried about the power draw I have. Even a small overclock can bump up the power draw by over 50W, especially if you bump up the voltage, and I seriously don’t need an overclock for anything I play (hmm, maybe for Planetside 2?). The cooler is from my previous computer which I used to overclock from 2.9GHz to 3.5GHz, so at that time it made much more sense to have a large cooler like that. It actually has a metal plate attached to the back of the motherboard to support its weight, so I’m not worried. ^^

SSD speed isn’t really a that big deal. The main point of SSDs is that their access times are a few magnitudes lower than mechanical drives. Here’s a dated but relevant enough benchmark: http://www.tomshardware.com/reviews/ssd-520-sandforce-review-benchmark,3124-14.html. Sure, there’s a difference, but it’s hardly significant. I simply picked Kingston because they were so cheap at my local retailer. It’s more about having ANY SSD than which SSD you have.

I don’t think you can find an overclockable perfect IPS monitor. The Korean monitors on Amazon and Ebay only target a very specific target audience. They rely on the users accepting the much worse warranty and the fact that you have to pay for shipping if it breaks and you have to send it back. Since they don’t have much to lose by allowing people to overclock monitors (you are running them outside their specifications after all). This, in addition to using sub-optimal panels is what allows them to sell them so cheaply. A perfect monitor with the exact same panel could easily cost 3-5x as much at the time when I bought mine (prices are obviously going down all the time), and no company would want the warranty nightmare of overclockable $1000 monitors, nor a reputation of subpar services or warranties. That’s most likely why they software cap the clock rate. Although they do sell “Pixel Perfect” monitors on Ebay and Amazon that are slightly more expensive, but you may still get annoying problems since the panels they sell did get rejected during quality testing.

Having multiple inputs means that they have some hardware that allows the monitor to switch between those inputs. I’m not sure if that’s a measurable latency. Unless you’re planning on pluging in a console into your screen, you shouldn’t ever use anything other than DVI anyway, so this isn’t really a limitation anyway. Monitors with on-screen menus requires a small dedicated chip which blits/blends the menu onto the monitor, and this takes a small amount of time. All images coming in from the GPU also go through the scaling unit of the monitor regardless of if they need to or not, which also adds latency.

The cooler of your previous computer? And you are sure it is better than the stock-cooler? In my opinion, the restriction of sight on its own is already enough reason to NOT use it, for your crazy computer, that is :wink: I wouldn’t be surprised if that cooler of yours is the same as the one I had in my PREVIOUS computer, yours is probably an upgraded version. But hardware developed fast, the stock-coolers of these days are way better than they used to be, but it counts almost twice for SSD’s, a 2 year old benchmark about SSD’s is now (pretty much) OUTDATED, they developed a lot since they became very popular two years ago all of a sudden (and that was when I built a pc with an SSD for the first time).

[quote]“Monitors with on-screen menus requires a small dedicated chip which blits/blends the menu onto the monitor, and this takes a small amount of time.”
[/quote]
Yes, but this doesn’t have anything to do with latency does it? I understand the latency theory and stuff, but to make clear what I meant by my question: you said in an earlier post:

[quote]“There’s also the problem with the bandwidth of a DVI cable, which is why there aren’t any 2560x1440 monitors with a refresh rate higher than 60Hz.”
[/quote]
I didn’t understand why those IPS monitors need less bandwith than others (since on-screen-menus and stuff have nothing to do with that) and they have a 2560x1440 res with sometimes 120Hz, you said. But I guess I know the answer already, Dual link DVI means double bandwith (‘capacity’), doesn’t it?

Regarding SSDs, I don’t think it’s a big deal. SSDs have shifted the bottleneck away from disk IO to CPU speed a bit, so a faster SSDs barely have an impact in my experience. I still use my old 60GB Corsair SSD in my old computer, and it boots up almost as fast as my SSD cached computer, although it does have fewer programs installed. If you can find a benchmark that proves me otherwise, I’d gladly change my opinion. =P

Yes, but this doesn’t have anything to do with latency does it?
[/quote]
Oh, but it does! How could you possibly implement this so that it takes no time at all? The monitor needs a small GPU-like chip which can render and blend in the menu on top of the images coming from your GPU to do that. When I say latency, I do not mean gray-to-gray latency (the value they often list for monitors) which is unaffected by this; I’m talking about something which is often called input latency, e.g. the time it takes from you pressing a button on your keyboard to the result appearing on the screen.

They don’t require less bandwidth.

A single-link DVI cable has 3.96 Gbit/second of actual throughput (taking into consideration 10-bit encoding overhead). A dual-link cable has twice the data cables, but the official (?) spec says this:

Basically, limited by the quality of the cable, the ability of the source to send that much data and the ability of the monitor to receive that much data. In practice the 3.96 * 2 = 7,92 Gbit/s is the minimum that any combination of cable, monitor and GPU should be able to handle. How much is that? It’s 0,99 GByte/s, or 1013,76 MByte/s.

1920x1080p x 60Hz x 3 bps = 356 MByte/s.
2560x1440p x 60Hz x 3 bps = 633 MByte/s.
2560x1440p x 120Hz x 3 bps = 1266 MByte/s <— Whoops.
2560x1440p x 90Hz x 3 bps = 949 MByte/s. <— What I run.
2560x1440p x 96Hz x 3 bps = 1013 MByte/s. <— The highest my setup could go (approximately).
3840x2160p x 60Hz x 3 bps = 1424 MByte/s <— Why 4K-displays need two cables.

It’s simply more data than the spec guarantees will work.

Aha, so your monitor can handle more than 96Hz, but your cables can’t! So I was right, Dual Link DVI can send twice as much data as normal DVI cables.
So thats how an on-screen menu works, makes sense yes, but I thought they would use some nicer kind of technique that renders over the real image so that it wouldn’t affect the latency :stuck_out_tongue:

If its true that only access times matter on SSD’s I can’t give you a benchmark (and they won’t convince you) but about read/write speeds, here you go: http://ssd.userbenchmark.com/
As you can see all Kingston SSD’s are really ‘unstable’ fast reading but slow writing/effective speed, but I guess that is the cost of ‘cheap’ :wink:

I don’t know. Presumably my GPU can handle it since it doesn’t crash the computer. It sounds a bit weird that the cable would be the weakest link (pun unintended), so my bet is that I simply got a monitor controller which can’t handle that clock rate.

There are many more factors that decide the performance of SSDs, and sequential read and write speed is the least significant of those since you’ll rarely write large files to your little SSD. There are other more important metrics like IOPS and random write, read performance, and THE most important one, real world performance which is hard as hell to measure accurately. I completely agree that my SSD is far from the best one, but I maintain my standing that I think it makes a very small difference in practice.

EDIT:
I just tried to overclock my monitor a bit more, to 100Hz this time, and it worked. I did suffer from some instability (flickering, slow fullscreen transitions) at such high refresh rates though, so I probably won’t run it that fast.

Here’s the program I’m using to overclock my monitor. USE AT YOUR OWN RISK. I AM NOT THE CREATOR OF THIS PROGRAM NOR DO I TAKE RESPONSIBILITY FOR ANY DAMAGE THIS PROGRAM MAY DO TO YOUR MONITOR OR COMPUTER.

Weird that your monitor works with 100Hz now, but didn’t before, your monitor makes progress! But it’s unfortunately that you didn’t get a good 120Hz-able monitor, because if a computer deserves it, its yours :smiley:

Ofcourse, random write/read is very important, but I assume that you know that the benchmark I gave you has tested that as well? (Sorry I forgot to mention, its my fault I only spoke of read/write speeds) You can select whatever ‘value’ (column) you want to see (4K read/write means random read/write) ::slight_smile: So its quite a handy benchmark if you want to compare everything, and effective speed was the calculated real world performance you talked about.

Thanks for your program, but my GPU already has enough work to do with 2 full HD screens @60Hz, so I will save it for later and when I am going to test it I am going to test it first on one of my screens covered in dust :smiley: BLOWING UP COMPU… Ahum I mean: experimenting with computers, our hobby!