Saying that you have a 3.6GHz processor honestly doesn’t mean sh*t.
It’s like saying that you have an 8 megapixel camera. You just simply can’t compare two processors based only on the frequency (or on the number of cores, for that matter).
For example an AMD FX-4300 has 4 cores and a 3.8GHz default frequency, where’s my Intel Core i5 3570K has 4 cores too but only 3.4GHz default clock speed.
So does that mean that the FX-4300 is better? No. In fact, the 3570K is about 2x faster.
4GBs of RAM for gaming is not really sufficient. Of course this depends on the game and the OS too, but AAA monsters like Battlefield or Witcher have a tendency of eating up RAM like crazy. If your OS runs out of RAM it has to use the swap partition on your HDD which is obviously going to be very slow, so for gaming 8GBs of RAM I would say is the golden standard.
For most gaming systems the actual bottleneck however is the graphics card. If you want to game on fullHD it is recommended to have at least something like a GTX750 Ti/AMD R7 260X or better with 2GBs of VRAM. For fullHD 2GBs of VRAM is plenty, but if you have something like a WQHD display or you play games that require a lot of VRAM (like Skyrim with all those fancy 2K and 4K texture packs) you’re better of getting something with 3 or 4GBs of VRAM.
A good gaming PC is expensive (mine is around $1100 but there are still parts that I’m not quite satisfied with), but you can get away with something like a $750 build that can play most of the games on fullHD mid/high at 60fps. However, if you want to play those beasts (Witcher, Crysis, Battlefield, Far Cry, etc.) on fullHD ultra you will have to spend around $1000 at least.
However, I’m not suggesting you to instantly go out buy new parts and spend a lot of money. If you could tell us about your PC’s specs in a bit more detail we could help more easily.