Jump to content

Battlefield 2


Recommended Posts

Here are the system requirements of the game:

- Minimum Specification:

CPU: 1.7 Ghz

RAM: 512 Mb

Video Card: NVidia GeForce FX 5700, ATI Radeon 8500 or ATI Radeon 9500

with 128 Mb of RAM

- Recommended Specification:

CPU: 2.4 Ghz

RAM: 1 Gb

Video Card with at least 256 Mb of RAM

As you see, your system covers the minimum requirements, so you can play the game :)

Link to post
Share on other sites

It wont run, and if it does it won't be playable

I have a 64AMD 3500+ 1024Ocz low latency memory and 2 6600GT SLI and I can barely play it on medium setts, lol

Good thing im getting my 7800Ultra when it comes out

Link to post
Share on other sites

It wont run, and if it does it won't be playable

I have a 64AMD 3500+ 1024Ocz low latency memory and 2 6600GT SLI and I can barely play it on medium setts, lol

Good thing im getting my 7800Ultra when it comes out

Strange...I have a lower system than that and mine runs fine on medium settings (a few on high) ...

64AMD 3000+ 1024RAM  and only one 6600GT (PCI exp.)

You need to sort your system out  :P

Link to post
Share on other sites

Thanks guys

At the very least I will download the demo and double my ram to 1 gig, then give it a go.  The system seems to run BFV very well on medium level, but when things get very busy it seems to slow a little.

Link to post
Share on other sites

Meh, I know im right,

Friend of mine has the same setup as me but with just 1 6600GT and the only way for him to get aa acceptable framerate(40-60) is to tune everything down to low with 1 or 2 setts on med.

If you guys are playing on med settings with just 1 6600gt .. then it be laggy.. maybe ur used to it or maybe you don't even notice it. But I can assure you that your framerates won't pass 25 (which is unplayable for me)

I have a few benchmarks from several hardware sites that show this 2.

Link to post
Share on other sites

I really don't see how that's possible...with a 1.86GHz Pentium-M (a single-treaded, 32-bit processor), 2GB of RAM, and a 256MB GeForce Go6800, I can play Battlefield 2 on maximum settings at 1440x900, including 4xS AntiAliasing and 4x Anisotropic Filtering, and maintain over 30 FPS at all times.  The 6600GT can't be THAT much weaker than the Go6800.

Link to post
Share on other sites

you clearly don't know much about GFX cards..

Of course the 6800 chip is at least twice as good as its smaller version the 6600..

on top of that, the 6800Go is even faster( about 10% ) than the original 6800Ultra.

The pentium M series are by far the best choice for gaming ( for intel that is ) cause of their low power consumption and yes believe it or not they actually do pretty well vs. Amd processors.

And 32 or 64 bit doesn't mean anything, the game doesn't support it, and the OS doesn't either, and if it does It still wouldn't mean shit since the engine doesn't take advantage of it.

Im getting bottlenecked by the 128bit bus and the 128mb memory.

Which is why I can't go over med setts. EZ, and 30 FPS sounds low for good quality, you should test it.

Link to post
Share on other sites

I don't know where on earth you're getting your idea that the Go6800 can outperform a 6800 Ultra...it can't even keep up with the 6800GT: http://www.tomshardware.com/mobile/20041108/index.html

Also, as you can see here, dual 6600GTs can reach 3DMark05 scores of comfortably over 5400; the Go6800 can reach maybe 4200 with modded drivers and substantial overclocking (I get 4062).

Also, I'm aware that the benefits of the 2MB L2 cache, shorter pipeline, and power-optimized architecture of the Pentium-M make it, clock for clock, a better gaming processor, but it does fall short once a lot of extranneous calculations are added (eg, 15 bots in an already intensive game like BattleField 2).

Long story short, I don't know how you're measuring playabilty, but my old Mobility RADEON 9600 w/64MB of VRAM could maintain playability at medium-low settings at 1280x800, so dual 6600GTs should be able to blow that away.

Link to post
Share on other sites

You use 3dmark05 to measure your gaming performance? very bad idea

your example of a 6600GTSLI system with 5400 3dmarks sounds a bit low, very low

My 6600GTSLI setup manages to get 7219 in 3dmark05 on default setts @ 1024x735, and my 7800GTX gets around 7000 on the same setts. When I push the limit a bit further, lets say Super sampling / High res (1280x1024/70) All setts maxed out I was suprised it ran at ALL, but yeah I managed to get 2030 3DMARKS with my *gosu* 6600GTs.

My 7800GTX reached around 6000 3DMARKS on those setts.

Long story short, don't use synthetic benchmarks; they SUCK and they won't help you make a better choice.

CS:S stress test or Doom time demo are good for measuring FPS, which it is all about in the end.

The Go cards are indeed faster and have lower power consumption which leads to less heat and that leads to higher clocks.

It doesn't mean shit they score less 3DMARKs, that just shows what a shitty benchmark it is.

Link to post
Share on other sites

3dmark05 is actually quite good, because it executes snippets of (poorly optimized/coded) real-time rendered 3D movies, which behave similar to real games.

Have you even read anything I said?

@Dukeleto

I still don't see your point, They are comparing desktop cards to laptop ones, which is retarded to begin with.

Laptops always have crappy cache, ram and so on, so that benchmark is pointless.

The 6800Go just simply PWNZ all other mobile GFX cards in that list,

anyway I was talking about some tweaker that made it possible for him to use a 6800Go in his desktop ( same idea as a pentium M in a desktop PC ) and it ROCKED, he was able to overclock it so much higher than the desktop version 6800U with stock cooling. Which lead to much better FPS for the go card and the ultra was far behind.

Link to post
Share on other sites

Have you even read anything I said?

Uh, yes, which is exactly why I said what I said...

The talk about the Go series is just nonsense. Go is a laptop card that, according to the tests at Tom's Hardware, doesn't outperform even the 6600GT.

However, what the benchmark is trying to say is that the laptop cards are closing in to the desktop cards concerning performance. Of course, they haven't caught up yet. Furthermore, 6800 is not twice as good as 6600 (seems to be 50% better on average). Why should it be? How do you measure performance?

There are two ways to tell if a card is twice as good as another card:

* It runs with twice the fps at the exact same setup.

* It runs with conciderably more fps, and draws nicer graphics.

Link to post
Share on other sites

For one, you base your opinion on 1 benchmark?

secondly, Go cards ARE better for the exact reason I mentioned

and Thirdly, of course the 6800 is twice as good.. for a very simple reason.. IT HAS DOUBLE OF EVERYTHING

6600GT

Link to post
Share on other sites

ya right whatever lol

Go play doom 3 all maxed out with a 6600GT and try to get above 20 FPS, you WONT

6800ultra on the other hand will manage 40 FPS with no trouble at all.

THATS my point, I don't give a shit about some comparison between an ATI and a nvidia card, they are NOTHING alike.

Link to post
Share on other sites

Double of everything is not twice as good.

A good example is AMD vs Intel.

Why isn't the 3.8 GHz almost twice as good as the AMD 2GHz?  ::)

And I don't base my opinions on one benchmark. I frequently visit Tom's Hardware, and they have countless benchmarks.

Link to post
Share on other sites

I'm not sure why you keep insisting on comparing the 6600GT to the 6800 Ultra when there is a 6800GT.  Also, you seem to have forgotton that you're running dual 6600GTs, and therefore, by your "twice-as-much-means-twice-the-performance" reasoning(however heinously flawed it may be), you should be getting the power of a 6800 Ultra.

And since we were originally trying to figure out why your dual-6600GT setup could barely handle BF2 on medium while my single-Go6800 *NOT ULTRA!!!* setup can blow it away at maximum settings + max AF and AA: The Go6800 has a slower fill rate than the 6600GT and has only 12 pixel pipelines, not 16.  Multiple benchmarks show dual 6600GTs beating out Go6800s by large margins.  So I can only conclude that you seriously need to optomize your system.

Link to post
Share on other sites

@cyborg

Double of everything is not always twice as good, in this case it is.

The 128bit bottleneck of the 6600GT is so frustrating to me, it is the reason I can't play most games at high res.

That is why the ultra has double the performance, cause it has the 256bit bus which makes it able to render much faster.

@dukeleto

I brought up the 6800U since you guys dont believe the Go cards are better, they are in every way superior

Maybe not at stock speeds I dont know.

Just to make everyone happy, I just formatted and reinstalled windows with NOTHING else except 3dmark05

And im getting about the same score 7120. Which is about the same as the 7800GTX manages to get in 3dmark05.

Havent tried battlefield yet but I have a feeling that nothing has changed, the extreme larger textures of the maps in battlefield just get lagged out by the 128bit bus.

The fact that my dual 6600GTs get the same score as 1 7800GTX just show how retarded 3dmark05 really is.

My guess is because I have 2 GPU cores is the reason im getting such a high score, pretty primitive program then.

And my twice-the-performance reasoning STEPS in right here, I recently purchased an 7800GTX and it runs battlefield smoother than ever, with a lower core frequency but double of EVERYTHING. I will measure FPS in a short while when I find a dependable program. But im pretty sure its T-W-I-C-E the amount of FPS than I was getting before, Since I already tested this before I formatted.

Link to post
Share on other sites

What 35mark05 does is to test real rendering, like you find in games. What's why it is a generally accepted benchmark.

But there seems to be one thing you don't know about graphic cards, and that's the fact that settings have much to say.

The 6600 in SLI may be just as goot as 7800GTX on some settings, but when you turn on top antialiasing and anisotropic filtering, you start to see a big difference. That may be why you see such a big difference.

And again, double of everything is not always better.

Although, Avatar, if the double of everything is from the same manufacturer, and the double of everything actually gets used, then double of everything might lead to almost twice the performance.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...