Jump to content

Freelandr

Members
  • Posts

    89
  • Joined

  • Last visited

Posts posted by Freelandr

  1. According to the US army, the number of bullets fired per hits scored in combat is redicolous... Cant' recall the exact number but it was hunderds, thousands fired per single hit

    Now xenonauts are elite and I expect them to do better. But maybe accuracy is too high?

    According to figures released by the Department of Defense, the average number of rounds expended in Vietnam to kill one enemy solder with the M-16 was 50,000. The average number of rounds expended by U.S. military snipers to kill one enemy soldier was 1.3 rounds. That's a cost-difference of $23,000 per kill for the average soldier, vs. $0.17 per kill for the military sniper.

    According to the U.S. Army, the average soldier will hit a man-sized target 10 percent of the time at 300 meters using the M16A2 rifle.

    But that was vietnam. I wonder what the averages are now

  2. I approve of Gigabyte. Solid performance, fair price, and the GTX 670 that I use had the three fans on it. Under a full load it keeps under 60C at <60% fan speed. This particular card is around $400, though, but Gigabyte also makes a 660Ti at $300. Same cooler style. Personally, I never liked the blower style that EVGA seems to butter their bread with - too loud.

    I've been addicted to nVidia chipsets for years (CUDA support for Premiere certainly helps, too) but ATI Radeon HD 7000's seem to be incredibly popular with gamers. I think it might be the giant amounts of memory and bit-depth, not to mention the low price. I'm just not sold on the drivers. It seems like nVidia rolls out updates almost monthly for massive performance improvements whereas ATI does them sporadically, but again: price. I did own an ATI card back in the day and it did what I needed it to. Gigabyte has a tri-cooler style Radeon HD 7870 for $250.

    Note: Keep in mind most of the modern video cards for more than $200 take up two slots on the motherboard. Often it'll cover up a PCI-E x1 slot or a standard PCI slot. Also keep in mind power: modern cards are around 300W under full load (8800 was 200W; manufacturers list it higher to compensate for CPU requirement as well) and some require two 6-pin power connectors.

    I do a fair bit of CUDA@Home stuff too, so I wanna stay with Nvidia too and not Radeon

    I know my board can handle the size of the card, ive got a custom built machine, even if it is old, and I have a 650W power supply and it has the additional modular plug for additional power. All will be good there.

  3. Ive never had them so I will take your word for it.

    What are those biccy's called that have choc on top and like an orange flavoured jelly in the middle.

    I used to have them when I lived in the UK, dont know if ive seen them in aus.

    something cakes they were called... dammit i cant remember

    but they were yum too

  4. At this stage of the game, though, the team would literally be saying, "sorry guys, release won't be for at least a year yet because we have to recode everything to work with the new engine." It'd be like if you had a project that you were working on in VB (God forbid) and then somebody just decided, "hey, why not use C#. C# is nice and easy and everybody seems to like C# these days."

    Lets just imagine for a few minutes that Chris and the team decided to actually do that and scrap the whole lot and recode in a diff language. What would that achieve for the game? Would it be better for any reason?

    This was just a random thought as I was trolling the forum, I have zero coding knowledge and don not pretend to have any, I have no idea why people use diff code for different things. Im just curious.

×
×
  • Create New...